TransWikia.com

delete bigquery table from spark by using scala

Stack Overflow Asked by Counter10000 on December 17, 2020

Is there a way to drop a BigQuery table from Spark by using Scala?

I only find ways to read and write BigQuery table from Spark by using Scala from the example here:
https://cloud.google.com/dataproc/docs/tutorials/bigquery-connector-spark-example

Can someone provide an example to drop a BigQuery table? For example, I can drop a table in BigQuery console using this statement "drop table if exists projectid1.dataset1.table1".

Please note that my purpose of removing the existing table is NOT to overwrite. I simply want to remove it. Please help. Thanks.

One Answer

Please refer to the BigQuery API:

import com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.{BigQueryOptions, TableId}

val bq = BigQueryOptions.getDefaultInstance().getService() 
val table = bq.getTable(TableId.of("projectid1", "dataset1", "table1"))
if(table != null) {
  table.delete()
}

Notice, this should work in Dataproc. In other cluster you will need to properly set the cresentials

Correct answer by David Rabinowitz on December 17, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP