Hi
Would be great to have a query as an optional field to the extract_table_to_storage method in the google.cloud.bigquery.client.py , so that it will be easy with code to extract the data from the desired table with the custom query and write to GS, or is there any other way to directly write the custom query data after creating the job
import uuid
from gcloud import bigquery
client = bigquery.Client(project='test-sample')
query = """\
SELECT * FROM test_database.activity \
where activity_date = DATE_ADD(CURRENT_DATE(), -1, 'DAY') order by activity_date;"""
dataset = client.dataset('test_database')
job_name = 'test_job_name_' + str(uuid.uuid4())
query_job = client.run_async_query('fullname-age-query-job', query)
job = bigquery_client.extract_table_to_storage(
client.run_async_query(job_name, query), destination)
or is there any other way for this work around, i have some tables as partitioned tables and some with out partition tables, and would need to export some times the data from partitioned and the non-partitioned tables for a given date for analysis with other tools.
Thanks
Hi
Would be great to have a query as an optional field to the extract_table_to_storage method in the google.cloud.bigquery.client.py , so that it will be easy with code to extract the data from the desired table with the custom query and write to GS, or is there any other way to directly write the custom query data after creating the job
or is there any other way for this work around, i have some tables as partitioned tables and some with out partition tables, and would need to export some times the data from partitioned and the non-partitioned tables for a given date for analysis with other tools.
Thanks