Consistent with the recent work to allow the API to accept some primitives as well as specific objects, the schema argument to LoadJobConfig* could accept a list of dicts. Currently it requires constructing a bunch of SchemaField objects:
schema = [
# could be just column_dicts
bigquery.schema.SchemaField.from_api_repr(column_dict) for column_dict in column_dicts
]
with open(empty_path, "rb") as source_file:
job_config = bigquery.job.LoadJobConfig(
schema=schema, write_disposition="WRITE_TRUNCATE"
)
job = gbq_client.load_table_from_file(
source_file,
table_ref,
location="US",
job_config=job_config,
)
We're trying to move to using the BQ Python API rather than subprocessing out to a shell with bq; it's got much better but it's still a bit Java-esque, and these are examples of times it's a tougher sell than constructing a string a sending it to bash. As ever, lmk if the API isn't designed for these cases and you'd encourage users to use bash.
*this could also be a dict, though less of an imperative.
Consistent with the recent work to allow the API to accept some primitives as well as specific objects, the schema argument to
LoadJobConfig* could accept a list of dicts. Currently it requires constructing a bunch ofSchemaFieldobjects:We're trying to move to using the BQ Python API rather than subprocessing out to a shell with
bq; it's got much better but it's still a bit Java-esque, and these are examples of times it's a tougher sell than constructing a string a sending it to bash. As ever, lmk if the API isn't designed for these cases and you'd encourage users to use bash.*this could also be a dict, though less of an imperative.