You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using the BigQuery connector to transfer data to a BQ table. I am wondering if there is an option to set a table expiration when using the dataframe.write() operation. I see that the connector supports other types of expiration, but not this one.
Thank you!
The text was updated successfully, but these errors were encountered:
This would be super-helpful in the case of merge queries. It happens that you need to update and/or add rows to a table. In that case, the pattern is to write the new data to a temporary table and then perform the merge query. Currently, using the connector, you can't perform the merge query and the temporary table delete. If you could add an option to set the expiration timestamp to the table, you just need to perform the merge outside of spark
Hi,
I am using the BigQuery connector to transfer data to a BQ table. I am wondering if there is an option to set a table expiration when using the
dataframe.write()
operation. I see that the connector supports other types of expiration, but not this one.Thank you!
The text was updated successfully, but these errors were encountered: