You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a DataFrame to write to a BQ table using direct write method. I would like to run some metrics during the write operation. The write operation succeeds but extracting the metrics value stucks and does not return.
Thank you for prompt response. I tried the latest version 0.32.2 but still the same issue
I tried starting Spark shell once with --packages com.google.cloud.spark:spark-bigquery-with-dependencies_2.12:0.32.2 and once with com.google.cloud.spark:spark-3.3-bigquery:0.32.2 and both behave the same.
I have a DataFrame to write to a BQ table using direct write method. I would like to run some metrics during the write operation. The write operation succeeds but extracting the metrics value stucks and does not return.
Here is the workflow.
Run a Spark shells as follows:
Then run the below snippet inside it.
Last statement
obs.get("count")
just hangs forever and does not return at all.If we change the write operation to write parquet to GCS, the last statement returns fine and gives a 100 which is record count.
df_wrapped.write.parquet("gs://<SOME-GCS-PATH>")
The text was updated successfully, but these errors were encountered: