You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am using Airbyte to fetch Universal Analytics Data to Bigquery. However, in the past few days, Airbyte sync continued to fail, and the reason appeared to be "Failure reason: message='activity timeout', timeoutType=TIMEOUT_TYPE_HEARTBEAT".
Below is the entire log, please help me find out the reason, thank you
2024-05-06 09:03:40 INFO i.a.c.i.LineGobbler(voidCall):149 - Checking if airbyte/source-google-analytics-v4:0.3.1 exists...
2024-05-06 09:03:41 INFO i.a.c.i.LineGobbler(voidCall):149 - airbyte/source-google-analytics-v4:0.3.1 was found locally.
2024-05-06 09:03:41 INFO i.a.w.p.DockerProcessFactory(create):139 - Creating docker container = destination-bigquery-write-477-2-hmvws with resources io.airbyte.config.ResourceRequirements@888b1a6[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null
2024-05-06 09:03:41 INFO i.a.w.p.DockerProcessFactory(create):139 - Creating docker container = source-google-analytics-v4-read-477-2-werrr with resources io.airbyte.config.ResourceRequirements@4e0283e7[cpuRequest=0.5,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts io.airbyte.config.AllowedHosts@4e51959[hosts=[oauth2.googleapis.com, www.googleapis.com, analyticsdata.googleapis.com, analyticsreporting.googleapis.com, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}]
2024-05-06 09:03:41 INFO i.a.c.i.LineGobbler(voidCall):149 - airbyte/destination-bigquery:1.5.1 was found locally.
2024-05-06 09:03:41 INFO i.a.w.p.DockerProcessFactory(create):192 - Preparing command: docker run --rm --init -i -w /data/477/2 --log-driver none --name source-google-analytics-v4-read-477-2-werrr -e CONCURRENT_SOURCE_STREAM_READ=false --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-google-analytics-v4:0.3.1 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e USE_STREAM_CAPABLE_STATE=true -e FIELD_SELECTION_WORKSPACES= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=2 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.50.7 -e WORKER_JOB_ID=477 airbyte/source-google-analytics-v4:0.3.1 read --config source_config.json --catalog source_catalog.json
2024-05-06 09:03:41 INFO i.a.w.p.DockerProcessFactory(create):192 - Preparing command: docker run --rm --init -i -w /data/477/2 --log-driver none --name destination-bigquery-write-477-2-hmvws --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:1.5.1 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e USE_STREAM_CAPABLE_STATE=true -e FIELD_SELECTION_WORKSPACES= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=2 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.50.7 -e WORKER_JOB_ID=477 airbyte/destination-bigquery:1.5.1 write --config destination_config.json --catalog destination_catalog.json
2024-05-06 09:03:41 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0
2024-05-06 09:03:41 INFO i.a.w.i.VersionedAirbyteMessageBufferedWriterFactory(createWriter):41 - Writing messages to protocol version 0.2.0
2024-05-06 09:03:41 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0
2024-05-06 09:03:41 INFO i.a.w.g.BufferedReplicationWorker(readFromSource):291 - readFromSource: start
2024-05-06 09:03:41 INFO i.a.w.i.HeartbeatTimeoutChaperone(runWithHeartbeatThread):94 - Starting source heartbeat check. Will check every 1 minutes.
2024-05-06 09:03:41 INFO i.a.w.g.BufferedReplicationWorker(processMessage):327 - processMessage: start
2024-05-06 09:03:41 INFO i.a.w.g.BufferedReplicationWorker(writeToDestination):368 - writeToDestination: start
2024-05-06 09:03:41 INFO i.a.w.g.BufferedReplicationWorker(readFromDestination):401 - readFromDestination: start
2024-05-06 09:03:45 source > Starting syncing SourceGoogleAnalyticsV4
2024-05-06 09:03:49 destination > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2024-05-06 09:03:49 destination > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination
2024-05-06 09:03:49 destination > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: WRITE
2024-05-06 09:03:49 destination > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2024-05-06 09:03:49 source > Marking stream carrefour_landingpage_01 as STARTED
2024-05-06 09:03:49 source > Syncing stream: carrefour_landingpage_01
2024-05-06 09:03:50 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):236 - Attempt 0 to stream status started null:carrefour_landingpage_01
2024-05-06 09:03:50 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-06 09:03:50 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-06 09:03:50 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-06 09:03:50 destination > INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):422 Selected loading method is set to: STANDARD
2024-05-06 09:03:50 destination > WARN i.a.i.d.b.BigQueryDestination(getConsumer):213 The "standard" upload mode is not performant, and is not recommended for production. Please use the GCS upload mode if you are syncing a large amount of data.
2024-05-06 09:03:51 destination > INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):422 Selected loading method is set to: STANDARD
2024-05-06 09:03:53 source > Google Analytics data is sampled. Consider using a smaller window_in_days parameter. For more info check https://developers.google.com/analytics/devguides/reporting/core/v4/basics#sampling
2024-05-06 09:03:54 source > Marking stream carrefour_landingpage_01 as RUNNING
2024-05-06 09:03:54 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):236 - Attempt 0 to update stream status running null:carrefour_landingpage_01
2024-05-06 09:03:56 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):227 Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=jessica_airbyte_ua, tableId=_airbyte_tmp_spn_carrefour_landingpage_01}}
2024-05-06 09:03:56 destination > INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):422 Selected loading method is set to: STANDARD
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi, I am using Airbyte to fetch Universal Analytics Data to Bigquery. However, in the past few days, Airbyte sync continued to fail, and the reason appeared to be "Failure reason: message='activity timeout', timeoutType=TIMEOUT_TYPE_HEARTBEAT".
Below is the entire log, please help me find out the reason, thank you
2024-05-06 09:03:40 INFO i.a.c.i.LineGobbler(voidCall):149 - Checking if airbyte/source-google-analytics-v4:0.3.1 exists...
2024-05-06 09:03:41 INFO i.a.c.i.LineGobbler(voidCall):149 - airbyte/source-google-analytics-v4:0.3.1 was found locally.
2024-05-06 09:03:41 INFO i.a.w.p.DockerProcessFactory(create):139 - Creating docker container = destination-bigquery-write-477-2-hmvws with resources io.airbyte.config.ResourceRequirements@888b1a6[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null
2024-05-06 09:03:41 INFO i.a.w.p.DockerProcessFactory(create):139 - Creating docker container = source-google-analytics-v4-read-477-2-werrr with resources io.airbyte.config.ResourceRequirements@4e0283e7[cpuRequest=0.5,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts io.airbyte.config.AllowedHosts@4e51959[hosts=[oauth2.googleapis.com, www.googleapis.com, analyticsdata.googleapis.com, analyticsreporting.googleapis.com, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}]
2024-05-06 09:03:41 INFO i.a.c.i.LineGobbler(voidCall):149 - airbyte/destination-bigquery:1.5.1 was found locally.
2024-05-06 09:03:41 INFO i.a.w.p.DockerProcessFactory(create):192 - Preparing command: docker run --rm --init -i -w /data/477/2 --log-driver none --name source-google-analytics-v4-read-477-2-werrr -e CONCURRENT_SOURCE_STREAM_READ=false --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-google-analytics-v4:0.3.1 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e USE_STREAM_CAPABLE_STATE=true -e FIELD_SELECTION_WORKSPACES= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=2 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.50.7 -e WORKER_JOB_ID=477 airbyte/source-google-analytics-v4:0.3.1 read --config source_config.json --catalog source_catalog.json
2024-05-06 09:03:41 INFO i.a.w.p.DockerProcessFactory(create):192 - Preparing command: docker run --rm --init -i -w /data/477/2 --log-driver none --name destination-bigquery-write-477-2-hmvws --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:1.5.1 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e USE_STREAM_CAPABLE_STATE=true -e FIELD_SELECTION_WORKSPACES= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=2 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.50.7 -e WORKER_JOB_ID=477 airbyte/destination-bigquery:1.5.1 write --config destination_config.json --catalog destination_catalog.json
2024-05-06 09:03:41 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0
2024-05-06 09:03:41 INFO i.a.w.i.VersionedAirbyteMessageBufferedWriterFactory(createWriter):41 - Writing messages to protocol version 0.2.0
2024-05-06 09:03:41 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):177 - Reading messages from protocol version 0.2.0
2024-05-06 09:03:41 INFO i.a.w.g.BufferedReplicationWorker(readFromSource):291 - readFromSource: start
2024-05-06 09:03:41 INFO i.a.w.i.HeartbeatTimeoutChaperone(runWithHeartbeatThread):94 - Starting source heartbeat check. Will check every 1 minutes.
2024-05-06 09:03:41 INFO i.a.w.g.BufferedReplicationWorker(processMessage):327 - processMessage: start
2024-05-06 09:03:41 INFO i.a.w.g.BufferedReplicationWorker(writeToDestination):368 - writeToDestination: start
2024-05-06 09:03:41 INFO i.a.w.g.BufferedReplicationWorker(readFromDestination):401 - readFromDestination: start
2024-05-06 09:03:45 source > Starting syncing SourceGoogleAnalyticsV4
2024-05-06 09:03:49 destination > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2024-05-06 09:03:49 destination > INFO i.a.i.b.IntegrationRunner(runInternal):106 Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination
2024-05-06 09:03:49 destination > INFO i.a.i.b.IntegrationRunner(runInternal):107 Command: WRITE
2024-05-06 09:03:49 destination > INFO i.a.i.b.IntegrationRunner(runInternal):108 Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2024-05-06 09:03:49 source > Marking stream carrefour_landingpage_01 as STARTED
2024-05-06 09:03:49 source > Syncing stream: carrefour_landingpage_01
2024-05-06 09:03:50 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):236 - Attempt 0 to stream status started null:carrefour_landingpage_01
2024-05-06 09:03:50 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-06 09:03:50 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-06 09:03:50 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword always_show - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-05-06 09:03:50 destination > INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):422 Selected loading method is set to: STANDARD
2024-05-06 09:03:50 destination > WARN i.a.i.d.b.BigQueryDestination(getConsumer):213 The "standard" upload mode is not performant, and is not recommended for production. Please use the GCS upload mode if you are syncing a large amount of data.
2024-05-06 09:03:51 destination > INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):422 Selected loading method is set to: STANDARD
2024-05-06 09:03:53 source > Google Analytics data is sampled. Consider using a smaller window_in_days parameter. For more info check https://developers.google.com/analytics/devguides/reporting/core/v4/basics#sampling
2024-05-06 09:03:54 source > Marking stream carrefour_landingpage_01 as RUNNING
2024-05-06 09:03:54 INFO i.a.a.c.AirbyteApiClient(retryWithJitterThrows):236 - Attempt 0 to update stream status running null:carrefour_landingpage_01
2024-05-06 09:03:56 destination > INFO i.a.i.d.b.BigQueryUtils(createPartitionedTableIfNotExists):227 Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=jessica_airbyte_ua, tableId=_airbyte_tmp_spn_carrefour_landingpage_01}}
2024-05-06 09:03:56 destination > INFO i.a.i.d.b.BigQueryUtils(getLoadingMethod):422 Selected loading method is set to: STANDARD
Beta Was this translation helpful? Give feedback.
All reactions