Skip to content

Latest commit

 

History

History
88 lines (55 loc) · 3.32 KB

http_consumer.md

File metadata and controls

88 lines (55 loc) · 3.32 KB

Start ape_dts as an HTTP server to provide data to consumers

Refer to tutorial

ape_dts starts as an HTTP server, pulling CDC data from MySQL/Postgres and cache it in memory.

Consumers can pull and consume data from ape_dts via API, the data format is Avro, same to MySQL -> Kafka

Snapshot task is NOT supported since it is more convenient to query data through SQL and consume.

Api

info

Get the current information of the server.

curl "http://127.0.0.1:10231/info"

Response

{"acked_batch_id":0,"sent_batch_id":0}
  • batch_id: Generated by ape_dts, increments by 1 with each data pull by the consumer, starting from 0, and reset when ape_dts restarts.
  • sent_batch_id: The maximum batch_id that has been sent to the client.
  • acked_batch_id: The maximum batch_id that has been acknowledged by the client, acknowledged data will be removed from ape_dts's cache.

fetch_new

Fetch new data from the server.

curl "http://127.0.0.1:10231/fetch_new?batch_size=2&ack_batch_id=1"

Parameters

  • batch_size: The maximum records count to pull, if ape_dts's cache is insufficient, all available data will be returned.
  • ack_batch_id: Optional.
    • If set, data with batch_id <= ack_batch_id will be removed from ape_dts's cache.
    • ack_batch_id must be >= acked_batch_id returned by info.
    • ack_batch_id must be <= sent_batch_id returned by info.

Response

{"data":[[14,116,101,115,116,95,100,98,8,116,98,95,49,12,105,110,115,101,114,116,2,4,4,105,100,6,105,110,116,8,76,111,110,103,10,118,97,108,117,101,6,105,110,116,8,76,111,110,103,0,0,2,4,4,105,100,4,2,10,118,97,108,117,101,4,2,0,0],[14,116,101,115,116,95,100,98,8,116,98,95,49,12,105,110,115,101,114,116,2,4,4,105,100,6,105,110,116,8,76,111,110,103,10,118,97,108,117,101,6,105,110,116,8,76,111,110,103,0,0,2,4,4,105,100,4,4,10,118,97,108,117,101,4,4,0,0]],"batch_id":1}
  • data: Multiple data entries encoded in Avro format. Refer to the [Parse and Consume] later in this article.
  • batch_id: The ID generated by ape_dts for this pull.

fetch_old

Fetch old data repeatedly from the server.

curl "http://127.0.0.1:10232/fetch_old?old_batch_id=1"

Parameters

  • old_batch_id: The batch_id of the old data to be fetched.
    • old_batch_id must be <= sent_batch_id returned by info.
    • old_batch_id must be > acked_batch_id returned by info.

Response

{"data":[[14,116,101,115,116,95,100,98,8,116,98,95,49,12,105,110,115,101,114,116,2,4,4,105,100,6,105,110,116,8,76,111,110,103,10,118,97,108,117,101,6,105,110,116,8,76,111,110,103,0,0,2,4,4,105,100,4,2,10,118,97,108,117,101,4,2,0,0],[14,116,101,115,116,95,100,98,8,116,98,95,49,12,105,110,115,101,114,116,2,4,4,105,100,6,105,110,116,8,76,111,110,103,10,118,97,108,117,101,6,105,110,116,8,76,111,110,103,0,0,2,4,4,105,100,4,4,10,118,97,108,117,101,4,4,0,0]],"batch_id":1}
  • Same with fetch_new.

ack

Send acknowledgement to ape_dts.

curl -X POST "http://127.0.0.1:10232/ack" -H "Content-Type: application/json" -d '{"ack_batch_id": 6}'

Parameters

  • ack_batch_id: Same as the ack_batch_id parameter in fetch_new.

Response

{"acked_batch_id":1}
  • acked_batch_id: Same as the acked_batch_id returned by info.

Parse and Consume

python / golang consumer demo