Skip to main content

Ingest REST API

You can run INSERT INTO [stream](column1, column2) VALUES (..) SQL to insert data to Proton. You can also call the ingestion REST API to push data to Proton, with any preferred languages. This may work better for integration Proton with other systems or live data source (such as IoT sensors).

Prerequisites

Expose port 3218 from Proton container

The Proton ingest REST API is on port 3218 by default. Please start the proton container with the 3218 port exposed. For example:

docker run -d -p 3218:3218 --pull always --name proton ghcr.io/timeplus-io/proton:latest 

Create a stream in Proton

You need to create a stream in Timeplus via CREATE STREAM. Columns with proper names and types should be set.

First run the SQL client

docker exec -it proton proton-client

Then run the following SQL to create the stream.

CREATE STREAM foo(id int, name string)

Push data to Timeplus

The endpoint for real-time data ingestion is http://localhost:3218/proton/v1/ingest/streams/{stream_name}. HTTP method is POST.

Request samples:

curl -s -X POST http://localhost:3218/proton/v1/ingest/streams/foo \
-d '{
"columns": ["id","name"],
"data": [
[1,"hello"],
[2,"world"]
]
}
'

The above method should work very well for most system integrations. However, the column names will be repeatedly mentioned in the requested body.

We also provide a more performant solution to only list the column names once.

The request body is this format:

{
"columns": [..],
"data": [
[..],
[..]
]
}

Note:

  • the columns is an array of string, with the column names
  • the data is an array of arrays. Each nested array represents a row of data. The value order must match the exact same order in the columns.

For example:

{
"columns": ["key1", "key2"],
"data": [
["value11", "value12"],
["value21", "value22"]
]
}

You can also use one of our SDKs to ingest data without handling the low level details of REST API.