Kafka 外部流
You can read data from Apache Kafka (as well as Confluent Cloud, or Redpanda) in Timeplus with External Stream. 结合 物化视图 和 目标流,你还可以使用外部流向 Apache Kafka 写入数据。
创建外部流
In Timeplus Proton, the external stream supports Kafka API as the only type.
In Timeplus Enterprise, it also supports External Stream for Apache Pulsar and External Stream for other Timeplus deployment.
To create an external stream for Apache Kafka or Kafka-compatiable messaging platforms, you can run the following DDL SQL:
CREATE EXTERNAL STREAM [IF NOT EXISTS] stream_name
(<col_name1> <col_type>)
SETTINGS
type='kafka',
brokers='ip:9092',
topic='..',
security_protocol='..',
username='..',
password='..',
sasl_mechanism='..',
data_format='..',
kafka_schema_registry_url='..',
kafka_schema_registry_credentials='..',
ssl_ca_cert_file='..',
ssl_ca_pem='..',
skip_ssl_cert_check=..
The supported values for security_protocol
are:
- 纯文本:省略此选项时,这是默认值。
- SASL_SSL:设置此值时,应指定用户名和密码。
- 如果你需要 指定自己的 SSL 认证文件,可以添加另一个设置
ssl_ca_cert_file='/ssl/ca.pem'
Proton 1.5.5 中的新增内容,如果你不想或无法使用文件路径,例如在 Timeplus Cloud 或 Docker/Kuker/Kup 中,也可以将 pem 文件的全部内容作为字符串放入ssl_ca_pem
设置中伯内特斯环境。 - 可以通过
设置 skip_ssl_cert_check=true
来跳过 SSL 认证验证。
- 如果你需要 指定自己的 SSL 认证文件,可以添加另一个设置
The supported values for sasl_mechanism
are:
- PLAIN:当你将 security_protocol 设置为 SASL_SSL 时,这是 sasl_mechanmic 的默认值。
- SCRAM-SHA-256
- SCRAM-SHA-512
The supported values for data_format
are:
- JSONEachRow: parse each row of the message as a single JSON document. The top level JSON key/value pairs will be parsed as the columns. 了解更多.
- CSV:不太常用。 了解更多.
- ProtobufSingle: for single Protobuf message per message
- Protobuf: there could be multiple Protobuf messages in a single message.
- Avro:在 Proton 1.5.2 中添加
- rawBlob:默认值。 Read/write message as plain text.
注意
For examples to connect to various Kafka API compatitable message platforms, please check this doc.