You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
iot/labs/doris/readme.md

3.9 KiB

This file contains ambiguous Unicode characters!

This file contains ambiguous Unicode characters that may be confused with others in your current locale. If your use case is intentional and legitimate, you can safely ignore this warning. Use the Escape button to highlight these characters.

#配置

doris 参考资料

  1. http://doris.apache.org/master/zh-CN/
  2. https://cloud.baidu.com/doc/PALO/s/Ikivhcwb5
  3. http://doc.dorisdb.com/2146003 doris 不支持 superset 和 metabase以后通过二次开发解决

doris 硬件配置

http://doc.dorisdb.com/2228586 BE推荐16核64GB以上FE推荐8核16GB以上。 磁盘可以使用HDD或者SSD。 CPU需要支持AVX2指令集cat /proc/cpuinfo |grep avx2 确认有输出即可如果没有支持建议更换机器DorisDB的向量化技术需要CPU指令集支持才能发挥更好的效果。 网络需要万兆网卡和万兆交换机。

kafka connect

  1. https://www.confluent.io/hub/confluentinc/kafka-connect-elasticsearch
  2. https://www.confluent.io/hub/debezium/debezium-connector-mysql

cockroachdb cdc

  1. cockroachdb to kafka :https://www.cockroachlabs.com/docs/v21.1/stream-data-out-of-cockroachdb-using-changefeeds.html#create-a-core-changefeed

mysql

mysql配置了主从创建了数据库example新建了User表

kafka

通过 kafkacat 查看实时信息

kafaka-connect

查看全部插件:http://localhost:8083/connector-plugins 查看运行插件:http://localhost:8083/connectors 查看插件

kibana

查看 elasticsearch 状态: http://localhost:5601

进入 mysql 容器内,连接 doris fe

mysql -h doris-fe -P 9030 -u root

设置root密码

SET PASSWORD FOR 'root' = PASSWORD('aA123456!');

添加 doris be

进入 doris be 容器内,运行 cat /etc/hosts 获取当前的ip使用该 ip 替换 hostname 或固定容器 的ip ALTER SYSTEM ADD BACKEND "172.172.0.41:9050";

移除 doris be

ALTER SYSTEM DROPP BACKEND "172.172.0.41:9050";

查看be状态

SHOW PROC '/backends';

创建数据库:

CREATE DATABASE example;

切换数据库:

USE example;

创建表:

CREATE TABLE User ( Id char(36) NOT NULL COMMENT 'Id', UserName varchar(255) NOT NULL COMMENT 'UserName', SecurityStamp varchar(255) REPLACE NULL COMMENT 'SecurityStamp', PasswordHash varchar(255) REPLACE NULL COMMENT 'PasswordHash', PasswordConfirmed tinyint(1) REPLACE NOT NULL COMMENT 'PasswordConfirmed', Email varchar(255) REPLACE NULL DEFAULT NULL COMMENT 'Email', EmailConfirmed tinyint(1) REPLACE NOT NULL COMMENT 'EmailConfirmed', PhoneNumber varchar(255) REPLACE NULL DEFAULT NULL COMMENT 'PhoneNumber', PhoneNumberConfirmed tinyint(1) REPLACE NOT NULL COMMENT 'PhoneNumberConfirmed', RealName varchar(255) REPLACE NULL COMMENT 'RealName', IdentityNumber varchar(255) REPLACE NULL COMMENT 'IdentityNumber', IdentityConfirmed tinyint(1) REPLACE NOT NULL COMMENT 'IdentityConfirmed', NickName varchar(255) REPLACE NULL COMMENT 'NickName', Avatar varchar(255) REPLACE NULL COMMENT 'Avatar', Sex int(0) REPLACE NULL DEFAULT NULL COMMENT 'Sex', Birthday datetime REPLACE NULL DEFAULT NULL COMMENT 'Birthday', LockoutEnabled tinyint(1) REPLACE NOT NULL COMMENT 'LockoutEnabled', AccessFailedCount int(0) REPLACE NOT NULL COMMENT 'AccessFailedCount', LockoutEnd datetime REPLACE NULL DEFAULT NULL COMMENT 'LockoutEnd', RowVersion varchar(255) REPLACE NULL COMMENT 'RowVersion', Created datetime REPLACE NOT NULL COMMENT 'Created', Modified datetime REPLACE NULL DEFAULT NULL COMMENT 'Modified', Deleted datetime REPLACE NULL DEFAULT NULL COMMENT 'Deleted' ) AGGREGATE KEY(Id,UserName) DISTRIBUTED BY HASH(Id) BUCKETS 10 PROPERTIES("replication_num" = "1");

kafka 导入 doris未完成

查看导入任务: SHOW ALL ROUTINE LOAD;

创建导入任务:

CREATE ROUTINE LOAD example.job1 on User PROPERTIES ( "format"="json",
"json_root"="$.after", "desired_concurrent_number"="1",
"max_error_number"="1000", "timezone" = "Asia/Shanghai" ) FROM KAFKA ( "kafka_broker_list"= "kafka:9092", "kafka_topic" = "mysql.example.User" );