一、定义Model

### --- 定义Model

~~~ 1、Model Designer
~~~ 2、Data Model:因为把维表都包含了,不涉及到维表,只要选择数据源即可
~~~ 3、Dimensions
~~~ 4、Measures
~~~ 5、settings——>Save

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_kylin

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_kylin_02

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_hadoop_03

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_hadoop_04

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_hadoop_05


二、定义cube

### --- 定义cube

~~~ # 1、定义一个Cube:Cube Info
~~~ # 2、Dimensions
~~~ # 3、Measures
~~~ # 4、Refresh Setting
~~~ # 5、Advanced Setting
~~~ 将衍生时间维度设置为Hierarchy关系,减少非必要计算在Rowkey部分,
~~~ 将用户最频繁用作过滤的列、筛选性强的列放在Rowkey的开始位置
~~~ # 6、Configration Overwrites:默认
~~~ # 7、Overview:默认——Save

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_hadoop_06

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_kylin_07

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_java_08

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_java_09

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_java_10

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_java_11

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_kylin_12

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_java_13

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_kylin_14

三、Streaming Cube注意事项


### --- Streaming Cube 和普通的 cube 大致上一样。以下几点需要注意:

~~~ 分区时间列应该是 Cube 的一个 dimension。
~~~ 在 Streaming OLAP 中时间总是一个查询条件,Kylin 利用它来缩小扫描分区的范围
~~~ 不要使用 order_time 作为 dimension 因为它非常的精细;
~~~ 建议使用 mintue_start、hour_start 或其他,取决于用户如何查询数据
~~~ 定义 year_start quarter_start month_start day_start hour_start minute_start 作为层级以减少组合计算
~~~ 在 refersh setting 设置中,创建更多合并的范围,
~~~ 如 0.5 小时、4 小时、1 天、 7 天;这样设置有助于控制 cube segment 的数量
~~~ 在 rowkeys 部分,拖拽 minute_start 到最上面的位置,对于 streaming 查询,时间条件会一直显示;
~~~ 将其放到前面将会缩小扫描范围

四、Build Cube:手动构建

### --- Build Cube:手动构建
### --- 查看cube是否构建

~~~ # 通过命令的方式定义一个build:在kylin所在主机执行build构建
[root@hadoop02 ~]# curl -X PUT --user ADMIN:KYLIN \
-H "Content-Type: application/json;charset=utf-8" \
-d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' \
http://hadoop02:7070/kylin/api/cubes/streamingcube1/build2
~~~输出参数:返回的是一个json串
{
"uuid": "8eaf6513-572c-19d9-c6e8-e8a963af2753",
"last_modified": 1637418659513,
"version": "3.0.0.20500",
"name": "BUILD CUBE - streamingcube1 - 0_28 - CST 2021-11-20 22:30:59",
"projectName": "yanqi_sales_olap",
"type": "BUILD",
"duration": 0,
"related_cube": "streamingcube1",
"display_cube_name": "streamingcube1",
"related_segment": "46e4710d-3ba0-14d0-ef28-80f8892162d2",
"related_segment_name": "0_28",
"exec_start_time": 0,
"exec_end_time": 0,
"exec_interrupt_time": 0,
"mr_waiting": 0,
"steps": [{
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-00",
"name": "Save data from Kafka",
"sequence_id": 0,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/kylin_intermediate_streamingcube1_46e4710d_3ba0_14d0_ef28_80f8892162d2 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -jobname Kylin_Save_Kafka_Data_streamingcube1_Step",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-01",
"name": "Extract Fact Table Distinct Columns",
"sequence_id": 1,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/fact_distinct_columns -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -statisticsoutput hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/fact_distinct_columns/statistics -statisticssamplingpercent 100 -jobname Kylin_Fact_Distinct_Columns_streamingcube1_Step -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-02",
"name": "Build Dimension Dictionary",
"sequence_id": 2,
"exec_cmd": " -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/fact_distinct_columns -dictPath hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/dict -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-03",
"name": "Save Cuboid Statistics",
"sequence_id": 3,
"exec_cmd": null,
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-04",
"name": "Create HTable",
"sequence_id": 4,
"exec_cmd": " -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -partitions hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/rowkey_stats/part-r-00000 -cuboidMode CURRENT -hbaseConfPath hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/hbase-conf.xml",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-05",
"name": "Build Base Cuboid",
"sequence_id": 5,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -input FLAT_TABLE -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_base_cuboid -jobname Kylin_Base_Cuboid_Builder_streamingcube1 -level 0 -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-06",
"name": "Build N-Dimension Cuboid : level 1",
"sequence_id": 6,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_base_cuboid -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_1_cuboid -jobname Kylin_ND-Cuboid_Builder_streamingcube1_Step -level 1 -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-07",
"name": "Build N-Dimension Cuboid : level 2",
"sequence_id": 7,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_1_cuboid -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_2_cuboid -jobname Kylin_ND-Cuboid_Builder_streamingcube1_Step -level 2 -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-08",
"name": "Build N-Dimension Cuboid : level 3",
"sequence_id": 8,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_2_cuboid -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_3_cuboid -jobname Kylin_ND-Cuboid_Builder_streamingcube1_Step -level 3 -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-09",
"name": "Build N-Dimension Cuboid : level 4",
"sequence_id": 9,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_3_cuboid -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_4_cuboid -jobname Kylin_ND-Cuboid_Builder_streamingcube1_Step -level 4 -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-10",
"name": "Build N-Dimension Cuboid : level 5",
"sequence_id": 10,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_4_cuboid -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_5_cuboid -jobname Kylin_ND-Cuboid_Builder_streamingcube1_Step -level 5 -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-11",
"name": "Build N-Dimension Cuboid : level 6",
"sequence_id": 11,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_5_cuboid -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_6_cuboid -jobname Kylin_ND-Cuboid_Builder_streamingcube1_Step -level 6 -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-12",
"name": "Build N-Dimension Cuboid : level 7",
"sequence_id": 12,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_6_cuboid -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_7_cuboid -jobname Kylin_ND-Cuboid_Builder_streamingcube1_Step -level 7 -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-13",
"name": "Build N-Dimension Cuboid : level 8",
"sequence_id": 13,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_7_cuboid -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_8_cuboid -jobname Kylin_ND-Cuboid_Builder_streamingcube1_Step -level 8 -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-14",
"name": "Build N-Dimension Cuboid : level 9",
"sequence_id": 14,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_8_cuboid -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/level_9_cuboid -jobname Kylin_ND-Cuboid_Builder_streamingcube1_Step -level 9 -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-15",
"name": "Build Cube In-Mem",
"sequence_id": 15,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf_inmem.xml -cubename streamingcube1 -segmentid 46e4710d-3ba0-14d0-ef28-80f8892162d2 -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/ -jobname Kylin_Cube_Builder_streamingcube1 -cubingJobId 8eaf6513-572c-19d9-c6e8-e8a963af2753",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-16",
"name": "Convert Cuboid Data to HFile",
"sequence_id": 16,
"exec_cmd": " -conf /opt/yanqi/servers/kylin-3.1.1/conf/kylin_job_conf.xml -cubename streamingcube1 -partitions hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/rowkey_stats/part-r-00000_hfile -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/cuboid/* -output hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/hfile -htablename KYLIN_YTRZDXUV6N -jobname Kylin_HFile_Generator_streamingcube1_Step",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-17",
"name": "Load HFile to HBase Table",
"sequence_id": 17,
"exec_cmd": " -input hdfs://hadoop01:9000/kylin/kylin_metadata/kylin-8eaf6513-572c-19d9-c6e8-e8a963af2753/streamingcube1/hfile -htablename KYLIN_YTRZDXUV6N -cubename streamingcube1",
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-18",
"name": "Update Cube Info",
"sequence_id": 18,
"exec_cmd": null,
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-19",
"name": "Hive Cleanup",
"sequence_id": 19,
"exec_cmd": null,
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}, {
"interruptCmd": null,
"id": "8eaf6513-572c-19d9-c6e8-e8a963af2753-20",
"name": "Garbage Collection on HDFS",
"sequence_id": 20,
"exec_cmd": null,
"interrupt_cmd": null,
"exec_start_time": 0,
"exec_end_time": 0,
"exec_wait_time": 0,
"step_status": "PENDING",
"cmd_type": "SHELL_CMD_HADOOP",
"info": {},
"run_async": false
}],
"submitter": "ADMIN",
"job_status": "PENDING",
"build_instance": "unknown",
"progress": 0.0
}

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_kylin_15

### --- 执行SQL查询

~~~ # 执行sql查询:以每分钟显示查询结果
select minute_start, count(*), sum(amount), sum(qty)
from streamingds1
group by minute_start
order by minute_start

|NO.Z.00024|——————————|BigDataEnd|——|Hadoop&OLAP_Kylin.V24|——|Kylin.v24|Kylin构建Cube|流式构建.V2|_java_16

五、Build Cube:定时任务自动构建

### --- 自动Build

~~~ 第一个 Cube 构建和查询成功了,就可以按照一定的频率调度增量 build。
~~~ Kylin 将会记录每一个 build 的 offsets;
~~~ 当收到一个 build 请求,它将会从上一个结束的位置开始,然后从 Kafka 获取最新的 offsets。
~~~ 有了 REST API 您可以使用任何像 Linux cron 调度工具触发构建任务。
### --- 在crontab -e下写入如下参数

[root@hadoop02 ~]# crontab -e
~~~写入如下参数:每20分钟构建一次
*/20 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://hadoop02:7070/kylin/api/cubes/streamingcube1/build2
~~~输出参数
{
"code": "999",
"data": null, //数据是空,因为没有新的数据增加;它不会再构建,只有等待有新的数据进来,才会触发构建
"msg": "The cube streamingcube1 has segments [streamingcube1[0_28]], but none of them is READY. It's not allowed for parallel building",
"stacktrace": "org.apache.kylin.rest.exception.InternalErrorException: The cube streamingcube1 has segments [streamingcube1[0_28]], but none of them is READY. It's not allowed for parallel building\n\tat org.apache.kylin.rest.controller.CubeController.buildInternal(CubeController.java:435)\n\tat org.apache.kylin.rest.controller.CubeController.rebuild2(CubeController.java:418)\n\tat org.apache.kylin.rest.controller.CubeController.build2(CubeController.java:409)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.lang.reflect.Method.invoke(Method.java:498)\n\tat org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)\n\tat org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)\n\tat org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)\n\tat org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:854)\n\tat org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:765)\n\tat org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)\n\tat org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)\n\tat org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)\n\tat org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)\n\tat org.springframework.web.servlet.FrameworkServlet.doPut(FrameworkServlet.java:883)\n\tat javax.servlet.http.HttpServlet.service(HttpServlet.java:650)\n\tat org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)\n\tat javax.servlet.http.HttpServlet.service(HttpServlet.java:728)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)\n\tat org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)\n\tat org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)\n\tat org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:200)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:66)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)\n\tat org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)\n\tat org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)\n\tat org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)\n\tat org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)\n\tat org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:347)\n\tat org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:263)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)\n\tat com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:209)\n\tat com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:244)\n\tat org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)\n\tat org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)\n\tat org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)\n\tat org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:110)\n\tat org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:492)\n\tat org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:165)\n\tat org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104)\n\tat org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:1025)\n\tat org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)\n\tat org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:452)\n\tat org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1195)\n\tat org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:654)\n\tat org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:317)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: org.apache.kylin.rest.exception.BadRequestException: The cube streamingcube1 has segments [streamingcube1[0_28]], but none of them is READY. It's not allowed for parallel building\n\tat org.apache.kylin.rest.service.JobService.checkAllowParallelBuilding(JobService.java:443)\n\tat org.apache.kylin.rest.service.JobService.submitJobInternal(JobService.java:242)\n\tat org.apache.kylin.rest.service.JobService.submitJob(JobService.java:223)\n\tat org.apache.kylin.rest.controller.CubeController.buildInternal(CubeController.java:431)\n\t... 78 more\n",
"exception": "The cube streamingcube1 has segments [streamingcube1[0_28]], but none of them is READY. It's not allowed for parallel building",
"url": "http://hadoop02:7070/kylin/api/cubes/streamingcube1/build2"
}

Walter Savage Landor:strove with none,for none was worth my strife.Nature I loved and, next to Nature, Art:I warm'd both hands before the fire of life.It sinks, and I am ready to depart

                                                                                                                                                   ——W.S.Landor