You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
问题简介:
我使用ClickHouse reader和Hive writer,可以成功提交flink job,但任务在flink中运行几秒钟后就显示任务failed。
查看flink错误日志发现如下错误:
`Caused by: org.apache.thrift.TException: Unsupported Hive2 protocol
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:595)
... 32 more
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil.connect(HiveDbUtil.java:222)
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil.connect(HiveDbUtil.java:203)
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil$1.call(HiveDbUtil.java:96)
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil$1.call(HiveDbUtil.java:93)
at com.dtstack.chunjun.util.RetryUtil$Retry.call(RetryUtil.java:141)
at com.dtstack.chunjun.util.RetryUtil$Retry.doRetry(RetryUtil.java:80)
... 21 more
.
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil.getConnectionWithRetry(HiveDbUtil.java:104)
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil.getConnection(HiveDbUtil.java:86)
at com.dtstack.chunjun.connector.hive.util.HiveUtil.createHiveTableWithTableInfo(HiveUtil.java:90)
at com.dtstack.chunjun.connector.hive.sink.HiveOutputFormat.checkCreateTable(HiveOutputFormat.java:386)
at com.dtstack.chunjun.connector.hive.sink.HiveOutputFormat.primaryCreateTable(HiveOutputFormat.java:357)
at com.dtstack.chunjun.connector.hive.sink.HiveOutputFormat.openInternal(HiveOutputFormat.java:114)
at com.dtstack.chunjun.sink.format.BaseRichOutputFormat.open(BaseRichOutputFormat.java:262)
at com.dtstack.chunjun.connector.hive.sink.HiveOutputFormat.open(HiveOutputFormat.java:94)
at com.dtstack.chunjun.sink.DtOutputFormatSinkFunction.open(DtOutputFormatSinkFunction.java:95)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:34)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
at org.apache.flink.streaming.api.operators.StreamSink.open(StreamSink.java:46)
at org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:433)
at org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$2(StreamTask.java:545)
at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:93)
at org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:535)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:575)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:745)`
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
问题简介:
我使用ClickHouse reader和Hive writer,可以成功提交flink job,但任务在flink中运行几秒钟后就显示任务failed。
查看flink错误日志发现如下错误:
`Caused by: org.apache.thrift.TException: Unsupported Hive2 protocol
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:595)
... 32 more
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil.connect(HiveDbUtil.java:222)
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil.connect(HiveDbUtil.java:203)
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil$1.call(HiveDbUtil.java:96)
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil$1.call(HiveDbUtil.java:93)
at com.dtstack.chunjun.util.RetryUtil$Retry.call(RetryUtil.java:141)
at com.dtstack.chunjun.util.RetryUtil$Retry.doRetry(RetryUtil.java:80)
... 21 more
.
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil.getConnectionWithRetry(HiveDbUtil.java:104)
at com.dtstack.chunjun.connector.hive.util.HiveDbUtil.getConnection(HiveDbUtil.java:86)
at com.dtstack.chunjun.connector.hive.util.HiveUtil.createHiveTableWithTableInfo(HiveUtil.java:90)
at com.dtstack.chunjun.connector.hive.sink.HiveOutputFormat.checkCreateTable(HiveOutputFormat.java:386)
at com.dtstack.chunjun.connector.hive.sink.HiveOutputFormat.primaryCreateTable(HiveOutputFormat.java:357)
at com.dtstack.chunjun.connector.hive.sink.HiveOutputFormat.openInternal(HiveOutputFormat.java:114)
at com.dtstack.chunjun.sink.format.BaseRichOutputFormat.open(BaseRichOutputFormat.java:262)
at com.dtstack.chunjun.connector.hive.sink.HiveOutputFormat.open(HiveOutputFormat.java:94)
at com.dtstack.chunjun.sink.DtOutputFormatSinkFunction.open(DtOutputFormatSinkFunction.java:95)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:34)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
at org.apache.flink.streaming.api.operators.StreamSink.open(StreamSink.java:46)
at org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:433)
at org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$2(StreamTask.java:545)
at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:93)
at org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:535)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:575)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:745)`
环境信息:
操作系统:Ubuntu16.04.7
hive版本:2.3.9
flink版本:1.12.7
chunjun版本:1.12.7(最新)
我编写的测试用的json文件如下:
{ "job" : { "content" : [ { "reader" : { "name" : "clickhousereader", "parameter" : { "column":[ "packet_name","node_id","gateway_id" ], "connection" : [ { "jdbcUrl" : [ "jdbc:clickhouse://xxxx:8123/default" ], "schema": "bp001", "table" : [ "HOS_4000" ] }], "username" : "xxx", "password" : "xxxxxx" } }, "writer": { "name" : "hivewriter", "parameter" : { "jdbcUrl" : "jdbc:hive2://192.168.111.111:10000/chunjun", "username" : "", "password" : "", "fileType" : "text", "writeMode" : "overwrite", "charsetName" : "UTF-8", "tablesColumn" : "{\"chunjun.bp001\":[{\"type\":\"STRING\",\"key\":\"packet_name\"},{\"type\":\"STRING\",\"key\":\"node_id\"},{\"type\":\"STRING\",\"key\":\"gateway_id\"}]}", "partition" : "pt", "partitionType" : "MINUTE", "defaultFS" : "hdfs://ns", "hadoopConfig" : { "dfs.ha.namenodes.ns": "nn1,nn2", "fs.defaultFS": "hdfs://ns", "dfs.namenode.rpc-address.ns.nn2": "ip:9000", "dfs.client.failover.proxy.provider.ns": "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider", "dfs.namenode.rpc-address.ns.nn1": "ip:9000", "dfs.nameservices": "ns", "fs.hdfs.impl.disable.cache": "true", "hadoop.user.name": "root", "fs.hdfs.impl": "org.apache.hadoop.hdfs.DistributedFileSystem" } } } } ], "setting": { "speed": { "channel": 1 } } } }
补充说明:
1、flink-sql的客户端可成功连接操作hive,说明flink与hive的集成没有问题
2、hive的beeline客户端可正常使用,使用其它客户端通过json配置文件中的hive的jdbcURL也可以成功连接上hive
Beta Was this translation helpful? Give feedback.
All reactions