spark2.1.0的kafka链接失败了

知行合一1 发布于 2017/07/03 11:27
阅读 84
收藏 0

in thread "main" java.lang.IllegalArgumentException: 'path' is not specified
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$9.apply(DataSource.scala:205)
    at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$9.apply(DataSource.scala:205)
    at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
    at org.apache.spark.sql.catalyst.util.CaseInsensitiveMap.getOrElse(CaseInsensitiveMap.scala:23)
    at org.apache.spark.sql.execution.datasources.DataSource.sourceSchema(DataSource.scala:204)
    at org.apache.spark.sql.execution.datasources.DataSource.sourceInfo$lzycompute(DataSource.scala:87)
    at org.apache.spark.sql.execution.datasources.DataSource.sourceInfo(DataSource.scala:87)
    at org.apache.spark.sql.execution.streaming.StreamingRelation$.apply(StreamingRelation.scala:30)
    at org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:124)
    at org.apache.spark.examp.JavaKafkaWordCountDataRow.main(JavaKafkaWordCountDataRow.java:63)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/07/03 11:24:03 INFO spark.SparkContext: Invoking stop() from shutdown hook
17/07/03 11:24:03 INFO server.ServerConnector: Stopped ServerConnector@1eef9aef{HTTP/1.1}{0.0.0.0:4040}
17/07/03 11:24:03 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@37eeec90{/stages/stage/kill,null,UNAVAILABLE}
 

加载中
0
知行合一1
知行合一1

 Dataset<Row> ds1 = spark
              .readStream()
              .option("kafka.bootstrap.servers", "192.168.28.101:2181,192.168.28.102:2181,192.168.28.103:2181")
              .option("subscribe", "3719e24b66abea87")
              .load();
            ds1.selectExpr("CAST(key AS STRING)", "CAST(value AS STRING)");
            
            System.out.println("---ds1--"+ ds1);

返回顶部
顶部