spark2.0+python2.75失败了, 咋整

js2java 发布于 2017/03/13 10:56
阅读 393
收藏 0

 python.PythonRunner: Times: total = 578, boot = 325, init = 252, finish = 1
17/03/12 22:53:33 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 1793 bytes result sent to driver
17/03/12 22:53:33 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 776 ms on localhost (1/1)
17/03/12 22:53:33 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
17/03/12 22:53:33 INFO scheduler.DAGScheduler: ResultStage 0 (runJob at PythonRDD.scala:441) finished in 0.809 s
17/03/12 22:53:33 INFO scheduler.DAGScheduler: Job 0 finished: runJob at PythonRDD.scala:441, took 0.956975 s
17/03/12 22:53:51 ERROR python.PythonRDD: Error while sending iterator
java.net.SocketTimeoutException: Accept timed out
    at java.net.PlainSocketImpl.socketAccept(Native Method)
    at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:409)
    at java.net.ServerSocket.implAccept(ServerSocket.java:545)
    at java.net.ServerSocket.accept(ServerSocket.java:513)
    at org.apache.spark.api.python.PythonRDD$$anon$2.run(PythonRDD.scala:697)
Traceback (most recent call last):
  File "/home/test/spark/bin/examples/src/main/python/ml/als_example.py", line 43, in <module>
    ratings = spark.createDataFrame(ratingsRDD)
  File "/home/test/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 520, in createDataFrame
  File "/home/test/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 360, in _createFromRDD
  File "/home/test/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 331, in _inferSchema
  File "/home/test/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 1328, in first
  File "/home/test/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 1310, in take
  File "/home/test/spark/python/lib/pyspark.zip/pyspark/context.py", line 942, in runJob
  File "/home/test/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 139, in _load_from_socket
Exception: could not open socket
17/03/12 22:54:03 INFO spark.SparkContext: Invoking stop() from shutdown hook
17/03/12 22:54:03 INFO server.ServerConnector: Stopped ServerConnector@7fe7f474{HTTP/1.1}{0.0.0.0:4040}
17/03/12 22:54:03 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@11d003a1{/stages/stage/kill,null,UNAVAILABLE}
17/03/12 22:54:03 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@23553b8d{/api,null,UNAVAILABLE}
17/03/12 22:54:03 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2cde81ca{/,null,UNAVAILABLE}
17/03/12 22:54:03 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2606ec23{/static,null,UNAVAILABLE}
 

加载中
返回顶部
顶部