Skip to content
This repository has been archived by the owner on May 27, 2020. It is now read-only.

[Bug] When using on Spark2.x with hadoop 2.7.x, Error occur as following #166

Open
DeeeFOX opened this issue Oct 8, 2016 · 0 comments
Open

Comments

@DeeeFOX
Copy link

DeeeFOX commented Oct 8, 2016

  • When submit a python spark job using the command:
    spark-submit --master spark://192.168.10.67:7077 --deploy-mode client --packages com.stratio.datasource:spark-mongodb_2.11:0.12.0 sqls/spark_mgdb_sql/spsql_v2.0.py
  • Error occurs:

[ERROR] [10/08/2016 16:41:35.993] [mongodbClientFactory-akka.actor.default-dispatcher-3] [akka://mongodbClientFactory/user/mongoConnectionActor] host and port should be specified in host:port format
com.mongodb.MongoException: host and port should be specified in host:port format
at com.mongodb.ServerAddress.(ServerAddress.java:95)
at com.mongodb.ServerAddress.(ServerAddress.java:53)
at com.stratio.datasource.mongodb.client.MongodbClientActor.com$stratio$datasource$mongodb$client$MongodbClientActor$$doGetClient(MongodbClientActor.scala:108)
at com.stratio.datasource.mongodb.client.MongodbClientActor$$anonfun$receive$1.applyOrElse(MongodbClientActor.scala:45)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at com.stratio.datasource.mongodb.client.MongodbClientActor.aroundReceive(MongodbClientActor.scala:34)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

  • What I have given:
    But the mongo url of given is in format host:port like:
    192.168.0.1:30000
  • And It work fine when using Spark1.6.2 with hadoop 2.7.x with following command:
    spark-submit --master spark://192.168.10.67:7077 --deploy-mode client --packages com.stratio.datasource:spark-mongodb_2.10:0.11.2 sqls/spark_mgdb_sql/spsql_v1.6.py
  • Supplement
    1.6 and 2.0 is just different in calling by SparkContext and SparkSession which is suggested by the official doc
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant