You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 27, 2020. It is now read-only.
When submit a python spark job using the command: spark-submit --master spark://192.168.10.67:7077 --deploy-mode client --packages com.stratio.datasource:spark-mongodb_2.11:0.12.0 sqls/spark_mgdb_sql/spsql_v2.0.py
Error occurs:
[ERROR] [10/08/2016 16:41:35.993] [mongodbClientFactory-akka.actor.default-dispatcher-3] [akka://mongodbClientFactory/user/mongoConnectionActor] host and port should be specified in host:port format
com.mongodb.MongoException: host and port should be specified in host:port format
at com.mongodb.ServerAddress.(ServerAddress.java:95)
at com.mongodb.ServerAddress.(ServerAddress.java:53)
at com.stratio.datasource.mongodb.client.MongodbClientActor.com$stratio$datasource$mongodb$client$MongodbClientActor$$doGetClient(MongodbClientActor.scala:108)
at com.stratio.datasource.mongodb.client.MongodbClientActor$$anonfun$receive$1.applyOrElse(MongodbClientActor.scala:45)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at com.stratio.datasource.mongodb.client.MongodbClientActor.aroundReceive(MongodbClientActor.scala:34)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
What I have given:
But the mongo url of given is in format host:port like:
192.168.0.1:30000
And It work fine when using Spark1.6.2 with hadoop 2.7.x with following command: spark-submit --master spark://192.168.10.67:7077 --deploy-mode client --packages com.stratio.datasource:spark-mongodb_2.10:0.11.2 sqls/spark_mgdb_sql/spsql_v1.6.py
Supplement
1.6 and 2.0 is just different in calling by SparkContext and SparkSession which is suggested by the official doc
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
spark-submit --master spark://192.168.10.67:7077 --deploy-mode client --packages com.stratio.datasource:spark-mongodb_2.11:0.12.0 sqls/spark_mgdb_sql/spsql_v2.0.py
[ERROR] [10/08/2016 16:41:35.993] [mongodbClientFactory-akka.actor.default-dispatcher-3] [akka://mongodbClientFactory/user/mongoConnectionActor] host and port should be specified in host:port format
com.mongodb.MongoException: host and port should be specified in host:port format
at com.mongodb.ServerAddress.(ServerAddress.java:95)
at com.mongodb.ServerAddress.(ServerAddress.java:53)
at com.stratio.datasource.mongodb.client.MongodbClientActor.com$stratio$datasource$mongodb$client$MongodbClientActor$$doGetClient(MongodbClientActor.scala:108)
at com.stratio.datasource.mongodb.client.MongodbClientActor$$anonfun$receive$1.applyOrElse(MongodbClientActor.scala:45)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at com.stratio.datasource.mongodb.client.MongodbClientActor.aroundReceive(MongodbClientActor.scala:34)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
But the mongo url of given is in format host:port like:
192.168.0.1:30000
spark-submit --master spark://192.168.10.67:7077 --deploy-mode client --packages com.stratio.datasource:spark-mongodb_2.10:0.11.2 sqls/spark_mgdb_sql/spsql_v1.6.py
1.6 and 2.0 is just different in calling by SparkContext and SparkSession which is suggested by the official doc
The text was updated successfully, but these errors were encountered: