You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 27, 2020. It is now read-only.
py4j.protocol.Py4JJavaError: An error occurred while calling o33.load. : com.mongodb.CommandFailureException: { "serverUsed" : "myserver:27017" , "ok" : 0.0 , "errmsg" : "Authentication failed." , "code" : 18 , "codeName" : "AuthenticationFailed"} at com.mongodb.CommandResult.getException(CommandResult.java:76) at com.mongodb.CommandResult.throwOnError(CommandResult.java:140) at com.mongodb.DBPort$SaslAuthenticator.authenticate(DBPort.java:899) at com.mongodb.DBPort.authenticate(DBPort.java:432) at com.mongodb.DBPort.checkAuth(DBPort.java:443) at com.mongodb.DBTCPConnector.innerCall(DBTCPConnector.java:289) at com.mongodb.DBTCPConnector.call(DBTCPConnector.java:269) at com.mongodb.DBCollectionImpl.find(DBCollectionImpl.java:84) at com.mongodb.DB.command(DB.java:320) at com.mongodb.DB.command(DB.java:299) at com.mongodb.DBCollection.getStats(DBCollection.java:1808) at com.mongodb.casbah.MongoCollectionBase$class.getStats(MongoCollection.scala:608) at com.mongodb.casbah.MongoCollection.getStats(MongoCollection.scala:1162) at com.mongodb.casbah.MongoCollectionBase$class.stats(MongoCollection.scala:606) at com.mongodb.casbah.MongoCollection.stats(MongoCollection.scala:1162) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner.isShardedCollection(MongodbPartitioner.scala:79) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner$$anonfun$computePartitions$1.apply(MongodbPartitioner.scala:67) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner$$anonfun$computePartitions$1.apply(MongodbPartitioner.scala:66) at com.stratio.datasource.mongodb.util.usingMongoClient$.apply(usingMongoClient.scala:27) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner.computePartitions(MongodbPartitioner.scala:66) at com.stratio.datasource.mongodb.rdd.MongodbRDD.getPartitions(MongodbRDD.scala:42) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.rdd.RDD.partitions(RDD.scala:246) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.rdd.RDD.partitions(RDD.scala:246) at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:65) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:328) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:328) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:358) at org.apache.spark.rdd.PairRDDFunctions.reduceByKey(PairRDDFunctions.scala:327) at com.stratio.datasource.mongodb.schema.MongodbSchema.schema(MongodbSchema.scala:46) at com.stratio.datasource.mongodb.MongodbRelation.com$stratio$datasource$mongodb$MongodbRelation$$lazySchema$lzycompute(MongodbRelation.scala:63) at com.stratio.datasource.mongodb.MongodbRelation.com$stratio$datasource$mongodb$MongodbRelation$$lazySchema(MongodbRelation.scala:60) at com.stratio.datasource.mongodb.MongodbRelation$$anonfun$1.apply(MongodbRelation.scala:65) at com.stratio.datasource.mongodb.MongodbRelation$$anonfun$1.apply(MongodbRelation.scala:65) at scala.Option.getOrElse(Option.scala:121) at com.stratio.datasource.mongodb.MongodbRelation.<init>(MongodbRelation.scala:65) at com.stratio.datasource.mongodb.DefaultSource.createRelation(DefaultSource.scala:36) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:345) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:280) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:214) at java.lang.Thread.run(Thread.java:745)
Any help is appreciated.
The text was updated successfully, but these errors were encountered:
gurpreetgosal
changed the title
Access Mongodb with python API with credentials.
Access Mongodb using python API with credentials.
Mar 8, 2017
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I want to read collection from mongodb which is secured. I am writing the following query:
df_raw_eventdata = sqlContext.read.format("com.stratio.datasource.mongodb").
options(host='server_ip:port', database='db1', collection='collec_1', credentials = 'username,db1,password').load()
But I am getting this error.
py4j.protocol.Py4JJavaError: An error occurred while calling o33.load. : com.mongodb.CommandFailureException: { "serverUsed" : "myserver:27017" , "ok" : 0.0 , "errmsg" : "Authentication failed." , "code" : 18 , "codeName" : "AuthenticationFailed"} at com.mongodb.CommandResult.getException(CommandResult.java:76) at com.mongodb.CommandResult.throwOnError(CommandResult.java:140) at com.mongodb.DBPort$SaslAuthenticator.authenticate(DBPort.java:899) at com.mongodb.DBPort.authenticate(DBPort.java:432) at com.mongodb.DBPort.checkAuth(DBPort.java:443) at com.mongodb.DBTCPConnector.innerCall(DBTCPConnector.java:289) at com.mongodb.DBTCPConnector.call(DBTCPConnector.java:269) at com.mongodb.DBCollectionImpl.find(DBCollectionImpl.java:84) at com.mongodb.DB.command(DB.java:320) at com.mongodb.DB.command(DB.java:299) at com.mongodb.DBCollection.getStats(DBCollection.java:1808) at com.mongodb.casbah.MongoCollectionBase$class.getStats(MongoCollection.scala:608) at com.mongodb.casbah.MongoCollection.getStats(MongoCollection.scala:1162) at com.mongodb.casbah.MongoCollectionBase$class.stats(MongoCollection.scala:606) at com.mongodb.casbah.MongoCollection.stats(MongoCollection.scala:1162) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner.isShardedCollection(MongodbPartitioner.scala:79) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner$$anonfun$computePartitions$1.apply(MongodbPartitioner.scala:67) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner$$anonfun$computePartitions$1.apply(MongodbPartitioner.scala:66) at com.stratio.datasource.mongodb.util.usingMongoClient$.apply(usingMongoClient.scala:27) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner.computePartitions(MongodbPartitioner.scala:66) at com.stratio.datasource.mongodb.rdd.MongodbRDD.getPartitions(MongodbRDD.scala:42) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.rdd.RDD.partitions(RDD.scala:246) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.rdd.RDD.partitions(RDD.scala:246) at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:65) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:328) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:328) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:358) at org.apache.spark.rdd.PairRDDFunctions.reduceByKey(PairRDDFunctions.scala:327) at com.stratio.datasource.mongodb.schema.MongodbSchema.schema(MongodbSchema.scala:46) at com.stratio.datasource.mongodb.MongodbRelation.com$stratio$datasource$mongodb$MongodbRelation$$lazySchema$lzycompute(MongodbRelation.scala:63) at com.stratio.datasource.mongodb.MongodbRelation.com$stratio$datasource$mongodb$MongodbRelation$$lazySchema(MongodbRelation.scala:60) at com.stratio.datasource.mongodb.MongodbRelation$$anonfun$1.apply(MongodbRelation.scala:65) at com.stratio.datasource.mongodb.MongodbRelation$$anonfun$1.apply(MongodbRelation.scala:65) at scala.Option.getOrElse(Option.scala:121) at com.stratio.datasource.mongodb.MongodbRelation.<init>(MongodbRelation.scala:65) at com.stratio.datasource.mongodb.DefaultSource.createRelation(DefaultSource.scala:36) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:345) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:280) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:214) at java.lang.Thread.run(Thread.java:745)
Any help is appreciated.
The text was updated successfully, but these errors were encountered: