Skip to content
This repository has been archived by the owner on May 27, 2020. It is now read-only.

spark.sqlContext.fromMongoDB(readConfig) retrieve error #173

Open
dark-spark2 opened this issue Jan 23, 2017 · 3 comments
Open

spark.sqlContext.fromMongoDB(readConfig) retrieve error #173

dark-spark2 opened this issue Jan 23, 2017 · 3 comments

Comments

@dark-spark2
Copy link

Hi ,
I installed spark 2.0.2 , and run the spark shell.

bin]$ ./spark-shell --jars ~/spark-mongodb_2.10-0.11.2.jar --packages org.mongodb:casbah-core_2.10:3.0.0
Ivy Default Cache set to: /home/centos/.ivy2/cache
The jars for the packages stored in: /home/centos/.ivy2/jars
:: loading settings :: url = jar:file:/home/centos/spark/spark-2.0.2-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.mongodb#casbah-core_2.10 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found org.mongodb#casbah-core_2.10;3.0.0 in central
found org.mongodb#casbah-commons_2.10;3.0.0 in central
found com.github.nscala-time#nscala-time_2.10;1.0.0 in central
found joda-time#joda-time;2.3 in central
found org.joda#joda-convert;1.2 in central
found org.mongodb#mongo-java-driver;3.0.4 in central
found org.slf4j#slf4j-api;1.6.0 in central
found org.mongodb#casbah-query_2.10;3.0.0 in central
:: resolution report :: resolve 377ms :: artifacts dl 13ms
:: modules in use:
com.github.nscala-time#nscala-time_2.10;1.0.0 from central in [default]
joda-time#joda-time;2.3 from central in [default]
org.joda#joda-convert;1.2 from central in [default]
org.mongodb#casbah-commons_2.10;3.0.0 from central in [default]
org.mongodb#casbah-core_2.10;3.0.0 from central in [default]
org.mongodb#casbah-query_2.10;3.0.0 from central in [default]
org.mongodb#mongo-java-driver;3.0.4 from central in [default]
org.slf4j#slf4j-api;1.6.0 from central in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 8 | 0 | 0 | 0 || 8 | 0 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
confs: [default]
0 artifacts copied, 8 already retrieved (0kB/8ms)
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
17/01/23 14:17:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/01/23 14:17:55 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://172.31.19.188:4040
Spark context available as 'sc' (master = local[*], app id = local-1485181074885).
Spark session available as 'spark'.
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 2.0.2
/
/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.

scala> import org.apache.spark.sql._
import org.apache.spark.sql._

scala> import com.mongodb.casbah.{WriteConcern => MongodbWriteConcern}
import com.mongodb.casbah.{WriteConcern=>MongodbWriteConcern}

scala> import com.stratio.datasource.mongodb._
import com.stratio.datasource.mongodb._

scala> import com.stratio.datasource.mongodb.config._
import com.stratio.datasource.mongodb.config._

scala> import com.stratio.datasource.mongodb.config.MongodbConfig._
import com.stratio.datasource.mongodb.config.MongodbConfig._

scala> val builder = MongodbConfigBuilder(Map(Host -> List("localhost:27017"), Database -> "db1", Collection ->"coll1", SamplingRatio -> 0.001, WriteConcern -> "normal"))
builder: com.stratio.datasource.mongodb.config.MongodbConfigBuilder = MongodbConfigBuilder(Map(database -> db1, writeConcern -> normal, schema_samplingRatio -> 0.001, collection -> coll1, host -> List(localhost:27017)))

scala>

scala> val readConfig = builder.build()
readConfig: com.stratio.datasource.util.Config = com.stratio.datasource.util.ConfigBuilder$$anon$1@f3cee0fa

scala> val mongoRDD = spark.sqlContext.fromMongoDB(readConfig)
java.lang.NoSuchMethodError: com.stratio.datasource.mongodb.MongodbContext.fromMongoDB(Lcom/stratio/datasource/util/Config;Lscala/Option;)Lorg/apache/spark/sql/Dataset;
... 56 elided

@sanjosh
Copy link

sanjosh commented Apr 13, 2017

I am seeing the same error. Did you find a workaround @dark-spark2 ?

@cyjj
Copy link

cyjj commented May 8, 2017

I am currently getting the same error using spark 2.1.0 Do you guys get a solution for this?

@thirumalalagu
Copy link

Still I am getting the same error using Spark 2.3.1.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants