java - Can't access sqlite db from Spark -


i have following code:

val conf = new sparkconf().setappname("spark test") val sc = new sparkcontext(conf) val sqlcontext = new org.apache.spark.sql.sqlcontext(sc)  val data = sqlcontext.read.format("jdbc").options(   map(     "url" -> "jdbc:sqlite:/nv/pricing/ix_tri_pi.sqlite3",     "dbtable" -> "select security_id ix_tri_pi")).load()  data.foreach {   row => println(row.getint(1)) } 

and try submit with:

spark-submit \   --class "com.novus.analytics.spark.sparktest" \    --master "local[4]" \  /users/smabie/workspace/analytics/analytics-spark/target/scala-2.10/analytics-spark.jar \   --conf spark.executer.extraclasspath=sqlite-jdbc-3.8.7.jar \   --conf  spark.driver.extraclasspath=sqlite-jdbc-3.8.7.jar \   --driver-class-path sqlite-jdbc-3.8.7.jar \   --jars sqlite-jdbc-3.8.7.jar 

but following exception:

exception in thread "main" java.sql.sqlexception: no suitable driver 

i using spark version 1.6.1, if helps. thanks!

try defining jar last parameter of spark-submit.


Comments

Popular posts from this blog

Django REST Framework perform_create: You cannot call `.save()` after accessing `serializer.data` -

Why does Go error when trying to marshal this JSON? -