apache spark - Automatically including jars to PySpark classpath -


i'm trying automatically include jars pyspark classpath. right can type following command , works:

$ pyspark --jars /path/to/my.jar 

i'd have jar included default can type pyspark , use in ipython notebook.

i've read can include argument setting pyspark_submit_args in env:

export pyspark_submit_args="--jars /path/to/my.jar" 

unfortunately above doesn't work. runtime error failed load class data source.

running spark 1.3.1.

edit

my workaround when using ipython notebook following:

$ ipython_opts="notebook" pyspark --jars /path/to/my.jar 

you can add jar files in spark-defaults.conf file (located in conf folder of spark installation). if there more 1 entry in jars list, use : separator.

spark.driver.extraclasspath /path/to/my.jar 

this property documented in https://spark.apache.org/docs/1.3.1/configuration.html#runtime-environment


Comments

Popular posts from this blog

c# - Better 64-bit byte array hash -

webrtc - Which ICE candidate am I using and why? -

php - Zend Framework / Skeleton-Application / Composer install issue -