./spark-submit –jars $LIBJARS –class test.MyApp –master local myApp.jar

在~/.bashrc 加上

for i in `ls /path/to/your/third/party/*.jar`
do
LIBJARS=$i,$LIBJARS
done
export LIBJARS=$LIBJARS

​http://stackoverflow.com/questions/24855368/spark-throws-classnotfoundexception-when-using-jars-option​

如果再报数据库Driver ClassNotFoundException的错,可以在spark-env.sh里SPARK_CLASSPATH=your external jars

实际跑的时候会有

SPARK_CLASSPATH was detected (set to ‘/usr/local/cloudwave-ha/cloudwave/lib/cloudwave-jdbc.jar’).
This is deprecated in Spark 1.0+.

Please instead use:
- ./spark-submit with –driver-class-path to augment the driver classpath
- spark.executor.extraClassPath to augment the executor classpath