MySQL Driver Error in Apache Spark

Datetime:2016-08-23 04:16:06         Topic: MySQL  Spark          Share        Original >>
Here to See The Original Article!!!

I was following the Spark example to load data from MySQL database. See ""

There was an error upon executing:

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 20.0 failed 4 times, most recent failure: Lost task 0.3 in stage 20.0 (TID 233, ip-172-22-11-249.ap-southeast-1.compute.internal): java.lang.IllegalStateException: Did not find registered driver with class com.mysql.jdbc.Driver

To force Spark to load the "com.mysql.jdbc.Driver", add the following option as highlighted below

val df = sqlContext
  .option("url", url)
  .option("dbtable", "people") 


Put your ads here, just $200 per month.