Run Spark locally and access S3

By changing the code:
val sparkConfig = new SparkConf()
   .set("", "com.amazonaws.auth.DefaultAWSCredentialsProviderChain")
By adding JVM arguments to Java:
By setting the JVM property from Java (I have not tested if this works for the credentials provider, but it should):
System.setProperty("spark.master", "local[*]")
System.setProperty("", "com.amazonaws.auth.DefaultAWSCredentialsProviderChain")
The AWS credentials will be taken from the default profile or you can specify the profile with the environment variable AWS_PROFILE=<your profile.