Run Spark locally and access S3

By changing the code:
val sparkConfig = new SparkConf()
   .set("fs.s3a.aws.credentials.provider", "com.amazonaws.auth.DefaultAWSCredentialsProviderChain")
   .setMaster("local[*]")
By adding JVM arguments to Java:
-Dspark.master=local[*]
-Dspark.hadoop.fs.s3a.aws.credentials.provider=com.amazonaws.auth.DefaultAWSCredentialsProviderChain
By setting the JVM property from Java (I have not tested if this works for the credentials provider, but it should):
System.setProperty("spark.master", "local[*]")
System.setProperty("spark.hadoop.fs.s3a.aws.credentials.provider", "com.amazonaws.auth.DefaultAWSCredentialsProviderChain")
The AWS credentials will be taken from the default profile or you can specify the profile with the environment variable AWS_PROFILE=<your profile.