PySpark - SparkContext: Error initializing SparkContext File does not exist
PySpark - SparkContext: Error initializing SparkContext File does not exist I have small piece code in PySpark, but I keep getting errors. I'm new to this so im not sure where to start. from pyspark import SparkContext, SparkConf conf = SparkConf().setAppName("Open json").setMaster("local[3]") sc = SparkContext(conf = conf) print("Done") I ran this in cmd with the command : spark-submit .PySparkOpen.py I then get the following error statement: C:UsersAbdullahDocumentsMaster Thesis>spark-submit .PySparkOpen.py 18/06/30 15:21:58 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-Java classes where applicable 18/06/30 15:22:01 ERROR SparkContext: Error initializing SparkContext. java.io.FileNotFoundException: File file:/C:/Users/Abdullah/Documents/Master%20Thesis/PySpark/Open.py does not exist at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:611) at...