无法在Sagemaker中将数据从s3读取到spark数据帧

时间:2018-07-23 23:08:14

标签: amazon-s3 pyspark amazon-sagemaker

我正在尝试将s3存储桶(sagemaker笔记本具有完全访问权限)上的csv文件读取到spark数据帧中,但是我遇到了以下找不到sagemaker-spark_2.11-spark_2.2.0-1.1.1.jar的问题。任何解决此问题的技巧都值得赞赏!

bucket = "mybucket"
prefix = "folder/file.csv"
df = spark.read.csv("s3://{}/{}/".format(bucket,prefix))

Py4JJavaError: An error occurred while calling o388.csv.
: java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister: Error reading configuration file
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.parse(ServiceLoader.java:309)
at java.util.ServiceLoader.access$200(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.hasNextService(ServiceLoader.java:357)
at java.util.ServiceLoader$LazyIterator.hasNext(ServiceLoader.java:393)
at java.util.ServiceLoader$1.hasNext(ServiceLoader.java:474)
at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:614)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:190)
at org.apache.spark.sql.DataFrameReader.csv(DataFrameReader.scala:596)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.FileNotFoundException: /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages/sagemaker_pyspark/jars/sagemaker-spark_2.11-spark_2.2.0-1.1.1.jar (No such file or directory)
    at java.util.zip.ZipFile.open(Native Method)
    at java.util.zip.ZipFile.<init>(ZipFile.java:219)
    at java.util.zip.ZipFile.<init>(ZipFile.java:149)
    at java.util.jar.JarFile.<init>(JarFile.java:166)
    at java.util.jar.JarFile.<init>(JarFile.java:103)
    at sun.net.www.protocol.jar.URLJarFile.<init>(URLJarFile.java:93)
    at sun.net.www.protocol.jar.URLJarFile.getJarFile(URLJarFile.java:69)
    at sun.net.www.protocol.jar.JarFileFactory.get(JarFileFactory.java:84)
    at sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:122)
    at sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:150)
    at java.net.URL.openStream(URL.java:1045)
    at java.util.ServiceLoader.parse(ServiceLoader.java:304)
    ... 26 more

1 个答案:

答案 0 :(得分:1)

(Making comment to the original question as answer)

It looks like a jupyter kernel issue. I had a similar issue and I used Sparkmagic (pyspark) kernel instead of Sparkmagic (pyspark3) and it is working fine. Follow instructions on this blog and see if it helps.