Hy,
我已多次运行Spark(Spyder IDE)。 今天我收到了这个错误(代码是一样的)
from py4j.java_gateway import JavaGateway
gateway = JavaGateway()
os.environ['SPARK_HOME']="C:/Apache/spark-1.6.0"
os.environ['JAVA_HOME']="C:/Program Files/Java/jre1.8.0_71"
sys.path.append("C:/Apache/spark-1.6.0/python/")
os.environ['HADOOP_HOME']="C:/Apache/spark-1.6.0/winutils/"
from pyspark import SparkContext
from pyspark import SparkConf
conf = SparkConf()
The system cannot find the path specified.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Apache\spark-1.6.0\python\pyspark\conf.py", line 104, in __init__
SparkContext._ensure_initialized()
File "C:\Apache\spark-1.6.0\python\pyspark\context.py", line 245, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File "C:\Apache\spark-1.6.0\python\pyspark\java_gateway.py", line 94, in launch_gateway
raise Exception("Java gateway process exited before sending the driver its port number")
Exception: Java gateway process exited before sending the driver its port number
出了什么问题? 谢谢你的时间。
答案 0 :(得分:5)
好的......有人在VirtualMachine中安装了新的java版本。我只是改变了这个
<mvc:annotation-driven/>
<bean name="viewResolver"
class="org.springframework.web.servlet.view.BeanNameViewResolver"/>
<bean name="jsonTemplate"
class="org.springframework.web.servlet.view.json.MappingJackson2JsonView"/>
再次运作。 为你的时间而战。