启动Spark-shell时出错

时间:2017-11-07 10:28:58

标签: apache-spark pyspark

enter image description here有人请帮助我,从过去2天开始我试图设置spark和python环境。 遗憾的是,无法做到这一点。

错误: -

from unittest.mock import patch
from orders.models import Order

class OrderModelTest(CartSetupTestCase):

    @patch('orders.signals.SlackNotification.objects.create')
    def test_string_representation(self, create):
        order = Order.objects.create(
            user=self.user,
            merchant_uid="1475633246629",
            customer_name="asd",
            address="주소",
            address_detail="asdfdsa",
            postal_code="12345",
            phone_number="01095104344",
            possible_date_start="2011-11-24",
            possible_date_end="2011-11-24",
            possible_time_start="11:22 AM",
            possible_time_end="11:22 AM",
            total_price=self.cart.total_price,
        )
        self.assertEquals(1, create.call_count)

这些是我设置的路径:

Spark-shell
failed to initialize compiler: object java.lang.object in compiler mirror not found

环境变量:

Anaconda root: C:\Users\raj\AppData\Local\Continuum\Anaconda3
Python: C:\Users\raj\AppData\Local\Programs\Python\Python36-32
c:\spark

我还没有安装scala。

请有人帮我做好设置。

有没有办法解决上述错误或使用Anaconda的Jupyter笔记本调用pyspark

0 个答案:

没有答案