如何在Webhdfs中启用cors原点允许-HDFS-Hadoop-Access-Control-Allow-Origin

时间:2018-10-11 20:33:02

标签: angular hadoop cors hdfs webhdfs

当我尝试从Angular 6应用访问Webhdfs时,出现以下错误。在我看来,不幸的是,我几乎尝试了所有操作,包括更改core-site.xmlhdfs-site.xml中的设置,但都没有得到积极的结果。显然,很可能需要正确配置Hadoop。有谁知道如何解决这个问题?

[Error] Origin http://localhost:4200 is not allowed by Access-Control-Allow-Origin.
[Error] XMLHttpRequest cannot load http://192.168.0.16:9870/webhdfs/v1/user/myuser/myfile.csv?op=CREATE&user.name=myuser&createflag=&createparent=true&overwrite=false due to access control checks.
[Error] Failed to load resource: Origin http://localhost:4200 is not allowed by Access-Control-Allow-Origin. (myfile.csv, line 0)

4 个答案:

答案 0 :(得分:1)

来自docs

  

要启用跨域支持(CORS),请设置以下配置参数:

     

将org.apache.hadoop.security.HttpCrossOriginFilterInitializer添加到core-site.xml中的hadoop.http.filter.initializers中。您还需要在core-site.xml中设置以下属性-

hadoop.http.cross-origin.enabled = true

hadoop.http.cross-origin.allowed-origins = *

hadoop.http.cross-origin.allowed-methods = GET,POST,HEAD,DELETE,OPTIONS

hadoop.http.cross-origin.allowed-headers = X-Requested-With,Content-Type,Accept,Origin

hadoop.http.cross-origin.max-age = 1800

答案 1 :(得分:0)

您应该配置hdfs-site.xml,并添加配置

<property>
    <name>dfs.permissions</name>
    <value>false</value>
    <description>If "true", enable permission checking in HDFS. If "false", permission checking is turned off, but all other behavior is unchanged. Switching from one parameter value to the other does not change the mode, owner or group of files or directories.</description>
</property>

答案 2 :(得分:0)

如果没有,请在core-site.xml中添加...

<property>
  <name>hadoop.http.filter.initializers</name>
  <value>org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.security.HttpCrossOriginFilterInitializer</value>
  <description>A comma separated list of class names. Each class in the list
  must extend org.apache.hadoop.http.FilterInitializer. The corresponding
  Filter will be initialized. Then, the Filter will be applied to all user
  facing jsp and servlet web pages.  The ordering of the list defines the
  ordering of the filters.</description>
</property>
<property>
<name>hadoop.http.cross-origin.enabled</name>
<value>true</value>
<description>Enables cross origin support for all web-services</description>
</property>
<property>
<name>hadoop.http.cross-origin.allowed-origins</name>
<value>*</value>
<description>Comma separated list of origins that are allowed, wildcards (*) and patterns allowed</description>
</property>
<property>
<name>hadoop.http.cross-origin.allowed-methods</name>
<value>GET,POST,HEAD,PUT,OPTIONS,DELETE</value>
<description>Comma separated list of methods that are allowed</description>
</property>
<property>
<name>hadoop.http.cross-origin.allowed-headers</name>
<value>X-Requested-With,Content-Type,Accept,Origin,WWW-Authenticate,Accept-Encoding,Transfer-Encoding</value>
<description>Comma separated list of headers that are allowed</description>
</property>
<property>
<name>hadoop.http.cross-origin.max-age</name>
<value>1800</value>
<description>Number of seconds a pre-flighted request can be cached</description>
</property>

答案 3 :(得分:0)

您必须将这些属性附加到此文件etc/hadoop/core-site.xml

<configuration>
    <property>
            <name>hadoop.http.cross-origin.enabled</name>
            <value>false</value>
    </property>
    <property>
            <name>hadoop.http.cross-origin.allowed-origins</name>
            <value>*</value>
    </property>
    <property>
            <name>hadoop.http.cross-origin.allowed-methods</name>
            <value>GET,POST,HEAD</value>
    </property>
    <property>
            <name>hadoop.http.cross-origin.allowed-headers</name>
            <value>X-Requested-With,Content-Type,Accept,Origin</value>
    </property>
    <property>
            <name>hadoop.http.cross-origin.max-age</name>
            <value>1800</value>
    </property></configuration>