将URL保存在bash变量中会导致curl失败

时间:2013-04-26 09:07:00

标签: linux bash curl

在bash脚本中,我将以前命令中的URL存储在bash变量$DESTINATION_URL中。我想使用此变量运行curl命令。

如果我使用$DESTINATION_URL变量,则curl命令失败。

如果我使用URL本身尝试相同的curl命令,它可以正常工作。似乎&导致了问题,但我看不出原因。

以下示例:

ha@hadoop-fullslot1:~$ echo $DESTINATION_URL
http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true


ha@hadoop-fullslot1:~$ curl -v -s -i -X PUT -T $SOURCE "$DESTINATION_URL"
* About to connect() to hadoop-fullslot1 port 50075 (#0)
*   Trying 10.1.3.39... connected
 HTTP/1.1bhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: hadoop-fullslot1:50075
> Accept: */*
> Content-Length: 1907377
> Expect: 100-continue
>
* Empty reply from server
* Connection #0 to host hadoop-fullslot1 left intact
* Closing connection #0


ha@hadoop-fullslot1:~$ curl -v -s -i -X PUT -T $SOURCE "http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true"
* About to connect() to hadoop-fullslot1 port 50075 (#0)
*   Trying 10.1.3.39... connected
> PUT /webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true HTTP/1.1
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: hadoop-fullslot1:50075
> Accept: */*
> Content-Length: 1907377
> Expect: 100-continue
>
< HTTP/1.1 100 Continue
HTTP/1.1 100 Continue

* We are completely uploaded and fine
< HTTP/1.1 201 Created
HTTP/1.1 201 Created
< Cache-Control: no-cache
Cache-Control: no-cache
< Expires: Fri, 26 Apr 2013 09:01:38 GMT
Expires: Fri, 26 Apr 2013 09:01:38 GMT
< Date: Fri, 26 Apr 2013 09:01:38 GMT
Date: Fri, 26 Apr 2013 09:01:38 GMT
< Pragma: no-cache
Pragma: no-cache
< Expires: Fri, 26 Apr 2013 09:01:38 GMT
Expires: Fri, 26 Apr 2013 09:01:38 GMT
< Date: Fri, 26 Apr 2013 09:01:38 GMT
Date: Fri, 26 Apr 2013 09:01:38 GMT
< Pragma: no-cache
Pragma: no-cache
< Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar
Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar
< Content-Type: application/octet-stream
Content-Type: application/octet-stream
< Content-Length: 0
Content-Length: 0
< Server: Jetty(6.1.26.cloudera.2)
Server: Jetty(6.1.26.cloudera.2)

<
* Connection #0 to host hadoop-fullslot1 left intact
* Closing connection #0
ha@hadoop-fullslot1:~$

2 个答案:

答案 0 :(得分:3)

您的变量包含的内容(垃圾)不仅仅是URL。我想猜一个CR字节什么的,看看“HTTP / 1.1”如何首先在线上打印,虽然它应该在URL的右边...

答案 1 :(得分:-1)

使用单引号'代替双引号"