在一個bash腳本中,我將前一個命令的URL存儲在一個bash變量$DESTINATION_URL
中。我想用這個變量來運行一個curl命令。在bash變量中保存一個URL導致curl失敗
如果我使用$DESTINATION_URL
變量,curl命令將失敗。
如果我嘗試使用URL本身的相同curl命令,它工作正常。這似乎是&
是造成問題,但我不明白爲什麼。下面
例子:
[email protected]:~$ echo $DESTINATION_URL
http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true
[email protected]:~$ curl -v -s -i -X PUT -T $SOURCE "$DESTINATION_URL"
* About to connect() to hadoop-fullslot1 port 50075 (#0)
* Trying 10.1.3.39... connected
HTTP/1.1bhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: hadoop-fullslot1:50075
> Accept: */*
> Content-Length: 1907377
> Expect: 100-continue
>
* Empty reply from server
* Connection #0 to host hadoop-fullslot1 left intact
* Closing connection #0
[email protected]:~$ curl -v -s -i -X PUT -T $SOURCE "http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true"
* About to connect() to hadoop-fullslot1 port 50075 (#0)
* Trying 10.1.3.39... connected
> PUT /webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true HTTP/1.1
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: hadoop-fullslot1:50075
> Accept: */*
> Content-Length: 1907377
> Expect: 100-continue
>
< HTTP/1.1 100 Continue
HTTP/1.1 100 Continue
* We are completely uploaded and fine
< HTTP/1.1 201 Created
HTTP/1.1 201 Created
< Cache-Control: no-cache
Cache-Control: no-cache
< Expires: Fri, 26 Apr 2013 09:01:38 GMT
Expires: Fri, 26 Apr 2013 09:01:38 GMT
< Date: Fri, 26 Apr 2013 09:01:38 GMT
Date: Fri, 26 Apr 2013 09:01:38 GMT
< Pragma: no-cache
Pragma: no-cache
< Expires: Fri, 26 Apr 2013 09:01:38 GMT
Expires: Fri, 26 Apr 2013 09:01:38 GMT
< Date: Fri, 26 Apr 2013 09:01:38 GMT
Date: Fri, 26 Apr 2013 09:01:38 GMT
< Pragma: no-cache
Pragma: no-cache
< Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar
Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar
< Content-Type: application/octet-stream
Content-Type: application/octet-stream
< Content-Length: 0
Content-Length: 0
< Server: Jetty(6.1.26.cloudera.2)
Server: Jetty(6.1.26.cloudera.2)
<
* Connection #0 to host hadoop-fullslot1 left intact
* Closing connection #0
[email protected]:~$
檢查是否屬於這種情況,請嘗試類似'echo「| $ {DESTINATION_URL} |」所以你看到任何空白。 – 2013-04-26 11:38:53
URL中沒有特殊字符,因爲如果將CURL命令直接放入命令(而不是使用變量),CURL命令可以正常工作。 – 2013-04-26 13:40:42
我們得到「HTTP/1.1 100 Continue」,因爲我們嘗試在重定向之前發送數據。 - [查看更多內容](http://hadoop.apache.org/docs/r1.0.4/webhdfs.html#CREATE) – 2013-04-26 13:51:39