2013-04-26 90 views
0

在一個bash腳本中,我將前一個命令的URL存儲在一個bash變量$DESTINATION_URL中。我想用這個變量來運行一個curl命令。在bash變量中保存一個URL導致curl失敗

如果我使用$DESTINATION_URL變量,curl命令將失敗。

如果我嘗試使用URL本身的相同curl命令,它工作正常。這似乎是&是造成問題,但我不明白爲什麼。下面

例子:

[email protected]:~$ echo $DESTINATION_URL 
http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true 


[email protected]:~$ curl -v -s -i -X PUT -T $SOURCE "$DESTINATION_URL" 
* About to connect() to hadoop-fullslot1 port 50075 (#0) 
* Trying 10.1.3.39... connected 
HTTP/1.1bhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true 
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 
> Host: hadoop-fullslot1:50075 
> Accept: */* 
> Content-Length: 1907377 
> Expect: 100-continue 
> 
* Empty reply from server 
* Connection #0 to host hadoop-fullslot1 left intact 
* Closing connection #0 


[email protected]:~$ curl -v -s -i -X PUT -T $SOURCE "http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true" 
* About to connect() to hadoop-fullslot1 port 50075 (#0) 
* Trying 10.1.3.39... connected 
> PUT /webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true HTTP/1.1 
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 
> Host: hadoop-fullslot1:50075 
> Accept: */* 
> Content-Length: 1907377 
> Expect: 100-continue 
> 
< HTTP/1.1 100 Continue 
HTTP/1.1 100 Continue 

* We are completely uploaded and fine 
< HTTP/1.1 201 Created 
HTTP/1.1 201 Created 
< Cache-Control: no-cache 
Cache-Control: no-cache 
< Expires: Fri, 26 Apr 2013 09:01:38 GMT 
Expires: Fri, 26 Apr 2013 09:01:38 GMT 
< Date: Fri, 26 Apr 2013 09:01:38 GMT 
Date: Fri, 26 Apr 2013 09:01:38 GMT 
< Pragma: no-cache 
Pragma: no-cache 
< Expires: Fri, 26 Apr 2013 09:01:38 GMT 
Expires: Fri, 26 Apr 2013 09:01:38 GMT 
< Date: Fri, 26 Apr 2013 09:01:38 GMT 
Date: Fri, 26 Apr 2013 09:01:38 GMT 
< Pragma: no-cache 
Pragma: no-cache 
< Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar 
Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar 
< Content-Type: application/octet-stream 
Content-Type: application/octet-stream 
< Content-Length: 0 
Content-Length: 0 
< Server: Jetty(6.1.26.cloudera.2) 
Server: Jetty(6.1.26.cloudera.2) 

< 
* Connection #0 to host hadoop-fullslot1 left intact 
* Closing connection #0 
[email protected]:~$ 

回答

3

你的變量包含的不僅僅是URL更多的東西(垃圾)。我會猜測一個CR字節或什麼,看到「HTTP/1.1」是如何獲得的,雖然它應該在URL的右邊,但是它首先被打印出來...

+0

檢查是否屬於這種情況,請嘗試類似'echo「| $ {DESTINATION_URL} |」所以你看到任何空白。 – 2013-04-26 11:38:53

+0

URL中沒有特殊字符,因爲如果將CURL命令直接放入命令(而不是使用變量),CURL命令可以正常工作。 – 2013-04-26 13:40:42

+0

我們得到「HTTP/1.1 100 Continue」,因爲我們嘗試在重定向之前發送數據。 - [查看更多內容](http://hadoop.apache.org/docs/r1.0.4/webhdfs.html#CREATE) – 2013-04-26 13:51:39

-1

使用單引號',而不是雙引號"

+0

我必須使用雙引號,因爲我想通過一個變量作爲CURL命令的參數。 – 2013-04-26 13:55:32