2016-09-19 160 views
1

當我嘗試部署使用shub deploy,我得到這個錯誤:無法部署到Scrapinghub

Removing intermediate container fccf1ec715e6 Step 10 : RUN sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt ---> Running in 729e0d414f46 Double requirement given: attrs==16.1.0 (from -r /app/requirements.txt (line 51)) (already in attrs==16.0.0 (from -r /app/requirements.txt (line 1)), name='attrs')

{"message": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1", "details": {"message": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1"}, "error": "build_error"}

{"message": "Internal build error", "status": "error"} Deploy log location: c:\users\dr521f~1.pri\appdata\local\temp\shub_deploy_pvx7dk.log Error: Deploy failed: {"message": "Internal build error", "status": "error"}

這是我的requirements.txt

attrs==16.1.0 
beautifulsoup4==4.5.1 
cffi==1.8.2 
click==6.6 
cryptography==1.5 
cssselect==0.9.2 
enum34==1.1.6 
fake-useragent==0.1.2 
hubstorage==0.23.1 
idna==2.1 
ipaddress==1.0.17 
lxml==3.6.1 
parsel==1.0.3 
pyasn1==0.1.9 
pyasn1-modules==0.0.8 
pycparser==2.14 
PyDispatcher==2.0.5 
pyOpenSSL==16.1.0 
pypiwin32==219 
queuelib==1.4.2 
requests==2.11.1 
retrying==1.3.3 
ruamel.ordereddict==0.4.9 
ruamel.yaml==0.12.13 
scrapinghub==1.8.0 
Scrapy==1.1.2 
scrapy-fake-useragent==0.0.1 
service-identity==16.0.0 
shub==2.4.0 
six==1.10.0 
Twisted==16.4.0 
typing==3.5.2.2 
w3lib==1.15.0 
zope.interface==4.3.2 

爲什麼我不能部署?

回答

1

從文檔here

Note that this requirements file is an extension of the Scrapy Cloud stack, and therefore should not contain packages that are already part of the stack, such as scrapy.

正如你可以在錯誤看到:

Running in 729e0d414f46 Double requirement given: attrs==16.1.0 (from -r /app/requirements.txt (line 51)) (already in attrs==16.0.0 (from -r /app/requirements.txt (line 1)), name='attrs')

它說Double requirement given

對整個項目和Scrapinghub使用不同的requirements.txt。我最終創建了包含此項的shub-requirements.txt

beautifulsoup4==4.5.1 
fake-useragent==0.1.2