2016-12-13 78 views
0

現在嘗試立即scrapy一天後你好。它終於起作用了。 但是,因爲我對Python完全陌生。我認爲我沒有正確安裝它。運行我的第一個scrapy項目時出錯

我試圖從scrapy.org手冊(第9頁) 運行我的第一個scrapy項目,但是當我嘗試運行該項目時出現錯誤。

下面是我遇到的錯誤:

[[email protected] sproject]# scrapy runspider quotes_spider.py -o quotes.json 
2016-12-13 01:33:44 [scrapy] INFO: Scrapy 1.2.2 started (bot: scrapybot) 
2016-12-13 01:33:44 [scrapy] INFO: Overridden settings: {'FEED_URI': 'quotes.json', 'FEED_FORMAT': 'json'} 
2016-12-13 01:33:45 [scrapy] INFO: Enabled extensions: 
['scrapy.extensions.feedexport.FeedExporter', 
'scrapy.extensions.logstats.LogStats', 
'scrapy.extensions.corestats.CoreStats', 
'scrapy.extensions.telnet.TelnetConsole'] 
2016-12-13 01:33:45 [scrapy] INFO: Enabled downloader middlewares: 
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 
'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 
'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 
'scrapy.downloadermiddlewares.retry.RetryMiddleware', 
'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 
'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 
'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 
'scrapy.downloadermiddlewares.chunked.ChunkedTransferMiddleware', 
'scrapy.downloadermiddlewares.stats.DownloaderStats'] 
2016-12-13 01:33:45 [scrapy] INFO: Enabled spider middlewares: 
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 
'scrapy.spidermiddlewares.referer.RefererMiddleware', 
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 
'scrapy.spidermiddlewares.depth.DepthMiddleware'] 
2016-12-13 01:33:45 [scrapy] INFO: Enabled item pipelines: 
[] 
2016-12-13 01:33:45 [scrapy] INFO: Spider opened 
Unhandled error in Deferred: 
2016-12-13 01:33:45 [twisted] CRITICAL: Unhandled error in Deferred: 

2016-12-13 01:33:45 [twisted] CRITICAL: 
Traceback (most recent call last): 
    File "/usr/local/lib/python3.5/site-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks 
    result = g.send(result) 
    File "/usr/local/lib/python3.5/site-packages/scrapy/crawler.py", line 74, in crawl 
    yield self.engine.open_spider(self.spider, start_requests) 
ImportError: No module named '_sqlite3' 

和我使用的是Python版本:

[root]# cd ~ 
[root]# python -V 
Python 3.5.2 
[root]# pip -V 
pip 9.0.1 from /usr/local/lib/python3.5/site-packages (python 3.5) 

謝謝。任何幫助表示讚賞。

+0

[sqlite3是Python標準庫的一部分](https://docs.python.org /3/library/sqlite3.html)。奇怪的是,它在您的系統上不可用。如果你在一個'python' shell中鍵入'import sqlite3',你有同樣的錯誤嗎?如果是的話,Python 3.5可能沒有正確安裝在你的系統上(至少對於sqlite3依賴) –

回答

0

它看起來不像Scrapy問題。您可能需要明確安裝libsqlite3-dev(假設您位於基於Debian的系統)或sqlite-devel(如果您位於Redhat系列)