首页 > Scrapy [twisted] CRITICAL: Unhandled error in Deferred:错误

Scrapy [twisted] CRITICAL: Unhandled error in Deferred:错误

今天在centos6.7 64位系统中安装Scrapy,但是报了如题错误:
python版本:2.7.11
pip list如下
beautifulsoup4 (4.4.1)
cffi (1.4.2)
characteristic (14.3.0)
cryptography (1.1.2)
cssselect (0.9.1)
enum34 (1.1.2)
idna (2.0)
ipaddress (1.0.16)
lxml (3.5.0)
meld3 (1.0.2)
pip (7.1.2)
pyasn1 (0.1.9)
pyasn1-modules (0.0.8)
pycparser (2.14)
pyOpenSSL (0.15.1)
queuelib (1.4.2)
Scrapy (1.0.4)
service-identity (14.0.0)
setuptools (19.2)
six (1.10.0)
supervisor (3.2.0)
Twisted (15.5.0)
w3lib (1.13.0)
web.py (0.37)
wheel (0.26.0)
zope.interface (4.1.3)
按照官网教程新建项目
1、scrapy startproject tutorial
2、修改tutorial/items.py

import scrapy
class DmozItem(scrapy.Item):

title = scrapy.Field()
link = scrapy.Field()
desc = scrapy.Field()

3、修改tutorial/tutorial/spiders/dmoz_spider.py

import scrapy
class DmozSpider(scrapy.Spider):

name = "dmoz"
allowed_domains = ["dmoz.org"]
start_urls = [
    "http://www.dmoz.org/Computers/Programming/Languages/Python/Books/",
    "http://www.dmoz.org/Computers/Programming/Languages/Python/Resources/"
]
def parse(self, response):
    filename = response.url.split("/")[-2] + '.html'
    with open(filename, 'wb') as f:
        f.write(response.body)

4、根目录下执行 scrapy crawl dmoz,报错了

2016-01-07 16:10:42 [scrapy] INFO: Scrapy 1.0.4 started (bot: tutorial)
2016-01-07 16:10:42 [scrapy] INFO: Optional features available: ssl, http11
2016-01-07 16:10:42 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'tutorial.spiders', 'SPIDER_MODULES': ['tutorial.spiders'], 'BOT_NAME': 'tutorial'}
2016-01-07 16:10:42 [scrapy] INFO: Enabled extensions: CloseSpider, TelnetConsole, LogStats, CoreStats, SpiderState
Unhandled error in Deferred:
2016-01-07 16:10:42 [twisted] CRITICAL: Unhandled error in Deferred:

Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 150, in _run_command

cmd.run(args, opts)

File "/usr/local/lib/python2.7/site-packages/scrapy/commands/crawl.py", line 57, in run

self.crawler_process.crawl(spname, **opts.spargs)

File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 153, in crawl

d = crawler.crawl(*args, **kwargs)

File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1274, in unwindGenerator

return _inlineCallbacks(None, gen, Deferred())

--- <exception caught here> ---
File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1128, in _inlineCallbacks

result = g.send(result)

File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 71, in crawl

self.engine = self._create_engine()

File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 83, in _create_engine

return ExecutionEngine(self, lambda _: self.stop())

File "/usr/local/lib/python2.7/site-packages/scrapy/core/engine.py", line 66, in init

self.scheduler_cls = load_object(self.settings['SCHEDULER'])

File "/usr/local/lib/python2.7/site-packages/scrapy/utils/misc.py", line 44, in load_object

mod = import_module(module)

File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module

__import__(name)

File "/usr/local/lib/python2.7/site-packages/scrapy/core/scheduler.py", line 6, in <module>

from queuelib import PriorityQueue

File "/usr/local/lib/python2.7/site-packages/queuelib/__init__.py", line 1, in <module>

from queuelib.queue import FifoDiskQueue, LifoDiskQueue

File "/usr/local/lib/python2.7/site-packages/queuelib/queue.py", line 5, in <module>

import sqlite3

File "/usr/local/lib/python2.7/sqlite3/__init__.py", line 24, in <module>

from dbapi2 import *

File "/usr/local/lib/python2.7/sqlite3/dbapi2.py", line 28, in <module>

from _sqlite3 import *

exceptions.ImportError: No module named _sqlite3
2016-01-07 16:10:42 [twisted] CRITICAL:

错误大概就是这样子,刚开始用python,遇到的请赐教,感激。。

【热门文章】
【热门文章】