0

我的帖子

个人中心

设置

  发新话题
使用的是Python3.7,安装完scrapy和一些依赖的库后运行scrapy crawl spidertieba。出现报错,报错信息是:
2018-01-21 17:55:38 [scrapy.utils.log] INFO: Scrapy 1.5.0 started (bot: hellospi
der)
2018-01-21 17:55:38 [scrapy.utils.log] INFO: Versions: lxml 4.1.1.0, libxml2 2.9
.7, cssselect 1.0.3, parsel 1.3.1, w3lib 1.18.0, Twisted 17.9.0, Python 3.7.0a3
(v3.7.0a3:90a6785, Dec  5 2017, 22:04:17) [MSC v.1900 32 bit (Intel)], pyOpenSSL
17.5.0 (OpenSSL 1.1.0f  25 May 2017), cryptography 2.1.4, Platform Windows-7-6.
1.7601-SP1
2018-01-21 17:55:38 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'he
llospider', 'NEWSPIDER_MODULE': 'hellospider.spiders', 'ROBOTSTXT_OBEY': True, '
SPIDER_MODULES': ['hellospider.spiders']}
Traceback (most recent call last):
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "C:\Users\Administrator.USER-20160420AE\AppData\Local\Programs\Python\Pyt
hon37-32\Scripts\scrapy.exe\__main__.py", line 9, in <module>
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\cmdline.py", line 150, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\cmdline.py", line 90, in _run_print_help
    func(*a, **kw)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\cmdline.py", line 157, in _run_command
    cmd.run(args, opts)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\commands\crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\crawler.py", line 170, in crawl
    crawler = self.create_crawler(crawler_or_spidercls)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\crawler.py", line 198, in create_crawler
    return self._create_crawler(crawler_or_spidercls)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\crawler.py", line 203, in _create_crawler
    return Crawler(spidercls, self.settings)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\crawler.py", line 55, in __init__
    self.extensions = ExtensionManager.from_crawler(self)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\middleware.py", line 58, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\utils\misc.py", line 44, in load_object
    mod = import_module(module)
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\importlib\__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 680, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\scrapy\extensions\telnet.py", line 12, in <module>
    from twisted.conch import manhole, telnet
  File "c:\users\administrator.user-20160420ae\appdata\local\programs\python\pyt
hon37-32\lib\site-packages\twisted\conch\manhole.py", line 154
    def write(self, data, async=False):
                              ^
SyntaxError: invalid syntax



ok已经解决了




本帖最后由 xgugu 于 2018-1-22 12:49 编辑
你是怎么解决的呀,我也遇到这个问题



搞定了,方法如下
将变量名async换一个名字,例如换成shark,就可以编译过了
def write(self, data, shark=False):



引用:
原帖由 SU152 于 2018-1-29 17:48 发表
搞定了,方法如下
将变量名async换一个名字,例如换成shark,就可以编译过了
def write(self, data, shark=False):
大神,这是什么原理?



楼上哥们,他是看源码的,我也碰到这个问题了。
调用的addoutput里面的也要将async参数更改为shark,因为下面的if判断,源码里面都是用的shark,又不是async,识别不了async


    def write(self, data, shark=False):
        self.handler.addOutput(data, shark)


    def addOutput(self, data, shark=False):
        if shark:
            self.terminal.eraseLine()
            self.terminal.cursorBackward(len(self.lineBuffer) + len(self.ps[self.pn]))

        self.terminal.write(data)

        if shark:
            if self._needsNewline():
                self.terminal.nextLine()




本帖最后由 wqg_dba 于 2018-7-29 10:16 编辑
楼上哥们6呀



‹‹ 上一贴:跪求大神帮忙解决伪代码问题   |   下一贴:【干货】全面的IT技能图谱-高清思维导图 ... ››
  发新话题
快速回复主题
关于我们 | 诚聘英才 | 联系我们 | 网站大事 | 友情链接 |意见反馈 | 网站地图
Copyright©2005-2018 51CTO.COM
本论坛言论纯属发布者个人意见,不代表51CTO网站立场!如有疑义,请与管理员联系:bbs@51cto.com