scrapy-jsonrpc icon indicating copy to clipboard operation
scrapy-jsonrpc copied to clipboard

Python 3 compatibility

Open redapple opened this issue 7 years ago • 6 comments

scrapy-jsonrpc is not compatible with Python 3.

Apart from the example client code that uses urllib.urlopen() :

  • the crawler resource is not found, the child resource name "crawler" needs to be passed as bytes to Twisted
  • the responses are not bytes and Twisted also complains

redapple avatar Sep 27 '16 13:09 redapple

Any updates on this issue?

mirceachira avatar Jun 16 '17 11:06 mirceachira

I see there's a pull request for this for a while now, I'm running it locally and it works but it would be nice to merge the branch so that the master could be used directly. Please fix this asap so that people don't have to deal with this in the future :)

mirceachira avatar Jul 13 '17 09:07 mirceachira

any update for this issue?

rustanacexd avatar Jul 02 '18 08:07 rustanacexd

Any update. I cannot make it work with Python 3.


2019-02-26 22:57:06 [py.warnings] WARNING: /Users/XXX/anaconda3/lib/python3.6/site-packages/scrapy_jsonrpc/webservice.py:4: ScrapyDeprecationWarning: Module `scrapy.log` has been deprecated, Scrapy now relies on the builtin Python library for logging. Read the updated logging entry in the documentation to learn more.
  from scrapy import log, signals

Traceback (most recent call last):
  File "/Users/XXX/anaconda3/bin/scrapy", line 11, in <module>
    sys.exit(execute())
  File "/Users/XXX/anaconda3/lib/python3.6/site-packages/scrapy/cmdline.py", line 150, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/cmdline.py", line 90, in _run_print_help
    func(*a, **kw)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/cmdline.py", line 157, in _run_command
    cmd.run(args, opts)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/commands/crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/crawler.py", line 171, in crawl
    crawler = self.create_crawler(crawler_or_spidercls)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/crawler.py", line 200, in create_crawler
    return self._create_crawler(crawler_or_spidercls)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/crawler.py", line 205, in _create_crawler
    return Crawler(spidercls, self.settings)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/crawler.py", line 55, in __init__
    self.extensions = ExtensionManager.from_crawler(self)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/middleware.py", line 53, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/utils/misc.py", line 44, in load_object
    mod = import_module(module)
  File "/Users/kourosh/anaconda3/lib/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy_jsonrpc/webservice.py", line 7, in <module>
    from scrapy_jsonrpc.jsonrpc import jsonrpc_server_call
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy_jsonrpc/jsonrpc.py", line 11, in <module>
    from scrapy_jsonrpc.serialize import ScrapyJSONDecoder
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy_jsonrpc/serialize.py", line 8, in <module>
    from scrapy.spider import Spider
ModuleNotFoundError: No module named 'scrapy.spider'


kouroshshafi avatar Feb 27 '19 04:02 kouroshshafi

@kouroshshafi, try downgrading scrapy to 1.5, this is probably irrelevant to python3.

Digenis avatar Apr 20 '19 12:04 Digenis

IS any one working on this?

ShrinkDW avatar Aug 23 '20 17:08 ShrinkDW