Andrey Rakhmatullin
Andrey Rakhmatullin
For the record, in the described example the raised exception is *unhandled*, bubbling out of `Crawler.crawl_async()` or `Crawler.crawl()` depending on the runner used, (raised in `self.engine = self._create_engine()` and reraised...
> I have non-scrapy external scripts that uses logging module Can you please explain how would they be affected by Scrapy logging if they are external scripts?
I can't call this "non-scrapy", "external" or "script" and it works as intended to me. It's likely that you don't want to use `LOG_ENABLED=False` here. Does passing `install_root_handler=False` to your...
> by external script I meant simple thing just using built-in module directly 'logger.info()' rather than calling it within scrapy code like 'self.logger.info()'. Using `CrawlerProcess`, as opposed to e.g. `CrawlerRunner`,...
My question still stands. If that's easier you can provide a minimal reproducible example of your problem.
Assuming you didn't omit some custom logging configuration, this indeed won't log anything even after you remove all Scrapy-related parts. You can minimize it to `python3 -c 'import logging; logging.info("foo")'`.
> yes, it will not show anything So what was your minimal reproducible example intended to reproduce? > Anyway the issue is still there. So far you weren't able to...
> this will not print when LOG_ENABLED=False Or when it's True. But this helped me find the source of the confusion: by default Scrapy installs `scrapy.utils.log.LogCounterHandler` and so `logging.lastResort` is...
@K-Preetham-Reddy yes
@VARUN3WARE it's tagged "discuss", it's too early to provide PRs, sorry.