NewsSpider
NewsSpider copied to clipboard
Bump scrapy from 1.7.3 to 2.6.2
Bumps scrapy from 1.7.3 to 2.6.2.
Release notes
Sourced from scrapy's releases.
2.6.2
Fixes a security issue around HTTP proxy usage, and addresses a few regressions introduced in Scrapy 2.6.0.
See the changelog.
2.6.1
Fixes a regression introduced in 2.6.0 that would unset the request method when following redirects.
2.6.0
- Security fixes for cookie handling (see details below)
- Python 3.10 support
- asyncio support is no longer considered experimental, and works out-of-the-box on Windows regardless of your Python version
- Feed exports now support
pathlib.Path
output paths and per-feed item filtering and post-processingSecurity bug fixes
When a
Request
object with cookies defined gets a redirect response causing a newRequest
object to be scheduled, the cookies defined in the originalRequest
object are no longer copied into the newRequest
object.If you manually set the
Cookie
header on aRequest
object and the domain name of the redirect URL is not an exact match for the domain of the URL of the originalRequest
object, yourCookie
header is now dropped from the newRequest
object.The old behavior could be exploited by an attacker to gain access to your cookies. Please, see the cjvr-mfj7-j4j8 security advisory for more information.
Note: It is still possible to enable the sharing of cookies between different domains with a shared domain suffix (e.g.
example.com
and any subdomain) by defining the shared domain suffix (e.g.example.com
) as the cookie domain when defining your cookies. See the documentation of theRequest
class for more information.When the domain of a cookie, either received in the
Set-Cookie
header of a response or defined in aRequest
object, is set to apublic suffix <https://publicsuffix.org/>
_, the cookie is now ignored unless the cookie domain is the same as the request domain.The old behavior could be exploited by an attacker to inject cookies from a controlled domain into your cookiejar that could be sent to other domains not controlled by the attacker. Please, see the mfjm-vh54-3f96 security advisory for more information.
2.5.1
Security bug fix:
If you use
HttpAuthMiddleware
(i.e. thehttp_user
andhttp_pass
spider attributes) for HTTP authentication, any request exposes your credentials to the request target.To prevent unintended exposure of authentication credentials to unintended domains, you must now additionally set a new, additional spider attribute,
http_auth_domain
, and point it to the specific domain to which the authentication credentials must be sent.If the
http_auth_domain
spider attribute is not set, the domain of the first request will be considered the HTTP authentication target, and authentication credentials will only be sent in requests targeting that domain.If you need to send the same HTTP authentication credentials to multiple domains, you can use
w3lib.http.basic_auth_header
instead to set the value of theAuthorization
header of your requests.If you really want your spider to send the same HTTP authentication credentials to any domain, set the
http_auth_domain
spider attribute toNone
.Finally, if you are a user of scrapy-splash, know that this version of Scrapy breaks compatibility with scrapy-splash 0.7.2 and earlier. You will need to upgrade scrapy-splash to a greater version for it to continue to work.
2.5.0
- Official Python 3.9 support
- Experimental HTTP/2 support
- New get_retry_request() function to retry requests from spider callbacks
... (truncated)
Changelog
Sourced from scrapy's changelog.
Scrapy 2.6.2 (2022-07-25)
Security bug fix:
When :class:
~scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware
processes a request with :reqmeta:proxy
metadata, and that :reqmeta:proxy
metadata includes proxy credentials, :class:~scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware
sets theProxy-Authentication
header, but only if that header is not already set.There are third-party proxy-rotation downloader middlewares that set different :reqmeta:
proxy
metadata every time they process a request.Because of request retries and redirects, the same request can be processed by downloader middlewares more than once, including both :class:
~scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware
and any third-party proxy-rotation downloader middleware.These third-party proxy-rotation downloader middlewares could change the :reqmeta:
proxy
metadata of a request to a new value, but fail to remove theProxy-Authentication
header from the previous value of the :reqmeta:proxy
metadata, causing the credentials of one proxy to be sent to a different proxy.To prevent the unintended leaking of proxy credentials, the behavior of :class:
~scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware
is now as follows when processing a request:
If the request being processed defines :reqmeta:
proxy
metadata that includes credentials, theProxy-Authorization
header is always updated to feature those credentials.If the request being processed defines :reqmeta:
proxy
metadata without credentials, theProxy-Authorization
header is removed unless it was originally defined for the same proxy URL.To remove proxy credentials while keeping the same proxy URL, remove the
Proxy-Authorization
header.If the request has no :reqmeta:
proxy
metadata, or that metadata is a falsy value (e.g.None
), theProxy-Authorization
header is removed.It is no longer possible to set a proxy URL through the :reqmeta:
proxy
metadata but set the credentials through theProxy-Authorization
header. Set proxy credentials through the :reqmeta:proxy
metadata instead.
... (truncated)
Commits
aecbccb
Bump version: 2.6.1 → 2.6.2af7dd16
Merge pull request from GHSA-9x8m-2xpf-crp34205609
Fixed intersphinx referencese3e69d1
Pin documentation requirements (#5536)54bfb96
Cover #5525 in the 2.6.2 release notes (#5535)4ef7182
If TWISTED_REACTOR is None, reuse any pre-installed reactor (#5528)1c1cd5d
Update the 2.6.2 release notes84c29a2
Unset the release date of still-unreleased 2.6.2 (#5503)b9b9422
Merge pull request #5482 from alexpdev/parse_help_msg915c288
edit- Additional commits viewable in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase
.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
-
@dependabot rebase
will rebase this PR -
@dependabot recreate
will recreate this PR, overwriting any edits that have been made to it -
@dependabot merge
will merge this PR after your CI passes on it -
@dependabot squash and merge
will squash and merge this PR after your CI passes on it -
@dependabot cancel merge
will cancel a previously requested merge and block automerging -
@dependabot reopen
will reopen this PR if it is closed -
@dependabot close
will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually -
@dependabot ignore this major version
will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) -
@dependabot ignore this minor version
will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) -
@dependabot ignore this dependency
will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) -
@dependabot use these labels
will set the current labels as the default for future PRs for this repo and language -
@dependabot use these reviewers
will set the current reviewers as the default for future PRs for this repo and language -
@dependabot use these assignees
will set the current assignees as the default for future PRs for this repo and language -
@dependabot use this milestone
will set the current milestone as the default for future PRs for this repo and language
You can disable automated security fix PRs for this repo from the Security Alerts page.