-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove the experimental name space from cookie parameters #144
base: main
Are you sure you want to change the base?
Conversation
Codecov Report
Additional details and impacted files@@ Coverage Diff @@
## main #144 +/- ##
==========================================
- Coverage 98.18% 97.83% -0.36%
==========================================
Files 11 11
Lines 828 877 +49
==========================================
+ Hits 813 858 +45
- Misses 15 19 +4
|
* If the ``COOKIES_ENABLED`` setting is ``True`` (default), automatic request | ||
parameter mapping now sets ``responseCookies`` to ``True`` and maps request | ||
cookies to ``requestCookies``. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This means that, by default, all Zyte API requests with automap will start including "responseCookies": True
among their parameters. Similar to how we do with httpResponseHeaders
, but in this case the behavior also affects browser rendering and automatic extraction scenarios. The test expectation updates are a great way to get an idea of the impact.
This is a big one, the reason I went for 0.13.0 instead of 0.12.3, and I wonder whether or not this is the right call. I wonder if we should implement an opt-in setting for responseCookies
, or make it so that it is only added to requests if requestCookies
is also added (either manually by the user our automatically mapped).
scrapy_zyte_api/_params.py
Outdated
): | ||
for field in list(self._unreported_deprecated_experimental_fields): | ||
if field in params["experimental"]: | ||
self._unreported_deprecated_experimental_fields.remove(field) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Went with this to warn only once. I wonder if it is overkill, or even if it is better to log a warning on every request to encourage a fix and minimize the chance of users missing the warning message (i.e. if you are already ignoring a couple of warnings, you might not notice the extra one).
scrapy_zyte_api/_params.py
Outdated
logger.warning( | ||
f"Zyte API parameters for request {request} include " | ||
f"experimental.{field}, which is deprecated. Please, " | ||
f"replace it with {field}, both in request parameters " | ||
f"and in any response parsing logic that might rely " | ||
f"on the old parameter." | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I went with logging warning messages (here and in all other places in this PR) instead of using warnings.warn
with DeprecationWarning
mostly because it made it easier to use tests (which already had warning log support). I also wonder if logging a warning might be better for Scrapy Cloud support, or other cloud systems. But no strong opinion.
To do: