You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Limit decoded body size by manually decoding the compressed content
This creates one (more) copy of the content if we limit the output
because Zlib and Bzip2 want to remove the consumed input from the
input string.
Also, this moves away from IO::Uncompress::Gunzip and IO::Uncompress::Bzip2
in favour of Compress::Raw::Zlib and Compress::Raw::Bzip2 because I
found no way to convince IO::Uncompress::Gunzip::gunzip to pass through
the appropriate limiting options.
The API is extended (but not yet documented) in three ways:
1) A global variable, $HTTP::Message::MAX_BODY_SIZE to limit the
maximum size of ->decoded_content
2) An accessor, ->max_body_size, which can be set for individual
HTTP::Responses
3) An optional parameter to ->decoded_content, which certainly is the
most preferrable option but requires cooperation from all locations where
->decoded_content is called.
Output the Compress::Raw::Zlib version in case a test fails
We might be fine with version 2.061...
Up our prerequisite to 2.061 for the time being...
Update META.json as well...
Amend changes
* Eliminate use of wantarray() in ->max_body_size
* Eliminate use of vars.pm
Also handle Brotli (de)compression
Reindent to match source
Only run zipbomb tests if we have a recent version of Compress::Raw::Zlib
The Bufsize parameter was introduced in 2.060, so using 2.061 should
be fairly safe here
Remove debugging comments, remove misleading comments
In #181 , #181 (review)
Add Changes blurb
... mostly to pacify the gods of CI
0 commit comments