Dump traffic body by ecap plugin but the html is mess code.

Asked by hyr666

I change a sample from adapter_mongo to save every traffic body to a file.
Most code based on mongo sample but the result is not very good and many html result is mess code instead of readable html.
But for those problem page, If I use curl to access by squid the saved file is readable.
Is this caused by encoding or gzip or something?

Question information

English Edit question
eCAP Edit question
No assignee Edit question
Last query:
Last reply:
Revision history for this message
Alex Rousskov (rousskov) said :

I do not know what "adapter_mongo" is.

The combination of HTTP Transfer-Encoding(*) and Content-Encoding message headers usually determine the message body encoding(s). Log/store those HTTP headers and you may better understand what body you are dealing with.

You can also examine raw body (possibly chunked encoded!) using wireshark and similar traffic sniffing tools. What you store ought to be similar to what wireshark shows with the exception of chunked encoding (*).

(*) The host application would normally strip at least the chunked encoding though.

Can you help with this problem?

Provide an answer of your own, or ask hyr666 for more information if necessary.

To post a message you must log in.