Forum

Chunked Encoding with FastCGI does not work as expected

Martin Lesser
13 June 2009, 21:37
I'm trying to stream chunked data from a FastCGI (py) application so the client gets informed about the steps the FastCGI app is doing.

The return data (execpt the header) does not get delivered in chunks instead all results from the app are delivered when the whole job is finished. Is this data buffered by hiawatha?

When I run the app directly it behaves as expected and sends the chunks by the time.

Hiawatha version: 6.14
Operating System: FreeBSD 7.2-RELEASE

TIA, Martin
Hugo Leisink
14 June 2009, 00:03
Hiawatha does indeed buffer data from (Fast)CGI applications. This is done to improve speed. Delivering data to the clients in the same chunks as recieved from the (Fast)CGI process is not part of the CGI or FastCGI specifications. So you should not depend on it.
Martin Lesser
14 June 2009, 09:42
Is there any possibility to change this? Or is there any other/better approach for the situation where the client has to wait for results from server but should be informed about the single steps being performed on the server?

So tha i.e. http://webpy.org/cookbook/streaming_large_files works as expected.
Martin Lesser
14 June 2009, 10:29
Solved: By looking through the code it was easy: By filling up each chunk with blanks so each chunk gets a size of > 2048 B each chunk is sent to the client. Buffering only happens if the chunk is too small.
Hugo Leisink
14 June 2009, 12:51
Normally, such thing is done via AJAX. The client sends a requests which activates a process which takes some time. The result of that request contains a javascript which polls the server for the status. When the result of that poll is 'ready', the javascript reloads the page which will show result of that process. The poll results can of course contain progress information.

That's a more robust sollution than yours. It's webserver independent.
This topic has been closed.