DescriptionFlush pending text after every chunk of Tokens
If you had a very slow loading page which would
hang in the middle of a text node, we would not
show the results of that text node until we
encountered the first token after that node.
This broke a web-based traceroute which used
http chunked encoding to keep the connection open
while it slowly piped the output of traceroute
over the http connection.
http://www.net.princeton.edu/cgi-bin/traceroute.pl?target=google.com&cmd=Go
This changes the behavior of our one test with a
gigantic text node by changing how the the text
is split.
I believe this also affected progressive loading of
text/plain documents. I'm slightly surprised
no one complained. I tested with:
http://norvig.com/big.txt
and we now progressively load that much better than
we do with dev channel chrome.
I'm not sure a good way to test this that won't be flaky.
If we encounter anything other than more text node
content in the http stream we'll immediately emit
the pending text. For now I'm going to presume
the one affected test is sufficient for coverage.
BUG=363974
Committed: https://src.chromium.org/viewvc/blink?view=rev&revision=172094
Patch Set 1 #
Total comments: 1
Patch Set 2 : Mark as NeedsRebaseline #Patch Set 3 : add another NeedsRebaseline #
Messages
Total messages: 17 (0 generated)
|