-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Idea: wf:commit_content/1
#158
Comments
That's an interesting idea.
I'm admittedly not super sure the best way to deliver that (especially, as you say, with the other backends).
One method that could be employed to sort-of achieve this is to use the #delay_body{} element:
https://github.com/nitrogen/nitrogen_core/blob/rebar3/src/elements/other/element_delay_body.erl
The demo for it on nitrogenproject.com is currently breaking (I keep forgetting to fix it on live), but the way it works is as follows:
#delay_body{
tag=load_slow_content
}.
delay_body_event(load_slow_content) ->
get_some_data_from_some_slow_website(),
#panel{text="This took so long to render, or retrieve from a slow website"},
For a large slow list of something, this can also work for that just by dropping a bunch of items. Just like any postback, you can bind variables in the postback term and parse them out accordingly.
You could even, theoretically, use it to make a continuously loading list, something like this:
#delay_body{
tag={get_100_items, 1}
}.
delay_body_event(get_100_items, FirstItem) ->
case get_100_items_from_database(FirstItem) of
eod -> [];
Items ->
[
render_items(Items),
#delay_body{tag={get_100_items, FirstItem + 100}}
]
end.
Then it would load and render 100 items at a time until get_100_items_from_database(ItemNum) returns eod.
Jesse Gumm
http://jessegumm.com
…On May 19 2023, at 9:19 am, Dennis Snell ***@***.***> wrote:
This is a low-priority idea that came up during some conversations about React streaming and general streaming designs to minimize the latency from requesting a page and receiving the first bytes. In some cases, getting unfinished HTML to the browser can have a measurable impact on page load experience.
The fundamental problem is demonstrated in the following main() function:
main() -> [
page_header(),
#panel { body = [ slow_to_compute_function() ] },
page_footer()
].
In this function the body of the page takes a long time to generate, but the page header and footer are fast. Because we have to wait for main() to fully-complete before sending bytes to the browser this means that a browser sits waiting with no content.
A new function, wf:commit_content(HTML), could split up the rendering so that some HTML can be send while waiting for the rest of the function to compute.
main() ->
wf:commit_content(page_header()),
[
#panel { body = [ slow_to_compute_function() ] },
page_footer()
].
In this version of the function we're communicating to nitrogen that the page header is ready to send to the client and there will be no more HTTP headers to follow; that everything else is the rest of the HTML document.
This is a fairly specific optimization but could have relevance even for Nitrogen projects, especially if they send the Nitrogen SCRIPT tags towards the top of the HTML. A browser currently has to wait for the full request to generate, then download it, then load the JS, then render the page.
If such a commit function existed, it would be possible to send the head of the HTML to the browser, including some of the page structure (e.g. navigation menu and logo/header) and, importantly, a list of JavaScript files and stylesheets to load. This gives the browser the opportunity to start fetching the JavaScript files and some images and stylesheets before the server is done creating the page.
I'm not sure if this can be done with all the supported backends, but I believe with cowboy at least it should be possible simply by sending data to the request process and then continuing to send another after main() finishes and returns. Once the initial committed payload sends, however, Nitrogen would have to ignore further calls to change the status code or send headers (wf:status_code/1 and wf:content_type/1 and wf:header/2 etc…).
—
Reply to this email directly, view it on GitHub (#158), or unsubscribe (https://github.com/notifications/unsubscribe-auth/AABNRHGWGAV5UC7M57MW3WLXG56OVANCNFSM6AAAAAAYH3DGNU).
You are receiving this because you are subscribed to this thread.
|
If I understand this correctly, However, one major problem with this (unless I'm misreading The approach with Again, not a priority, but I wanted to toss the idea out here because I think it might be workable, and can help cut down the initial load and render and interactivity for pages. |
That is correct, it wouldn't help with a curl request.
In your original statement, you mentioned sending a bunch of javascript files, but I misinterpreted that as a bunch of other javascript files rather than just the normal one on page-load.
I do see the value in what you've proposed.
I'm really unsure how this would work given the way the tags are rendered in wf_tags and wf_render_elements.
I suspect it would require ability to send partially rendered content, or just sending when any top-level tags are fully rendered.
It's worth experimenting with, in any case.
Jesse Gumm
http://jessegumm.com
…On May 20 2023, at 1:11 am, Dennis Snell ***@***.***> wrote:
If I understand this correctly, #delay_body{} ends up sending the rest of the HTML via communication after page load. I did experiment with this using wf:comet/1 and yeah that works fine too, gets an even-more progressive page load.
However, one major problem with this (unless I'm misreading #delay_body{}) is that a call to the page via curl demonstrates that you don't get the delayed content unless and until the Nitrogen JavaScript loads.
The approach with wf:commit_content/1 still sends the full page on a single HTTP call; it merely gets some of the response out to the browser earlier on the socket than the rest of the page. This stands in contrast to waiting until the entire page's HTML is ready and then sending it all at once. If you curl it you will still get the entire rendered page on that initial call.
Again, not a priority, but I wanted to toss the idea out here because I think it might be workable, and can help cut down the initial load and render and interactivity for pages.
—
Reply to this email directly, view it on GitHub (#158 (comment)), or unsubscribe (https://github.com/notifications/unsubscribe-auth/AABNRHGITI3EEUSXPUP4L53XHBOCDANCNFSM6AAAAAAYH3DGNU).
You are receiving this because you commented.
|
I think it would be the same way for rendering, as I don't think it would be viable to do this automatically. rather, someone would need to decide, as in my example, that they want to render a part of their content, commit it, and then continue generating additional content.
either way is relevant, as are styles and anything that might get sent via cheers! |
Reading comprehension was not my strongest skill this weekend. You're right, whatever is wrapped in wf:commit_content would, by necessity, need to be fully rendered then sent. I was thinking about it as a configuration setting for continuous sending, but you're right.
This will definitely require more thought though, as one big challenge would be the templates. Might be a way to partially transmit templates with reworking them.
Jesse Gumm
http://jessegumm.com
…On May 21 2023, at 5:09 am, Dennis Snell ***@***.***> wrote:
> given the way the tags are rendered in wf_tags and wf_render_elements
I think it would be the same way for rendering, as I don't think it would be viable to do this automatically. rather, someone would need to decide, as in my example, that they want to render a part of their content, commit it, and then continue generating additional content.
> In your original statement, you mentioned sending a bunch of javascript files, but I misinterpreted that as a bunch of other javascript files rather than just the normal one on page-load.
either way is relevant, as are styles and anything that might get sent via <link prefetch> or <link preload>
cheers!
—
Reply to this email directly, view it on GitHub (#158 (comment)), or unsubscribe (https://github.com/notifications/unsubscribe-auth/AABNRHDEVDYF266YV7KWJ7DXHHSVPANCNFSM6AAAAAAYH3DGNU).
You are receiving this because you commented.
|
This is a low-priority idea that came up during some conversations about React streaming and general streaming designs to minimize the latency from requesting a page and receiving the first bytes. In some cases, getting unfinished HTML to the browser can have a measurable impact on page load experience.
The fundamental problem is demonstrated in the following
main()
function:In this function the body of the page takes a long time to generate, but the page header and footer are fast. Because we have to wait for
main()
to fully-complete before sending bytes to the browser this means that a browser sits waiting with no content.A new function,
wf:commit_content(HTML)
, could split up the rendering so that some HTML can be send while waiting for the rest of the function to compute.In this version of the function we're communicating to
nitrogen
that the page header is ready to send to the client and there will be no more HTTP headers to follow; that everything else is the rest of the HTML document.This is a fairly specific optimization but could have relevance even for Nitrogen projects, especially if they send the Nitrogen
SCRIPT
tags towards the top of the HTML. A browser currently has to wait for the full request to generate, then download it, then load the JS, then render the page.If such a commit function existed, it would be possible to send the head of the HTML to the browser, including some of the page structure (e.g. navigation menu and logo/header) and, importantly, a list of JavaScript files and stylesheets to load. This gives the browser the opportunity to start fetching the JavaScript files and some images and stylesheets before the server is done creating the page.
I'm not sure if this can be done with all the supported backends, but I believe with
cowboy
at least it should be possible simply by sending data to the request process and then continuing to send another aftermain()
finishes and returns. Once the initial committed payload sends, however, Nitrogen would have to ignore further calls to change the status code or send headers (wf:status_code/1
andwf:content_type/1
andwf:header/2
etc…).The text was updated successfully, but these errors were encountered: