Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HTTP/2 Request and response prioritization #48

Open
LPardue opened this issue Apr 19, 2022 · 2 comments
Open

HTTP/2 Request and response prioritization #48

LPardue opened this issue Apr 19, 2022 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@LPardue
Copy link
Contributor

LPardue commented Apr 19, 2022

This is most relevant when probe requests are made on existing connections. It's touched upon in the text

At the HTTP/2 layer it is important that the load-generating data is not interfering with the latency-measuring probes. For example, the different streams should not be stacked one after the other but rather be allowed to be multiplexed for optimal latency.

But I think you probably need to say more about this as other HTTP/2 implementations come into the mix.

@cpaasch
Copy link
Contributor

cpaasch commented Jul 7, 2022

@LPardue - what comes to your mind here? Do you have some suggestions?

@cpaasch cpaasch added the enhancement New feature or request label Jul 7, 2022
@LPardue
Copy link
Contributor Author

LPardue commented Jul 11, 2022

So I think it would help to clarify what is meant by "different streams stacking one after the other". Are you talking about the requests, the responses, or both?

For instance, if the client is doing a large upload and sending probe requests, it will want to prioritize sending the probe requests as soon as possible, in order to do as many probes within the test time period. The server doesn't have any real competition for sending responses to those probe requests.

In the reverse direction, if a client has requested a large download it can easily emit probe requests (no contention) but the server is going to want to respond to probes ASAP, so that it looks good.

Some servers follow client-provided signals of response prioritization. So you could recommend that probe requests are sent with high priority (using whatever scheme implementations choose).

Scheduling of requests on the client-side is not really standardized. Different browsers do different things, which can be based on content type or request context. Tuning that can be tricky or impossible.Somebody trying to send probes and large uploads might be surprised by how those things contend. So we should probably just highlight in considerations that this can happen, and what problems to look out for (skewed results or whatever)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants