Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batching with array #108

Open
wants to merge 8 commits into
base: new-index
Choose a base branch
from
Open

Batching with array #108

wants to merge 8 commits into from

Conversation

RCasatta
Copy link
Collaborator

@RCasatta RCasatta commented Sep 16, 2024

Currently electrs support batching by separating requests via new lines, however, other impls support it also via json array as explained in the cited issue. This add support for batching requests via json array

close Blockstream/esplora#516

src/electrum/server.rs Outdated Show resolved Hide resolved
@shesek
Copy link
Collaborator

shesek commented Oct 18, 2024

Should we limit the number of requests per batch?

@@ -137,3 +140,44 @@ fn test_electrum() -> Result<()> {

Ok(())
}

/// Test the Electrum RPC server using an headless Electrum wallet
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment here should be updated, "using a raw TCP socket" or something similar

@RCasatta
Copy link
Collaborator Author

Should we limit the number of requests per batch?

Do we have limits for "new-line separated" batch requests? Is it different here? How much do you propose as the limit here?

@shesek
Copy link
Collaborator

shesek commented Oct 24, 2024

Do we have limits for "new-line separated" batch requests? Is it different here?

We don't, but it is different here. New-lines separated requests are streamed, and individual responses are sent as they become available. We only hold up to 10 pending requests in the queue, then block and stop reading from the TCP socket until a pending request gets processed to free room (a bounded mpsc::sync_channel).

With array batching all the requests and all the response has to be buffered in-memory at once, which could use significant resources and be a DoS vector if there are no limits in place.

@RCasatta
Copy link
Collaborator Author

RCasatta commented Oct 28, 2024

How much do you propose as the limit here

Do you think we can afford 20 as the limit for batch requests? I think it's common number because is used as gap limit...

@shesek
Copy link
Collaborator

shesek commented Nov 3, 2024

20 seems fine, yes 👍

However it seems the tests are failing. The failure isn't directly related to the batching but to the TestRunner used to test it.

At the moment we cannot concurrently tests because the server will
conflict with each other, this make the test_electrum_raw test ignored
by default, and add a test in CI specifically for this test.
@RCasatta
Copy link
Collaborator Author

RCasatta commented Nov 4, 2024

I didn't realize locally because i launched the test singularly, I think we don't support having multiple concurrent tests.
My solution is to make the test ignored by default and test it specifically.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

electrum rpc batch request not working
2 participants