Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ Test fixtures for use by clients are available for each release on the [Github r

#### `execute`

- ✨ Add transaction batching to avoid RPC overload when executing tests with many transactions. Transactions are now sent in configurable batches (default: 750) with progress logging. Use `--max-tx-per-batch` to configure the batch size ([#1907](https://github.com/ethereum/execution-specs/pull/1907)).
- ✨ `execute hive` and `execute remote` now defer funding of accounts until the minimum amount required to send the test transactions is calculated, in order to optimize the amount of Eth used to execute the tests ([#1822](https://github.com/ethereum/execution-specs/pull/1822)).
- ✨ Dynamically fetch gas prices from the network and update all transactions to use 1.5x the current values ([#1822](https://github.com/ethereum/execution-specs/pull/1822)).
- ✨ New `--dry-run` flag to calculate the amount of Eth that will be spent executing a test given the current network gas prices ([#1822](https://github.com/ethereum/execution-specs/pull/1822)).
Expand Down
32 changes: 32 additions & 0 deletions docs/running_tests/execute/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,3 +48,35 @@ EOAs are funded after gas prices are determined, enabling accurate balance calcu
### Blob Transaction Support

Blob transactions are fully supported in execute mode, including automatic gas pricing for blob gas fees and validation via `engine_getBlobsVX` endpoints when the Engine RPC is available.

### Transaction Batching

When executing tests with many transactions (e.g., benchmark tests), the `execute` plugin automatically batches transactions to avoid overloading the RPC service (The experiment transaction limit for RPC is 1000 requests.). This is particularly important for large-scale tests that may generate hundreds or thousands of transactions.

**Default Behavior:**

- Transactions are sent in batches of up to 750 transactions by default
- Each batch is sent and confirmed before the next batch begins
- Progress logging shows batch number and transaction ranges

**CLI Configuration:**

The batch size can be configured via the `--max-tx-per-batch` option:

```bash
# Use smaller batches for slower RPC endpoints
execute --max-tx-per-batch 100 tests/

# Use larger batches for high-performance RPC endpoints
execute --max-tx-per-batch 1000 tests/
```

**Safety Threshold:**

A warning is logged when `max_transactions_per_batch` exceeds 1000, as this may cause RPC service instability or failures depending on the RPC endpoint's capacity.

**Use Cases:**

- **Benchmark tests**: Tests that measure gas consumption often generate many transactions
- **Stress testing**: When intentionally testing RPC endpoint limits
- **Slow RPC endpoints**: Reduce batch size to avoid timeouts on slower endpoints
16 changes: 16 additions & 0 deletions docs/running_tests/execute/remote.md
Original file line number Diff line number Diff line change
Expand Up @@ -205,6 +205,22 @@ Once the sender account is funded, the command will start executing tests one by

Test transactions are not sent from the main sender account though, they are sent from a different unique account that is created for each test (accounts returned by `pre.fund_eoa`).

### Transaction Batching

When executing tests that generate many transactions (such as benchmark tests), transactions are automatically batched to avoid overloading the RPC endpoint. By default, transactions are sent in batches of 750.

You can configure the batch size using the `--max-tx-per-batch` flag:

```bash
# Reduce batch size for slower RPC endpoints
uv run execute remote --fork=Prague --rpc-endpoint=https://rpc.endpoint.io --max-tx-per-batch 100 --rpc-seed-key 0x... --chain-id 12345

# Increase batch size for high-performance endpoints
uv run execute remote --fork=Prague --rpc-endpoint=https://rpc.endpoint.io --max-tx-per-batch 1000 --rpc-seed-key 0x... --chain-id 12345
```

A warning is logged when the batch size exceeds 1000, as this may cause RPC service instability.

### Use with Parallel Execution

If the `execute` is run using the `-n=N` flag (respectively `--sim-parallelism=N`), n>1, the tests will be executed in parallel, and each process will have its own separate sender account, so the amount that is swept from the seed account is divided by the number of processes, and this has to be taken into account when setting the sweep amount and also when funding the seed account.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -138,6 +138,17 @@ def pytest_addoption(parser: pytest.Parser) -> None:
default=False,
help="Don't send transactions, just print the minimum balance required per test.",
)
execute_group.addoption(
"--max-tx-per-batch",
action="store",
dest="max_tx_per_batch",
type=int,
default=None,
help=(
"Maximum number of transactions to send in a single batch to the RPC. "
"Default=750. Higher values may cause RPC instability."
),
)

report_group = parser.getgroup(
"tests", "Arguments defining html report behavior"
Expand Down Expand Up @@ -311,6 +322,12 @@ def dry_run(request: pytest.FixtureRequest) -> bool:
return request.config.getoption("dry_run")


@pytest.fixture(scope="session")
def max_transactions_per_batch(request: pytest.FixtureRequest) -> int | None:
"""Return the maximum number of transactions per batch, or None for default."""
return request.config.getoption("max_tx_per_batch")
Comment thread
LouisTsai-Csie marked this conversation as resolved.


@pytest.fixture(scope="session")
def default_max_fee_per_gas(
request: pytest.FixtureRequest,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -205,11 +205,13 @@ def __init__(
get_payload_wait_time: float,
initial_forkchoice_update_retries: int = 5,
transaction_wait_timeout: int = 60,
max_transactions_per_batch: int | None = None,
):
"""Initialize the Ethereum RPC client for the hive simulator."""
super().__init__(
rpc_endpoint,
transaction_wait_timeout=transaction_wait_timeout,
max_transactions_per_batch=max_transactions_per_batch,
)
self.fork = fork
self.engine_rpc = engine_rpc
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -383,6 +383,7 @@ def eth_rpc(
session_fork: Fork,
transactions_per_block: int,
session_temp_folder: Path,
max_transactions_per_batch: int | None,
) -> EthRPC:
"""Initialize ethereum RPC client for the execution client under test."""
get_payload_wait_time = request.config.getoption("get_payload_wait_time")
Expand All @@ -395,4 +396,5 @@ def eth_rpc(
session_temp_folder=session_temp_folder,
get_payload_wait_time=get_payload_wait_time,
transaction_wait_timeout=tx_wait_timeout,
max_transactions_per_batch=max_transactions_per_batch,
)
Original file line number Diff line number Diff line change
Expand Up @@ -149,11 +149,16 @@ def eth_rpc(
session_fork: Fork,
transactions_per_block: int,
session_temp_folder: Path,
max_transactions_per_batch: int | None,
) -> EthRPC:
"""Initialize ethereum RPC client for the execution client under test."""
tx_wait_timeout = request.config.getoption("tx_wait_timeout")
if engine_rpc is None:
return EthRPC(rpc_endpoint, transaction_wait_timeout=tx_wait_timeout)
return EthRPC(
rpc_endpoint,
transaction_wait_timeout=tx_wait_timeout,
max_transactions_per_batch=max_transactions_per_batch,
)
get_payload_wait_time = request.config.getoption("get_payload_wait_time")
return ChainBuilderEthRPC(
rpc_endpoint=rpc_endpoint,
Expand All @@ -163,4 +168,5 @@ def eth_rpc(
session_temp_folder=session_temp_folder,
get_payload_wait_time=get_payload_wait_time,
transaction_wait_timeout=tx_wait_timeout,
max_transactions_per_batch=max_transactions_per_batch,
)
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,7 @@ def execute(
"""Execute the format."""
del fork
del engine_rpc

for block in self.blocks:
for tx in block:
if not isinstance(tx, NetworkWrappedTransaction):
Expand Down Expand Up @@ -131,6 +132,7 @@ def execute(
f"Transaction rejected as expected: {exc_info.value}"
)
else:
# Send transactions (batching is handled by eth_rpc internally)
eth_rpc.send_wait_transactions(signed_txs)
all_tx_hashes.extend([tx.hash for tx in signed_txs])

Expand Down
39 changes: 36 additions & 3 deletions packages/testing/src/execution_testing/rpc/rpc.py
Original file line number Diff line number Diff line change
Expand Up @@ -267,8 +267,12 @@ class EthRPC(BaseRPC):
within EEST based hive simulators.
"""

OVERLOAD_THRESHOLD: int = 1000
DEFAULT_MAX_TRANSACTIONS_PER_BATCH: int = 750

transaction_wait_timeout: int = 60
poll_interval: float = 1.0 # how often to poll for tx inclusion
max_transactions_per_batch: int = DEFAULT_MAX_TRANSACTIONS_PER_BATCH

gas_information_stale_seconds: int

Expand All @@ -283,6 +287,7 @@ def __init__(
transaction_wait_timeout: int = 60,
poll_interval: float | None = None,
gas_information_stale_seconds: int = 12,
max_transactions_per_batch: int | None = None,
**kwargs: Any,
) -> None:
"""
Expand Down Expand Up @@ -320,6 +325,19 @@ def __init__(
"blobBaseFee": 0.0,
}

# Transaction batching configuration
if max_transactions_per_batch is None:
max_transactions_per_batch = (
self.DEFAULT_MAX_TRANSACTIONS_PER_BATCH
)
self.max_transactions_per_batch = max_transactions_per_batch
if max_transactions_per_batch > self.OVERLOAD_THRESHOLD:
logger.warning(
f"max_transactions_per_batch ({max_transactions_per_batch}) exceeds "
f"the safe threshold ({self.OVERLOAD_THRESHOLD}). "
"This may cause RPC service instability or failures."
)

def config(self, timeout: int | None = None) -> EthConfigResponse | None:
"""
`eth_config`: Returns information about a fork configuration of the
Expand Down Expand Up @@ -707,10 +725,25 @@ def send_wait_transactions(
) -> List[Any]:
"""
Send list of transactions and waits until all of them are included in a
block.
block. Transactions are sent in batches to avoid RPC overload.
"""
self.send_transactions(transactions)
return self.wait_for_transactions(transactions)
results: List[Any] = []
batch_size = self.max_transactions_per_batch
total_txs = len(transactions)

for i in range(0, total_txs, batch_size):
batch = transactions[i : i + batch_size]
if total_txs > batch_size:
logger.info(
f"Sending transaction batch {i // batch_size + 1} "
f"({len(batch)} transactions, "
f"{i + 1}-{min(i + batch_size, total_txs)} "
f"of {total_txs})"
)
self.send_transactions(batch)
results.extend(self.wait_for_transactions(batch))

return results


class DebugRPC(EthRPC):
Expand Down
Loading