Skip to content

Lelantus strip#1805

Open
levoncrypto wants to merge 12 commits into
masterfrom
lelantus-strip
Open

Lelantus strip#1805
levoncrypto wants to merge 12 commits into
masterfrom
lelantus-strip

Conversation

@levoncrypto
Copy link
Copy Markdown
Contributor

@levoncrypto levoncrypto commented Mar 20, 2026

Note

High Risk
High-impact removal of protocol and wallet code paths (Lelantus/HD mint tracking) plus consensus/init/spork touchpoints; risk is primarily from unintended behavior changes around transaction validation, wallet migration assumptions, and chain/index compatibility.

Overview
Removes the legacy Lelantus implementation and its wallet/HD mint tracking from the codebase, leaving Spark as the only privacy pool logic wired into batching, spork gating, and startup/init flows.

This drops Lelantus build targets and large swaths of core functionality (e.g., src/lelantus.cpp, src/hdmint/*), removes Lelantus-specific consensus fields (e.g., nLelantusV3PayloadStartBlock), and simplifies transaction handling so TRANSACTION_LELANTUS is effectively treated as a no-op special-tx type.

QA and docs are aligned with the removal: Lelantus RPC tests (and the bip47-sendreceive test that depended on Lelantus JoinSplits) are deleted from the test runner, Spark tests are adjusted (clean-chain setup, class rename), and wallet dump expectations are updated to reflect no generated sigma/hdmint keys.

Reviewed by Cursor Bugbot for commit 9f0b130. Bugbot is set up for automated code reviews on this repo. Configure here.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Mar 20, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review

Walkthrough

This PR removes the Lelantus protocol and associated HD-mint infrastructure and tests, strips liblelantus cryptographic/prover/verifier code, updates consensus/validation, RPC and UI surfaces to Spark-only behavior, and simplifies Spark proof verification to a synchronous flow.

Changes

Cohort / File(s) Summary
Core Lelantus removal
src/lelantus.cpp, src/lelantus.h, src/liblelantus/*, src/hdmint/*, src/wallet/lelantusjoinsplitbuilder.*, src/primitives/mint_spend.*
Removed entire Lelantus implementation: cryptographic primitives, prover/verifier, joinsplit logic, HD-mint types (CHDMint/CMintPool/CHDMintTracker/CHDMintWallet), spend metadata, and many related headers/implementations.
Consensus & validation
src/validation.cpp, src/validation.h, src/consensus/params.h, src/chainparams.cpp, src/init.cpp, src/init.h
Removed Lelantus-specific checks/flags and VerifyPrivateTxOwn; changed CheckTransaction signature to drop lelantusTxInfo; removed nLelantusV3PayloadStartBlock field and related assignments.
Batching / proof container
src/batchproof_container.cpp, src/batchproof_container.h
Dropped Lelantus sigma/range proof batching and related APIs/state; container now only handles Spark spend batching.
Spark state & threadpool
src/spark/state.cpp, src/spark/state.h, src/spark/sparkwallet.cpp, src/spark/threadpool.h
Removed async in-progress proof scheduling and global proof thread-pool; CheckSparkBlock now takes block height; thread-pool construction/teardown and sparkwallet thread-safety adjusted.
RPC & CLI
src/rpc/misc.cpp, src/rpc/client.cpp, src/rpc/rawtransaction.cpp, src/rpc/rpcevo.cpp, src/wallet/rpcwallet.cpp
Removed many Lelantus RPCs or replaced them with stubs that throw, removed Lelantus param conversions/decoding and verifyprivatetxown, updated spork valid names to exclude Lelantus.
Mempool / miner / block assembly
src/txmempool.*, src/miner.cpp, src/txmempool.h
Removed Lelantus mempool bookkeeping/cleanup and miner block-level Lelantus spend limits/blacklisting; Spark limits retained.
Wallet core & helpers
src/hdmint/*, src/wallet/* (many), src/wallet/CMakeLists.txt, src/wallet/rpcdump.cpp, src/wallet/txbuilder.cpp
Removed HD-mint wallet, mintpool/tracker/wallet regeneration, join-split builder, many wallet RPCs and mint/import handling; CMake wallet build lists updated.
BIP47 / payload handling
src/bip47/account.cpp, src/bip47/secretpoint.cpp, qa/rpc-tests/bip47-sendreceive.py
Removed Lelantus-based masked payload acceptance (now returns false) and deleted the BIP47 rpc test.
QA tests & fixtures
qa/rpc-tests/*, src/test/*, src/wallet/test/*, src/test/CMakeLists.txt
Removed or updated many Lelantus-related RPC tests, unit tests, fixtures and test sources; adjusted a wallet-dump expected value.
Qt UI & resources
src/qt/bitcoingui.*, src/qt/optionsmodel.*, src/qt/optionsdialog.ui, src/qt/overviewpage.*, src/qt/walletmodel.*, src/qt/*
Removed Lelantus UI actions/options/migration dialogs/icon/styles/translations and related bindings; OverviewPage resize logic added; wallet UI methods removed/updated.
Transaction UI logic
src/qt/transactiondesc.cpp, src/qt/transactionrecord.cpp, src/qt/coincontroldialog.cpp, src/qt/sendcoinsdialog.cpp
Removed Lelantus-specific fee parsing and transaction-type handling; CoinControl now only treats Spark j-mints; removed migration gating on send.
Libraries & build
src/CMakeLists.txt, src/libspark/coin.h, src/libspark/grootle.h, src/libspark/spend_transaction.h
Removed many liblelantus sources from build lists; added sigma/spark coin types and headers in libspark; adjusted include ordering.
Special tx / spork / llmq
src/evo/specialtx.cpp, src/evo/spork.cpp/.h, src/llmq/quorums_instantsend.cpp, qa/rpc-tests/llmq-is-spark.py
Adjusted special-tx switch handling, removed Lelantus spork features/constants, limited InstantSend adaptation to Spark spends, renamed test class to LLMQ_IS_Spark.
Docs
README.md
Reworded README to replace explicit “Lelantus Spark” terminology with “Spark protocol” and removed extra audit mentions.

Sequence Diagram(s)

mermaid
sequenceDiagram
participant Client as Client (RPC/UI)
participant Node as Node (validation)
participant Spark as SparkState
participant Verifier as Verifier (sync)
participant DB as Wallet/DB
Note over Node,Spark: Previously used async threadpool/futures for proof checks
Client->>Node: Submit transaction / ConnectBlock
Node->>Spark: CheckSparkBlock(block, height)
Spark->>Verifier: CheckSparkSpendTransaction(tx) (synchronous verify)
Verifier-->>Spark: result (pass/fail) (rgba(0,128,0,0.5))
Spark->>DB: update checked-spend cache (rgba(0,0,255,0.5))
Node-->>Client: accept/reject

Estimated code review effort

🎯 5 (Critical) | ⏱️ ~120 minutes

Possibly related PRs

Suggested reviewers

  • psolstice
  • levonpetrosyan93
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch lelantus-strip

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 12

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (6)
src/qt/locale/bitcoin_zh_CN.ts (1)

874-880: ⚠️ Potential issue | 🟡 Minor

Keep this translation aligned with the source string.

Line 879 removes the Lelantus reference only in zh_CN, but the <source> text in this message still explicitly mentions it. That leaves this locale semantically different from the English fallback and the rest of the translations. If the UI copy is meant to be protocol-neutral now, please update the source string and regenerate the locale files instead of patching a single translation.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/qt/locale/bitcoin_zh_CN.ts` around lines 874 - 880, The zh_CN translation
drops the "Lelantus" reference but the English <source> string still mentions
it, causing a semantic mismatch; instead of editing this single translation,
update the canonical <source> string (the message starting "Sending funds to a
RAP address requires a notification transaction...") to the protocol-neutral
wording (remove or generalize the "Lelantus" mention) and then regenerate the
locale files so all translations (including src/qt/locale/bitcoin_zh_CN.ts) are
in sync; ensure the updated source text is used by the i18n
extraction/regeneration tooling so translators can update their locales.
src/chain.h (1)

17-25: ⚠️ Potential issue | 🟡 Minor

Duplicate include of libspark/coin.h.

The header libspark/coin.h is included twice: once at line 17 and again at line 25. This appears to be a leftover from replacing liblelantus/coin.h with libspark/coin.h. Remove one of the duplicate includes.

🔧 Proposed fix
 `#include` <secp256k1/include/GroupElement.h>
-#include "libspark/coin.h"
 `#include` "evo/spork.h"
 `#include` "firo_params.h"
 `#include` "util.h"

Or alternatively, remove line 25 if line 17 is the intended location.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/chain.h` around lines 17 - 25, Remove the duplicate include of
"libspark/coin.h" in the header list: keep a single `#include` "libspark/coin.h"
and delete the second occurrence so the file only includes that header once
(reference: the duplicated "libspark/coin.h" include in the top-of-file includes
block).
src/evo/spork.cpp (1)

54-68: ⚠️ Potential issue | 🔴 Critical

This removes historical Lelantus spork enforcement.

ConnectBlock() still sends blocks through sporkManager->IsTransactionAllowed() / IsBlockAllowed() inside the evo-spork window (src/validation.cpp:2875-2890), and validation.cpp:2700-2745 still routes Lelantus mints/join-splits through that path. Dropping the Lelantus branches here means old featureLelantus / transparent-limit sporks stop participating in revalidation, which is consensus-visible for historical blocks. Please keep the legacy checks until the chain no longer depends on those spork states.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/evo/spork.cpp` around lines 54 - 68, The change removed legacy Lelantus
spork checks from IsTransactionAllowed(const CTransaction &tx, const
ActiveSporkMap &sporkMap, CValidationState &state); restore the historical
branches so Lelantus mints/join-splits are still validated against
featureLelantus and any transparent-limit sporks during evo-spork window.
Specifically, in IsTransactionAllowed keep the original conditional branches
that check tx.IsLelantusMint()/tx.IsLelantusJoinSplit() (or equivalent methods
used previously) against sporkMap entries for CSporkAction::featureLelantus and
its transparent limit spork, mirroring how Spark handling is done (use
sporkMap.find(...) and compare amounts against limitSpork->second.second) so
ConnectBlock()/sporkManager->IsTransactionAllowed() continues to enforce legacy
Lelantus sporks for historical blocks.
src/spark/state.cpp (1)

689-815: ⚠️ Potential issue | 🟠 Major

Expensive Spark proof verification runs synchronously on mempool validation thread for recent transactions.

The variable scheduledAsync is declared on line 772 but never assigned a true value anywhere in the codebase. This means non-batched spends always execute spark::SpendTransaction::verify() synchronously on line 813.

Batching only activates for blocks older than 1 day (fCollectProofs set in validation.cpp lines 2670, 3781), which occurs during sync/reindex scenarios. For normal mempool validation of recent transactions, useBatching is false, forcing expensive proof verification inline on the validation thread. This weakens DoS resistance by allowing malicious transactions to monopolize CPU during proof verification.

The code comment on line 812 ("defense-in-depth") confirms this is a fallback after async dispatch was removed.

src/wallet/rpcwallet.cpp (1)

1668-1683: ⚠️ Potential issue | 🔴 Critical

Don’t drop transparent funds from gettotalbalance.

This now returns only Spark available balance, so transparent UTXOs disappear from the RPC result, and wallets without a Spark wallet will error instead of reporting their total funds.

Suggested fix
 UniValue gettotalbalance(const JSONRPCRequest& request)
 {
     CWallet * const pwallet = GetWalletForJSONRPCRequest(request);
     if (!EnsureWalletIsAvailable(pwallet, request.fHelp)) {
         return NullUniValue;
@@
-    EnsureSparkWalletIsAvailable();
     LOCK2(cs_main, pwallet->cs_wallet);
 
-    return  ValueFromAmount(pwallet->sparkWallet->getAvailableBalance());
+    CAmount sparkBalance = 0;
+    if (pwallet->sparkWallet) {
+        sparkBalance = pwallet->sparkWallet->getAvailableBalance();
+    }
+
+    return ValueFromAmount(pwallet->GetBalance() + sparkBalance);
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/wallet/rpcwallet.cpp` around lines 1668 - 1683, The current
implementation always calls EnsureSparkWalletIsAvailable() and returns only
pwallet->sparkWallet->getAvailableBalance(), which drops transparent UTXOs and
errors when no Spark wallet exists; change the logic to compute a total =
transparent_balance + spark_balance where transparent_balance is obtained from
the wallet (use pwallet->GetBalance() or the wallet's appropriate method to read
confirmed transparent UTXOs) and spark_balance is 0 if pwallet->sparkWallet is
null otherwise pwallet->sparkWallet->getAvailableBalance(); remove or skip the
EnsureSparkWalletIsAvailable() call so absence of a Spark wallet doesn't error,
sum both amounts and return ValueFromAmount(total).
src/validation.cpp (1)

2337-2344: ⚠️ Potential issue | 🟠 Major

Historical Lelantus fees still need to reach nFees for supply accounting.

Using the coinbase value as the reward cap fixes the overpayment check, but it does not fix the later AddTotalSupply(block.vtx[0]->GetValueOut() - nFees) / disconnect mirror. With Lelantus join-split fees no longer added to nFees, a reindex or reorg on an -addressindex node will overstate supply by the sum of those historical fees.

Also applies to: 2801-2809

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/validation.cpp` around lines 2337 - 2344, The code currently skips adding
fees for Spark/Lelantus spends when spark::ParseSparkSpend(tx) throws, which
causes supply accounting mismatches (AddTotalSupply(block.vtx[0]->GetValueOut()
- nFees)). Fix by adding a fallback that still increments nFees for historical
Lelantus join-split spends when tx.IsSparkSpend() but ParseSparkSpend throws:
inside the catch block compute the historical fee (e.g. via a dedicated helper
such as spark::GetHistoricalLelantusFee(tx) or by deriving fee from the
transaction’s inputs/outputs in the legacy format) and add that value to nFees
so AddTotalSupply sees the same total fees; ensure the fallback helper is
referenced/implemented and used wherever spark::ParseSparkSpend(tx).getFee() is
used.
🧹 Nitpick comments (5)
qa/rpc-tests/llmq-is-spark.py (1)

43-43: Optional: Remove unnecessary semicolon.

Python doesn't require semicolons after statements. While valid, removing it would be more idiomatic.

♻️ Proposed fix
-            break;
+            break
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@qa/rpc-tests/llmq-is-spark.py` at line 43, Remove the unnecessary semicolon
after the break statement in qa/rpc-tests/llmq-is-spark.py (the line containing
"break;"); simply change the "break;" to "break" to follow Python idioms and
eliminate the redundant character within the loop/block where break is used.
src/llmq/quorums_instantsend.cpp (1)

1598-1604: Inconsistent call sites still reference IsLelantusJoinSplit().

With Lelantus support removed from AdaptPrivateTx, the unchanged call sites at lines 1016 and 1525 now have dead code:

// Line 1016:
CTransaction const & tx{(tx_.IsLelantusJoinSplit() || tx_.IsSparkSpend()) ? isutils::AdaptPrivateTx(tx_) : tx_};

// Line 1525:
CTransaction const & tx{(tx_.IsLelantusJoinSplit() || tx_.IsSparkSpend()) ? isutils::AdaptPrivateTx(tx_) : tx_};

The IsLelantusJoinSplit() check is now meaningless since AdaptPrivateTx returns the transaction unchanged for non-Spark spends. These should be simplified to only check IsSparkSpend().

♻️ Suggested simplification for lines 1016 and 1525
-CTransaction const & tx{(tx_.IsLelantusJoinSplit() || tx_.IsSparkSpend()) ? isutils::AdaptPrivateTx(tx_) : tx_};
+CTransaction const & tx{tx_.IsSparkSpend() ? isutils::AdaptPrivateTx(tx_) : tx_};
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/llmq/quorums_instantsend.cpp` around lines 1598 - 1604, Call sites still
test tx_.IsLelantusJoinSplit() even though AdaptPrivateTx only handles Spark
spends; update both usages that currently do (tx_.IsLelantusJoinSplit() ||
tx_.IsSparkSpend()) ? isutils::AdaptPrivateTx(tx_) : tx_ to only check
tx_.IsSparkSpend(), i.e. replace the whole boolean condition with
tx_.IsSparkSpend(), so AdaptPrivateTx is only invoked for Spark spends and the
redundant Lelantus check is removed (references: AdaptPrivateTx,
isutils::AdaptPrivateTx, IsLelantusJoinSplit, IsSparkSpend).
src/wallet/txbuilder.cpp (1)

105-106: Comment text is inaccurate for the actual initialization.

The code initializes fee with payTxFee.GetFeePerK(), so it does not start with “no fee.” Please adjust the comment wording.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/wallet/txbuilder.cpp` around lines 105 - 106, The comment above the fee
loop is misleading: it says "Start with no fee" but the loop actually
initializes fee with payTxFee.GetFeePerK(); update the comment to accurately
reflect that fee starts at payTxFee.GetFeePerK() (e.g., "Start with base fee
from payTxFee.GetFeePerK() and loop until there is enough fee") so readers can
match the comment to the initialization in the for(...) that assigns fee.
src/libspark/coin.h (1)

237-240: Prefer qualified names over using namespace in this header.

This is a public header, so adding another namespace-wide import makes symbol collisions harder to reason about for every includer. Qualifying secp_primitives::GroupElement directly is safer here. As per coding guidelines "Avoid using namespace declarations".

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/libspark/coin.h` around lines 237 - 240, Remove the "using namespace
secp_primitives;" declaration from the header and replace usages of unqualified
secp_primitives symbols with fully qualified names (e.g., change GroupElement to
secp_primitives::GroupElement and similarly qualify any other types/functions
from secp_primitives referenced in this file, including inside namespace
lelantus and in class/struct/function declarations like any constructors,
methods, or typedefs that currently rely on the using directive).
src/qt/overviewpage.cpp (1)

201-253: Prefer layout policies over per-resize fixed geometry here.

This handler hard-codes child widths/heights from the top-level window size and clamps text labels to 20px tall. That is brittle for translations and HiDPI font metrics, and it will fight Qt's layout engine on every resize. Prefer stretch factors and size policies in the .ui/layout instead. Please verify this page with long translations and 125%-200% scaling before merging.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/qt/overviewpage.cpp` around lines 201 - 253, The resize handler
(OverviewPage::resizeEvent) is forcing fixed widths/heights on widgets
(ui->labelBalance, ui->labelUnconfirmed, ui->anonymizeButton, etc.) which fights
Qt layouts and breaks translations/HiDPI; remove per-resize geometry adjustments
and instead configure the layout in the .ui: set appropriate
QHBoxLayout/QVBoxLayout stretch factors, use
QSizePolicy::Preferred/Minimum/Expanding for labels and buttons, and avoid
setFixedWidth/setFixedHeight calls (replace with setMinimumWidth only where
truly needed); keep or adapt adjustTextSize only if it computes font scaling
correctly for HiDPI and translations, and after making layout changes verify the
page with long translations and 125%–200% scaling to confirm no clipping.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@README.md`:
- Line 11: Replace the unhyphenated compound adjective "privacy focused" with
the hyphenated form "privacy-focused" in the README sentence containing the Firo
description (the phrase "privacy focused cryptocurrency" should be updated to
"privacy-focused cryptocurrency"); update the one occurrence of that phrase to
maintain correct grammar and readability.

In `@src/bip47/account.cpp`:
- Around line 251-256: The change unconditionally returns false from
CAccountReceiver::acceptMaskedPayload, disabling BIP47 notifications; restore
behavior by either re-routing callers in src/wallet/wallet.cpp to the surviving
overload acceptMaskedPayload(maskedPayload, outpoint, outpointPubkey) or
implement a Spark-era body for CAccountReceiver::acceptMaskedPayload that parses
the maskedPayload and invokes acceptPcode() (and flips the same success/path the
original did) so the wallet persists channels and derives addresses; update the
wallet.cpp callsite(s) to use the outpoint/outpointPubkey overload if Lelantus
parsing is truly removed, or reintroduce equivalent parsing and success handling
in CAccountReceiver::acceptMaskedPayload so acceptPcode() runs.

In `@src/libspark/coin.h`:
- Around line 176-193: The Serialize/Unserialize in Coin (functions Serialize
and template Unserialize) currently reserve sizeof(int32_t) but only copy the
single-byte CoinDenomination (std::uint8_t) into that space, leaking
uninitialized padding; fix by writing/reading an explicit int32_t value for
denomination (e.g. cast/assign denomination into a local int32_t and memcpy that
into buffer before s.write, and read into an int32_t then assign back to
denomination in Unserialize) so the serialized layout is deterministic and
matches the legacy 4-byte layout referenced (keep GroupElement::memoryRequired
and value.serialize/value.deserialize usage unchanged).
- Around line 172-197: The sigma namespace code references GroupElement and
Scalar but those names are out of scope inside namespace sigma; qualify them as
secp_primitives::GroupElement and secp_primitives::Scalar (or add using
secp_primitives::GroupElement; using secp_primitives::Scalar; at the top of
namespace sigma) for PublicCoin, CScalarHash, and spend_info_container; also fix
PublicCoin::Serialize() and Unserialize() so the denomination size matches what
is written/read (either allocate size + sizeof(CoinDenomination) and write
exactly size + sizeof(CoinDenomination), or explicitly zero/pack a 4-byte
int32_t for denomination consistently in both Serialize and Unserialize) so no
uninitialized bytes are written/read.

In `@src/llmq/quorums_instantsend.cpp`:
- Around line 493-502: The loop in the IsSparkSpend check repeatedly acquires
LOCK(cs_main) for every CTxIn; hoist the LOCK(cs_main) so it is taken once
before iterating tx.vin to avoid repeated lock acquisition. Specifically, around
the block using lTag.deserialize(&in.scriptSig.front()) and
spark::CSparkState::GetState()->IsUsedLTag(lTag), take LOCK(cs_main) once
(outside the for loop), then iterate through CTxIn entries and call IsUsedLTag,
returning false if any tag is used and true afterwards.

In `@src/rpc/misc.cpp`:
- Around line 1075-1098: The RPC handlers getanonymityset, getmintmetadata,
getusedcoinserials, and getlatestcoinid currently remain listed in the public
commands[] table but always throw, so they should not be advertised; update the
commands[] registration to either remove these entries (if fully removed) or
mark them as hidden/deprecated (using the existing RPC registration
flags/mechanism your project uses) so they no longer appear in capability
discovery/help, and ensure the implementations (getanonymityset,
getmintmetadata, getusedcoinserials, getlatestcoinid) are either unregistered or
updated to use the deprecation path rather than being publicly advertised as
supported.

In `@src/spark/sparkwallet.cpp`:
- Line 85: The Spark payload parsing and wallet recovery are running
synchronously on the validation path (mempool admission / block connection) and
performing CWalletDB I/O there; change these code paths to offload heavy work to
the background thread pool instead of blocking validation. Specifically, ensure
threadPool is initialized (make threadPool non-null) and replace direct calls
that parse Spark payloads, identify/recover coins, or perform synchronous
CWalletDB writes (the methods invoked from src/validation.cpp) with tasks posted
to threadPool; move wallet DB writes and full-view-key recovery logic into those
background tasks and return quickly from the validation-call sites so validation
does not perform wallet I/O or long-running recovery synchronously.

In `@src/spark/state.cpp`:
- Around line 473-488: The block total uses size_t which can overflow on 32-bit
builds; change the accumulator blockSpendsValue in CheckSparkBlock to use
CAmount (int64_t) to match GetSpendTransparentAmount() and
consensus.GetMaxValueSparkSpendPerBlock()/GetMaxValueSparkSpendPerTransaction(),
and keep all comparisons using cmp::greater or equivalent overloads that accept
CAmount to avoid architecture-dependent overflow; update any related variable
names/usages in CheckSparkBlock accordingly.

In `@src/validation.cpp`:
- Around line 696-702: Remove the consensus behavior depending on local
sync/param gates: in the Sigma check (the if using
::Params().GetConsensus().nLelantusStartBlock) drop the "nLelantusStartBlock >
1" condition so the rejection only depends on !isVerifyDB and the height
comparison; in the Lelantus checks for
tx.IsLelantusJoinSplit()/tx.IsLelantusMint() remove the
"!IsInitialBlockDownload()" clause so the state.DoS(...) decision only depends
on !isVerifyDB and the nHeight >= (::Params().GetConsensus().nSparkStartBlock +
5) comparison. Ensure you edit the if conditions around state.DoS(...) and leave
isVerifyDB as the historical/reindex escape hatch.
- Around line 4197-4200: The code currently sets block.fChecked when fCheckPOW
&& fCheckMerkleRoot before calling spark::CheckSparkBlock, which causes
Spark-specific failures to be cached; change the logic so that block.fChecked is
only set after spark::CheckSparkBlock(state, block, nHeight) returns
successfully (i.e., move the block.fChecked = true assignment to after the
spark::CheckSparkBlock call and only set it if that call succeeds and no error
state is returned).

In `@src/wallet/rpcdump.cpp`:
- Around line 578-580: The call to sparkWallet->ListSparkMints() inside the
fMintUpdate branch is a no-op because CSparkWallet::ListSparkMints() is a const
accessor that returns a vector which is ignored; either remove this call or use
its return value (e.g., auto mints = sparkWallet->ListSparkMints()) if you
intend to inspect mints, and do not perform a second full rescan via
ScanForWalletTransactions(chainActive.Genesis()) unless necessary; if
post-import Spark bookkeeping (rebuild/sync/refresh) is required, replace the
ListSparkMints() call with the appropriate Spark API method on sparkWallet that
performs that bookkeeping (rather than the read-only accessor) while keeping the
ScanForWalletTransactions usage limited to the intended rescan flow.

In `@src/wallet/rpcwallet.cpp`:
- Around line 65-70: EnsureSparkWalletIsAvailable currently throws
JSONRPCError(RPC_WALLET_ERROR, "spark mint is not allowed for legacy wallet")
which is misleading for non-mint RPCs; update the thrown error message in
EnsureSparkWalletIsAvailable (the check using pwalletMain and
pwalletMain->sparkWallet) to a generic Spark-unavailable message (e.g. "spark
wallet is not available for this wallet") while keeping the same JSONRPCError
type and RPC_WALLET_ERROR code so all callers (balance/signing/address queries
and mint) receive a correct, generic error.

---

Outside diff comments:
In `@src/chain.h`:
- Around line 17-25: Remove the duplicate include of "libspark/coin.h" in the
header list: keep a single `#include` "libspark/coin.h" and delete the second
occurrence so the file only includes that header once (reference: the duplicated
"libspark/coin.h" include in the top-of-file includes block).

In `@src/evo/spork.cpp`:
- Around line 54-68: The change removed legacy Lelantus spork checks from
IsTransactionAllowed(const CTransaction &tx, const ActiveSporkMap &sporkMap,
CValidationState &state); restore the historical branches so Lelantus
mints/join-splits are still validated against featureLelantus and any
transparent-limit sporks during evo-spork window. Specifically, in
IsTransactionAllowed keep the original conditional branches that check
tx.IsLelantusMint()/tx.IsLelantusJoinSplit() (or equivalent methods used
previously) against sporkMap entries for CSporkAction::featureLelantus and its
transparent limit spork, mirroring how Spark handling is done (use
sporkMap.find(...) and compare amounts against limitSpork->second.second) so
ConnectBlock()/sporkManager->IsTransactionAllowed() continues to enforce legacy
Lelantus sporks for historical blocks.

In `@src/qt/locale/bitcoin_zh_CN.ts`:
- Around line 874-880: The zh_CN translation drops the "Lelantus" reference but
the English <source> string still mentions it, causing a semantic mismatch;
instead of editing this single translation, update the canonical <source> string
(the message starting "Sending funds to a RAP address requires a notification
transaction...") to the protocol-neutral wording (remove or generalize the
"Lelantus" mention) and then regenerate the locale files so all translations
(including src/qt/locale/bitcoin_zh_CN.ts) are in sync; ensure the updated
source text is used by the i18n extraction/regeneration tooling so translators
can update their locales.

In `@src/validation.cpp`:
- Around line 2337-2344: The code currently skips adding fees for Spark/Lelantus
spends when spark::ParseSparkSpend(tx) throws, which causes supply accounting
mismatches (AddTotalSupply(block.vtx[0]->GetValueOut() - nFees)). Fix by adding
a fallback that still increments nFees for historical Lelantus join-split spends
when tx.IsSparkSpend() but ParseSparkSpend throws: inside the catch block
compute the historical fee (e.g. via a dedicated helper such as
spark::GetHistoricalLelantusFee(tx) or by deriving fee from the transaction’s
inputs/outputs in the legacy format) and add that value to nFees so
AddTotalSupply sees the same total fees; ensure the fallback helper is
referenced/implemented and used wherever spark::ParseSparkSpend(tx).getFee() is
used.

In `@src/wallet/rpcwallet.cpp`:
- Around line 1668-1683: The current implementation always calls
EnsureSparkWalletIsAvailable() and returns only
pwallet->sparkWallet->getAvailableBalance(), which drops transparent UTXOs and
errors when no Spark wallet exists; change the logic to compute a total =
transparent_balance + spark_balance where transparent_balance is obtained from
the wallet (use pwallet->GetBalance() or the wallet's appropriate method to read
confirmed transparent UTXOs) and spark_balance is 0 if pwallet->sparkWallet is
null otherwise pwallet->sparkWallet->getAvailableBalance(); remove or skip the
EnsureSparkWalletIsAvailable() call so absence of a Spark wallet doesn't error,
sum both amounts and return ValueFromAmount(total).

---

Nitpick comments:
In `@qa/rpc-tests/llmq-is-spark.py`:
- Line 43: Remove the unnecessary semicolon after the break statement in
qa/rpc-tests/llmq-is-spark.py (the line containing "break;"); simply change the
"break;" to "break" to follow Python idioms and eliminate the redundant
character within the loop/block where break is used.

In `@src/libspark/coin.h`:
- Around line 237-240: Remove the "using namespace secp_primitives;" declaration
from the header and replace usages of unqualified secp_primitives symbols with
fully qualified names (e.g., change GroupElement to
secp_primitives::GroupElement and similarly qualify any other types/functions
from secp_primitives referenced in this file, including inside namespace
lelantus and in class/struct/function declarations like any constructors,
methods, or typedefs that currently rely on the using directive).

In `@src/llmq/quorums_instantsend.cpp`:
- Around line 1598-1604: Call sites still test tx_.IsLelantusJoinSplit() even
though AdaptPrivateTx only handles Spark spends; update both usages that
currently do (tx_.IsLelantusJoinSplit() || tx_.IsSparkSpend()) ?
isutils::AdaptPrivateTx(tx_) : tx_ to only check tx_.IsSparkSpend(), i.e.
replace the whole boolean condition with tx_.IsSparkSpend(), so AdaptPrivateTx
is only invoked for Spark spends and the redundant Lelantus check is removed
(references: AdaptPrivateTx, isutils::AdaptPrivateTx, IsLelantusJoinSplit,
IsSparkSpend).

In `@src/qt/overviewpage.cpp`:
- Around line 201-253: The resize handler (OverviewPage::resizeEvent) is forcing
fixed widths/heights on widgets (ui->labelBalance, ui->labelUnconfirmed,
ui->anonymizeButton, etc.) which fights Qt layouts and breaks
translations/HiDPI; remove per-resize geometry adjustments and instead configure
the layout in the .ui: set appropriate QHBoxLayout/QVBoxLayout stretch factors,
use QSizePolicy::Preferred/Minimum/Expanding for labels and buttons, and avoid
setFixedWidth/setFixedHeight calls (replace with setMinimumWidth only where
truly needed); keep or adapt adjustTextSize only if it computes font scaling
correctly for HiDPI and translations, and after making layout changes verify the
page with long translations and 125%–200% scaling to confirm no clipping.

In `@src/wallet/txbuilder.cpp`:
- Around line 105-106: The comment above the fee loop is misleading: it says
"Start with no fee" but the loop actually initializes fee with
payTxFee.GetFeePerK(); update the comment to accurately reflect that fee starts
at payTxFee.GetFeePerK() (e.g., "Start with base fee from payTxFee.GetFeePerK()
and loop until there is enough fee") so readers can match the comment to the
initialization in the for(...) that assigns fee.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: aaa1e246-c6c5-47c8-86e4-fee54f93119a

📥 Commits

Reviewing files that changed from the base of the PR and between eefaa0a and 20de5f9.

⛔ Files ignored due to path filters (1)
  • contrib/flathub/org.firo.firo-qt.metainfo.xml is excluded by !**/*.xml
📒 Files selected for processing (144)
  • README.md
  • qa/pull-tester/rpc-tests.py
  • qa/rpc-tests/bip47-sendreceive.py
  • qa/rpc-tests/lelantus_mint.py
  • qa/rpc-tests/lelantus_mintspend.py
  • qa/rpc-tests/lelantus_setmintstatus_validation.py
  • qa/rpc-tests/lelantus_spend_gettransaction.py
  • qa/rpc-tests/llmq-is-spark.py
  • qa/rpc-tests/wallet-dump.py
  • src/CMakeLists.txt
  • src/batchproof_container.cpp
  • src/batchproof_container.h
  • src/bip47/account.cpp
  • src/bip47/secretpoint.cpp
  • src/bloom.cpp
  • src/chain.h
  • src/chainparams.cpp
  • src/coin_containers.cpp
  • src/coin_containers.h
  • src/evo/specialtx.cpp
  • src/evo/spork.cpp
  • src/evo/spork.h
  • src/hdmint/hdmint.cpp
  • src/hdmint/hdmint.h
  • src/hdmint/mintpool.cpp
  • src/hdmint/mintpool.h
  • src/hdmint/test/lelantus_tests.cpp
  • src/hdmint/tracker.cpp
  • src/hdmint/tracker.h
  • src/hdmint/wallet.cpp
  • src/hdmint/wallet.h
  • src/init.cpp
  • src/init.h
  • src/lelantus.cpp
  • src/lelantus.h
  • src/liblelantus/challenge_generator.h
  • src/liblelantus/challenge_generator_impl.h
  • src/liblelantus/coin.cpp
  • src/liblelantus/coin.h
  • src/liblelantus/innerproduct_proof.h
  • src/liblelantus/innerproduct_proof_generator.cpp
  • src/liblelantus/innerproduct_proof_generator.h
  • src/liblelantus/innerproduct_proof_verifier.cpp
  • src/liblelantus/innerproduct_proof_verifier.h
  • src/liblelantus/joinsplit.cpp
  • src/liblelantus/joinsplit.h
  • src/liblelantus/lelantus_primitives.cpp
  • src/liblelantus/lelantus_primitives.h
  • src/liblelantus/lelantus_proof.h
  • src/liblelantus/lelantus_prover.cpp
  • src/liblelantus/lelantus_prover.h
  • src/liblelantus/lelantus_verifier.cpp
  • src/liblelantus/lelantus_verifier.h
  • src/liblelantus/openssl_context.h
  • src/liblelantus/params.cpp
  • src/liblelantus/params.h
  • src/liblelantus/range_proof.h
  • src/liblelantus/range_prover.cpp
  • src/liblelantus/range_prover.h
  • src/liblelantus/range_verifier.cpp
  • src/liblelantus/range_verifier.h
  • src/liblelantus/schnorr_proof.h
  • src/liblelantus/schnorr_prover.cpp
  • src/liblelantus/schnorr_prover.h
  • src/liblelantus/schnorr_verifier.cpp
  • src/liblelantus/schnorr_verifier.h
  • src/liblelantus/sigmaextended_proof.h
  • src/liblelantus/sigmaextended_prover.cpp
  • src/liblelantus/sigmaextended_prover.h
  • src/liblelantus/sigmaextended_verifier.cpp
  • src/liblelantus/sigmaextended_verifier.h
  • src/liblelantus/spend_metadata.cpp
  • src/liblelantus/spend_metadata.h
  • src/liblelantus/test/challenge_generator_tests.cpp
  • src/liblelantus/test/coin_tests.cpp
  • src/liblelantus/test/inner_product_test.cpp
  • src/liblelantus/test/joinsplit_tests.cpp
  • src/liblelantus/test/lelantus_primitives_tests.cpp
  • src/liblelantus/test/lelantus_test.cpp
  • src/liblelantus/test/lelantus_test_fixture.cpp
  • src/liblelantus/test/lelantus_test_fixture.h
  • src/liblelantus/test/range_proof_test.cpp
  • src/liblelantus/test/schnorr_test.cpp
  • src/liblelantus/test/serialize_test.cpp
  • src/liblelantus/test/sigma_extended_test.cpp
  • src/liblelantus/threadpool.h
  • src/libspark/coin.h
  • src/libspark/grootle.h
  • src/libspark/spend_transaction.h
  • src/llmq/quorums_instantsend.cpp
  • src/miner.cpp
  • src/primitives/mint_spend.cpp
  • src/primitives/mint_spend.h
  • src/qt/bitcoin.qrc
  • src/qt/bitcoingui.cpp
  • src/qt/bitcoingui.h
  • src/qt/coincontroldialog.cpp
  • src/qt/forms/optionsdialog.ui
  • src/qt/locale/bitcoin_zh_CN.ts
  • src/qt/optionsdialog.cpp
  • src/qt/optionsmodel.cpp
  • src/qt/optionsmodel.h
  • src/qt/overviewpage.cpp
  • src/qt/overviewpage.h
  • src/qt/res/css/firo.css
  • src/qt/sendcoinsdialog.cpp
  • src/qt/sparkmodel.cpp
  • src/qt/transactiondesc.cpp
  • src/qt/transactionrecord.cpp
  • src/qt/walletmodel.cpp
  • src/qt/walletmodel.h
  • src/qt/walletmodeltransaction.cpp
  • src/qt/walletmodeltransaction.h
  • src/rpc/client.cpp
  • src/rpc/misc.cpp
  • src/rpc/rawtransaction.cpp
  • src/rpc/rpcevo.cpp
  • src/spark/sparkwallet.cpp
  • src/spark/state.cpp
  • src/spark/state.h
  • src/test/CMakeLists.txt
  • src/test/fixtures.cpp
  • src/test/fixtures.h
  • src/test/lelantus_mintspend_test.cpp
  • src/test/lelantus_state_tests.cpp
  • src/test/lelantus_tests.cpp
  • src/test/test_bitcoin.cpp
  • src/txmempool.cpp
  • src/txmempool.h
  • src/validation.cpp
  • src/validation.h
  • src/wallet/CMakeLists.txt
  • src/wallet/crypter.cpp
  • src/wallet/lelantusjoinsplitbuilder.cpp
  • src/wallet/lelantusjoinsplitbuilder.h
  • src/wallet/rpcdump.cpp
  • src/wallet/rpcwallet.cpp
  • src/wallet/test/CMakeLists.txt
  • src/wallet/test/lelantus_tests.cpp
  • src/wallet/txbuilder.cpp
  • src/wallet/wallet.cpp
  • src/wallet/wallet.h
  • src/wallet/walletdb.cpp
  • src/wallet/walletdb.h
💤 Files with no reviewable changes (104)
  • src/bip47/secretpoint.cpp
  • src/init.h
  • src/rpc/rpcevo.cpp
  • src/CMakeLists.txt
  • src/liblelantus/params.cpp
  • src/wallet/test/CMakeLists.txt
  • src/hdmint/hdmint.cpp
  • src/test/test_bitcoin.cpp
  • src/liblelantus/spend_metadata.cpp
  • src/liblelantus/joinsplit.h
  • src/qt/sparkmodel.cpp
  • qa/rpc-tests/lelantus_mintspend.py
  • src/bloom.cpp
  • qa/rpc-tests/lelantus_spend_gettransaction.py
  • src/liblelantus/lelantus_verifier.cpp
  • src/liblelantus/lelantus_primitives.cpp
  • qa/pull-tester/rpc-tests.py
  • src/evo/spork.h
  • src/liblelantus/test/challenge_generator_tests.cpp
  • src/qt/walletmodeltransaction.cpp
  • src/wallet/crypter.cpp
  • src/qt/bitcoingui.cpp
  • src/rpc/rawtransaction.cpp
  • src/qt/bitcoin.qrc
  • src/liblelantus/test/lelantus_primitives_tests.cpp
  • src/liblelantus/openssl_context.h
  • src/liblelantus/test/lelantus_test.cpp
  • src/liblelantus/coin.cpp
  • src/qt/res/css/firo.css
  • src/primitives/mint_spend.cpp
  • src/txmempool.cpp
  • src/test/lelantus_tests.cpp
  • src/qt/walletmodel.h
  • src/liblelantus/test/schnorr_test.cpp
  • qa/rpc-tests/lelantus_setmintstatus_validation.py
  • src/liblelantus/sigmaextended_prover.cpp
  • src/hdmint/mintpool.cpp
  • src/test/CMakeLists.txt
  • src/liblelantus/innerproduct_proof.h
  • src/liblelantus/test/lelantus_test_fixture.cpp
  • src/liblelantus/test/serialize_test.cpp
  • src/qt/walletmodel.cpp
  • src/liblelantus/test/sigma_extended_test.cpp
  • src/hdmint/hdmint.h
  • src/qt/forms/optionsdialog.ui
  • src/liblelantus/test/range_proof_test.cpp
  • src/liblelantus/test/inner_product_test.cpp
  • src/wallet/CMakeLists.txt
  • src/qt/bitcoingui.h
  • src/batchproof_container.cpp
  • qa/rpc-tests/lelantus_mint.py
  • src/liblelantus/schnorr_verifier.h
  • src/qt/optionsmodel.h
  • src/lelantus.cpp
  • src/qt/optionsmodel.cpp
  • src/liblelantus/schnorr_prover.cpp
  • src/liblelantus/lelantus_prover.cpp
  • src/liblelantus/lelantus_proof.h
  • src/liblelantus/challenge_generator.h
  • src/liblelantus/schnorr_prover.h
  • src/liblelantus/schnorr_verifier.cpp
  • qa/rpc-tests/bip47-sendreceive.py
  • src/miner.cpp
  • src/liblelantus/innerproduct_proof_verifier.h
  • src/hdmint/test/lelantus_tests.cpp
  • src/liblelantus/spend_metadata.h
  • src/hdmint/tracker.h
  • src/wallet/test/lelantus_tests.cpp
  • src/liblelantus/sigmaextended_proof.h
  • src/liblelantus/params.h
  • src/liblelantus/range_proof.h
  • src/test/lelantus_mintspend_test.cpp
  • src/liblelantus/sigmaextended_verifier.cpp
  • src/liblelantus/test/coin_tests.cpp
  • src/liblelantus/schnorr_proof.h
  • src/liblelantus/challenge_generator_impl.h
  • src/liblelantus/innerproduct_proof_generator.h
  • src/liblelantus/lelantus_prover.h
  • src/wallet/lelantusjoinsplitbuilder.h
  • src/liblelantus/range_prover.h
  • src/liblelantus/range_verifier.cpp
  • src/liblelantus/innerproduct_proof_verifier.cpp
  • src/liblelantus/range_verifier.h
  • src/liblelantus/test/joinsplit_tests.cpp
  • src/liblelantus/test/lelantus_test_fixture.h
  • src/liblelantus/joinsplit.cpp
  • src/liblelantus/range_prover.cpp
  • src/test/fixtures.h
  • src/liblelantus/sigmaextended_prover.h
  • src/liblelantus/threadpool.h
  • src/wallet/lelantusjoinsplitbuilder.cpp
  • src/hdmint/wallet.cpp
  • src/batchproof_container.h
  • src/liblelantus/sigmaextended_verifier.h
  • src/hdmint/mintpool.h
  • src/liblelantus/coin.h
  • src/test/fixtures.cpp
  • src/test/lelantus_state_tests.cpp
  • src/liblelantus/lelantus_verifier.h
  • src/liblelantus/lelantus_primitives.h
  • src/hdmint/wallet.h
  • src/liblelantus/innerproduct_proof_generator.cpp
  • src/lelantus.h
  • src/hdmint/tracker.cpp

Comment thread README.md Outdated
Comment thread src/bip47/account.cpp
Comment thread src/libspark/coin.h
Comment thread src/libspark/coin.h
Comment thread src/rpc/misc.cpp
Comment thread src/spark/state.cpp
Comment thread src/validation.cpp Outdated
Comment thread src/validation.cpp Outdated
Comment thread src/wallet/rpcdump.cpp Outdated
Comment thread src/wallet/rpcwallet.cpp
@reubenyap
Copy link
Copy Markdown
Member

PR #1805 Review: Lelantus Strip

145 files changed, +439 / -18,081 — Full removal of Lelantus protocol code, replacing with Spark equivalents.

Static Analysis Findings

Clean — cppcheck and flawfinder found no issues in the changed files (most changes are deletions).


CRITICAL Issues

1. ConnectBlock fee accounting gap
src/validation.cpp removes Lelantus fee tracking (nLelantusSpendAmount, nLelantusTransparentAmount) but doesn't verify Spark fee accounting fully covers the gap. If any transaction type previously validated via Lelantus fee paths now falls through without validation, it could allow fee-free transactions.
Verify that the Spark fee validation path in ConnectBlock covers all transaction types that Lelantus previously handled.

2. Incomplete Lelantus state cleanup
CLelantusTxInfo and CSigmaTxInfo references removed from block processing, but sigma.h / sigma.cpp still exist in the tree. Partial removal risks inconsistent state — old Sigma/Lelantus data in the chainstate DB may cause assertion failures on resync.
Confirm Sigma removal is handled in a separate PR, and add a migration/check that old Lelantus state is safely ignored or pruned.

3. Missing historical verification path
Nodes syncing from genesis must still validate historical Lelantus transactions in blocks before the fork height. The removal of CheckLelantusTransaction and CheckSigmaTransaction means pre-fork blocks cannot be validated.
Must retain read-only verification code gated on block height, or add a checkpoint that skips detailed validation before the fork.

4. RPC getpoolinfo returns empty Lelantus data
getpoolinfo RPC (if it still exists) may crash or return malformed responses if it references removed Lelantus pool metrics without null checks.
Check all RPCs that referenced Lelantus pool/mint/spend data.

5. Wallet migration for Lelantus balances
Users who still hold Lelantus mints (not yet spent to Spark) will lose access to those funds. The PR doesn't include a wallet migration path or user-facing warning.
Must either: (a) force-spend remaining Lelantus mints to Spark before activation, or (b) retain minimal Lelantus spend capability for migration.

HIGH Issues

6. Lelantus tests removed but no Spark equivalents added
All Lelantus/Sigma test files are deleted. If Spark test coverage doesn't mirror what was removed, the project loses regression protection.
Audit Spark test coverage against the deleted Lelantus test cases.

7. CTransaction serialization backward compatibility
Changes to transaction serialization (removing Lelantus fields) could break deserialization of historical transactions from disk or network peers running older versions.
Ensure CTransaction::SerializationOp still handles old tx versions with Lelantus payloads.

8. Masternode/LLMQ interaction
Masternodes validate transactions. If masternode code paths still reference Lelantus types that were removed, masternode consensus could break.
Check src/evo/ and src/llmq/ for any Lelantus references.

9. GUI wallet Lelantus references
Qt wallet code may have UI elements or menu items referencing Lelantus mint/spend operations that would crash or confuse users after removal.
Grep src/qt/ for Lelantus/Sigma references.

MEDIUM Issues

10. Build system cleanup
Makefile.am / CMakeLists.txt may still reference removed liblelantus source files, causing build failures on clean builds.

11. Documentation references
RPC help text and developer docs may still mention Lelantus operations.

12. Protocol version bump
A removal this significant should bump the protocol version to ensure peers can distinguish pre- and post-strip nodes.

13. Log message cleanup
Logging categories referencing Lelantus (LogPrint(BCLog::LELANTUS, ...)) may remain in untouched files, causing log noise or build warnings.


Summary

This is a massive, high-risk removal. The deletion itself is straightforward, but the transition safety (historical validation, wallet migration, fee accounting continuity) needs careful verification before merge. Recommend addressing Critical items 1, 3, and 5 before merge.


Review assisted by AI + static analysis (cppcheck, flawfinder, clang-tidy). Posted by @reubenyap.

@coderabbitai coderabbitai Bot requested a review from psolstice March 24, 2026 14:42
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (3)
src/llmq/quorums_instantsend.cpp (1)

1598-1604: ⚠️ Potential issue | 🟡 Minor

Lelantus removal from AdaptPrivateTx is consistent with grace period but lacks safeguard documentation.

The AdaptPrivateTx function now only handles Spark spends. The Lelantus adaptation (via AdaptJsplitTx) was removed entirely from the codebase, which aligns with the grace period ending January 30, 2026.

Since we're now in March 2026 (past the grace period), Lelantus transactions are actively rejected from the mempool (validation.cpp:840-842 checks chainActive.Height() >= nLelantusGracefulPeriod). However, SyncTransaction is called for both current blocks and reorg'd blocks (validation.cpp:3779). A deep reorg to pre-grace-period blocks could theoretically resurrect Lelantus transactions, which would then pass through AdaptPrivateTx unchanged instead of being adapted.

While this is an extremely low-probability edge case given the grace period has already passed, consider adding a comment explaining the intentional removal of Lelantus handling to prevent future confusion.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/llmq/quorums_instantsend.cpp` around lines 1598 - 1604, AdaptPrivateTx
now only handles Spark spends via AdaptSparkTx and intentionally omits the
former Lelantus handling (AdaptJsplitTx); add a concise comment above
AdaptPrivateTx that documents this intentional removal, noting the grace period
end (nLelantusGracefulPeriod) and the mempool rejection check
(chainActive.Height() >= nLelantusGracefulPeriod in validation logic), and
mention the low-probability reorg edge case where SyncTransaction could
encounter pre-grace-period blocks—clarify that Lelantus is deliberately not
adapted here to avoid confusion for future maintainers.
src/wallet/rpcwallet.cpp (2)

2856-2863: ⚠️ Potential issue | 🟠 Major

Scope Spark fee override to outgoing transactions only.

At Line 2856, fee parsing runs for any Spark spend tx. Because Line 2865 uses nNet - nFee, this can distort reported amount for non-sender wallet views. Guard this with wtx.IsFromMe(filter).

Suggested guard
-    if (wtx.tx->IsSparkSpend()) {
+    if (wtx.IsFromMe(filter) && wtx.tx->IsSparkSpend()) {
         try {
             nFee = (0 - spark::ParseSparkSpend(*wtx.tx).getFee());
         }
         catch (const std::exception &) {
             // do nothing
         }
     }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/wallet/rpcwallet.cpp` around lines 2856 - 2863, The Spark fee override is
currently applied for any Spark spend transaction which can misreport amounts
for non-sender views; update the block that checks wtx.tx->IsSparkSpend() to
only parse and override nFee when the transaction is outgoing by adding a guard
using wtx.IsFromMe(filter). In other words, inside the function handling wallet
transactions (the code using variables wtx, nFee, nNet and calling
spark::ParseSparkSpend), wrap the ParseSparkSpend try/catch so it runs only if
wtx.IsFromMe(filter) returns true, leaving all other logic unchanged.

1643-1655: ⚠️ Potential issue | 🟡 Minor

getprivatebalance help text overstates coin origin.

Line 1643 says “created by the wallet”, but Line 1655 returns getAvailableBalance(), which includes all confirmed tracked Spark coins without origin filtering. This can mislead operators about what is counted.

Suggested wording fix
-            "Private balance is the sum of all confirmed spark mints which are created by the wallet.\n"
+            "Private balance is the sum of confirmed Spark funds tracked by this wallet.\n"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/wallet/rpcwallet.cpp` around lines 1643 - 1655, The getprivatebalance
help text incorrectly states that private coins are "created by the wallet"
while the RPC actually returns
ValueFromAmount(pwallet->sparkWallet->getAvailableBalance()), which includes all
confirmed tracked Spark coins regardless of origin; update the help string in
the getprivatebalance block (near EnsureSparkWalletIsAvailable() / LOCK2 usage)
to accurately describe the result as the sum of confirmed tracked/available
Spark mints known to the wallet (or similar wording like "confirmed tracked
Spark coins known to the wallet"), keeping the rest of the function
(EnsureSparkWalletIsAvailable, LOCK2, and the return of getAvailableBalance())
unchanged.
🧹 Nitpick comments (4)
src/spark/state.cpp (1)

771-815: Dead code from async removal can be cleaned up.

After removing the async thread pool infrastructure, several constructs are now vestigial:

  1. scheduledAsync (line 772) is always false, making lines 808-810 unreachable dead code
  2. fRecheckNeeded (line 775) is always false, so the do-while loop executes exactly once (it's effectively just a block scope)

Consider simplifying to remove these artifacts:

Suggested cleanup
-            bool fChecked = false;
-            bool scheduledAsync = false;
+            bool fChecked = false;
 
             try {
-                bool fRecheckNeeded;
-                do {
-                    fRecheckNeeded = false;
-
-                    LOCK(cs_checkedSparkSpendTransactions);
-                    if (gCheckedSparkSpendTransactions.count(hashTx)) {
-                        auto& checkState = gCheckedSparkSpendTransactions[hashTx];
-                        if (checkState.fChecked) {
-                            if (!checkState.fResult)
-                                return state.DoS(100, false, REJECT_INVALID, "CheckSparkSpendTransaction: previously checked and failed");
-                            else {
-                                LogPrintf("CheckSparkSpendTransaction: already checked tx %s\n", hashTx.ToString());
-                                fChecked = true;
-                            }
+                {
+                    LOCK(cs_checkedSparkSpendTransactions);
+                    if (gCheckedSparkSpendTransactions.count(hashTx)) {
+                        auto& checkState = gCheckedSparkSpendTransactions[hashTx];
+                        if (checkState.fChecked) {
+                            if (!checkState.fResult)
+                                return state.DoS(100, false, REJECT_INVALID, "CheckSparkSpendTransaction: previously checked and failed");
+                            LogPrintf("CheckSparkSpendTransaction: already checked tx %s\n", hashTx.ToString());
+                            fChecked = true;
                         }
                     }
                 }
-                while (fRecheckNeeded);
     
                 if (fChecked) {
                     passVerify = true;
                 }
                 else {
                     if (fStatefulSigmaCheck) {
                         passVerify = spark::SpendTransaction::verify(*spend, cover_sets);
                         {
                             LOCK(cs_checkedSparkSpendTransactions);
                             gCheckedSparkSpendTransactions[hashTx] = {true, passVerify};
                         }
                     }
                     else {
-                        if (scheduledAsync) {
-                            passVerify = true;
-                        } else {
-                            passVerify = spark::SpendTransaction::verify(*spend, cover_sets);
-                        }
+                        passVerify = spark::SpendTransaction::verify(*spend, cover_sets);
                     }
                 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/spark/state.cpp` around lines 771 - 815, The code still contains
vestigial async-related variables and a redundant do-while loop; remove the
unused bool scheduledAsync and the fRecheckNeeded/do-while construct in
CheckSparkSpendTransaction, replacing the do-while with a single scoped check
using LOCK(cs_checkedSparkSpendTransactions) to look up
gCheckedSparkSpendTransactions[hashTx] and set fChecked appropriately, and then
simplify the subsequent branch: drop the scheduledAsync branch (lines that
assume async scheduling) so the non-stateful path simply falls back to
synchronous verify via spark::SpendTransaction::verify and ensure you still
write back gCheckedSparkSpendTransactions[hashTx] = {true, passVerify} inside
the LOCK where appropriate.
src/libspark/coin.h (3)

207-219: Prefer static_cast over C-style casts.

The serialization uses C-style casts which are less explicit than static_cast. While functionally correct, this is inconsistent with the more explicit casting style used in PublicCoin::Serialize above.

♻️ Suggested improvement
     template<typename Stream>
     void Serialize(Stream& s) const {
-        int64_t tmp = uint8_t(denomination);
+        int64_t tmp = static_cast<int64_t>(static_cast<std::uint8_t>(denomination));
         s << tmp;
         tmp = coinGroupId;
         s << tmp;
     }
     template<typename Stream>
     void Unserialize(Stream& s) {
         int64_t tmp;
-        s >> tmp; denomination = CoinDenomination(tmp);
-        s >> tmp; coinGroupId = int(tmp);
+        s >> tmp; denomination = static_cast<CoinDenomination>(static_cast<std::uint8_t>(tmp));
+        s >> tmp; coinGroupId = static_cast<int>(tmp);
     }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/libspark/coin.h` around lines 207 - 219, Replace the C-style casts in
Coin::Serialize and Coin::Unserialize with explicit C++ casts: when assigning to
tmp use static_cast<int64_t>(denomination) in Serialize, and in Unserialize cast
tmp to CoinDenomination via denomination = static_cast<CoinDenomination>(tmp)
and cast coinGroupId using coinGroupId = static_cast<int>(tmp); update both
Serialize(Stream& s) and Unserialize(Stream& s) to use these static_casts for
clarity and consistency with PublicCoin::Serialize.

223-234: Use .data() instead of &bnData[0] for vector access.

The coding guidelines specify using .data() instead of &v[0] to avoid potential out-of-bounds access issues and for clarity.

♻️ Suggested fix
 struct CScalarHash {
     std::size_t operator ()(const secp_primitives::Scalar& bn) const noexcept {
         std::vector<unsigned char> bnData(bn.memoryRequired());
-        bn.serialize(&bnData[0]);
+        bn.serialize(bnData.data());
         unsigned char hash[CSHA256::OUTPUT_SIZE];
-        CSHA256().Write(&bnData[0], bnData.size()).Finalize(hash);
+        CSHA256().Write(bnData.data(), bnData.size()).Finalize(hash);
         // take the first bytes of "hash".
         std::size_t result;
         std::memcpy(&result, hash, sizeof(std::size_t));
         return result;
     }
 };

As per coding guidelines: "Watch for out-of-bounds vector access; use .data() instead of &v[0]".

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/libspark/coin.h` around lines 223 - 234, In CScalarHash::operator()
replace the vector element-pointer uses with vector::data(): when serializing
and when hashing, use bnData.data() instead of &bnData[0] to follow guidelines
and avoid potential out-of-bounds access; update the two call sites inside
operator() (the bn.serialize(...) call and the CSHA256().Write(...) call) to
pass bnData.data() and keep the rest of the logic (CSHA256 hash, memcpy into
result) unchanged.

238-238: Missing namespace closing comment.

For consistency with other namespaces in this file (lines 143 and 275), add a closing comment.

♻️ Suggested fix
-}
+} // namespace sigma
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/libspark/coin.h` at line 238, Add a trailing comment after the final
closing brace to indicate which namespace is being closed (match the exact
identifier used at the opening of that namespace in this file), e.g., change the
lone "}" to "}" followed by "// namespace <NAME>" (use the same namespace name
used earlier in this file so it is consistent with the other closing comments
near lines 143 and 275).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/chainparams.cpp`:
- Around line 1022-1023: The mempool and block acceptance boundaries diverge
because consensus.nLelantusGracefulPeriod is 0 while block logic falls back to
consensus.nSparkStartBlock; fix by making the two behaviors consistent: either
set consensus.nLelantusGracefulPeriod to the same non‑zero fallback used for
blocks (e.g., the devnet/regtest value currently in nSparkStartBlock) in
chainparams (so the parameter itself is correct), or change the mempool check in
AcceptToMemoryPool (the Lelantus/mempool validation path) to apply the same
fallback logic (if consensus.nLelantusGracefulPeriod == 0 then use
consensus.nSparkStartBlock) before rejecting transactions. Ensure you update the
regtest comment if you change the boundary value.
- Around line 804-805: Remove the dead Lelantus config field and its
assignments: delete the nLelantusV3PayloadStartBlock member from the consensus
params struct (the consensus params declaration) and remove every assignment to
nLelantusV3PayloadStartBlock in chainparams.cpp (the Lelantus strip assignment
and the later assignment around lines ~1320). Ensure no other code references
nLelantusV3PayloadStartBlock after removal; if any compiler errors point to
residual uses, remove or replace those usages with the active nSparkStartBlock
logic.

In `@src/llmq/quorums_instantsend.cpp`:
- Around line 493-502: Add a defensive early rejection for Lelantus JoinSplit
transactions: inside the CheckCanLock / equivalent function (the block handling
tx.IsSparkSpend() and later UTXO checks), add a check for
tx.IsLelantusJoinSplit() that immediately returns false so Lelantus transactions
are explicitly rejected at this layer; keep the existing LOCK(cs_main) placement
and ensure the IsLelantusJoinSplit() check occurs before any UTXO or
synthetic-input processing to improve readability and clarity.

In `@src/wallet/rpcdump.cpp`:
- Around line 577-580: The call to pwallet->ScanForWalletTransactions(pindex)
omits the fUpdate flag causing inconsistent behavior; update the call in the
else branch to pass fUpdate=true (i.e.,
pwallet->ScanForWalletTransactions(pindex, true)) to match other usages and
ensure AddToWalletIfInvolvingMe fully updates wallet transaction state for
discovered transactions; locate the call that uses pindex and
LogPrintf("Rescanning last %i blocks...") and add the missing boolean argument.

In `@src/wallet/rpcwallet.cpp`:
- Around line 1680-1685: EnsureSparkWalletIsAvailable() currently causes
gettotalbalance to hard-fail when Spark is missing; change the logic to compute
the transparent balance unconditionally (use pwallet->GetBalance()), then
attempt to add the Spark balance only if pwallet->sparkWallet is non-null and/or
EnsureSparkWalletIsAvailable() does not throw. Specifically, remove the
unconditional EnsureSparkWalletIsAvailable() call in the gettotalbalance path,
or wrap it in a try/catch, check that pwallet->sparkWallet is valid before
calling pwallet->sparkWallet->getAvailableBalance(), and if Spark is unavailable
return ValueFromAmount(transparent) instead of failing; otherwise return
ValueFromAmount(transparent + spark).

---

Outside diff comments:
In `@src/llmq/quorums_instantsend.cpp`:
- Around line 1598-1604: AdaptPrivateTx now only handles Spark spends via
AdaptSparkTx and intentionally omits the former Lelantus handling
(AdaptJsplitTx); add a concise comment above AdaptPrivateTx that documents this
intentional removal, noting the grace period end (nLelantusGracefulPeriod) and
the mempool rejection check (chainActive.Height() >= nLelantusGracefulPeriod in
validation logic), and mention the low-probability reorg edge case where
SyncTransaction could encounter pre-grace-period blocks—clarify that Lelantus is
deliberately not adapted here to avoid confusion for future maintainers.

In `@src/wallet/rpcwallet.cpp`:
- Around line 2856-2863: The Spark fee override is currently applied for any
Spark spend transaction which can misreport amounts for non-sender views; update
the block that checks wtx.tx->IsSparkSpend() to only parse and override nFee
when the transaction is outgoing by adding a guard using wtx.IsFromMe(filter).
In other words, inside the function handling wallet transactions (the code using
variables wtx, nFee, nNet and calling spark::ParseSparkSpend), wrap the
ParseSparkSpend try/catch so it runs only if wtx.IsFromMe(filter) returns true,
leaving all other logic unchanged.
- Around line 1643-1655: The getprivatebalance help text incorrectly states that
private coins are "created by the wallet" while the RPC actually returns
ValueFromAmount(pwallet->sparkWallet->getAvailableBalance()), which includes all
confirmed tracked Spark coins regardless of origin; update the help string in
the getprivatebalance block (near EnsureSparkWalletIsAvailable() / LOCK2 usage)
to accurately describe the result as the sum of confirmed tracked/available
Spark mints known to the wallet (or similar wording like "confirmed tracked
Spark coins known to the wallet"), keeping the rest of the function
(EnsureSparkWalletIsAvailable, LOCK2, and the return of getAvailableBalance())
unchanged.

---

Nitpick comments:
In `@src/libspark/coin.h`:
- Around line 207-219: Replace the C-style casts in Coin::Serialize and
Coin::Unserialize with explicit C++ casts: when assigning to tmp use
static_cast<int64_t>(denomination) in Serialize, and in Unserialize cast tmp to
CoinDenomination via denomination = static_cast<CoinDenomination>(tmp) and cast
coinGroupId using coinGroupId = static_cast<int>(tmp); update both
Serialize(Stream& s) and Unserialize(Stream& s) to use these static_casts for
clarity and consistency with PublicCoin::Serialize.
- Around line 223-234: In CScalarHash::operator() replace the vector
element-pointer uses with vector::data(): when serializing and when hashing, use
bnData.data() instead of &bnData[0] to follow guidelines and avoid potential
out-of-bounds access; update the two call sites inside operator() (the
bn.serialize(...) call and the CSHA256().Write(...) call) to pass bnData.data()
and keep the rest of the logic (CSHA256 hash, memcpy into result) unchanged.
- Line 238: Add a trailing comment after the final closing brace to indicate
which namespace is being closed (match the exact identifier used at the opening
of that namespace in this file), e.g., change the lone "}" to "}" followed by
"// namespace <NAME>" (use the same namespace name used earlier in this file so
it is consistent with the other closing comments near lines 143 and 275).

In `@src/spark/state.cpp`:
- Around line 771-815: The code still contains vestigial async-related variables
and a redundant do-while loop; remove the unused bool scheduledAsync and the
fRecheckNeeded/do-while construct in CheckSparkSpendTransaction, replacing the
do-while with a single scoped check using LOCK(cs_checkedSparkSpendTransactions)
to look up gCheckedSparkSpendTransactions[hashTx] and set fChecked
appropriately, and then simplify the subsequent branch: drop the scheduledAsync
branch (lines that assume async scheduling) so the non-stateful path simply
falls back to synchronous verify via spark::SpendTransaction::verify and ensure
you still write back gCheckedSparkSpendTransactions[hashTx] = {true, passVerify}
inside the LOCK where appropriate.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: 42c044e3-f560-47ae-b3bf-a379e80e6acc

📥 Commits

Reviewing files that changed from the base of the PR and between 20de5f9 and b107f2a.

📒 Files selected for processing (9)
  • README.md
  • src/chain.h
  • src/chainparams.cpp
  • src/libspark/coin.h
  • src/llmq/quorums_instantsend.cpp
  • src/spark/state.cpp
  • src/validation.cpp
  • src/wallet/rpcdump.cpp
  • src/wallet/rpcwallet.cpp
💤 Files with no reviewable changes (1)
  • src/chain.h
🚧 Files skipped from review as they are similar to previous changes (1)
  • README.md
👮 Files not reviewed due to content moderation or server errors (1)
  • src/validation.cpp

Comment thread src/chainparams.cpp Outdated
Comment thread src/chainparams.cpp Outdated
Comment thread src/llmq/quorums_instantsend.cpp Outdated
Comment thread src/wallet/rpcdump.cpp Outdated
Comment thread src/wallet/rpcwallet.cpp Outdated
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (2)
src/spark/threadpool.h (1)

5-6: Header guard doesn't follow project convention.

The header guard FIRO_SPARK_THREADPOOL_H doesn't follow the project's documented pattern BITCOIN_<PATH>_<FILE>_H. Consider using BITCOIN_SPARK_THREADPOOL_H or FIRO_SPARK_THREADPOOL_H depending on whether the project has standardized on FIRO prefix for new files.

As per coding guidelines: "Header guards should follow format #ifndef BITCOIN_<PATH>_<FILE>_H"

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/spark/threadpool.h` around lines 5 - 6, Change the header guard macro to
follow the project's convention: replace the existing FIRO_SPARK_THREADPOOL_H
guard with the standardized BITCOIN_SPARK_THREADPOOL_H (update both the
`#ifndef/`#define and the corresponding `#endif` comment if present) so the header
guard in threadpool.h uses BITCOIN_SPARK_THREADPOOL_H consistently.
src/spark/sparkwallet.cpp (1)

95-105: Defensive null checks are good; consider typed pointer.

The null check in FinishTasks() is a sensible defensive measure. However, the repeated (ParallelOpThreadPool<void>*)threadPool casts throughout the file suggest threadPool is stored as void*. Consider declaring it as ParallelOpThreadPool<void>* or std::unique_ptr<ParallelOpThreadPool<void>> to eliminate unsafe casts and improve type safety.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/spark/sparkwallet.cpp` around lines 95 - 105, The member threadPool is
currently stored as a void* causing repeated unsafe casts; change its
declaration in CSparkWallet to a typed pointer (ParallelOpThreadPool<void>* or
better std::unique_ptr<ParallelOpThreadPool<void>>) and update
CSparkWallet::~CSparkWallet and CSparkWallet::FinishTasks to use the typed
pointer (call shutdown on the typed instance and delete or reset the unique_ptr)
so you can remove all (ParallelOpThreadPool<void>*) casts and improve type
safety.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/llmq/quorums_instantsend.cpp`:
- Around line 493-506: The loop inside the tx.IsSparkSpend() branch accesses
in.scriptSig.front() without ensuring the scriptSig buffer is non-empty, which
can cause undefined behavior; update the loop in the IsSparkSpend handling
(where you iterate CTxIn in tx.vin and call GroupElement::deserialize on
&in.scriptSig.front()) to first check that in.scriptSig is non-empty (e.g., if
(in.scriptSig.empty()) return false or continue based on intended validation),
only call deserialize when non-empty, and preserve the existing IsUsedLTag check
against spark::CSparkState::GetState()->IsUsedLTag(lTag).

In `@src/wallet/rpcdump.cpp`:
- Around line 574-576: The rescan-only path for legacy mint seed keys (detected
via nChange.first == 2 and guarded by fMintUpdate) currently only calls
pwallet->ScanForWalletTransactions which rebuilds tx history but does not
rebuild the mint pool/state those keys rely on; update the fMintUpdate branch so
that when a legacy mint seed key is detected you either invoke the existing
legacy-mint migration routine (e.g., the wallet's mint pool/state rebuild
function—refer to functions named like RebuildMintPool/RestoreLegacyMints or a
migration helper) to fully reconstruct mint state, or if no migration exists,
reject the import (return an error from importwallet) before persisting keys;
ensure this change is applied in the block that currently contains
pwallet->ScanForWalletTransactions and references fMintUpdate and nChange.first
so legacy mints are not silently left unrecoverable.

---

Nitpick comments:
In `@src/spark/sparkwallet.cpp`:
- Around line 95-105: The member threadPool is currently stored as a void*
causing repeated unsafe casts; change its declaration in CSparkWallet to a typed
pointer (ParallelOpThreadPool<void>* or better
std::unique_ptr<ParallelOpThreadPool<void>>) and update
CSparkWallet::~CSparkWallet and CSparkWallet::FinishTasks to use the typed
pointer (call shutdown on the typed instance and delete or reset the unique_ptr)
so you can remove all (ParallelOpThreadPool<void>*) casts and improve type
safety.

In `@src/spark/threadpool.h`:
- Around line 5-6: Change the header guard macro to follow the project's
convention: replace the existing FIRO_SPARK_THREADPOOL_H guard with the
standardized BITCOIN_SPARK_THREADPOOL_H (update both the `#ifndef/`#define and the
corresponding `#endif` comment if present) so the header guard in threadpool.h
uses BITCOIN_SPARK_THREADPOOL_H consistently.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: e20ce7b4-1ab8-4d5a-ac60-6daa909517c6

📥 Commits

Reviewing files that changed from the base of the PR and between b107f2a and 588c4ef.

📒 Files selected for processing (7)
  • src/chainparams.cpp
  • src/consensus/params.h
  • src/llmq/quorums_instantsend.cpp
  • src/spark/sparkwallet.cpp
  • src/spark/threadpool.h
  • src/wallet/rpcdump.cpp
  • src/wallet/rpcwallet.cpp
💤 Files with no reviewable changes (1)
  • src/consensus/params.h
🚧 Files skipped from review as they are similar to previous changes (1)
  • src/wallet/rpcwallet.cpp

Comment thread src/llmq/quorums_instantsend.cpp
Comment thread src/wallet/rpcdump.cpp Outdated
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/llmq/quorums_instantsend.cpp`:
- Around line 497-507: CheckCanLock currently only checks tx.IsSparkSpend() so
adapted Spark transactions rewritten to INSTANTSEND_ADAPTED_TX slip past the
Spark-specific logic; update this check to include the adapted-transaction flag
(INSTANTSEND_ADAPTED_TX) as an alternate branch and treat adapted transactions
identically to Spark spends (iterate vin, ensure scriptSig/non-empty,
deserialize GroupElement and consult
spark::CSparkState::GetState()->IsUsedLTag(lTag) before returning true), or
alternatively avoid adapting in AdaptSparkTx()/SyncTransaction() before
ProcessTx()/retry queue; adjust the conditional that now reads tx.IsSparkSpend()
to also detect INSTANTSEND_ADAPTED_TX and preserve the existing Spark-path
behavior.

In `@src/wallet/rpcdump.cpp`:
- Around line 548-553: The loop that adds keys (calls like AddKeyPubKey()) may
persist supported keys and then abort immediately when encountering a legacy
mint-seed (pwallet->mapKeyMetadata[keyid].nChange.first == 2), leaving the
wallet partially imported; to fix, perform a pre-scan of the parsed dump for any
unsupported legacy mint-seed entries (check pwallet->mapKeyMetadata for
nChange.first == 2) and abort before calling AddKeyPubKey(), OR accumulate a
flag/error when such entries are found and defer throwing until after
post-import cleanup steps (UpdateTimeFirstKey(), the final rescan, and
ShowProgress("", 100)) so that previously added keys and rescans complete before
failing; ensure the change is applied in the import routine in rpcdump.cpp (the
import loop/ImportWallet logic) and preserve existing persistence and cleanup on
early-detection paths.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: 283131c3-9bcb-404d-b0e4-15ce532be103

📥 Commits

Reviewing files that changed from the base of the PR and between 588c4ef and aca1bad.

📒 Files selected for processing (3)
  • qa/pull-tester/rpc-tests.py
  • src/llmq/quorums_instantsend.cpp
  • src/wallet/rpcdump.cpp
💤 Files with no reviewable changes (1)
  • qa/pull-tester/rpc-tests.py

Comment thread src/llmq/quorums_instantsend.cpp Outdated
Comment thread src/wallet/rpcdump.cpp Outdated
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/llmq/quorums_instantsend.cpp (1)

497-509: ⚠️ Potential issue | 🔴 Critical

Critical: tx.IsSparkSpend() will always be false inside the INSTANTSEND_ADAPTED_TX branch, bypassing LTag validation.

The check at line 498 can never pass because:

  1. AdaptSparkTx() sets nType = INSTANTSEND_ADAPTED_TX (line 1600)
  2. IsSparkSpend() requires nType == TRANSACTION_SPARK (per relevant snippet from transaction.cpp:218-223)

When an adapted Spark spend reaches CheckCanLock, it enters the INSTANTSEND_ADAPTED_TX branch, but the nested IsSparkSpend() check fails, causing the LTag validation to be completely skipped. The function returns true at line 508 without verifying that the linking tags haven't been used, effectively disabling double-spend protection for Spark InstantSend.

🐛 Proposed fix: Remove the redundant nested check
     if (tx.nType == isutils::INSTANTSEND_ADAPTED_TX) {
-        if (tx.IsSparkSpend()) {
-            for (CTxIn const & in : tx.vin) {
-                if (in.scriptSig.empty())
-                    return false;
-                GroupElement lTag;
-                lTag.deserialize(&in.scriptSig.front());
-                if (spark::CSparkState::GetState()->IsUsedLTag(lTag))
-                    return false;
-            }
+        for (CTxIn const & in : tx.vin) {
+            if (in.scriptSig.empty())
+                return false;
+            GroupElement lTag;
+            lTag.deserialize(&in.scriptSig.front());
+            if (spark::CSparkState::GetState()->IsUsedLTag(lTag))
+                return false;
         }
         return true;
     }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/llmq/quorums_instantsend.cpp` around lines 497 - 509, Inside
CheckCanLock, the nested check if (tx.IsSparkSpend()) inside the branch if
(tx.nType == isutils::INSTANTSEND_ADAPTED_TX) prevents LTag validation because
adapted Spark txs set nType to INSTANTSEND_ADAPTED_TX (AdaptSparkTx) while
IsSparkSpend() expects TRANSACTION_SPARK; remove the redundant IsSparkSpend()
condition so that when tx.nType == INSTANTSEND_ADAPTED_TX you always iterate
tx.vin, ensure in.scriptSig is non-empty, deserialize GroupElement lTag from
in.scriptSig and call spark::CSparkState::GetState()->IsUsedLTag(lTag) to reject
used tags (i.e., delete the if (tx.IsSparkSpend()) guard and its braces while
keeping the input loop and IsUsedLTag check).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Outside diff comments:
In `@src/llmq/quorums_instantsend.cpp`:
- Around line 497-509: Inside CheckCanLock, the nested check if
(tx.IsSparkSpend()) inside the branch if (tx.nType ==
isutils::INSTANTSEND_ADAPTED_TX) prevents LTag validation because adapted Spark
txs set nType to INSTANTSEND_ADAPTED_TX (AdaptSparkTx) while IsSparkSpend()
expects TRANSACTION_SPARK; remove the redundant IsSparkSpend() condition so that
when tx.nType == INSTANTSEND_ADAPTED_TX you always iterate tx.vin, ensure
in.scriptSig is non-empty, deserialize GroupElement lTag from in.scriptSig and
call spark::CSparkState::GetState()->IsUsedLTag(lTag) to reject used tags (i.e.,
delete the if (tx.IsSparkSpend()) guard and its braces while keeping the input
loop and IsUsedLTag check).

ℹ️ Review info
⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: 4136dc0a-295f-43ac-a4e5-685ffe581c83

📥 Commits

Reviewing files that changed from the base of the PR and between aca1bad and 5b5611a.

📒 Files selected for processing (2)
  • src/llmq/quorums_instantsend.cpp
  • src/wallet/rpcdump.cpp
💤 Files with no reviewable changes (1)
  • src/wallet/rpcdump.cpp

@codeant-ai
Copy link
Copy Markdown

codeant-ai Bot commented Apr 17, 2026

User petrosyan.levon93@gmail.com does not have a PR Review subscription.

Go to Team management and add this email to the PR Review subscription.

Copy link
Copy Markdown
Contributor

@cursor cursor Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 2 potential issues.

Fix All in Cursor

❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

Reviewed by Cursor Bugbot for commit 9f0b130. Configure here.

Comment thread src/bloom.cpp
Comment thread src/init.cpp Outdated
@codeant-ai
Copy link
Copy Markdown

codeant-ai Bot commented May 2, 2026

User levoncrypto1994@gmail.com does not have a PR Review subscription.

Go to Team management and add this email to the PR Review subscription.

@codeant-ai
Copy link
Copy Markdown

codeant-ai Bot commented May 3, 2026

User petrosyan.levon93@gmail.com does not have a PR Review subscription.

Go to Team management and add this email to the PR Review subscription.

@levonpetrosyan93 levonpetrosyan93 self-requested a review May 10, 2026 23:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants