Compare commits

...

45 Commits

Author SHA1 Message Date
Steve Myers
7876c8fd06 Merge bitcoindevkit/bdk#1437: Bump bdk version to 1.0.0-alpha.11
db9fdccc18 Bump bdk version to 1.0.0-alpha.11 (Steve Myers)

Pull request description:

  ### Description

  fixes #1435

  bdk_chain to 0.14.0
  bdk_bitcoind_rpc to 0.10.0
  bdk_electrum to 0.13.0
  bdk_esplora to 0.13.0
  bdk_file_store to 0.11.0
  bdk_testenv to 0.4.0
  bdk_persist to 0.2.0

  ### Checklists

  #### All Submissions:

  * [x] I've signed all my commits
  * [x] I followed the [contribution guidelines](https://github.com/bitcoindevkit/bdk/blob/master/CONTRIBUTING.md)
  * [x] I ran `cargo fmt` and `cargo clippy` before committing

ACKs for top commit:
  ValuedMammal:
    ACK db9fdccc18
  storopoli:
    ACK db9fdccc18

Tree-SHA512: 804475927461a4bcf37786313123a0a0cc68390af6c556111551dc126a06614296eb657a8a5662a36cf4569ab332f5f9285c99c5f1992d93f43568e881961895
2024-05-10 17:23:21 -05:00
Steve Myers
db9fdccc18 Bump bdk version to 1.0.0-alpha.11
bdk_chain to 0.14.0
bdk_bitcoind_rpc to 0.10.0
bdk_electrum to 0.13.0
bdk_esplora to 0.13.0
bdk_file_store to 0.11.0
bdk_testenv to 0.4.0
bdk_persist to 0.2.0
2024-05-10 14:05:40 -05:00
Steve Myers
63e3bbe820 Merge bitcoindevkit/bdk#1403: Update bdk_electrum crate to use sync/full-scan structs
b45897e6fe feat(electrum): update docs and simplify logic of `ElectrumExt` (志宇)
92fb6cb373 chore(electrum): do not use `anyhow::Result` directly (志宇)
b2f3cacce6 feat(electrum): include option for previous `TxOut`s for fee calculation (Wei Chen)
c0d7d60a58 feat(chain)!: use custom return types for `ElectrumExt` methods (志宇)
2945c6be88 fix(electrum): fixed `sync` functionality (Wei Chen)
9ed33c25ea docs(electrum): fixed `full_scan`, `sync`, and crate documentation (Wei Chen)
b1f861b932 feat: update logging of electrum examples (志宇)
a6fdfb2ae4 feat(electrum)!: use new sync/full-scan structs for `ElectrumExt` (志宇)
653e4fed6d feat(wallet): cache txs when constructing full-scan/sync requests (志宇)
58f27b38eb feat(chain): introduce `TxCache` to `SyncRequest` and `FullScanRequest` (志宇)
721bb7f519 fix(chain): Make `Anchor` type in `FullScanResult` generic (志宇)
e3cfb84898 feat(chain): `TxGraph::insert_tx` reuses `Arc` (志宇)
2ffb65618a refactor(electrum): remove `RelevantTxids` and track txs in `TxGraph` (Wei Chen)

Pull request description:

  Fixes #1265
  Possibly fixes #1419

  ### Context

  Previous changes such as

  * Universal structures for full-scan/sync (PR #1413)
  * Making `CheckPoint` linked list query-able (PR #1369)
  * Making `Transaction`s cheaply-clonable (PR #1373)

  has allowed us to simplify the interaction between chain-source and receiving-structures (`bdk_chain`).

  The motivation is to accomplish something like this ([as mentioned here](https://github.com/bitcoindevkit/bdk/issues/1153#issuecomment-1752263555)):
  ```rust
  let things_I_am_interested_in = wallet.lock().unwrap().start_sync();
  let update = electrum_or_esplora.sync(things_i_am_interested_in)?;
  wallet.lock().unwrap().apply_update(update)?:
  ```

  ### Description

  This PR greatly simplifies the API of our Electrum chain-source (`bdk_electrum`) by making use of the aforementioned changes. Instead of referring back to the receiving `TxGraph` mid-sync/scan to determine which full transaction to fetch, we provide the Electrum chain-source already-fetched full transactions to start sync/scan (this is cheap, as transactions are wrapped in `Arc`s since #1373).

  In addition, an option has been added to include the previous `TxOut` for transactions received from an external wallet for fee calculation.

  ### Changelog notice

  * Change `TxGraph::insert_tx` to take in anything that satisfies `Into<Arc<Transaction>>`. This allows us to reuse the `Arc` pointer of what is being inserted.
  * Add `tx_cache` field to `SyncRequest` and `FullScanRequest`.
  * Make `Anchor` type in `FullScanResult` generic for more flexibility.
  * Change `ElectrumExt` methods to take in `SyncRequest`/`FullScanRequest` and return `SyncResult`/`FullScanResult`. Also update electrum examples accordingly.
  * Add `ElectrumResultExt` trait which allows us to convert the update `TxGraph` of `SyncResult`/`FullScanResult` for `bdk_electrum`.
  * Added an option for `full_scan` and `sync` to also fetch previous `TxOut`s to allow for fee calculation.

  ### Checklists

  #### All Submissions:

  * [x] I've signed all my commits
  * [x] I followed the [contribution guidelines](https://github.com/bitcoindevkit/bdk/blob/master/CONTRIBUTING.md)
  * [x] I ran `cargo fmt` and `cargo clippy` before committing

  #### New Features:

  * [x] I've added tests for the new feature
  * [x] I've added docs for the new feature

ACKs for top commit:
  ValuedMammal:
    ACK b45897e6fe
  notmandatory:
    ACK b45897e6fe

Tree-SHA512: 1e274546015e7c7257965b36079ffe0cb3c2c0b7c2e0c322bcf32a06925a0c3e1119da1c8fd5318f1dbd82c2e952f6a07f227a9b023c48f506a62c93045d96d3
2024-05-10 12:33:09 -05:00
志宇
b45897e6fe feat(electrum): update docs and simplify logic of ElectrumExt
Helper method docs are updated to explain what they are updating. Logic
is simplified as we do not need to check whether a tx exists already in
`update_graph` before inserting it.
2024-05-10 16:40:55 +08:00
志宇
92fb6cb373 chore(electrum): do not use anyhow::Result directly 2024-05-10 14:54:29 +08:00
Wei Chen
b2f3cacce6 feat(electrum): include option for previous TxOuts for fee calculation
The previous `TxOut` for transactions received from an external
wallet may be optionally added as floating `TxOut`s to `TxGraph`
to allow for fee calculation.
2024-05-10 14:54:29 +08:00
志宇
c0d7d60a58 feat(chain)!: use custom return types for ElectrumExt methods
This is more code, but a much more elegant solution than having
`ElectrumExt` methods return `SyncResult`/`FullScanResult` and having an
`ElectrumResultExt` extention trait.
2024-05-10 14:54:29 +08:00
Wei Chen
2945c6be88 fix(electrum): fixed sync functionality 2024-05-10 14:54:28 +08:00
Wei Chen
9ed33c25ea docs(electrum): fixed full_scan, sync, and crate documentation 2024-05-10 14:54:28 +08:00
志宇
b1f861b932 feat: update logging of electrum examples
* Syncing with `example_electrum` now shows progress as a percentage.
* Flush stdout more aggressively.
2024-05-10 14:54:28 +08:00
志宇
a6fdfb2ae4 feat(electrum)!: use new sync/full-scan structs for ElectrumExt
`ElectrumResultExt` trait is also introduced that adds methods which can
convert the `Anchor` type for the update `TxGraph`.

We also make use of the new `TxCache` fields in
`SyncRequest`/`FullScanRequest`. This way, we can avoid re-fetching full
transactions from Electrum if not needed.

Examples and tests are updated to use the new `ElectrumExt` API.
2024-05-10 14:54:28 +08:00
志宇
653e4fed6d feat(wallet): cache txs when constructing full-scan/sync requests 2024-05-10 14:11:20 +08:00
志宇
58f27b38eb feat(chain): introduce TxCache to SyncRequest and FullScanRequest
This transaction cache can be provided so the chain-source can avoid
re-fetching transactions.
2024-05-10 14:11:19 +08:00
志宇
721bb7f519 fix(chain): Make Anchor type in FullScanResult generic 2024-05-10 14:11:19 +08:00
志宇
e3cfb84898 feat(chain): TxGraph::insert_tx reuses Arc
When we insert a transaction that is already wrapped in `Arc`, we should
reuse the `Arc`.
2024-05-10 14:11:19 +08:00
Wei Chen
2ffb65618a refactor(electrum): remove RelevantTxids and track txs in TxGraph
This PR removes `RelevantTxids` from the electrum crate and tracks
transactions in a `TxGraph`. This removes the need to separately
construct a `TxGraph` after a `full_scan` or `sync`.
2024-05-10 14:11:18 +08:00
Steve Myers
fb7ff298a4 Merge bitcoindevkit/bdk#1203: Include the descriptor in keychain::Changeset
86711d4f46 doc(chain): add section for non-recommended K to descriptor assignments (Daniela Brozzoni)
de53d72191 test: Only the highest ord keychain is returned (Daniela Brozzoni)
9d8023bf56 fix(chain): introduce keychain-variant-ranking to `KeychainTxOutIndex` (志宇)
6c8748124f chore(chain): move `use` in `indexed_tx_graph.rs` so clippy is happy (志宇)
537aa03ae0 chore(chain): update test so clippy does not complain (志宇)
ed117de7a5 test(chain): applying changesets one-by-one vs aggregate should be same (志宇)
6a3fb849e8 fix(chain): simplify `Append::append` impl for `keychain::ChangeSet` (志宇)
1d294b734d fix: Run tests only if the miniscript feature is.. ..enabled, enable it by default (Daniela Brozzoni)
0e3e136f6f doc(bdk): Add instructions for manually inserting... ...secret keys in the wallet in Wallet::load (Daniela Brozzoni)
76afccc555 fix(wallet): add expected descriptors as signers after creating from wallet::ChangeSet (Steve Myers)
4f05441a00 keychain::ChangeSet includes the descriptor (Daniela Brozzoni)
8ff99f27df ref(chain): Define test descriptors, use them... ...everywhere (Daniela Brozzoni)
b9902936a0 ref(chain): move `keychain::ChangeSet` into `txout_index.rs` (志宇)

Pull request description:

  Fixes #1101

  - Moves keychain::ChangeSet inside `keychain/txout_index.rs` as now the `ChangeSet` depends on miniscript
  - Slightly cleans up tests by introducing some constant descriptors
  - The KeychainTxOutIndex's internal SpkIterator now uses DescriptorId
  instead of K. The DescriptorId -> K translation is made at the
  KeychainTxOutIndex level.
  - The keychain::Changeset is now a struct, which includes a map for last
  revealed indexes, and one for newly added keychains and their
  descriptor.

  ### Changelog notice

  API changes in bdk:
  - Wallet::keychains returns a `impl Iterator` instead of `BTreeMap`
  - Wallet::load doesn't take descriptors anymore, since they're stored in the db
  - Wallet::new_or_load checks if the loaded descriptor from db is the same as the provided one

  API changes in bdk_chain:
  - `ChangeSet` is now a struct, which includes a map for last revealed
        indexes, and one for keychains and descriptors.
  - `KeychainTxOutIndex::inner` returns a `SpkIterator<(DescriptorId, u32)>`
  - `KeychainTxOutIndex::outpoints` returns a `BTreeSet` instead of `&BTreeSet`
  - `KeychainTxOutIndex::keychains` returns a `impl Iterator` instead of
        `&BTreeMap`
  - `KeychainTxOutIndex::txouts` doesn't return a ExactSizeIterator anymore
  - `KeychainTxOutIndex::last_revealed_indices` returns a `BTreeMap`
        instead of `&BTreeMap`
  - `KeychainTxOutIndex::add_keychain` has been renamed to `KeychainTxOutIndex::insert_descriptor`, and now it returns a ChangeSet
  - `KeychainTxOutIndex::reveal_next_spk` returns Option
  - `KeychainTxOutIndex::next_unused_spk` returns Option
  - `KeychainTxOutIndex::unbounded_spk_iter` returns Option
  - `KeychainTxOutIndex::next_index` returns Option
  - `KeychainTxOutIndex::reveal_to_target` returns Option
  - `KeychainTxOutIndex::revealed_keychain_spks` returns Option
  - `KeychainTxOutIndex::unused_keychain_spks` returns Option
  - `KeychainTxOutIndex::last_revealed_index` returns Option
  - `KeychainTxOutIndex::keychain_outpoints` returns Option
  - `KeychainTxOutIndex::keychain_outpoints_in_range` returns Option
  - `KeychainTxOutIndex::last_used_index` returns None if the keychain has never been used, or if it doesn't exist

  ### Checklists

  #### All Submissions:

  * [x] I've signed all my commits
  * [x] I followed the [contribution guidelines](https://github.com/bitcoindevkit/bdk/blob/master/CONTRIBUTING.md)
  * [x] I ran `cargo fmt` and `cargo clippy` before committing

  #### New Features:

  * [x] I've added tests for the new feature
  * [x] I've added docs for the new feature

ACKs for top commit:
  evanlinjin:
    ACK 86711d4f46

Tree-SHA512: 4b1c9a31951f67b18037b7dd9837acbc35823f21de644ab833754b74d20f5373549f81e66965ecd3953ebf4f99644c9fd834812acfa65f9188950f1bda17ab60
2024-05-09 13:18:57 -05:00
Daniela Brozzoni
86711d4f46 doc(chain): add section for non-recommended K to descriptor assignments 2024-05-09 14:40:19 +08:00
Steve Myers
86408b90a5 Merge bitcoindevkit/bdk#1430: ci: Pin clippy to rust 1.78.0
de2763a4b8 ci: Pin clippy to rust 1.78.0 (valued mammal)

Pull request description:

  This PR pins clippy check in CI to the rust 1.78 toolchain, which prevents new lints in stable releases from interrupting the usual workflow. Because rust versions are released on a predictable schedule, we can revisit this setting in the future as needed (say 3-6 months).

  ### Checklists

  #### All Submissions:

  * [x] I've signed all my commits
  * [x] I followed the [contribution guidelines](https://github.com/bitcoindevkit/bdk/blob/master/CONTRIBUTING.md)
  * [ ] I ran `cargo fmt` and `cargo clippy` before committing

ACKs for top commit:
  danielabrozzoni:
    ACK de2763a4b8
  storopoli:
    ACK de2763a4b8
  notmandatory:
    ACK de2763a4b8
  oleonardolima:
    ACK de2763a4b8

Tree-SHA512: 73cad29a5ff437290aca8f85a011c4f5fc4d9ff5755f3d3ef9fa1820f5631eda857b1a67955adfc6ef98145958c674cc09f7613b96f38cc30c75a656d872edbc
2024-05-08 19:37:06 -05:00
Daniela Brozzoni
de53d72191 test: Only the highest ord keychain is returned 2024-05-08 15:49:51 +02:00
志宇
9d8023bf56 fix(chain): introduce keychain-variant-ranking to KeychainTxOutIndex
This fixes the bug with changesets not being monotone. Previously, the
result of applying changesets individually v.s. applying the aggregate
of changesets may result in different `KeychainTxOutIndex` states.

The nature of the changeset allows different keychain types to share the
same descriptor. However, the previous design did not take this into
account. To do this properly, we should keep track of all keychains
currently associated with a given descriptor. However, the API only
allows returning one keychain per spk/txout/outpoint (which is a good
API).

Therefore, we rank keychain variants by `Ord`. Earlier keychain variants
have a higher rank, and the first keychain will be returned.
2024-05-08 15:49:50 +02:00
志宇
6c8748124f chore(chain): move use in indexed_tx_graph.rs so clippy is happy 2024-05-08 15:49:48 +02:00
志宇
537aa03ae0 chore(chain): update test so clippy does not complain 2024-05-08 15:49:47 +02:00
志宇
ed117de7a5 test(chain): applying changesets one-by-one vs aggregate should be same 2024-05-08 15:49:46 +02:00
志宇
6a3fb849e8 fix(chain): simplify Append::append impl for keychain::ChangeSet
We only need to loop though entries of `other`. The logic before was
wasteful because we were also looping though all entries of `self` even
if we do not need to modify the `self` entry.
2024-05-08 15:49:45 +02:00
Daniela Brozzoni
1d294b734d fix: Run tests only if the miniscript feature is..
..enabled, enable it by default
2024-05-08 15:49:44 +02:00
Daniela Brozzoni
0e3e136f6f doc(bdk): Add instructions for manually inserting...
...secret keys in the wallet in Wallet::load
2024-05-08 15:49:43 +02:00
Steve Myers
76afccc555 fix(wallet): add expected descriptors as signers after creating from wallet::ChangeSet 2024-05-08 15:49:42 +02:00
Daniela Brozzoni
4f05441a00 keychain::ChangeSet includes the descriptor
- The KeychainTxOutIndex's internal SpkIterator now uses DescriptorId
  instead of K. The DescriptorId -> K translation is made at the
  KeychainTxOutIndex level.
- The keychain::Changeset is now a struct, which includes a map for last
  revealed indexes, and one for newly added keychains and their
  descriptor.

API changes in bdk:
- Wallet::keychains returns a `impl Iterator` instead of `BTreeMap`
- Wallet::load doesn't take descriptors anymore, since they're stored in
  the db
- Wallet::new_or_load checks if the loaded descriptor from db is the
  same as the provided one

API changes in bdk_chain:
- `ChangeSet` is now a struct, which includes a map for last revealed
  indexes, and one for keychains and descriptors.
- `KeychainTxOutIndex::inner` returns a `SpkIterator<(DescriptorId, u32)>`
- `KeychainTxOutIndex::outpoints` returns a `impl Iterator` instead of `&BTreeSet`
- `KeychainTxOutIndex::keychains` returns a `impl Iterator` instead of
  `&BTreeMap`
- `KeychainTxOutIndex::txouts` doesn't return a ExactSizeIterator
  anymore
- `KeychainTxOutIndex::unbounded_spk_iter` returns an `Option`
- `KeychainTxOutIndex::next_index` returns an `Option`
- `KeychainTxOutIndex::last_revealed_indices` returns a `BTreeMap`
  instead of `&BTreeMap`
- `KeychainTxOutIndex::reveal_to_target` returns an `Option`
- `KeychainTxOutIndex::reveal_next_spk` returns an `Option`
- `KeychainTxOutIndex::next_unused_spk` returns an `Option`
- `KeychainTxOutIndex::add_keychain` has been renamed to
  `KeychainTxOutIndex::insert_descriptor`, and now it returns a
  ChangeSet
2024-05-08 15:49:41 +02:00
Daniela Brozzoni
8ff99f27df ref(chain): Define test descriptors, use them...
...everywhere
2024-05-08 13:23:28 +02:00
志宇
b9902936a0 ref(chain): move keychain::ChangeSet into txout_index.rs
We plan to record `Descriptor` additions into persistence. Hence, we
need to add `Descriptor`s to the changeset. This depends on
`miniscript`. Moving this into `txout_index.rs` makes sense as this is
consistent with all the other files. The only reason why this wasn't
this way before, is because the changeset didn't need miniscript.

Co-Authored-By: Daniela Brozzoni <danielabrozzoni@protonmail.com>
2024-05-08 13:23:27 +02:00
Steve Myers
66abc73c3d Merge bitcoindevkit/bdk#1423: fix(persist): add default feature to enable bdk_chain/std
a577c22b12 fix(persist): add default feature to enable bdk_chain/std (Steve Myers)

Pull request description:

  ### Description

  This PR adds a `default` feature to `bdk_persist` so it can be build on its own.  Once #1422 is done we can remove the `default`again.

  ### Notes to the reviewers

  I need to be able to build `bdk_persist` on its own so I can publish it to crates.io.

  ### Checklists

  #### All Submissions:

  * [x] I've signed all my commits
  * [x] I followed the [contribution guidelines](https://github.com/bitcoindevkit/bdk/blob/master/CONTRIBUTING.md)
  * [x] I ran `cargo fmt` and `cargo clippy` before committing

ACKs for top commit:
  ValuedMammal:
    ACK a577c22b12
  oleonardolima:
    ACK a577c22b12
  storopoli:
    ACK a577c22b12

Tree-SHA512: 8b07a9e4974dec8812ca19ce7226dcaece064270a0be8b83d3c326fdf1e89b051eb0bd8aa0eda9362b2c8233ecd6003b70c92ee046603973d8d40611418c3841
2024-05-07 19:32:09 -05:00
valued mammal
de2763a4b8 ci: Pin clippy to rust 1.78.0 2024-05-07 09:55:14 -04:00
志宇
dcd2d4741d Merge bitcoindevkit/bdk#1411: feat: update keychain::Balance to use bitcoin::Amount
22aa534d76 feat: use `Amount` on `TxBuilder::add_recipient` (Leonardo Lima)
d5c0e7200c feat: use `Amount` on `spk_txout_index` and related (Leonardo Lima)
8a33d98db9 feat: update `wallet::Balance` to use `bitcoin::Amount` (Leonardo Lima)

Pull request description:

  fixes #823

  <!-- You can erase any parts of this template not applicable to your Pull Request. -->

  ### Description

  It's being used on `Balance`, and throughout the code, an `u64` represents the amount, which relies on the user to infer its sats, not millisats, or any other representation.

  It updates the usage of `u64` on `Balance`, and other APIs:
  - `TxParams::add_recipient`
  - `KeyChainTxOutIndex::sent_and_received`, `KeyChainTxOutIndex::net_value`
  -  `SpkTxOutIndex::sent_and_received`, `SpkTxOutIndex::net_value`

  <!-- Describe the purpose of this PR, what's being adding and/or fixed -->

  ### Notes to the reviewers

  <!-- In this section you can include notes directed to the reviewers, like explaining why some parts
  of the PR were done in a specific way -->

  It updates some of the APIs to expect the `bitcoin::Amount`, but it does not update internal usage of u64, such as `TxParams` still expects and uses `u64`, please see the PR comments for related discussion.

  ### Changelog notice

  <!-- Notice the release manager should include in the release tag message changelog -->
  <!-- See https://keepachangelog.com/en/1.0.0/ for examples -->

  - Changed the `keychain::Balance` struct fields to use `Amount` instead of `u64`.
  - Changed the `add_recipient` method on `TxBuilder` implementation to expect `bitcoin::Amount`.
  - Changed the `sent_and_received`, and `net_value` methods on `KeyChainTxOutIndex` to expect `bitcoin::Amount`.
  - Changed the `sent_and_received`, and `net_value` methods on `SpkTxOutIndex` to expect `bitcoin::Amount`.

  ### Checklists

  #### All Submissions:

  * [x] I've signed all my commits
  * [x] I followed the [contribution guidelines](https://github.com/bitcoindevkit/bdk/blob/master/CONTRIBUTING.md)
  * [x] I ran `cargo fmt` and `cargo clippy` before committing

  #### New Features:

  * [x] I've added tests for the new feature
  * [x] I've added docs for the new feature

  #### Bugfixes:

  * [x] This pull request breaks the existing API
  * [ ] I've added tests to reproduce the issue which are now passing
  * [x] I'm linking the issue being fixed by this PR

ACKs for top commit:
  evanlinjin:
    ACK 22aa534d76

Tree-SHA512: c4e8198d96c0d66cc3d2e4149e8a56bb7565b9cd49ff42113eaebd24b1d7bfeecd7124db0b06524b78b8891ee1bde1546705b80afad408f48495cf3c02446d02
2024-05-06 20:23:48 +08:00
志宇
23538c4039 Merge bitcoindevkit/bdk#1414: chore: clean up electrsd and anyhow dev dependencies
f6218e4741 chore: reexport crates in `TestEnv` (Wei Chen)
125959976f chore: remove `anyhow` dev dependency from `electrum`, `esplora`, and `bitcoind_rpc` (Wei Chen)

Pull request description:

  <!-- You can erase any parts of this template not applicable to your Pull Request. -->

  ### Description

  Reexports `electrsd` in `TestEnv` to remove the `electrsd` dev depedency out of `bdk_electrum` and `bdk_esplora`.
  Credit to @ValuedMammal for the idea.

  Since `bitcoind` reexports `anyhow`, this dev dependency was also removed from `bdk_electrum`, `bdk_esplora`, and `bdk_bitcoind_rpc`. `bitcoind`, `bitcoincore_rpc` and `electrum_client` were also reexported for convenience.

  ### Changelog notice

  * Change `bdk_testenv` to re-export internally used crates.

  ### Checklists

  #### All Submissions:

  * [x] I've signed all my commits
  * [x] I followed the [contribution guidelines](https://github.com/bitcoindevkit/bdk/blob/master/CONTRIBUTING.md)
  * [x] I ran `cargo fmt` and `cargo clippy` before committing

ACKs for top commit:
  evanlinjin:
    ACK f6218e4741

Tree-SHA512: c7645eb91d08d4ccb80982a992f691b5a8c0df39df506f6b361bc6f2bb076d62cbe5bb5d88b4c684c36e22464c0674f21f6ef4e23733f89b03aa12ec43a67cba
2024-05-06 20:12:23 +08:00
志宇
a9f7377934 Merge bitcoindevkit/bdk#1427: docs(esplora): fixed full_scan and sync documentation
f6dc6890c3 docs(esplora): fixed `full_scan` and `sync` documentation (Wei Chen)

Pull request description:

  <!-- You can erase any parts of this template not applicable to your Pull Request. -->

  ### Description

  Fixed documentation for `full_scan` and `sync` in `bdk_esplora`.

  ### Changelog notice

  <!-- Notice the release manager should include in the release tag message changelog -->
  <!-- See https://keepachangelog.com/en/1.0.0/ for examples -->
  * Updated documentation for `full_scan` and `sync` in `bdk_esplora`.

  ### Checklists

  #### All Submissions:

  * [x] I've signed all my commits
  * [x] I followed the [contribution guidelines](https://github.com/bitcoindevkit/bdk/blob/master/CONTRIBUTING.md)
  * [x] I ran `cargo fmt` and `cargo clippy` before committing

ACKs for top commit:
  evanlinjin:
    ACK f6dc6890c3
  storopoli:
    ACK f6dc6890c3

Tree-SHA512: 900fb1a2839379af867a6effad32ec4bdfb897330a72ee1e1ec203299e7f3d5fae576550aeed8fd93c5c70a13ad2b0e898033d8b45b490319b5d74216b93f332
2024-05-06 20:09:31 +08:00
Wei Chen
f6dc6890c3 docs(esplora): fixed full_scan and sync documentation 2024-05-06 16:51:19 +08:00
Leonardo Lima
22aa534d76 feat: use Amount on TxBuilder::add_recipient 2024-05-05 12:07:07 -03:00
Leonardo Lima
d5c0e7200c feat: use Amount on spk_txout_index and related
- update `wallet.rs` fns: `sent_and_received` fn
- update `keychain` `txout_index` fn: `sent_and_received and `net_value`
2024-05-05 12:07:01 -03:00
Wei Chen
f6218e4741 chore: reexport crates in TestEnv 2024-05-05 19:28:18 +08:00
Wei Chen
125959976f chore: remove anyhow dev dependency from electrum, esplora, and bitcoind_rpc 2024-05-05 19:28:18 +08:00
Leonardo Lima
8a33d98db9 feat: update wallet::Balance to use bitcoin::Amount
- update all fields `immature`, ` trusted_pending`, `unstrusted_pending`
  and `confirmed` to use the `bitcoin::Amount` instead of `u64`
- update all `impl Balance` methods to use `bitcoin::Amount`
- update all tests that relies on `keychain::Balance`
2024-05-04 21:59:07 -03:00
志宇
2703cc6e78 Merge bitcoindevkit/bdk#1417: test(wallet): add thread safety test
db47347472 test(wallet): add thread safety test (Rob N)

Pull request description:

  ### Description

  `Wallet` auto-implements `Send` and `Sync` after removing the generic. This test is a compile time error if there are changes to `Wallet` in the future that make it unsafe to send between threads. See #1387 for discussion.

  ### Checklists

  #### All Submissions:

  * [x] I've signed all my commits
  * [x] I followed the [contribution guidelines](https://github.com/bitcoindevkit/bdk/blob/master/CONTRIBUTING.md)
  * [x] I ran `cargo fmt` and `cargo clippy` before committing

  #### New Features:

  * [x] I've added tests for the new feature
  * [ ] I've added docs for the new feature

  #### Bugfixes:

  * [ ] This pull request breaks the existing API
  * [ ] I've added tests to reproduce the issue which are now passing
  * [ ] I'm linking the issue being fixed by this PR

ACKs for top commit:
  evanlinjin:
    ACK db47347472

Tree-SHA512: 490e666bc503f15286268db7e5e2f75ee44ad2f80251d6f7a01af2a435023b87607eee33623712433ea8d27511be63c6c1e9cad4159b3fe66a4644cfa9e344fb
2024-05-04 20:24:30 +08:00
Rob N
db47347472 test(wallet): add thread safety test 2024-05-02 08:43:02 -10:00
Steve Myers
a577c22b12 fix(persist): add default feature to enable bdk_chain/std 2024-05-02 13:30:13 -05:00
47 changed files with 2154 additions and 1247 deletions

View File

@@ -118,7 +118,7 @@ jobs:
- uses: actions/checkout@v1 - uses: actions/checkout@v1
- uses: actions-rs/toolchain@v1 - uses: actions-rs/toolchain@v1
with: with:
toolchain: stable toolchain: 1.78.0
components: clippy components: clippy
override: true override: true
- name: Rust Cache - name: Rust Cache

View File

@@ -1,7 +1,7 @@
[package] [package]
name = "bdk" name = "bdk"
homepage = "https://bitcoindevkit.org" homepage = "https://bitcoindevkit.org"
version = "1.0.0-alpha.10" version = "1.0.0-alpha.11"
repository = "https://github.com/bitcoindevkit/bdk" repository = "https://github.com/bitcoindevkit/bdk"
documentation = "https://docs.rs/bdk" documentation = "https://docs.rs/bdk"
description = "A modern, lightweight, descriptor-based wallet library" description = "A modern, lightweight, descriptor-based wallet library"
@@ -19,8 +19,8 @@ miniscript = { version = "11.0.0", features = ["serde"], default-features = fals
bitcoin = { version = "0.31.0", features = ["serde", "base64", "rand-std"], default-features = false } bitcoin = { version = "0.31.0", features = ["serde", "base64", "rand-std"], default-features = false }
serde = { version = "^1.0", features = ["derive"] } serde = { version = "^1.0", features = ["derive"] }
serde_json = { version = "^1.0" } serde_json = { version = "^1.0" }
bdk_chain = { path = "../chain", version = "0.13.0", features = ["miniscript", "serde"], default-features = false } bdk_chain = { path = "../chain", version = "0.14.0", features = ["miniscript", "serde"], default-features = false }
bdk_persist = { path = "../persist", version = "0.1.0" } bdk_persist = { path = "../persist", version = "0.2.0" }
# Optional dependencies # Optional dependencies
bip39 = { version = "2.0", optional = true } bip39 = { version = "2.0", optional = true }

View File

@@ -92,7 +92,7 @@
//! .unwrap(); //! .unwrap();
//! let psbt = { //! let psbt = {
//! let mut builder = wallet.build_tx().coin_selection(AlwaysSpendEverything); //! let mut builder = wallet.build_tx().coin_selection(AlwaysSpendEverything);
//! builder.add_recipient(to_address.script_pubkey(), 50_000); //! builder.add_recipient(to_address.script_pubkey(), Amount::from_sat(50_000));
//! builder.finish()? //! builder.finish()?
//! }; //! };
//! //!

View File

@@ -32,14 +32,14 @@ use bdk_chain::{
IndexedTxGraph, IndexedTxGraph,
}; };
use bdk_persist::{Persist, PersistBackend}; use bdk_persist::{Persist, PersistBackend};
use bitcoin::constants::genesis_block;
use bitcoin::secp256k1::{All, Secp256k1}; use bitcoin::secp256k1::{All, Secp256k1};
use bitcoin::sighash::{EcdsaSighashType, TapSighashType}; use bitcoin::sighash::{EcdsaSighashType, TapSighashType};
use bitcoin::{ use bitcoin::{
absolute, psbt, Address, Block, FeeRate, Network, OutPoint, Script, ScriptBuf, Sequence, absolute, psbt, Address, Block, FeeRate, Network, OutPoint, Script, ScriptBuf, Sequence,
Transaction, TxOut, Txid, Witness, Transaction, TxOut, Txid, Witness,
}; };
use bitcoin::{consensus::encode::serialize, transaction, Amount, BlockHash, Psbt}; use bitcoin::{consensus::encode::serialize, transaction, BlockHash, Psbt};
use bitcoin::{constants::genesis_block, Amount};
use core::fmt; use core::fmt;
use core::ops::Deref; use core::ops::Deref;
use descriptor::error::Error as DescriptorError; use descriptor::error::Error as DescriptorError;
@@ -54,6 +54,7 @@ pub mod tx_builder;
pub(crate) mod utils; pub(crate) mod utils;
pub mod error; pub mod error;
pub use utils::IsDust; pub use utils::IsDust;
use coin_selection::DefaultCoinSelectionAlgorithm; use coin_selection::DefaultCoinSelectionAlgorithm;
@@ -305,6 +306,8 @@ pub enum LoadError {
MissingNetwork, MissingNetwork,
/// Data loaded from persistence is missing genesis hash. /// Data loaded from persistence is missing genesis hash.
MissingGenesis, MissingGenesis,
/// Data loaded from persistence is missing descriptor.
MissingDescriptor,
} }
impl fmt::Display for LoadError { impl fmt::Display for LoadError {
@@ -317,6 +320,7 @@ impl fmt::Display for LoadError {
} }
LoadError::MissingNetwork => write!(f, "loaded data is missing network type"), LoadError::MissingNetwork => write!(f, "loaded data is missing network type"),
LoadError::MissingGenesis => write!(f, "loaded data is missing genesis hash"), LoadError::MissingGenesis => write!(f, "loaded data is missing genesis hash"),
LoadError::MissingDescriptor => write!(f, "loaded data is missing descriptor"),
} }
} }
} }
@@ -352,6 +356,13 @@ pub enum NewOrLoadError {
/// The network type loaded from persistence. /// The network type loaded from persistence.
got: Option<Network>, got: Option<Network>,
}, },
/// The loaded desccriptor does not match what was provided.
LoadedDescriptorDoesNotMatch {
/// The descriptor loaded from persistence.
got: Option<ExtendedDescriptor>,
/// The keychain of the descriptor not matching
keychain: KeychainKind,
},
} }
impl fmt::Display for NewOrLoadError { impl fmt::Display for NewOrLoadError {
@@ -372,6 +383,13 @@ impl fmt::Display for NewOrLoadError {
NewOrLoadError::LoadedNetworkDoesNotMatch { expected, got } => { NewOrLoadError::LoadedNetworkDoesNotMatch { expected, got } => {
write!(f, "loaded network type is not {}, got {:?}", expected, got) write!(f, "loaded network type is not {}, got {:?}", expected, got)
} }
NewOrLoadError::LoadedDescriptorDoesNotMatch { got, keychain } => {
write!(
f,
"loaded descriptor is different from what was provided, got {:?} for keychain {:?}",
got, keychain
)
}
} }
} }
} }
@@ -499,21 +517,57 @@ impl Wallet {
} }
/// Load [`Wallet`] from the given persistence backend. /// Load [`Wallet`] from the given persistence backend.
pub fn load<E: IntoWalletDescriptor>( ///
descriptor: E, /// Note that the descriptor secret keys are not persisted to the db; this means that after
change_descriptor: Option<E>, /// calling this method the [`Wallet`] **won't** know the secret keys, and as such, won't be
/// able to sign transactions.
///
/// If you wish to use the wallet to sign transactions, you need to add the secret keys
/// manually to the [`Wallet`]:
///
/// ```rust,no_run
/// # use bdk::Wallet;
/// # use bdk::signer::{SignersContainer, SignerOrdering};
/// # use bdk::descriptor::Descriptor;
/// # use bitcoin::key::Secp256k1;
/// # use bdk::KeychainKind;
/// # use bdk_file_store::Store;
/// #
/// # fn main() -> Result<(), anyhow::Error> {
/// # let temp_dir = tempfile::tempdir().expect("must create tempdir");
/// # let file_path = temp_dir.path().join("store.db");
/// # let db: Store<bdk::wallet::ChangeSet> = Store::create_new(&[], &file_path).expect("must create db");
/// let secp = Secp256k1::new();
///
/// let (external_descriptor, external_keymap) = Descriptor::parse_descriptor(&secp, "wpkh(tprv8ZgxMBicQKsPdy6LMhUtFHAgpocR8GC6QmwMSFpZs7h6Eziw3SpThFfczTDh5rW2krkqffa11UpX3XkeTTB2FvzZKWXqPY54Y6Rq4AQ5R8L/84'/1'/0'/0/*)").unwrap();
/// let (internal_descriptor, internal_keymap) = Descriptor::parse_descriptor(&secp, "wpkh(tprv8ZgxMBicQKsPdy6LMhUtFHAgpocR8GC6QmwMSFpZs7h6Eziw3SpThFfczTDh5rW2krkqffa11UpX3XkeTTB2FvzZKWXqPY54Y6Rq4AQ5R8L/84'/1'/0'/1/*)").unwrap();
///
/// let external_signer_container = SignersContainer::build(external_keymap, &external_descriptor, &secp);
/// let internal_signer_container = SignersContainer::build(internal_keymap, &internal_descriptor, &secp);
///
/// let mut wallet = Wallet::load(db)?;
///
/// external_signer_container.signers().into_iter()
/// .for_each(|s| wallet.add_signer(KeychainKind::External, SignerOrdering::default(), s.clone()));
/// internal_signer_container.signers().into_iter()
/// .for_each(|s| wallet.add_signer(KeychainKind::Internal, SignerOrdering::default(), s.clone()));
/// # Ok(())
/// # }
/// ```
///
/// Alternatively, you can call [`Wallet::new_or_load`], which will add the private keys of the
/// passed-in descriptors to the [`Wallet`].
pub fn load(
mut db: impl PersistBackend<ChangeSet> + Send + Sync + 'static, mut db: impl PersistBackend<ChangeSet> + Send + Sync + 'static,
) -> Result<Self, LoadError> { ) -> Result<Self, LoadError> {
let changeset = db let changeset = db
.load_from_persistence() .load_from_persistence()
.map_err(LoadError::Persist)? .map_err(LoadError::Persist)?
.ok_or(LoadError::NotInitialized)?; .ok_or(LoadError::NotInitialized)?;
Self::load_from_changeset(descriptor, change_descriptor, db, changeset) Self::load_from_changeset(db, changeset)
} }
fn load_from_changeset<E: IntoWalletDescriptor>( fn load_from_changeset(
descriptor: E,
change_descriptor: Option<E>,
db: impl PersistBackend<ChangeSet> + Send + Sync + 'static, db: impl PersistBackend<ChangeSet> + Send + Sync + 'static,
changeset: ChangeSet, changeset: ChangeSet,
) -> Result<Self, LoadError> { ) -> Result<Self, LoadError> {
@@ -522,10 +576,23 @@ impl Wallet {
let chain = let chain =
LocalChain::from_changeset(changeset.chain).map_err(|_| LoadError::MissingGenesis)?; LocalChain::from_changeset(changeset.chain).map_err(|_| LoadError::MissingGenesis)?;
let mut index = KeychainTxOutIndex::<KeychainKind>::default(); let mut index = KeychainTxOutIndex::<KeychainKind>::default();
let descriptor = changeset
.indexed_tx_graph
.indexer
.keychains_added
.get(&KeychainKind::External)
.ok_or(LoadError::MissingDescriptor)?
.clone();
let change_descriptor = changeset
.indexed_tx_graph
.indexer
.keychains_added
.get(&KeychainKind::Internal)
.cloned();
let (signers, change_signers) = let (signers, change_signers) =
create_signers(&mut index, &secp, descriptor, change_descriptor, network) create_signers(&mut index, &secp, descriptor, change_descriptor, network)
.map_err(LoadError::Descriptor)?; .expect("Can't fail: we passed in valid descriptors, recovered from the changeset");
let mut indexed_graph = IndexedTxGraph::new(index); let mut indexed_graph = IndexedTxGraph::new(index);
indexed_graph.apply_changeset(changeset.indexed_tx_graph); indexed_graph.apply_changeset(changeset.indexed_tx_graph);
@@ -562,8 +629,8 @@ impl Wallet {
) )
} }
/// Either loads [`Wallet`] from persistence, or initializes it if it does not exist (with a /// Either loads [`Wallet`] from persistence, or initializes it if it does not exist, using the
/// custom genesis hash). /// provided descriptor, change descriptor, network, and custom genesis hash.
/// ///
/// This method will fail if the loaded [`Wallet`] has different parameters to those provided. /// This method will fail if the loaded [`Wallet`] has different parameters to those provided.
/// This is like [`Wallet::new_or_load`] with an additional `genesis_hash` parameter. This is /// This is like [`Wallet::new_or_load`] with an additional `genesis_hash` parameter. This is
@@ -580,25 +647,23 @@ impl Wallet {
.map_err(NewOrLoadError::Persist)?; .map_err(NewOrLoadError::Persist)?;
match changeset { match changeset {
Some(changeset) => { Some(changeset) => {
let wallet = let mut wallet = Self::load_from_changeset(db, changeset).map_err(|e| match e {
Self::load_from_changeset(descriptor, change_descriptor, db, changeset) LoadError::Descriptor(e) => NewOrLoadError::Descriptor(e),
.map_err(|e| match e { LoadError::Persist(e) => NewOrLoadError::Persist(e),
LoadError::Descriptor(e) => NewOrLoadError::Descriptor(e), LoadError::NotInitialized => NewOrLoadError::NotInitialized,
LoadError::Persist(e) => NewOrLoadError::Persist(e), LoadError::MissingNetwork => NewOrLoadError::LoadedNetworkDoesNotMatch {
LoadError::NotInitialized => NewOrLoadError::NotInitialized, expected: network,
LoadError::MissingNetwork => { got: None,
NewOrLoadError::LoadedNetworkDoesNotMatch { },
expected: network, LoadError::MissingGenesis => NewOrLoadError::LoadedGenesisDoesNotMatch {
got: None, expected: genesis_hash,
} got: None,
} },
LoadError::MissingGenesis => { LoadError::MissingDescriptor => NewOrLoadError::LoadedDescriptorDoesNotMatch {
NewOrLoadError::LoadedGenesisDoesNotMatch { got: None,
expected: genesis_hash, keychain: KeychainKind::External,
got: None, },
} })?;
}
})?;
if wallet.network != network { if wallet.network != network {
return Err(NewOrLoadError::LoadedNetworkDoesNotMatch { return Err(NewOrLoadError::LoadedNetworkDoesNotMatch {
expected: network, expected: network,
@@ -611,6 +676,73 @@ impl Wallet {
got: Some(wallet.chain.genesis_hash()), got: Some(wallet.chain.genesis_hash()),
}); });
} }
let (expected_descriptor, expected_descriptor_keymap) = descriptor
.into_wallet_descriptor(&wallet.secp, network)
.map_err(NewOrLoadError::Descriptor)?;
let wallet_descriptor = wallet.public_descriptor(KeychainKind::External).cloned();
if wallet_descriptor != Some(expected_descriptor.clone()) {
return Err(NewOrLoadError::LoadedDescriptorDoesNotMatch {
got: wallet_descriptor,
keychain: KeychainKind::External,
});
}
// if expected descriptor has private keys add them as new signers
if !expected_descriptor_keymap.is_empty() {
let signer_container = SignersContainer::build(
expected_descriptor_keymap,
&expected_descriptor,
&wallet.secp,
);
signer_container.signers().into_iter().for_each(|signer| {
wallet.add_signer(
KeychainKind::External,
SignerOrdering::default(),
signer.clone(),
)
});
}
let expected_change_descriptor = if let Some(c) = change_descriptor {
Some(
c.into_wallet_descriptor(&wallet.secp, network)
.map_err(NewOrLoadError::Descriptor)?,
)
} else {
None
};
let wallet_change_descriptor =
wallet.public_descriptor(KeychainKind::Internal).cloned();
match (expected_change_descriptor, wallet_change_descriptor) {
(Some((expected_descriptor, expected_keymap)), Some(wallet_descriptor))
if wallet_descriptor == expected_descriptor =>
{
// if expected change descriptor has private keys add them as new signers
if !expected_keymap.is_empty() {
let signer_container = SignersContainer::build(
expected_keymap,
&expected_descriptor,
&wallet.secp,
);
signer_container.signers().into_iter().for_each(|signer| {
wallet.add_signer(
KeychainKind::Internal,
SignerOrdering::default(),
signer.clone(),
)
});
}
}
(None, None) => (),
(_, wallet_descriptor) => {
return Err(NewOrLoadError::LoadedDescriptorDoesNotMatch {
got: wallet_descriptor,
keychain: KeychainKind::Internal,
});
}
}
Ok(wallet) Ok(wallet)
} }
None => Self::new_with_genesis_hash( None => Self::new_with_genesis_hash(
@@ -636,7 +768,7 @@ impl Wallet {
} }
/// Iterator over all keychains in this wallet /// Iterator over all keychains in this wallet
pub fn keychains(&self) -> &BTreeMap<KeychainKind, ExtendedDescriptor> { pub fn keychains(&self) -> impl Iterator<Item = (&KeychainKind, &ExtendedDescriptor)> {
self.indexed_graph.index.keychains() self.indexed_graph.index.keychains()
} }
@@ -650,7 +782,11 @@ impl Wallet {
/// [BIP32](https://github.com/bitcoin/bips/blob/master/bip-0032.mediawiki) max index. /// [BIP32](https://github.com/bitcoin/bips/blob/master/bip-0032.mediawiki) max index.
pub fn peek_address(&self, keychain: KeychainKind, mut index: u32) -> AddressInfo { pub fn peek_address(&self, keychain: KeychainKind, mut index: u32) -> AddressInfo {
let keychain = self.map_keychain(keychain); let keychain = self.map_keychain(keychain);
let mut spk_iter = self.indexed_graph.index.unbounded_spk_iter(&keychain); let mut spk_iter = self
.indexed_graph
.index
.unbounded_spk_iter(&keychain)
.expect("Must exist (we called map_keychain)");
if !spk_iter.descriptor().has_wildcard() { if !spk_iter.descriptor().has_wildcard() {
index = 0; index = 0;
} }
@@ -677,7 +813,11 @@ impl Wallet {
/// If writing to persistent storage fails. /// If writing to persistent storage fails.
pub fn reveal_next_address(&mut self, keychain: KeychainKind) -> anyhow::Result<AddressInfo> { pub fn reveal_next_address(&mut self, keychain: KeychainKind) -> anyhow::Result<AddressInfo> {
let keychain = self.map_keychain(keychain); let keychain = self.map_keychain(keychain);
let ((index, spk), index_changeset) = self.indexed_graph.index.reveal_next_spk(&keychain); let ((index, spk), index_changeset) = self
.indexed_graph
.index
.reveal_next_spk(&keychain)
.expect("Must exist (we called map_keychain)");
self.persist self.persist
.stage_and_commit(indexed_tx_graph::ChangeSet::from(index_changeset).into())?; .stage_and_commit(indexed_tx_graph::ChangeSet::from(index_changeset).into())?;
@@ -705,8 +845,11 @@ impl Wallet {
index: u32, index: u32,
) -> anyhow::Result<impl Iterator<Item = AddressInfo> + '_> { ) -> anyhow::Result<impl Iterator<Item = AddressInfo> + '_> {
let keychain = self.map_keychain(keychain); let keychain = self.map_keychain(keychain);
let (spk_iter, index_changeset) = let (spk_iter, index_changeset) = self
self.indexed_graph.index.reveal_to_target(&keychain, index); .indexed_graph
.index
.reveal_to_target(&keychain, index)
.expect("must exist (we called map_keychain)");
self.persist self.persist
.stage_and_commit(indexed_tx_graph::ChangeSet::from(index_changeset).into())?; .stage_and_commit(indexed_tx_graph::ChangeSet::from(index_changeset).into())?;
@@ -729,7 +872,11 @@ impl Wallet {
/// If writing to persistent storage fails. /// If writing to persistent storage fails.
pub fn next_unused_address(&mut self, keychain: KeychainKind) -> anyhow::Result<AddressInfo> { pub fn next_unused_address(&mut self, keychain: KeychainKind) -> anyhow::Result<AddressInfo> {
let keychain = self.map_keychain(keychain); let keychain = self.map_keychain(keychain);
let ((index, spk), index_changeset) = self.indexed_graph.index.next_unused_spk(&keychain); let ((index, spk), index_changeset) = self
.indexed_graph
.index
.next_unused_spk(&keychain)
.expect("must exist (we called map_keychain)");
self.persist self.persist
.stage_and_commit(indexed_tx_graph::ChangeSet::from(index_changeset).into())?; .stage_and_commit(indexed_tx_graph::ChangeSet::from(index_changeset).into())?;
@@ -799,7 +946,7 @@ impl Wallet {
.filter_chain_unspents( .filter_chain_unspents(
&self.chain, &self.chain,
self.chain.tip().block_id(), self.chain.tip().block_id(),
self.indexed_graph.index.outpoints().iter().cloned(), self.indexed_graph.index.outpoints(),
) )
.map(|((k, i), full_txo)| new_local_utxo(k, i, full_txo)) .map(|((k, i), full_txo)| new_local_utxo(k, i, full_txo))
} }
@@ -813,7 +960,7 @@ impl Wallet {
.filter_chain_txouts( .filter_chain_txouts(
&self.chain, &self.chain,
self.chain.tip().block_id(), self.chain.tip().block_id(),
self.indexed_graph.index.outpoints().iter().cloned(), self.indexed_graph.index.outpoints(),
) )
.map(|((k, i), full_txo)| new_local_utxo(k, i, full_txo)) .map(|((k, i), full_txo)| new_local_utxo(k, i, full_txo))
} }
@@ -851,7 +998,11 @@ impl Wallet {
&self, &self,
keychain: KeychainKind, keychain: KeychainKind,
) -> impl Iterator<Item = (u32, ScriptBuf)> + Clone { ) -> impl Iterator<Item = (u32, ScriptBuf)> + Clone {
self.indexed_graph.index.unbounded_spk_iter(&keychain) let keychain = self.map_keychain(keychain);
self.indexed_graph
.index
.unbounded_spk_iter(&keychain)
.expect("Must exist (we called map_keychain)")
} }
/// Returns the utxo owned by this wallet corresponding to `outpoint` if it exists in the /// Returns the utxo owned by this wallet corresponding to `outpoint` if it exists in the
@@ -950,10 +1101,10 @@ impl Wallet {
/// [`insert_txout`]: Self::insert_txout /// [`insert_txout`]: Self::insert_txout
pub fn calculate_fee_rate(&self, tx: &Transaction) -> Result<FeeRate, CalculateFeeError> { pub fn calculate_fee_rate(&self, tx: &Transaction) -> Result<FeeRate, CalculateFeeError> {
self.calculate_fee(tx) self.calculate_fee(tx)
.map(|fee| bitcoin::Amount::from_sat(fee) / tx.weight()) .map(|fee| Amount::from_sat(fee) / tx.weight())
} }
/// Compute the `tx`'s sent and received amounts (in satoshis). /// Compute the `tx`'s sent and received [`Amount`]s.
/// ///
/// This method returns a tuple `(sent, received)`. Sent is the sum of the txin amounts /// This method returns a tuple `(sent, received)`. Sent is the sum of the txin amounts
/// that spend from previous txouts tracked by this wallet. Received is the summation /// that spend from previous txouts tracked by this wallet. Received is the summation
@@ -978,7 +1129,7 @@ impl Wallet {
/// let tx = &psbt.clone().extract_tx().expect("tx"); /// let tx = &psbt.clone().extract_tx().expect("tx");
/// let (sent, received) = wallet.sent_and_received(tx); /// let (sent, received) = wallet.sent_and_received(tx);
/// ``` /// ```
pub fn sent_and_received(&self, tx: &Transaction) -> (u64, u64) { pub fn sent_and_received(&self, tx: &Transaction) -> (Amount, Amount) {
self.indexed_graph.index.sent_and_received(tx, ..) self.indexed_graph.index.sent_and_received(tx, ..)
} }
@@ -1133,7 +1284,7 @@ impl Wallet {
self.indexed_graph.graph().balance( self.indexed_graph.graph().balance(
&self.chain, &self.chain,
self.chain.tip().block_id(), self.chain.tip().block_id(),
self.indexed_graph.index.outpoints().iter().cloned(), self.indexed_graph.index.outpoints(),
|&(k, _), _| k == KeychainKind::Internal, |&(k, _), _| k == KeychainKind::Internal,
) )
} }
@@ -1197,7 +1348,7 @@ impl Wallet {
/// let psbt = { /// let psbt = {
/// let mut builder = wallet.build_tx(); /// let mut builder = wallet.build_tx();
/// builder /// builder
/// .add_recipient(to_address.script_pubkey(), 50_000); /// .add_recipient(to_address.script_pubkey(), Amount::from_sat(50_000));
/// builder.finish()? /// builder.finish()?
/// }; /// };
/// ///
@@ -1220,17 +1371,9 @@ impl Wallet {
coin_selection: Cs, coin_selection: Cs,
params: TxParams, params: TxParams,
) -> Result<Psbt, CreateTxError> { ) -> Result<Psbt, CreateTxError> {
let external_descriptor = self let keychains: BTreeMap<_, _> = self.indexed_graph.index.keychains().collect();
.indexed_graph let external_descriptor = keychains.get(&KeychainKind::External).expect("must exist");
.index let internal_descriptor = keychains.get(&KeychainKind::Internal);
.keychains()
.get(&KeychainKind::External)
.expect("must exist");
let internal_descriptor = self
.indexed_graph
.index
.keychains()
.get(&KeychainKind::Internal);
let external_policy = external_descriptor let external_policy = external_descriptor
.extract_policy(&self.signers, BuildSatisfaction::None, &self.secp)? .extract_policy(&self.signers, BuildSatisfaction::None, &self.secp)?
@@ -1464,8 +1607,11 @@ impl Wallet {
Some(ref drain_recipient) => drain_recipient.clone(), Some(ref drain_recipient) => drain_recipient.clone(),
None => { None => {
let change_keychain = self.map_keychain(KeychainKind::Internal); let change_keychain = self.map_keychain(KeychainKind::Internal);
let ((index, spk), index_changeset) = let ((index, spk), index_changeset) = self
self.indexed_graph.index.next_unused_spk(&change_keychain); .indexed_graph
.index
.next_unused_spk(&change_keychain)
.expect("Keychain exists (we called map_keychain)");
let spk = spk.into(); let spk = spk.into();
self.indexed_graph.index.mark_used(change_keychain, index); self.indexed_graph.index.mark_used(change_keychain, index);
self.persist self.persist
@@ -1579,7 +1725,7 @@ impl Wallet {
/// let mut psbt = { /// let mut psbt = {
/// let mut builder = wallet.build_tx(); /// let mut builder = wallet.build_tx();
/// builder /// builder
/// .add_recipient(to_address.script_pubkey(), 50_000) /// .add_recipient(to_address.script_pubkey(), Amount::from_sat(50_000))
/// .enable_rbf(); /// .enable_rbf();
/// builder.finish()? /// builder.finish()?
/// }; /// };
@@ -1752,7 +1898,7 @@ impl Wallet {
/// # let to_address = Address::from_str("2N4eQYCbKUHCCTUjBJeHcJp9ok6J2GZsTDt").unwrap().assume_checked(); /// # let to_address = Address::from_str("2N4eQYCbKUHCCTUjBJeHcJp9ok6J2GZsTDt").unwrap().assume_checked();
/// let mut psbt = { /// let mut psbt = {
/// let mut builder = wallet.build_tx(); /// let mut builder = wallet.build_tx();
/// builder.add_recipient(to_address.script_pubkey(), 50_000); /// builder.add_recipient(to_address.script_pubkey(), Amount::from_sat(50_000));
/// builder.finish()? /// builder.finish()?
/// }; /// };
/// let finalized = wallet.sign(&mut psbt, SignOptions::default())?; /// let finalized = wallet.sign(&mut psbt, SignOptions::default())?;
@@ -1825,7 +1971,11 @@ impl Wallet {
/// ///
/// This can be used to build a watch-only version of a wallet /// This can be used to build a watch-only version of a wallet
pub fn public_descriptor(&self, keychain: KeychainKind) -> Option<&ExtendedDescriptor> { pub fn public_descriptor(&self, keychain: KeychainKind) -> Option<&ExtendedDescriptor> {
self.indexed_graph.index.keychains().get(&keychain) self.indexed_graph
.index
.keychains()
.find(|(k, _)| *k == &keychain)
.map(|(_, d)| d)
} }
/// Finalize a PSBT, i.e., for each input determine if sufficient data is available to pass /// Finalize a PSBT, i.e., for each input determine if sufficient data is available to pass
@@ -1876,17 +2026,9 @@ impl Wallet {
.get_utxo_for(n) .get_utxo_for(n)
.and_then(|txout| self.get_descriptor_for_txout(&txout)) .and_then(|txout| self.get_descriptor_for_txout(&txout))
.or_else(|| { .or_else(|| {
self.indexed_graph self.indexed_graph.index.keychains().find_map(|(_, desc)| {
.index desc.derive_from_psbt_input(psbt_input, psbt.get_utxo_for(n), &self.secp)
.keychains() })
.iter()
.find_map(|(_, desc)| {
desc.derive_from_psbt_input(
psbt_input,
psbt.get_utxo_for(n),
&self.secp,
)
})
}); });
match desc { match desc {
@@ -1952,7 +2094,12 @@ impl Wallet {
/// The index of the next address that you would get if you were to ask the wallet for a new address /// The index of the next address that you would get if you were to ask the wallet for a new address
pub fn next_derivation_index(&self, keychain: KeychainKind) -> u32 { pub fn next_derivation_index(&self, keychain: KeychainKind) -> u32 {
self.indexed_graph.index.next_index(&keychain).0 let keychain = self.map_keychain(keychain);
self.indexed_graph
.index
.next_index(&keychain)
.expect("Keychain must exist (we called map_keychain)")
.0
} }
/// Informs the wallet that you no longer intend to broadcast a tx that was built from it. /// Informs the wallet that you no longer intend to broadcast a tx that was built from it.
@@ -2119,7 +2266,6 @@ impl Wallet {
if params.add_global_xpubs { if params.add_global_xpubs {
let all_xpubs = self let all_xpubs = self
.keychains() .keychains()
.iter()
.flat_map(|(_, desc)| desc.get_extended_keys()) .flat_map(|(_, desc)| desc.get_extended_keys())
.collect::<Vec<_>>(); .collect::<Vec<_>>();
@@ -2419,6 +2565,7 @@ impl Wallet {
/// start a blockchain sync with a spk based blockchain client. /// start a blockchain sync with a spk based blockchain client.
pub fn start_sync_with_revealed_spks(&self) -> SyncRequest { pub fn start_sync_with_revealed_spks(&self) -> SyncRequest {
SyncRequest::from_chain_tip(self.chain.tip()) SyncRequest::from_chain_tip(self.chain.tip())
.cache_graph_txs(self.tx_graph())
.populate_with_revealed_spks(&self.indexed_graph.index, ..) .populate_with_revealed_spks(&self.indexed_graph.index, ..)
} }
@@ -2432,6 +2579,7 @@ impl Wallet {
/// in which the list of used scripts is not known. /// in which the list of used scripts is not known.
pub fn start_full_scan(&self) -> FullScanRequest<KeychainKind> { pub fn start_full_scan(&self) -> FullScanRequest<KeychainKind> {
FullScanRequest::from_keychain_txout_index(self.chain.tip(), &self.indexed_graph.index) FullScanRequest::from_keychain_txout_index(self.chain.tip(), &self.indexed_graph.index)
.cache_graph_txs(self.tx_graph())
} }
} }
@@ -2496,13 +2644,13 @@ fn create_signers<E: IntoWalletDescriptor>(
) -> Result<(Arc<SignersContainer>, Arc<SignersContainer>), crate::descriptor::error::Error> { ) -> Result<(Arc<SignersContainer>, Arc<SignersContainer>), crate::descriptor::error::Error> {
let (descriptor, keymap) = into_wallet_descriptor_checked(descriptor, secp, network)?; let (descriptor, keymap) = into_wallet_descriptor_checked(descriptor, secp, network)?;
let signers = Arc::new(SignersContainer::build(keymap, &descriptor, secp)); let signers = Arc::new(SignersContainer::build(keymap, &descriptor, secp));
index.add_keychain(KeychainKind::External, descriptor); let _ = index.insert_descriptor(KeychainKind::External, descriptor);
let change_signers = match change_descriptor { let change_signers = match change_descriptor {
Some(descriptor) => { Some(descriptor) => {
let (descriptor, keymap) = into_wallet_descriptor_checked(descriptor, secp, network)?; let (descriptor, keymap) = into_wallet_descriptor_checked(descriptor, secp, network)?;
let signers = Arc::new(SignersContainer::build(keymap, &descriptor, secp)); let signers = Arc::new(SignersContainer::build(keymap, &descriptor, secp));
index.add_keychain(KeychainKind::Internal, descriptor); let _ = index.insert_descriptor(KeychainKind::Internal, descriptor);
signers signers
} }
None => Arc::new(SignersContainer::new()), None => Arc::new(SignersContainer::new()),

View File

@@ -29,7 +29,7 @@
//! //!
//! tx_builder //! tx_builder
//! // Create a transaction with one output to `to_address` of 50_000 satoshi //! // Create a transaction with one output to `to_address` of 50_000 satoshi
//! .add_recipient(to_address.script_pubkey(), 50_000) //! .add_recipient(to_address.script_pubkey(), Amount::from_sat(50_000))
//! // With a custom fee rate of 5.0 satoshi/vbyte //! // With a custom fee rate of 5.0 satoshi/vbyte
//! .fee_rate(FeeRate::from_sat_per_vb(5).expect("valid feerate")) //! .fee_rate(FeeRate::from_sat_per_vb(5).expect("valid feerate"))
//! // Only spend non-change outputs //! // Only spend non-change outputs
@@ -47,7 +47,7 @@ use core::marker::PhantomData;
use bitcoin::psbt::{self, Psbt}; use bitcoin::psbt::{self, Psbt};
use bitcoin::script::PushBytes; use bitcoin::script::PushBytes;
use bitcoin::{absolute, FeeRate, OutPoint, ScriptBuf, Sequence, Transaction, Txid}; use bitcoin::{absolute, Amount, FeeRate, OutPoint, ScriptBuf, Sequence, Transaction, Txid};
use super::coin_selection::{CoinSelectionAlgorithm, DefaultCoinSelectionAlgorithm}; use super::coin_selection::{CoinSelectionAlgorithm, DefaultCoinSelectionAlgorithm};
use super::{CreateTxError, Wallet}; use super::{CreateTxError, Wallet};
@@ -94,8 +94,8 @@ impl TxBuilderContext for BumpFee {}
/// let mut builder = wallet.build_tx(); /// let mut builder = wallet.build_tx();
/// builder /// builder
/// .ordering(TxOrdering::Untouched) /// .ordering(TxOrdering::Untouched)
/// .add_recipient(addr1.script_pubkey(), 50_000) /// .add_recipient(addr1.script_pubkey(), Amount::from_sat(50_000))
/// .add_recipient(addr2.script_pubkey(), 50_000); /// .add_recipient(addr2.script_pubkey(), Amount::from_sat(50_000));
/// builder.finish()? /// builder.finish()?
/// }; /// };
/// ///
@@ -104,7 +104,7 @@ impl TxBuilderContext for BumpFee {}
/// let mut builder = wallet.build_tx(); /// let mut builder = wallet.build_tx();
/// builder.ordering(TxOrdering::Untouched); /// builder.ordering(TxOrdering::Untouched);
/// for addr in &[addr1, addr2] { /// for addr in &[addr1, addr2] {
/// builder.add_recipient(addr.script_pubkey(), 50_000); /// builder.add_recipient(addr.script_pubkey(), Amount::from_sat(50_000));
/// } /// }
/// builder.finish()? /// builder.finish()?
/// }; /// };
@@ -274,7 +274,7 @@ impl<'a, Cs, Ctx> TxBuilder<'a, Cs, Ctx> {
/// ///
/// let builder = wallet /// let builder = wallet
/// .build_tx() /// .build_tx()
/// .add_recipient(to_address.script_pubkey(), 50_000) /// .add_recipient(to_address.script_pubkey(), Amount::from_sat(50_000))
/// .policy_path(path, KeychainKind::External); /// .policy_path(path, KeychainKind::External);
/// ///
/// # Ok::<(), anyhow::Error>(()) /// # Ok::<(), anyhow::Error>(())
@@ -713,21 +713,26 @@ impl std::error::Error for AllowShrinkingError {}
impl<'a, Cs: CoinSelectionAlgorithm> TxBuilder<'a, Cs, CreateTx> { impl<'a, Cs: CoinSelectionAlgorithm> TxBuilder<'a, Cs, CreateTx> {
/// Replace the recipients already added with a new list /// Replace the recipients already added with a new list
pub fn set_recipients(&mut self, recipients: Vec<(ScriptBuf, u64)>) -> &mut Self { pub fn set_recipients(&mut self, recipients: Vec<(ScriptBuf, Amount)>) -> &mut Self {
self.params.recipients = recipients; self.params.recipients = recipients
.into_iter()
.map(|(script, amount)| (script, amount.to_sat()))
.collect();
self self
} }
/// Add a recipient to the internal list /// Add a recipient to the internal list
pub fn add_recipient(&mut self, script_pubkey: ScriptBuf, amount: u64) -> &mut Self { pub fn add_recipient(&mut self, script_pubkey: ScriptBuf, amount: Amount) -> &mut Self {
self.params.recipients.push((script_pubkey, amount)); self.params
.recipients
.push((script_pubkey, amount.to_sat()));
self self
} }
/// Add data as an output, using OP_RETURN /// Add data as an output, using OP_RETURN
pub fn add_data<T: AsRef<PushBytes>>(&mut self, data: &T) -> &mut Self { pub fn add_data<T: AsRef<PushBytes>>(&mut self, data: &T) -> &mut Self {
let script = ScriptBuf::new_op_return(data); let script = ScriptBuf::new_op_return(data);
self.add_recipient(script, 0u64); self.add_recipient(script, Amount::ZERO);
self self
} }

View File

@@ -14,7 +14,7 @@ fn test_psbt_malformed_psbt_input_legacy() {
let (mut wallet, _) = get_funded_wallet(get_test_wpkh()); let (mut wallet, _) = get_funded_wallet(get_test_wpkh());
let send_to = wallet.peek_address(KeychainKind::External, 0); let send_to = wallet.peek_address(KeychainKind::External, 0);
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(send_to.script_pubkey(), 10_000); builder.add_recipient(send_to.script_pubkey(), Amount::from_sat(10_000));
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
psbt.inputs.push(psbt_bip.inputs[0].clone()); psbt.inputs.push(psbt_bip.inputs[0].clone());
let options = SignOptions { let options = SignOptions {
@@ -31,7 +31,7 @@ fn test_psbt_malformed_psbt_input_segwit() {
let (mut wallet, _) = get_funded_wallet(get_test_wpkh()); let (mut wallet, _) = get_funded_wallet(get_test_wpkh());
let send_to = wallet.peek_address(KeychainKind::External, 0); let send_to = wallet.peek_address(KeychainKind::External, 0);
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(send_to.script_pubkey(), 10_000); builder.add_recipient(send_to.script_pubkey(), Amount::from_sat(10_000));
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
psbt.inputs.push(psbt_bip.inputs[1].clone()); psbt.inputs.push(psbt_bip.inputs[1].clone());
let options = SignOptions { let options = SignOptions {
@@ -47,7 +47,7 @@ fn test_psbt_malformed_tx_input() {
let (mut wallet, _) = get_funded_wallet(get_test_wpkh()); let (mut wallet, _) = get_funded_wallet(get_test_wpkh());
let send_to = wallet.peek_address(KeychainKind::External, 0); let send_to = wallet.peek_address(KeychainKind::External, 0);
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(send_to.script_pubkey(), 10_000); builder.add_recipient(send_to.script_pubkey(), Amount::from_sat(10_000));
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
psbt.unsigned_tx.input.push(TxIn::default()); psbt.unsigned_tx.input.push(TxIn::default());
let options = SignOptions { let options = SignOptions {
@@ -63,7 +63,7 @@ fn test_psbt_sign_with_finalized() {
let (mut wallet, _) = get_funded_wallet(get_test_wpkh()); let (mut wallet, _) = get_funded_wallet(get_test_wpkh());
let send_to = wallet.peek_address(KeychainKind::External, 0); let send_to = wallet.peek_address(KeychainKind::External, 0);
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(send_to.script_pubkey(), 10_000); builder.add_recipient(send_to.script_pubkey(), Amount::from_sat(10_000));
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
// add a finalized input // add a finalized input
@@ -201,7 +201,7 @@ fn test_psbt_multiple_internalkey_signers() {
// the prevout we're spending // the prevout we're spending
let prevouts = &[TxOut { let prevouts = &[TxOut {
script_pubkey: send_to.script_pubkey(), script_pubkey: send_to.script_pubkey(),
value: Amount::from_sat(to_spend), value: to_spend,
}]; }];
let prevouts = Prevouts::All(prevouts); let prevouts = Prevouts::All(prevouts);
let input_index = 0; let input_index = 0;

View File

@@ -1,7 +1,7 @@
use std::str::FromStr; use std::str::FromStr;
use assert_matches::assert_matches; use assert_matches::assert_matches;
use bdk::descriptor::calc_checksum; use bdk::descriptor::{calc_checksum, IntoWalletDescriptor};
use bdk::psbt::PsbtUtils; use bdk::psbt::PsbtUtils;
use bdk::signer::{SignOptions, SignerError}; use bdk::signer::{SignOptions, SignerError};
use bdk::wallet::coin_selection::{self, LargestFirstCoinSelection}; use bdk::wallet::coin_selection::{self, LargestFirstCoinSelection};
@@ -10,9 +10,11 @@ use bdk::wallet::tx_builder::AddForeignUtxoError;
use bdk::wallet::NewError; use bdk::wallet::NewError;
use bdk::wallet::{AddressInfo, Balance, Wallet}; use bdk::wallet::{AddressInfo, Balance, Wallet};
use bdk::KeychainKind; use bdk::KeychainKind;
use bdk_chain::collections::BTreeMap;
use bdk_chain::COINBASE_MATURITY; use bdk_chain::COINBASE_MATURITY;
use bdk_chain::{BlockId, ConfirmationTime}; use bdk_chain::{BlockId, ConfirmationTime};
use bitcoin::hashes::Hash; use bitcoin::hashes::Hash;
use bitcoin::key::Secp256k1;
use bitcoin::psbt; use bitcoin::psbt;
use bitcoin::script::PushBytesBuf; use bitcoin::script::PushBytesBuf;
use bitcoin::sighash::{EcdsaSighashType, TapSighashType}; use bitcoin::sighash::{EcdsaSighashType, TapSighashType};
@@ -84,14 +86,24 @@ fn load_recovers_wallet() {
// recover wallet // recover wallet
{ {
let db = bdk_file_store::Store::open(DB_MAGIC, &file_path).expect("must recover db"); let db = bdk_file_store::Store::open(DB_MAGIC, &file_path).expect("must recover db");
let wallet = let wallet = Wallet::load(db).expect("must recover wallet");
Wallet::load(get_test_tr_single_sig_xprv(), None, db).expect("must recover wallet");
assert_eq!(wallet.network(), Network::Testnet); assert_eq!(wallet.network(), Network::Testnet);
assert_eq!(wallet.spk_index().keychains(), wallet_spk_index.keychains()); assert_eq!(
wallet.spk_index().keychains().collect::<Vec<_>>(),
wallet_spk_index.keychains().collect::<Vec<_>>()
);
assert_eq!( assert_eq!(
wallet.spk_index().last_revealed_indices(), wallet.spk_index().last_revealed_indices(),
wallet_spk_index.last_revealed_indices() wallet_spk_index.last_revealed_indices()
); );
let secp = Secp256k1::new();
assert_eq!(
*wallet.get_descriptor_for_keychain(KeychainKind::External),
get_test_tr_single_sig_xprv()
.into_wallet_descriptor(&secp, wallet.network())
.unwrap()
.0
);
} }
// `new` can only be called on empty db // `new` can only be called on empty db
@@ -108,12 +120,12 @@ fn new_or_load() {
let file_path = temp_dir.path().join("store.db"); let file_path = temp_dir.path().join("store.db");
// init wallet when non-existent // init wallet when non-existent
let wallet_keychains = { let wallet_keychains: BTreeMap<_, _> = {
let db = bdk_file_store::Store::open_or_create_new(DB_MAGIC, &file_path) let db = bdk_file_store::Store::open_or_create_new(DB_MAGIC, &file_path)
.expect("must create db"); .expect("must create db");
let wallet = Wallet::new_or_load(get_test_wpkh(), None, db, Network::Testnet) let wallet = Wallet::new_or_load(get_test_wpkh(), None, db, Network::Testnet)
.expect("must init wallet"); .expect("must init wallet");
wallet.keychains().clone() wallet.keychains().map(|(k, v)| (*k, v.clone())).collect()
}; };
// wrong network // wrong network
@@ -162,6 +174,49 @@ fn new_or_load() {
); );
} }
// wrong external descriptor
{
let exp_descriptor = get_test_tr_single_sig();
let got_descriptor = get_test_wpkh()
.into_wallet_descriptor(&Secp256k1::new(), Network::Testnet)
.unwrap()
.0;
let db =
bdk_file_store::Store::open_or_create_new(DB_MAGIC, &file_path).expect("must open db");
let err = Wallet::new_or_load(exp_descriptor, None, db, Network::Testnet)
.expect_err("wrong external descriptor");
assert!(
matches!(
err,
bdk::wallet::NewOrLoadError::LoadedDescriptorDoesNotMatch { ref got, keychain }
if got == &Some(got_descriptor) && keychain == KeychainKind::External
),
"err: {}",
err,
);
}
// wrong internal descriptor
{
let exp_descriptor = Some(get_test_tr_single_sig());
let got_descriptor = None;
let db =
bdk_file_store::Store::open_or_create_new(DB_MAGIC, &file_path).expect("must open db");
let err = Wallet::new_or_load(get_test_wpkh(), exp_descriptor, db, Network::Testnet)
.expect_err("wrong internal descriptor");
assert!(
matches!(
err,
bdk::wallet::NewOrLoadError::LoadedDescriptorDoesNotMatch { ref got, keychain }
if got == &got_descriptor && keychain == KeychainKind::Internal
),
"err: {}",
err,
);
}
// all parameters match // all parameters match
{ {
let db = let db =
@@ -169,7 +224,10 @@ fn new_or_load() {
let wallet = Wallet::new_or_load(get_test_wpkh(), None, db, Network::Testnet) let wallet = Wallet::new_or_load(get_test_wpkh(), None, db, Network::Testnet)
.expect("must recover wallet"); .expect("must recover wallet");
assert_eq!(wallet.network(), Network::Testnet); assert_eq!(wallet.network(), Network::Testnet);
assert_eq!(wallet.keychains(), &wallet_keychains); assert!(wallet
.keychains()
.map(|(k, v)| (*k, v.clone()))
.eq(wallet_keychains));
} }
} }
@@ -181,7 +239,6 @@ fn test_descriptor_checksum() {
let raw_descriptor = wallet let raw_descriptor = wallet
.keychains() .keychains()
.iter()
.next() .next()
.unwrap() .unwrap()
.1 .1
@@ -200,14 +257,14 @@ fn test_get_funded_wallet_balance() {
// The funded wallet contains a tx with a 76_000 sats input and two outputs, one spending 25_000 // The funded wallet contains a tx with a 76_000 sats input and two outputs, one spending 25_000
// to a foreign address and one returning 50_000 back to the wallet as change. The remaining 1000 // to a foreign address and one returning 50_000 back to the wallet as change. The remaining 1000
// sats are the transaction fee. // sats are the transaction fee.
assert_eq!(wallet.get_balance().confirmed, 50_000); assert_eq!(wallet.get_balance().confirmed, Amount::from_sat(50_000));
} }
#[test] #[test]
fn test_get_funded_wallet_sent_and_received() { fn test_get_funded_wallet_sent_and_received() {
let (wallet, txid) = get_funded_wallet(get_test_wpkh()); let (wallet, txid) = get_funded_wallet(get_test_wpkh());
let mut tx_amounts: Vec<(Txid, (u64, u64))> = wallet let mut tx_amounts: Vec<(Txid, (Amount, Amount))> = wallet
.transactions() .transactions()
.map(|ct| (ct.tx_node.txid, wallet.sent_and_received(&ct.tx_node))) .map(|ct| (ct.tx_node.txid, wallet.sent_and_received(&ct.tx_node)))
.collect(); .collect();
@@ -219,8 +276,8 @@ fn test_get_funded_wallet_sent_and_received() {
// The funded wallet contains a tx with a 76_000 sats input and two outputs, one spending 25_000 // The funded wallet contains a tx with a 76_000 sats input and two outputs, one spending 25_000
// to a foreign address and one returning 50_000 back to the wallet as change. The remaining 1000 // to a foreign address and one returning 50_000 back to the wallet as change. The remaining 1000
// sats are the transaction fee. // sats are the transaction fee.
assert_eq!(sent, 76_000); assert_eq!(sent.to_sat(), 76_000);
assert_eq!(received, 50_000); assert_eq!(received.to_sat(), 50_000);
} }
#[test] #[test]
@@ -347,7 +404,7 @@ fn test_create_tx_manually_selected_empty_utxos() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.manually_selected_only(); .manually_selected_only();
builder.finish().unwrap(); builder.finish().unwrap();
} }
@@ -358,7 +415,7 @@ fn test_create_tx_version_0() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.version(0); .version(0);
assert!(matches!(builder.finish(), Err(CreateTxError::Version0))); assert!(matches!(builder.finish(), Err(CreateTxError::Version0)));
} }
@@ -369,7 +426,7 @@ fn test_create_tx_version_1_csv() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.version(1); .version(1);
assert!(matches!(builder.finish(), Err(CreateTxError::Version1Csv))); assert!(matches!(builder.finish(), Err(CreateTxError::Version1Csv)));
} }
@@ -380,7 +437,7 @@ fn test_create_tx_custom_version() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.version(42); .version(42);
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -393,7 +450,7 @@ fn test_create_tx_default_locktime_is_last_sync_height() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
// Since we never synced the wallet we don't have a last_sync_height // Since we never synced the wallet we don't have a last_sync_height
@@ -406,7 +463,7 @@ fn test_create_tx_fee_sniping_locktime_last_sync() {
let (mut wallet, _) = get_funded_wallet(get_test_wpkh()); let (mut wallet, _) = get_funded_wallet(get_test_wpkh());
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -422,7 +479,7 @@ fn test_create_tx_default_locktime_cltv() {
let (mut wallet, _) = get_funded_wallet(get_test_single_sig_cltv()); let (mut wallet, _) = get_funded_wallet(get_test_single_sig_cltv());
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
assert_eq!(psbt.unsigned_tx.lock_time.to_consensus_u32(), 100_000); assert_eq!(psbt.unsigned_tx.lock_time.to_consensus_u32(), 100_000);
@@ -434,7 +491,7 @@ fn test_create_tx_custom_locktime() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.current_height(630_001) .current_height(630_001)
.nlocktime(absolute::LockTime::from_height(630_000).unwrap()); .nlocktime(absolute::LockTime::from_height(630_000).unwrap());
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -451,7 +508,7 @@ fn test_create_tx_custom_locktime_compatible_with_cltv() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.nlocktime(absolute::LockTime::from_height(630_000).unwrap()); .nlocktime(absolute::LockTime::from_height(630_000).unwrap());
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -464,7 +521,7 @@ fn test_create_tx_custom_locktime_incompatible_with_cltv() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.nlocktime(absolute::LockTime::from_height(50000).unwrap()); .nlocktime(absolute::LockTime::from_height(50000).unwrap());
assert!(matches!(builder.finish(), assert!(matches!(builder.finish(),
Err(CreateTxError::LockTime { requested, required }) Err(CreateTxError::LockTime { requested, required })
@@ -476,7 +533,7 @@ fn test_create_tx_no_rbf_csv() {
let (mut wallet, _) = get_funded_wallet(get_test_single_sig_csv()); let (mut wallet, _) = get_funded_wallet(get_test_single_sig_csv());
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
assert_eq!(psbt.unsigned_tx.input[0].sequence, Sequence(6)); assert_eq!(psbt.unsigned_tx.input[0].sequence, Sequence(6));
@@ -488,7 +545,7 @@ fn test_create_tx_with_default_rbf_csv() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.enable_rbf(); .enable_rbf();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
// When CSV is enabled it takes precedence over the rbf value (unless forced by the user). // When CSV is enabled it takes precedence over the rbf value (unless forced by the user).
@@ -502,7 +559,7 @@ fn test_create_tx_with_custom_rbf_csv() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.enable_rbf_with_sequence(Sequence(3)); .enable_rbf_with_sequence(Sequence(3));
assert!(matches!(builder.finish(), assert!(matches!(builder.finish(),
Err(CreateTxError::RbfSequenceCsv { rbf, csv }) Err(CreateTxError::RbfSequenceCsv { rbf, csv })
@@ -514,7 +571,7 @@ fn test_create_tx_no_rbf_cltv() {
let (mut wallet, _) = get_funded_wallet(get_test_single_sig_cltv()); let (mut wallet, _) = get_funded_wallet(get_test_single_sig_cltv());
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
assert_eq!(psbt.unsigned_tx.input[0].sequence, Sequence(0xFFFFFFFE)); assert_eq!(psbt.unsigned_tx.input[0].sequence, Sequence(0xFFFFFFFE));
@@ -526,7 +583,7 @@ fn test_create_tx_invalid_rbf_sequence() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.enable_rbf_with_sequence(Sequence(0xFFFFFFFE)); .enable_rbf_with_sequence(Sequence(0xFFFFFFFE));
assert!(matches!(builder.finish(), Err(CreateTxError::RbfSequence))); assert!(matches!(builder.finish(), Err(CreateTxError::RbfSequence)));
} }
@@ -537,7 +594,7 @@ fn test_create_tx_custom_rbf_sequence() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.enable_rbf_with_sequence(Sequence(0xDEADBEEF)); .enable_rbf_with_sequence(Sequence(0xDEADBEEF));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -549,7 +606,7 @@ fn test_create_tx_default_sequence() {
let (mut wallet, _) = get_funded_wallet(get_test_wpkh()); let (mut wallet, _) = get_funded_wallet(get_test_wpkh());
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
assert_eq!(psbt.unsigned_tx.input[0].sequence, Sequence(0xFFFFFFFE)); assert_eq!(psbt.unsigned_tx.input[0].sequence, Sequence(0xFFFFFFFE));
@@ -561,7 +618,7 @@ fn test_create_tx_change_policy_no_internal() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.do_not_spend_change(); .do_not_spend_change();
assert!(matches!( assert!(matches!(
builder.finish(), builder.finish(),
@@ -603,7 +660,7 @@ fn test_create_tx_drain_wallet_and_drain_to_and_with_recipient() {
let drain_addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let drain_addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 20_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(20_000))
.drain_to(drain_addr.script_pubkey()) .drain_to(drain_addr.script_pubkey())
.drain_wallet(); .drain_wallet();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -658,7 +715,7 @@ fn test_create_tx_default_fee_rate() {
let (mut wallet, _) = get_funded_wallet(get_test_wpkh()); let (mut wallet, _) = get_funded_wallet(get_test_wpkh());
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let fee = check_fee!(wallet, psbt); let fee = check_fee!(wallet, psbt);
@@ -671,7 +728,7 @@ fn test_create_tx_custom_fee_rate() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.fee_rate(FeeRate::from_sat_per_vb_unchecked(5)); .fee_rate(FeeRate::from_sat_per_vb_unchecked(5));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let fee = check_fee!(wallet, psbt); let fee = check_fee!(wallet, psbt);
@@ -740,7 +797,7 @@ fn test_create_tx_add_change() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.ordering(TxOrdering::Untouched); .ordering(TxOrdering::Untouched);
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let fee = check_fee!(wallet, psbt); let fee = check_fee!(wallet, psbt);
@@ -758,7 +815,7 @@ fn test_create_tx_skip_change_dust() {
let (mut wallet, _) = get_funded_wallet(get_test_wpkh()); let (mut wallet, _) = get_funded_wallet(get_test_wpkh());
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 49_800); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(49_800));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let fee = check_fee!(wallet, psbt); let fee = check_fee!(wallet, psbt);
@@ -787,8 +844,8 @@ fn test_create_tx_ordering_respected() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 30_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(30_000))
.add_recipient(addr.script_pubkey(), 10_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(10_000))
.ordering(bdk::wallet::tx_builder::TxOrdering::Bip69Lexicographic); .ordering(bdk::wallet::tx_builder::TxOrdering::Bip69Lexicographic);
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let fee = check_fee!(wallet, psbt); let fee = check_fee!(wallet, psbt);
@@ -807,7 +864,7 @@ fn test_create_tx_default_sighash() {
let (mut wallet, _) = get_funded_wallet(get_test_wpkh()); let (mut wallet, _) = get_funded_wallet(get_test_wpkh());
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 30_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(30_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
assert_eq!(psbt.inputs[0].sighash_type, None); assert_eq!(psbt.inputs[0].sighash_type, None);
@@ -819,7 +876,7 @@ fn test_create_tx_custom_sighash() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 30_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(30_000))
.sighash(EcdsaSighashType::Single.into()); .sighash(EcdsaSighashType::Single.into());
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -1018,7 +1075,7 @@ fn test_create_tx_add_utxo() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 30_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(30_000))
.add_utxo(OutPoint { .add_utxo(OutPoint {
txid: small_output_tx.txid(), txid: small_output_tx.txid(),
vout: 0, vout: 0,
@@ -1034,7 +1091,8 @@ fn test_create_tx_add_utxo() {
"should add an additional input since 25_000 < 30_000" "should add an additional input since 25_000 < 30_000"
); );
assert_eq!( assert_eq!(
sent_received.0, 75_000, sent_received.0,
Amount::from_sat(75_000),
"total should be sum of both inputs" "total should be sum of both inputs"
); );
} }
@@ -1068,7 +1126,7 @@ fn test_create_tx_manually_selected_insufficient() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 30_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(30_000))
.add_utxo(OutPoint { .add_utxo(OutPoint {
txid: small_output_tx.txid(), txid: small_output_tx.txid(),
vout: 0, vout: 0,
@@ -1087,7 +1145,7 @@ fn test_create_tx_policy_path_required() {
.unwrap() .unwrap()
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 30_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(10_000));
builder.finish().unwrap(); builder.finish().unwrap();
} }
@@ -1122,7 +1180,7 @@ fn test_create_tx_policy_path_no_csv() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 30_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(30_000))
.policy_path(path, KeychainKind::External); .policy_path(path, KeychainKind::External);
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -1143,7 +1201,7 @@ fn test_create_tx_policy_path_use_csv() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 30_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(30_000))
.policy_path(path, KeychainKind::External); .policy_path(path, KeychainKind::External);
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -1164,7 +1222,7 @@ fn test_create_tx_policy_path_ignored_subtree_with_csv() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 30_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(30_000))
.policy_path(path, KeychainKind::External); .policy_path(path, KeychainKind::External);
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -1180,7 +1238,7 @@ fn test_create_tx_global_xpubs_with_origin() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.add_global_xpubs(); .add_global_xpubs();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -1214,7 +1272,7 @@ fn test_add_foreign_utxo() {
let mut builder = wallet1.build_tx(); let mut builder = wallet1.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 60_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(60_000))
.only_witness_utxo() .only_witness_utxo()
.add_foreign_utxo(utxo.outpoint, psbt_input, foreign_utxo_satisfaction) .add_foreign_utxo(utxo.outpoint, psbt_input, foreign_utxo_satisfaction)
.unwrap(); .unwrap();
@@ -1225,7 +1283,7 @@ fn test_add_foreign_utxo() {
wallet1.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx")); wallet1.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx"));
assert_eq!( assert_eq!(
sent_received.0 - sent_received.1, (sent_received.0 - sent_received.1).to_sat(),
10_000 + fee.unwrap_or(0), 10_000 + fee.unwrap_or(0),
"we should have only net spent ~10_000" "we should have only net spent ~10_000"
); );
@@ -1290,7 +1348,7 @@ fn test_calculate_fee_with_missing_foreign_utxo() {
let mut builder = wallet1.build_tx(); let mut builder = wallet1.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 60_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(60_000))
.only_witness_utxo() .only_witness_utxo()
.add_foreign_utxo(utxo.outpoint, psbt_input, foreign_utxo_satisfaction) .add_foreign_utxo(utxo.outpoint, psbt_input, foreign_utxo_satisfaction)
.unwrap(); .unwrap();
@@ -1374,7 +1432,7 @@ fn test_add_foreign_utxo_only_witness_utxo() {
.unwrap(); .unwrap();
let mut builder = wallet1.build_tx(); let mut builder = wallet1.build_tx();
builder.add_recipient(addr.script_pubkey(), 60_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(60_000));
{ {
let mut builder = builder.clone(); let mut builder = builder.clone();
@@ -1443,7 +1501,7 @@ fn test_create_tx_global_xpubs_origin_missing() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.add_global_xpubs(); .add_global_xpubs();
builder.finish().unwrap(); builder.finish().unwrap();
} }
@@ -1457,7 +1515,7 @@ fn test_create_tx_global_xpubs_master_without_origin() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.add_global_xpubs(); .add_global_xpubs();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -1477,7 +1535,7 @@ fn test_bump_fee_irreplaceable_tx() {
let (mut wallet, _) = get_funded_wallet(get_test_wpkh()); let (mut wallet, _) = get_funded_wallet(get_test_wpkh());
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let tx = psbt.extract_tx().expect("failed to extract tx"); let tx = psbt.extract_tx().expect("failed to extract tx");
@@ -1494,7 +1552,7 @@ fn test_bump_fee_confirmed_tx() {
let (mut wallet, _) = get_funded_wallet(get_test_wpkh()); let (mut wallet, _) = get_funded_wallet(get_test_wpkh());
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let tx = psbt.extract_tx().expect("failed to extract tx"); let tx = psbt.extract_tx().expect("failed to extract tx");
@@ -1519,7 +1577,7 @@ fn test_bump_fee_low_fee_rate() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.enable_rbf(); .enable_rbf();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let feerate = psbt.fee_rate().unwrap(); let feerate = psbt.fee_rate().unwrap();
@@ -1553,7 +1611,7 @@ fn test_bump_fee_low_abs() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.enable_rbf(); .enable_rbf();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -1576,7 +1634,7 @@ fn test_bump_fee_zero_abs() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.enable_rbf(); .enable_rbf();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -1599,7 +1657,7 @@ fn test_bump_fee_reduce_change() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.enable_rbf(); .enable_rbf();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let original_sent_received = let original_sent_received =
@@ -1622,8 +1680,8 @@ fn test_bump_fee_reduce_change() {
assert_eq!(sent_received.0, original_sent_received.0); assert_eq!(sent_received.0, original_sent_received.0);
assert_eq!( assert_eq!(
sent_received.1 + fee.unwrap_or(0), sent_received.1 + Amount::from_sat(fee.unwrap_or(0)),
original_sent_received.1 + original_fee.unwrap_or(0) original_sent_received.1 + Amount::from_sat(original_fee.unwrap_or(0))
); );
assert!(fee.unwrap_or(0) > original_fee.unwrap_or(0)); assert!(fee.unwrap_or(0) > original_fee.unwrap_or(0));
@@ -1642,8 +1700,7 @@ fn test_bump_fee_reduce_change() {
.iter() .iter()
.find(|txout| txout.script_pubkey != addr.script_pubkey()) .find(|txout| txout.script_pubkey != addr.script_pubkey())
.unwrap() .unwrap()
.value .value,
.to_sat(),
sent_received.1 sent_received.1
); );
@@ -1659,8 +1716,8 @@ fn test_bump_fee_reduce_change() {
assert_eq!(sent_received.0, original_sent_received.0); assert_eq!(sent_received.0, original_sent_received.0);
assert_eq!( assert_eq!(
sent_received.1 + fee.unwrap_or(0), sent_received.1 + Amount::from_sat(fee.unwrap_or(0)),
original_sent_received.1 + original_fee.unwrap_or(0) original_sent_received.1 + Amount::from_sat(original_fee.unwrap_or(0))
); );
assert!( assert!(
fee.unwrap_or(0) > original_fee.unwrap_or(0), fee.unwrap_or(0) > original_fee.unwrap_or(0),
@@ -1684,8 +1741,7 @@ fn test_bump_fee_reduce_change() {
.iter() .iter()
.find(|txout| txout.script_pubkey != addr.script_pubkey()) .find(|txout| txout.script_pubkey != addr.script_pubkey())
.unwrap() .unwrap()
.value .value,
.to_sat(),
sent_received.1 sent_received.1
); );
@@ -1729,7 +1785,7 @@ fn test_bump_fee_reduce_single_recipient() {
let tx = &psbt.unsigned_tx; let tx = &psbt.unsigned_tx;
assert_eq!(tx.output.len(), 1); assert_eq!(tx.output.len(), 1);
assert_eq!( assert_eq!(
tx.output[0].value.to_sat() + fee.unwrap_or(0), tx.output[0].value + Amount::from_sat(fee.unwrap_or(0)),
sent_received.0 sent_received.0
); );
@@ -1771,7 +1827,7 @@ fn test_bump_fee_absolute_reduce_single_recipient() {
assert_eq!(tx.output.len(), 1); assert_eq!(tx.output.len(), 1);
assert_eq!( assert_eq!(
tx.output[0].value.to_sat() + fee.unwrap_or(0), tx.output[0].value + Amount::from_sat(fee.unwrap_or(0)),
sent_received.0 sent_received.0
); );
@@ -1825,7 +1881,7 @@ fn test_bump_fee_drain_wallet() {
wallet wallet
.insert_tx(tx, ConfirmationTime::Unconfirmed { last_seen: 0 }) .insert_tx(tx, ConfirmationTime::Unconfirmed { last_seen: 0 })
.unwrap(); .unwrap();
assert_eq!(original_sent_received.0, 25_000); assert_eq!(original_sent_received.0, Amount::from_sat(25_000));
// for the new feerate, it should be enough to reduce the output, but since we specify // for the new feerate, it should be enough to reduce the output, but since we specify
// `drain_wallet` we expect to spend everything // `drain_wallet` we expect to spend everything
@@ -1838,7 +1894,7 @@ fn test_bump_fee_drain_wallet() {
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let sent_received = wallet.sent_and_received(&psbt.extract_tx().expect("failed to extract tx")); let sent_received = wallet.sent_and_received(&psbt.extract_tx().expect("failed to extract tx"));
assert_eq!(sent_received.0, 75_000); assert_eq!(sent_received.0, Amount::from_sat(75_000));
} }
#[test] #[test]
@@ -1895,7 +1951,7 @@ fn test_bump_fee_remove_output_manually_selected_only() {
wallet wallet
.insert_tx(tx, ConfirmationTime::Unconfirmed { last_seen: 0 }) .insert_tx(tx, ConfirmationTime::Unconfirmed { last_seen: 0 })
.unwrap(); .unwrap();
assert_eq!(original_sent_received.0, 25_000); assert_eq!(original_sent_received.0, Amount::from_sat(25_000));
let mut builder = wallet.build_fee_bump(txid).unwrap(); let mut builder = wallet.build_fee_bump(txid).unwrap();
builder builder
@@ -1933,7 +1989,7 @@ fn test_bump_fee_add_input() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx().coin_selection(LargestFirstCoinSelection); let mut builder = wallet.build_tx().coin_selection(LargestFirstCoinSelection);
builder builder
.add_recipient(addr.script_pubkey(), 45_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(45_000))
.enable_rbf(); .enable_rbf();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let tx = psbt.extract_tx().expect("failed to extract tx"); let tx = psbt.extract_tx().expect("failed to extract tx");
@@ -1949,8 +2005,14 @@ fn test_bump_fee_add_input() {
let sent_received = let sent_received =
wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx")); wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx"));
let fee = check_fee!(wallet, psbt); let fee = check_fee!(wallet, psbt);
assert_eq!(sent_received.0, original_details.0 + 25_000); assert_eq!(
assert_eq!(fee.unwrap_or(0) + sent_received.1, 30_000); sent_received.0,
original_details.0 + Amount::from_sat(25_000)
);
assert_eq!(
Amount::from_sat(fee.unwrap_or(0)) + sent_received.1,
Amount::from_sat(30_000)
);
let tx = &psbt.unsigned_tx; let tx = &psbt.unsigned_tx;
assert_eq!(tx.input.len(), 2); assert_eq!(tx.input.len(), 2);
@@ -1968,8 +2030,7 @@ fn test_bump_fee_add_input() {
.iter() .iter()
.find(|txout| txout.script_pubkey != addr.script_pubkey()) .find(|txout| txout.script_pubkey != addr.script_pubkey())
.unwrap() .unwrap()
.value .value,
.to_sat(),
sent_received.1 sent_received.1
); );
@@ -1985,7 +2046,7 @@ fn test_bump_fee_absolute_add_input() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx().coin_selection(LargestFirstCoinSelection); let mut builder = wallet.build_tx().coin_selection(LargestFirstCoinSelection);
builder builder
.add_recipient(addr.script_pubkey(), 45_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(45_000))
.enable_rbf(); .enable_rbf();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let tx = psbt.extract_tx().expect("failed to extract tx"); let tx = psbt.extract_tx().expect("failed to extract tx");
@@ -2002,8 +2063,14 @@ fn test_bump_fee_absolute_add_input() {
wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx")); wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx"));
let fee = check_fee!(wallet, psbt); let fee = check_fee!(wallet, psbt);
assert_eq!(sent_received.0, original_sent_received.0 + 25_000); assert_eq!(
assert_eq!(fee.unwrap_or(0) + sent_received.1, 30_000); sent_received.0,
original_sent_received.0 + Amount::from_sat(25_000)
);
assert_eq!(
Amount::from_sat(fee.unwrap_or(0)) + sent_received.1,
Amount::from_sat(30_000)
);
let tx = &psbt.unsigned_tx; let tx = &psbt.unsigned_tx;
assert_eq!(tx.input.len(), 2); assert_eq!(tx.input.len(), 2);
@@ -2021,8 +2088,7 @@ fn test_bump_fee_absolute_add_input() {
.iter() .iter()
.find(|txout| txout.script_pubkey != addr.script_pubkey()) .find(|txout| txout.script_pubkey != addr.script_pubkey())
.unwrap() .unwrap()
.value .value,
.to_sat(),
sent_received.1 sent_received.1
); );
@@ -2065,11 +2131,15 @@ fn test_bump_fee_no_change_add_input_and_change() {
wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx")); wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx"));
let fee = check_fee!(wallet, psbt); let fee = check_fee!(wallet, psbt);
let original_send_all_amount = original_sent_received.0 - original_fee.unwrap_or(0); let original_send_all_amount =
assert_eq!(sent_received.0, original_sent_received.0 + 50_000); original_sent_received.0 - Amount::from_sat(original_fee.unwrap_or(0));
assert_eq!(
sent_received.0,
original_sent_received.0 + Amount::from_sat(50_000)
);
assert_eq!( assert_eq!(
sent_received.1, sent_received.1,
75_000 - original_send_all_amount - fee.unwrap_or(0) Amount::from_sat(75_000) - original_send_all_amount - Amount::from_sat(fee.unwrap_or(0))
); );
let tx = &psbt.unsigned_tx; let tx = &psbt.unsigned_tx;
@@ -2081,16 +2151,15 @@ fn test_bump_fee_no_change_add_input_and_change() {
.find(|txout| txout.script_pubkey == addr.script_pubkey()) .find(|txout| txout.script_pubkey == addr.script_pubkey())
.unwrap() .unwrap()
.value, .value,
Amount::from_sat(original_send_all_amount) original_send_all_amount
); );
assert_eq!( assert_eq!(
tx.output tx.output
.iter() .iter()
.find(|txout| txout.script_pubkey != addr.script_pubkey()) .find(|txout| txout.script_pubkey != addr.script_pubkey())
.unwrap() .unwrap()
.value .value,
.to_sat(), Amount::from_sat(75_000) - original_send_all_amount - Amount::from_sat(fee.unwrap_or(0))
75_000 - original_send_all_amount - fee.unwrap_or(0)
); );
assert_fee_rate!(psbt, fee.unwrap_or(0), FeeRate::from_sat_per_vb_unchecked(50), @add_signature); assert_fee_rate!(psbt, fee.unwrap_or(0), FeeRate::from_sat_per_vb_unchecked(50), @add_signature);
@@ -2105,7 +2174,7 @@ fn test_bump_fee_add_input_change_dust() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx().coin_selection(LargestFirstCoinSelection); let mut builder = wallet.build_tx().coin_selection(LargestFirstCoinSelection);
builder builder
.add_recipient(addr.script_pubkey(), 45_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(45_000))
.enable_rbf(); .enable_rbf();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let original_sent_received = let original_sent_received =
@@ -2145,11 +2214,17 @@ fn test_bump_fee_add_input_change_dust() {
wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx")); wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx"));
let fee = check_fee!(wallet, psbt); let fee = check_fee!(wallet, psbt);
assert_eq!(original_sent_received.1, 5_000 - original_fee.unwrap_or(0)); assert_eq!(
original_sent_received.1,
Amount::from_sat(5_000 - original_fee.unwrap_or(0))
);
assert_eq!(sent_received.0, original_sent_received.0 + 25_000); assert_eq!(
sent_received.0,
original_sent_received.0 + Amount::from_sat(25_000)
);
assert_eq!(fee.unwrap_or(0), 30_000); assert_eq!(fee.unwrap_or(0), 30_000);
assert_eq!(sent_received.1, 0); assert_eq!(sent_received.1, Amount::ZERO);
let tx = &psbt.unsigned_tx; let tx = &psbt.unsigned_tx;
assert_eq!(tx.input.len(), 2); assert_eq!(tx.input.len(), 2);
@@ -2176,7 +2251,7 @@ fn test_bump_fee_force_add_input() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx().coin_selection(LargestFirstCoinSelection); let mut builder = wallet.build_tx().coin_selection(LargestFirstCoinSelection);
builder builder
.add_recipient(addr.script_pubkey(), 45_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(45_000))
.enable_rbf(); .enable_rbf();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let mut tx = psbt.extract_tx().expect("failed to extract tx"); let mut tx = psbt.extract_tx().expect("failed to extract tx");
@@ -2200,8 +2275,14 @@ fn test_bump_fee_force_add_input() {
wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx")); wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx"));
let fee = check_fee!(wallet, psbt); let fee = check_fee!(wallet, psbt);
assert_eq!(sent_received.0, original_sent_received.0 + 25_000); assert_eq!(
assert_eq!(fee.unwrap_or(0) + sent_received.1, 30_000); sent_received.0,
original_sent_received.0 + Amount::from_sat(25_000)
);
assert_eq!(
Amount::from_sat(fee.unwrap_or(0)) + sent_received.1,
Amount::from_sat(30_000)
);
let tx = &psbt.unsigned_tx; let tx = &psbt.unsigned_tx;
assert_eq!(tx.input.len(), 2); assert_eq!(tx.input.len(), 2);
@@ -2219,8 +2300,7 @@ fn test_bump_fee_force_add_input() {
.iter() .iter()
.find(|txout| txout.script_pubkey != addr.script_pubkey()) .find(|txout| txout.script_pubkey != addr.script_pubkey())
.unwrap() .unwrap()
.value .value,
.to_sat(),
sent_received.1 sent_received.1
); );
@@ -2237,7 +2317,7 @@ fn test_bump_fee_absolute_force_add_input() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx().coin_selection(LargestFirstCoinSelection); let mut builder = wallet.build_tx().coin_selection(LargestFirstCoinSelection);
builder builder
.add_recipient(addr.script_pubkey(), 45_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(45_000))
.enable_rbf(); .enable_rbf();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
let mut tx = psbt.extract_tx().expect("failed to extract tx"); let mut tx = psbt.extract_tx().expect("failed to extract tx");
@@ -2260,8 +2340,14 @@ fn test_bump_fee_absolute_force_add_input() {
wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx")); wallet.sent_and_received(&psbt.clone().extract_tx().expect("failed to extract tx"));
let fee = check_fee!(wallet, psbt); let fee = check_fee!(wallet, psbt);
assert_eq!(sent_received.0, original_sent_received.0 + 25_000); assert_eq!(
assert_eq!(fee.unwrap_or(0) + sent_received.1, 30_000); sent_received.0,
original_sent_received.0 + Amount::from_sat(25_000)
);
assert_eq!(
Amount::from_sat(fee.unwrap_or(0)) + sent_received.1,
Amount::from_sat(30_000)
);
let tx = &psbt.unsigned_tx; let tx = &psbt.unsigned_tx;
assert_eq!(tx.input.len(), 2); assert_eq!(tx.input.len(), 2);
@@ -2279,8 +2365,7 @@ fn test_bump_fee_absolute_force_add_input() {
.iter() .iter()
.find(|txout| txout.script_pubkey != addr.script_pubkey()) .find(|txout| txout.script_pubkey != addr.script_pubkey())
.unwrap() .unwrap()
.value .value,
.to_sat(),
sent_received.1 sent_received.1
); );
@@ -2382,7 +2467,7 @@ fn test_fee_amount_negative_drain_val() {
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(send_to.script_pubkey(), 8630) .add_recipient(send_to.script_pubkey(), Amount::from_sat(8630))
.add_utxo(incoming_op) .add_utxo(incoming_op)
.unwrap() .unwrap()
.enable_rbf() .enable_rbf()
@@ -2496,7 +2581,7 @@ fn test_include_output_redeem_witness_script() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 45_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(45_000))
.include_output_redeem_witness_script(); .include_output_redeem_witness_script();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -2515,7 +2600,7 @@ fn test_signing_only_one_of_multiple_inputs() {
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 45_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(45_000))
.include_output_redeem_witness_script(); .include_output_redeem_witness_script();
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
@@ -2860,7 +2945,7 @@ fn test_sending_to_bip350_bech32m_address() {
.unwrap() .unwrap()
.assume_checked(); .assume_checked();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 45_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(45_000));
builder.finish().unwrap(); builder.finish().unwrap();
} }
@@ -2993,7 +3078,7 @@ fn test_taproot_psbt_populate_tap_key_origins() {
let addr = wallet.reveal_next_address(KeychainKind::External).unwrap(); let addr = wallet.reveal_next_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
assert_eq!( assert_eq!(
@@ -3033,7 +3118,7 @@ fn test_taproot_psbt_populate_tap_key_origins_repeated_key() {
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 25_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(25_000))
.policy_path(path, KeychainKind::External); .policy_path(path, KeychainKind::External);
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -3217,7 +3302,7 @@ fn test_taproot_foreign_utxo() {
let mut builder = wallet1.build_tx(); let mut builder = wallet1.build_tx();
builder builder
.add_recipient(addr.script_pubkey(), 60_000) .add_recipient(addr.script_pubkey(), Amount::from_sat(60_000))
.add_foreign_utxo(utxo.outpoint, psbt_input, foreign_utxo_satisfaction) .add_foreign_utxo(utxo.outpoint, psbt_input, foreign_utxo_satisfaction)
.unwrap(); .unwrap();
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -3228,7 +3313,7 @@ fn test_taproot_foreign_utxo() {
assert_eq!( assert_eq!(
sent_received.0 - sent_received.1, sent_received.0 - sent_received.1,
10_000 + fee.unwrap_or(0), Amount::from_sat(10_000 + fee.unwrap_or(0)),
"we should have only net spent ~10_000" "we should have only net spent ~10_000"
); );
@@ -3245,7 +3330,7 @@ fn test_spend_from_wallet(mut wallet: Wallet) {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
assert!( assert!(
@@ -3269,7 +3354,7 @@ fn test_taproot_no_key_spend() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
assert!( assert!(
@@ -3304,7 +3389,7 @@ fn test_taproot_script_spend_sign_all_leaves() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
assert!( assert!(
@@ -3335,7 +3420,7 @@ fn test_taproot_script_spend_sign_include_some_leaves() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
let mut script_leaves: Vec<_> = psbt.inputs[0] let mut script_leaves: Vec<_> = psbt.inputs[0]
.tap_scripts .tap_scripts
@@ -3375,7 +3460,7 @@ fn test_taproot_script_spend_sign_exclude_some_leaves() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
let mut script_leaves: Vec<_> = psbt.inputs[0] let mut script_leaves: Vec<_> = psbt.inputs[0]
.tap_scripts .tap_scripts
@@ -3413,7 +3498,7 @@ fn test_taproot_script_spend_sign_no_leaves() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
wallet wallet
@@ -3436,7 +3521,7 @@ fn test_taproot_sign_derive_index_from_psbt() {
let addr = wallet.next_unused_address(KeychainKind::External).unwrap(); let addr = wallet.next_unused_address(KeychainKind::External).unwrap();
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 25_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(25_000));
let mut psbt = builder.finish().unwrap(); let mut psbt = builder.finish().unwrap();
// re-create the wallet with an empty db // re-create the wallet with an empty db
@@ -3582,10 +3667,10 @@ fn test_spend_coinbase() {
assert_eq!( assert_eq!(
balance, balance,
Balance { Balance {
immature: 25_000, immature: Amount::from_sat(25_000),
trusted_pending: 0, trusted_pending: Amount::ZERO,
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 0 confirmed: Amount::ZERO
} }
); );
@@ -3633,10 +3718,10 @@ fn test_spend_coinbase() {
assert_eq!( assert_eq!(
balance, balance,
Balance { Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 0, trusted_pending: Amount::ZERO,
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 25_000 confirmed: Amount::from_sat(25_000)
} }
); );
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
@@ -3654,7 +3739,7 @@ fn test_allow_dust_limit() {
let mut builder = wallet.build_tx(); let mut builder = wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 0); builder.add_recipient(addr.script_pubkey(), Amount::ZERO);
assert_matches!( assert_matches!(
builder.finish(), builder.finish(),
@@ -3665,7 +3750,7 @@ fn test_allow_dust_limit() {
builder builder
.allow_dust(true) .allow_dust(true)
.add_recipient(addr.script_pubkey(), 0); .add_recipient(addr.script_pubkey(), Amount::ZERO);
assert!(builder.finish().is_ok()); assert!(builder.finish().is_ok());
} }
@@ -3793,7 +3878,7 @@ fn test_tx_cancellation() {
.unwrap() .unwrap()
.assume_checked(); .assume_checked();
let mut builder = $wallet.build_tx(); let mut builder = $wallet.build_tx();
builder.add_recipient(addr.script_pubkey(), 10_000); builder.add_recipient(addr.script_pubkey(), Amount::from_sat(10_000));
let psbt = builder.finish().unwrap(); let psbt = builder.finish().unwrap();
@@ -3854,3 +3939,9 @@ fn test_tx_cancellation() {
.unwrap(); .unwrap();
assert_eq!(change_derivation_4, (KeychainKind::Internal, 2)); assert_eq!(change_derivation_4, (KeychainKind::Internal, 2));
} }
#[test]
fn test_thread_safety() {
fn thread_safe<T: Send + Sync>() {}
thread_safe::<Wallet>(); // compiles only if true
}

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "bdk_bitcoind_rpc" name = "bdk_bitcoind_rpc"
version = "0.9.0" version = "0.10.0"
edition = "2021" edition = "2021"
rust-version = "1.63" rust-version = "1.63"
homepage = "https://bitcoindevkit.org" homepage = "https://bitcoindevkit.org"
@@ -16,11 +16,10 @@ readme = "README.md"
# For no-std, remember to enable the bitcoin/no-std feature # For no-std, remember to enable the bitcoin/no-std feature
bitcoin = { version = "0.31", default-features = false } bitcoin = { version = "0.31", default-features = false }
bitcoincore-rpc = { version = "0.18" } bitcoincore-rpc = { version = "0.18" }
bdk_chain = { path = "../chain", version = "0.13", default-features = false } bdk_chain = { path = "../chain", version = "0.14", default-features = false }
[dev-dependencies] [dev-dependencies]
bdk_testenv = { path = "../testenv", default_features = false } bdk_testenv = { path = "../testenv", default_features = false }
anyhow = { version = "1" }
[features] [features]
default = ["std"] default = ["std"]

View File

@@ -7,7 +7,7 @@ use bdk_chain::{
local_chain::{CheckPoint, LocalChain}, local_chain::{CheckPoint, LocalChain},
Append, BlockId, IndexedTxGraph, SpkTxOutIndex, Append, BlockId, IndexedTxGraph, SpkTxOutIndex,
}; };
use bdk_testenv::TestEnv; use bdk_testenv::{anyhow, TestEnv};
use bitcoin::{hashes::Hash, Block, OutPoint, ScriptBuf, WScriptHash}; use bitcoin::{hashes::Hash, Block, OutPoint, ScriptBuf, WScriptHash};
use bitcoincore_rpc::RpcApi; use bitcoincore_rpc::RpcApi;
@@ -377,7 +377,7 @@ fn tx_can_become_unconfirmed_after_reorg() -> anyhow::Result<()> {
assert_eq!( assert_eq!(
get_balance(&recv_chain, &recv_graph)?, get_balance(&recv_chain, &recv_graph)?,
Balance { Balance {
confirmed: SEND_AMOUNT.to_sat() * ADDITIONAL_COUNT as u64, confirmed: SEND_AMOUNT * ADDITIONAL_COUNT as u64,
..Balance::default() ..Balance::default()
}, },
"initial balance must be correct", "initial balance must be correct",
@@ -391,8 +391,8 @@ fn tx_can_become_unconfirmed_after_reorg() -> anyhow::Result<()> {
assert_eq!( assert_eq!(
get_balance(&recv_chain, &recv_graph)?, get_balance(&recv_chain, &recv_graph)?,
Balance { Balance {
confirmed: SEND_AMOUNT.to_sat() * (ADDITIONAL_COUNT - reorg_count) as u64, confirmed: SEND_AMOUNT * (ADDITIONAL_COUNT - reorg_count) as u64,
trusted_pending: SEND_AMOUNT.to_sat() * reorg_count as u64, trusted_pending: SEND_AMOUNT * reorg_count as u64,
..Balance::default() ..Balance::default()
}, },
"reorg_count: {}", "reorg_count: {}",

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "bdk_chain" name = "bdk_chain"
version = "0.13.0" version = "0.14.0"
edition = "2021" edition = "2021"
rust-version = "1.63" rust-version = "1.63"
homepage = "https://bitcoindevkit.org" homepage = "https://bitcoindevkit.org"
@@ -26,6 +26,6 @@ rand = "0.8"
proptest = "1.2.0" proptest = "1.2.0"
[features] [features]
default = ["std"] default = ["std", "miniscript"]
std = ["bitcoin/std", "miniscript/std"] std = ["bitcoin/std", "miniscript?/std"]
serde = ["serde_crate", "bitcoin/serde"] serde = ["serde_crate", "bitcoin/serde", "miniscript?/serde"]

View File

@@ -1,10 +1,29 @@
use crate::miniscript::{Descriptor, DescriptorPublicKey}; use crate::{
alloc::{string::ToString, vec::Vec},
miniscript::{Descriptor, DescriptorPublicKey},
};
use bitcoin::hashes::{hash_newtype, sha256, Hash};
hash_newtype! {
/// Represents the ID of a descriptor, defined as the sha256 hash of
/// the descriptor string, checksum excluded.
///
/// This is useful for having a fixed-length unique representation of a descriptor,
/// in particular, we use it to persist application state changes related to the
/// descriptor without having to re-write the whole descriptor each time.
///
pub struct DescriptorId(pub sha256::Hash);
}
/// A trait to extend the functionality of a miniscript descriptor. /// A trait to extend the functionality of a miniscript descriptor.
pub trait DescriptorExt { pub trait DescriptorExt {
/// Returns the minimum value (in satoshis) at which an output is broadcastable. /// Returns the minimum value (in satoshis) at which an output is broadcastable.
/// Panics if the descriptor wildcard is hardened. /// Panics if the descriptor wildcard is hardened.
fn dust_value(&self) -> u64; fn dust_value(&self) -> u64;
/// Returns the descriptor id, calculated as the sha256 of the descriptor, checksum not
/// included.
fn descriptor_id(&self) -> DescriptorId;
} }
impl DescriptorExt for Descriptor<DescriptorPublicKey> { impl DescriptorExt for Descriptor<DescriptorPublicKey> {
@@ -15,4 +34,11 @@ impl DescriptorExt for Descriptor<DescriptorPublicKey> {
.dust_value() .dust_value()
.to_sat() .to_sat()
} }
fn descriptor_id(&self) -> DescriptorId {
let desc = self.to_string();
let desc_without_checksum = desc.split('#').next().expect("Must be here");
let descriptor_bytes = <Vec<u8>>::from(desc_without_checksum.as_bytes());
DescriptorId(sha256::Hash::hash(&descriptor_bytes))
}
} }

View File

@@ -4,7 +4,6 @@ use alloc::vec::Vec;
use bitcoin::{Block, OutPoint, Transaction, TxOut, Txid}; use bitcoin::{Block, OutPoint, Transaction, TxOut, Txid};
use crate::{ use crate::{
keychain,
tx_graph::{self, TxGraph}, tx_graph::{self, TxGraph},
Anchor, AnchorFromBlockPosition, Append, BlockId, Anchor, AnchorFromBlockPosition, Append, BlockId,
}; };
@@ -320,8 +319,9 @@ impl<A, IA: Default> From<tx_graph::ChangeSet<A>> for ChangeSet<A, IA> {
} }
} }
impl<A, K> From<keychain::ChangeSet<K>> for ChangeSet<A, keychain::ChangeSet<K>> { #[cfg(feature = "miniscript")]
fn from(indexer: keychain::ChangeSet<K>) -> Self { impl<A, K> From<crate::keychain::ChangeSet<K>> for ChangeSet<A, crate::keychain::ChangeSet<K>> {
fn from(indexer: crate::keychain::ChangeSet<K>) -> Self {
Self { Self {
graph: Default::default(), graph: Default::default(),
indexer, indexer,

View File

@@ -10,77 +10,12 @@
//! //!
//! [`SpkTxOutIndex`]: crate::SpkTxOutIndex //! [`SpkTxOutIndex`]: crate::SpkTxOutIndex
use crate::{collections::BTreeMap, Append};
#[cfg(feature = "miniscript")] #[cfg(feature = "miniscript")]
mod txout_index; mod txout_index;
use bitcoin::Amount;
#[cfg(feature = "miniscript")] #[cfg(feature = "miniscript")]
pub use txout_index::*; pub use txout_index::*;
/// Represents updates to the derivation index of a [`KeychainTxOutIndex`].
/// It maps each keychain `K` to its last revealed index.
///
/// It can be applied to [`KeychainTxOutIndex`] with [`apply_changeset`]. [`ChangeSet`]s are
/// monotone in that they will never decrease the revealed derivation index.
///
/// [`KeychainTxOutIndex`]: crate::keychain::KeychainTxOutIndex
/// [`apply_changeset`]: crate::keychain::KeychainTxOutIndex::apply_changeset
#[derive(Clone, Debug, PartialEq)]
#[cfg_attr(
feature = "serde",
derive(serde::Deserialize, serde::Serialize),
serde(
crate = "serde_crate",
bound(
deserialize = "K: Ord + serde::Deserialize<'de>",
serialize = "K: Ord + serde::Serialize"
)
)
)]
#[must_use]
pub struct ChangeSet<K>(pub BTreeMap<K, u32>);
impl<K> ChangeSet<K> {
/// Get the inner map of the keychain to its new derivation index.
pub fn as_inner(&self) -> &BTreeMap<K, u32> {
&self.0
}
}
impl<K: Ord> Append for ChangeSet<K> {
/// Append another [`ChangeSet`] into self.
///
/// If the keychain already exists, increase the index when the other's index > self's index.
/// If the keychain did not exist, append the new keychain.
fn append(&mut self, mut other: Self) {
self.0.iter_mut().for_each(|(key, index)| {
if let Some(other_index) = other.0.remove(key) {
*index = other_index.max(*index);
}
});
// We use `extend` instead of `BTreeMap::append` due to performance issues with `append`.
// Refer to https://github.com/rust-lang/rust/issues/34666#issuecomment-675658420
self.0.extend(other.0);
}
/// Returns whether the changeset are empty.
fn is_empty(&self) -> bool {
self.0.is_empty()
}
}
impl<K> Default for ChangeSet<K> {
fn default() -> Self {
Self(Default::default())
}
}
impl<K> AsRef<BTreeMap<K, u32>> for ChangeSet<K> {
fn as_ref(&self) -> &BTreeMap<K, u32> {
&self.0
}
}
/// Balance, differentiated into various categories. /// Balance, differentiated into various categories.
#[derive(Debug, PartialEq, Eq, Clone, Default)] #[derive(Debug, PartialEq, Eq, Clone, Default)]
#[cfg_attr( #[cfg_attr(
@@ -90,13 +25,13 @@ impl<K> AsRef<BTreeMap<K, u32>> for ChangeSet<K> {
)] )]
pub struct Balance { pub struct Balance {
/// All coinbase outputs not yet matured /// All coinbase outputs not yet matured
pub immature: u64, pub immature: Amount,
/// Unconfirmed UTXOs generated by a wallet tx /// Unconfirmed UTXOs generated by a wallet tx
pub trusted_pending: u64, pub trusted_pending: Amount,
/// Unconfirmed UTXOs received from an external wallet /// Unconfirmed UTXOs received from an external wallet
pub untrusted_pending: u64, pub untrusted_pending: Amount,
/// Confirmed and immediately spendable balance /// Confirmed and immediately spendable balance
pub confirmed: u64, pub confirmed: Amount,
} }
impl Balance { impl Balance {
@@ -104,12 +39,12 @@ impl Balance {
/// ///
/// This is the balance you can spend right now that shouldn't get cancelled via another party /// This is the balance you can spend right now that shouldn't get cancelled via another party
/// double spending it. /// double spending it.
pub fn trusted_spendable(&self) -> u64 { pub fn trusted_spendable(&self) -> Amount {
self.confirmed + self.trusted_pending self.confirmed + self.trusted_pending
} }
/// Get the whole balance visible to the wallet. /// Get the whole balance visible to the wallet.
pub fn total(&self) -> u64 { pub fn total(&self) -> Amount {
self.confirmed + self.trusted_pending + self.untrusted_pending + self.immature self.confirmed + self.trusted_pending + self.untrusted_pending + self.immature
} }
} }
@@ -136,40 +71,3 @@ impl core::ops::Add for Balance {
} }
} }
} }
#[cfg(test)]
mod test {
use super::*;
#[test]
fn append_keychain_derivation_indices() {
#[derive(Ord, PartialOrd, Eq, PartialEq, Clone, Debug)]
enum Keychain {
One,
Two,
Three,
Four,
}
let mut lhs_di = BTreeMap::<Keychain, u32>::default();
let mut rhs_di = BTreeMap::<Keychain, u32>::default();
lhs_di.insert(Keychain::One, 7);
lhs_di.insert(Keychain::Two, 0);
rhs_di.insert(Keychain::One, 3);
rhs_di.insert(Keychain::Two, 5);
lhs_di.insert(Keychain::Three, 3);
rhs_di.insert(Keychain::Four, 4);
let mut lhs = ChangeSet(lhs_di);
let rhs = ChangeSet(rhs_di);
lhs.append(rhs);
// Exiting index doesn't update if the new index in `other` is lower than `self`.
assert_eq!(lhs.0.get(&Keychain::One), Some(&7));
// Existing index updates if the new index in `other` is higher than `self`.
assert_eq!(lhs.0.get(&Keychain::Two), Some(&5));
// Existing index is unchanged if keychain doesn't exist in `other`.
assert_eq!(lhs.0.get(&Keychain::Three), Some(&3));
// New keychain gets added if the keychain is in `other` but not in `self`.
assert_eq!(lhs.0.get(&Keychain::Four), Some(&4));
}
}

View File

@@ -3,9 +3,9 @@ use crate::{
indexed_tx_graph::Indexer, indexed_tx_graph::Indexer,
miniscript::{Descriptor, DescriptorPublicKey}, miniscript::{Descriptor, DescriptorPublicKey},
spk_iter::BIP32_MAX_INDEX, spk_iter::BIP32_MAX_INDEX,
SpkIterator, SpkTxOutIndex, DescriptorExt, DescriptorId, SpkIterator, SpkTxOutIndex,
}; };
use bitcoin::{OutPoint, Script, Transaction, TxOut, Txid}; use bitcoin::{hashes::Hash, Amount, OutPoint, Script, SignedAmount, Transaction, TxOut, Txid};
use core::{ use core::{
fmt::Debug, fmt::Debug,
ops::{Bound, RangeBounds}, ops::{Bound, RangeBounds},
@@ -13,6 +13,79 @@ use core::{
use crate::Append; use crate::Append;
/// Represents updates to the derivation index of a [`KeychainTxOutIndex`].
/// It maps each keychain `K` to a descriptor and its last revealed index.
///
/// It can be applied to [`KeychainTxOutIndex`] with [`apply_changeset`]. [`ChangeSet] are
/// monotone in that they will never decrease the revealed derivation index.
///
/// [`KeychainTxOutIndex`]: crate::keychain::KeychainTxOutIndex
/// [`apply_changeset`]: crate::keychain::KeychainTxOutIndex::apply_changeset
#[derive(Clone, Debug, PartialEq)]
#[cfg_attr(
feature = "serde",
derive(serde::Deserialize, serde::Serialize),
serde(
crate = "serde_crate",
bound(
deserialize = "K: Ord + serde::Deserialize<'de>",
serialize = "K: Ord + serde::Serialize"
)
)
)]
#[must_use]
pub struct ChangeSet<K> {
/// Contains the keychains that have been added and their respective descriptor
pub keychains_added: BTreeMap<K, Descriptor<DescriptorPublicKey>>,
/// Contains for each descriptor_id the last revealed index of derivation
pub last_revealed: BTreeMap<DescriptorId, u32>,
}
impl<K: Ord> Append for ChangeSet<K> {
/// Append another [`ChangeSet`] into self.
///
/// For each keychain in `keychains_added` in the given [`ChangeSet`]:
/// If the keychain already exist with a different descriptor, we overwrite the old descriptor.
///
/// For each `last_revealed` in the given [`ChangeSet`]:
/// If the keychain already exists, increase the index when the other's index > self's index.
fn append(&mut self, other: Self) {
// We use `extend` instead of `BTreeMap::append` due to performance issues with `append`.
// Refer to https://github.com/rust-lang/rust/issues/34666#issuecomment-675658420
self.keychains_added.extend(other.keychains_added);
// for `last_revealed`, entries of `other` will take precedence ONLY if it is greater than
// what was originally in `self`.
for (desc_id, index) in other.last_revealed {
use crate::collections::btree_map::Entry;
match self.last_revealed.entry(desc_id) {
Entry::Vacant(entry) => {
entry.insert(index);
}
Entry::Occupied(mut entry) => {
if *entry.get() < index {
entry.insert(index);
}
}
}
}
}
/// Returns whether the changeset are empty.
fn is_empty(&self) -> bool {
self.last_revealed.is_empty() && self.keychains_added.is_empty()
}
}
impl<K> Default for ChangeSet<K> {
fn default() -> Self {
Self {
last_revealed: BTreeMap::default(),
keychains_added: BTreeMap::default(),
}
}
}
const DEFAULT_LOOKAHEAD: u32 = 25; const DEFAULT_LOOKAHEAD: u32 = 25;
/// [`KeychainTxOutIndex`] controls how script pubkeys are revealed for multiple keychains, and /// [`KeychainTxOutIndex`] controls how script pubkeys are revealed for multiple keychains, and
@@ -54,7 +127,7 @@ const DEFAULT_LOOKAHEAD: u32 = 25;
/// ///
/// # Change sets /// # Change sets
/// ///
/// Methods that can update the last revealed index will return [`super::ChangeSet`] to report /// Methods that can update the last revealed index or add keychains will return [`super::ChangeSet`] to report
/// these changes. This can be persisted for future recovery. /// these changes. This can be persisted for future recovery.
/// ///
/// ## Synopsis /// ## Synopsis
@@ -79,14 +152,43 @@ const DEFAULT_LOOKAHEAD: u32 = 25;
/// # let secp = bdk_chain::bitcoin::secp256k1::Secp256k1::signing_only(); /// # let secp = bdk_chain::bitcoin::secp256k1::Secp256k1::signing_only();
/// # let (external_descriptor,_) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/0/*)").unwrap(); /// # let (external_descriptor,_) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/0/*)").unwrap();
/// # let (internal_descriptor,_) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/1/*)").unwrap(); /// # let (internal_descriptor,_) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/1/*)").unwrap();
/// # let (descriptor_for_user_42, _) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/2/*)").unwrap(); /// # let (descriptor_42, _) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/2/*)").unwrap();
/// txout_index.add_keychain(MyKeychain::External, external_descriptor); /// let _ = txout_index.insert_descriptor(MyKeychain::External, external_descriptor);
/// txout_index.add_keychain(MyKeychain::Internal, internal_descriptor); /// let _ = txout_index.insert_descriptor(MyKeychain::Internal, internal_descriptor);
/// txout_index.add_keychain(MyKeychain::MyAppUser { user_id: 42 }, descriptor_for_user_42); /// let _ = txout_index.insert_descriptor(MyKeychain::MyAppUser { user_id: 42 }, descriptor_42);
/// ///
/// let new_spk_for_user = txout_index.reveal_next_spk(&MyKeychain::MyAppUser{ user_id: 42 }); /// let new_spk_for_user = txout_index.reveal_next_spk(&MyKeychain::MyAppUser{ user_id: 42 });
/// ``` /// ```
/// ///
/// # Non-recommend keychain to descriptor assignments
///
/// A keychain (`K`) is used to identify a descriptor. However, the following keychain to descriptor
/// arrangements result in behavior that is harder to reason about and is not recommended.
///
/// ## Multiple keychains identifying the same descriptor
///
/// Although a single keychain variant can only identify a single descriptor, multiple keychain
/// variants can identify the same descriptor.
///
/// If multiple keychains identify the same descriptor:
/// 1. Methods that take in a keychain (such as [`reveal_next_spk`]) will work normally when any
/// keychain (that identifies that descriptor) is passed in.
/// 2. Methods that return data which associates with a descriptor (such as [`outpoints`],
/// [`txouts`], [`unused_spks`], etc.) the method will return the highest-ranked keychain variant
/// that identifies the descriptor. Rank is determined by the [`Ord`] implementation of the keychain
/// type.
///
/// This arrangement is not recommended since some methods will return a single keychain variant
/// even though multiple keychain variants identify the same descriptor.
///
/// ## Reassigning the descriptor of a single keychain
///
/// Descriptors added to [`KeychainTxOutIndex`] are never removed. However, a keychain that
/// identifies a descriptor can be reassigned to identify a different descriptor. This may result in
/// a situation where a descriptor has no associated keychain(s), and relevant [`TxOut`]s,
/// [`OutPoint`]s and [`Script`]s (of that descriptor) will not be return by [`KeychainTxOutIndex`].
/// Therefore, reassigning the descriptor of a single keychain is not recommended.
///
/// [`Ord`]: core::cmp::Ord /// [`Ord`]: core::cmp::Ord
/// [`SpkTxOutIndex`]: crate::spk_txout_index::SpkTxOutIndex /// [`SpkTxOutIndex`]: crate::spk_txout_index::SpkTxOutIndex
/// [`Descriptor`]: crate::miniscript::Descriptor /// [`Descriptor`]: crate::miniscript::Descriptor
@@ -99,13 +201,27 @@ const DEFAULT_LOOKAHEAD: u32 = 25;
/// [`new`]: KeychainTxOutIndex::new /// [`new`]: KeychainTxOutIndex::new
/// [`unbounded_spk_iter`]: KeychainTxOutIndex::unbounded_spk_iter /// [`unbounded_spk_iter`]: KeychainTxOutIndex::unbounded_spk_iter
/// [`all_unbounded_spk_iters`]: KeychainTxOutIndex::all_unbounded_spk_iters /// [`all_unbounded_spk_iters`]: KeychainTxOutIndex::all_unbounded_spk_iters
/// [`outpoints`]: KeychainTxOutIndex::outpoints
/// [`txouts`]: KeychainTxOutIndex::txouts
/// [`unused_spks`]: KeychainTxOutIndex::unused_spks
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
pub struct KeychainTxOutIndex<K> { pub struct KeychainTxOutIndex<K> {
inner: SpkTxOutIndex<(K, u32)>, inner: SpkTxOutIndex<(DescriptorId, u32)>,
// descriptors of each keychain // keychain -> (descriptor, descriptor id) map
keychains: BTreeMap<K, Descriptor<DescriptorPublicKey>>, keychains_to_descriptors: BTreeMap<K, (DescriptorId, Descriptor<DescriptorPublicKey>)>,
// descriptor id -> keychain set
// Because different keychains can have the same descriptor, we rank keychains by `Ord` so that
// that the first keychain variant (according to `Ord`) has the highest rank. When associated
// data (such as spks, outpoints) are returned with a keychain, we return the highest-ranked
// keychain with it.
descriptor_ids_to_keychain_set: HashMap<DescriptorId, BTreeSet<K>>,
// descriptor_id -> descriptor map
// This is a "monotone" map, meaning that its size keeps growing, i.e., we never delete
// descriptors from it. This is useful for revealing spks for descriptors that don't have
// keychains associated.
descriptor_ids_to_descriptors: BTreeMap<DescriptorId, Descriptor<DescriptorPublicKey>>,
// last revealed indexes // last revealed indexes
last_revealed: BTreeMap<K, u32>, last_revealed: BTreeMap<DescriptorId, u32>,
// lookahead settings for each keychain // lookahead settings for each keychain
lookahead: u32, lookahead: u32,
} }
@@ -121,7 +237,13 @@ impl<K: Clone + Ord + Debug> Indexer for KeychainTxOutIndex<K> {
fn index_txout(&mut self, outpoint: OutPoint, txout: &TxOut) -> Self::ChangeSet { fn index_txout(&mut self, outpoint: OutPoint, txout: &TxOut) -> Self::ChangeSet {
match self.inner.scan_txout(outpoint, txout).cloned() { match self.inner.scan_txout(outpoint, txout).cloned() {
Some((keychain, index)) => self.reveal_to_target(&keychain, index).1, Some((descriptor_id, index)) => {
// We want to reveal spks for descriptors that aren't tracked by any keychain, and
// so we call reveal with descriptor_id
let (_, changeset) = self.reveal_to_target_with_id(descriptor_id, index)
.expect("descriptors are added in a monotone manner, there cannot be a descriptor id with no corresponding descriptor");
changeset
}
None => super::ChangeSet::default(), None => super::ChangeSet::default(),
} }
} }
@@ -135,7 +257,13 @@ impl<K: Clone + Ord + Debug> Indexer for KeychainTxOutIndex<K> {
} }
fn initial_changeset(&self) -> Self::ChangeSet { fn initial_changeset(&self) -> Self::ChangeSet {
super::ChangeSet(self.last_revealed.clone()) super::ChangeSet {
keychains_added: self
.keychains()
.map(|(k, v)| (k.clone(), v.clone()))
.collect(),
last_revealed: self.last_revealed.clone(),
}
} }
fn apply_changeset(&mut self, changeset: Self::ChangeSet) { fn apply_changeset(&mut self, changeset: Self::ChangeSet) {
@@ -161,7 +289,9 @@ impl<K> KeychainTxOutIndex<K> {
pub fn new(lookahead: u32) -> Self { pub fn new(lookahead: u32) -> Self {
Self { Self {
inner: SpkTxOutIndex::default(), inner: SpkTxOutIndex::default(),
keychains: BTreeMap::new(), keychains_to_descriptors: BTreeMap::new(),
descriptor_ids_to_keychain_set: HashMap::new(),
descriptor_ids_to_descriptors: BTreeMap::new(),
last_revealed: BTreeMap::new(), last_revealed: BTreeMap::new(),
lookahead, lookahead,
} }
@@ -170,26 +300,37 @@ impl<K> KeychainTxOutIndex<K> {
/// Methods that are *re-exposed* from the internal [`SpkTxOutIndex`]. /// Methods that are *re-exposed* from the internal [`SpkTxOutIndex`].
impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> { impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// Get the highest-ranked keychain that is currently associated with the given `desc_id`.
fn keychain_of_desc_id(&self, desc_id: &DescriptorId) -> Option<&K> {
let keychains = self.descriptor_ids_to_keychain_set.get(desc_id)?;
keychains.iter().next()
}
/// Return a reference to the internal [`SpkTxOutIndex`]. /// Return a reference to the internal [`SpkTxOutIndex`].
/// ///
/// **WARNING:** The internal index will contain lookahead spks. Refer to /// **WARNING:** The internal index will contain lookahead spks. Refer to
/// [struct-level docs](KeychainTxOutIndex) for more about `lookahead`. /// [struct-level docs](KeychainTxOutIndex) for more about `lookahead`.
pub fn inner(&self) -> &SpkTxOutIndex<(K, u32)> { pub fn inner(&self) -> &SpkTxOutIndex<(DescriptorId, u32)> {
&self.inner &self.inner
} }
/// Get a reference to the set of indexed outpoints. /// Get the set of indexed outpoints, corresponding to tracked keychains.
pub fn outpoints(&self) -> &BTreeSet<((K, u32), OutPoint)> { pub fn outpoints(&self) -> impl DoubleEndedIterator<Item = ((K, u32), OutPoint)> + '_ {
self.inner.outpoints() self.inner
.outpoints()
.iter()
.filter_map(|((desc_id, index), op)| {
let keychain = self.keychain_of_desc_id(desc_id)?;
Some(((keychain.clone(), *index), *op))
})
} }
/// Iterate over known txouts that spend to tracked script pubkeys. /// Iterate over known txouts that spend to tracked script pubkeys.
pub fn txouts( pub fn txouts(&self) -> impl DoubleEndedIterator<Item = (K, u32, OutPoint, &TxOut)> + '_ {
&self, self.inner.txouts().filter_map(|((desc_id, i), op, txo)| {
) -> impl DoubleEndedIterator<Item = (K, u32, OutPoint, &TxOut)> + ExactSizeIterator { let keychain = self.keychain_of_desc_id(desc_id)?;
self.inner Some((keychain.clone(), *i, op, txo))
.txouts() })
.map(|((k, i), op, txo)| (k.clone(), *i, op, txo))
} }
/// Finds all txouts on a transaction that has previously been scanned and indexed. /// Finds all txouts on a transaction that has previously been scanned and indexed.
@@ -199,32 +340,39 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
) -> impl DoubleEndedIterator<Item = (K, u32, OutPoint, &TxOut)> { ) -> impl DoubleEndedIterator<Item = (K, u32, OutPoint, &TxOut)> {
self.inner self.inner
.txouts_in_tx(txid) .txouts_in_tx(txid)
.map(|((k, i), op, txo)| (k.clone(), *i, op, txo)) .filter_map(|((desc_id, i), op, txo)| {
let keychain = self.keychain_of_desc_id(desc_id)?;
Some((keychain.clone(), *i, op, txo))
})
} }
/// Return the [`TxOut`] of `outpoint` if it has been indexed. /// Return the [`TxOut`] of `outpoint` if it has been indexed, and if it corresponds to a
/// tracked keychain.
/// ///
/// The associated keychain and keychain index of the txout's spk is also returned. /// The associated keychain and keychain index of the txout's spk is also returned.
/// ///
/// This calls [`SpkTxOutIndex::txout`] internally. /// This calls [`SpkTxOutIndex::txout`] internally.
pub fn txout(&self, outpoint: OutPoint) -> Option<(K, u32, &TxOut)> { pub fn txout(&self, outpoint: OutPoint) -> Option<(K, u32, &TxOut)> {
self.inner let ((descriptor_id, index), txo) = self.inner.txout(outpoint)?;
.txout(outpoint) let keychain = self.keychain_of_desc_id(descriptor_id)?;
.map(|((k, i), txo)| (k.clone(), *i, txo)) Some((keychain.clone(), *index, txo))
} }
/// Return the script that exists under the given `keychain`'s `index`. /// Return the script that exists under the given `keychain`'s `index`.
/// ///
/// This calls [`SpkTxOutIndex::spk_at_index`] internally. /// This calls [`SpkTxOutIndex::spk_at_index`] internally.
pub fn spk_at_index(&self, keychain: K, index: u32) -> Option<&Script> { pub fn spk_at_index(&self, keychain: K, index: u32) -> Option<&Script> {
self.inner.spk_at_index(&(keychain, index)) let descriptor_id = self.keychains_to_descriptors.get(&keychain)?.0;
self.inner.spk_at_index(&(descriptor_id, index))
} }
/// Returns the keychain and keychain index associated with the spk. /// Returns the keychain and keychain index associated with the spk.
/// ///
/// This calls [`SpkTxOutIndex::index_of_spk`] internally. /// This calls [`SpkTxOutIndex::index_of_spk`] internally.
pub fn index_of_spk(&self, script: &Script) -> Option<(K, u32)> { pub fn index_of_spk(&self, script: &Script) -> Option<(K, u32)> {
self.inner.index_of_spk(script).cloned() let (desc_id, last_index) = self.inner.index_of_spk(script)?;
let keychain = self.keychain_of_desc_id(desc_id)?;
Some((keychain.clone(), *last_index))
} }
/// Returns whether the spk under the `keychain`'s `index` has been used. /// Returns whether the spk under the `keychain`'s `index` has been used.
@@ -234,7 +382,11 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// ///
/// This calls [`SpkTxOutIndex::is_used`] internally. /// This calls [`SpkTxOutIndex::is_used`] internally.
pub fn is_used(&self, keychain: K, index: u32) -> bool { pub fn is_used(&self, keychain: K, index: u32) -> bool {
self.inner.is_used(&(keychain, index)) let descriptor_id = self.keychains_to_descriptors.get(&keychain).map(|k| k.0);
match descriptor_id {
Some(descriptor_id) => self.inner.is_used(&(descriptor_id, index)),
None => false,
}
} }
/// Marks the script pubkey at `index` as used even though the tracker hasn't seen an output /// Marks the script pubkey at `index` as used even though the tracker hasn't seen an output
@@ -242,7 +394,9 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// ///
/// This only has an effect when the `index` had been added to `self` already and was unused. /// This only has an effect when the `index` had been added to `self` already and was unused.
/// ///
/// Returns whether the `index` was initially present as `unused`. /// Returns whether the spk under the given `keychain` and `index` is successfully
/// marked as used. Returns false either when there is no descriptor under the given
/// keychain, or when the spk is already marked as used.
/// ///
/// This is useful when you want to reserve a script pubkey for something but don't want to add /// This is useful when you want to reserve a script pubkey for something but don't want to add
/// the transaction output using it to the index yet. Other callers will consider `index` on /// the transaction output using it to the index yet. Other callers will consider `index` on
@@ -252,7 +406,11 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// ///
/// [`unmark_used`]: Self::unmark_used /// [`unmark_used`]: Self::unmark_used
pub fn mark_used(&mut self, keychain: K, index: u32) -> bool { pub fn mark_used(&mut self, keychain: K, index: u32) -> bool {
self.inner.mark_used(&(keychain, index)) let descriptor_id = self.keychains_to_descriptors.get(&keychain).map(|k| k.0);
match descriptor_id {
Some(descriptor_id) => self.inner.mark_used(&(descriptor_id, index)),
None => false,
}
} }
/// Undoes the effect of [`mark_used`]. Returns whether the `index` is inserted back into /// Undoes the effect of [`mark_used`]. Returns whether the `index` is inserted back into
@@ -265,7 +423,11 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// ///
/// [`mark_used`]: Self::mark_used /// [`mark_used`]: Self::mark_used
pub fn unmark_used(&mut self, keychain: K, index: u32) -> bool { pub fn unmark_used(&mut self, keychain: K, index: u32) -> bool {
self.inner.unmark_used(&(keychain, index)) let descriptor_id = self.keychains_to_descriptors.get(&keychain).map(|k| k.0);
match descriptor_id {
Some(descriptor_id) => self.inner.unmark_used(&(descriptor_id, index)),
None => false,
}
} }
/// Computes the total value transfer effect `tx` has on the script pubkeys belonging to the /// Computes the total value transfer effect `tx` has on the script pubkeys belonging to the
@@ -273,9 +435,13 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// *received* when it is on an output. For `sent` to be computed correctly, the output being /// *received* when it is on an output. For `sent` to be computed correctly, the output being
/// spent must have already been scanned by the index. Calculating received just uses the /// spent must have already been scanned by the index. Calculating received just uses the
/// [`Transaction`] outputs directly, so it will be correct even if it has not been scanned. /// [`Transaction`] outputs directly, so it will be correct even if it has not been scanned.
pub fn sent_and_received(&self, tx: &Transaction, range: impl RangeBounds<K>) -> (u64, u64) { pub fn sent_and_received(
&self,
tx: &Transaction,
range: impl RangeBounds<K>,
) -> (Amount, Amount) {
self.inner self.inner
.sent_and_received(tx, Self::map_to_inner_bounds(range)) .sent_and_received(tx, self.map_to_inner_bounds(range))
} }
/// Computes the net value that this transaction gives to the script pubkeys in the index and /// Computes the net value that this transaction gives to the script pubkeys in the index and
@@ -285,35 +451,77 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// This calls [`SpkTxOutIndex::net_value`] internally. /// This calls [`SpkTxOutIndex::net_value`] internally.
/// ///
/// [`sent_and_received`]: Self::sent_and_received /// [`sent_and_received`]: Self::sent_and_received
pub fn net_value(&self, tx: &Transaction, range: impl RangeBounds<K>) -> i64 { pub fn net_value(&self, tx: &Transaction, range: impl RangeBounds<K>) -> SignedAmount {
self.inner.net_value(tx, Self::map_to_inner_bounds(range)) self.inner.net_value(tx, self.map_to_inner_bounds(range))
} }
} }
impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> { impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// Return a reference to the internal map of keychain to descriptors. /// Return the map of the keychain to descriptors.
pub fn keychains(&self) -> &BTreeMap<K, Descriptor<DescriptorPublicKey>> { pub fn keychains(
&self.keychains &self,
) -> impl DoubleEndedIterator<Item = (&K, &Descriptor<DescriptorPublicKey>)> + ExactSizeIterator + '_
{
self.keychains_to_descriptors
.iter()
.map(|(k, (_, d))| (k, d))
} }
/// Add a keychain to the tracker's `txout_index` with a descriptor to derive addresses. /// Insert a descriptor with a keychain associated to it.
/// ///
/// Adding a keychain means you will be able to derive new script pubkeys under that keychain /// Adding a descriptor means you will be able to derive new script pubkeys under it
/// and the txout index will discover transaction outputs with those script pubkeys. /// and the txout index will discover transaction outputs with those script pubkeys.
/// ///
/// # Panics /// When trying to add a keychain that already existed under a different descriptor, or a descriptor
/// /// that already existed with a different keychain, the old keychain (or descriptor) will be
/// This will panic if a different `descriptor` is introduced to the same `keychain`. /// overwritten.
pub fn add_keychain(&mut self, keychain: K, descriptor: Descriptor<DescriptorPublicKey>) { pub fn insert_descriptor(
let old_descriptor = &*self &mut self,
.keychains keychain: K,
.entry(keychain.clone()) descriptor: Descriptor<DescriptorPublicKey>,
.or_insert_with(|| descriptor.clone()); ) -> super::ChangeSet<K> {
assert_eq!( let mut changeset = super::ChangeSet::<K>::default();
&descriptor, old_descriptor, let desc_id = descriptor.descriptor_id();
"keychain already contains a different descriptor"
); let old_desc = self
.keychains_to_descriptors
.insert(keychain.clone(), (desc_id, descriptor.clone()));
if let Some((old_desc_id, _)) = old_desc {
// nothing needs to be done if caller reinsterted the same descriptor under the same
// keychain
if old_desc_id == desc_id {
return changeset;
}
// we should remove old descriptor that is associated with this keychain as the index
// is designed to track one descriptor per keychain (however different keychains can
// share the same descriptor)
let _is_keychain_removed = self
.descriptor_ids_to_keychain_set
.get_mut(&old_desc_id)
.expect("we must have already inserted this descriptor")
.remove(&keychain);
debug_assert!(_is_keychain_removed);
}
self.descriptor_ids_to_keychain_set
.entry(desc_id)
.or_default()
.insert(keychain.clone());
self.descriptor_ids_to_descriptors
.insert(desc_id, descriptor.clone());
self.replenish_lookahead(&keychain, self.lookahead); self.replenish_lookahead(&keychain, self.lookahead);
changeset
.keychains_added
.insert(keychain.clone(), descriptor);
changeset
}
/// Gets the descriptor associated with the keychain. Returns `None` if the keychain doesn't
/// have a descriptor associated with it.
pub fn get_descriptor(&self, keychain: &K) -> Option<&Descriptor<DescriptorPublicKey>> {
self.keychains_to_descriptors.get(keychain).map(|(_, d)| d)
} }
/// Get the lookahead setting. /// Get the lookahead setting.
@@ -329,63 +537,60 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// ///
/// This does not change the global `lookahead` setting. /// This does not change the global `lookahead` setting.
pub fn lookahead_to_target(&mut self, keychain: &K, target_index: u32) { pub fn lookahead_to_target(&mut self, keychain: &K, target_index: u32) {
let (next_index, _) = self.next_index(keychain); if let Some((next_index, _)) = self.next_index(keychain) {
let temp_lookahead = (target_index + 1)
.checked_sub(next_index)
.filter(|&index| index > 0);
let temp_lookahead = (target_index + 1) if let Some(temp_lookahead) = temp_lookahead {
.checked_sub(next_index) self.replenish_lookahead(keychain, temp_lookahead);
.filter(|&index| index > 0); }
if let Some(temp_lookahead) = temp_lookahead {
self.replenish_lookahead(keychain, temp_lookahead);
} }
} }
fn replenish_lookahead(&mut self, keychain: &K, lookahead: u32) { fn replenish_lookahead(&mut self, keychain: &K, lookahead: u32) {
let descriptor = self.keychains.get(keychain).expect("keychain must exist"); let descriptor_opt = self.keychains_to_descriptors.get(keychain).cloned();
let next_store_index = self.next_store_index(keychain); if let Some((descriptor_id, descriptor)) = descriptor_opt {
let next_reveal_index = self.last_revealed.get(keychain).map_or(0, |v| *v + 1); let next_store_index = self.next_store_index(descriptor_id);
let next_reveal_index = self.last_revealed.get(&descriptor_id).map_or(0, |v| *v + 1);
for (new_index, new_spk) in for (new_index, new_spk) in SpkIterator::new_with_range(
SpkIterator::new_with_range(descriptor, next_store_index..next_reveal_index + lookahead) descriptor,
{ next_store_index..next_reveal_index + lookahead,
let _inserted = self ) {
.inner let _inserted = self.inner.insert_spk((descriptor_id, new_index), new_spk);
.insert_spk((keychain.clone(), new_index), new_spk); debug_assert!(_inserted, "replenish lookahead: must not have existing spk: keychain={:?}, lookahead={}, next_store_index={}, next_reveal_index={}", keychain, lookahead, next_store_index, next_reveal_index);
debug_assert!(_inserted, "replenish lookahead: must not have existing spk: keychain={:?}, lookahead={}, next_store_index={}, next_reveal_index={}", keychain, lookahead, next_store_index, next_reveal_index); }
} }
} }
fn next_store_index(&self, keychain: &K) -> u32 { fn next_store_index(&self, descriptor_id: DescriptorId) -> u32 {
self.inner() self.inner()
.all_spks() .all_spks()
// This range is filtering out the spks with a keychain different than // This range is keeping only the spks with descriptor_id equal to
// `keychain`. We don't use filter here as range is more optimized. // `descriptor_id`. We don't use filter here as range is more optimized.
.range((keychain.clone(), u32::MIN)..(keychain.clone(), u32::MAX)) .range((descriptor_id, u32::MIN)..(descriptor_id, u32::MAX))
.last() .last()
.map_or(0, |((_, index), _)| *index + 1) .map_or(0, |((_, index), _)| *index + 1)
} }
/// Get an unbounded spk iterator over a given `keychain`. /// Get an unbounded spk iterator over a given `keychain`. Returns `None` if the provided
/// /// keychain doesn't exist
/// # Panics pub fn unbounded_spk_iter(
/// &self,
/// This will panic if the given `keychain`'s descriptor does not exist. keychain: &K,
pub fn unbounded_spk_iter(&self, keychain: &K) -> SpkIterator<Descriptor<DescriptorPublicKey>> { ) -> Option<SpkIterator<Descriptor<DescriptorPublicKey>>> {
SpkIterator::new( let descriptor = self.keychains_to_descriptors.get(keychain)?.1.clone();
self.keychains Some(SpkIterator::new(descriptor))
.get(keychain)
.expect("keychain does not exist")
.clone(),
)
} }
/// Get unbounded spk iterators for all keychains. /// Get unbounded spk iterators for all keychains.
pub fn all_unbounded_spk_iters( pub fn all_unbounded_spk_iters(
&self, &self,
) -> BTreeMap<K, SpkIterator<Descriptor<DescriptorPublicKey>>> { ) -> BTreeMap<K, SpkIterator<Descriptor<DescriptorPublicKey>>> {
self.keychains self.keychains_to_descriptors
.iter() .iter()
.map(|(k, descriptor)| (k.clone(), SpkIterator::new(descriptor.clone()))) .map(|(k, (_, descriptor))| (k.clone(), SpkIterator::new(descriptor.clone())))
.collect() .collect()
} }
@@ -394,18 +599,27 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
&self, &self,
range: impl RangeBounds<K>, range: impl RangeBounds<K>,
) -> impl DoubleEndedIterator<Item = (&K, u32, &Script)> + Clone { ) -> impl DoubleEndedIterator<Item = (&K, u32, &Script)> + Clone {
self.keychains.range(range).flat_map(|(keychain, _)| { self.keychains_to_descriptors
let start = Bound::Included((keychain.clone(), u32::MIN)); .range(range)
let end = match self.last_revealed.get(keychain) { .flat_map(|(_, (descriptor_id, _))| {
Some(last_revealed) => Bound::Included((keychain.clone(), *last_revealed)), let start = Bound::Included((*descriptor_id, u32::MIN));
None => Bound::Excluded((keychain.clone(), u32::MIN)), let end = match self.last_revealed.get(descriptor_id) {
}; Some(last_revealed) => Bound::Included((*descriptor_id, *last_revealed)),
None => Bound::Excluded((*descriptor_id, u32::MIN)),
};
self.inner self.inner
.all_spks() .all_spks()
.range((start, end)) .range((start, end))
.map(|((keychain, i), spk)| (keychain, *i, spk.as_script())) .map(|((descriptor_id, i), spk)| {
}) (
self.keychain_of_desc_id(descriptor_id)
.expect("must have keychain"),
*i,
spk.as_script(),
)
})
})
} }
/// Iterate over revealed spks of the given `keychain`. /// Iterate over revealed spks of the given `keychain`.
@@ -419,20 +633,29 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// Iterate over revealed, but unused, spks of all keychains. /// Iterate over revealed, but unused, spks of all keychains.
pub fn unused_spks(&self) -> impl DoubleEndedIterator<Item = (K, u32, &Script)> + Clone { pub fn unused_spks(&self) -> impl DoubleEndedIterator<Item = (K, u32, &Script)> + Clone {
self.keychains.keys().flat_map(|keychain| { self.keychains_to_descriptors.keys().flat_map(|keychain| {
self.unused_keychain_spks(keychain) self.unused_keychain_spks(keychain)
.map(|(i, spk)| (keychain.clone(), i, spk)) .map(|(i, spk)| (keychain.clone(), i, spk))
}) })
} }
/// Iterate over revealed, but unused, spks of the given `keychain`. /// Iterate over revealed, but unused, spks of the given `keychain`.
/// Returns an empty iterator if the provided keychain doesn't exist.
pub fn unused_keychain_spks( pub fn unused_keychain_spks(
&self, &self,
keychain: &K, keychain: &K,
) -> impl DoubleEndedIterator<Item = (u32, &Script)> + Clone { ) -> impl DoubleEndedIterator<Item = (u32, &Script)> + Clone {
let next_i = self.last_revealed.get(keychain).map_or(0, |&i| i + 1); let desc_id = self
.keychains_to_descriptors
.get(keychain)
.map(|(desc_id, _)| *desc_id)
// We use a dummy desc id if we can't find the real one in our map. In this way,
// if this method was to be called with a non-existent keychain, we would return an
// empty iterator
.unwrap_or_else(|| DescriptorId::from_byte_array([0; 32]));
let next_i = self.last_revealed.get(&desc_id).map_or(0, |&i| i + 1);
self.inner self.inner
.unused_spks((keychain.clone(), u32::MIN)..(keychain.clone(), next_i)) .unused_spks((desc_id, u32::MIN)..(desc_id, next_i))
.map(|((_, i), spk)| (*i, spk)) .map(|((_, i), spk)| (*i, spk))
} }
@@ -447,17 +670,15 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// ///
/// Not checking the second field of the tuple may result in address reuse. /// Not checking the second field of the tuple may result in address reuse.
/// ///
/// # Panics /// Returns None if the provided `keychain` doesn't exist.
/// pub fn next_index(&self, keychain: &K) -> Option<(u32, bool)> {
/// Panics if the `keychain` does not exist. let (descriptor_id, descriptor) = self.keychains_to_descriptors.get(keychain)?;
pub fn next_index(&self, keychain: &K) -> (u32, bool) { let last_index = self.last_revealed.get(descriptor_id).cloned();
let descriptor = self.keychains.get(keychain).expect("keychain must exist");
let last_index = self.last_revealed.get(keychain).cloned();
// we can only get the next index if the wildcard exists. // we can only get the next index if the wildcard exists.
let has_wildcard = descriptor.has_wildcard(); let has_wildcard = descriptor.has_wildcard();
match last_index { Some(match last_index {
// if there is no index, next_index is always 0. // if there is no index, next_index is always 0.
None => (0, true), None => (0, true),
// descriptors without wildcards can only have one index. // descriptors without wildcards can only have one index.
@@ -469,19 +690,27 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
Some(index) if index == BIP32_MAX_INDEX => (index, false), Some(index) if index == BIP32_MAX_INDEX => (index, false),
// get the next derivation index. // get the next derivation index.
Some(index) => (index + 1, true), Some(index) => (index + 1, true),
} })
} }
/// Get the last derivation index that is revealed for each keychain. /// Get the last derivation index that is revealed for each keychain.
/// ///
/// Keychains with no revealed indices will not be included in the returned [`BTreeMap`]. /// Keychains with no revealed indices will not be included in the returned [`BTreeMap`].
pub fn last_revealed_indices(&self) -> &BTreeMap<K, u32> { pub fn last_revealed_indices(&self) -> BTreeMap<K, u32> {
&self.last_revealed self.last_revealed
.iter()
.filter_map(|(desc_id, index)| {
let keychain = self.keychain_of_desc_id(desc_id)?;
Some((keychain.clone(), *index))
})
.collect()
} }
/// Get the last derivation index revealed for `keychain`. /// Get the last derivation index revealed for `keychain`. Returns None if the keychain doesn't
/// exist, or if the keychain doesn't have any revealed scripts.
pub fn last_revealed_index(&self, keychain: &K) -> Option<u32> { pub fn last_revealed_index(&self, keychain: &K) -> Option<u32> {
self.last_revealed.get(keychain).cloned() let descriptor_id = self.keychains_to_descriptors.get(keychain)?.0;
self.last_revealed.get(&descriptor_id).cloned()
} }
/// Convenience method to call [`Self::reveal_to_target`] on multiple keychains. /// Convenience method to call [`Self::reveal_to_target`] on multiple keychains.
@@ -496,16 +725,77 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
let mut spks = BTreeMap::new(); let mut spks = BTreeMap::new();
for (keychain, &index) in keychains { for (keychain, &index) in keychains {
let (new_spks, new_changeset) = self.reveal_to_target(keychain, index); if let Some((new_spks, new_changeset)) = self.reveal_to_target(keychain, index) {
if !new_changeset.is_empty() { if !new_changeset.is_empty() {
spks.insert(keychain.clone(), new_spks); spks.insert(keychain.clone(), new_spks);
changeset.append(new_changeset.clone()); changeset.append(new_changeset.clone());
}
} }
} }
(spks, changeset) (spks, changeset)
} }
/// Convenience method to call `reveal_to_target` with a descriptor_id instead of a keychain.
/// This is useful for revealing spks of descriptors for which we don't have a keychain
/// tracked.
/// Refer to the `reveal_to_target` documentation for more.
///
/// Returns None if the provided `descriptor_id` doesn't correspond to a tracked descriptor.
fn reveal_to_target_with_id(
&mut self,
descriptor_id: DescriptorId,
target_index: u32,
) -> Option<(
SpkIterator<Descriptor<DescriptorPublicKey>>,
super::ChangeSet<K>,
)> {
let descriptor = self
.descriptor_ids_to_descriptors
.get(&descriptor_id)?
.clone();
let has_wildcard = descriptor.has_wildcard();
let target_index = if has_wildcard { target_index } else { 0 };
let next_reveal_index = self
.last_revealed
.get(&descriptor_id)
.map_or(0, |index| *index + 1);
debug_assert!(next_reveal_index + self.lookahead >= self.next_store_index(descriptor_id));
// If the target_index is already revealed, we are done
if next_reveal_index > target_index {
return Some((
SpkIterator::new_with_range(descriptor, next_reveal_index..next_reveal_index),
super::ChangeSet::default(),
));
}
// We range over the indexes that are not stored and insert their spks in the index.
// Indexes from next_reveal_index to next_reveal_index + lookahead are already stored (due
// to lookahead), so we only range from next_reveal_index + lookahead to target + lookahead
let range = next_reveal_index + self.lookahead..=target_index + self.lookahead;
for (new_index, new_spk) in SpkIterator::new_with_range(descriptor.clone(), range) {
let _inserted = self.inner.insert_spk((descriptor_id, new_index), new_spk);
debug_assert!(_inserted, "must not have existing spk");
debug_assert!(
has_wildcard || new_index == 0,
"non-wildcard descriptors must not iterate past index 0"
);
}
let _old_index = self.last_revealed.insert(descriptor_id, target_index);
debug_assert!(_old_index < Some(target_index));
Some((
SpkIterator::new_with_range(descriptor, next_reveal_index..target_index + 1),
super::ChangeSet {
keychains_added: BTreeMap::new(),
last_revealed: core::iter::once((descriptor_id, target_index)).collect(),
},
))
}
/// Reveals script pubkeys of the `keychain`'s descriptor **up to and including** the /// Reveals script pubkeys of the `keychain`'s descriptor **up to and including** the
/// `target_index`. /// `target_index`.
/// ///
@@ -517,84 +807,46 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// [`super::ChangeSet`], which reports updates to the latest revealed index. If no new script /// [`super::ChangeSet`], which reports updates to the latest revealed index. If no new script
/// pubkeys are revealed, then both of these will be empty. /// pubkeys are revealed, then both of these will be empty.
/// ///
/// # Panics /// Returns None if the provided `keychain` doesn't exist.
///
/// Panics if `keychain` does not exist.
pub fn reveal_to_target( pub fn reveal_to_target(
&mut self, &mut self,
keychain: &K, keychain: &K,
target_index: u32, target_index: u32,
) -> ( ) -> Option<(
SpkIterator<Descriptor<DescriptorPublicKey>>, SpkIterator<Descriptor<DescriptorPublicKey>>,
super::ChangeSet<K>, super::ChangeSet<K>,
) { )> {
let descriptor = self.keychains.get(keychain).expect("keychain must exist"); let descriptor_id = self.keychains_to_descriptors.get(keychain)?.0;
let has_wildcard = descriptor.has_wildcard(); self.reveal_to_target_with_id(descriptor_id, target_index)
let target_index = if has_wildcard { target_index } else { 0 };
let next_reveal_index = self
.last_revealed
.get(keychain)
.map_or(0, |index| *index + 1);
debug_assert!(next_reveal_index + self.lookahead >= self.next_store_index(keychain));
// If the target_index is already revealed, we are done
if next_reveal_index > target_index {
return (
SpkIterator::new_with_range(
descriptor.clone(),
next_reveal_index..next_reveal_index,
),
super::ChangeSet::default(),
);
}
// We range over the indexes that are not stored and insert their spks in the index.
// Indexes from next_reveal_index to next_reveal_index + lookahead are already stored (due
// to lookahead), so we only range from next_reveal_index + lookahead to target + lookahead
let range = next_reveal_index + self.lookahead..=target_index + self.lookahead;
for (new_index, new_spk) in SpkIterator::new_with_range(descriptor, range) {
let _inserted = self
.inner
.insert_spk((keychain.clone(), new_index), new_spk);
debug_assert!(_inserted, "must not have existing spk");
debug_assert!(
has_wildcard || new_index == 0,
"non-wildcard descriptors must not iterate past index 0"
);
}
let _old_index = self.last_revealed.insert(keychain.clone(), target_index);
debug_assert!(_old_index < Some(target_index));
(
SpkIterator::new_with_range(descriptor.clone(), next_reveal_index..target_index + 1),
super::ChangeSet(core::iter::once((keychain.clone(), target_index)).collect()),
)
} }
/// Attempts to reveal the next script pubkey for `keychain`. /// Attempts to reveal the next script pubkey for `keychain`.
/// ///
/// Returns the derivation index of the revealed script pubkey, the revealed script pubkey and a /// Returns the derivation index of the revealed script pubkey, the revealed script pubkey and a
/// [`super::ChangeSet`] which represents changes in the last revealed index (if any). /// [`super::ChangeSet`] which represents changes in the last revealed index (if any).
/// Returns None if the provided keychain doesn't exist.
/// ///
/// When a new script cannot be revealed, we return the last revealed script and an empty /// When a new script cannot be revealed, we return the last revealed script and an empty
/// [`super::ChangeSet`]. There are two scenarios when a new script pubkey cannot be derived: /// [`super::ChangeSet`]. There are two scenarios when a new script pubkey cannot be derived:
/// ///
/// 1. The descriptor has no wildcard and already has one script revealed. /// 1. The descriptor has no wildcard and already has one script revealed.
/// 2. The descriptor has already revealed scripts up to the numeric bound. /// 2. The descriptor has already revealed scripts up to the numeric bound.
/// /// 3. There is no descriptor associated with the given keychain.
/// # Panics pub fn reveal_next_spk(
/// &mut self,
/// Panics if the `keychain` does not exist. keychain: &K,
pub fn reveal_next_spk(&mut self, keychain: &K) -> ((u32, &Script), super::ChangeSet<K>) { ) -> Option<((u32, &Script), super::ChangeSet<K>)> {
let (next_index, _) = self.next_index(keychain); let descriptor_id = self.keychains_to_descriptors.get(keychain)?.0;
let changeset = self.reveal_to_target(keychain, next_index).1; let (next_index, _) = self.next_index(keychain).expect("We know keychain exists");
let changeset = self
.reveal_to_target(keychain, next_index)
.expect("We know keychain exists")
.1;
let script = self let script = self
.inner .inner
.spk_at_index(&(keychain.clone(), next_index)) .spk_at_index(&(descriptor_id, next_index))
.expect("script must already be stored"); .expect("script must already be stored");
((next_index, script), changeset) Some(((next_index, script), changeset))
} }
/// Gets the next unused script pubkey in the keychain. I.e., the script pubkey with the lowest /// Gets the next unused script pubkey in the keychain. I.e., the script pubkey with the lowest
@@ -606,21 +858,22 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// has used all scripts up to the derivation bounds, then the last derived script pubkey will be /// has used all scripts up to the derivation bounds, then the last derived script pubkey will be
/// returned. /// returned.
/// ///
/// # Panics /// Returns None if the provided keychain doesn't exist.
/// pub fn next_unused_spk(
/// Panics if `keychain` has never been added to the index &mut self,
pub fn next_unused_spk(&mut self, keychain: &K) -> ((u32, &Script), super::ChangeSet<K>) { keychain: &K,
) -> Option<((u32, &Script), super::ChangeSet<K>)> {
let need_new = self.unused_keychain_spks(keychain).next().is_none(); let need_new = self.unused_keychain_spks(keychain).next().is_none();
// this rather strange branch is needed because of some lifetime issues // this rather strange branch is needed because of some lifetime issues
if need_new { if need_new {
self.reveal_next_spk(keychain) self.reveal_next_spk(keychain)
} else { } else {
( Some((
self.unused_keychain_spks(keychain) self.unused_keychain_spks(keychain)
.next() .next()
.expect("we already know next exists"), .expect("we already know next exists"),
super::ChangeSet::default(), super::ChangeSet::default(),
) ))
} }
} }
@@ -639,21 +892,35 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
&'a self, &'a self,
range: impl RangeBounds<K> + 'a, range: impl RangeBounds<K> + 'a,
) -> impl DoubleEndedIterator<Item = (&'a K, u32, OutPoint)> + 'a { ) -> impl DoubleEndedIterator<Item = (&'a K, u32, OutPoint)> + 'a {
let bounds = Self::map_to_inner_bounds(range); let bounds = self.map_to_inner_bounds(range);
self.inner self.inner
.outputs_in_range(bounds) .outputs_in_range(bounds)
.map(move |((keychain, i), op)| (keychain, *i, op)) .map(move |((desc_id, i), op)| {
let keychain = self
.keychain_of_desc_id(desc_id)
.expect("keychain must exist");
(keychain, *i, op)
})
} }
fn map_to_inner_bounds(bound: impl RangeBounds<K>) -> impl RangeBounds<(K, u32)> { fn map_to_inner_bounds(
&self,
bound: impl RangeBounds<K>,
) -> impl RangeBounds<(DescriptorId, u32)> {
let get_desc_id = |keychain| {
self.keychains_to_descriptors
.get(keychain)
.map(|(desc_id, _)| *desc_id)
.unwrap_or_else(|| DescriptorId::from_byte_array([0; 32]))
};
let start = match bound.start_bound() { let start = match bound.start_bound() {
Bound::Included(keychain) => Bound::Included((keychain.clone(), u32::MIN)), Bound::Included(keychain) => Bound::Included((get_desc_id(keychain), u32::MIN)),
Bound::Excluded(keychain) => Bound::Excluded((keychain.clone(), u32::MAX)), Bound::Excluded(keychain) => Bound::Excluded((get_desc_id(keychain), u32::MAX)),
Bound::Unbounded => Bound::Unbounded, Bound::Unbounded => Bound::Unbounded,
}; };
let end = match bound.end_bound() { let end = match bound.end_bound() {
Bound::Included(keychain) => Bound::Included((keychain.clone(), u32::MAX)), Bound::Included(keychain) => Bound::Included((get_desc_id(keychain), u32::MAX)),
Bound::Excluded(keychain) => Bound::Excluded((keychain.clone(), u32::MIN)), Bound::Excluded(keychain) => Bound::Excluded((get_desc_id(keychain), u32::MIN)),
Bound::Unbounded => Bound::Unbounded, Bound::Unbounded => Bound::Unbounded,
}; };
@@ -669,7 +936,7 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
/// Returns the highest derivation index of each keychain that [`KeychainTxOutIndex`] has found /// Returns the highest derivation index of each keychain that [`KeychainTxOutIndex`] has found
/// a [`TxOut`] with it's script pubkey. /// a [`TxOut`] with it's script pubkey.
pub fn last_used_indices(&self) -> BTreeMap<K, u32> { pub fn last_used_indices(&self) -> BTreeMap<K, u32> {
self.keychains self.keychains_to_descriptors
.iter() .iter()
.filter_map(|(keychain, _)| { .filter_map(|(keychain, _)| {
self.last_used_index(keychain) self.last_used_index(keychain)
@@ -678,9 +945,27 @@ impl<K: Clone + Ord + Debug> KeychainTxOutIndex<K> {
.collect() .collect()
} }
/// Applies the derivation changeset to the [`KeychainTxOutIndex`], extending the number of /// Applies the derivation changeset to the [`KeychainTxOutIndex`], as specified in the
/// derived scripts per keychain, as specified in the `changeset`. /// [`ChangeSet::append`] documentation:
/// - Extends the number of derived scripts per keychain
/// - Adds new descriptors introduced
/// - If a descriptor is introduced for a keychain that already had a descriptor, overwrites
/// the old descriptor
pub fn apply_changeset(&mut self, changeset: super::ChangeSet<K>) { pub fn apply_changeset(&mut self, changeset: super::ChangeSet<K>) {
let _ = self.reveal_to_target_multi(&changeset.0); let ChangeSet {
keychains_added,
last_revealed,
} = changeset;
for (keychain, descriptor) in keychains_added {
let _ = self.insert_descriptor(keychain, descriptor);
}
let last_revealed = last_revealed
.into_iter()
.filter_map(|(desc_id, index)| {
let keychain = self.keychain_of_desc_id(&desc_id)?;
Some((keychain.clone(), index))
})
.collect();
let _ = self.reveal_to_target_multi(&last_revealed);
} }
} }

View File

@@ -44,7 +44,7 @@ pub use miniscript;
#[cfg(feature = "miniscript")] #[cfg(feature = "miniscript")]
mod descriptor_ext; mod descriptor_ext;
#[cfg(feature = "miniscript")] #[cfg(feature = "miniscript")]
pub use descriptor_ext::DescriptorExt; pub use descriptor_ext::{DescriptorExt, DescriptorId};
#[cfg(feature = "miniscript")] #[cfg(feature = "miniscript")]
mod spk_iter; mod spk_iter;
#[cfg(feature = "miniscript")] #[cfg(feature = "miniscript")]

View File

@@ -1,11 +1,18 @@
//! Helper types for spk-based blockchain clients. //! Helper types for spk-based blockchain clients.
use crate::{
collections::{BTreeMap, HashMap},
local_chain::CheckPoint,
ConfirmationTimeHeightAnchor, TxGraph,
};
use alloc::{boxed::Box, sync::Arc, vec::Vec};
use bitcoin::{OutPoint, Script, ScriptBuf, Transaction, Txid};
use core::{fmt::Debug, marker::PhantomData, ops::RangeBounds}; use core::{fmt::Debug, marker::PhantomData, ops::RangeBounds};
use alloc::{boxed::Box, collections::BTreeMap, vec::Vec}; /// A cache of [`Arc`]-wrapped full transactions, identified by their [`Txid`]s.
use bitcoin::{OutPoint, Script, ScriptBuf, Txid}; ///
/// This is used by the chain-source to avoid re-fetching full transactions.
use crate::{local_chain::CheckPoint, ConfirmationTimeHeightAnchor, TxGraph}; pub type TxCache = HashMap<Txid, Arc<Transaction>>;
/// Data required to perform a spk-based blockchain client sync. /// Data required to perform a spk-based blockchain client sync.
/// ///
@@ -17,6 +24,8 @@ pub struct SyncRequest {
/// ///
/// [`LocalChain::tip`]: crate::local_chain::LocalChain::tip /// [`LocalChain::tip`]: crate::local_chain::LocalChain::tip
pub chain_tip: CheckPoint, pub chain_tip: CheckPoint,
/// Cache of full transactions, so the chain-source can avoid re-fetching.
pub tx_cache: TxCache,
/// Transactions that spend from or to these indexed script pubkeys. /// Transactions that spend from or to these indexed script pubkeys.
pub spks: Box<dyn ExactSizeIterator<Item = ScriptBuf> + Send>, pub spks: Box<dyn ExactSizeIterator<Item = ScriptBuf> + Send>,
/// Transactions with these txids. /// Transactions with these txids.
@@ -30,12 +39,36 @@ impl SyncRequest {
pub fn from_chain_tip(cp: CheckPoint) -> Self { pub fn from_chain_tip(cp: CheckPoint) -> Self {
Self { Self {
chain_tip: cp, chain_tip: cp,
tx_cache: TxCache::new(),
spks: Box::new(core::iter::empty()), spks: Box::new(core::iter::empty()),
txids: Box::new(core::iter::empty()), txids: Box::new(core::iter::empty()),
outpoints: Box::new(core::iter::empty()), outpoints: Box::new(core::iter::empty()),
} }
} }
/// Add to the [`TxCache`] held by the request.
///
/// This consumes the [`SyncRequest`] and returns the updated one.
#[must_use]
pub fn cache_txs<T>(mut self, full_txs: impl IntoIterator<Item = (Txid, T)>) -> Self
where
T: Into<Arc<Transaction>>,
{
self.tx_cache = full_txs
.into_iter()
.map(|(txid, tx)| (txid, tx.into()))
.collect();
self
}
/// Add all transactions from [`TxGraph`] into the [`TxCache`].
///
/// This consumes the [`SyncRequest`] and returns the updated one.
#[must_use]
pub fn cache_graph_txs<A>(self, graph: &TxGraph<A>) -> Self {
self.cache_txs(graph.full_txs().map(|tx_node| (tx_node.txid, tx_node.tx)))
}
/// Set the [`Script`]s that will be synced against. /// Set the [`Script`]s that will be synced against.
/// ///
/// This consumes the [`SyncRequest`] and returns the updated one. /// This consumes the [`SyncRequest`] and returns the updated one.
@@ -194,6 +227,8 @@ pub struct FullScanRequest<K> {
/// ///
/// [`LocalChain::tip`]: crate::local_chain::LocalChain::tip /// [`LocalChain::tip`]: crate::local_chain::LocalChain::tip
pub chain_tip: CheckPoint, pub chain_tip: CheckPoint,
/// Cache of full transactions, so the chain-source can avoid re-fetching.
pub tx_cache: TxCache,
/// Iterators of script pubkeys indexed by the keychain index. /// Iterators of script pubkeys indexed by the keychain index.
pub spks_by_keychain: BTreeMap<K, Box<dyn Iterator<Item = (u32, ScriptBuf)> + Send>>, pub spks_by_keychain: BTreeMap<K, Box<dyn Iterator<Item = (u32, ScriptBuf)> + Send>>,
} }
@@ -204,10 +239,34 @@ impl<K: Ord + Clone> FullScanRequest<K> {
pub fn from_chain_tip(chain_tip: CheckPoint) -> Self { pub fn from_chain_tip(chain_tip: CheckPoint) -> Self {
Self { Self {
chain_tip, chain_tip,
tx_cache: TxCache::new(),
spks_by_keychain: BTreeMap::new(), spks_by_keychain: BTreeMap::new(),
} }
} }
/// Add to the [`TxCache`] held by the request.
///
/// This consumes the [`SyncRequest`] and returns the updated one.
#[must_use]
pub fn cache_txs<T>(mut self, full_txs: impl IntoIterator<Item = (Txid, T)>) -> Self
where
T: Into<Arc<Transaction>>,
{
self.tx_cache = full_txs
.into_iter()
.map(|(txid, tx)| (txid, tx.into()))
.collect();
self
}
/// Add all transactions from [`TxGraph`] into the [`TxCache`].
///
/// This consumes the [`SyncRequest`] and returns the updated one.
#[must_use]
pub fn cache_graph_txs<A>(self, graph: &TxGraph<A>) -> Self {
self.cache_txs(graph.full_txs().map(|tx_node| (tx_node.txid, tx_node.tx)))
}
/// Construct a new [`FullScanRequest`] from a given `chain_tip` and `index`. /// Construct a new [`FullScanRequest`] from a given `chain_tip` and `index`.
/// ///
/// Unbounded script pubkey iterators for each keychain (`K`) are extracted using /// Unbounded script pubkey iterators for each keychain (`K`) are extracted using
@@ -316,9 +375,9 @@ impl<K: Ord + Clone> FullScanRequest<K> {
/// Data returned from a spk-based blockchain client full scan. /// Data returned from a spk-based blockchain client full scan.
/// ///
/// See also [`FullScanRequest`]. /// See also [`FullScanRequest`].
pub struct FullScanResult<K> { pub struct FullScanResult<K, A = ConfirmationTimeHeightAnchor> {
/// The update to apply to the receiving [`LocalChain`](crate::local_chain::LocalChain). /// The update to apply to the receiving [`LocalChain`](crate::local_chain::LocalChain).
pub graph_update: TxGraph<ConfirmationTimeHeightAnchor>, pub graph_update: TxGraph<A>,
/// The update to apply to the receiving [`TxGraph`]. /// The update to apply to the receiving [`TxGraph`].
pub chain_update: CheckPoint, pub chain_update: CheckPoint,
/// Last active indices for the corresponding keychains (`K`). /// Last active indices for the corresponding keychains (`K`).

View File

@@ -158,8 +158,8 @@ mod test {
let (external_descriptor,_) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/0/*)").unwrap(); let (external_descriptor,_) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/0/*)").unwrap();
let (internal_descriptor,_) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/1/*)").unwrap(); let (internal_descriptor,_) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/1/*)").unwrap();
txout_index.add_keychain(TestKeychain::External, external_descriptor.clone()); let _ = txout_index.insert_descriptor(TestKeychain::External, external_descriptor.clone());
txout_index.add_keychain(TestKeychain::Internal, internal_descriptor.clone()); let _ = txout_index.insert_descriptor(TestKeychain::Internal, internal_descriptor.clone());
(txout_index, external_descriptor, internal_descriptor) (txout_index, external_descriptor, internal_descriptor)
} }
@@ -258,18 +258,10 @@ mod test {
None None
); );
} }
}
// The following dummy traits were created to test if SpkIterator is working properly.
#[allow(unused)] #[test]
trait TestSendStatic: Send + 'static { fn spk_iterator_is_send_and_static() {
fn test(&self) -> u32 { fn is_send_and_static<A: Send + 'static>() {}
20 is_send_and_static::<SpkIterator<Descriptor<DescriptorPublicKey>>>()
}
}
impl TestSendStatic for SpkIterator<Descriptor<DescriptorPublicKey>> {
fn test(&self) -> u32 {
20
}
}
} }

View File

@@ -4,7 +4,7 @@ use crate::{
collections::{hash_map::Entry, BTreeMap, BTreeSet, HashMap}, collections::{hash_map::Entry, BTreeMap, BTreeSet, HashMap},
indexed_tx_graph::Indexer, indexed_tx_graph::Indexer,
}; };
use bitcoin::{OutPoint, Script, ScriptBuf, Transaction, TxOut, Txid}; use bitcoin::{Amount, OutPoint, Script, ScriptBuf, SignedAmount, Transaction, TxOut, Txid};
/// An index storing [`TxOut`]s that have a script pubkey that matches those in a list. /// An index storing [`TxOut`]s that have a script pubkey that matches those in a list.
/// ///
@@ -275,21 +275,25 @@ impl<I: Clone + Ord> SpkTxOutIndex<I> {
/// output. For `sent` to be computed correctly, the output being spent must have already been /// output. For `sent` to be computed correctly, the output being spent must have already been
/// scanned by the index. Calculating received just uses the [`Transaction`] outputs directly, /// scanned by the index. Calculating received just uses the [`Transaction`] outputs directly,
/// so it will be correct even if it has not been scanned. /// so it will be correct even if it has not been scanned.
pub fn sent_and_received(&self, tx: &Transaction, range: impl RangeBounds<I>) -> (u64, u64) { pub fn sent_and_received(
let mut sent = 0; &self,
let mut received = 0; tx: &Transaction,
range: impl RangeBounds<I>,
) -> (Amount, Amount) {
let mut sent = Amount::ZERO;
let mut received = Amount::ZERO;
for txin in &tx.input { for txin in &tx.input {
if let Some((index, txout)) = self.txout(txin.previous_output) { if let Some((index, txout)) = self.txout(txin.previous_output) {
if range.contains(index) { if range.contains(index) {
sent += txout.value.to_sat(); sent += txout.value;
} }
} }
} }
for txout in &tx.output { for txout in &tx.output {
if let Some(index) = self.index_of_spk(&txout.script_pubkey) { if let Some(index) = self.index_of_spk(&txout.script_pubkey) {
if range.contains(index) { if range.contains(index) {
received += txout.value.to_sat(); received += txout.value;
} }
} }
} }
@@ -301,9 +305,10 @@ impl<I: Clone + Ord> SpkTxOutIndex<I> {
/// for calling [`sent_and_received`] and subtracting sent from received. /// for calling [`sent_and_received`] and subtracting sent from received.
/// ///
/// [`sent_and_received`]: Self::sent_and_received /// [`sent_and_received`]: Self::sent_and_received
pub fn net_value(&self, tx: &Transaction, range: impl RangeBounds<I>) -> i64 { pub fn net_value(&self, tx: &Transaction, range: impl RangeBounds<I>) -> SignedAmount {
let (sent, received) = self.sent_and_received(tx, range); let (sent, received) = self.sent_and_received(tx, range);
received as i64 - sent as i64 received.to_signed().expect("valid `SignedAmount`")
- sent.to_signed().expect("valid `SignedAmount`")
} }
/// Whether any of the inputs of this transaction spend a txout tracked or whether any output /// Whether any of the inputs of this transaction spend a txout tracked or whether any output

View File

@@ -95,7 +95,7 @@ use crate::{
use alloc::collections::vec_deque::VecDeque; use alloc::collections::vec_deque::VecDeque;
use alloc::sync::Arc; use alloc::sync::Arc;
use alloc::vec::Vec; use alloc::vec::Vec;
use bitcoin::{OutPoint, Script, Transaction, TxOut, Txid}; use bitcoin::{Amount, OutPoint, Script, Transaction, TxOut, Txid};
use core::fmt::{self, Formatter}; use core::fmt::{self, Formatter};
use core::{ use core::{
convert::Infallible, convert::Infallible,
@@ -516,12 +516,12 @@ impl<A: Clone + Ord> TxGraph<A> {
/// Inserts the given transaction into [`TxGraph`]. /// Inserts the given transaction into [`TxGraph`].
/// ///
/// The [`ChangeSet`] returned will be empty if `tx` already exists. /// The [`ChangeSet`] returned will be empty if `tx` already exists.
pub fn insert_tx(&mut self, tx: Transaction) -> ChangeSet<A> { pub fn insert_tx<T: Into<Arc<Transaction>>>(&mut self, tx: T) -> ChangeSet<A> {
let tx = tx.into();
let mut update = Self::default(); let mut update = Self::default();
update.txs.insert( update
tx.txid(), .txs
(TxNodeInternal::Whole(tx.into()), BTreeSet::new(), 0), .insert(tx.txid(), (TxNodeInternal::Whole(tx), BTreeSet::new(), 0));
);
self.apply_update(update) self.apply_update(update)
} }
@@ -1155,10 +1155,10 @@ impl<A: Anchor> TxGraph<A> {
outpoints: impl IntoIterator<Item = (OI, OutPoint)>, outpoints: impl IntoIterator<Item = (OI, OutPoint)>,
mut trust_predicate: impl FnMut(&OI, &Script) -> bool, mut trust_predicate: impl FnMut(&OI, &Script) -> bool,
) -> Result<Balance, C::Error> { ) -> Result<Balance, C::Error> {
let mut immature = 0; let mut immature = Amount::ZERO;
let mut trusted_pending = 0; let mut trusted_pending = Amount::ZERO;
let mut untrusted_pending = 0; let mut untrusted_pending = Amount::ZERO;
let mut confirmed = 0; let mut confirmed = Amount::ZERO;
for res in self.try_filter_chain_unspents(chain, chain_tip, outpoints) { for res in self.try_filter_chain_unspents(chain, chain_tip, outpoints) {
let (spk_i, txout) = res?; let (spk_i, txout) = res?;
@@ -1166,16 +1166,16 @@ impl<A: Anchor> TxGraph<A> {
match &txout.chain_position { match &txout.chain_position {
ChainPosition::Confirmed(_) => { ChainPosition::Confirmed(_) => {
if txout.is_confirmed_and_spendable(chain_tip.height) { if txout.is_confirmed_and_spendable(chain_tip.height) {
confirmed += txout.txout.value.to_sat(); confirmed += txout.txout.value;
} else if !txout.is_mature(chain_tip.height) { } else if !txout.is_mature(chain_tip.height) {
immature += txout.txout.value.to_sat(); immature += txout.txout.value;
} }
} }
ChainPosition::Unconfirmed(_) => { ChainPosition::Unconfirmed(_) => {
if trust_predicate(&spk_i, &txout.txout.script_pubkey) { if trust_predicate(&spk_i, &txout.txout.script_pubkey) {
trusted_pending += txout.txout.value.to_sat(); trusted_pending += txout.txout.value;
} else { } else {
untrusted_pending += txout.txout.value.to_sat(); untrusted_pending += txout.txout.value;
} }
} }
} }

View File

@@ -1,3 +1,5 @@
#![cfg(feature = "miniscript")]
mod tx_template; mod tx_template;
#[allow(unused_imports)] #[allow(unused_imports)]
pub use tx_template::*; pub use tx_template::*;
@@ -73,3 +75,15 @@ pub fn new_tx(lt: u32) -> bitcoin::Transaction {
output: vec![], output: vec![],
} }
} }
#[allow(unused)]
pub const DESCRIPTORS: [&str; 7] = [
"tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/0/*)",
"tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/1/*)",
"wpkh([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/1/0/*)",
"tr(tprv8ZgxMBicQKsPd3krDUsBAmtnRsK3rb8u5yi1zhQgMhF1tR8MW7xfE4rnrbbsrbPR52e7rKapu6ztw1jXveJSCGHEriUGZV7mCe88duLp5pj/86'/1'/0'/0/*)",
"tr(tprv8ZgxMBicQKsPd3krDUsBAmtnRsK3rb8u5yi1zhQgMhF1tR8MW7xfE4rnrbbsrbPR52e7rKapu6ztw1jXveJSCGHEriUGZV7mCe88duLp5pj/86'/1'/0'/1/*)",
"wpkh(xprv9s21ZrQH143K4EXURwMHuLS469fFzZyXk7UUpdKfQwhoHcAiYTakpe8pMU2RiEdvrU9McyuE7YDoKcXkoAwEGoK53WBDnKKv2zZbb9BzttX/1/0/*)",
// non-wildcard
"wpkh([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/1/0)",
];

View File

@@ -1,3 +1,5 @@
#![cfg(feature = "miniscript")]
use rand::distributions::{Alphanumeric, DistString}; use rand::distributions::{Alphanumeric, DistString};
use std::collections::HashMap; use std::collections::HashMap;
@@ -52,7 +54,8 @@ impl TxOutTemplate {
pub fn init_graph<'a, A: Anchor + Clone + 'a>( pub fn init_graph<'a, A: Anchor + Clone + 'a>(
tx_templates: impl IntoIterator<Item = &'a TxTemplate<'a, A>>, tx_templates: impl IntoIterator<Item = &'a TxTemplate<'a, A>>,
) -> (TxGraph<A>, SpkTxOutIndex<u32>, HashMap<&'a str, Txid>) { ) -> (TxGraph<A>, SpkTxOutIndex<u32>, HashMap<&'a str, Txid>) {
let (descriptor, _) = Descriptor::parse_descriptor(&Secp256k1::signing_only(), "tr(tprv8ZgxMBicQKsPd3krDUsBAmtnRsK3rb8u5yi1zhQgMhF1tR8MW7xfE4rnrbbsrbPR52e7rKapu6ztw1jXveJSCGHEriUGZV7mCe88duLp5pj/86'/1'/0'/0/*)").unwrap(); let (descriptor, _) =
Descriptor::parse_descriptor(&Secp256k1::signing_only(), super::DESCRIPTORS[2]).unwrap();
let mut graph = TxGraph::<A>::default(); let mut graph = TxGraph::<A>::default();
let mut spk_index = SpkTxOutIndex::default(); let mut spk_index = SpkTxOutIndex::default();
(0..10).for_each(|index| { (0..10).for_each(|index| {

View File

@@ -1,13 +1,16 @@
#![cfg(feature = "miniscript")]
#[macro_use] #[macro_use]
mod common; mod common;
use std::{collections::BTreeSet, sync::Arc}; use std::{collections::BTreeSet, sync::Arc};
use crate::common::DESCRIPTORS;
use bdk_chain::{ use bdk_chain::{
indexed_tx_graph::{self, IndexedTxGraph}, indexed_tx_graph::{self, IndexedTxGraph},
keychain::{self, Balance, KeychainTxOutIndex}, keychain::{self, Balance, KeychainTxOutIndex},
local_chain::LocalChain, local_chain::LocalChain,
tx_graph, ChainPosition, ConfirmationHeightAnchor, tx_graph, ChainPosition, ConfirmationHeightAnchor, DescriptorExt,
}; };
use bitcoin::{ use bitcoin::{
secp256k1::Secp256k1, Amount, OutPoint, Script, ScriptBuf, Transaction, TxIn, TxOut, secp256k1::Secp256k1, Amount, OutPoint, Script, ScriptBuf, Transaction, TxIn, TxOut,
@@ -23,8 +26,7 @@ use miniscript::Descriptor;
/// agnostic. /// agnostic.
#[test] #[test]
fn insert_relevant_txs() { fn insert_relevant_txs() {
const DESCRIPTOR: &str = "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/0/*)"; let (descriptor, _) = Descriptor::parse_descriptor(&Secp256k1::signing_only(), DESCRIPTORS[0])
let (descriptor, _) = Descriptor::parse_descriptor(&Secp256k1::signing_only(), DESCRIPTOR)
.expect("must be valid"); .expect("must be valid");
let spk_0 = descriptor.at_derivation_index(0).unwrap().script_pubkey(); let spk_0 = descriptor.at_derivation_index(0).unwrap().script_pubkey();
let spk_1 = descriptor.at_derivation_index(9).unwrap().script_pubkey(); let spk_1 = descriptor.at_derivation_index(9).unwrap().script_pubkey();
@@ -32,7 +34,7 @@ fn insert_relevant_txs() {
let mut graph = IndexedTxGraph::<ConfirmationHeightAnchor, KeychainTxOutIndex<()>>::new( let mut graph = IndexedTxGraph::<ConfirmationHeightAnchor, KeychainTxOutIndex<()>>::new(
KeychainTxOutIndex::new(10), KeychainTxOutIndex::new(10),
); );
graph.index.add_keychain((), descriptor); let _ = graph.index.insert_descriptor((), descriptor.clone());
let tx_a = Transaction { let tx_a = Transaction {
output: vec![ output: vec![
@@ -71,7 +73,10 @@ fn insert_relevant_txs() {
txs: txs.iter().cloned().map(Arc::new).collect(), txs: txs.iter().cloned().map(Arc::new).collect(),
..Default::default() ..Default::default()
}, },
indexer: keychain::ChangeSet([((), 9_u32)].into()), indexer: keychain::ChangeSet {
last_revealed: [(descriptor.descriptor_id(), 9_u32)].into(),
keychains_added: [].into(),
},
}; };
assert_eq!( assert_eq!(
@@ -79,7 +84,16 @@ fn insert_relevant_txs() {
changeset, changeset,
); );
assert_eq!(graph.initial_changeset(), changeset,); // The initial changeset will also contain info about the keychain we added
let initial_changeset = indexed_tx_graph::ChangeSet {
graph: changeset.graph,
indexer: keychain::ChangeSet {
last_revealed: changeset.indexer.last_revealed,
keychains_added: [((), descriptor)].into(),
},
};
assert_eq!(graph.initial_changeset(), initial_changeset);
} }
/// Ensure consistency IndexedTxGraph list_* and balance methods. These methods lists /// Ensure consistency IndexedTxGraph list_* and balance methods. These methods lists
@@ -117,15 +131,17 @@ fn test_list_owned_txouts() {
// Initiate IndexedTxGraph // Initiate IndexedTxGraph
let (desc_1, _) = Descriptor::parse_descriptor(&Secp256k1::signing_only(), "tr(tprv8ZgxMBicQKsPd3krDUsBAmtnRsK3rb8u5yi1zhQgMhF1tR8MW7xfE4rnrbbsrbPR52e7rKapu6ztw1jXveJSCGHEriUGZV7mCe88duLp5pj/86'/1'/0'/0/*)").unwrap(); let (desc_1, _) =
let (desc_2, _) = Descriptor::parse_descriptor(&Secp256k1::signing_only(), "tr(tprv8ZgxMBicQKsPd3krDUsBAmtnRsK3rb8u5yi1zhQgMhF1tR8MW7xfE4rnrbbsrbPR52e7rKapu6ztw1jXveJSCGHEriUGZV7mCe88duLp5pj/86'/1'/0'/1/*)").unwrap(); Descriptor::parse_descriptor(&Secp256k1::signing_only(), common::DESCRIPTORS[2]).unwrap();
let (desc_2, _) =
Descriptor::parse_descriptor(&Secp256k1::signing_only(), common::DESCRIPTORS[3]).unwrap();
let mut graph = IndexedTxGraph::<ConfirmationHeightAnchor, KeychainTxOutIndex<String>>::new( let mut graph = IndexedTxGraph::<ConfirmationHeightAnchor, KeychainTxOutIndex<String>>::new(
KeychainTxOutIndex::new(10), KeychainTxOutIndex::new(10),
); );
graph.index.add_keychain("keychain_1".into(), desc_1); let _ = graph.index.insert_descriptor("keychain_1".into(), desc_1);
graph.index.add_keychain("keychain_2".into(), desc_2); let _ = graph.index.insert_descriptor("keychain_2".into(), desc_2);
// Get trusted and untrusted addresses // Get trusted and untrusted addresses
@@ -135,14 +151,20 @@ fn test_list_owned_txouts() {
{ {
// we need to scope here to take immutanble reference of the graph // we need to scope here to take immutanble reference of the graph
for _ in 0..10 { for _ in 0..10 {
let ((_, script), _) = graph.index.reveal_next_spk(&"keychain_1".to_string()); let ((_, script), _) = graph
.index
.reveal_next_spk(&"keychain_1".to_string())
.unwrap();
// TODO Assert indexes // TODO Assert indexes
trusted_spks.push(script.to_owned()); trusted_spks.push(script.to_owned());
} }
} }
{ {
for _ in 0..10 { for _ in 0..10 {
let ((_, script), _) = graph.index.reveal_next_spk(&"keychain_2".to_string()); let ((_, script), _) = graph
.index
.reveal_next_spk(&"keychain_2".to_string())
.unwrap();
untrusted_spks.push(script.to_owned()); untrusted_spks.push(script.to_owned());
} }
} }
@@ -235,26 +257,18 @@ fn test_list_owned_txouts() {
.unwrap_or_else(|| panic!("block must exist at {}", height)); .unwrap_or_else(|| panic!("block must exist at {}", height));
let txouts = graph let txouts = graph
.graph() .graph()
.filter_chain_txouts( .filter_chain_txouts(&local_chain, chain_tip, graph.index.outpoints())
&local_chain,
chain_tip,
graph.index.outpoints().iter().cloned(),
)
.collect::<Vec<_>>(); .collect::<Vec<_>>();
let utxos = graph let utxos = graph
.graph() .graph()
.filter_chain_unspents( .filter_chain_unspents(&local_chain, chain_tip, graph.index.outpoints())
&local_chain,
chain_tip,
graph.index.outpoints().iter().cloned(),
)
.collect::<Vec<_>>(); .collect::<Vec<_>>();
let balance = graph.graph().balance( let balance = graph.graph().balance(
&local_chain, &local_chain,
chain_tip, chain_tip,
graph.index.outpoints().iter().cloned(), graph.index.outpoints(),
|_, spk: &Script| trusted_spks.contains(&spk.to_owned()), |_, spk: &Script| trusted_spks.contains(&spk.to_owned()),
); );
@@ -341,10 +355,10 @@ fn test_list_owned_txouts() {
assert_eq!( assert_eq!(
balance, balance,
Balance { Balance {
immature: 70000, // immature coinbase immature: Amount::from_sat(70000), // immature coinbase
trusted_pending: 25000, // tx3 + tx5 trusted_pending: Amount::from_sat(25000), // tx3 + tx5
untrusted_pending: 20000, // tx4 untrusted_pending: Amount::from_sat(20000), // tx4
confirmed: 0 // Nothing is confirmed yet confirmed: Amount::ZERO // Nothing is confirmed yet
} }
); );
} }
@@ -376,10 +390,10 @@ fn test_list_owned_txouts() {
assert_eq!( assert_eq!(
balance, balance,
Balance { Balance {
immature: 70000, // immature coinbase immature: Amount::from_sat(70000), // immature coinbase
trusted_pending: 25000, // tx3 + tx5 trusted_pending: Amount::from_sat(25000), // tx3 + tx5
untrusted_pending: 20000, // tx4 untrusted_pending: Amount::from_sat(20000), // tx4
confirmed: 0 // Nothing is confirmed yet confirmed: Amount::ZERO // Nothing is confirmed yet
} }
); );
} }
@@ -408,10 +422,10 @@ fn test_list_owned_txouts() {
assert_eq!( assert_eq!(
balance, balance,
Balance { Balance {
immature: 70000, // immature coinbase immature: Amount::from_sat(70000), // immature coinbase
trusted_pending: 15000, // tx5 trusted_pending: Amount::from_sat(15000), // tx5
untrusted_pending: 20000, // tx4 untrusted_pending: Amount::from_sat(20000), // tx4
confirmed: 10000 // tx3 got confirmed confirmed: Amount::from_sat(10000) // tx3 got confirmed
} }
); );
} }
@@ -439,10 +453,10 @@ fn test_list_owned_txouts() {
assert_eq!( assert_eq!(
balance, balance,
Balance { Balance {
immature: 70000, // immature coinbase immature: Amount::from_sat(70000), // immature coinbase
trusted_pending: 15000, // tx5 trusted_pending: Amount::from_sat(15000), // tx5
untrusted_pending: 20000, // tx4 untrusted_pending: Amount::from_sat(20000), // tx4
confirmed: 10000 // tx1 got matured confirmed: Amount::from_sat(10000) // tx1 got matured
} }
); );
} }
@@ -455,10 +469,10 @@ fn test_list_owned_txouts() {
assert_eq!( assert_eq!(
balance, balance,
Balance { Balance {
immature: 0, // coinbase matured immature: Amount::ZERO, // coinbase matured
trusted_pending: 15000, // tx5 trusted_pending: Amount::from_sat(15000), // tx5
untrusted_pending: 20000, // tx4 untrusted_pending: Amount::from_sat(20000), // tx4
confirmed: 80000 // tx1 + tx3 confirmed: Amount::from_sat(80000) // tx1 + tx3
} }
); );
} }

View File

@@ -5,36 +5,39 @@ mod common;
use bdk_chain::{ use bdk_chain::{
collections::BTreeMap, collections::BTreeMap,
indexed_tx_graph::Indexer, indexed_tx_graph::Indexer,
keychain::{self, KeychainTxOutIndex}, keychain::{self, ChangeSet, KeychainTxOutIndex},
Append, Append, DescriptorExt, DescriptorId,
}; };
use bitcoin::{secp256k1::Secp256k1, Amount, OutPoint, ScriptBuf, Transaction, TxOut}; use bitcoin::{secp256k1::Secp256k1, Amount, OutPoint, ScriptBuf, Transaction, TxOut};
use miniscript::{Descriptor, DescriptorPublicKey}; use miniscript::{Descriptor, DescriptorPublicKey};
use crate::common::DESCRIPTORS;
#[derive(Clone, Debug, PartialEq, Eq, Ord, PartialOrd)] #[derive(Clone, Debug, PartialEq, Eq, Ord, PartialOrd)]
enum TestKeychain { enum TestKeychain {
External, External,
Internal, Internal,
} }
fn parse_descriptor(descriptor: &str) -> Descriptor<DescriptorPublicKey> {
let secp = bdk_chain::bitcoin::secp256k1::Secp256k1::signing_only();
Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, descriptor)
.unwrap()
.0
}
fn init_txout_index( fn init_txout_index(
external_descriptor: Descriptor<DescriptorPublicKey>,
internal_descriptor: Descriptor<DescriptorPublicKey>,
lookahead: u32, lookahead: u32,
) -> ( ) -> bdk_chain::keychain::KeychainTxOutIndex<TestKeychain> {
bdk_chain::keychain::KeychainTxOutIndex<TestKeychain>,
Descriptor<DescriptorPublicKey>,
Descriptor<DescriptorPublicKey>,
) {
let mut txout_index = bdk_chain::keychain::KeychainTxOutIndex::<TestKeychain>::new(lookahead); let mut txout_index = bdk_chain::keychain::KeychainTxOutIndex::<TestKeychain>::new(lookahead);
let secp = bdk_chain::bitcoin::secp256k1::Secp256k1::signing_only(); let _ = txout_index.insert_descriptor(TestKeychain::External, external_descriptor);
let (external_descriptor,_) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/0/*)").unwrap(); let _ = txout_index.insert_descriptor(TestKeychain::Internal, internal_descriptor);
let (internal_descriptor,_) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "tr([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/1/*)").unwrap();
txout_index.add_keychain(TestKeychain::External, external_descriptor.clone()); txout_index
txout_index.add_keychain(TestKeychain::Internal, internal_descriptor.clone());
(txout_index, external_descriptor, internal_descriptor)
} }
fn spk_at_index(descriptor: &Descriptor<DescriptorPublicKey>, index: u32) -> ScriptBuf { fn spk_at_index(descriptor: &Descriptor<DescriptorPublicKey>, index: u32) -> ScriptBuf {
@@ -44,29 +47,136 @@ fn spk_at_index(descriptor: &Descriptor<DescriptorPublicKey>, index: u32) -> Scr
.script_pubkey() .script_pubkey()
} }
// We create two empty changesets lhs and rhs, we then insert various descriptors with various
// last_revealed, append rhs to lhs, and check that the result is consistent with these rules:
// - Existing index doesn't update if the new index in `other` is lower than `self`.
// - Existing index updates if the new index in `other` is higher than `self`.
// - Existing index is unchanged if keychain doesn't exist in `other`.
// - New keychain gets added if the keychain is in `other` but not in `self`.
#[test]
fn append_changesets_check_last_revealed() {
let secp = bitcoin::secp256k1::Secp256k1::signing_only();
let descriptor_ids: Vec<_> = DESCRIPTORS
.iter()
.take(4)
.map(|d| {
Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, d)
.unwrap()
.0
.descriptor_id()
})
.collect();
let mut lhs_di = BTreeMap::<DescriptorId, u32>::default();
let mut rhs_di = BTreeMap::<DescriptorId, u32>::default();
lhs_di.insert(descriptor_ids[0], 7);
lhs_di.insert(descriptor_ids[1], 0);
lhs_di.insert(descriptor_ids[2], 3);
rhs_di.insert(descriptor_ids[0], 3); // value less than lhs desc 0
rhs_di.insert(descriptor_ids[1], 5); // value more than lhs desc 1
lhs_di.insert(descriptor_ids[3], 4); // key doesn't exist in lhs
let mut lhs = ChangeSet {
keychains_added: BTreeMap::<(), _>::new(),
last_revealed: lhs_di,
};
let rhs = ChangeSet {
keychains_added: BTreeMap::<(), _>::new(),
last_revealed: rhs_di,
};
lhs.append(rhs);
// Existing index doesn't update if the new index in `other` is lower than `self`.
assert_eq!(lhs.last_revealed.get(&descriptor_ids[0]), Some(&7));
// Existing index updates if the new index in `other` is higher than `self`.
assert_eq!(lhs.last_revealed.get(&descriptor_ids[1]), Some(&5));
// Existing index is unchanged if keychain doesn't exist in `other`.
assert_eq!(lhs.last_revealed.get(&descriptor_ids[2]), Some(&3));
// New keychain gets added if the keychain is in `other` but not in `self`.
assert_eq!(lhs.last_revealed.get(&descriptor_ids[3]), Some(&4));
}
#[test]
fn test_apply_changeset_with_different_descriptors_to_same_keychain() {
let external_descriptor = parse_descriptor(DESCRIPTORS[0]);
let internal_descriptor = parse_descriptor(DESCRIPTORS[1]);
let mut txout_index =
init_txout_index(external_descriptor.clone(), internal_descriptor.clone(), 0);
assert_eq!(
txout_index.keychains().collect::<Vec<_>>(),
vec![
(&TestKeychain::External, &external_descriptor),
(&TestKeychain::Internal, &internal_descriptor)
]
);
let changeset = ChangeSet {
keychains_added: [(TestKeychain::External, internal_descriptor.clone())].into(),
last_revealed: [].into(),
};
txout_index.apply_changeset(changeset);
assert_eq!(
txout_index.keychains().collect::<Vec<_>>(),
vec![
(&TestKeychain::External, &internal_descriptor),
(&TestKeychain::Internal, &internal_descriptor)
]
);
let changeset = ChangeSet {
keychains_added: [(TestKeychain::Internal, external_descriptor.clone())].into(),
last_revealed: [].into(),
};
txout_index.apply_changeset(changeset);
assert_eq!(
txout_index.keychains().collect::<Vec<_>>(),
vec![
(&TestKeychain::External, &internal_descriptor),
(&TestKeychain::Internal, &external_descriptor)
]
);
}
#[test] #[test]
fn test_set_all_derivation_indices() { fn test_set_all_derivation_indices() {
use bdk_chain::indexed_tx_graph::Indexer; use bdk_chain::indexed_tx_graph::Indexer;
let (mut txout_index, _, _) = init_txout_index(0); let external_descriptor = parse_descriptor(DESCRIPTORS[0]);
let internal_descriptor = parse_descriptor(DESCRIPTORS[1]);
let mut txout_index =
init_txout_index(external_descriptor.clone(), internal_descriptor.clone(), 0);
let derive_to: BTreeMap<_, _> = let derive_to: BTreeMap<_, _> =
[(TestKeychain::External, 12), (TestKeychain::Internal, 24)].into(); [(TestKeychain::External, 12), (TestKeychain::Internal, 24)].into();
let last_revealed: BTreeMap<_, _> = [
(external_descriptor.descriptor_id(), 12),
(internal_descriptor.descriptor_id(), 24),
]
.into();
assert_eq!( assert_eq!(
txout_index.reveal_to_target_multi(&derive_to).1.as_inner(), txout_index.reveal_to_target_multi(&derive_to).1,
&derive_to ChangeSet {
keychains_added: BTreeMap::new(),
last_revealed: last_revealed.clone()
}
); );
assert_eq!(txout_index.last_revealed_indices(), &derive_to); assert_eq!(txout_index.last_revealed_indices(), derive_to);
assert_eq!( assert_eq!(
txout_index.reveal_to_target_multi(&derive_to).1, txout_index.reveal_to_target_multi(&derive_to).1,
keychain::ChangeSet::default(), keychain::ChangeSet::default(),
"no changes if we set to the same thing" "no changes if we set to the same thing"
); );
assert_eq!(txout_index.initial_changeset().as_inner(), &derive_to); assert_eq!(txout_index.initial_changeset().last_revealed, last_revealed);
} }
#[test] #[test]
fn test_lookahead() { fn test_lookahead() {
let (mut txout_index, external_desc, internal_desc) = init_txout_index(10); let external_descriptor = parse_descriptor(DESCRIPTORS[0]);
let internal_descriptor = parse_descriptor(DESCRIPTORS[1]);
let mut txout_index =
init_txout_index(external_descriptor.clone(), internal_descriptor.clone(), 10);
// given: // given:
// - external lookahead set to 10 // - external lookahead set to 10
@@ -76,15 +186,16 @@ fn test_lookahead() {
// - scripts cached in spk_txout_index should increase correctly // - scripts cached in spk_txout_index should increase correctly
// - stored scripts of external keychain should be of expected counts // - stored scripts of external keychain should be of expected counts
for index in (0..20).skip_while(|i| i % 2 == 1) { for index in (0..20).skip_while(|i| i % 2 == 1) {
let (revealed_spks, revealed_changeset) = let (revealed_spks, revealed_changeset) = txout_index
txout_index.reveal_to_target(&TestKeychain::External, index); .reveal_to_target(&TestKeychain::External, index)
.unwrap();
assert_eq!( assert_eq!(
revealed_spks.collect::<Vec<_>>(), revealed_spks.collect::<Vec<_>>(),
vec![(index, spk_at_index(&external_desc, index))], vec![(index, spk_at_index(&external_descriptor, index))],
); );
assert_eq!( assert_eq!(
revealed_changeset.as_inner(), &revealed_changeset.last_revealed,
&[(TestKeychain::External, index)].into() &[(external_descriptor.descriptor_id(), index)].into()
); );
assert_eq!( assert_eq!(
@@ -126,17 +237,18 @@ fn test_lookahead() {
// - derivation index is set ahead of current derivation index + lookahead // - derivation index is set ahead of current derivation index + lookahead
// expect: // expect:
// - scripts cached in spk_txout_index should increase correctly, a.k.a. no scripts are skipped // - scripts cached in spk_txout_index should increase correctly, a.k.a. no scripts are skipped
let (revealed_spks, revealed_changeset) = let (revealed_spks, revealed_changeset) = txout_index
txout_index.reveal_to_target(&TestKeychain::Internal, 24); .reveal_to_target(&TestKeychain::Internal, 24)
.unwrap();
assert_eq!( assert_eq!(
revealed_spks.collect::<Vec<_>>(), revealed_spks.collect::<Vec<_>>(),
(0..=24) (0..=24)
.map(|index| (index, spk_at_index(&internal_desc, index))) .map(|index| (index, spk_at_index(&internal_descriptor, index)))
.collect::<Vec<_>>(), .collect::<Vec<_>>(),
); );
assert_eq!( assert_eq!(
revealed_changeset.as_inner(), &revealed_changeset.last_revealed,
&[(TestKeychain::Internal, 24)].into() &[(internal_descriptor.descriptor_id(), 24)].into()
); );
assert_eq!( assert_eq!(
txout_index.inner().all_spks().len(), txout_index.inner().all_spks().len(),
@@ -172,14 +284,14 @@ fn test_lookahead() {
let tx = Transaction { let tx = Transaction {
output: vec![ output: vec![
TxOut { TxOut {
script_pubkey: external_desc script_pubkey: external_descriptor
.at_derivation_index(external_index) .at_derivation_index(external_index)
.unwrap() .unwrap()
.script_pubkey(), .script_pubkey(),
value: Amount::from_sat(10_000), value: Amount::from_sat(10_000),
}, },
TxOut { TxOut {
script_pubkey: internal_desc script_pubkey: internal_descriptor
.at_derivation_index(internal_index) .at_derivation_index(internal_index)
.unwrap() .unwrap()
.script_pubkey(), .script_pubkey(),
@@ -219,14 +331,17 @@ fn test_lookahead() {
// - last used index should change as expected // - last used index should change as expected
#[test] #[test]
fn test_scan_with_lookahead() { fn test_scan_with_lookahead() {
let (mut txout_index, external_desc, _) = init_txout_index(10); let external_descriptor = parse_descriptor(DESCRIPTORS[0]);
let internal_descriptor = parse_descriptor(DESCRIPTORS[1]);
let mut txout_index =
init_txout_index(external_descriptor.clone(), internal_descriptor.clone(), 10);
let spks: BTreeMap<u32, ScriptBuf> = [0, 10, 20, 30] let spks: BTreeMap<u32, ScriptBuf> = [0, 10, 20, 30]
.into_iter() .into_iter()
.map(|i| { .map(|i| {
( (
i, i,
external_desc external_descriptor
.at_derivation_index(i) .at_derivation_index(i)
.unwrap() .unwrap()
.script_pubkey(), .script_pubkey(),
@@ -243,8 +358,8 @@ fn test_scan_with_lookahead() {
let changeset = txout_index.index_txout(op, &txout); let changeset = txout_index.index_txout(op, &txout);
assert_eq!( assert_eq!(
changeset.as_inner(), &changeset.last_revealed,
&[(TestKeychain::External, spk_i)].into() &[(external_descriptor.descriptor_id(), spk_i)].into()
); );
assert_eq!( assert_eq!(
txout_index.last_revealed_index(&TestKeychain::External), txout_index.last_revealed_index(&TestKeychain::External),
@@ -257,7 +372,7 @@ fn test_scan_with_lookahead() {
} }
// now try with index 41 (lookahead surpassed), we expect that the txout to not be indexed // now try with index 41 (lookahead surpassed), we expect that the txout to not be indexed
let spk_41 = external_desc let spk_41 = external_descriptor
.at_derivation_index(41) .at_derivation_index(41)
.unwrap() .unwrap()
.script_pubkey(); .script_pubkey();
@@ -273,11 +388,13 @@ fn test_scan_with_lookahead() {
#[test] #[test]
#[rustfmt::skip] #[rustfmt::skip]
fn test_wildcard_derivations() { fn test_wildcard_derivations() {
let (mut txout_index, external_desc, _) = init_txout_index(0); let external_descriptor = parse_descriptor(DESCRIPTORS[0]);
let external_spk_0 = external_desc.at_derivation_index(0).unwrap().script_pubkey(); let internal_descriptor = parse_descriptor(DESCRIPTORS[1]);
let external_spk_16 = external_desc.at_derivation_index(16).unwrap().script_pubkey(); let mut txout_index = init_txout_index(external_descriptor.clone(), internal_descriptor.clone(), 0);
let external_spk_26 = external_desc.at_derivation_index(26).unwrap().script_pubkey(); let external_spk_0 = external_descriptor.at_derivation_index(0).unwrap().script_pubkey();
let external_spk_27 = external_desc.at_derivation_index(27).unwrap().script_pubkey(); let external_spk_16 = external_descriptor.at_derivation_index(16).unwrap().script_pubkey();
let external_spk_26 = external_descriptor.at_derivation_index(26).unwrap().script_pubkey();
let external_spk_27 = external_descriptor.at_derivation_index(27).unwrap().script_pubkey();
// - nothing is derived // - nothing is derived
// - unused list is also empty // - unused list is also empty
@@ -285,13 +402,13 @@ fn test_wildcard_derivations() {
// - next_derivation_index() == (0, true) // - next_derivation_index() == (0, true)
// - derive_new() == ((0, <spk>), keychain::ChangeSet) // - derive_new() == ((0, <spk>), keychain::ChangeSet)
// - next_unused() == ((0, <spk>), keychain::ChangeSet:is_empty()) // - next_unused() == ((0, <spk>), keychain::ChangeSet:is_empty())
assert_eq!(txout_index.next_index(&TestKeychain::External), (0, true)); assert_eq!(txout_index.next_index(&TestKeychain::External).unwrap(), (0, true));
let (spk, changeset) = txout_index.reveal_next_spk(&TestKeychain::External); let (spk, changeset) = txout_index.reveal_next_spk(&TestKeychain::External).unwrap();
assert_eq!(spk, (0_u32, external_spk_0.as_script())); assert_eq!(spk, (0_u32, external_spk_0.as_script()));
assert_eq!(changeset.as_inner(), &[(TestKeychain::External, 0)].into()); assert_eq!(&changeset.last_revealed, &[(external_descriptor.descriptor_id(), 0)].into());
let (spk, changeset) = txout_index.next_unused_spk(&TestKeychain::External); let (spk, changeset) = txout_index.next_unused_spk(&TestKeychain::External).unwrap();
assert_eq!(spk, (0_u32, external_spk_0.as_script())); assert_eq!(spk, (0_u32, external_spk_0.as_script()));
assert_eq!(changeset.as_inner(), &[].into()); assert_eq!(&changeset.last_revealed, &[].into());
// - derived till 25 // - derived till 25
// - used all spks till 15. // - used all spks till 15.
@@ -307,16 +424,16 @@ fn test_wildcard_derivations() {
.chain([17, 20, 23]) .chain([17, 20, 23])
.for_each(|index| assert!(txout_index.mark_used(TestKeychain::External, index))); .for_each(|index| assert!(txout_index.mark_used(TestKeychain::External, index)));
assert_eq!(txout_index.next_index(&TestKeychain::External), (26, true)); assert_eq!(txout_index.next_index(&TestKeychain::External).unwrap(), (26, true));
let (spk, changeset) = txout_index.reveal_next_spk(&TestKeychain::External); let (spk, changeset) = txout_index.reveal_next_spk(&TestKeychain::External).unwrap();
assert_eq!(spk, (26, external_spk_26.as_script())); assert_eq!(spk, (26, external_spk_26.as_script()));
assert_eq!(changeset.as_inner(), &[(TestKeychain::External, 26)].into()); assert_eq!(&changeset.last_revealed, &[(external_descriptor.descriptor_id(), 26)].into());
let (spk, changeset) = txout_index.next_unused_spk(&TestKeychain::External); let (spk, changeset) = txout_index.next_unused_spk(&TestKeychain::External).unwrap();
assert_eq!(spk, (16, external_spk_16.as_script())); assert_eq!(spk, (16, external_spk_16.as_script()));
assert_eq!(changeset.as_inner(), &[].into()); assert_eq!(&changeset.last_revealed, &[].into());
// - Use all the derived till 26. // - Use all the derived till 26.
// - next_unused() = ((27, <spk>), keychain::ChangeSet) // - next_unused() = ((27, <spk>), keychain::ChangeSet)
@@ -324,9 +441,9 @@ fn test_wildcard_derivations() {
txout_index.mark_used(TestKeychain::External, index); txout_index.mark_used(TestKeychain::External, index);
}); });
let (spk, changeset) = txout_index.next_unused_spk(&TestKeychain::External); let (spk, changeset) = txout_index.next_unused_spk(&TestKeychain::External).unwrap();
assert_eq!(spk, (27, external_spk_27.as_script())); assert_eq!(spk, (27, external_spk_27.as_script()));
assert_eq!(changeset.as_inner(), &[(TestKeychain::External, 27)].into()); assert_eq!(&changeset.last_revealed, &[(external_descriptor.descriptor_id(), 27)].into());
} }
#[test] #[test]
@@ -334,13 +451,14 @@ fn test_non_wildcard_derivations() {
let mut txout_index = KeychainTxOutIndex::<TestKeychain>::new(0); let mut txout_index = KeychainTxOutIndex::<TestKeychain>::new(0);
let secp = bitcoin::secp256k1::Secp256k1::signing_only(); let secp = bitcoin::secp256k1::Secp256k1::signing_only();
let (no_wildcard_descriptor, _) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, "wpkh([73c5da0a/86'/0'/0']xprv9xgqHN7yz9MwCkxsBPN5qetuNdQSUttZNKw1dcYTV4mkaAFiBVGQziHs3NRSWMkCzvgjEe3n9xV8oYywvM8at9yRqyaZVz6TYYhX98VjsUk/1/0)").unwrap(); let (no_wildcard_descriptor, _) =
Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, DESCRIPTORS[6]).unwrap();
let external_spk = no_wildcard_descriptor let external_spk = no_wildcard_descriptor
.at_derivation_index(0) .at_derivation_index(0)
.unwrap() .unwrap()
.script_pubkey(); .script_pubkey();
txout_index.add_keychain(TestKeychain::External, no_wildcard_descriptor); let _ = txout_index.insert_descriptor(TestKeychain::External, no_wildcard_descriptor.clone());
// given: // given:
// - `txout_index` with no stored scripts // - `txout_index` with no stored scripts
@@ -348,14 +466,24 @@ fn test_non_wildcard_derivations() {
// - next derivation index should be new // - next derivation index should be new
// - when we derive a new script, script @ index 0 // - when we derive a new script, script @ index 0
// - when we get the next unused script, script @ index 0 // - when we get the next unused script, script @ index 0
assert_eq!(txout_index.next_index(&TestKeychain::External), (0, true)); assert_eq!(
let (spk, changeset) = txout_index.reveal_next_spk(&TestKeychain::External); txout_index.next_index(&TestKeychain::External).unwrap(),
(0, true)
);
let (spk, changeset) = txout_index
.reveal_next_spk(&TestKeychain::External)
.unwrap();
assert_eq!(spk, (0, external_spk.as_script())); assert_eq!(spk, (0, external_spk.as_script()));
assert_eq!(changeset.as_inner(), &[(TestKeychain::External, 0)].into()); assert_eq!(
&changeset.last_revealed,
&[(no_wildcard_descriptor.descriptor_id(), 0)].into()
);
let (spk, changeset) = txout_index.next_unused_spk(&TestKeychain::External); let (spk, changeset) = txout_index
.next_unused_spk(&TestKeychain::External)
.unwrap();
assert_eq!(spk, (0, external_spk.as_script())); assert_eq!(spk, (0, external_spk.as_script()));
assert_eq!(changeset.as_inner(), &[].into()); assert_eq!(&changeset.last_revealed, &[].into());
// given: // given:
// - the non-wildcard descriptor already has a stored and used script // - the non-wildcard descriptor already has a stored and used script
@@ -363,18 +491,26 @@ fn test_non_wildcard_derivations() {
// - next derivation index should not be new // - next derivation index should not be new
// - derive new and next unused should return the old script // - derive new and next unused should return the old script
// - store_up_to should not panic and return empty changeset // - store_up_to should not panic and return empty changeset
assert_eq!(txout_index.next_index(&TestKeychain::External), (0, false)); assert_eq!(
txout_index.next_index(&TestKeychain::External).unwrap(),
(0, false)
);
txout_index.mark_used(TestKeychain::External, 0); txout_index.mark_used(TestKeychain::External, 0);
let (spk, changeset) = txout_index.reveal_next_spk(&TestKeychain::External); let (spk, changeset) = txout_index
.reveal_next_spk(&TestKeychain::External)
.unwrap();
assert_eq!(spk, (0, external_spk.as_script())); assert_eq!(spk, (0, external_spk.as_script()));
assert_eq!(changeset.as_inner(), &[].into()); assert_eq!(&changeset.last_revealed, &[].into());
let (spk, changeset) = txout_index.next_unused_spk(&TestKeychain::External); let (spk, changeset) = txout_index
.next_unused_spk(&TestKeychain::External)
.unwrap();
assert_eq!(spk, (0, external_spk.as_script())); assert_eq!(spk, (0, external_spk.as_script()));
assert_eq!(changeset.as_inner(), &[].into()); assert_eq!(&changeset.last_revealed, &[].into());
let (revealed_spks, revealed_changeset) = let (revealed_spks, revealed_changeset) = txout_index
txout_index.reveal_to_target(&TestKeychain::External, 200); .reveal_to_target(&TestKeychain::External, 200)
.unwrap();
assert_eq!(revealed_spks.count(), 0); assert_eq!(revealed_spks.count(), 0);
assert!(revealed_changeset.is_empty()); assert!(revealed_changeset.is_empty());
@@ -438,7 +574,13 @@ fn lookahead_to_target() {
]; ];
for t in test_cases { for t in test_cases {
let (mut index, _, _) = init_txout_index(t.lookahead); let external_descriptor = parse_descriptor(DESCRIPTORS[0]);
let internal_descriptor = parse_descriptor(DESCRIPTORS[1]);
let mut index = init_txout_index(
external_descriptor.clone(),
internal_descriptor.clone(),
t.lookahead,
);
if let Some(last_revealed) = t.external_last_revealed { if let Some(last_revealed) = t.external_last_revealed {
let _ = index.reveal_to_target(&TestKeychain::External, last_revealed); let _ = index.reveal_to_target(&TestKeychain::External, last_revealed);
@@ -449,17 +591,19 @@ fn lookahead_to_target() {
let keychain_test_cases = [ let keychain_test_cases = [
( (
external_descriptor.descriptor_id(),
TestKeychain::External, TestKeychain::External,
t.external_last_revealed, t.external_last_revealed,
t.external_target, t.external_target,
), ),
( (
internal_descriptor.descriptor_id(),
TestKeychain::Internal, TestKeychain::Internal,
t.internal_last_revealed, t.internal_last_revealed,
t.internal_target, t.internal_target,
), ),
]; ];
for (keychain, last_revealed, target) in keychain_test_cases { for (descriptor_id, keychain, last_revealed, target) in keychain_test_cases {
if let Some(target) = target { if let Some(target) = target {
let original_last_stored_index = match last_revealed { let original_last_stored_index = match last_revealed {
Some(last_revealed) => Some(last_revealed + t.lookahead), Some(last_revealed) => Some(last_revealed + t.lookahead),
@@ -475,10 +619,10 @@ fn lookahead_to_target() {
let keys = index let keys = index
.inner() .inner()
.all_spks() .all_spks()
.range((keychain.clone(), 0)..=(keychain.clone(), u32::MAX)) .range((descriptor_id, 0)..=(descriptor_id, u32::MAX))
.map(|(k, _)| k.clone()) .map(|(k, _)| *k)
.collect::<Vec<_>>(); .collect::<Vec<_>>();
let exp_keys = core::iter::repeat(keychain) let exp_keys = core::iter::repeat(descriptor_id)
.zip(0_u32..=exp_last_stored_index) .zip(0_u32..=exp_last_stored_index)
.collect::<Vec<_>>(); .collect::<Vec<_>>();
assert_eq!(keys, exp_keys); assert_eq!(keys, exp_keys);
@@ -486,3 +630,150 @@ fn lookahead_to_target() {
} }
} }
} }
/// `::index_txout` should still index txouts with spks derived from descriptors without keychains.
/// This includes properly refilling the lookahead for said descriptors.
#[test]
fn index_txout_after_changing_descriptor_under_keychain() {
let secp = bdk_chain::bitcoin::secp256k1::Secp256k1::signing_only();
let (desc_a, _) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, DESCRIPTORS[0])
.expect("descriptor 0 must be valid");
let (desc_b, _) = Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, DESCRIPTORS[1])
.expect("descriptor 1 must be valid");
let desc_id_a = desc_a.descriptor_id();
let mut txout_index = bdk_chain::keychain::KeychainTxOutIndex::<()>::new(10);
// Introduce `desc_a` under keychain `()` and replace the descriptor.
let _ = txout_index.insert_descriptor((), desc_a.clone());
let _ = txout_index.insert_descriptor((), desc_b.clone());
// Loop through spks in intervals of `lookahead` to create outputs with. We should always be
// able to index these outputs if `lookahead` is respected.
let spk_indices = [9, 19, 29, 39];
for i in spk_indices {
let spk_at_index = desc_a
.at_derivation_index(i)
.expect("must derive")
.script_pubkey();
let index_changeset = txout_index.index_txout(
// Use spk derivation index as vout as we just want an unique outpoint.
OutPoint::new(h!("mock_tx"), i as _),
&TxOut {
value: Amount::from_sat(10_000),
script_pubkey: spk_at_index,
},
);
assert_eq!(
index_changeset,
bdk_chain::keychain::ChangeSet {
keychains_added: BTreeMap::default(),
last_revealed: [(desc_id_a, i)].into(),
},
"must always increase last active if impl respects lookahead"
);
}
}
#[test]
fn insert_descriptor_no_change() {
let secp = Secp256k1::signing_only();
let (desc, _) =
Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, DESCRIPTORS[0]).unwrap();
let mut txout_index = KeychainTxOutIndex::<()>::default();
assert_eq!(
txout_index.insert_descriptor((), desc.clone()),
keychain::ChangeSet {
keychains_added: [((), desc.clone())].into(),
last_revealed: Default::default()
},
);
assert_eq!(
txout_index.insert_descriptor((), desc.clone()),
keychain::ChangeSet::default(),
"inserting the same descriptor for keychain should return an empty changeset",
);
}
#[test]
fn applying_changesets_one_by_one_vs_aggregate_must_have_same_result() {
let desc = parse_descriptor(DESCRIPTORS[0]);
let changesets: &[ChangeSet<TestKeychain>] = &[
ChangeSet {
keychains_added: [(TestKeychain::Internal, desc.clone())].into(),
last_revealed: [].into(),
},
ChangeSet {
keychains_added: [(TestKeychain::External, desc.clone())].into(),
last_revealed: [(desc.descriptor_id(), 12)].into(),
},
];
let mut indexer_a = KeychainTxOutIndex::<TestKeychain>::new(0);
for changeset in changesets {
indexer_a.apply_changeset(changeset.clone());
}
let mut indexer_b = KeychainTxOutIndex::<TestKeychain>::new(0);
let aggregate_changesets = changesets
.iter()
.cloned()
.reduce(|mut agg, cs| {
agg.append(cs);
agg
})
.expect("must aggregate changesets");
indexer_b.apply_changeset(aggregate_changesets);
assert_eq!(
indexer_a.keychains().collect::<Vec<_>>(),
indexer_b.keychains().collect::<Vec<_>>()
);
assert_eq!(
indexer_a.spk_at_index(TestKeychain::External, 0),
indexer_b.spk_at_index(TestKeychain::External, 0)
);
assert_eq!(
indexer_a.spk_at_index(TestKeychain::Internal, 0),
indexer_b.spk_at_index(TestKeychain::Internal, 0)
);
assert_eq!(
indexer_a.last_revealed_indices(),
indexer_b.last_revealed_indices()
);
}
// When the same descriptor is associated with various keychains,
// index methods only return the highest keychain by Ord
#[test]
fn test_only_highest_ord_keychain_is_returned() {
let desc = parse_descriptor(DESCRIPTORS[0]);
let mut indexer = KeychainTxOutIndex::<TestKeychain>::new(0);
let _ = indexer.insert_descriptor(TestKeychain::Internal, desc.clone());
let _ = indexer.insert_descriptor(TestKeychain::External, desc);
// reveal_next_spk will work with either keychain
let spk0: ScriptBuf = indexer
.reveal_next_spk(&TestKeychain::External)
.unwrap()
.0
.1
.into();
let spk1: ScriptBuf = indexer
.reveal_next_spk(&TestKeychain::Internal)
.unwrap()
.0
.1
.into();
// index_of_spk will always return External
assert_eq!(
indexer.index_of_spk(&spk0),
Some((TestKeychain::External, 0))
);
assert_eq!(
indexer.index_of_spk(&spk1),
Some((TestKeychain::External, 1))
);
}

View File

@@ -1,3 +1,5 @@
#![cfg(feature = "miniscript")]
use std::ops::{Bound, RangeBounds}; use std::ops::{Bound, RangeBounds};
use bdk_chain::{ use bdk_chain::{

View File

@@ -1,5 +1,7 @@
use bdk_chain::{indexed_tx_graph::Indexer, SpkTxOutIndex}; use bdk_chain::{indexed_tx_graph::Indexer, SpkTxOutIndex};
use bitcoin::{absolute, transaction, Amount, OutPoint, ScriptBuf, Transaction, TxIn, TxOut}; use bitcoin::{
absolute, transaction, Amount, OutPoint, ScriptBuf, SignedAmount, Transaction, TxIn, TxOut,
};
#[test] #[test]
fn spk_txout_sent_and_received() { fn spk_txout_sent_and_received() {
@@ -20,14 +22,23 @@ fn spk_txout_sent_and_received() {
}], }],
}; };
assert_eq!(index.sent_and_received(&tx1, ..), (0, 42_000)); assert_eq!(
assert_eq!(index.sent_and_received(&tx1, ..1), (0, 42_000)); index.sent_and_received(&tx1, ..),
assert_eq!(index.sent_and_received(&tx1, 1..), (0, 0)); (Amount::from_sat(0), Amount::from_sat(42_000))
assert_eq!(index.net_value(&tx1, ..), 42_000); );
assert_eq!(
index.sent_and_received(&tx1, ..1),
(Amount::from_sat(0), Amount::from_sat(42_000))
);
assert_eq!(
index.sent_and_received(&tx1, 1..),
(Amount::from_sat(0), Amount::from_sat(0))
);
assert_eq!(index.net_value(&tx1, ..), SignedAmount::from_sat(42_000));
index.index_tx(&tx1); index.index_tx(&tx1);
assert_eq!( assert_eq!(
index.sent_and_received(&tx1, ..), index.sent_and_received(&tx1, ..),
(0, 42_000), (Amount::from_sat(0), Amount::from_sat(42_000)),
"shouldn't change after scanning" "shouldn't change after scanning"
); );
@@ -53,10 +64,19 @@ fn spk_txout_sent_and_received() {
], ],
}; };
assert_eq!(index.sent_and_received(&tx2, ..), (42_000, 50_000)); assert_eq!(
assert_eq!(index.sent_and_received(&tx2, ..1), (42_000, 30_000)); index.sent_and_received(&tx2, ..),
assert_eq!(index.sent_and_received(&tx2, 1..), (0, 20_000)); (Amount::from_sat(42_000), Amount::from_sat(50_000))
assert_eq!(index.net_value(&tx2, ..), 8_000); );
assert_eq!(
index.sent_and_received(&tx2, ..1),
(Amount::from_sat(42_000), Amount::from_sat(30_000))
);
assert_eq!(
index.sent_and_received(&tx2, 1..),
(Amount::from_sat(0), Amount::from_sat(20_000))
);
assert_eq!(index.net_value(&tx2, ..), SignedAmount::from_sat(8_000));
} }
#[test] #[test]

View File

@@ -1,3 +1,5 @@
#![cfg(feature = "miniscript")]
#[macro_use] #[macro_use]
mod common; mod common;
use bdk_chain::tx_graph::CalculateFeeError; use bdk_chain::tx_graph::CalculateFeeError;

View File

@@ -1,10 +1,12 @@
#![cfg(feature = "miniscript")]
#[macro_use] #[macro_use]
mod common; mod common;
use std::collections::{BTreeSet, HashSet}; use std::collections::{BTreeSet, HashSet};
use bdk_chain::{keychain::Balance, BlockId}; use bdk_chain::{keychain::Balance, BlockId};
use bitcoin::{OutPoint, Script}; use bitcoin::{Amount, OutPoint, Script};
use common::*; use common::*;
#[allow(dead_code)] #[allow(dead_code)]
@@ -79,10 +81,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("confirmed_genesis", 0), ("confirmed_conflict", 0)]), exp_chain_txouts: HashSet::from([("confirmed_genesis", 0), ("confirmed_conflict", 0)]),
exp_unspents: HashSet::from([("confirmed_conflict", 0)]), exp_unspents: HashSet::from([("confirmed_conflict", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 0, trusted_pending: Amount::ZERO,
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 20000, confirmed: Amount::from_sat(20000),
}, },
}, },
Scenario { Scenario {
@@ -115,10 +117,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("tx1", 0), ("tx_conflict_2", 0)]), exp_chain_txouts: HashSet::from([("tx1", 0), ("tx_conflict_2", 0)]),
exp_unspents: HashSet::from([("tx_conflict_2", 0)]), exp_unspents: HashSet::from([("tx_conflict_2", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 30000, trusted_pending: Amount::from_sat(30000),
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 0, confirmed: Amount::ZERO,
}, },
}, },
Scenario { Scenario {
@@ -150,10 +152,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("tx1", 0), ("tx1", 1), ("tx_conflict_2", 0)]), exp_chain_txouts: HashSet::from([("tx1", 0), ("tx1", 1), ("tx_conflict_2", 0)]),
exp_unspents: HashSet::from([("tx_conflict_2", 0)]), exp_unspents: HashSet::from([("tx_conflict_2", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 30000, trusted_pending: Amount::from_sat(30000),
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 0, confirmed: Amount::ZERO,
}, },
}, },
Scenario { Scenario {
@@ -192,10 +194,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("tx1", 0), ("tx_conflict_3", 0)]), exp_chain_txouts: HashSet::from([("tx1", 0), ("tx_conflict_3", 0)]),
exp_unspents: HashSet::from([("tx_conflict_3", 0)]), exp_unspents: HashSet::from([("tx_conflict_3", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 40000, trusted_pending: Amount::from_sat(40000),
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 0, confirmed: Amount::ZERO,
}, },
}, },
Scenario { Scenario {
@@ -227,10 +229,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("tx1", 0), ("tx_orphaned_conflict", 0)]), exp_chain_txouts: HashSet::from([("tx1", 0), ("tx_orphaned_conflict", 0)]),
exp_unspents: HashSet::from([("tx_orphaned_conflict", 0)]), exp_unspents: HashSet::from([("tx_orphaned_conflict", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 30000, trusted_pending: Amount::from_sat(30000),
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 0, confirmed: Amount::ZERO,
}, },
}, },
Scenario { Scenario {
@@ -262,10 +264,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("tx1", 0), ("tx_conflict_1", 0)]), exp_chain_txouts: HashSet::from([("tx1", 0), ("tx_conflict_1", 0)]),
exp_unspents: HashSet::from([("tx_conflict_1", 0)]), exp_unspents: HashSet::from([("tx_conflict_1", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 20000, trusted_pending: Amount::from_sat(20000),
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 0, confirmed: Amount::ZERO,
}, },
}, },
Scenario { Scenario {
@@ -311,10 +313,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("tx1", 0), ("tx_confirmed_conflict", 0)]), exp_chain_txouts: HashSet::from([("tx1", 0), ("tx_confirmed_conflict", 0)]),
exp_unspents: HashSet::from([("tx_confirmed_conflict", 0)]), exp_unspents: HashSet::from([("tx_confirmed_conflict", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 0, trusted_pending: Amount::ZERO,
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 50000, confirmed: Amount::from_sat(50000),
}, },
}, },
Scenario { Scenario {
@@ -356,10 +358,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("A", 0), ("B", 0), ("C", 0)]), exp_chain_txouts: HashSet::from([("A", 0), ("B", 0), ("C", 0)]),
exp_unspents: HashSet::from([("C", 0)]), exp_unspents: HashSet::from([("C", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 30000, trusted_pending: Amount::from_sat(30000),
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 0, confirmed: Amount::ZERO,
}, },
}, },
Scenario { Scenario {
@@ -397,10 +399,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("A", 0), ("B'", 0)]), exp_chain_txouts: HashSet::from([("A", 0), ("B'", 0)]),
exp_unspents: HashSet::from([("B'", 0)]), exp_unspents: HashSet::from([("B'", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 0, trusted_pending: Amount::ZERO,
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 20000, confirmed: Amount::from_sat(20000),
}, },
}, },
Scenario { Scenario {
@@ -442,10 +444,10 @@ fn test_tx_conflict_handling() {
]), ]),
exp_unspents: HashSet::from([("C", 0)]), exp_unspents: HashSet::from([("C", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 30000, trusted_pending: Amount::from_sat(30000),
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 0, confirmed: Amount::ZERO,
}, },
}, },
Scenario { Scenario {
@@ -487,10 +489,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("A", 0), ("B'", 0)]), exp_chain_txouts: HashSet::from([("A", 0), ("B'", 0)]),
exp_unspents: HashSet::from([("B'", 0)]), exp_unspents: HashSet::from([("B'", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 30000, trusted_pending: Amount::from_sat(30000),
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 0, confirmed: Amount::ZERO,
}, },
}, },
Scenario { Scenario {
@@ -532,10 +534,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("A", 0), ("B'", 0)]), exp_chain_txouts: HashSet::from([("A", 0), ("B'", 0)]),
exp_unspents: HashSet::from([("B'", 0)]), exp_unspents: HashSet::from([("B'", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 0, trusted_pending: Amount::ZERO,
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 50000, confirmed: Amount::from_sat(50000),
}, },
}, },
Scenario { Scenario {
@@ -583,10 +585,10 @@ fn test_tx_conflict_handling() {
exp_chain_txouts: HashSet::from([("A", 0), ("B'", 0)]), exp_chain_txouts: HashSet::from([("A", 0), ("B'", 0)]),
exp_unspents: HashSet::from([("B'", 0)]), exp_unspents: HashSet::from([("B'", 0)]),
exp_balance: Balance { exp_balance: Balance {
immature: 0, immature: Amount::ZERO,
trusted_pending: 0, trusted_pending: Amount::ZERO,
untrusted_pending: 0, untrusted_pending: Amount::ZERO,
confirmed: 50000, confirmed: Amount::from_sat(50000),
}, },
}, },
]; ];

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "bdk_electrum" name = "bdk_electrum"
version = "0.12.0" version = "0.13.0"
edition = "2021" edition = "2021"
homepage = "https://bitcoindevkit.org" homepage = "https://bitcoindevkit.org"
repository = "https://github.com/bitcoindevkit/bdk" repository = "https://github.com/bitcoindevkit/bdk"
@@ -12,11 +12,9 @@ readme = "README.md"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
bdk_chain = { path = "../chain", version = "0.13.0", default-features = false } bdk_chain = { path = "../chain", version = "0.14.0" }
electrum-client = { version = "0.19" } electrum-client = { version = "0.19" }
#rustls = { version = "=0.21.1", optional = true, features = ["dangerous_configuration"] } #rustls = { version = "=0.21.1", optional = true, features = ["dangerous_configuration"] }
[dev-dependencies] [dev-dependencies]
bdk_testenv = { path = "../testenv", default-features = false } bdk_testenv = { path = "../testenv", default-features = false }
electrsd = { version= "0.27.1", features = ["bitcoind_25_0", "esplora_a33e97e1", "legacy"] }
anyhow = "1"

View File

@@ -1,164 +1,48 @@
use bdk_chain::{ use bdk_chain::{
bitcoin::{OutPoint, ScriptBuf, Transaction, Txid}, bitcoin::{OutPoint, ScriptBuf, Transaction, Txid},
collections::{BTreeMap, HashMap, HashSet},
local_chain::CheckPoint, local_chain::CheckPoint,
tx_graph::{self, TxGraph}, spk_client::{FullScanRequest, FullScanResult, SyncRequest, SyncResult, TxCache},
Anchor, BlockId, ConfirmationHeightAnchor, ConfirmationTimeHeightAnchor, tx_graph::TxGraph,
}; BlockId, ConfirmationHeightAnchor, ConfirmationTimeHeightAnchor,
use electrum_client::{Client, ElectrumApi, Error, HeaderNotification};
use std::{
collections::{BTreeMap, BTreeSet, HashMap, HashSet},
fmt::Debug,
str::FromStr,
}; };
use core::str::FromStr;
use electrum_client::{ElectrumApi, Error, HeaderNotification};
use std::sync::Arc;
/// We include a chain suffix of a certain length for the purpose of robustness. /// We include a chain suffix of a certain length for the purpose of robustness.
const CHAIN_SUFFIX_LENGTH: u32 = 8; const CHAIN_SUFFIX_LENGTH: u32 = 8;
/// Represents updates fetched from an Electrum server, but excludes full transactions. /// Trait to extend [`electrum_client::Client`] functionality.
///
/// To provide a complete update to [`TxGraph`], you'll need to call [`Self::missing_full_txs`] to
/// determine the full transactions missing from [`TxGraph`]. Then call [`Self::into_tx_graph`] to
/// fetch the full transactions from Electrum and finalize the update.
#[derive(Debug, Default, Clone)]
pub struct RelevantTxids(HashMap<Txid, BTreeSet<ConfirmationHeightAnchor>>);
impl RelevantTxids {
/// Determine the full transactions that are missing from `graph`.
///
/// Refer to [`RelevantTxids`] for more details.
pub fn missing_full_txs<A: Anchor>(&self, graph: &TxGraph<A>) -> Vec<Txid> {
self.0
.keys()
.filter(move |&&txid| graph.as_ref().get_tx(txid).is_none())
.cloned()
.collect()
}
/// Finalizes the [`TxGraph`] update by fetching `missing` txids from the `client`.
///
/// Refer to [`RelevantTxids`] for more details.
pub fn into_tx_graph(
self,
client: &Client,
missing: Vec<Txid>,
) -> Result<TxGraph<ConfirmationHeightAnchor>, Error> {
let new_txs = client.batch_transaction_get(&missing)?;
let mut graph = TxGraph::<ConfirmationHeightAnchor>::new(new_txs);
for (txid, anchors) in self.0 {
for anchor in anchors {
let _ = graph.insert_anchor(txid, anchor);
}
}
Ok(graph)
}
/// Finalizes the update by fetching `missing` txids from the `client`, where the
/// resulting [`TxGraph`] has anchors of type [`ConfirmationTimeHeightAnchor`].
///
/// Refer to [`RelevantTxids`] for more details.
///
/// **Note:** The confirmation time might not be precisely correct if there has been a reorg.
// Electrum's API intends that we use the merkle proof API, we should change `bdk_electrum` to
// use it.
pub fn into_confirmation_time_tx_graph(
self,
client: &Client,
missing: Vec<Txid>,
) -> Result<TxGraph<ConfirmationTimeHeightAnchor>, Error> {
let graph = self.into_tx_graph(client, missing)?;
let relevant_heights = {
let mut visited_heights = HashSet::new();
graph
.all_anchors()
.iter()
.map(|(a, _)| a.confirmation_height_upper_bound())
.filter(move |&h| visited_heights.insert(h))
.collect::<Vec<_>>()
};
let height_to_time = relevant_heights
.clone()
.into_iter()
.zip(
client
.batch_block_header(relevant_heights)?
.into_iter()
.map(|bh| bh.time as u64),
)
.collect::<HashMap<u32, u64>>();
let graph_changeset = {
let old_changeset = TxGraph::default().apply_update(graph);
tx_graph::ChangeSet {
txs: old_changeset.txs,
txouts: old_changeset.txouts,
last_seen: old_changeset.last_seen,
anchors: old_changeset
.anchors
.into_iter()
.map(|(height_anchor, txid)| {
let confirmation_height = height_anchor.confirmation_height;
let confirmation_time = height_to_time[&confirmation_height];
let time_anchor = ConfirmationTimeHeightAnchor {
anchor_block: height_anchor.anchor_block,
confirmation_height,
confirmation_time,
};
(time_anchor, txid)
})
.collect(),
}
};
let mut new_graph = TxGraph::default();
new_graph.apply_changeset(graph_changeset);
Ok(new_graph)
}
}
/// Combination of chain and transactions updates from electrum
///
/// We have to update the chain and the txids at the same time since we anchor the txids to
/// the same chain tip that we check before and after we gather the txids.
#[derive(Debug)]
pub struct ElectrumUpdate {
/// Chain update
pub chain_update: CheckPoint,
/// Transaction updates from electrum
pub relevant_txids: RelevantTxids,
}
/// Trait to extend [`Client`] functionality.
pub trait ElectrumExt { pub trait ElectrumExt {
/// Full scan the keychain scripts specified with the blockchain (via an Electrum client) and /// Full scan the keychain scripts specified with the blockchain (via an Electrum client) and
/// returns updates for [`bdk_chain`] data structures. /// returns updates for [`bdk_chain`] data structures.
/// ///
/// - `prev_tip`: the most recent blockchain tip present locally /// - `request`: struct with data required to perform a spk-based blockchain client full scan,
/// - `keychain_spks`: keychains that we want to scan transactions for /// see [`FullScanRequest`]
/// /// - `stop_gap`: the full scan for each keychain stops after a gap of script pubkeys with no
/// The full scan for each keychain stops after a gap of `stop_gap` script pubkeys with no associated /// associated transactions
/// transactions. `batch_size` specifies the max number of script pubkeys to request for in a /// - `batch_size`: specifies the max number of script pubkeys to request for in a single batch
/// single batch request. /// request
/// - `fetch_prev_txouts`: specifies whether or not we want previous `TxOut`s for fee
/// calculation
fn full_scan<K: Ord + Clone>( fn full_scan<K: Ord + Clone>(
&self, &self,
prev_tip: CheckPoint, request: FullScanRequest<K>,
keychain_spks: BTreeMap<K, impl IntoIterator<Item = (u32, ScriptBuf)>>,
stop_gap: usize, stop_gap: usize,
batch_size: usize, batch_size: usize,
) -> Result<(ElectrumUpdate, BTreeMap<K, u32>), Error>; fetch_prev_txouts: bool,
) -> Result<ElectrumFullScanResult<K>, Error>;
/// Sync a set of scripts with the blockchain (via an Electrum client) for the data specified /// Sync a set of scripts with the blockchain (via an Electrum client) for the data specified
/// and returns updates for [`bdk_chain`] data structures. /// and returns updates for [`bdk_chain`] data structures.
/// ///
/// - `prev_tip`: the most recent blockchain tip present locally /// - `request`: struct with data required to perform a spk-based blockchain client sync,
/// - `misc_spks`: an iterator of scripts we want to sync transactions for /// see [`SyncRequest`]
/// - `txids`: transactions for which we want updated [`Anchor`]s /// - `batch_size`: specifies the max number of script pubkeys to request for in a single batch
/// - `outpoints`: transactions associated with these outpoints (residing, spending) that we /// request
/// want to include in the update /// - `fetch_prev_txouts`: specifies whether or not we want previous `TxOut`s for fee
/// /// calculation
/// `batch_size` specifies the max number of script pubkeys to request for in a single batch
/// request.
/// ///
/// If the scripts to sync are unknown, such as when restoring or importing a keychain that /// If the scripts to sync are unknown, such as when restoring or importing a keychain that
/// may include scripts that have been used, use [`full_scan`] with the keychain. /// may include scripts that have been used, use [`full_scan`] with the keychain.
@@ -166,31 +50,33 @@ pub trait ElectrumExt {
/// [`full_scan`]: ElectrumExt::full_scan /// [`full_scan`]: ElectrumExt::full_scan
fn sync( fn sync(
&self, &self,
prev_tip: CheckPoint, request: SyncRequest,
misc_spks: impl IntoIterator<Item = ScriptBuf>,
txids: impl IntoIterator<Item = Txid>,
outpoints: impl IntoIterator<Item = OutPoint>,
batch_size: usize, batch_size: usize,
) -> Result<ElectrumUpdate, Error>; fetch_prev_txouts: bool,
) -> Result<ElectrumSyncResult, Error>;
} }
impl<A: ElectrumApi> ElectrumExt for A { impl<E: ElectrumApi> ElectrumExt for E {
fn full_scan<K: Ord + Clone>( fn full_scan<K: Ord + Clone>(
&self, &self,
prev_tip: CheckPoint, mut request: FullScanRequest<K>,
keychain_spks: BTreeMap<K, impl IntoIterator<Item = (u32, ScriptBuf)>>,
stop_gap: usize, stop_gap: usize,
batch_size: usize, batch_size: usize,
) -> Result<(ElectrumUpdate, BTreeMap<K, u32>), Error> { fetch_prev_txouts: bool,
let mut request_spks = keychain_spks ) -> Result<ElectrumFullScanResult<K>, Error> {
.into_iter() let mut request_spks = request.spks_by_keychain;
.map(|(k, s)| (k, s.into_iter()))
.collect::<BTreeMap<K, _>>(); // We keep track of already-scanned spks just in case a reorg happens and we need to do a
// rescan. We need to keep track of this as iterators in `keychain_spks` are "unbounded" so
// cannot be collected. In addition, we keep track of whether an spk has an active tx
// history for determining the `last_active_index`.
// * key: (keychain, spk_index) that identifies the spk.
// * val: (script_pubkey, has_tx_history).
let mut scanned_spks = BTreeMap::<(K, u32), (ScriptBuf, bool)>::new(); let mut scanned_spks = BTreeMap::<(K, u32), (ScriptBuf, bool)>::new();
let (electrum_update, keychain_update) = loop { let update = loop {
let (tip, _) = construct_update_tip(self, prev_tip.clone())?; let (tip, _) = construct_update_tip(self, request.chain_tip.clone())?;
let mut relevant_txids = RelevantTxids::default(); let mut graph_update = TxGraph::<ConfirmationHeightAnchor>::default();
let cps = tip let cps = tip
.iter() .iter()
.take(10) .take(10)
@@ -202,7 +88,8 @@ impl<A: ElectrumApi> ElectrumExt for A {
scanned_spks.append(&mut populate_with_spks( scanned_spks.append(&mut populate_with_spks(
self, self,
&cps, &cps,
&mut relevant_txids, &mut request.tx_cache,
&mut graph_update,
&mut scanned_spks &mut scanned_spks
.iter() .iter()
.map(|(i, (spk, _))| (i.clone(), spk.clone())), .map(|(i, (spk, _))| (i.clone(), spk.clone())),
@@ -215,7 +102,8 @@ impl<A: ElectrumApi> ElectrumExt for A {
populate_with_spks( populate_with_spks(
self, self,
&cps, &cps,
&mut relevant_txids, &mut request.tx_cache,
&mut graph_update,
keychain_spks, keychain_spks,
stop_gap, stop_gap,
batch_size, batch_size,
@@ -232,6 +120,11 @@ impl<A: ElectrumApi> ElectrumExt for A {
continue; // reorg continue; // reorg
} }
// Fetch previous `TxOut`s for fee calculation if flag is enabled.
if fetch_prev_txouts {
fetch_prev_txout(self, &mut request.tx_cache, &mut graph_update)?;
}
let chain_update = tip; let chain_update = tip;
let keychain_update = request_spks let keychain_update = request_spks
@@ -245,54 +138,148 @@ impl<A: ElectrumApi> ElectrumExt for A {
}) })
.collect::<BTreeMap<_, _>>(); .collect::<BTreeMap<_, _>>();
break ( break FullScanResult {
ElectrumUpdate { graph_update,
chain_update, chain_update,
relevant_txids, last_active_indices: keychain_update,
}, };
keychain_update,
);
}; };
Ok((electrum_update, keychain_update)) Ok(ElectrumFullScanResult(update))
} }
fn sync( fn sync(
&self, &self,
prev_tip: CheckPoint, request: SyncRequest,
misc_spks: impl IntoIterator<Item = ScriptBuf>,
txids: impl IntoIterator<Item = Txid>,
outpoints: impl IntoIterator<Item = OutPoint>,
batch_size: usize, batch_size: usize,
) -> Result<ElectrumUpdate, Error> { fetch_prev_txouts: bool,
let spk_iter = misc_spks ) -> Result<ElectrumSyncResult, Error> {
.into_iter() let mut tx_cache = request.tx_cache.clone();
.enumerate()
.map(|(i, spk)| (i as u32, spk));
let (mut electrum_update, _) = self.full_scan( let full_scan_req = FullScanRequest::from_chain_tip(request.chain_tip.clone())
prev_tip.clone(), .cache_txs(request.tx_cache)
[((), spk_iter)].into(), .set_spks_for_keychain((), request.spks.enumerate().map(|(i, spk)| (i as u32, spk)));
usize::MAX, let mut full_scan_res = self
batch_size, .full_scan(full_scan_req, usize::MAX, batch_size, false)?
)?; .with_confirmation_height_anchor();
let (tip, _) = construct_update_tip(self, prev_tip)?; let (tip, _) = construct_update_tip(self, request.chain_tip)?;
let cps = tip let cps = tip
.iter() .iter()
.take(10) .take(10)
.map(|cp| (cp.height(), cp)) .map(|cp| (cp.height(), cp))
.collect::<BTreeMap<u32, CheckPoint>>(); .collect::<BTreeMap<u32, CheckPoint>>();
populate_with_txids(self, &cps, &mut electrum_update.relevant_txids, txids)?; populate_with_txids(
self,
&cps,
&mut tx_cache,
&mut full_scan_res.graph_update,
request.txids,
)?;
populate_with_outpoints(
self,
&cps,
&mut tx_cache,
&mut full_scan_res.graph_update,
request.outpoints,
)?;
let _txs = // Fetch previous `TxOut`s for fee calculation if flag is enabled.
populate_with_outpoints(self, &cps, &mut electrum_update.relevant_txids, outpoints)?; if fetch_prev_txouts {
fetch_prev_txout(self, &mut tx_cache, &mut full_scan_res.graph_update)?;
}
Ok(electrum_update) Ok(ElectrumSyncResult(SyncResult {
chain_update: full_scan_res.chain_update,
graph_update: full_scan_res.graph_update,
}))
} }
} }
/// The result of [`ElectrumExt::full_scan`].
///
/// This can be transformed into a [`FullScanResult`] with either [`ConfirmationHeightAnchor`] or
/// [`ConfirmationTimeHeightAnchor`] anchor types.
pub struct ElectrumFullScanResult<K>(FullScanResult<K, ConfirmationHeightAnchor>);
impl<K> ElectrumFullScanResult<K> {
/// Return [`FullScanResult`] with [`ConfirmationHeightAnchor`].
pub fn with_confirmation_height_anchor(self) -> FullScanResult<K, ConfirmationHeightAnchor> {
self.0
}
/// Return [`FullScanResult`] with [`ConfirmationTimeHeightAnchor`].
///
/// This requires additional calls to the Electrum server.
pub fn with_confirmation_time_height_anchor(
self,
client: &impl ElectrumApi,
) -> Result<FullScanResult<K, ConfirmationTimeHeightAnchor>, Error> {
let res = self.0;
Ok(FullScanResult {
graph_update: try_into_confirmation_time_result(res.graph_update, client)?,
chain_update: res.chain_update,
last_active_indices: res.last_active_indices,
})
}
}
/// The result of [`ElectrumExt::sync`].
///
/// This can be transformed into a [`SyncResult`] with either [`ConfirmationHeightAnchor`] or
/// [`ConfirmationTimeHeightAnchor`] anchor types.
pub struct ElectrumSyncResult(SyncResult<ConfirmationHeightAnchor>);
impl ElectrumSyncResult {
/// Return [`SyncResult`] with [`ConfirmationHeightAnchor`].
pub fn with_confirmation_height_anchor(self) -> SyncResult<ConfirmationHeightAnchor> {
self.0
}
/// Return [`SyncResult`] with [`ConfirmationTimeHeightAnchor`].
///
/// This requires additional calls to the Electrum server.
pub fn with_confirmation_time_height_anchor(
self,
client: &impl ElectrumApi,
) -> Result<SyncResult<ConfirmationTimeHeightAnchor>, Error> {
let res = self.0;
Ok(SyncResult {
graph_update: try_into_confirmation_time_result(res.graph_update, client)?,
chain_update: res.chain_update,
})
}
}
fn try_into_confirmation_time_result(
graph_update: TxGraph<ConfirmationHeightAnchor>,
client: &impl ElectrumApi,
) -> Result<TxGraph<ConfirmationTimeHeightAnchor>, Error> {
let relevant_heights = graph_update
.all_anchors()
.iter()
.map(|(a, _)| a.confirmation_height)
.collect::<HashSet<_>>();
let height_to_time = relevant_heights
.clone()
.into_iter()
.zip(
client
.batch_block_header(relevant_heights)?
.into_iter()
.map(|bh| bh.time as u64),
)
.collect::<HashMap<u32, u64>>();
Ok(graph_update.map_anchors(|a| ConfirmationTimeHeightAnchor {
anchor_block: a.anchor_block,
confirmation_height: a.confirmation_height,
confirmation_time: height_to_time[&a.confirmation_height],
}))
}
/// Return a [`CheckPoint`] of the latest tip, that connects with `prev_tip`. /// Return a [`CheckPoint`] of the latest tip, that connects with `prev_tip`.
fn construct_update_tip( fn construct_update_tip(
client: &impl ElectrumApi, client: &impl ElectrumApi,
@@ -408,48 +395,48 @@ fn determine_tx_anchor(
} }
} }
/// Populate the `graph_update` with associated transactions/anchors of `outpoints`.
///
/// Transactions in which the outpoint resides, and transactions that spend from the outpoint are
/// included. Anchors of the aforementioned transactions are included.
///
/// Checkpoints (in `cps`) are used to create anchors. The `tx_cache` is self-explanatory.
fn populate_with_outpoints( fn populate_with_outpoints(
client: &impl ElectrumApi, client: &impl ElectrumApi,
cps: &BTreeMap<u32, CheckPoint>, cps: &BTreeMap<u32, CheckPoint>,
relevant_txids: &mut RelevantTxids, tx_cache: &mut TxCache,
graph_update: &mut TxGraph<ConfirmationHeightAnchor>,
outpoints: impl IntoIterator<Item = OutPoint>, outpoints: impl IntoIterator<Item = OutPoint>,
) -> Result<HashMap<Txid, Transaction>, Error> { ) -> Result<(), Error> {
let mut full_txs = HashMap::new();
for outpoint in outpoints { for outpoint in outpoints {
let txid = outpoint.txid; let op_txid = outpoint.txid;
let tx = client.transaction_get(&txid)?; let op_tx = fetch_tx(client, tx_cache, op_txid)?;
debug_assert_eq!(tx.txid(), txid); let op_txout = match op_tx.output.get(outpoint.vout as usize) {
let txout = match tx.output.get(outpoint.vout as usize) {
Some(txout) => txout, Some(txout) => txout,
None => continue, None => continue,
}; };
debug_assert_eq!(op_tx.txid(), op_txid);
// attempt to find the following transactions (alongside their chain positions), and // attempt to find the following transactions (alongside their chain positions), and
// add to our sparsechain `update`: // add to our sparsechain `update`:
let mut has_residing = false; // tx in which the outpoint resides let mut has_residing = false; // tx in which the outpoint resides
let mut has_spending = false; // tx that spends the outpoint let mut has_spending = false; // tx that spends the outpoint
for res in client.script_get_history(&txout.script_pubkey)? { for res in client.script_get_history(&op_txout.script_pubkey)? {
if has_residing && has_spending { if has_residing && has_spending {
break; break;
} }
if res.tx_hash == txid { if !has_residing && res.tx_hash == op_txid {
if has_residing {
continue;
}
has_residing = true; has_residing = true;
full_txs.insert(res.tx_hash, tx.clone()); let _ = graph_update.insert_tx(Arc::clone(&op_tx));
} else { if let Some(anchor) = determine_tx_anchor(cps, res.height, res.tx_hash) {
if has_spending { let _ = graph_update.insert_anchor(res.tx_hash, anchor);
continue;
} }
let res_tx = match full_txs.get(&res.tx_hash) { }
Some(tx) => tx,
None => { if !has_spending && res.tx_hash != op_txid {
let res_tx = client.transaction_get(&res.tx_hash)?; let res_tx = fetch_tx(client, tx_cache, res.tx_hash)?;
full_txs.insert(res.tx_hash, res_tx); // we exclude txs/anchors that do not spend our specified outpoint(s)
full_txs.get(&res.tx_hash).expect("just inserted")
}
};
has_spending = res_tx has_spending = res_tx
.input .input
.iter() .iter()
@@ -457,26 +444,26 @@ fn populate_with_outpoints(
if !has_spending { if !has_spending {
continue; continue;
} }
}; let _ = graph_update.insert_tx(Arc::clone(&res_tx));
if let Some(anchor) = determine_tx_anchor(cps, res.height, res.tx_hash) {
let anchor = determine_tx_anchor(cps, res.height, res.tx_hash); let _ = graph_update.insert_anchor(res.tx_hash, anchor);
let tx_entry = relevant_txids.0.entry(res.tx_hash).or_default(); }
if let Some(anchor) = anchor {
tx_entry.insert(anchor);
} }
} }
} }
Ok(full_txs) Ok(())
} }
/// Populate the `graph_update` with transactions/anchors of the provided `txids`.
fn populate_with_txids( fn populate_with_txids(
client: &impl ElectrumApi, client: &impl ElectrumApi,
cps: &BTreeMap<u32, CheckPoint>, cps: &BTreeMap<u32, CheckPoint>,
relevant_txids: &mut RelevantTxids, tx_cache: &mut TxCache,
graph_update: &mut TxGraph<ConfirmationHeightAnchor>,
txids: impl IntoIterator<Item = Txid>, txids: impl IntoIterator<Item = Txid>,
) -> Result<(), Error> { ) -> Result<(), Error> {
for txid in txids { for txid in txids {
let tx = match client.transaction_get(&txid) { let tx = match fetch_tx(client, tx_cache, txid) {
Ok(tx) => tx, Ok(tx) => tx,
Err(electrum_client::Error::Protocol(_)) => continue, Err(electrum_client::Error::Protocol(_)) => continue,
Err(other_err) => return Err(other_err), Err(other_err) => return Err(other_err),
@@ -488,6 +475,8 @@ fn populate_with_txids(
.map(|txo| &txo.script_pubkey) .map(|txo| &txo.script_pubkey)
.expect("tx must have an output"); .expect("tx must have an output");
// because of restrictions of the Electrum API, we have to use the `script_get_history`
// call to get confirmation status of our transaction
let anchor = match client let anchor = match client
.script_get_history(spk)? .script_get_history(spk)?
.into_iter() .into_iter()
@@ -497,18 +486,64 @@ fn populate_with_txids(
None => continue, None => continue,
}; };
let tx_entry = relevant_txids.0.entry(txid).or_default(); let _ = graph_update.insert_tx(tx);
if let Some(anchor) = anchor { if let Some(anchor) = anchor {
tx_entry.insert(anchor); let _ = graph_update.insert_anchor(txid, anchor);
} }
} }
Ok(()) Ok(())
} }
/// Fetch transaction of given `txid`.
///
/// We maintain a `tx_cache` so that we won't need to fetch from Electrum with every call.
fn fetch_tx<C: ElectrumApi>(
client: &C,
tx_cache: &mut TxCache,
txid: Txid,
) -> Result<Arc<Transaction>, Error> {
use bdk_chain::collections::hash_map::Entry;
Ok(match tx_cache.entry(txid) {
Entry::Occupied(entry) => entry.get().clone(),
Entry::Vacant(entry) => entry
.insert(Arc::new(client.transaction_get(&txid)?))
.clone(),
})
}
// Helper function which fetches the `TxOut`s of our relevant transactions' previous transactions,
// which we do not have by default. This data is needed to calculate the transaction fee.
fn fetch_prev_txout<C: ElectrumApi>(
client: &C,
tx_cache: &mut TxCache,
graph_update: &mut TxGraph<ConfirmationHeightAnchor>,
) -> Result<(), Error> {
let full_txs: Vec<Arc<Transaction>> =
graph_update.full_txs().map(|tx_node| tx_node.tx).collect();
for tx in full_txs {
for vin in &tx.input {
let outpoint = vin.previous_output;
let prev_tx = fetch_tx(client, tx_cache, outpoint.txid)?;
for txout in prev_tx.output.clone() {
let _ = graph_update.insert_txout(outpoint, txout);
}
}
}
Ok(())
}
/// Populate the `graph_update` with transactions/anchors associated with the given `spks`.
///
/// Transactions that contains an output with requested spk, or spends form an output with
/// requested spk will be added to `graph_update`. Anchors of the aforementioned transactions are
/// also included.
///
/// Checkpoints (in `cps`) are used to create anchors. The `tx_cache` is self-explanatory.
fn populate_with_spks<I: Ord + Clone>( fn populate_with_spks<I: Ord + Clone>(
client: &impl ElectrumApi, client: &impl ElectrumApi,
cps: &BTreeMap<u32, CheckPoint>, cps: &BTreeMap<u32, CheckPoint>,
relevant_txids: &mut RelevantTxids, tx_cache: &mut TxCache,
graph_update: &mut TxGraph<ConfirmationHeightAnchor>,
spks: &mut impl Iterator<Item = (I, ScriptBuf)>, spks: &mut impl Iterator<Item = (I, ScriptBuf)>,
stop_gap: usize, stop_gap: usize,
batch_size: usize, batch_size: usize,
@@ -540,10 +575,10 @@ fn populate_with_spks<I: Ord + Clone>(
unused_spk_count = 0; unused_spk_count = 0;
} }
for tx in spk_history { for tx_res in spk_history {
let tx_entry = relevant_txids.0.entry(tx.tx_hash).or_default(); let _ = graph_update.insert_tx(fetch_tx(client, tx_cache, tx_res.tx_hash)?);
if let Some(anchor) = determine_tx_anchor(cps, tx.height, tx.tx_hash) { if let Some(anchor) = determine_tx_anchor(cps, tx_res.height, tx_res.tx_hash) {
tx_entry.insert(anchor); let _ = graph_update.insert_anchor(tx_res.tx_hash, anchor);
} }
} }
} }

View File

@@ -7,19 +7,10 @@
//! keychain where the range of possibly used scripts is not known. In this case it is necessary to //! keychain where the range of possibly used scripts is not known. In this case it is necessary to
//! scan all keychain scripts until a number (the "stop gap") of unused scripts is discovered. For a //! scan all keychain scripts until a number (the "stop gap") of unused scripts is discovered. For a
//! sync or full scan the user receives relevant blockchain data and output updates for //! sync or full scan the user receives relevant blockchain data and output updates for
//! [`bdk_chain`] including [`RelevantTxids`]. //! [`bdk_chain`].
//!
//! The [`RelevantTxids`] only includes `txid`s and not full transactions. The caller is responsible
//! for obtaining full transactions before applying new data to their [`bdk_chain`]. This can be
//! done with these steps:
//!
//! 1. Determine which full transactions are missing. Use [`RelevantTxids::missing_full_txs`].
//!
//! 2. Obtaining the full transactions. To do this via electrum use [`ElectrumApi::batch_transaction_get`].
//! //!
//! Refer to [`example_electrum`] for a complete example. //! Refer to [`example_electrum`] for a complete example.
//! //!
//! [`ElectrumApi::batch_transaction_get`]: electrum_client::ElectrumApi::batch_transaction_get
//! [`example_electrum`]: https://github.com/bitcoindevkit/bdk/tree/master/example-crates/example_electrum //! [`example_electrum`]: https://github.com/bitcoindevkit/bdk/tree/master/example-crates/example_electrum
#![warn(missing_docs)] #![warn(missing_docs)]

View File

@@ -1,18 +1,17 @@
use anyhow::Result;
use bdk_chain::{ use bdk_chain::{
bitcoin::{hashes::Hash, Address, Amount, ScriptBuf, WScriptHash}, bitcoin::{hashes::Hash, Address, Amount, ScriptBuf, WScriptHash},
keychain::Balance, keychain::Balance,
local_chain::LocalChain, local_chain::LocalChain,
spk_client::SyncRequest,
ConfirmationTimeHeightAnchor, IndexedTxGraph, SpkTxOutIndex, ConfirmationTimeHeightAnchor, IndexedTxGraph, SpkTxOutIndex,
}; };
use bdk_electrum::{ElectrumExt, ElectrumUpdate}; use bdk_electrum::ElectrumExt;
use bdk_testenv::TestEnv; use bdk_testenv::{anyhow, bitcoincore_rpc::RpcApi, TestEnv};
use electrsd::bitcoind::bitcoincore_rpc::RpcApi;
fn get_balance( fn get_balance(
recv_chain: &LocalChain, recv_chain: &LocalChain,
recv_graph: &IndexedTxGraph<ConfirmationTimeHeightAnchor, SpkTxOutIndex<()>>, recv_graph: &IndexedTxGraph<ConfirmationTimeHeightAnchor, SpkTxOutIndex<()>>,
) -> Result<Balance> { ) -> anyhow::Result<Balance> {
let chain_tip = recv_chain.tip().block_id(); let chain_tip = recv_chain.tip().block_id();
let outpoints = recv_graph.index.outpoints().clone(); let outpoints = recv_graph.index.outpoints().clone();
let balance = recv_graph let balance = recv_graph
@@ -28,7 +27,7 @@ fn get_balance(
/// 3. Mine extra block to confirm sent tx. /// 3. Mine extra block to confirm sent tx.
/// 4. Check [`Balance`] to ensure tx is confirmed. /// 4. Check [`Balance`] to ensure tx is confirmed.
#[test] #[test]
fn scan_detects_confirmed_tx() -> Result<()> { fn scan_detects_confirmed_tx() -> anyhow::Result<()> {
const SEND_AMOUNT: Amount = Amount::from_sat(10_000); const SEND_AMOUNT: Amount = Amount::from_sat(10_000);
let env = TestEnv::new()?; let env = TestEnv::new()?;
@@ -62,27 +61,52 @@ fn scan_detects_confirmed_tx() -> Result<()> {
// Sync up to tip. // Sync up to tip.
env.wait_until_electrum_sees_block()?; env.wait_until_electrum_sees_block()?;
let ElectrumUpdate { let update = client
chain_update, .sync(
relevant_txids, SyncRequest::from_chain_tip(recv_chain.tip())
} = client.sync(recv_chain.tip(), [spk_to_track], None, None, 5)?; .chain_spks(core::iter::once(spk_to_track)),
5,
true,
)?
.with_confirmation_time_height_anchor(&client)?;
let missing = relevant_txids.missing_full_txs(recv_graph.graph());
let graph_update = relevant_txids.into_confirmation_time_tx_graph(&client, missing)?;
let _ = recv_chain let _ = recv_chain
.apply_update(chain_update) .apply_update(update.chain_update)
.map_err(|err| anyhow::anyhow!("LocalChain update error: {:?}", err))?; .map_err(|err| anyhow::anyhow!("LocalChain update error: {:?}", err))?;
let _ = recv_graph.apply_update(graph_update); let _ = recv_graph.apply_update(update.graph_update);
// Check to see if tx is confirmed. // Check to see if tx is confirmed.
assert_eq!( assert_eq!(
get_balance(&recv_chain, &recv_graph)?, get_balance(&recv_chain, &recv_graph)?,
Balance { Balance {
confirmed: SEND_AMOUNT.to_sat(), confirmed: SEND_AMOUNT,
..Balance::default() ..Balance::default()
}, },
); );
for tx in recv_graph.graph().full_txs() {
// Retrieve the calculated fee from `TxGraph`, which will panic if we do not have the
// floating txouts available from the transaction's previous outputs.
let fee = recv_graph
.graph()
.calculate_fee(&tx.tx)
.expect("fee must exist");
// Retrieve the fee in the transaction data from `bitcoind`.
let tx_fee = env
.bitcoind
.client
.get_transaction(&tx.txid, None)
.expect("Tx must exist")
.fee
.expect("Fee must exist")
.abs()
.to_sat() as u64;
// Check that the calculated fee matches the fee from the transaction data.
assert_eq!(fee, tx_fee);
}
Ok(()) Ok(())
} }
@@ -93,7 +117,7 @@ fn scan_detects_confirmed_tx() -> Result<()> {
/// 3. Perform 8 separate reorgs on each block with a confirmed tx. /// 3. Perform 8 separate reorgs on each block with a confirmed tx.
/// 4. Check [`Balance`] after each reorg to ensure unconfirmed amount is correct. /// 4. Check [`Balance`] after each reorg to ensure unconfirmed amount is correct.
#[test] #[test]
fn tx_can_become_unconfirmed_after_reorg() -> Result<()> { fn tx_can_become_unconfirmed_after_reorg() -> anyhow::Result<()> {
const REORG_COUNT: usize = 8; const REORG_COUNT: usize = 8;
const SEND_AMOUNT: Amount = Amount::from_sat(10_000); const SEND_AMOUNT: Amount = Amount::from_sat(10_000);
@@ -128,26 +152,27 @@ fn tx_can_become_unconfirmed_after_reorg() -> Result<()> {
// Sync up to tip. // Sync up to tip.
env.wait_until_electrum_sees_block()?; env.wait_until_electrum_sees_block()?;
let ElectrumUpdate { let update = client
chain_update, .sync(
relevant_txids, SyncRequest::from_chain_tip(recv_chain.tip()).chain_spks([spk_to_track.clone()]),
} = client.sync(recv_chain.tip(), [spk_to_track.clone()], None, None, 5)?; 5,
false,
)?
.with_confirmation_time_height_anchor(&client)?;
let missing = relevant_txids.missing_full_txs(recv_graph.graph());
let graph_update = relevant_txids.into_confirmation_time_tx_graph(&client, missing)?;
let _ = recv_chain let _ = recv_chain
.apply_update(chain_update) .apply_update(update.chain_update)
.map_err(|err| anyhow::anyhow!("LocalChain update error: {:?}", err))?; .map_err(|err| anyhow::anyhow!("LocalChain update error: {:?}", err))?;
let _ = recv_graph.apply_update(graph_update.clone()); let _ = recv_graph.apply_update(update.graph_update.clone());
// Retain a snapshot of all anchors before reorg process. // Retain a snapshot of all anchors before reorg process.
let initial_anchors = graph_update.all_anchors(); let initial_anchors = update.graph_update.all_anchors();
// Check if initial balance is correct. // Check if initial balance is correct.
assert_eq!( assert_eq!(
get_balance(&recv_chain, &recv_graph)?, get_balance(&recv_chain, &recv_graph)?,
Balance { Balance {
confirmed: SEND_AMOUNT.to_sat() * REORG_COUNT as u64, confirmed: SEND_AMOUNT * REORG_COUNT as u64,
..Balance::default() ..Balance::default()
}, },
"initial balance must be correct", "initial balance must be correct",
@@ -158,28 +183,29 @@ fn tx_can_become_unconfirmed_after_reorg() -> Result<()> {
env.reorg_empty_blocks(depth)?; env.reorg_empty_blocks(depth)?;
env.wait_until_electrum_sees_block()?; env.wait_until_electrum_sees_block()?;
let ElectrumUpdate { let update = client
chain_update, .sync(
relevant_txids, SyncRequest::from_chain_tip(recv_chain.tip()).chain_spks([spk_to_track.clone()]),
} = client.sync(recv_chain.tip(), [spk_to_track.clone()], None, None, 5)?; 5,
false,
)?
.with_confirmation_time_height_anchor(&client)?;
let missing = relevant_txids.missing_full_txs(recv_graph.graph());
let graph_update = relevant_txids.into_confirmation_time_tx_graph(&client, missing)?;
let _ = recv_chain let _ = recv_chain
.apply_update(chain_update) .apply_update(update.chain_update)
.map_err(|err| anyhow::anyhow!("LocalChain update error: {:?}", err))?; .map_err(|err| anyhow::anyhow!("LocalChain update error: {:?}", err))?;
// Check to see if a new anchor is added during current reorg. // Check to see if a new anchor is added during current reorg.
if !initial_anchors.is_superset(graph_update.all_anchors()) { if !initial_anchors.is_superset(update.graph_update.all_anchors()) {
println!("New anchor added at reorg depth {}", depth); println!("New anchor added at reorg depth {}", depth);
} }
let _ = recv_graph.apply_update(graph_update); let _ = recv_graph.apply_update(update.graph_update);
assert_eq!( assert_eq!(
get_balance(&recv_chain, &recv_graph)?, get_balance(&recv_chain, &recv_graph)?,
Balance { Balance {
confirmed: SEND_AMOUNT.to_sat() * (REORG_COUNT - depth) as u64, confirmed: SEND_AMOUNT * (REORG_COUNT - depth) as u64,
trusted_pending: SEND_AMOUNT.to_sat() * depth as u64, trusted_pending: SEND_AMOUNT * depth as u64,
..Balance::default() ..Balance::default()
}, },
"reorg_count: {}", "reorg_count: {}",

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "bdk_esplora" name = "bdk_esplora"
version = "0.12.0" version = "0.13.0"
edition = "2021" edition = "2021"
homepage = "https://bitcoindevkit.org" homepage = "https://bitcoindevkit.org"
repository = "https://github.com/bitcoindevkit/bdk" repository = "https://github.com/bitcoindevkit/bdk"
@@ -12,7 +12,7 @@ readme = "README.md"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
bdk_chain = { path = "../chain", version = "0.13.0", default-features = false } bdk_chain = { path = "../chain", version = "0.14.0", default-features = false }
esplora-client = { version = "0.7.0", default-features = false } esplora-client = { version = "0.7.0", default-features = false }
async-trait = { version = "0.1.66", optional = true } async-trait = { version = "0.1.66", optional = true }
futures = { version = "0.3.26", optional = true } futures = { version = "0.3.26", optional = true }
@@ -23,9 +23,7 @@ miniscript = { version = "11.0.0", optional = true, default-features = false }
[dev-dependencies] [dev-dependencies]
bdk_testenv = { path = "../testenv", default_features = false } bdk_testenv = { path = "../testenv", default_features = false }
electrsd = { version= "0.27.1", features = ["bitcoind_25_0", "esplora_a33e97e1", "legacy"] }
tokio = { version = "1", features = ["rt", "rt-multi-thread", "macros"] } tokio = { version = "1", features = ["rt", "rt-multi-thread", "macros"] }
anyhow = "1"
[features] [features]
default = ["std", "async-https", "blocking-https-rustls"] default = ["std", "async-https", "blocking-https-rustls"]

View File

@@ -28,8 +28,8 @@ pub trait EsploraAsyncExt {
/// Scan keychain scripts for transactions against Esplora, returning an update that can be /// Scan keychain scripts for transactions against Esplora, returning an update that can be
/// applied to the receiving structures. /// applied to the receiving structures.
/// ///
/// * `local_tip`: the previously seen tip from [`LocalChain::tip`]. /// - `request`: struct with data required to perform a spk-based blockchain client full scan,
/// * `keychain_spks`: keychains that we want to scan transactions for /// see [`FullScanRequest`]
/// ///
/// The full scan for each keychain stops after a gap of `stop_gap` script pubkeys with no /// The full scan for each keychain stops after a gap of `stop_gap` script pubkeys with no
/// associated transactions. `parallel_requests` specifies the max number of HTTP requests to /// associated transactions. `parallel_requests` specifies the max number of HTTP requests to
@@ -47,8 +47,6 @@ pub trait EsploraAsyncExt {
/// and [Sparrow](https://www.sparrowwallet.com/docs/faq.html#ive-restored-my-wallet-but-some-of-my-funds-are-missing). /// and [Sparrow](https://www.sparrowwallet.com/docs/faq.html#ive-restored-my-wallet-but-some-of-my-funds-are-missing).
/// ///
/// A `stop_gap` of 0 will be treated as a `stop_gap` of 1. /// A `stop_gap` of 0 will be treated as a `stop_gap` of 1.
///
/// [`LocalChain::tip`]: bdk_chain::local_chain::LocalChain::tip
async fn full_scan<K: Ord + Clone + Send>( async fn full_scan<K: Ord + Clone + Send>(
&self, &self,
request: FullScanRequest<K>, request: FullScanRequest<K>,
@@ -59,16 +57,12 @@ pub trait EsploraAsyncExt {
/// Sync a set of scripts with the blockchain (via an Esplora client) for the data /// Sync a set of scripts with the blockchain (via an Esplora client) for the data
/// specified and return a [`TxGraph`]. /// specified and return a [`TxGraph`].
/// ///
/// * `local_tip`: the previously seen tip from [`LocalChain::tip`]. /// - `request`: struct with data required to perform a spk-based blockchain client sync, see
/// * `misc_spks`: scripts that we want to sync transactions for /// [`SyncRequest`]
/// * `txids`: transactions for which we want updated [`ConfirmationTimeHeightAnchor`]s
/// * `outpoints`: transactions associated with these outpoints (residing, spending) that we
/// want to include in the update
/// ///
/// If the scripts to sync are unknown, such as when restoring or importing a keychain that /// If the scripts to sync are unknown, such as when restoring or importing a keychain that
/// may include scripts that have been used, use [`full_scan`] with the keychain. /// may include scripts that have been used, use [`full_scan`] with the keychain.
/// ///
/// [`LocalChain::tip`]: bdk_chain::local_chain::LocalChain::tip
/// [`full_scan`]: EsploraAsyncExt::full_scan /// [`full_scan`]: EsploraAsyncExt::full_scan
async fn sync( async fn sync(
&self, &self,
@@ -417,8 +411,7 @@ mod test {
local_chain::LocalChain, local_chain::LocalChain,
BlockId, BlockId,
}; };
use bdk_testenv::TestEnv; use bdk_testenv::{anyhow, bitcoincore_rpc::RpcApi, TestEnv};
use electrsd::bitcoind::bitcoincore_rpc::RpcApi;
use esplora_client::Builder; use esplora_client::Builder;
use crate::async_ext::{chain_update, fetch_latest_blocks}; use crate::async_ext::{chain_update, fetch_latest_blocks};

View File

@@ -26,8 +26,8 @@ pub trait EsploraExt {
/// Scan keychain scripts for transactions against Esplora, returning an update that can be /// Scan keychain scripts for transactions against Esplora, returning an update that can be
/// applied to the receiving structures. /// applied to the receiving structures.
/// ///
/// * `local_tip`: the previously seen tip from [`LocalChain::tip`]. /// - `request`: struct with data required to perform a spk-based blockchain client full scan,
/// * `keychain_spks`: keychains that we want to scan transactions for /// see [`FullScanRequest`]
/// ///
/// The full scan for each keychain stops after a gap of `stop_gap` script pubkeys with no /// The full scan for each keychain stops after a gap of `stop_gap` script pubkeys with no
/// associated transactions. `parallel_requests` specifies the max number of HTTP requests to /// associated transactions. `parallel_requests` specifies the max number of HTTP requests to
@@ -45,8 +45,6 @@ pub trait EsploraExt {
/// and [Sparrow](https://www.sparrowwallet.com/docs/faq.html#ive-restored-my-wallet-but-some-of-my-funds-are-missing). /// and [Sparrow](https://www.sparrowwallet.com/docs/faq.html#ive-restored-my-wallet-but-some-of-my-funds-are-missing).
/// ///
/// A `stop_gap` of 0 will be treated as a `stop_gap` of 1. /// A `stop_gap` of 0 will be treated as a `stop_gap` of 1.
///
/// [`LocalChain::tip`]: bdk_chain::local_chain::LocalChain::tip
fn full_scan<K: Ord + Clone>( fn full_scan<K: Ord + Clone>(
&self, &self,
request: FullScanRequest<K>, request: FullScanRequest<K>,
@@ -57,16 +55,12 @@ pub trait EsploraExt {
/// Sync a set of scripts with the blockchain (via an Esplora client) for the data /// Sync a set of scripts with the blockchain (via an Esplora client) for the data
/// specified and return a [`TxGraph`]. /// specified and return a [`TxGraph`].
/// ///
/// * `local_tip`: the previously seen tip from [`LocalChain::tip`]. /// - `request`: struct with data required to perform a spk-based blockchain client sync, see
/// * `misc_spks`: scripts that we want to sync transactions for /// [`SyncRequest`]
/// * `txids`: transactions for which we want updated [`ConfirmationTimeHeightAnchor`]s
/// * `outpoints`: transactions associated with these outpoints (residing, spending) that we
/// want to include in the update
/// ///
/// If the scripts to sync are unknown, such as when restoring or importing a keychain that /// If the scripts to sync are unknown, such as when restoring or importing a keychain that
/// may include scripts that have been used, use [`full_scan`] with the keychain. /// may include scripts that have been used, use [`full_scan`] with the keychain.
/// ///
/// [`LocalChain::tip`]: bdk_chain::local_chain::LocalChain::tip
/// [`full_scan`]: EsploraExt::full_scan /// [`full_scan`]: EsploraExt::full_scan
fn sync(&self, request: SyncRequest, parallel_requests: usize) -> Result<SyncResult, Error>; fn sync(&self, request: SyncRequest, parallel_requests: usize) -> Result<SyncResult, Error>;
} }
@@ -407,8 +401,7 @@ mod test {
use bdk_chain::bitcoin::Txid; use bdk_chain::bitcoin::Txid;
use bdk_chain::local_chain::LocalChain; use bdk_chain::local_chain::LocalChain;
use bdk_chain::BlockId; use bdk_chain::BlockId;
use bdk_testenv::TestEnv; use bdk_testenv::{anyhow, bitcoincore_rpc::RpcApi, TestEnv};
use electrsd::bitcoind::bitcoincore_rpc::RpcApi;
use esplora_client::{BlockHash, Builder}; use esplora_client::{BlockHash, Builder};
use std::collections::{BTreeMap, BTreeSet}; use std::collections::{BTreeMap, BTreeSet};
use std::time::Duration; use std::time::Duration;

View File

@@ -1,7 +1,5 @@
use bdk_chain::spk_client::{FullScanRequest, SyncRequest}; use bdk_chain::spk_client::{FullScanRequest, SyncRequest};
use bdk_esplora::EsploraAsyncExt; use bdk_esplora::EsploraAsyncExt;
use electrsd::bitcoind::anyhow;
use electrsd::bitcoind::bitcoincore_rpc::RpcApi;
use esplora_client::{self, Builder}; use esplora_client::{self, Builder};
use std::collections::{BTreeSet, HashSet}; use std::collections::{BTreeSet, HashSet};
use std::str::FromStr; use std::str::FromStr;
@@ -9,7 +7,7 @@ use std::thread::sleep;
use std::time::Duration; use std::time::Duration;
use bdk_chain::bitcoin::{Address, Amount, Txid}; use bdk_chain::bitcoin::{Address, Amount, Txid};
use bdk_testenv::TestEnv; use bdk_testenv::{anyhow, bitcoincore_rpc::RpcApi, TestEnv};
#[tokio::test] #[tokio::test]
pub async fn test_update_tx_graph_without_keychain() -> anyhow::Result<()> { pub async fn test_update_tx_graph_without_keychain() -> anyhow::Result<()> {

View File

@@ -1,7 +1,5 @@
use bdk_chain::spk_client::{FullScanRequest, SyncRequest}; use bdk_chain::spk_client::{FullScanRequest, SyncRequest};
use bdk_esplora::EsploraExt; use bdk_esplora::EsploraExt;
use electrsd::bitcoind::anyhow;
use electrsd::bitcoind::bitcoincore_rpc::RpcApi;
use esplora_client::{self, Builder}; use esplora_client::{self, Builder};
use std::collections::{BTreeSet, HashSet}; use std::collections::{BTreeSet, HashSet};
use std::str::FromStr; use std::str::FromStr;
@@ -9,7 +7,7 @@ use std::thread::sleep;
use std::time::Duration; use std::time::Duration;
use bdk_chain::bitcoin::{Address, Amount, Txid}; use bdk_chain::bitcoin::{Address, Amount, Txid};
use bdk_testenv::TestEnv; use bdk_testenv::{anyhow, bitcoincore_rpc::RpcApi, TestEnv};
#[test] #[test]
pub fn test_update_tx_graph_without_keychain() -> anyhow::Result<()> { pub fn test_update_tx_graph_without_keychain() -> anyhow::Result<()> {

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "bdk_file_store" name = "bdk_file_store"
version = "0.10.0" version = "0.11.0"
edition = "2021" edition = "2021"
license = "MIT OR Apache-2.0" license = "MIT OR Apache-2.0"
repository = "https://github.com/bitcoindevkit/bdk" repository = "https://github.com/bitcoindevkit/bdk"
@@ -12,8 +12,8 @@ readme = "README.md"
[dependencies] [dependencies]
anyhow = { version = "1", default-features = false } anyhow = { version = "1", default-features = false }
bdk_chain = { path = "../chain", version = "0.13.0", features = [ "serde", "miniscript" ] } bdk_chain = { path = "../chain", version = "0.14.0", features = [ "serde", "miniscript" ] }
bdk_persist = { path = "../persist", version = "0.1.0"} bdk_persist = { path = "../persist", version = "0.2.0"}
bincode = { version = "1" } bincode = { version = "1" }
serde = { version = "1", features = ["derive"] } serde = { version = "1", features = ["derive"] }

View File

@@ -1,7 +1,7 @@
[package] [package]
name = "bdk_persist" name = "bdk_persist"
homepage = "https://bitcoindevkit.org" homepage = "https://bitcoindevkit.org"
version = "0.1.0" version = "0.2.0"
repository = "https://github.com/bitcoindevkit/bdk" repository = "https://github.com/bitcoindevkit/bdk"
documentation = "https://docs.rs/bdk_persist" documentation = "https://docs.rs/bdk_persist"
description = "Types that define data persistence of a BDK wallet" description = "Types that define data persistence of a BDK wallet"
@@ -14,6 +14,9 @@ rust-version = "1.63"
[dependencies] [dependencies]
anyhow = { version = "1", default-features = false } anyhow = { version = "1", default-features = false }
bdk_chain = { path = "../chain", version = "0.13.0", default-features = false } bdk_chain = { path = "../chain", version = "0.14.0", default-features = false }
[features]
default = ["bdk_chain/std"]

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "bdk_testenv" name = "bdk_testenv"
version = "0.3.0" version = "0.4.0"
edition = "2021" edition = "2021"
rust-version = "1.63" rust-version = "1.63"
homepage = "https://bitcoindevkit.org" homepage = "https://bitcoindevkit.org"
@@ -13,10 +13,8 @@ readme = "README.md"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
bitcoincore-rpc = { version = "0.18" } bdk_chain = { path = "../chain", version = "0.14", default-features = false }
bdk_chain = { path = "../chain", version = "0.13", default-features = false }
electrsd = { version= "0.27.1", features = ["bitcoind_25_0", "esplora_a33e97e1", "legacy"] } electrsd = { version= "0.27.1", features = ["bitcoind_25_0", "esplora_a33e97e1", "legacy"] }
anyhow = { version = "1" }
[features] [features]
default = ["std"] default = ["std"]

View File

@@ -11,6 +11,11 @@ use bitcoincore_rpc::{
bitcoincore_rpc_json::{GetBlockTemplateModes, GetBlockTemplateRules}, bitcoincore_rpc_json::{GetBlockTemplateModes, GetBlockTemplateRules},
RpcApi, RpcApi,
}; };
pub use electrsd;
pub use electrsd::bitcoind;
pub use electrsd::bitcoind::anyhow;
pub use electrsd::bitcoind::bitcoincore_rpc;
pub use electrsd::electrum_client;
use electrsd::electrum_client::ElectrumApi; use electrsd::electrum_client::ElectrumApi;
use std::time::Duration; use std::time::Duration;
@@ -261,8 +266,7 @@ impl TestEnv {
#[cfg(test)] #[cfg(test)]
mod test { mod test {
use crate::TestEnv; use crate::TestEnv;
use anyhow::Result; use electrsd::bitcoind::{anyhow::Result, bitcoincore_rpc::RpcApi};
use bitcoincore_rpc::RpcApi;
/// This checks that reorgs initiated by `bitcoind` is detected by our `electrsd` instance. /// This checks that reorgs initiated by `bitcoind` is detected by our `electrsd` instance.
#[test] #[test]

View File

@@ -212,7 +212,7 @@ fn main() -> anyhow::Result<()> {
graph.graph().balance( graph.graph().balance(
&*chain, &*chain,
synced_to.block_id(), synced_to.block_id(),
graph.index.outpoints().iter().cloned(), graph.index.outpoints(),
|(k, _), _| k == &Keychain::Internal, |(k, _), _| k == &Keychain::Internal,
) )
}; };
@@ -336,7 +336,7 @@ fn main() -> anyhow::Result<()> {
graph.graph().balance( graph.graph().balance(
&*chain, &*chain,
synced_to.block_id(), synced_to.block_id(),
graph.index.outpoints().iter().cloned(), graph.index.outpoints(),
|(k, _), _| k == &Keychain::Internal, |(k, _), _| k == &Keychain::Internal,
) )
}; };

View File

@@ -249,14 +249,20 @@ where
script_pubkey: address.script_pubkey(), script_pubkey: address.script_pubkey(),
}]; }];
let internal_keychain = if graph.index.keychains().get(&Keychain::Internal).is_some() { let internal_keychain = if graph
.index
.keychains()
.any(|(k, _)| *k == Keychain::Internal)
{
Keychain::Internal Keychain::Internal
} else { } else {
Keychain::External Keychain::External
}; };
let ((change_index, change_script), change_changeset) = let ((change_index, change_script), change_changeset) = graph
graph.index.next_unused_spk(&internal_keychain); .index
.next_unused_spk(&internal_keychain)
.expect("Must exist");
changeset.append(change_changeset); changeset.append(change_changeset);
// Clone to drop the immutable reference. // Clone to drop the immutable reference.
@@ -266,8 +272,9 @@ where
&graph &graph
.index .index
.keychains() .keychains()
.get(&internal_keychain) .find(|(k, _)| *k == &internal_keychain)
.expect("must exist") .expect("must exist")
.1
.at_derivation_index(change_index) .at_derivation_index(change_index)
.expect("change_index can't be hardened"), .expect("change_index can't be hardened"),
&assets, &assets,
@@ -284,8 +291,9 @@ where
min_drain_value: graph min_drain_value: graph
.index .index
.keychains() .keychains()
.get(&internal_keychain) .find(|(k, _)| *k == &internal_keychain)
.expect("must exist") .expect("must exist")
.1
.dust_value(), .dust_value(),
..CoinSelectorOpt::fund_outputs( ..CoinSelectorOpt::fund_outputs(
&outputs, &outputs,
@@ -416,7 +424,7 @@ pub fn planned_utxos<A: Anchor, O: ChainOracle, K: Clone + bdk_tmp_plan::CanDeri
assets: &bdk_tmp_plan::Assets<K>, assets: &bdk_tmp_plan::Assets<K>,
) -> Result<Vec<PlannedUtxo<K, A>>, O::Error> { ) -> Result<Vec<PlannedUtxo<K, A>>, O::Error> {
let chain_tip = chain.get_chain_tip()?; let chain_tip = chain.get_chain_tip()?;
let outpoints = graph.index.outpoints().iter().cloned(); let outpoints = graph.index.outpoints();
graph graph
.graph() .graph()
.try_filter_chain_unspents(chain, chain_tip, outpoints) .try_filter_chain_unspents(chain, chain_tip, outpoints)
@@ -428,8 +436,9 @@ pub fn planned_utxos<A: Anchor, O: ChainOracle, K: Clone + bdk_tmp_plan::CanDeri
let desc = graph let desc = graph
.index .index
.keychains() .keychains()
.get(&k) .find(|(keychain, _)| *keychain == &k)
.expect("keychain must exist") .expect("keychain must exist")
.1
.at_derivation_index(i) .at_derivation_index(i)
.expect("i can't be hardened"); .expect("i can't be hardened");
let plan = bdk_tmp_plan::plan_satisfaction(&desc, assets)?; let plan = bdk_tmp_plan::plan_satisfaction(&desc, assets)?;
@@ -465,7 +474,8 @@ where
_ => unreachable!("only these two variants exist in match arm"), _ => unreachable!("only these two variants exist in match arm"),
}; };
let ((spk_i, spk), index_changeset) = spk_chooser(index, &Keychain::External); let ((spk_i, spk), index_changeset) =
spk_chooser(index, &Keychain::External).expect("Must exist");
let db = &mut *db.lock().unwrap(); let db = &mut *db.lock().unwrap();
db.stage_and_commit(C::from(( db.stage_and_commit(C::from((
local_chain::ChangeSet::default(), local_chain::ChangeSet::default(),
@@ -506,18 +516,18 @@ where
let chain = &*chain.lock().unwrap(); let chain = &*chain.lock().unwrap();
fn print_balances<'a>( fn print_balances<'a>(
title_str: &'a str, title_str: &'a str,
items: impl IntoIterator<Item = (&'a str, u64)>, items: impl IntoIterator<Item = (&'a str, Amount)>,
) { ) {
println!("{}:", title_str); println!("{}:", title_str);
for (name, amount) in items.into_iter() { for (name, amount) in items.into_iter() {
println!(" {:<10} {:>12} sats", name, amount) println!(" {:<10} {:>12} sats", name, amount.to_sat())
} }
} }
let balance = graph.graph().try_balance( let balance = graph.graph().try_balance(
chain, chain,
chain.get_chain_tip()?, chain.get_chain_tip()?,
graph.index.outpoints().iter().cloned(), graph.index.outpoints(),
|(k, _), _| k == &Keychain::Internal, |(k, _), _| k == &Keychain::Internal,
)?; )?;
@@ -547,7 +557,7 @@ where
let graph = &*graph.lock().unwrap(); let graph = &*graph.lock().unwrap();
let chain = &*chain.lock().unwrap(); let chain = &*chain.lock().unwrap();
let chain_tip = chain.get_chain_tip()?; let chain_tip = chain.get_chain_tip()?;
let outpoints = graph.index.outpoints().iter().cloned(); let outpoints = graph.index.outpoints();
match txout_cmd { match txout_cmd {
TxOutCmd::List { TxOutCmd::List {
@@ -695,9 +705,11 @@ where
let mut index = KeychainTxOutIndex::<Keychain>::default(); let mut index = KeychainTxOutIndex::<Keychain>::default();
// TODO: descriptors are already stored in the db, so we shouldn't re-insert
// them in the index here. However, the keymap is not stored in the database.
let (descriptor, mut keymap) = let (descriptor, mut keymap) =
Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, &args.descriptor)?; Descriptor::<DescriptorPublicKey>::parse_descriptor(&secp, &args.descriptor)?;
index.add_keychain(Keychain::External, descriptor); let _ = index.insert_descriptor(Keychain::External, descriptor);
if let Some((internal_descriptor, internal_keymap)) = args if let Some((internal_descriptor, internal_keymap)) = args
.change_descriptor .change_descriptor
@@ -706,7 +718,7 @@ where
.transpose()? .transpose()?
{ {
keymap.extend(internal_keymap); keymap.extend(internal_keymap);
index.add_keychain(Keychain::Internal, internal_descriptor); let _ = index.insert_descriptor(Keychain::Internal, internal_descriptor);
} }
let mut db_backend = match Store::<C>::open_or_create_new(db_magic, &args.db_path) { let mut db_backend = match Store::<C>::open_or_create_new(db_magic, &args.db_path) {

View File

@@ -1,19 +1,20 @@
use std::{ use std::{
collections::BTreeMap,
io::{self, Write}, io::{self, Write},
sync::Mutex, sync::Mutex,
}; };
use bdk_chain::{ use bdk_chain::{
bitcoin::{constants::genesis_block, Address, Network, OutPoint, Txid}, bitcoin::{constants::genesis_block, Address, Network, Txid},
collections::BTreeSet,
indexed_tx_graph::{self, IndexedTxGraph}, indexed_tx_graph::{self, IndexedTxGraph},
keychain, keychain,
local_chain::{self, LocalChain}, local_chain::{self, LocalChain},
spk_client::{FullScanRequest, SyncRequest},
Append, ConfirmationHeightAnchor, Append, ConfirmationHeightAnchor,
}; };
use bdk_electrum::{ use bdk_electrum::{
electrum_client::{self, Client, ElectrumApi}, electrum_client::{self, Client, ElectrumApi},
ElectrumExt, ElectrumUpdate, ElectrumExt,
}; };
use example_cli::{ use example_cli::{
anyhow::{self, Context}, anyhow::{self, Context},
@@ -147,42 +148,56 @@ fn main() -> anyhow::Result<()> {
let client = electrum_cmd.electrum_args().client(args.network)?; let client = electrum_cmd.electrum_args().client(args.network)?;
let response = match electrum_cmd.clone() { let (chain_update, mut graph_update, keychain_update) = match electrum_cmd.clone() {
ElectrumCommands::Scan { ElectrumCommands::Scan {
stop_gap, stop_gap,
scan_options, scan_options,
.. ..
} => { } => {
let (keychain_spks, tip) = { let request = {
let graph = &*graph.lock().unwrap(); let graph = &*graph.lock().unwrap();
let chain = &*chain.lock().unwrap(); let chain = &*chain.lock().unwrap();
let keychain_spks = graph FullScanRequest::from_chain_tip(chain.tip())
.index .cache_graph_txs(graph.graph())
.all_unbounded_spk_iters() .set_spks_for_keychain(
.into_iter() Keychain::External,
.map(|(keychain, iter)| { graph
let mut first = true; .index
let spk_iter = iter.inspect(move |(i, _)| { .unbounded_spk_iter(&Keychain::External)
if first { .into_iter()
eprint!("\nscanning {}: ", keychain); .flatten(),
first = false; )
.set_spks_for_keychain(
Keychain::Internal,
graph
.index
.unbounded_spk_iter(&Keychain::Internal)
.into_iter()
.flatten(),
)
.inspect_spks_for_all_keychains({
let mut once = BTreeSet::new();
move |k, spk_i, _| {
if once.insert(k) {
eprint!("\nScanning {}: {} ", k, spk_i);
} else {
eprint!("{} ", spk_i);
} }
io::stdout().flush().expect("must flush");
eprint!("{} ", i); }
let _ = io::stdout().flush();
});
(keychain, spk_iter)
}) })
.collect::<BTreeMap<_, _>>();
let tip = chain.tip();
(keychain_spks, tip)
}; };
client let res = client
.full_scan(tip, keychain_spks, stop_gap, scan_options.batch_size) .full_scan::<_>(request, stop_gap, scan_options.batch_size, false)
.context("scanning the blockchain")? .context("scanning the blockchain")?
.with_confirmation_height_anchor();
(
res.chain_update,
res.graph_update,
Some(res.last_active_indices),
)
} }
ElectrumCommands::Sync { ElectrumCommands::Sync {
mut unused_spks, mut unused_spks,
@@ -195,7 +210,6 @@ fn main() -> anyhow::Result<()> {
// Get a short lock on the tracker to get the spks we're interested in // Get a short lock on the tracker to get the spks we're interested in
let graph = graph.lock().unwrap(); let graph = graph.lock().unwrap();
let chain = chain.lock().unwrap(); let chain = chain.lock().unwrap();
let chain_tip = chain.tip().block_id();
if !(all_spks || unused_spks || utxos || unconfirmed) { if !(all_spks || unused_spks || utxos || unconfirmed) {
unused_spks = true; unused_spks = true;
@@ -205,18 +219,20 @@ fn main() -> anyhow::Result<()> {
unused_spks = false; unused_spks = false;
} }
let mut spks: Box<dyn Iterator<Item = bdk_chain::bitcoin::ScriptBuf>> = let chain_tip = chain.tip();
Box::new(core::iter::empty()); let mut request =
SyncRequest::from_chain_tip(chain_tip.clone()).cache_graph_txs(graph.graph());
if all_spks { if all_spks {
let all_spks = graph let all_spks = graph
.index .index
.revealed_spks(..) .revealed_spks(..)
.map(|(k, i, spk)| (k.to_owned(), i, spk.to_owned())) .map(|(k, i, spk)| (k.to_owned(), i, spk.to_owned()))
.collect::<Vec<_>>(); .collect::<Vec<_>>();
spks = Box::new(spks.chain(all_spks.into_iter().map(|(k, i, spk)| { request = request.chain_spks(all_spks.into_iter().map(|(k, spk_i, spk)| {
eprintln!("scanning {}:{}", k, i); eprint!("Scanning {}: {}", k, spk_i);
spk spk
}))); }));
} }
if unused_spks { if unused_spks {
let unused_spks = graph let unused_spks = graph
@@ -224,82 +240,88 @@ fn main() -> anyhow::Result<()> {
.unused_spks() .unused_spks()
.map(|(k, i, spk)| (k, i, spk.to_owned())) .map(|(k, i, spk)| (k, i, spk.to_owned()))
.collect::<Vec<_>>(); .collect::<Vec<_>>();
spks = Box::new(spks.chain(unused_spks.into_iter().map(|(k, i, spk)| { request =
eprintln!( request.chain_spks(unused_spks.into_iter().map(move |(k, spk_i, spk)| {
"Checking if address {} {}:{} has been used", eprint!(
Address::from_script(&spk, args.network).unwrap(), "Checking if address {} {}:{} has been used",
k, Address::from_script(&spk, args.network).unwrap(),
i, k,
); spk_i,
spk );
}))); spk
}));
} }
let mut outpoints: Box<dyn Iterator<Item = OutPoint>> = Box::new(core::iter::empty());
if utxos { if utxos {
let init_outpoints = graph.index.outpoints().iter().cloned(); let init_outpoints = graph.index.outpoints();
let utxos = graph let utxos = graph
.graph() .graph()
.filter_chain_unspents(&*chain, chain_tip, init_outpoints) .filter_chain_unspents(&*chain, chain_tip.block_id(), init_outpoints)
.map(|(_, utxo)| utxo) .map(|(_, utxo)| utxo)
.collect::<Vec<_>>(); .collect::<Vec<_>>();
request = request.chain_outpoints(utxos.into_iter().map(|utxo| {
outpoints = Box::new( eprint!(
utxos "Checking if outpoint {} (value: {}) has been spent",
.into_iter() utxo.outpoint, utxo.txout.value
.inspect(|utxo| { );
eprintln!( utxo.outpoint
"Checking if outpoint {} (value: {}) has been spent", }));
utxo.outpoint, utxo.txout.value
);
})
.map(|utxo| utxo.outpoint),
);
}; };
let mut txids: Box<dyn Iterator<Item = Txid>> = Box::new(core::iter::empty());
if unconfirmed { if unconfirmed {
let unconfirmed_txids = graph let unconfirmed_txids = graph
.graph() .graph()
.list_chain_txs(&*chain, chain_tip) .list_chain_txs(&*chain, chain_tip.block_id())
.filter(|canonical_tx| !canonical_tx.chain_position.is_confirmed()) .filter(|canonical_tx| !canonical_tx.chain_position.is_confirmed())
.map(|canonical_tx| canonical_tx.tx_node.txid) .map(|canonical_tx| canonical_tx.tx_node.txid)
.collect::<Vec<Txid>>(); .collect::<Vec<Txid>>();
txids = Box::new(unconfirmed_txids.into_iter().inspect(|txid| { request = request.chain_txids(
eprintln!("Checking if {} is confirmed yet", txid); unconfirmed_txids
})); .into_iter()
.inspect(|txid| eprint!("Checking if {} is confirmed yet", txid)),
);
} }
let tip = chain.tip(); let total_spks = request.spks.len();
let total_txids = request.txids.len();
let total_ops = request.outpoints.len();
request = request
.inspect_spks({
let mut visited = 0;
move |_| {
visited += 1;
eprintln!(" [ {:>6.2}% ]", (visited * 100) as f32 / total_spks as f32)
}
})
.inspect_txids({
let mut visited = 0;
move |_| {
visited += 1;
eprintln!(" [ {:>6.2}% ]", (visited * 100) as f32 / total_txids as f32)
}
})
.inspect_outpoints({
let mut visited = 0;
move |_| {
visited += 1;
eprintln!(" [ {:>6.2}% ]", (visited * 100) as f32 / total_ops as f32)
}
});
let res = client
.sync(request, scan_options.batch_size, false)
.context("scanning the blockchain")?
.with_confirmation_height_anchor();
// drop lock on graph and chain // drop lock on graph and chain
drop((graph, chain)); drop((graph, chain));
let electrum_update = client (res.chain_update, res.graph_update, None)
.sync(tip, spks, txids, outpoints, scan_options.batch_size)
.context("scanning the blockchain")?;
(electrum_update, BTreeMap::new())
} }
}; };
let (
ElectrumUpdate {
chain_update,
relevant_txids,
},
keychain_update,
) = response;
let missing_txids = {
let graph = &*graph.lock().unwrap();
relevant_txids.missing_full_txs(graph.graph())
};
let mut graph_update = relevant_txids.into_tx_graph(&client, missing_txids)?;
let now = std::time::UNIX_EPOCH let now = std::time::UNIX_EPOCH
.elapsed() .elapsed()
.expect("must get time") .expect("must get time")
@@ -310,21 +332,17 @@ fn main() -> anyhow::Result<()> {
let mut chain = chain.lock().unwrap(); let mut chain = chain.lock().unwrap();
let mut graph = graph.lock().unwrap(); let mut graph = graph.lock().unwrap();
let chain = chain.apply_update(chain_update)?; let chain_changeset = chain.apply_update(chain_update)?;
let indexed_tx_graph = { let mut indexed_tx_graph_changeset =
let mut changeset = indexed_tx_graph::ChangeSet::<ConfirmationHeightAnchor, _>::default();
indexed_tx_graph::ChangeSet::<ConfirmationHeightAnchor, _>::default(); if let Some(keychain_update) = keychain_update {
let (_, indexer) = graph.index.reveal_to_target_multi(&keychain_update); let (_, keychain_changeset) = graph.index.reveal_to_target_multi(&keychain_update);
changeset.append(indexed_tx_graph::ChangeSet { indexed_tx_graph_changeset.append(keychain_changeset.into());
indexer, }
..Default::default() indexed_tx_graph_changeset.append(graph.apply_update(graph_update));
});
changeset.append(graph.apply_update(graph_update));
changeset
};
(chain, indexed_tx_graph) (chain_changeset, indexed_tx_graph_changeset)
}; };
let mut db = db.lock().unwrap(); let mut db = db.lock().unwrap();

View File

@@ -277,7 +277,7 @@ fn main() -> anyhow::Result<()> {
// We want to search for whether the UTXO is spent, and spent by which // We want to search for whether the UTXO is spent, and spent by which
// transaction. We provide the outpoint of the UTXO to // transaction. We provide the outpoint of the UTXO to
// `EsploraExt::update_tx_graph_without_keychain`. // `EsploraExt::update_tx_graph_without_keychain`.
let init_outpoints = graph.index.outpoints().iter().cloned(); let init_outpoints = graph.index.outpoints();
let utxos = graph let utxos = graph
.graph() .graph()
.filter_chain_unspents(&*chain, local_tip.block_id(), init_outpoints) .filter_chain_unspents(&*chain, local_tip.block_id(), init_outpoints)

View File

@@ -1,18 +1,18 @@
const DB_MAGIC: &str = "bdk_wallet_electrum_example"; const DB_MAGIC: &str = "bdk_wallet_electrum_example";
const SEND_AMOUNT: u64 = 5000; const SEND_AMOUNT: Amount = Amount::from_sat(5000);
const STOP_GAP: usize = 50; const STOP_GAP: usize = 50;
const BATCH_SIZE: usize = 5; const BATCH_SIZE: usize = 5;
use std::io::Write; use std::io::Write;
use std::str::FromStr; use std::str::FromStr;
use bdk::bitcoin::Address; use bdk::bitcoin::{Address, Amount};
use bdk::wallet::Update; use bdk::chain::collections::HashSet;
use bdk::{bitcoin::Network, Wallet}; use bdk::{bitcoin::Network, Wallet};
use bdk::{KeychainKind, SignOptions}; use bdk::{KeychainKind, SignOptions};
use bdk_electrum::{ use bdk_electrum::{
electrum_client::{self, ElectrumApi}, electrum_client::{self, ElectrumApi},
ElectrumExt, ElectrumUpdate, ElectrumExt,
}; };
use bdk_file_store::Store; use bdk_file_store::Store;
@@ -38,44 +38,30 @@ fn main() -> Result<(), anyhow::Error> {
print!("Syncing..."); print!("Syncing...");
let client = electrum_client::Client::new("ssl://electrum.blockstream.info:60002")?; let client = electrum_client::Client::new("ssl://electrum.blockstream.info:60002")?;
let prev_tip = wallet.latest_checkpoint(); let request = wallet
let keychain_spks = wallet .start_full_scan()
.all_unbounded_spk_iters() .inspect_spks_for_all_keychains({
.into_iter() let mut once = HashSet::<KeychainKind>::new();
.map(|(k, k_spks)| { move |k, spk_i, _| {
let mut once = Some(()); if once.insert(k) {
let mut stdout = std::io::stdout(); print!("\nScanning keychain [{:?}]", k)
let k_spks = k_spks } else {
.inspect(move |(spk_i, _)| match once.take() { print!(" {:<3}", spk_i)
Some(_) => print!("\nScanning keychain [{:?}]", k), }
None => print!(" {:<3}", spk_i), }
})
.inspect(move |_| stdout.flush().expect("must flush"));
(k, k_spks)
}) })
.collect(); .inspect_spks_for_all_keychains(|_, _, _| std::io::stdout().flush().expect("must flush"));
let ( let mut update = client
ElectrumUpdate { .full_scan(request, STOP_GAP, BATCH_SIZE, false)?
chain_update, .with_confirmation_time_height_anchor(&client)?;
relevant_txids,
}, let now = std::time::UNIX_EPOCH.elapsed().unwrap().as_secs();
keychain_update, let _ = update.graph_update.update_last_seen_unconfirmed(now);
) = client.full_scan(prev_tip, keychain_spks, STOP_GAP, BATCH_SIZE)?;
println!(); println!();
let missing = relevant_txids.missing_full_txs(wallet.as_ref()); wallet.apply_update(update)?;
let mut graph_update = relevant_txids.into_confirmation_time_tx_graph(&client, missing)?;
let now = std::time::UNIX_EPOCH.elapsed().unwrap().as_secs();
let _ = graph_update.update_last_seen_unconfirmed(now);
let wallet_update = Update {
last_active_indices: keychain_update,
graph: graph_update,
chain: Some(chain_update),
};
wallet.apply_update(wallet_update)?;
wallet.commit()?; wallet.commit()?;
let balance = wallet.get_balance(); let balance = wallet.get_balance();

View File

@@ -1,14 +1,14 @@
use std::{collections::BTreeSet, io::Write, str::FromStr}; use std::{collections::BTreeSet, io::Write, str::FromStr};
use bdk::{ use bdk::{
bitcoin::{Address, Network, Script}, bitcoin::{Address, Amount, Network, Script},
KeychainKind, SignOptions, Wallet, KeychainKind, SignOptions, Wallet,
}; };
use bdk_esplora::{esplora_client, EsploraAsyncExt}; use bdk_esplora::{esplora_client, EsploraAsyncExt};
use bdk_file_store::Store; use bdk_file_store::Store;
const DB_MAGIC: &str = "bdk_wallet_esplora_async_example"; const DB_MAGIC: &str = "bdk_wallet_esplora_async_example";
const SEND_AMOUNT: u64 = 5000; const SEND_AMOUNT: Amount = Amount::from_sat(5000);
const STOP_GAP: usize = 50; const STOP_GAP: usize = 50;
const PARALLEL_REQUESTS: usize = 5; const PARALLEL_REQUESTS: usize = 5;

View File

@@ -1,12 +1,12 @@
const DB_MAGIC: &str = "bdk_wallet_esplora_example"; const DB_MAGIC: &str = "bdk_wallet_esplora_example";
const SEND_AMOUNT: u64 = 1000; const SEND_AMOUNT: Amount = Amount::from_sat(1000);
const STOP_GAP: usize = 5; const STOP_GAP: usize = 5;
const PARALLEL_REQUESTS: usize = 1; const PARALLEL_REQUESTS: usize = 1;
use std::{collections::BTreeSet, io::Write, str::FromStr}; use std::{collections::BTreeSet, io::Write, str::FromStr};
use bdk::{ use bdk::{
bitcoin::{Address, Network}, bitcoin::{Address, Amount, Network},
KeychainKind, SignOptions, Wallet, KeychainKind, SignOptions, Wallet,
}; };
use bdk_esplora::{esplora_client, EsploraExt}; use bdk_esplora::{esplora_client, EsploraExt};