Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clippy fixes after alpha release #392

Merged
merged 7 commits into from
Jun 7, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/rust.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,8 @@ jobs:
run: cargo check
- name: Run cargo fmt check
run: |
if ! cargo fmt --check ; then
echo "Formatting errors detected, please run 'cargo fmt' to fix it";
if ! cargo fmt --all --check ; then
echo "Formatting errors detected, please run 'cargo fmt --all' to fix it";
exit 1
fi

Expand Down
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ fix: ## cargo fmt and fix
lint: ## cargo check and clippy
cargo check
cargo check --all-targets --all-features
cargo clippy --all
cargo clippy --all-targets --all-features

check-features: ## Checks that project compiles with all combinations of features
cargo hack --feature-powerset check
Expand Down
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,12 +84,12 @@ issue! All of the core developers can be reached via [Discord](https://discord.g

The easiest way to build a rollup is to use the Module System. You can find a tutorial [here](./examples/demo-nft-module/README.md).

We also provide two examples - [`demo-stf`](./examples/demo-stf/), which shows how to use the Module System to implement a
state transition, and [`demo-rollup`](./examples/demo-rollup/), which shows how to combine the demo STF with a DA layer and a ZKVM to
We also provide two examples - [`demo-stf`](./examples/demo-stf/README.md), which shows how to use the Module System to implement a
state transition, and [`demo-rollup`](./examples/demo-rollup/README.md), which shows how to combine the demo STF with a DA layer and a ZKVM to
get a complete rollup implementation.

If you want even more control over your rollup's functionality, you can implement a completely custom State Transition Function
without using the Module System. You can find a tutorial [here](./examples/demo-simple-stf/).
without using the Module System. You can find a tutorial [here](./examples/demo-simple-stf/README.md).

### Adding a new Data Availability Layer

Expand All @@ -100,7 +100,7 @@ If you want to add support for a new data availability layer, the easiest way to

Adapters contain the logic integrating 3rd party codebases into the Sovereign SDK. Over time, we expect Sovereign SDK
to have adapters for almost all Data Availability Layers and LLVM-compatible proof systems. Currently, we
maintain adapters for [`Risc0`](www.risczero.com) (a ZKVM) and [`Celestia`](www.celestia.org) a (DA layer).
maintain adapters for [`Risc0`](https://www.risczero.com) (a ZKVM) and [`Celestia`](https://www.celestia.org) a (DA layer).
The Avail project also maintains an adapter for their DA layer, which can be found [here](https://github.com/availproject/avail-sovereign-da-adapter).

## Warning
Expand Down
4 changes: 2 additions & 2 deletions adapters/celestia/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[package]
name = "jupiter"
version = { workspace = true }
edition = { workspace = true }
version = { workspace = true }
edition = { workspace = true }

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

Expand Down
4 changes: 2 additions & 2 deletions adapters/celestia/src/celestia.rs
Original file line number Diff line number Diff line change
Expand Up @@ -349,10 +349,10 @@ impl Address for H160 {}
pub fn parse_pfb_namespace(
group: NamespaceGroup,
) -> Result<Vec<(MsgPayForBlobs, TxPosition)>, BoxError> {
if group.shares().len() == 0 {
if group.shares().is_empty() {
return Ok(vec![]);
}
assert!(group.shares()[0].namespace() == PFB_NAMESPACE);
assert_eq!(group.shares()[0].namespace(), PFB_NAMESPACE);
let mut pfbs = Vec::new();
for blob in group.blobs() {
let mut data = blob.data();
Expand Down
15 changes: 9 additions & 6 deletions adapters/celestia/src/da_service.rs
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ impl DaService for CelestiaService {

fn get_finalized_at(&self, height: u64) -> Self::Future<Self::FilteredBlock> {
let client = self.client.clone();
let rollup_namespace = self.rollup_namespace.clone();
let rollup_namespace = self.rollup_namespace;
Box::pin(async move {
let _span = span!(Level::TRACE, "fetching finalized block", height = height);
// Fetch the header and relevant shares via RPC
Expand Down Expand Up @@ -287,20 +287,20 @@ mod tests {
shares::{NamespaceGroup, Share},
};

const SERIALIZED_PFB_SHARES: &'static str = r#"["AAAAAAAAAAQBAAABRQAAABHDAgq3AgqKAQqHAQogL2NlbGVzdGlhLmJsb2IudjEuTXNnUGF5Rm9yQmxvYnMSYwovY2VsZXN0aWExemZ2cnJmYXE5dWQ2Zzl0NGt6bXNscGYyNHlzYXhxZm56ZWU1dzkSCHNvdi10ZXN0GgEoIiCB8FoaUuOPrX2wFBbl4MnWY3qE72tns7sSY8xyHnQtr0IBABJmClAKRgofL2Nvc21vcy5jcnlwdG8uc2VjcDI1NmsxLlB1YktleRIjCiEDmXaTf6RVIgUVdG0XZ6bqecEn8jWeAi+LjzTis5QZdd4SBAoCCAEYARISCgwKBHV0aWESBDIwMDAQgPEEGkAhq2CzD1DqxsVXIriANXYyLAmJlnnt8YTNXiwHgMQQGUbl65QUe37UhnbNVrOzDVYK/nQV9TgI+5NetB2JbIz6EgEBGgRJTkRYAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="]"#;
const SERIALIZED_ROLLUP_DATA_SHARES: &'static str = r#"["c292LXRlc3QBAAAAKHsia2V5IjogInRlc3RrZXkiLCAidmFsdWUiOiAidGVzdHZhbHVlIn0AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="]"#;
const SERIALIZED_PFB_SHARES: &str = r#"["AAAAAAAAAAQBAAABRQAAABHDAgq3AgqKAQqHAQogL2NlbGVzdGlhLmJsb2IudjEuTXNnUGF5Rm9yQmxvYnMSYwovY2VsZXN0aWExemZ2cnJmYXE5dWQ2Zzl0NGt6bXNscGYyNHlzYXhxZm56ZWU1dzkSCHNvdi10ZXN0GgEoIiCB8FoaUuOPrX2wFBbl4MnWY3qE72tns7sSY8xyHnQtr0IBABJmClAKRgofL2Nvc21vcy5jcnlwdG8uc2VjcDI1NmsxLlB1YktleRIjCiEDmXaTf6RVIgUVdG0XZ6bqecEn8jWeAi+LjzTis5QZdd4SBAoCCAEYARISCgwKBHV0aWESBDIwMDAQgPEEGkAhq2CzD1DqxsVXIriANXYyLAmJlnnt8YTNXiwHgMQQGUbl65QUe37UhnbNVrOzDVYK/nQV9TgI+5NetB2JbIz6EgEBGgRJTkRYAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="]"#;
const SERIALIZED_ROLLUP_DATA_SHARES: &str = r#"["c292LXRlc3QBAAAAKHsia2V5IjogInRlc3RrZXkiLCAidmFsdWUiOiAidGVzdHZhbHVlIn0AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="]"#;

#[test]
fn test_get_pfbs() {
// the following test case is taken from arabica-6, block 275345
let shares: Vec<Share> =
serde_json::from_str(SERIALIZED_PFB_SHARES).expect("failed to deserialize pfb shares");

assert!(shares.len() == 1);
assert_eq!(shares.len(), 1);

let pfb_ns = NamespaceGroup::Compact(shares);
let pfbs = parse_pfb_namespace(pfb_ns).expect("failed to parse pfb shares");
assert!(pfbs.len() == 1);
assert_eq!(pfbs.len(), 1);
}

#[test]
Expand All @@ -315,7 +315,10 @@ mod tests {
.expect("iterator should contain exactly one blob");

let found_data: Vec<u8> = first_blob.data().collect();
assert!(&found_data == r#"{"key": "testkey", "value": "testvalue"}"#.as_bytes());
assert_eq!(
found_data,
r#"{"key": "testkey", "value": "testvalue"}"#.as_bytes()
);

assert!(blobs.next().is_none());
}
Expand Down
4 changes: 2 additions & 2 deletions adapters/celestia/src/share_commit.rs
Original file line number Diff line number Diff line change
Expand Up @@ -66,11 +66,11 @@ fn power_of_2_mountain_range(mut len: usize, square_size: usize) -> Vec<usize> {
while len != 0 {
if len >= square_size {
output.push(square_size);
len = len - square_size;
len -= square_size;
} else {
let p = next_lower_power_of_2(len);
output.push(p);
len = len - p;
len -= p;
}
}
output
Expand Down
13 changes: 7 additions & 6 deletions adapters/celestia/src/shares.rs
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ use prost::{
};
use serde::{de::Error, Deserialize, Serialize, Serializer};
use sov_rollup_interface::Bytes;
use tracing::error;
use tracing::{error, info};

use crate::verifier::PFB_NAMESPACE;

Expand Down Expand Up @@ -114,7 +114,7 @@ fn is_continuation_unchecked(share: &[u8]) -> bool {
}

fn enforce_version_zero(share: &[u8]) {
assert!(share[8] & !0x01 == 0)
assert_eq!(share[8] & !0x01, 0)
}

#[derive(Debug, Clone, PartialEq, Copy)]
Expand Down Expand Up @@ -279,7 +279,8 @@ impl NamespaceGroup {
pub fn from_b64(b64: &str) -> Result<Self, ShareParsingError> {
let mut decoded = Vec::with_capacity((b64.len() + 3) / 4 * 3);
// unsafe { decoded.set_len((b64.len() / 4 * 3)) }
if let Err(_) = base64::decode_config_buf(b64, base64::STANDARD, &mut decoded) {
if let Err(err) = base64::decode_config_buf(b64, base64::STANDARD, &mut decoded) {
info!("Error decoding NamespaceGroup from base64: {}", err);
return Err(ShareParsingError::ErrInvalidBase64);
}
let mut output: Bytes = decoded.into();
Expand Down Expand Up @@ -347,7 +348,7 @@ impl NamespaceGroup {
}
}

pub fn blobs<'a>(&self) -> NamespaceIterator {
pub fn blobs(&self) -> NamespaceIterator {
NamespaceIterator {
offset: 0,
shares: self,
Expand All @@ -359,7 +360,7 @@ pub struct Blob(pub Vec<Share>);

impl<'a> From<BlobRef<'a>> for Blob {
fn from(value: BlobRef<'a>) -> Self {
Self(value.0.iter().map(|s| s.clone()).collect())
Self(value.0.to_vec())
}
}

Expand Down Expand Up @@ -639,7 +640,7 @@ mod tests {
let share = Share::new(bytes);
let serialized = share.try_to_vec().unwrap();
let deserialized: Share = Share::try_from_slice(&serialized).unwrap();
prop_assert_eq!(share, deserialized.clone());
prop_assert_eq!(share, deserialized);
}
}
}
8 changes: 4 additions & 4 deletions adapters/celestia/src/types.rs
Original file line number Diff line number Diff line change
Expand Up @@ -181,17 +181,17 @@ use nmt_rs::{
NAMESPACED_HASH_LEN,
};

impl Into<NamespaceProof<NamespacedSha2Hasher>> for JsonNamespaceProof {
fn into(self) -> NamespaceProof<NamespacedSha2Hasher> {
impl From<JsonNamespaceProof> for NamespaceProof<NamespacedSha2Hasher> {
fn from(val: JsonNamespaceProof) -> Self {
NamespaceProof::PresenceProof {
proof: Proof {
siblings: self
siblings: val
.nodes
.unwrap_or_default()
.into_iter()
.map(|v| ns_hash_from_b64(&v.inner))
.collect(),
start_idx: self.start as u32,
start_idx: val.start as u32,
},
ignore_max_ns: true,
}
Expand Down
5 changes: 2 additions & 3 deletions adapters/celestia/src/verifier/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -161,12 +161,11 @@ impl da::DaVerifier for CelestiaVerifier {
// Collect all of the shares data into a single array
let trailing_shares = tx_shares[1..]
.iter()
.map(|share| share.data_ref().iter())
.flatten();
.flat_map(|share| share.data_ref().iter());
let tx_data: Vec<u8> = tx_shares[0].data_ref()[start_offset..]
.iter()
.chain(trailing_shares)
.map(|x| *x)
.copied()
.collect();

// Deserialize the pfb transaction
Expand Down
6 changes: 3 additions & 3 deletions examples/const-rollup-config/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[package]
name = "const-rollup-config"
version = { workspace = true }
edition = { workspace = true }
authors = { workspace = true }
version = { workspace = true }
edition = { workspace = true }
authors = { workspace = true }
homepage = "sovereign.xyz"
publish = false
13 changes: 6 additions & 7 deletions examples/demo-nft-module/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
[package]
name = "demo-nft-module"
version = { workspace = true }
edition = { workspace = true }
authors = { workspace = true }
license = { workspace = true }
version = { workspace = true }
edition = { workspace = true }
authors = { workspace = true }
license = { workspace = true }
homepage = "sovereign.xyz"
publish = false

Expand All @@ -13,18 +13,17 @@ publish = false
anyhow = { workspace = true }
borsh = { workspace = true, features = ["rc"] }
serde = { workspace = true, optional = true }
serde_json = { workspace = true, optional = true }

sov-modules-api = { path = "../../module-system/sov-modules-api", default-features = false }
sov-modules-macros = { path = "../../module-system/sov-modules-macros" }
sov-state = { path = "../../module-system/sov-state", default-features = false }

[dev-dependencies]
sov-rollup-interface = { path = "../../rollup-interface" }
sov-rollup-interface = { path = "../../rollup-interface" }
tempfile = { workspace = true }


[features]
default = ["native"]
serde = ["dep:serde", "dep:serde_json"]
serde = ["dep:serde"]
native = ["serde", "sov-state/native", "sov-modules-api/native"]
4 changes: 2 additions & 2 deletions examples/demo-prover/host/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ demo-stf = { path = "../../demo-stf" }
sov-rollup-interface = { path = "../../../rollup-interface" }
risc0-adapter = { path = "../../../adapters/risc0" }
const-rollup-config = { path = "../../const-rollup-config" }
sov-modules-api = { path = "../../../module-system/sov-modules-api", features=["native"] }
sov-state = { path = "../../../module-system/sov-state", features=["native"] }
sov-modules-api = { path = "../../../module-system/sov-modules-api", features = ["native"] }
sov-state = { path = "../../../module-system/sov-state", features = ["native"] }

methods = { path = "../methods" }
12 changes: 6 additions & 6 deletions examples/demo-rollup/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
[package]
name = "sov-demo-rollup"
version = { workspace = true }
edition = { workspace = true }
resolver = "2"
authors = { workspace = true }
version = { workspace = true }
edition = { workspace = true }
authors = { workspace = true }
homepage = "sovereign.xyz"
publish = false
resolver = "2"

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

Expand Down Expand Up @@ -36,8 +36,8 @@ sov-modules-stf-template = { path = "../../module-system/sov-modules-stf-templat
sov-bank = { path = "../../module-system/module-implementations/sov-bank", default-features = false }
sov-election = { path = "../../module-system/module-implementations/examples/sov-election", default-features = false }
sov-value-setter = { path = "../../module-system/module-implementations/examples/sov-value-setter", default-features = false }
sov-modules-api = { path = "../../module-system/sov-modules-api", features=["native"] }
sov-state = { path = "../../module-system/sov-state", features=["native"] }
sov-modules-api = { path = "../../module-system/sov-modules-api", features = ["native"] }
sov-state = { path = "../../module-system/sov-state", features = ["native"] }
const-rollup-config = { path = "../const-rollup-config" }


Expand Down
4 changes: 2 additions & 2 deletions examples/demo-stf/src/batch_builder.rs
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@ mod tests {
use sov_modules_api::default_context::DefaultContext;
use sov_modules_api::default_signature::private_key::DefaultPrivateKey;
use sov_modules_api::transaction::Transaction;
use sov_modules_api::{Context, Genesis, ModuleInfo};
use sov_modules_api::{Context, Genesis};
use sov_modules_macros::{DispatchCall, Genesis, MessageCodec};
use sov_rollup_interface::services::batch_builder::BatchBuilder;
use sov_state::{DefaultStorageSpec, ProverStorage, Storage};
Expand Down Expand Up @@ -311,7 +311,7 @@ mod tests {

let batch_size = txs[0].len() + txs[4].len() + 1;

let working_set = WorkingSet::new(storage.clone());
let working_set = WorkingSet::new(storage);
let mut batch_builder = build_test_batch_builder(batch_size);
batch_builder.set_working_set(working_set);

Expand Down
14 changes: 9 additions & 5 deletions examples/demo-stf/src/sov-cli/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ impl SerializedTx {
call_data_path.as_ref()
)
})?;
cmd_parser(&module_name, &call_data)
cmd_parser(module_name, &call_data)
}
}

Expand Down Expand Up @@ -205,10 +205,14 @@ pub fn main() {
sender_address,
salt,
} => {
let sender_address =
Address::from(AddressBech32::try_from(sender_address.clone()).expect(
&format!("Failed to derive pub key from string: {}", sender_address),
));
let sender_address = Address::from(
AddressBech32::try_from(sender_address.clone()).unwrap_or_else(|e| {
panic!(
"Failed to derive pub key from string: {}: {}",
sender_address, e
)
}),
);
let token_address =
sov_bank::create_token_address::<C>(&token_name, sender_address.as_ref(), salt);
println!("{}", token_address);
Expand Down
8 changes: 4 additions & 4 deletions examples/demo-stf/src/tests/data_generation/election_data.rs
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ impl MessageGenerator for ElectionCallMessages {
_is_last: bool,
) -> Transaction<DefaultContext> {
let message = Runtime::<DefaultContext>::encode_election_call(message);
Transaction::<DefaultContext>::new_signed_tx(&sender, message, nonce)
Transaction::<DefaultContext>::new_signed_tx(sender, message, nonce)
}
}

Expand Down Expand Up @@ -172,7 +172,7 @@ impl MessageGenerator for InvalidElectionCallMessages {
_is_last: bool,
) -> Transaction<DefaultContext> {
let message = Runtime::<DefaultContext>::encode_election_call(message);
Transaction::<DefaultContext>::new_signed_tx(&sender, message, nonce)
Transaction::<DefaultContext>::new_signed_tx(sender, message, nonce)
}
}

Expand Down Expand Up @@ -249,7 +249,7 @@ impl MessageGenerator for BadNonceElectionCallMessages {
let nonce = if flag { nonce + 1 } else { nonce };

let message = Runtime::<DefaultContext>::encode_election_call(message);
Transaction::<DefaultContext>::new_signed_tx(&sender, message, nonce)
Transaction::<DefaultContext>::new_signed_tx(sender, message, nonce)
}
}

Expand Down Expand Up @@ -286,6 +286,6 @@ impl MessageGenerator for BadSerializationElectionCallMessages {
Runtime::<DefaultContext>::encode_election_call(message)
};

Transaction::<DefaultContext>::new_signed_tx(&sender, call_data, nonce)
Transaction::<DefaultContext>::new_signed_tx(sender, call_data, nonce)
}
}
Loading