Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DO NOT MERGE] Error when loading wasm #1

Closed
wants to merge 10 commits into from

Conversation

mhchia
Copy link
Contributor

@mhchia mhchia commented Jul 11, 2023

Issue Encountered

When incorporating bind_prover from the tlsn-prover dev branch in this PR, I encountered an error in the final execution stage.

Although both wasm-pack build --target web and webpack-dev-server compile without issues, the page crashes upon loading, failing to execute the wasm code. The detailed error message is shown in the screenshot.

bind_prover(config, client_ws_stream_into, notary_ws_stream_into)
.await
.unwrap();

Both wasm-pack build --target web and webpack-dev-server work. However, when opening the page, I got the following error, which failed to run the wasm code.

image

The issue appears to involve the GFp_x25519_sc_reduce function from the ring crate, which currently does not compile to wasm (known issue). As tlsn-prover currently relies on the ring crate indirectly, this prevents proper execution of the wasm code. The ring dependency chain is demonstrated by the following dependency tree:

# Ran under the project root of `tlsn`. This lists dependencies other than dev-dependencies which depend on `ring` crate.
% cargo tree -i ring -e normal,build
ring v0.16.20
├── rustls-webpki v0.100.1
│   └── webpki-roots v0.23.1
│       ├── tlsn-core v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/tlsn/tlsn-core)
│       │   ├── tlsn-notary v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/tlsn/tlsn-notary)
│       │   └── tlsn-prover v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/tlsn/tlsn-prover)
│       └── tlsn-prover v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/tlsn/tlsn-prover)
├── sct v0.7.0
│   ├── tlsn-tls-client v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-client)
│   │   ├── tlsn-prover v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/tlsn/tlsn-prover)
│   │   └── tlsn-tls-client-async v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-client-async)
│   │       └── tlsn-prover v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/tlsn/tlsn-prover)
│   └── tlsn-tls-core v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-core)
│       ├── tlsn-core v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/tlsn/tlsn-core) (*)
│       ├── tlsn-prover v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/tlsn/tlsn-prover)
│       ├── tlsn-tls-backend v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-backend)
│       │   ├── tlsn-tls-client v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-client) (*)
│       │   └── tlsn-tls-mpc v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-mpc)
│       │       ├── tlsn-notary v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/tlsn/tlsn-notary)
│       │       └── tlsn-prover v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/tlsn/tlsn-prover)
│       ├── tlsn-tls-client v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-client) (*)
│       └── tlsn-tls-mpc v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-mpc) (*)
├── tlsn-tls-client v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-client) (*)
├── tlsn-tls-core v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-core) (*)
└── webpki v0.22.0
    ├── tlsn-tls-client v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-client) (*)
    └── tlsn-tls-core v0.1.0 (/Users/mhchia/projects/work/pse/tlsn/components/tls/tls-core) (*)

Reproduction Steps

To reproduce the issue, follow these steps:

  1. Clone the repo and checkout to this branch
git clone [email protected]:tlsnotary/tlsn-extension.git
git checkout mhchia/exploring-wasm-bindgen
  1. Install wasm-pack if it is not already installed.
yarn global add wasm-pack
  1. Run yarn build-and-start to build the project, generate bindings with wasm-bindgen, and start a webserver on port 8080.
yarn build-and-start
  1. Open the page at http://localhost:8080 in the browser.
open http://localhost:8080
  1. The error message should now be visible on the page.

Potential Solution

Based on offline discussions, one potential solution would be to replace ring with an alternative that supports wasm compilation.

@mhchia mhchia changed the title Generate bindings for rust websocket client [DO NOT MERGE] Error when loading wasm Jul 26, 2023
@mhchia
Copy link
Contributor Author

mhchia commented Aug 9, 2023

Current Status

Both notary and tlsn-extension (tlsn-prover) are stuck waiting for the results from futures.

Steps to Reproduce

1. Run websocket proxy

docker run -it --rm -p 55688:80 novnc/websockify 80 twitter.com:443

2. Run notary server in twitter example

git clone [email protected]:mhchia/tlsn.git && git checkout tlsn-examples-ws
cd tlsn/examples
RUST_LOG=trace,yamux=info cargo run --release --example notary

3. Setup tlsn extension

Install wasm-pack

yarn global add wasm-pack

Build wasm and start webserver

git clone [email protected]:tlsnotary/tlsn-extension.git && git checkout mhchia/exploring-wasm-bindgen
yarn build-and-start

4. Open the page (Prover)

open http://localhost:8080

Logs

Prover

You should be able to see the following in the console. An example can be seen here.

!@# 0
!@# 1
2023-08-09T13:24:19.716Z DEBUG bind_prover{config=ProverConfig { id: "example", server_dns: "twitter.com", root_cert_store: RootCertStore { roots: [OwnedTrustAnchor { subject: [49, 11, 48, 9, 6, 3, 85, 4, 6, 19, 2, 85, 83, 49, 23, 48, 21, 6, 3, 85, 4, 10, 19, 14, 68, 105, 103, 105, 67, 101, 114, 116, 44, 32, 73, 110, 99, 46, 49, 38, 48, 36, 6, 3, 85, 4, 3, 19, 29, 68, 105, 103, 105, 67, 101, 114, 116, 32, 84, 76, 83, 32, 69, 67, 67, 32, 80, 51, 56, 52, 32, 82, 111, 111, 116, 32, 71, 53], spki: [48, 16, 6, 7, 42, 134, 72, 206, 61, 2, 1, 6, 5, 43, 129, 4, 0, 34, 3, 98, 0, 4, 193, 68, 161, 207, 17, 151, 80, 154, 222, 35, 130, 53, 7, 205, 208, 203, 24, 157, 210, 241, 127, 119, 53, 79, 59, 221, 148, 114, 82, 237, 194, 59, 248, 236, 250, 123, 107, 88, 32, 236, 153, 174, 201, 252, 104, 179, 117, 185, 219, 9, 236, 200, 19, 245, 78, 198, 10, 29, 102, 48, 76, 187, 31, 71, 10, 60, 97, 16, 66, 41, 124, 165, 8, 14, 224, 34, 233, 211, 53, 104, 206, 155, 99, 159, 132, 181, 153, 77, 88, 160, 142, 245, 84, 231, 149, 201], name_constraints: None }, OwnedTrustAnchor { subject: [49, 11, 48, 9, 6, 3, 85, 4, 6, 19, 2, 85, 83, 49, 22, 48, 20, 6, 3, 85, 4, 10, 19, 13, 69, 110, 116, 114, 117, 115, 116, 44, 32, 73, 110, 99, 46, 49, 40, 48, 38, 6, 3, 85, 4, 11, 19, 31, 83, 101, 101, 32, 119, 119, 119, 46, 101, 110, 116, 114, 117, 115, 116, 46, 110, 101, 116, 47, 108, 101, 103, 97, 108, 45, 116, 101, 114, 109, 115, 49, 57, 48, 55, 6, 3, 85, 4, 11, 19, 48, 40, 99, 41, 32, 50, 48, 49, 50, 32, 69, 110, 116, 114, 117, 115, 116, 44, 32, 73, 110, 99, 46, 32, 45, 32, 102, 111, 114, 32, 97, 117, 116, 104, 111, 114, 105, 122, 101, 100, 32, 117, 115, 101, 32, 111, 110, 108, 121, 49, 51, 48, 49, 6, 3, 85, 4, 3, 19, 42, 69, 110, 116, 114, 117, 115, 116, 32, 82, 111, 111, 116, 32, 67, 101, 114, 116, 105, 102, 105, 99, 97, 116, 105, 111, 110, 32, 65, 117, 116, 104, 111, 114, 105, 116, 121, 32, 45, 32, 69, 67, 49], spki: [48, 16, 6, 7, 42, 134, 72, 206, 61, 2, 1, 6, 5, 43, 129, 4, 0, 34, 3, 98, 0, 4, 132, 19, 201, 208, 186, 109, 65, 123, 226, 108, 208, 235, 85, 95, 102, 2, 26, 36, 244, 91, 137, 105, 71, 227, 184, 194, 125, 241, 242, 2, 197, 159, 160, 246, 91, 213, 139, 6, 25, 134, 79, 83, 16, 109, 7, 36, 39, 161, 160, 248, 213, 71, 25, 97, 76, 125, 202, 147, 39, 234, 116, 12, 239, 111, 150, 9, 254, 99, 236, 112, 93, 54, 173, 103, 119, 174, 201, 157, 124, 85, 68, 58, 162, 99, 81, 31, 245, 227, 98, 212, 169, 71, 7, 62, 204, 32], name_constraints: None }, OwnedTrustAnchor { subject: [49, 11, 48, 9, 6, 3, 85, 4, 6, 19, 2, 85, 83, 49, 20, 48, 18, 6, 3, 85, 4, 10, 12, 11, 65, 102, 102, 105, 114, 109, 84, 114, 117, 115, 116, 49, 31, 48, 29, 6, 3, 85, 4, 3, 12, 22, 65, 102, 102, 105, 114, 109, 84, 114, 117, 115, 116, 32, 67, 111, 109, 109, 101, 114, 99, 105, 97, 108], spki: [48, 13, 6, 9, 42, 134, 72, 134, 247, 13, 1, 1, 1, 5, 0, 3, 130, 1, 15, 0, 48, 130, 1, 10, 2, 130, 1, 1, 0, 246, 27, 79, 103, 7, 43, 161, 21, 245, 6, 34, 203, 31, 1, 178, 227, 115, 69, 6, 68, 73, 44, 187, 73, 37, 20, 214, 206, 195, 183, 171, 44, 79, 198, 65, 50, 148, 87, 250, 18, 167, 91, 14, 226, 143, 31, 30, 134, 25, 167, 170, 181, 45, 185, 95, 13, 138, 194, 175, 133, 53, 121, 50, 45, 187, 28, 98, 55, 242, 177, 91, 74, 61, 202, 205, 113, 95, 233, 66, 190, 148, 232, 200, 222, 249, 34, 72, 100, 198, 229, 171, 198, 43, 109, 173, 5, 240, 250, 213, 11, 207, 154, 229, 240, 80, 164, 139, 59, 71, 165, 35, 91, 122, 122, 248, 51, 63, 184, 239, 153, 151, 227, 32, 193, 214, 40, 137, 207, 148, 251, 185, 69, 237, 227, 64, 23, 17, 212, 116, 240, 11, 49, 226, 43, 38, 106, 155, 76, 87, 174, 172, 32, 62, 186, 69, 12
...

Notary

You should see the something like the below (details can be seen here)

% RUST_LOG=trace,yamux=info cargo run --release --example notary
...
Listening on: 127.0.0.1:7788
Accepted connection from: 127.0.0.1:65478
2023-08-09T12:55:19.932099Z TRACE tungstenite::handshake::server: Server handshake initiated.    
2023-08-09T12:55:19.932167Z TRACE tungstenite::handshake::machine: Doing handshake round.    
2023-08-09T12:55:19.932389Z TRACE tungstenite::handshake::machine: Doing handshake round.    
2023-08-09T12:55:19.932458Z DEBUG tungstenite::handshake::server: Server handshake done.    
2023-08-09T12:55:19.932777Z TRACE tungstenite::protocol: Frames still in queue: 0    
2023-08-09T12:55:19.932805Z TRACE tungstenite::protocol: Frames still in queue: 0    
!@# notarize: 0
!@# notarize: 1
!@# notarize: 2

...

2023-08-09T12:55:19.943526Z TRACE tungstenite::protocol: Received message ot/1/parent    
2023-08-09T12:55:19.943534Z TRACE tungstenite::protocol: Frames still in queue: 0    
2023-08-09T12:55:19.943538Z TRACE tungstenite::protocol: Frames still in queue: 0    
2023-08-09T12:55:19.943572Z TRACE tungstenite::protocol: Frames still in queue: 0    
2023-08-09T12:55:19.943576Z TRACE tungstenite::protocol: Frames still in queue: 0    
2023-08-09T12:55:19.943585Z TRACE tungstenite::protocol: Frames still in queue: 0    
2023-08-09T12:55:19.943588Z TRACE tungstenite::protocol: Frames still in queue: 0    
2023-08-09T12:55:19.943755Z TRACE tungstenite::protocol: Frames still in queue: 0    
2023-08-09T12:55:19.943770Z TRACE tungstenite::protocol: Frames still in queue: 0    
!@# notarize: 3
2023-08-09T12:55:19.943834Z  INFO tlsn_notary: Created OT senders and receivers
!@# notarize: 27
!@# notarize: 28
!@# notarize: 29
!@# notarize: loop start
!@# notarize: 4
!@# notarize: 5: encoder_seed: [96, 215, 192, 173, 80, 65, 210, 138, 97, 163, 217, 79, 2, 30, 207, 132, 13, 66, 82, 126, 182, 96, 143, 97, 33, 96, 119, 37, 210, 252, 185, 51]
2023-08-09T12:55:19.944003Z TRACE tokio_util::codec::framed_impl: flushing framed transport
2023-08-09T12:55:19.944009Z TRACE tokio_util::codec::framed_impl: writing; remaining=37
2023-08-09T12:55:19.944016Z TRACE tokio_util::codec::framed_impl: framed transport flushed
2023-08-09T12:55:19.944459Z TRACE tungstenite::protocol: Frames still in queue: 0    
2023-08-09T12:55:19.944481Z TRACE tungstenite::protocol: Frames still in queue: 0    
2023-08-09T12:55:19.944489Z TRACE tungstenite::protocol: Frames still in queue: 1    
2023-08-09T12:55:19.944494Z TRACE tungstenite::protocol: Sending frame: Frame { header: FrameHeader { is_final: true, rsv1: false, rsv2: false, rsv3: false, opcode: Data(Binary), mask: None }, payload: [0, 0, 0, 2, 0, 0, 0, 7, 0, 0, 0, 37] }    
2023-08-09T12:55:19.944504Z TRACE tungstenite::protocol::frame: writing frame 

@chcharcharlie
Copy link

Not sure how useful it is, but I added a bit more logs and found that both sender setup and receiver setup are getting stuck at the parent_ot.rand_setup step: https://github.com/mhchia/tlsn/blob/0213adf7b547ae75977a9f195c862b7a539136d3/components/actors/actor-ot/src/sender.rs#L104.

@chcharcharlie
Copy link

Not sure how useful it is, but I added a bit more logs and found that both sender setup and receiver setup are getting stuck at the parent_ot.rand_setup step: https://github.com/mhchia/tlsn/blob/0213adf7b547ae75977a9f195c862b7a539136d3/components/actors/actor-ot/src/sender.rs#L104.

Digged a bit further and it seems to be Backend::spawn not really spawning up and running anything here: https://github.com/privacy-scaling-explorations/mpz/blob/dev/ot/mpz-ot/src/kos/sender.rs#L42.

I guess initially if we were able to get wasm-bindgen-rayon to help translate this into webworkers this will be able to work, but that path always fails with the websocket close issue, so I guess we will have to get around it?

@mhchia
Copy link
Contributor Author

mhchia commented Aug 19, 2023

Just a note for the recent changes. Now, the prover from the Twitter example can run in the browser, by switching from the rayon backend to single thread. It takes ~350 seconds though.

Changes

  • tlsn-utils: use single cpu backend over rayon
  • tlsn (branch tlsn-examples-ws-single-cpu-backend, diff)
    • Use the modified tlsn-utils
    • Use web-time over std::time. Because some std::time API is not supported in wasm
      • web-time is used here since it supports wasm and provides the same API as std::time
      • We can probably use time or instant since they seem to support wasm and they are a lot more popular
  • tlsn-extension (branch mhchia/exploring-wasm-bindgen)
    • Use wasm-bindgen-futures::spawn_local over tokio::spawn
    • Run tln-prover operations from the Twitter example

Steps to Reproduce

  1. Checkout tlsn to the branch tlsn-examples-ws-single-cpu-backend.
  2. The other steps remain the same in [DO NOT MERGE] Error when loading wasm #1 (comment).
  3. You should see something like image

@0xtsukino
Copy link
Collaborator

merged with #4

@0xtsukino 0xtsukino closed this Aug 25, 2023
@mhchia mhchia deleted the mhchia/exploring-wasm-bindgen branch August 25, 2023 14:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants