28 Commits

Author SHA1 Message Date
hdbg
1b4369b1cb feat(transport): add domain error type to GrpcTransportActor
Some checks failed
ci/woodpecker/pr/server-audit Pipeline was successful
ci/woodpecker/pr/server-lint Pipeline failed
ci/woodpecker/pr/server-vet Pipeline failed
ci/woodpecker/pr/server-test Pipeline failed
2026-02-26 15:07:11 +01:00
hdbg
7bd37b3c4a refactor: introduce TransportActor abstraction 2026-02-25 21:44:01 +01:00
hdbg
fe8c5e1bd2 housekeeping(useragent): rename
Some checks failed
ci/woodpecker/push/server-lint Pipeline failed
ci/woodpecker/push/server-audit Pipeline was successful
ci/woodpecker/push/server-vet Pipeline failed
ci/woodpecker/push/server-test Pipeline was successful
2026-02-21 13:54:47 +01:00
hdbg
cbbe1f8881 feat(proto): add URL parsing and TLS certificate management 2026-02-18 14:09:58 +01:00
hdbg
7438d62695 misc: create license and readme 2026-02-17 22:20:30 +01:00
hdbg
4236f2c36d refactor(server): reogranized actors, context, and db modules into <dir>/mod.rs structure
Some checks failed
ci/woodpecker/push/server-lint Pipeline was successful
ci/woodpecker/push/server-vet Pipeline failed
ci/woodpecker/push/server-audit Pipeline was successful
ci/woodpecker/push/server-test Pipeline was successful
2026-02-16 22:29:48 +01:00
hdbg
76ff535619 refactor(server::tests): moved integration-like tests into tests/ 2026-02-16 22:27:59 +01:00
hdbg
b3566c8af6 refactor(server): separated global actors into their own handle 2026-02-16 21:58:14 +01:00
hdbg
bdb9f01757 refactor(server): actors reorganization & linter fixes 2026-02-16 21:43:59 +01:00
hdbg
0805e7a846 feat(keyholder): add seal method and unseal integration tests 2026-02-16 21:38:29 +01:00
hdbg
eb9cbc88e9 feat(server::user-agent): Unseal implemented 2026-02-16 21:17:06 +01:00
hdbg
dd716da4cd test(keyholder): remove unused imports from test modules 2026-02-16 21:15:13 +01:00
hdbg
1545db7428 fix(ci): add protoc installation for lints
Some checks failed
ci/woodpecker/push/server-audit Pipeline was successful
ci/woodpecker/push/server-vet Pipeline failed
ci/woodpecker/push/server-lint Pipeline failed
ci/woodpecker/push/server-test Pipeline was successful
2026-02-16 21:14:55 +01:00
hdbg
20ac84b60c fix(ci): add clippy installation in mise.toml
Some checks failed
ci/woodpecker/push/server-audit Pipeline was successful
ci/woodpecker/push/server-vet Pipeline failed
ci/woodpecker/push/server-lint Pipeline failed
ci/woodpecker/push/server-test Pipeline was successful
2026-02-16 21:04:13 +01:00
hdbg
8f6dda871b refactor(actors): rename BootstrapActor to Bootstrapper 2026-02-16 21:01:53 +01:00
hdbg
47108ed8ad chore(supply-chain): update cargo-vet audits and trusted publishers
Some checks failed
ci/woodpecker/pr/server-lint Pipeline failed
ci/woodpecker/pr/server-audit Pipeline was successful
ci/woodpecker/pr/server-vet Pipeline failed
ci/woodpecker/pr/server-test Pipeline was successful
ci/woodpecker/push/server-lint Pipeline failed
ci/woodpecker/push/server-audit Pipeline was successful
ci/woodpecker/push/server-vet Pipeline failed
ci/woodpecker/push/server-test Pipeline was successful
2026-02-16 20:52:31 +01:00
hdbg
359df73c2e feat(server::key_holder): unique index on (root_key_id, nonce) to avoid nonce reuse 2026-02-16 20:45:15 +01:00
hdbg
ce03b7e15d feat(server::key_holder): ability to remotely get current state 2026-02-16 20:40:36 +01:00
hdbg
e4038d9188 refactor(keyholder): rename KeyHolderActor to KeyHolder and optimize db connection lifetime 2026-02-16 20:36:47 +01:00
hdbg
c82339d764 security(server::key_holder): replaced nonce-caching with exclusive transaction fetching nonce from the database 2026-02-16 18:23:25 +01:00
hdbg
c5b51f4b70 feat(server): UserAgent seal/unseal
Some checks failed
ci/woodpecker/pr/server-lint Pipeline failed
ci/woodpecker/pr/server-audit Pipeline was successful
ci/woodpecker/pr/server-vet Pipeline failed
ci/woodpecker/pr/server-test Pipeline was successful
2026-02-16 14:00:23 +01:00
hdbg
6b8f8c9ff7 feat(unseal): add unseal protocol support for user agents 2026-02-15 13:04:55 +01:00
hdbg
8263bc6b6f feat(server): boot mechanism 2026-02-15 01:44:12 +01:00
hdbg
a6c849f268 ci: add server linting pipeline for Rust code quality checks 2026-02-14 23:44:16 +01:00
hdbg
d8d65da0b4 test(user-agent): add challenge-response auth flow test 2026-02-14 23:43:36 +01:00
hdbg
abdf4e3893 tests(server): UserAgent invalid bootstrap token 2026-02-14 19:48:37 +01:00
hdbg
4bac70a6e9 security(server): cargo-vet proper init
All checks were successful
ci/woodpecker/push/server-audit Pipeline was successful
ci/woodpecker/push/server-vet Pipeline was successful
ci/woodpecker/push/server-test Pipeline was successful
2026-02-14 19:16:09 +01:00
hdbg
54a41743be housekeeping(server): trimmed-down dependencies 2026-02-14 19:04:50 +01:00
37 changed files with 937 additions and 2224 deletions

View File

@@ -1,31 +0,0 @@
Extension Discovery Cache
=========================
This folder is used by `package:extension_discovery` to cache lists of
packages that contains extensions for other packages.
DO NOT USE THIS FOLDER
----------------------
* Do not read (or rely) the contents of this folder.
* Do write to this folder.
If you're interested in the lists of extensions stored in this folder use the
API offered by package `extension_discovery` to get this information.
If this package doesn't work for your use-case, then don't try to read the
contents of this folder. It may change, and will not remain stable.
Use package `extension_discovery`
---------------------------------
If you want to access information from this folder.
Feel free to delete this folder
-------------------------------
Files in this folder act as a cache, and the cache is discarded if the files
are older than the modification time of `.dart_tool/package_config.json`.
Hence, it should never be necessary to clear this cache manually, if you find a
need to do please file a bug.

View File

@@ -1 +0,0 @@
{"version":2,"entries":[{"package":"app","rootUri":"../","packageUri":"lib/"}]}

View File

@@ -2,6 +2,7 @@ syntax = "proto3";
package arbiter; package arbiter;
import "auth.proto";
import "client.proto"; import "client.proto";
import "user_agent.proto"; import "user_agent.proto";
@@ -11,6 +12,6 @@ message ServerInfo {
} }
service ArbiterService { service ArbiterService {
rpc Client(stream arbiter.client.ClientRequest) returns (stream arbiter.client.ClientResponse); rpc Client(stream ClientRequest) returns (stream ClientResponse);
rpc UserAgent(stream arbiter.user_agent.UserAgentRequest) returns (stream arbiter.user_agent.UserAgentResponse); rpc UserAgent(stream UserAgentRequest) returns (stream UserAgentResponse);
} }

35
protobufs/auth.proto Normal file
View File

@@ -0,0 +1,35 @@
syntax = "proto3";
package arbiter.auth;
import "google/protobuf/timestamp.proto";
message AuthChallengeRequest {
bytes pubkey = 1;
optional string bootstrap_token = 2;
}
message AuthChallenge {
bytes pubkey = 1;
int32 nonce = 2;
}
message AuthChallengeSolution {
bytes signature = 1;
}
message AuthOk {}
message ClientMessage {
oneof payload {
AuthChallengeRequest auth_challenge_request = 1;
AuthChallengeSolution auth_challenge_solution = 2;
}
}
message ServerMessage {
oneof payload {
AuthChallenge auth_challenge = 1;
AuthOk auth_ok = 2;
}
}

View File

@@ -1,32 +1,17 @@
syntax = "proto3"; syntax = "proto3";
package arbiter.client; package arbiter;
message AuthChallengeRequest { import "auth.proto";
bytes pubkey = 1;
}
message AuthChallenge {
bytes pubkey = 1;
int32 nonce = 2;
}
message AuthChallengeSolution {
bytes signature = 1;
}
message AuthOk {}
message ClientRequest { message ClientRequest {
oneof payload { oneof payload {
AuthChallengeRequest auth_challenge_request = 1; arbiter.auth.ClientMessage auth_message = 1;
AuthChallengeSolution auth_challenge_solution = 2;
} }
} }
message ClientResponse { message ClientResponse {
oneof payload { oneof payload {
AuthChallenge auth_challenge = 1; arbiter.auth.ServerMessage auth_message = 1;
AuthOk auth_ok = 2;
} }
} }

View File

@@ -1,25 +1,10 @@
syntax = "proto3"; syntax = "proto3";
package arbiter.user_agent; package arbiter;
import "auth.proto";
import "google/protobuf/empty.proto"; import "google/protobuf/empty.proto";
message AuthChallengeRequest {
bytes pubkey = 1;
optional string bootstrap_token = 2;
}
message AuthChallenge {
bytes pubkey = 1;
int32 nonce = 2;
}
message AuthChallengeSolution {
bytes signature = 1;
}
message AuthOk {}
message UnsealStart { message UnsealStart {
bytes client_pubkey = 1; bytes client_pubkey = 1;
} }
@@ -50,19 +35,17 @@ enum VaultState {
message UserAgentRequest { message UserAgentRequest {
oneof payload { oneof payload {
AuthChallengeRequest auth_challenge_request = 1; arbiter.auth.ClientMessage auth_message = 1;
AuthChallengeSolution auth_challenge_solution = 2; UnsealStart unseal_start = 2;
UnsealStart unseal_start = 3; UnsealEncryptedKey unseal_encrypted_key = 3;
UnsealEncryptedKey unseal_encrypted_key = 4; google.protobuf.Empty query_vault_state = 4;
google.protobuf.Empty query_vault_state = 5;
} }
} }
message UserAgentResponse { message UserAgentResponse {
oneof payload { oneof payload {
AuthChallenge auth_challenge = 1; arbiter.auth.ServerMessage auth_message = 1;
AuthOk auth_ok = 2; UnsealStartResponse unseal_start_response = 2;
UnsealStartResponse unseal_start_response = 3; UnsealResult unseal_result = 3;
UnsealResult unseal_result = 4; VaultState vault_state = 4;
VaultState vault_state = 5;
} }
} }

292
server/Cargo.lock generated
View File

@@ -47,9 +47,9 @@ dependencies = [
[[package]] [[package]]
name = "anyhow" name = "anyhow"
version = "1.0.102" version = "1.0.101"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7f202df86484c868dbad7eaa557ef785d5c66295e41b460ef922eca0723b842c" checksum = "5f0e0fee31ef5ed1ba1316088939cea399010ed7731dba877ed44aeb407a75ea"
[[package]] [[package]]
name = "arbiter-client" name = "arbiter-client"
@@ -61,11 +61,9 @@ version = "0.1.0"
dependencies = [ dependencies = [
"base64", "base64",
"futures", "futures",
"hex",
"kameo", "kameo",
"miette", "miette",
"prost", "prost",
"prost-types",
"rand", "rand",
"rcgen", "rcgen",
"rstest", "rstest",
@@ -123,13 +121,9 @@ version = "0.1.0"
dependencies = [ dependencies = [
"arbiter-proto", "arbiter-proto",
"ed25519-dalek", "ed25519-dalek",
"http",
"kameo", "kameo",
"rustls-webpki",
"smlang", "smlang",
"thiserror",
"tokio", "tokio",
"tokio-stream",
"tonic", "tonic",
"tracing", "tracing",
"x25519-dalek", "x25519-dalek",
@@ -172,7 +166,7 @@ checksum = "3109e49b1e4909e9db6515a30c633684d68cdeaa252f215214cb4fa1a5bfee2c"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
"synstructure", "synstructure",
] ]
@@ -184,7 +178,7 @@ checksum = "7b18050c2cd6fe86c3a76584ef5e0baf286d038cda203eb6223df2cc413565f7"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -195,7 +189,7 @@ checksum = "9035ad2d096bed7955a320ee7e2230574d28fd3c3a0f186cbea1ff3c7eed5dbb"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -212,9 +206,9 @@ checksum = "c08606f8c3cbf4ce6ec8e28fb0014a2c086708fe954eaa885384a6165172e7e8"
[[package]] [[package]]
name = "aws-lc-rs" name = "aws-lc-rs"
version = "1.16.1" version = "1.15.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "94bffc006df10ac2a68c83692d734a465f8ee6c5b384d8545a636f81d858f4bf" checksum = "7b7b6141e96a8c160799cc2d5adecd5cbbe5054cb8c7c4af53da0f83bb7ad256"
dependencies = [ dependencies = [
"aws-lc-sys", "aws-lc-sys",
"untrusted 0.7.1", "untrusted 0.7.1",
@@ -223,9 +217,9 @@ dependencies = [
[[package]] [[package]]
name = "aws-lc-sys" name = "aws-lc-sys"
version = "0.38.0" version = "0.37.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4321e568ed89bb5a7d291a7f37997c2c0df89809d7b6d12062c81ddb54aa782e" checksum = "b092fe214090261288111db7a2b2c2118e5a7f30dc2569f1732c4069a6840549"
dependencies = [ dependencies = [
"cc", "cc",
"cmake", "cmake",
@@ -350,18 +344,18 @@ dependencies = [
[[package]] [[package]]
name = "block-buffer" name = "block-buffer"
version = "0.12.0" version = "0.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cdd35008169921d80bc60d3d0ab416eecb028c4cd653352907921d95084790be" checksum = "96eb4cdd6cf1b31d671e9efe75c5d1ec614776856cefbe109ca373554a6d514f"
dependencies = [ dependencies = [
"hybrid-array", "hybrid-array",
] ]
[[package]] [[package]]
name = "bumpalo" name = "bumpalo"
version = "3.20.2" version = "3.19.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5d20789868f4b01b2f2caec9f5c4e0213b41e3e5702a50157d699ae31ced2fcb" checksum = "5dd9dc738b7a8311c7ade152424974d8115f2cdad61e8dab8dac9f2362298510"
[[package]] [[package]]
name = "bytes" name = "bytes"
@@ -424,9 +418,9 @@ dependencies = [
[[package]] [[package]]
name = "chrono" name = "chrono"
version = "0.4.44" version = "0.4.43"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c673075a2e0e5f4a1dde27ce9dee1ea4558c7ffe648f576438a20ca1d2acc4b0" checksum = "fac4744fb15ae8337dc853fee7fb3f4e48c0fbaa23d0afe49c447b4fab126118"
dependencies = [ dependencies = [
"iana-time-zone", "iana-time-zone",
"js-sys", "js-sys",
@@ -520,9 +514,9 @@ dependencies = [
[[package]] [[package]]
name = "crypto-common" name = "crypto-common"
version = "0.2.1" version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "77727bb15fa921304124b128af125e7e3b968275d1b108b379190264f4423710" checksum = "211f05e03c7d03754740fd9e585de910a095d6b99f8bcfffdef8319fa02a8331"
dependencies = [ dependencies = [
"hybrid-array", "hybrid-array",
] ]
@@ -551,7 +545,7 @@ dependencies = [
"cfg-if", "cfg-if",
"cpufeatures 0.2.17", "cpufeatures 0.2.17",
"curve25519-dalek-derive", "curve25519-dalek-derive",
"digest 0.11.1", "digest 0.11.0",
"fiat-crypto 0.3.0", "fiat-crypto 0.3.0",
"rustc_version", "rustc_version",
"subtle", "subtle",
@@ -566,7 +560,7 @@ checksum = "f46882e17999c6cc590af592290432be3bce0428cb0d5f8b6715e4dc7b383eb3"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -590,7 +584,7 @@ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"strsim", "strsim",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -601,7 +595,7 @@ checksum = "d38308df82d1080de0afee5d069fa14b0326a88c14f15c5ccda35b4a6c414c81"
dependencies = [ dependencies = [
"darling_core", "darling_core",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -640,9 +634,9 @@ dependencies = [
[[package]] [[package]]
name = "deranged" name = "deranged"
version = "0.5.8" version = "0.5.6"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7cd812cc2bc1d69d4764bd80df88b4317eaef9e773c75226407d9bc0876b211c" checksum = "cc3dc5ad92c2e2d1c193bbbbdf2ea477cb81331de4f3103f267ca18368b988c4"
dependencies = [ dependencies = [
"powerfmt", "powerfmt",
] ]
@@ -688,7 +682,7 @@ dependencies = [
"dsl_auto_type", "dsl_auto_type",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -708,7 +702,7 @@ version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fe2444076b48641147115697648dc743c2c00b61adade0f01ce67133c7babe8c" checksum = "fe2444076b48641147115697648dc743c2c00b61adade0f01ce67133c7babe8c"
dependencies = [ dependencies = [
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -724,12 +718,12 @@ dependencies = [
[[package]] [[package]]
name = "digest" name = "digest"
version = "0.11.1" version = "0.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "285743a676ccb6b3e116bc14cc69319b957867930ae9c4822f8e0f54509d7243" checksum = "f8bf3682cdec91817be507e4aa104314898b95b84d74f3d43882210101a545b6"
dependencies = [ dependencies = [
"block-buffer 0.12.0", "block-buffer 0.11.0",
"crypto-common 0.2.1", "crypto-common 0.2.0",
] ]
[[package]] [[package]]
@@ -740,7 +734,7 @@ checksum = "97369cbbc041bc366949bc74d34658d6cda5621039731c6310521892a3a20ae0"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -760,7 +754,7 @@ dependencies = [
"heck", "heck",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -955,7 +949,7 @@ checksum = "e835b70203e41293343137df5c0664546da5745f82ec9b84d40be8336958447b"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -1022,19 +1016,19 @@ checksum = "899def5c37c4fd7b2664648c28120ecec138e4d395b459e5ca34f9cce2dd77fd"
dependencies = [ dependencies = [
"cfg-if", "cfg-if",
"libc", "libc",
"r-efi 5.3.0", "r-efi",
"wasip2", "wasip2",
] ]
[[package]] [[package]]
name = "getrandom" name = "getrandom"
version = "0.4.2" version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0de51e6874e94e7bf76d726fc5d13ba782deca734ff60d5bb2fb2607c7406555" checksum = "139ef39800118c7683f2fd3c98c1b23c09ae076556b435f8e9064ae108aaeeec"
dependencies = [ dependencies = [
"cfg-if", "cfg-if",
"libc", "libc",
"r-efi 6.0.0", "r-efi",
"rand_core 0.10.0", "rand_core 0.10.0",
"wasip2", "wasip2",
"wasip3", "wasip3",
@@ -1101,12 +1095,6 @@ version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2304e00983f87ffb38b55b444b5e3b60a884b5d30c0fca7d82fe33449bbe55ea" checksum = "2304e00983f87ffb38b55b444b5e3b60a884b5d30c0fca7d82fe33449bbe55ea"
[[package]]
name = "hex"
version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7f24254aa9a54b5c858eaee2f5bccdb46aaf0e486a595ed5fd8f86ba55232a70"
[[package]] [[package]]
name = "http" name = "http"
version = "1.4.0" version = "1.4.0"
@@ -1154,9 +1142,9 @@ checksum = "df3b46402a9d5adb4c86a0cf463f42e19994e3ee891101b1841f30a545cb49a9"
[[package]] [[package]]
name = "hybrid-array" name = "hybrid-array"
version = "0.4.8" version = "0.4.7"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8655f91cd07f2b9d0c24137bd650fe69617773435ee5ec83022377777ce65ef1" checksum = "e1b229d73f5803b562cc26e4da0396c8610a4ee209f4fac8fa4f8d709166dc45"
dependencies = [ dependencies = [
"typenum", "typenum",
] ]
@@ -1421,9 +1409,9 @@ dependencies = [
[[package]] [[package]]
name = "js-sys" name = "js-sys"
version = "0.3.91" version = "0.3.85"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b49715b7073f385ba4bc528e5747d02e66cb39c6146efb66b781f131f0fb399c" checksum = "8c942ebf8e95485ca0d52d97da7c5a2c387d0e7f0ba4c35e93bfcaee045955b3"
dependencies = [ dependencies = [
"once_cell", "once_cell",
"wasm-bindgen", "wasm-bindgen",
@@ -1453,7 +1441,7 @@ dependencies = [
"heck", "heck",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -1470,9 +1458,9 @@ checksum = "09edd9e8b54e49e587e4f6295a7d29c3ea94d469cb40ab8ca70b288248a81db2"
[[package]] [[package]]
name = "libc" name = "libc"
version = "0.2.183" version = "0.2.182"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b5b646652bf6661599e1da8901b3b9522896f01e736bad5f723fe7a3a27f899d" checksum = "6800badb6cb2082ffd7b6a67e6125bb39f18782f793520caee8cb8846be06112"
[[package]] [[package]]
name = "libsqlite3-sys" name = "libsqlite3-sys"
@@ -1486,9 +1474,9 @@ dependencies = [
[[package]] [[package]]
name = "linux-raw-sys" name = "linux-raw-sys"
version = "0.12.1" version = "0.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "32a66949e030da00e8c7d4434b251670a91556f4144941d37452769c25d58a53" checksum = "df1d3c3b53da64cf5760482273a98e575c651a67eec7f77df96b5b642de8f039"
[[package]] [[package]]
name = "litemap" name = "litemap"
@@ -1570,7 +1558,7 @@ checksum = "db5b29714e950dbb20d5e6f74f9dcec4edbcc1067bb7f8ed198c097b8c1a818b"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -1718,9 +1706,9 @@ checksum = "c08d65885ee38876c4f86fa503fb49d7b507c2b62552df7c70b2fce627e06381"
[[package]] [[package]]
name = "owo-colors" name = "owo-colors"
version = "4.3.0" version = "4.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d211803b9b6b570f68772237e415a029d5a50c65d382910b879fb19d3271f94d" checksum = "9c6901729fa79e91a0913333229e9ca5dc725089d1c363b2f4b4760709dc4a52"
[[package]] [[package]]
name = "parking_lot" name = "parking_lot"
@@ -1785,29 +1773,29 @@ dependencies = [
[[package]] [[package]]
name = "pin-project" name = "pin-project"
version = "1.1.11" version = "1.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f1749c7ed4bcaf4c3d0a3efc28538844fb29bcdd7d2b67b2be7e20ba861ff517" checksum = "677f1add503faace112b9f1373e43e9e054bfdd22ff1a63c1bc485eaec6a6a8a"
dependencies = [ dependencies = [
"pin-project-internal", "pin-project-internal",
] ]
[[package]] [[package]]
name = "pin-project-internal" name = "pin-project-internal"
version = "1.1.11" version = "1.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d9b20ed30f105399776b9c883e68e536ef602a16ae6f596d2c473591d6ad64c6" checksum = "6e918e4ff8c4549eb882f14b3a4bc8c8bc93de829416eacf579f1207a8fbf861"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
name = "pin-project-lite" name = "pin-project-lite"
version = "0.2.17" version = "0.2.16"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a89322df9ebe1c1578d689c92318e070967d1042b512afbe49518723f4e6d5cd" checksum = "3b3cff922bd51709b605d9ead9aa71031d81447142d828eb4a6eba76fe619f9b"
[[package]] [[package]]
name = "pin-utils" name = "pin-utils"
@@ -1860,14 +1848,14 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "479ca8adacdd7ce8f1fb39ce9ecccbfe93a3f1344b3d0d97f20bc0196208f62b" checksum = "479ca8adacdd7ce8f1fb39ce9ecccbfe93a3f1344b3d0d97f20bc0196208f62b"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
name = "proc-macro-crate" name = "proc-macro-crate"
version = "3.5.0" version = "3.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e67ba7e9b2b56446f1d419b1d807906278ffa1a658a8a5d8a39dcb1f5a78614f" checksum = "219cb19e96be00ab2e37d6e299658a0cfa83e52429179969b0f0121b4ac46983"
dependencies = [ dependencies = [
"toml_edit", "toml_edit",
] ]
@@ -1932,7 +1920,7 @@ dependencies = [
"pulldown-cmark", "pulldown-cmark",
"pulldown-cmark-to-cmark", "pulldown-cmark-to-cmark",
"regex", "regex",
"syn 2.0.117", "syn 2.0.115",
"tempfile", "tempfile",
] ]
@@ -1946,7 +1934,7 @@ dependencies = [
"itertools", "itertools",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -1955,15 +1943,14 @@ version = "0.14.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8991c4cbdb8bc5b11f0b074ffe286c30e523de90fee5ba8132f1399f23cb3dd7" checksum = "8991c4cbdb8bc5b11f0b074ffe286c30e523de90fee5ba8132f1399f23cb3dd7"
dependencies = [ dependencies = [
"chrono",
"prost", "prost",
] ]
[[package]] [[package]]
name = "pulldown-cmark" name = "pulldown-cmark"
version = "0.13.1" version = "0.13.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "83c41efbf8f90ac44de7f3a868f0867851d261b56291732d0cbf7cceaaeb55a6" checksum = "1e8bbe1a966bd2f362681a44f6edce3c2310ac21e4d5067a6e7ec396297a6ea0"
dependencies = [ dependencies = [
"bitflags", "bitflags",
"memchr", "memchr",
@@ -1981,9 +1968,9 @@ dependencies = [
[[package]] [[package]]
name = "quote" name = "quote"
version = "1.0.45" version = "1.0.44"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "41f2619966050689382d2b44f664f4bc593e129785a36d6ee376ddf37259b924" checksum = "21b2ebcf727b7760c461f091f9f0f539b77b8e87f2fd88131e7f1b433b3cece4"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
] ]
@@ -1994,12 +1981,6 @@ version = "5.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "69cdb34c158ceb288df11e18b4bd39de994f6657d83847bdffdbd7f346754b0f" checksum = "69cdb34c158ceb288df11e18b4bd39de994f6657d83847bdffdbd7f346754b0f"
[[package]]
name = "r-efi"
version = "6.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f8dcc9c7d52a811697d2151c701e0d08956f92b0e24136cf4cf27b57a6a0d9bf"
[[package]] [[package]]
name = "rand" name = "rand"
version = "0.10.0" version = "0.10.0"
@@ -2007,7 +1988,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bc266eb313df6c5c09c1c7b1fbe2510961e5bcd3add930c1e31f7ed9da0feff8" checksum = "bc266eb313df6c5c09c1c7b1fbe2510961e5bcd3add930c1e31f7ed9da0feff8"
dependencies = [ dependencies = [
"chacha20 0.10.0", "chacha20 0.10.0",
"getrandom 0.4.2", "getrandom 0.4.1",
"rand_core 0.10.0", "rand_core 0.10.0",
] ]
@@ -2075,9 +2056,9 @@ dependencies = [
[[package]] [[package]]
name = "regex-syntax" name = "regex-syntax"
version = "0.8.10" version = "0.8.9"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dc897dd8d9e8bd1ed8cdad82b5966c3e0ecae09fb1907d58efaa013543185d0a" checksum = "a96887878f22d7bad8a3b6dc5b7440e0ada9a245242924394987b21cf2210a4c"
[[package]] [[package]]
name = "relative-path" name = "relative-path"
@@ -2094,7 +2075,7 @@ dependencies = [
"proc-macro-error", "proc-macro-error",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -2146,7 +2127,7 @@ dependencies = [
"regex", "regex",
"relative-path", "relative-path",
"rustc_version", "rustc_version",
"syn 2.0.117", "syn 2.0.115",
"unicode-ident", "unicode-ident",
] ]
@@ -2176,9 +2157,9 @@ dependencies = [
[[package]] [[package]]
name = "rustix" name = "rustix"
version = "1.1.4" version = "1.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b6fe4565b9518b83ef4f91bb47ce29620ca828bd32cb7e408f0062e9930ba190" checksum = "146c9e247ccc180c1f61615433868c99f3de3ae256a30a43b49f67c2d9171f34"
dependencies = [ dependencies = [
"bitflags", "bitflags",
"errno", "errno",
@@ -2189,9 +2170,9 @@ dependencies = [
[[package]] [[package]]
name = "rustls" name = "rustls"
version = "0.23.37" version = "0.23.36"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "758025cb5fccfd3bc2fd74708fd4682be41d99e5dff73c377c0646c6012c73a4" checksum = "c665f33d38cea657d9614f766881e4d510e0eda4239891eea56b4cadcf01801b"
dependencies = [ dependencies = [
"aws-lc-rs", "aws-lc-rs",
"log", "log",
@@ -2286,7 +2267,7 @@ checksum = "d540f220d3187173da220f885ab66608367b6574e925011a9353e4badda91d79"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -2319,7 +2300,7 @@ checksum = "7c5f3b1e2dc8aad28310d8410bd4d7e180eca65fca176c52ab00d364475d0024"
dependencies = [ dependencies = [
"cfg-if", "cfg-if",
"cpufeatures 0.2.17", "cpufeatures 0.2.17",
"digest 0.11.1", "digest 0.11.0",
] ]
[[package]] [[package]]
@@ -2400,12 +2381,12 @@ dependencies = [
[[package]] [[package]]
name = "socket2" name = "socket2"
version = "0.6.3" version = "0.6.2"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3a766e1110788c36f4fa1c2b71b387a7815aa65f88ce0229841826633d93723e" checksum = "86f4aa3ad99f2088c990dfa82d367e19cb29268ed67c574d10d0a4bfe71f07e0"
dependencies = [ dependencies = [
"libc", "libc",
"windows-sys 0.61.2", "windows-sys 0.60.2",
] ]
[[package]] [[package]]
@@ -2456,7 +2437,7 @@ dependencies = [
"heck", "heck",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -2499,9 +2480,9 @@ dependencies = [
[[package]] [[package]]
name = "syn" name = "syn"
version = "2.0.117" version = "2.0.115"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e665b8803e7b1d2a727f4023456bbbbe74da67099c585258af0ad9c5013b9b99" checksum = "6e614ed320ac28113fa64972c4262d5dbc89deacdfd00c34a3e4cea073243c12"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@@ -2522,17 +2503,17 @@ checksum = "728a70f3dbaf5bab7f0c4b1ac8d7ae5ea60a4b5549c8a5914361c99147a709d2"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
name = "tempfile" name = "tempfile"
version = "3.27.0" version = "3.25.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "32497e9a4c7b38532efcdebeef879707aa9f794296a4f0244f6f69e9bc8574bd" checksum = "0136791f7c95b1f6dd99f9cc786b91bb81c3800b639b3478e561ddb7be95e5f1"
dependencies = [ dependencies = [
"fastrand", "fastrand",
"getrandom 0.4.2", "getrandom 0.4.1",
"once_cell", "once_cell",
"rustix", "rustix",
"windows-sys 0.61.2", "windows-sys 0.61.2",
@@ -2566,7 +2547,7 @@ checksum = "be35209fd0781c5401458ab66e4f98accf63553e8fae7425503e92fdd319783b"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -2596,7 +2577,7 @@ checksum = "ebc4ee7f67670e9b64d05fa4253e753e016c6c95ff35b89b7941d6b856dec1d5"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -2651,9 +2632,9 @@ dependencies = [
[[package]] [[package]]
name = "tokio" name = "tokio"
version = "1.50.0" version = "1.49.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "27ad5e34374e03cfffefc301becb44e9dc3c17584f414349ebe29ed26661822d" checksum = "72a2903cd7736441aac9df9d7688bd0ce48edccaadf181c3b90be801e81d3d86"
dependencies = [ dependencies = [
"bytes", "bytes",
"libc", "libc",
@@ -2669,13 +2650,13 @@ dependencies = [
[[package]] [[package]]
name = "tokio-macros" name = "tokio-macros"
version = "2.6.1" version = "2.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5c55a2eff8b69ce66c84f85e1da1c233edc36ceb85a2058d11b0d6a3c7e7569c" checksum = "af407857209536a95c8e56f8231ef2c2e2aff839b22e07a1ffcbc617e9db9fa5"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -2721,7 +2702,7 @@ checksum = "cf92845e79fc2e2def6a5d828f0801e29a2f8acc037becc5ab08595c7d5e9863"
dependencies = [ dependencies = [
"serde_core", "serde_core",
"serde_spanned", "serde_spanned",
"toml_datetime 0.7.5+spec-1.1.0", "toml_datetime",
"toml_parser", "toml_parser",
"winnow", "winnow",
] ]
@@ -2735,41 +2716,32 @@ dependencies = [
"serde_core", "serde_core",
] ]
[[package]]
name = "toml_datetime"
version = "1.0.0+spec-1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "32c2555c699578a4f59f0cc68e5116c8d7cabbd45e1409b989d4be085b53f13e"
dependencies = [
"serde_core",
]
[[package]] [[package]]
name = "toml_edit" name = "toml_edit"
version = "0.25.4+spec-1.1.0" version = "0.23.10+spec-1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7193cbd0ce53dc966037f54351dbbcf0d5a642c7f0038c382ef9e677ce8c13f2" checksum = "84c8b9f757e028cee9fa244aea147aab2a9ec09d5325a9b01e0a49730c2b5269"
dependencies = [ dependencies = [
"indexmap", "indexmap",
"toml_datetime 1.0.0+spec-1.1.0", "toml_datetime",
"toml_parser", "toml_parser",
"winnow", "winnow",
] ]
[[package]] [[package]]
name = "toml_parser" name = "toml_parser"
version = "1.0.9+spec-1.1.0" version = "1.0.8+spec-1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "702d4415e08923e7e1ef96cd5727c0dfed80b4d2fa25db9647fe5eb6f7c5a4c4" checksum = "0742ff5ff03ea7e67c8ae6c93cac239e0d9784833362da3f9a9c1da8dfefcbdc"
dependencies = [ dependencies = [
"winnow", "winnow",
] ]
[[package]] [[package]]
name = "tonic" name = "tonic"
version = "0.14.5" version = "0.14.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fec7c61a0695dc1887c1b53952990f3ad2e3a31453e1f49f10e75424943a93ec" checksum = "7f32a6f80051a4111560201420c7885d0082ba9efe2ab61875c587bb6b18b9a0"
dependencies = [ dependencies = [
"async-trait", "async-trait",
"axum", "axum",
@@ -2799,21 +2771,21 @@ dependencies = [
[[package]] [[package]]
name = "tonic-build" name = "tonic-build"
version = "0.14.5" version = "0.14.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1882ac3bf5ef12877d7ed57aad87e75154c11931c2ba7e6cde5e22d63522c734" checksum = "ce6d8958ed3be404120ca43ffa0fb1e1fc7be214e96c8d33bd43a131b6eebc9e"
dependencies = [ dependencies = [
"prettyplease", "prettyplease",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
name = "tonic-prost" name = "tonic-prost"
version = "0.14.5" version = "0.14.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a55376a0bbaa4975a3f10d009ad763d8f4108f067c7c2e74f3001fb49778d309" checksum = "9f86539c0089bfd09b1f8c0ab0239d80392af74c21bc9e0f15e1b4aca4c1647f"
dependencies = [ dependencies = [
"bytes", "bytes",
"prost", "prost",
@@ -2822,16 +2794,16 @@ dependencies = [
[[package]] [[package]]
name = "tonic-prost-build" name = "tonic-prost-build"
version = "0.14.5" version = "0.14.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f3144df636917574672e93d0f56d7edec49f90305749c668df5101751bb8f95a" checksum = "65873ace111e90344b8973e94a1fc817c924473affff24629281f90daed1cd2e"
dependencies = [ dependencies = [
"prettyplease", "prettyplease",
"proc-macro2", "proc-macro2",
"prost-build", "prost-build",
"prost-types", "prost-types",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
"tempfile", "tempfile",
"tonic-build", "tonic-build",
] ]
@@ -2886,7 +2858,7 @@ checksum = "7490cfa5ec963746568740651ac6781f701c9c5ea257c58e057f3ba8cf69e8da"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -2948,9 +2920,9 @@ checksum = "dbc4bc3a9f746d862c45cb89d705aa10f187bb96c76001afab07a0d35ce60142"
[[package]] [[package]]
name = "unicode-ident" name = "unicode-ident"
version = "1.0.24" version = "1.0.23"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e6e4313cd5fcd3dad5cafa179702e2b244f760991f45397d14d4ebf38247da75" checksum = "537dd038a89878be9b64dd4bd1b260315c1bb94f4d784956b81e27a088d9a09e"
[[package]] [[package]]
name = "unicode-linebreak" name = "unicode-linebreak"
@@ -3018,9 +2990,9 @@ checksum = "b6c140620e7ffbb22c2dee59cafe6084a59b5ffc27a8859a5f0d494b5d52b6be"
[[package]] [[package]]
name = "uuid" name = "uuid"
version = "1.22.0" version = "1.21.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a68d3c8f01c0cfa54a75291d83601161799e4a89a39e0929f4b0354d88757a37" checksum = "b672338555252d43fd2240c714dc444b8c6fb0a5c5335e65a07bba7742735ddb"
dependencies = [ dependencies = [
"js-sys", "js-sys",
"wasm-bindgen", "wasm-bindgen",
@@ -3079,9 +3051,9 @@ dependencies = [
[[package]] [[package]]
name = "wasm-bindgen" name = "wasm-bindgen"
version = "0.2.114" version = "0.2.108"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6532f9a5c1ece3798cb1c2cfdba640b9b3ba884f5db45973a6f442510a87d38e" checksum = "64024a30ec1e37399cf85a7ffefebdb72205ca1c972291c51512360d90bd8566"
dependencies = [ dependencies = [
"cfg-if", "cfg-if",
"once_cell", "once_cell",
@@ -3092,9 +3064,9 @@ dependencies = [
[[package]] [[package]]
name = "wasm-bindgen-macro" name = "wasm-bindgen-macro"
version = "0.2.114" version = "0.2.108"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "18a2d50fcf105fb33bb15f00e7a77b772945a2ee45dcf454961fd843e74c18e6" checksum = "008b239d9c740232e71bd39e8ef6429d27097518b6b30bdf9086833bd5b6d608"
dependencies = [ dependencies = [
"quote", "quote",
"wasm-bindgen-macro-support", "wasm-bindgen-macro-support",
@@ -3102,22 +3074,22 @@ dependencies = [
[[package]] [[package]]
name = "wasm-bindgen-macro-support" name = "wasm-bindgen-macro-support"
version = "0.2.114" version = "0.2.108"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "03ce4caeaac547cdf713d280eda22a730824dd11e6b8c3ca9e42247b25c631e3" checksum = "5256bae2d58f54820e6490f9839c49780dff84c65aeab9e772f15d5f0e913a55"
dependencies = [ dependencies = [
"bumpalo", "bumpalo",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
"wasm-bindgen-shared", "wasm-bindgen-shared",
] ]
[[package]] [[package]]
name = "wasm-bindgen-shared" name = "wasm-bindgen-shared"
version = "0.2.114" version = "0.2.108"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "75a326b8c223ee17883a4251907455a2431acc2791c98c26279376490c378c16" checksum = "1f01b580c9ac74c8d8f0c0e4afb04eeef2acf145458e52c03845ee9cd23e3d12"
dependencies = [ dependencies = [
"unicode-ident", "unicode-ident",
] ]
@@ -3199,7 +3171,7 @@ checksum = "053e2e040ab57b9dc951b72c264860db7eb3b0200ba345b4e4c3b14f67855ddf"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -3210,7 +3182,7 @@ checksum = "3f316c4a2570ba26bbec722032c4099d8c8bc095efccdc15688708623367e358"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -3404,9 +3376,9 @@ checksum = "d6bbff5f0aada427a1e5a6da5f1f98158182f26556f345ac9e04d36d0ebed650"
[[package]] [[package]]
name = "winnow" name = "winnow"
version = "0.7.15" version = "0.7.14"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "df79d97927682d2fd8adb29682d1140b343be4ac0f08fd68b7765d9c059d3945" checksum = "5a5364e9d77fcdeeaa6062ced926ee3381faa2ee02d3eb83a5c27a8825540829"
dependencies = [ dependencies = [
"memchr", "memchr",
] ]
@@ -3441,7 +3413,7 @@ dependencies = [
"heck", "heck",
"indexmap", "indexmap",
"prettyplease", "prettyplease",
"syn 2.0.117", "syn 2.0.115",
"wasm-metadata", "wasm-metadata",
"wit-bindgen-core", "wit-bindgen-core",
"wit-component", "wit-component",
@@ -3457,7 +3429,7 @@ dependencies = [
"prettyplease", "prettyplease",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
"wit-bindgen-core", "wit-bindgen-core",
"wit-bindgen-rust", "wit-bindgen-rust",
] ]
@@ -3563,7 +3535,7 @@ checksum = "b659052874eb698efe5b9e8cf382204678a0086ebf46982b79d6ca3182927e5d"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
"synstructure", "synstructure",
] ]
@@ -3584,7 +3556,7 @@ checksum = "d71e5d6e06ab090c67b5e44993ec16b72dcbaabc526db883a360057678b48502"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
"synstructure", "synstructure",
] ]
@@ -3605,7 +3577,7 @@ checksum = "85a5b4158499876c763cb03bc4e49185d3cccbabb15b33c627f7884f43db852e"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]
@@ -3638,7 +3610,7 @@ checksum = "eadce39539ca5cb3985590102671f2567e659fca9666581ad3411d59207951f3"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.117", "syn 2.0.115",
] ]
[[package]] [[package]]

View File

@@ -23,7 +23,6 @@ async-trait = "0.1.89"
futures = "0.3.31" futures = "0.3.31"
tokio-stream = { version = "0.1.18", features = ["full"] } tokio-stream = { version = "0.1.18", features = ["full"] }
kameo = "0.19.2" kameo = "0.19.2"
prost-types = { version = "0.14.3", features = ["chrono"] }
x25519-dalek = { version = "2.0.1", features = ["getrandom"] } x25519-dalek = { version = "2.0.1", features = ["getrandom"] }
rstest = "0.26.1" rstest = "0.26.1"
rustls-pki-types = "1.14.0" rustls-pki-types = "1.14.0"

View File

@@ -9,7 +9,6 @@ license = "Apache-2.0"
tonic.workspace = true tonic.workspace = true
tokio.workspace = true tokio.workspace = true
futures.workspace = true futures.workspace = true
hex = "0.4.3"
tonic-prost = "0.14.3" tonic-prost = "0.14.3"
prost = "0.14.3" prost = "0.14.3"
kameo.workspace = true kameo.workspace = true
@@ -18,9 +17,9 @@ miette.workspace = true
thiserror.workspace = true thiserror.workspace = true
rustls-pki-types.workspace = true rustls-pki-types.workspace = true
base64 = "0.22.1" base64 = "0.22.1"
prost-types.workspace = true
tracing.workspace = true tracing.workspace = true
[build-dependencies] [build-dependencies]
tonic-prost-build = "0.14.3" tonic-prost-build = "0.14.3"

View File

@@ -11,8 +11,7 @@ fn main() -> Result<(), Box<dyn std::error::Error>> {
.compile_protos( .compile_protos(
&[ &[
format!("{}/arbiter.proto", PROTOBUF_DIR), format!("{}/arbiter.proto", PROTOBUF_DIR),
format!("{}/user_agent.proto", PROTOBUF_DIR), format!("{}/auth.proto", PROTOBUF_DIR),
format!("{}/client.proto", PROTOBUF_DIR),
], ],
&[PROTOBUF_DIR.to_string()], &[PROTOBUF_DIR.to_string()],
) )

View File

@@ -3,15 +3,13 @@ pub mod url;
use base64::{Engine, prelude::BASE64_STANDARD}; use base64::{Engine, prelude::BASE64_STANDARD};
use crate::proto::auth::AuthChallenge;
pub mod proto { pub mod proto {
tonic::include_proto!("arbiter"); tonic::include_proto!("arbiter");
pub mod user_agent { pub mod auth {
tonic::include_proto!("arbiter.user_agent"); tonic::include_proto!("arbiter.auth");
}
pub mod client {
tonic::include_proto!("arbiter.client");
} }
} }
@@ -30,7 +28,7 @@ pub fn home_path() -> Result<std::path::PathBuf, std::io::Error> {
Ok(arbiter_home) Ok(arbiter_home)
} }
pub fn format_challenge(nonce: i32, pubkey: &[u8]) -> Vec<u8> { pub fn format_challenge(challenge: &AuthChallenge) -> Vec<u8> {
let concat_form = format!("{}:{}", nonce, BASE64_STANDARD.encode(pubkey)); let concat_form = format!("{}:{}", challenge.nonce, BASE64_STANDARD.encode(&challenge.pubkey));
concat_form.into_bytes() concat_form.into_bytes().to_vec()
} }

View File

@@ -1,293 +1,371 @@
//! Transport-facing abstractions for protocol/session code. //! Transport abstraction layer for bridging gRPC bidirectional streaming with kameo actors.
//! //!
//! This module separates three concerns: //! This module provides a clean separation between the gRPC transport layer and business logic
//! by modeling the connection as two linked kameo actors:
//! //!
//! - protocol/session logic wants a small duplex interface ([`Bi`]) //! - A **transport actor** ([`GrpcTransportActor`]) that owns the gRPC stream and channel,
//! - transport adapters push concrete stream items to an underlying IO layer //! forwarding inbound messages to the business actor and outbound messages to the client.
//! - transport boundaries translate between protocol-facing and transport-facing //! - A **business logic actor** that receives inbound messages from the transport actor and
//! item types via direction-specific converters //! sends outbound messages back through the transport actor.
//! //!
//! [`Bi`] is intentionally minimal and transport-agnostic: //! The [`wire()`] function sets up bidirectional linking between the two actors, ensuring
//! - [`Bi::recv`] yields inbound protocol messages //! that if either actor dies, the other is notified and can shut down gracefully.
//! - [`Bi::send`] accepts outbound protocol/domain items
//! //!
//! # Generic Ordering Rule //! # Terminology
//! //!
//! This module uses a single convention consistently: when a type or trait is //! - **InboundMessage**: a message received by the transport actor from the channel/socket
//! parameterized by protocol message directions, the generic parameters are //! and forwarded to the business actor.
//! declared as `Inbound` first, then `Outbound`. //! - **OutboundMessage**: a message produced by the business actor and sent to the transport
//! actor to be forwarded to the channel/socket.
//! //!
//! For [`Bi`], that means `Bi<Inbound, Outbound>`: //! # Architecture
//! - `recv() -> Option<Inbound>`
//! - `send(Outbound)`
//!
//! For adapter types that are parameterized by direction-specific converters,
//! inbound-related converter parameters are declared before outbound-related
//! converter parameters.
//!
//! [`RecvConverter`] and [`SendConverter`] are infallible conversion traits used
//! by adapters to map between protocol-facing and transport-facing item types.
//! The traits themselves are not result-aware; adapters decide how transport
//! errors are handled before (or instead of) conversion.
//!
//! [`grpc::GrpcAdapter`] combines:
//! - a tonic inbound stream
//! - a Tokio sender for outbound transport items
//! - a [`RecvConverter`] for the receive path
//! - a [`SendConverter`] for the send path
//!
//! [`DummyTransport`] is a no-op implementation useful for tests and local actor
//! execution where no real network stream exists.
//!
//! # Component Interaction
//! //!
//! ```text //! ```text
//! inbound (network -> protocol) //! gRPC Stream ──InboundMessage──▶ GrpcTransportActor ──tell(InboundMessage)──▶ BusinessActor
//! ============================ //! ▲ │
//! //! └─tell(Result<OutboundMessage, _>)────┘
//! tonic::Streaming<RecvTransport> //!
//! -> grpc::GrpcAdapter::recv() //! mpsc::Sender ──▶ Client
//! |
//! +--> on `Ok(item)`: RecvConverter::convert(RecvTransport) -> Inbound
//! +--> on `Err(status)`: log error and close stream (`None`)
//! -> Bi::recv()
//! -> protocol/session actor
//!
//! outbound (protocol -> network)
//! ==============================
//!
//! protocol/session actor
//! -> Bi::send(Outbound)
//! -> grpc::GrpcAdapter::send()
//! |
//! +--> SendConverter::convert(Outbound) -> SendTransport
//! -> Tokio mpsc::Sender<SendTransport>
//! -> tonic response stream
//! ``` //! ```
//! //!
//! # Design Notes //! # Example
//! //!
//! - `send()` returns [`Error`] only for transport delivery failures (for //! ```rust,ignore
//! example, when the outbound channel is closed). //! let (tx, rx) = mpsc::channel(1000);
//! - [`grpc::GrpcAdapter`] logs tonic receive errors and treats them as stream //! let context = server_context.clone();
//! closure (`None`). //!
//! - When protocol-facing and transport-facing types are identical, use //! wire(
//! [`IdentityRecvConverter`] / [`IdentitySendConverter`]. //! |transport_ref| MyBusinessActor::new(context, transport_ref),
//! |business_recipient, business_id| GrpcTransportActor {
//! sender: tx,
//! receiver: grpc_stream,
//! business_logic_actor: business_recipient,
//! business_logic_actor_id: business_id,
//! },
//! ).await;
//!
//! Ok(Response::new(ReceiverStream::new(rx)))
//! ```
use std::marker::PhantomData; use futures::{Stream, StreamExt};
use kameo::{
Actor,
actor::{ActorRef, PreparedActor, Recipient, Spawn, WeakActorRef},
mailbox::Signal,
prelude::Message,
};
use tokio::{
select,
sync::mpsc::{self, error::SendError},
};
use tonic::{Status, Streaming};
use tracing::{debug, error};
/// Errors returned by transport adapters implementing [`Bi`]. /// A bidirectional stream abstraction for sans-io testing.
pub enum Error {
/// The outbound side of the transport is no longer accepting messages.
ChannelClosed,
}
/// Minimal bidirectional transport abstraction used by protocol code.
/// ///
/// `Bi<Inbound, Outbound>` models a duplex channel with: /// Combines a [`Stream`] of incoming messages with the ability to [`send`](Bi::send)
/// - inbound items of type `Inbound` read via [`Bi::recv`] /// outgoing responses. This trait allows business logic to be tested without a real
/// - outbound items of type `Outbound` written via [`Bi::send`] /// gRPC connection by swapping in an in-memory implementation.
pub trait Bi<Inbound, Outbound>: Send + Sync + 'static { ///
/// # Type Parameters
/// - `T`: `InboundMessage` received from the channel/socket (e.g., `UserAgentRequest`)
/// - `U`: `OutboundMessage` sent to the channel/socket (e.g., `UserAgentResponse`)
pub trait Bi<T, U>: Stream<Item = Result<T, Status>> + Send + Sync + 'static {
type Error;
fn send( fn send(
&mut self, &mut self,
item: Outbound, item: Result<U, Status>,
) -> impl std::future::Future<Output = Result<(), Error>> + Send; ) -> impl std::future::Future<Output = Result<(), Self::Error>> + Send;
fn recv(&mut self) -> impl std::future::Future<Output = Option<Inbound>> + Send;
} }
/// Converts transport-facing inbound items into protocol-facing inbound items. /// Concrete [`Bi`] implementation backed by a tonic gRPC [`Streaming`] and an [`mpsc::Sender`].
pub trait RecvConverter: Send + Sync + 'static { ///
type Input; /// This is the production implementation used in gRPC service handlers. The `request_stream`
type Output; /// receives messages from the client, and `response_sender` sends responses back.
pub struct BiStream<T, U> {
fn convert(&self, item: Self::Input) -> Self::Output; pub request_stream: Streaming<T>,
pub response_sender: mpsc::Sender<Result<U, Status>>,
} }
/// Converts protocol/domain outbound items into transport-facing outbound items. impl<T, U> Stream for BiStream<T, U>
pub trait SendConverter: Send + Sync + 'static {
type Input;
type Output;
fn convert(&self, item: Self::Input) -> Self::Output;
}
/// A [`RecvConverter`] that forwards values unchanged.
pub struct IdentityRecvConverter<T> {
_marker: PhantomData<T>,
}
impl<T> IdentityRecvConverter<T> {
pub fn new() -> Self {
Self {
_marker: PhantomData,
}
}
}
impl<T> Default for IdentityRecvConverter<T> {
fn default() -> Self {
Self::new()
}
}
impl<T> RecvConverter for IdentityRecvConverter<T>
where where
T: Send + Sync + 'static, T: Send + 'static,
U: Send + 'static,
{ {
type Input = T; type Item = Result<T, Status>;
type Output = T;
fn convert(&self, item: Self::Input) -> Self::Output { fn poll_next(
item mut self: std::pin::Pin<&mut Self>,
cx: &mut std::task::Context<'_>,
) -> std::task::Poll<Option<Self::Item>> {
self.request_stream.poll_next_unpin(cx)
} }
} }
/// A [`SendConverter`] that forwards values unchanged. impl<T, U> Bi<T, U> for BiStream<T, U>
pub struct IdentitySendConverter<T> {
_marker: PhantomData<T>,
}
impl<T> IdentitySendConverter<T> {
pub fn new() -> Self {
Self {
_marker: PhantomData,
}
}
}
impl<T> Default for IdentitySendConverter<T> {
fn default() -> Self {
Self::new()
}
}
impl<T> SendConverter for IdentitySendConverter<T>
where where
T: Send + Sync + 'static, T: Send + 'static,
U: Send + 'static,
{ {
type Input = T; type Error = SendError<Result<U, Status>>;
type Output = T;
fn convert(&self, item: Self::Input) -> Self::Output { async fn send(&mut self, item: Result<U, Status>) -> Result<(), Self::Error> {
item self.response_sender.send(item).await
} }
} }
/// gRPC-specific transport adapters and helpers. /// Marker trait for transport actors that can receive outbound messages of type `T`.
pub mod grpc { ///
use futures::StreamExt; /// Implement this on your transport actor to indicate it can handle outbound messages
use tokio::sync::mpsc; /// produced by the business actor. Requires the actor to implement [`Message<Result<T, E>>`]
use tonic::Streaming; /// so business logic can forward responses via [`tell()`](ActorRef::tell).
///
/// # Example
///
/// ```rust,ignore
/// #[derive(Actor)]
/// struct MyTransportActor { /* ... */ }
///
/// impl Message<Result<MyResponse, MyError>> for MyTransportActor {
/// type Reply = ();
/// async fn handle(&mut self, msg: Result<MyResponse, MyError>, _ctx: &mut Context<Self, Self::Reply>) -> Self::Reply {
/// // forward outbound message to channel/socket
/// }
/// }
///
/// impl TransportActor<MyResponse, MyError> for MyTransportActor {}
/// ```
pub trait TransportActor<Outbound: Send + 'static, DomainError: Send + 'static>:
Actor + Send + Message<Result<Outbound, DomainError>>
{
}
use super::{Bi, Error, RecvConverter, SendConverter}; /// A kameo actor that bridges a gRPC bidirectional stream with a business logic actor.
///
/// This actor owns the gRPC [`Streaming`] receiver and an [`mpsc::Sender`] for responses.
/// It multiplexes between its own mailbox (for outbound messages from the business actor)
/// and the gRPC stream (for inbound client messages) using [`tokio::select!`].
///
/// # Message Flow
///
/// - **Inbound**: Messages from the gRPC stream are forwarded to `business_logic_actor`
/// via [`tell()`](Recipient::tell).
/// - **Outbound**: The business actor sends `Result<Outbound, DomainError>` messages to this
/// actor, which forwards them through the `sender` channel to the gRPC response stream.
///
/// # Lifecycle
///
/// - If the business logic actor dies (detected via actor linking), this actor stops,
/// which closes the gRPC stream.
/// - If the gRPC stream closes or errors, this actor stops, which (via linking) notifies
/// the business actor.
/// - Error responses (`Err(DomainError)`) are forwarded to the client and then the actor stops,
/// closing the connection.
///
/// # Type Parameters
/// - `Outbound`: `OutboundMessage` sent to the client (e.g., `UserAgentResponse`)
/// - `Inbound`: `InboundMessage` received from the client (e.g., `UserAgentRequest`)
/// - `E`: The domain error type, must implement `Into<tonic::Status>` for gRPC conversion
pub struct GrpcTransportActor<Outbound, Inbound, DomainError>
where
Outbound: Send + 'static,
Inbound: Send + 'static,
DomainError: Into<tonic::Status> + Send + 'static,
{
sender: mpsc::Sender<Result<Outbound, tonic::Status>>,
receiver: tonic::Streaming<Inbound>,
business_logic_actor: Recipient<Inbound>,
_error: std::marker::PhantomData<DomainError>,
}
/// [`Bi`] adapter backed by a tonic gRPC bidirectional stream. impl<Outbound, Inbound, DomainError> GrpcTransportActor<Outbound, Inbound, DomainError>
/// where
Outbound: Send + 'static,
/// Tonic receive errors are logged and treated as stream closure (`None`). Inbound: Send + 'static,
/// The receive converter is only invoked for successful inbound transport DomainError: Into<tonic::Status> + Send + 'static,
/// items. {
pub struct GrpcAdapter<InboundConverter, OutboundConverter>
where
InboundConverter: RecvConverter,
OutboundConverter: SendConverter,
{
sender: mpsc::Sender<OutboundConverter::Output>,
receiver: Streaming<InboundConverter::Input>,
inbound_converter: InboundConverter,
outbound_converter: OutboundConverter,
}
impl<InboundTransport, Inbound, InboundConverter, OutboundConverter>
GrpcAdapter<InboundConverter, OutboundConverter>
where
InboundConverter: RecvConverter<Input = InboundTransport, Output = Inbound>,
OutboundConverter: SendConverter,
{
pub fn new( pub fn new(
sender: mpsc::Sender<OutboundConverter::Output>, sender: mpsc::Sender<Result<Outbound, tonic::Status>>,
receiver: Streaming<InboundTransport>, receiver: tonic::Streaming<Inbound>,
inbound_converter: InboundConverter, business_logic_actor: Recipient<Inbound>,
outbound_converter: OutboundConverter,
) -> Self { ) -> Self {
Self { Self {
sender, sender,
receiver, receiver,
inbound_converter, business_logic_actor,
outbound_converter, _error: std::marker::PhantomData,
}
}
}
impl< InboundConverter, OutboundConverter> Bi<InboundConverter::Output, OutboundConverter::Input>
for GrpcAdapter<InboundConverter, OutboundConverter>
where
InboundConverter: RecvConverter,
OutboundConverter: SendConverter,
OutboundConverter::Input: Send + 'static,
OutboundConverter::Output: Send + 'static,
{
#[tracing::instrument(level = "trace", skip(self, item))]
async fn send(&mut self, item: OutboundConverter::Input) -> Result<(), Error> {
let outbound = self.outbound_converter.convert(item);
self.sender
.send(outbound)
.await
.map_err(|_| Error::ChannelClosed)
}
#[tracing::instrument(level = "trace", skip(self))]
async fn recv(&mut self) -> Option<InboundConverter::Output> {
match self.receiver.next().await {
Some(Ok(item)) => Some(self.inbound_converter.convert(item)),
Some(Err(error)) => {
tracing::error!(error = ?error, "grpc transport recv failed; closing stream");
None
}
None => None,
}
}
}
}
/// No-op [`Bi`] transport for tests and manual actor usage.
///
/// `send` drops all items and succeeds. [`Bi::recv`] never resolves and therefore
/// does not busy-wait or spuriously close the stream.
pub struct DummyTransport<Inbound, Outbound> {
_marker: PhantomData<(Inbound, Outbound)>,
}
impl<Inbound, Outbound> DummyTransport<Inbound, Outbound> {
pub fn new() -> Self {
Self {
_marker: PhantomData,
} }
} }
} }
impl<Inbound, Outbound> Default for DummyTransport<Inbound, Outbound> { impl<Outbound, Inbound, E> Actor for GrpcTransportActor<Outbound, Inbound, E>
fn default() -> Self {
Self::new()
}
}
impl<Inbound, Outbound> Bi<Inbound, Outbound> for DummyTransport<Inbound, Outbound>
where where
Inbound: Send + Sync + 'static, Outbound: Send + 'static,
Outbound: Send + Sync + 'static, Inbound: Send + 'static,
E: Into<tonic::Status> + Send + 'static,
{ {
async fn send(&mut self, _item: Outbound) -> Result<(), Error> { type Args = Self;
Ok(())
type Error = ();
async fn on_start(args: Self::Args, _: ActorRef<Self>) -> Result<Self, Self::Error> {
Ok(args)
} }
fn recv(&mut self) -> impl std::future::Future<Output = Option<Inbound>> + Send { fn on_link_died(
async { &mut self,
std::future::pending::<()>().await; _: WeakActorRef<Self>,
None id: kameo::prelude::ActorId,
_: kameo::prelude::ActorStopReason,
) -> impl Future<
Output = Result<std::ops::ControlFlow<kameo::prelude::ActorStopReason>, Self::Error>,
> + Send {
async move {
if id == self.business_logic_actor.id() {
error!("Business logic actor died, stopping GrpcTransportActor");
Ok(std::ops::ControlFlow::Break(
kameo::prelude::ActorStopReason::Normal,
))
} else {
debug!(
"Linked actor {} died, but it's not the business logic actor, ignoring",
id
);
Ok(std::ops::ControlFlow::Continue(()))
}
}
}
async fn next(
&mut self,
_: WeakActorRef<Self>,
mailbox_rx: &mut kameo::prelude::MailboxReceiver<Self>,
) -> Option<kameo::mailbox::Signal<Self>> {
select! {
msg = mailbox_rx.recv() => {
msg
}
recv_msg = self.receiver.next() => {
match recv_msg {
Some(Ok(msg)) => {
match self.business_logic_actor.tell(msg).await {
Ok(_) => None,
Err(e) => {
// TODO: this would probably require better error handling - or resending if backpressure is the issue
error!("Failed to send message to business logic actor: {}", e);
Some(Signal::Stop)
}
}
}
Some(Err(e)) => {
error!("Received error from stream: {}, stopping GrpcTransportActor", e);
Some(Signal::Stop)
}
None => {
error!("Receiver channel closed, stopping GrpcTransportActor");
Some(Signal::Stop)
}
}
}
} }
} }
} }
impl<Outbound, Inbound, E> Message<Result<Outbound, E>> for GrpcTransportActor<Outbound, Inbound, E>
where
Outbound: Send + 'static,
Inbound: Send + 'static,
E: Into<tonic::Status> + Send + 'static,
{
type Reply = ();
async fn handle(
&mut self,
msg: Result<Outbound, E>,
ctx: &mut kameo::prelude::Context<Self, Self::Reply>,
) -> Self::Reply {
let is_err = msg.is_err();
let grpc_msg = msg.map_err(Into::into);
match self.sender.send(grpc_msg).await {
Ok(_) => {
if is_err {
ctx.stop();
}
}
Err(e) => {
error!("Failed to send message: {}", e);
ctx.stop();
}
}
}
}
impl<Outbound, Inbound, E> TransportActor<Outbound, E> for GrpcTransportActor<Outbound, Inbound, E>
where
Outbound: Send + 'static,
Inbound: Send + 'static,
E: Into<tonic::Status> + Send + 'static,
{
}
/// Wires together a transport actor and a business logic actor with bidirectional linking.
///
/// This function handles the chicken-and-egg problem of two actors that need references
/// to each other at construction time. It uses kameo's [`PreparedActor`] to obtain
/// [`ActorRef`]s before spawning, then links both actors so that if either dies,
/// the other is notified via [`on_link_died`](Actor::on_link_died).
///
/// The business actor receives a type-erased [`Recipient<Result<Outbound, DomainError>>`] instead of an
/// `ActorRef<Transport>`, keeping it decoupled from the concrete transport implementation.
///
/// # Type Parameters
/// - `Transport`: The transport actor type (e.g., [`GrpcTransportActor`])
/// - `Inbound`: `InboundMessage` received by the business actor from the transport
/// - `Outbound`: `OutboundMessage` sent by the business actor back to the transport
/// - `Business`: The business logic actor
/// - `BusinessCtor`: Closure that receives a prepared business actor and transport recipient,
/// spawns the business actor, and returns its [`ActorRef`]
/// - `TransportCtor`: Closure that receives a prepared transport actor, a recipient for
/// inbound messages, and the business actor id, then spawns the transport actor
///
/// # Returns
/// A tuple of `(transport_ref, business_ref)` — actor references for both spawned actors.
pub async fn wire<
Transport,
Inbound,
Outbound,
DomainError,
Business,
BusinessCtor,
TransportCtor,
>(
business_ctor: BusinessCtor,
transport_ctor: TransportCtor,
) -> (ActorRef<Transport>, ActorRef<Business>)
where
Transport: TransportActor<Outbound, DomainError>,
Inbound: Send + 'static,
Outbound: Send + 'static,
DomainError: Send + 'static,
Business: Actor + Message<Inbound> + Send + 'static,
BusinessCtor: FnOnce(PreparedActor<Business>, Recipient<Result<Outbound, DomainError>>),
TransportCtor:
FnOnce(PreparedActor<Transport>, Recipient<Inbound>),
{
let prepared_business: PreparedActor<Business> = Spawn::prepare();
let prepared_transport: PreparedActor<Transport> = Spawn::prepare();
let business_ref = prepared_business.actor_ref().clone();
let transport_ref = prepared_transport.actor_ref().clone();
transport_ref.link(&business_ref).await;
business_ref.link(&transport_ref).await;
let recipient = transport_ref.clone().recipient();
business_ctor(prepared_business, recipient);
let business_recipient = business_ref.clone().recipient();
transport_ctor(prepared_transport, business_recipient);
(transport_ref, business_ref)
}

Binary file not shown.

View File

@@ -0,0 +1,12 @@
use arbiter_proto::{
proto::{ClientRequest, ClientResponse},
transport::Bi,
};
use crate::ServerContext;
pub(crate) async fn handle_client(
_context: ServerContext,
_bistream: impl Bi<ClientRequest, ClientResponse>,
) {
}

View File

@@ -1,289 +0,0 @@
use arbiter_proto::{
proto::client::{
AuthChallenge, AuthChallengeRequest, AuthChallengeSolution, AuthOk, ClientRequest,
ClientResponse,
client_request::Payload as ClientRequestPayload,
client_response::Payload as ClientResponsePayload,
},
transport::{Bi, DummyTransport},
};
use diesel::{ExpressionMethods as _, OptionalExtension as _, QueryDsl, dsl::update};
use diesel_async::RunQueryDsl;
use ed25519_dalek::VerifyingKey;
use kameo::Actor;
use tokio::select;
use tracing::{error, info};
use crate::{
ServerContext,
actors::client::state::{
ChallengeContext, ClientEvents, ClientStateMachine, ClientStates, DummyContext,
},
db::{self, schema},
};
mod state;
#[derive(Debug, Clone, PartialEq, Eq, thiserror::Error)]
pub enum ClientError {
#[error("Expected message with payload")]
MissingRequestPayload,
#[error("Unexpected request payload")]
UnexpectedRequestPayload,
#[error("Invalid state for challenge solution")]
InvalidStateForChallengeSolution,
#[error("Expected pubkey to have specific length")]
InvalidAuthPubkeyLength,
#[error("Failed to convert pubkey to VerifyingKey")]
InvalidAuthPubkeyEncoding,
#[error("Invalid signature length")]
InvalidSignatureLength,
#[error("Public key not registered")]
PublicKeyNotRegistered,
#[error("Invalid challenge solution")]
InvalidChallengeSolution,
#[error("State machine error")]
StateTransitionFailed,
#[error("Database pool error")]
DatabasePoolUnavailable,
#[error("Database error")]
DatabaseOperationFailed,
}
pub struct ClientActor<Transport>
where
Transport: Bi<ClientRequest, Result<ClientResponse, ClientError>>,
{
db: db::DatabasePool,
state: ClientStateMachine<DummyContext>,
transport: Transport,
}
impl<Transport> ClientActor<Transport>
where
Transport: Bi<ClientRequest, Result<ClientResponse, ClientError>>,
{
pub(crate) fn new(context: ServerContext, transport: Transport) -> Self {
Self {
db: context.db.clone(),
state: ClientStateMachine::new(DummyContext),
transport,
}
}
fn transition(&mut self, event: ClientEvents) -> Result<(), ClientError> {
self.state.process_event(event).map_err(|e| {
error!(?e, "State transition failed");
ClientError::StateTransitionFailed
})?;
Ok(())
}
pub async fn process_transport_inbound(&mut self, req: ClientRequest) -> Output {
let msg = req.payload.ok_or_else(|| {
error!(actor = "client", "Received message with no payload");
ClientError::MissingRequestPayload
})?;
match msg {
ClientRequestPayload::AuthChallengeRequest(req) => {
self.handle_auth_challenge_request(req).await
}
ClientRequestPayload::AuthChallengeSolution(solution) => {
self.handle_auth_challenge_solution(solution).await
}
}
}
async fn handle_auth_challenge_request(&mut self, req: AuthChallengeRequest) -> Output {
let pubkey = req
.pubkey
.as_array()
.ok_or(ClientError::InvalidAuthPubkeyLength)?;
let pubkey = VerifyingKey::from_bytes(pubkey).map_err(|_err| {
error!(?pubkey, "Failed to convert to VerifyingKey");
ClientError::InvalidAuthPubkeyEncoding
})?;
self.transition(ClientEvents::AuthRequest)?;
self.auth_with_challenge(pubkey, req.pubkey).await
}
async fn auth_with_challenge(&mut self, pubkey: VerifyingKey, pubkey_bytes: Vec<u8>) -> Output {
let nonce: Option<i32> = {
let mut db_conn = self.db.get().await.map_err(|e| {
error!(error = ?e, "Database pool error");
ClientError::DatabasePoolUnavailable
})?;
db_conn
.exclusive_transaction(|conn| {
Box::pin(async move {
let current_nonce = schema::program_client::table
.filter(
schema::program_client::public_key.eq(pubkey.as_bytes().to_vec()),
)
.select(schema::program_client::nonce)
.first::<i32>(conn)
.await?;
update(schema::program_client::table)
.filter(
schema::program_client::public_key.eq(pubkey.as_bytes().to_vec()),
)
.set(schema::program_client::nonce.eq(current_nonce + 1))
.execute(conn)
.await?;
Result::<_, diesel::result::Error>::Ok(current_nonce)
})
})
.await
.optional()
.map_err(|e| {
error!(error = ?e, "Database error");
ClientError::DatabaseOperationFailed
})?
};
let Some(nonce) = nonce else {
error!(?pubkey, "Public key not found in database");
return Err(ClientError::PublicKeyNotRegistered);
};
let challenge = AuthChallenge {
pubkey: pubkey_bytes,
nonce,
};
self.transition(ClientEvents::SentChallenge(ChallengeContext {
challenge: challenge.clone(),
key: pubkey,
}))?;
info!(
?pubkey,
?challenge,
"Sent authentication challenge to client"
);
Ok(response(ClientResponsePayload::AuthChallenge(challenge)))
}
fn verify_challenge_solution(
&self,
solution: &AuthChallengeSolution,
) -> Result<(bool, &ChallengeContext), ClientError> {
let ClientStates::WaitingForChallengeSolution(challenge_context) = self.state.state()
else {
error!("Received challenge solution in invalid state");
return Err(ClientError::InvalidStateForChallengeSolution);
};
let formatted_challenge = arbiter_proto::format_challenge(
challenge_context.challenge.nonce,
&challenge_context.challenge.pubkey,
);
let signature = solution.signature.as_slice().try_into().map_err(|_| {
error!(?solution, "Invalid signature length");
ClientError::InvalidSignatureLength
})?;
let valid = challenge_context
.key
.verify_strict(&formatted_challenge, &signature)
.is_ok();
Ok((valid, challenge_context))
}
async fn handle_auth_challenge_solution(
&mut self,
solution: AuthChallengeSolution,
) -> Output {
let (valid, challenge_context) = self.verify_challenge_solution(&solution)?;
if valid {
info!(
?challenge_context,
"Client provided valid solution to authentication challenge"
);
self.transition(ClientEvents::ReceivedGoodSolution)?;
Ok(response(ClientResponsePayload::AuthOk(AuthOk {})))
} else {
error!("Client provided invalid solution to authentication challenge");
self.transition(ClientEvents::ReceivedBadSolution)?;
Err(ClientError::InvalidChallengeSolution)
}
}
}
type Output = Result<ClientResponse, ClientError>;
fn response(payload: ClientResponsePayload) -> ClientResponse {
ClientResponse {
payload: Some(payload),
}
}
impl<Transport> Actor for ClientActor<Transport>
where
Transport: Bi<ClientRequest, Result<ClientResponse, ClientError>>,
{
type Args = Self;
type Error = ();
async fn on_start(
args: Self::Args,
_: kameo::prelude::ActorRef<Self>,
) -> Result<Self, Self::Error> {
Ok(args)
}
async fn next(
&mut self,
_actor_ref: kameo::prelude::WeakActorRef<Self>,
mailbox_rx: &mut kameo::prelude::MailboxReceiver<Self>,
) -> Option<kameo::mailbox::Signal<Self>> {
loop {
select! {
signal = mailbox_rx.recv() => {
return signal;
}
msg = self.transport.recv() => {
match msg {
Some(request) => {
match self.process_transport_inbound(request).await {
Ok(resp) => {
if self.transport.send(Ok(resp)).await.is_err() {
error!(actor = "client", reason = "channel closed", "send.failed");
return Some(kameo::mailbox::Signal::Stop);
}
}
Err(err) => {
let _ = self.transport.send(Err(err)).await;
return Some(kameo::mailbox::Signal::Stop);
}
}
}
None => {
info!(actor = "client", "transport.closed");
return Some(kameo::mailbox::Signal::Stop);
}
}
}
}
}
}
}
impl ClientActor<DummyTransport<ClientRequest, Result<ClientResponse, ClientError>>> {
pub fn new_manual(db: db::DatabasePool) -> Self {
Self {
db,
state: ClientStateMachine::new(DummyContext),
transport: DummyTransport::new(),
}
}
}

View File

@@ -1,31 +0,0 @@
use arbiter_proto::proto::client::AuthChallenge;
use ed25519_dalek::VerifyingKey;
/// Context for state machine with validated key and sent challenge
#[derive(Clone, Debug)]
pub struct ChallengeContext {
pub challenge: AuthChallenge,
pub key: VerifyingKey,
}
smlang::statemachine!(
name: Client,
custom_error: false,
transitions: {
*Init + AuthRequest = ReceivedAuthRequest,
ReceivedAuthRequest + SentChallenge(ChallengeContext) / move_challenge = WaitingForChallengeSolution(ChallengeContext),
WaitingForChallengeSolution(ChallengeContext) + ReceivedGoodSolution = Idle,
WaitingForChallengeSolution(ChallengeContext) + ReceivedBadSolution = AuthError,
}
);
pub struct DummyContext;
impl ClientStateMachineContext for DummyContext {
#[allow(missing_docs)]
#[allow(clippy::unused_unit)]
fn move_challenge(&mut self, event_data: ChallengeContext) -> Result<ChallengeContext, ()> {
Ok(event_data)
}
}

View File

@@ -0,0 +1,57 @@
use tonic::Status;
use crate::db;
#[derive(Debug, thiserror::Error)]
pub enum UserAgentError {
#[error("Missing payload in request")]
MissingPayload,
#[error("Invalid bootstrap token")]
InvalidBootstrapToken,
#[error("Public key not registered")]
PubkeyNotRegistered,
#[error("Invalid public key format")]
InvalidPubkey,
#[error("Invalid signature length")]
InvalidSignatureLength,
#[error("Invalid challenge solution")]
InvalidChallengeSolution,
#[error("Invalid state for operation")]
InvalidState,
#[error("Actor unavailable")]
ActorUnavailable,
#[error("Database error")]
Database(#[from] diesel::result::Error),
#[error("Database pool error")]
DatabasePool(#[from] db::PoolError),
}
impl From<UserAgentError> for Status {
fn from(err: UserAgentError) -> Self {
match err {
UserAgentError::MissingPayload
| UserAgentError::InvalidBootstrapToken
| UserAgentError::InvalidPubkey
| UserAgentError::InvalidSignatureLength => Status::invalid_argument(err.to_string()),
UserAgentError::PubkeyNotRegistered | UserAgentError::InvalidChallengeSolution => {
Status::unauthenticated(err.to_string())
}
UserAgentError::InvalidState => Status::failed_precondition(err.to_string()),
UserAgentError::ActorUnavailable
| UserAgentError::Database(_)
| UserAgentError::DatabasePool(_) => Status::internal(err.to_string()),
}
}
}

View File

@@ -1,21 +1,23 @@
use std::{ops::DerefMut, sync::Mutex}; use std::{ops::DerefMut, sync::Mutex};
use arbiter_proto::{ use arbiter_proto::proto::{
proto::user_agent::{ UnsealEncryptedKey, UnsealResult, UnsealStart, UnsealStartResponse, UserAgentRequest,
AuthChallenge, AuthChallengeRequest, AuthChallengeSolution, AuthOk, UnsealEncryptedKey, UserAgentResponse,
UnsealResult, UnsealStart, UnsealStartResponse, UserAgentRequest, UserAgentResponse, auth::{
self, AuthChallengeRequest, AuthOk, ClientMessage as ClientAuthMessage,
ServerMessage as AuthServerMessage,
client_message::Payload as ClientAuthPayload,
server_message::Payload as ServerAuthPayload,
},
user_agent_request::Payload as UserAgentRequestPayload, user_agent_request::Payload as UserAgentRequestPayload,
user_agent_response::Payload as UserAgentResponsePayload, user_agent_response::Payload as UserAgentResponsePayload,
},
transport::{Bi, DummyTransport},
}; };
use chacha20poly1305::{AeadInPlace, XChaCha20Poly1305, XNonce, aead::KeyInit}; use chacha20poly1305::{AeadInPlace, XChaCha20Poly1305, XNonce, aead::KeyInit};
use diesel::{ExpressionMethods as _, OptionalExtension as _, QueryDsl, dsl::update}; use diesel::{ExpressionMethods as _, OptionalExtension as _, QueryDsl, dsl::update};
use diesel_async::RunQueryDsl; use diesel_async::RunQueryDsl;
use ed25519_dalek::VerifyingKey; use ed25519_dalek::VerifyingKey;
use kameo::{Actor, error::SendError}; use kameo::{Actor, actor::Recipient, error::SendError, messages, prelude::Message};
use memsafe::MemSafe; use memsafe::MemSafe;
use tokio::select;
use tracing::{error, info}; use tracing::{error, info};
use x25519_dalek::{EphemeralSecret, PublicKey}; use x25519_dalek::{EphemeralSecret, PublicKey};
@@ -33,59 +35,24 @@ use crate::{
db::{self, schema}, db::{self, schema},
}; };
mod error;
mod state; mod state;
#[derive(Debug, Clone, PartialEq, Eq, thiserror::Error)] pub use error::UserAgentError;
pub enum UserAgentError {
#[error("Expected message with payload")]
MissingRequestPayload,
#[error("Expected message with payload")]
UnexpectedRequestPayload,
#[error("Invalid state for challenge solution")]
InvalidStateForChallengeSolution,
#[error("Invalid state for unseal encrypted key")]
InvalidStateForUnsealEncryptedKey,
#[error("client_pubkey must be 32 bytes")]
InvalidClientPubkeyLength,
#[error("Expected pubkey to have specific length")]
InvalidAuthPubkeyLength,
#[error("Failed to convert pubkey to VerifyingKey")]
InvalidAuthPubkeyEncoding,
#[error("Invalid signature length")]
InvalidSignatureLength,
#[error("Invalid bootstrap token")]
InvalidBootstrapToken,
#[error("Public key not registered")]
PublicKeyNotRegistered,
#[error("Invalid challenge solution")]
InvalidChallengeSolution,
#[error("State machine error")]
StateTransitionFailed,
#[error("Bootstrap token consumption failed")]
BootstrapperActorUnreachable,
#[error("Vault is not available")]
KeyHolderActorUnreachable,
#[error("Database pool error")]
DatabasePoolUnavailable,
#[error("Database error")]
DatabaseOperationFailed,
}
pub struct UserAgentActor<Transport> #[derive(Actor)]
where pub struct UserAgentActor {
Transport: Bi<UserAgentRequest, Result<UserAgentResponse, UserAgentError>>,
{
db: db::DatabasePool, db: db::DatabasePool,
actors: GlobalActors, actors: GlobalActors,
state: UserAgentStateMachine<DummyContext>, state: UserAgentStateMachine<DummyContext>,
transport: Transport, transport: Recipient<Result<UserAgentResponse, UserAgentError>>,
} }
impl<Transport> UserAgentActor<Transport> impl UserAgentActor {
where pub(crate) fn new(
Transport: Bi<UserAgentRequest, Result<UserAgentResponse, UserAgentError>>, context: ServerContext,
{ transport: Recipient<Result<UserAgentResponse, UserAgentError>>,
pub(crate) fn new(context: ServerContext, transport: Transport) -> Self { ) -> Self {
Self { Self {
db: context.db.clone(), db: context.db.clone(),
actors: context.actors.clone(), actors: context.actors.clone(),
@@ -94,42 +61,55 @@ where
} }
} }
fn transition(&mut self, event: UserAgentEvents) -> Result<(), UserAgentError> { pub fn new_manual(
self.state.process_event(event).map_err(|e| { db: db::DatabasePool,
error!(?e, "State transition failed"); actors: GlobalActors,
UserAgentError::StateTransitionFailed transport: Recipient<Result<UserAgentResponse, UserAgentError>>,
})?; ) -> Self {
Ok(()) Self {
db,
actors,
state: UserAgentStateMachine::new(DummyContext),
transport,
}
} }
pub async fn process_transport_inbound(&mut self, req: UserAgentRequest) -> Output { async fn process_request(&mut self, req: UserAgentRequest) -> Output {
let msg = req.payload.ok_or_else(|| { let msg = req.payload.ok_or_else(|| {
error!(actor = "useragent", "Received message with no payload"); error!(actor = "useragent", "Received message with no payload");
UserAgentError::MissingRequestPayload UserAgentError::MissingPayload
})?; })?;
match msg { match msg {
UserAgentRequestPayload::AuthChallengeRequest(req) => { UserAgentRequestPayload::AuthMessage(ClientAuthMessage {
self.handle_auth_challenge_request(req).await payload: Some(ClientAuthPayload::AuthChallengeRequest(req)),
} }) => self.handle_auth_challenge_request(req).await,
UserAgentRequestPayload::AuthChallengeSolution(solution) => { UserAgentRequestPayload::AuthMessage(ClientAuthMessage {
self.handle_auth_challenge_solution(solution).await payload: Some(ClientAuthPayload::AuthChallengeSolution(solution)),
} }) => self.handle_auth_challenge_solution(solution).await,
UserAgentRequestPayload::UnsealStart(unseal_start) => { UserAgentRequestPayload::UnsealStart(unseal_start) => {
self.handle_unseal_request(unseal_start).await self.handle_unseal_request(unseal_start).await
} }
UserAgentRequestPayload::UnsealEncryptedKey(unseal_encrypted_key) => { UserAgentRequestPayload::UnsealEncryptedKey(unseal_encrypted_key) => {
self.handle_unseal_encrypted_key(unseal_encrypted_key).await self.handle_unseal_encrypted_key(unseal_encrypted_key).await
} }
_ => Err(UserAgentError::UnexpectedRequestPayload), _ => Err(UserAgentError::MissingPayload),
} }
} }
fn transition(&mut self, event: UserAgentEvents) -> Result<(), UserAgentError> {
self.state.process_event(event).map_err(|e| {
error!(?e, "State transition failed");
UserAgentError::InvalidState
})?;
Ok(())
}
async fn auth_with_bootstrap_token( async fn auth_with_bootstrap_token(
&mut self, &mut self,
pubkey: ed25519_dalek::VerifyingKey, pubkey: ed25519_dalek::VerifyingKey,
token: String, token: String,
) -> Result<UserAgentResponse, UserAgentError> { ) -> Output {
let token_ok: bool = self let token_ok: bool = self
.actors .actors
.bootstrapper .bootstrapper
@@ -137,7 +117,7 @@ where
.await .await
.map_err(|e| { .map_err(|e| {
error!(?pubkey, "Failed to consume bootstrap token: {e}"); error!(?pubkey, "Failed to consume bootstrap token: {e}");
UserAgentError::BootstrapperActorUnreachable UserAgentError::ActorUnavailable
})?; })?;
if !token_ok { if !token_ok {
@@ -146,10 +126,7 @@ where
} }
{ {
let mut conn = self.db.get().await.map_err(|e| { let mut conn = self.db.get().await?;
error!(error = ?e, "Database pool error");
UserAgentError::DatabasePoolUnavailable
})?;
diesel::insert_into(schema::useragent_client::table) diesel::insert_into(schema::useragent_client::table)
.values(( .values((
@@ -157,24 +134,17 @@ where
schema::useragent_client::nonce.eq(1), schema::useragent_client::nonce.eq(1),
)) ))
.execute(&mut conn) .execute(&mut conn)
.await .await?;
.map_err(|e| {
error!(error = ?e, "Database error");
UserAgentError::DatabaseOperationFailed
})?;
} }
self.transition(UserAgentEvents::ReceivedBootstrapToken)?; self.transition(UserAgentEvents::ReceivedBootstrapToken)?;
Ok(response(UserAgentResponsePayload::AuthOk(AuthOk {}))) Ok(auth_response(ServerAuthPayload::AuthOk(AuthOk {})))
} }
async fn auth_with_challenge(&mut self, pubkey: VerifyingKey, pubkey_bytes: Vec<u8>) -> Output { async fn auth_with_challenge(&mut self, pubkey: VerifyingKey, pubkey_bytes: Vec<u8>) -> Output {
let nonce: Option<i32> = { let nonce: Option<i32> = {
let mut db_conn = self.db.get().await.map_err(|e| { let mut db_conn = self.db.get().await?;
error!(error = ?e, "Database pool error");
UserAgentError::DatabasePoolUnavailable
})?;
db_conn db_conn
.exclusive_transaction(|conn| { .exclusive_transaction(|conn| {
Box::pin(async move { Box::pin(async move {
@@ -198,19 +168,15 @@ where
}) })
}) })
.await .await
.optional() .optional()?
.map_err(|e| {
error!(error = ?e, "Database error");
UserAgentError::DatabaseOperationFailed
})?
}; };
let Some(nonce) = nonce else { let Some(nonce) = nonce else {
error!(?pubkey, "Public key not found in database"); error!(?pubkey, "Public key not found in database");
return Err(UserAgentError::PublicKeyNotRegistered); return Err(UserAgentError::PubkeyNotRegistered);
}; };
let challenge = AuthChallenge { let challenge = auth::AuthChallenge {
pubkey: pubkey_bytes, pubkey: pubkey_bytes,
nonce, nonce,
}; };
@@ -226,22 +192,19 @@ where
"Sent authentication challenge to client" "Sent authentication challenge to client"
); );
Ok(response(UserAgentResponsePayload::AuthChallenge(challenge))) Ok(auth_response(ServerAuthPayload::AuthChallenge(challenge)))
} }
fn verify_challenge_solution( fn verify_challenge_solution(
&self, &self,
solution: &AuthChallengeSolution, solution: &auth::AuthChallengeSolution,
) -> Result<(bool, &ChallengeContext), UserAgentError> { ) -> Result<(bool, &ChallengeContext), UserAgentError> {
let UserAgentStates::WaitingForChallengeSolution(challenge_context) = self.state.state() let UserAgentStates::WaitingForChallengeSolution(challenge_context) = self.state.state()
else { else {
error!("Received challenge solution in invalid state"); error!("Received challenge solution in invalid state");
return Err(UserAgentError::InvalidStateForChallengeSolution); return Err(UserAgentError::InvalidState);
}; };
let formatted_challenge = arbiter_proto::format_challenge( let formatted_challenge = arbiter_proto::format_challenge(&challenge_context.challenge);
challenge_context.challenge.nonce,
&challenge_context.challenge.pubkey,
);
let signature = solution.signature.as_slice().try_into().map_err(|_| { let signature = solution.signature.as_slice().try_into().map_err(|_| {
error!(?solution, "Invalid signature length"); error!(?solution, "Invalid signature length");
@@ -259,24 +222,31 @@ where
type Output = Result<UserAgentResponse, UserAgentError>; type Output = Result<UserAgentResponse, UserAgentError>;
fn response(payload: UserAgentResponsePayload) -> UserAgentResponse { fn auth_response(payload: ServerAuthPayload) -> UserAgentResponse {
UserAgentResponse {
payload: Some(UserAgentResponsePayload::AuthMessage(AuthServerMessage {
payload: Some(payload),
})),
}
}
fn unseal_response(payload: UserAgentResponsePayload) -> UserAgentResponse {
UserAgentResponse { UserAgentResponse {
payload: Some(payload), payload: Some(payload),
} }
} }
impl<Transport> UserAgentActor<Transport> #[messages]
where impl UserAgentActor {
Transport: Bi<UserAgentRequest, Result<UserAgentResponse, UserAgentError>>, #[message]
{ pub async fn handle_unseal_request(&mut self, req: UnsealStart) -> Output {
async fn handle_unseal_request(&mut self, req: UnsealStart) -> Output {
let secret = EphemeralSecret::random(); let secret = EphemeralSecret::random();
let public_key = PublicKey::from(&secret); let public_key = PublicKey::from(&secret);
let client_pubkey_bytes: [u8; 32] = req let client_pubkey_bytes: [u8; 32] = req
.client_pubkey .client_pubkey
.try_into() .try_into()
.map_err(|_| UserAgentError::InvalidClientPubkeyLength)?; .map_err(|_| UserAgentError::InvalidPubkey)?;
let client_public_key = PublicKey::from(client_pubkey_bytes); let client_public_key = PublicKey::from(client_pubkey_bytes);
@@ -285,17 +255,18 @@ where
client_public_key, client_public_key,
}))?; }))?;
Ok(response( Ok(unseal_response(
UserAgentResponsePayload::UnsealStartResponse(UnsealStartResponse { UserAgentResponsePayload::UnsealStartResponse(UnsealStartResponse {
server_pubkey: public_key.as_bytes().to_vec(), server_pubkey: public_key.as_bytes().to_vec(),
}), }),
)) ))
} }
async fn handle_unseal_encrypted_key(&mut self, req: UnsealEncryptedKey) -> Output { #[message]
pub async fn handle_unseal_encrypted_key(&mut self, req: UnsealEncryptedKey) -> Output {
let UserAgentStates::WaitingForUnsealKey(unseal_context) = self.state.state() else { let UserAgentStates::WaitingForUnsealKey(unseal_context) = self.state.state() else {
error!("Received unseal encrypted key in invalid state"); error!("Received unseal encrypted key in invalid state");
return Err(UserAgentError::InvalidStateForUnsealEncryptedKey); return Err(UserAgentError::InvalidState);
}; };
let ephemeral_secret = { let ephemeral_secret = {
let mut secret_lock = unseal_context.secret.lock().unwrap(); let mut secret_lock = unseal_context.secret.lock().unwrap();
@@ -306,7 +277,7 @@ where
drop(secret_lock); drop(secret_lock);
error!("Ephemeral secret already taken"); error!("Ephemeral secret already taken");
self.transition(UserAgentEvents::ReceivedInvalidKey)?; self.transition(UserAgentEvents::ReceivedInvalidKey)?;
return Ok(response(UserAgentResponsePayload::UnsealResult( return Ok(unseal_response(UserAgentResponsePayload::UnsealResult(
UnsealResult::InvalidKey.into(), UnsealResult::InvalidKey.into(),
))); )));
} }
@@ -339,48 +310,49 @@ where
Ok(_) => { Ok(_) => {
info!("Successfully unsealed key with client-provided key"); info!("Successfully unsealed key with client-provided key");
self.transition(UserAgentEvents::ReceivedValidKey)?; self.transition(UserAgentEvents::ReceivedValidKey)?;
Ok(response(UserAgentResponsePayload::UnsealResult( Ok(unseal_response(UserAgentResponsePayload::UnsealResult(
UnsealResult::Success.into(), UnsealResult::Success.into(),
))) )))
} }
Err(SendError::HandlerError(keyholder::Error::InvalidKey)) => { Err(SendError::HandlerError(keyholder::Error::InvalidKey)) => {
self.transition(UserAgentEvents::ReceivedInvalidKey)?; self.transition(UserAgentEvents::ReceivedInvalidKey)?;
Ok(response(UserAgentResponsePayload::UnsealResult( Ok(unseal_response(UserAgentResponsePayload::UnsealResult(
UnsealResult::InvalidKey.into(), UnsealResult::InvalidKey.into(),
))) )))
} }
Err(SendError::HandlerError(err)) => { Err(SendError::HandlerError(err)) => {
error!(?err, "Keyholder failed to unseal key"); error!(?err, "Keyholder failed to unseal key");
self.transition(UserAgentEvents::ReceivedInvalidKey)?; self.transition(UserAgentEvents::ReceivedInvalidKey)?;
Ok(response(UserAgentResponsePayload::UnsealResult( Ok(unseal_response(UserAgentResponsePayload::UnsealResult(
UnsealResult::InvalidKey.into(), UnsealResult::InvalidKey.into(),
))) )))
} }
Err(err) => { Err(err) => {
error!(?err, "Failed to send unseal request to keyholder"); error!(?err, "Failed to send unseal request to keyholder");
self.transition(UserAgentEvents::ReceivedInvalidKey)?; self.transition(UserAgentEvents::ReceivedInvalidKey)?;
Err(UserAgentError::KeyHolderActorUnreachable) Err(UserAgentError::ActorUnavailable)
} }
} }
} }
Err(err) => { Err(err) => {
error!(?err, "Failed to decrypt unseal key"); error!(?err, "Failed to decrypt unseal key");
self.transition(UserAgentEvents::ReceivedInvalidKey)?; self.transition(UserAgentEvents::ReceivedInvalidKey)?;
Ok(response(UserAgentResponsePayload::UnsealResult( Ok(unseal_response(UserAgentResponsePayload::UnsealResult(
UnsealResult::InvalidKey.into(), UnsealResult::InvalidKey.into(),
))) )))
} }
} }
} }
async fn handle_auth_challenge_request(&mut self, req: AuthChallengeRequest) -> Output { #[message]
pub async fn handle_auth_challenge_request(&mut self, req: AuthChallengeRequest) -> Output {
let pubkey = req let pubkey = req
.pubkey .pubkey
.as_array() .as_array()
.ok_or(UserAgentError::InvalidAuthPubkeyLength)?; .ok_or(UserAgentError::InvalidPubkey)?;
let pubkey = VerifyingKey::from_bytes(pubkey).map_err(|_err| { let pubkey = VerifyingKey::from_bytes(pubkey).map_err(|_err| {
error!(?pubkey, "Failed to convert to VerifyingKey"); error!(?pubkey, "Failed to convert to VerifyingKey");
UserAgentError::InvalidAuthPubkeyEncoding UserAgentError::InvalidPubkey
})?; })?;
self.transition(UserAgentEvents::AuthRequest)?; self.transition(UserAgentEvents::AuthRequest)?;
@@ -391,9 +363,10 @@ where
} }
} }
async fn handle_auth_challenge_solution( #[message]
pub async fn handle_auth_challenge_solution(
&mut self, &mut self,
solution: AuthChallengeSolution, solution: auth::AuthChallengeSolution,
) -> Output { ) -> Output {
let (valid, challenge_context) = self.verify_challenge_solution(&solution)?; let (valid, challenge_context) = self.verify_challenge_solution(&solution)?;
@@ -403,7 +376,7 @@ where
"Client provided valid solution to authentication challenge" "Client provided valid solution to authentication challenge"
); );
self.transition(UserAgentEvents::ReceivedGoodSolution)?; self.transition(UserAgentEvents::ReceivedGoodSolution)?;
Ok(response(UserAgentResponsePayload::AuthOk(AuthOk {}))) Ok(auth_response(ServerAuthPayload::AuthOk(AuthOk {})))
} else { } else {
error!("Client provided invalid solution to authentication challenge"); error!("Client provided invalid solution to authentication challenge");
self.transition(UserAgentEvents::ReceivedBadSolution)?; self.transition(UserAgentEvents::ReceivedBadSolution)?;
@@ -412,67 +385,17 @@ where
} }
} }
impl Message<UserAgentRequest> for UserAgentActor {
type Reply = ();
impl<Transport> Actor for UserAgentActor<Transport> async fn handle(
where
Transport: Bi<UserAgentRequest, Result<UserAgentResponse, UserAgentError>>,
{
type Args = Self;
type Error = ();
async fn on_start(
args: Self::Args,
_: kameo::prelude::ActorRef<Self>,
) -> Result<Self, Self::Error> {
Ok(args)
}
async fn next(
&mut self, &mut self,
_actor_ref: kameo::prelude::WeakActorRef<Self>, msg: UserAgentRequest,
mailbox_rx: &mut kameo::prelude::MailboxReceiver<Self>, _ctx: &mut kameo::prelude::Context<Self, Self::Reply>,
) -> Option<kameo::mailbox::Signal<Self>> { ) -> Self::Reply {
loop { let result = self.process_request(msg).await;
select! { if let Err(e) = self.transport.tell(result).await {
signal = mailbox_rx.recv() => { error!(actor = "useragent", "Failed to send response to transport: {}", e);
return signal;
}
msg = self.transport.recv() => {
match msg {
Some(request) => {
match self.process_transport_inbound(request).await {
Ok(response) => {
if self.transport.send(Ok(response)).await.is_err() {
error!(actor = "useragent", reason = "channel closed", "send.failed");
return Some(kameo::mailbox::Signal::Stop);
}
}
Err(err) => {
let _ = self.transport.send(Err(err)).await;
return Some(kameo::mailbox::Signal::Stop);
}
}
}
None => {
info!(actor = "useragent", "transport.closed");
return Some(kameo::mailbox::Signal::Stop);
}
}
}
}
}
}
}
impl UserAgentActor<DummyTransport<UserAgentRequest, Result<UserAgentResponse, UserAgentError>>> {
pub fn new_manual(db: db::DatabasePool, actors: GlobalActors) -> Self {
Self {
db,
actors,
state: UserAgentStateMachine::new(DummyContext),
transport: DummyTransport::new(),
} }
} }
} }

View File

@@ -1,6 +1,6 @@
use std::sync::Mutex; use std::sync::Mutex;
use arbiter_proto::proto::user_agent::AuthChallenge; use arbiter_proto::proto::auth::AuthChallenge;
use ed25519_dalek::VerifyingKey; use ed25519_dalek::VerifyingKey;
use x25519_dalek::{EphemeralSecret, PublicKey}; use x25519_dalek::{EphemeralSecret, PublicKey};

View File

@@ -1,23 +1,19 @@
#![forbid(unsafe_code)] #![forbid(unsafe_code)]
use arbiter_proto::{ use arbiter_proto::{
proto::{ proto::{ClientRequest, ClientResponse, UserAgentRequest, UserAgentResponse},
client::{ClientRequest, ClientResponse}, transport::{BiStream, GrpcTransportActor, wire},
user_agent::{UserAgentRequest, UserAgentResponse},
},
transport::{IdentityRecvConverter, SendConverter, grpc},
}; };
use async_trait::async_trait; use async_trait::async_trait;
use kameo::actor::Spawn; use kameo::actor::PreparedActor;
use tokio_stream::wrappers::ReceiverStream; use tokio_stream::wrappers::ReceiverStream;
use tokio::sync::mpsc; use tokio::sync::mpsc;
use tonic::{Request, Response, Status}; use tonic::{Request, Response, Status};
use tracing::info;
use crate::{ use crate::{
actors::{ actors::{
client::{ClientActor, ClientError}, client::handle_client,
user_agent::{UserAgentActor, UserAgentError}, user_agent::UserAgentActor,
}, },
context::ServerContext, context::ServerContext,
}; };
@@ -28,119 +24,6 @@ pub mod db;
const DEFAULT_CHANNEL_SIZE: usize = 1000; const DEFAULT_CHANNEL_SIZE: usize = 1000;
/// Converts User Agent domain outbounds into the tonic stream item emitted by
/// the server.§
///
/// The conversion is defined at the server boundary so the actor module remains
/// focused on domain semantics and does not depend on tonic status encoding.
struct UserAgentGrpcSender;
impl SendConverter for UserAgentGrpcSender {
type Input = Result<UserAgentResponse, UserAgentError>;
type Output = Result<UserAgentResponse, Status>;
fn convert(&self, item: Self::Input) -> Self::Output {
match item {
Ok(message) => Ok(message),
Err(err) => Err(user_agent_error_status(err)),
}
}
}
/// Converts Client domain outbounds into the tonic stream item emitted by the
/// server.
///
/// The conversion is defined at the server boundary so the actor module remains
/// focused on domain semantics and does not depend on tonic status encoding.
struct ClientGrpcSender;
impl SendConverter for ClientGrpcSender {
type Input = Result<ClientResponse, ClientError>;
type Output = Result<ClientResponse, Status>;
fn convert(&self, item: Self::Input) -> Self::Output {
match item {
Ok(message) => Ok(message),
Err(err) => Err(client_error_status(err)),
}
}
}
/// Maps Client domain errors to public gRPC transport errors for the `client`
/// streaming endpoint.
fn client_error_status(value: ClientError) -> Status {
match value {
ClientError::MissingRequestPayload | ClientError::UnexpectedRequestPayload => {
Status::invalid_argument("Expected message with payload")
}
ClientError::InvalidStateForChallengeSolution => {
Status::invalid_argument("Invalid state for challenge solution")
}
ClientError::InvalidAuthPubkeyLength => {
Status::invalid_argument("Expected pubkey to have specific length")
}
ClientError::InvalidAuthPubkeyEncoding => {
Status::invalid_argument("Failed to convert pubkey to VerifyingKey")
}
ClientError::InvalidSignatureLength => {
Status::invalid_argument("Invalid signature length")
}
ClientError::PublicKeyNotRegistered => {
Status::unauthenticated("Public key not registered")
}
ClientError::InvalidChallengeSolution => {
Status::unauthenticated("Invalid challenge solution")
}
ClientError::StateTransitionFailed => Status::internal("State machine error"),
ClientError::DatabasePoolUnavailable => Status::internal("Database pool error"),
ClientError::DatabaseOperationFailed => Status::internal("Database error"),
}
}
/// Maps User Agent domain errors to public gRPC transport errors for the
/// `user_agent` streaming endpoint.
fn user_agent_error_status(value: UserAgentError) -> Status {
match value {
UserAgentError::MissingRequestPayload | UserAgentError::UnexpectedRequestPayload => {
Status::invalid_argument("Expected message with payload")
}
UserAgentError::InvalidStateForChallengeSolution => {
Status::invalid_argument("Invalid state for challenge solution")
}
UserAgentError::InvalidStateForUnsealEncryptedKey => {
Status::failed_precondition("Invalid state for unseal encrypted key")
}
UserAgentError::InvalidClientPubkeyLength => {
Status::invalid_argument("client_pubkey must be 32 bytes")
}
UserAgentError::InvalidAuthPubkeyLength => {
Status::invalid_argument("Expected pubkey to have specific length")
}
UserAgentError::InvalidAuthPubkeyEncoding => {
Status::invalid_argument("Failed to convert pubkey to VerifyingKey")
}
UserAgentError::InvalidSignatureLength => {
Status::invalid_argument("Invalid signature length")
}
UserAgentError::InvalidBootstrapToken => {
Status::invalid_argument("Invalid bootstrap token")
}
UserAgentError::PublicKeyNotRegistered => {
Status::unauthenticated("Public key not registered")
}
UserAgentError::InvalidChallengeSolution => {
Status::unauthenticated("Invalid challenge solution")
}
UserAgentError::StateTransitionFailed => Status::internal("State machine error"),
UserAgentError::BootstrapperActorUnreachable => {
Status::internal("Bootstrap token consumption failed")
}
UserAgentError::KeyHolderActorUnreachable => Status::internal("Vault is not available"),
UserAgentError::DatabasePoolUnavailable => Status::internal("Database pool error"),
UserAgentError::DatabaseOperationFailed => Status::internal("Database error"),
}
}
pub struct Server { pub struct Server {
context: ServerContext, context: ServerContext,
} }
@@ -156,44 +39,44 @@ impl arbiter_proto::proto::arbiter_service_server::ArbiterService for Server {
type UserAgentStream = ReceiverStream<Result<UserAgentResponse, Status>>; type UserAgentStream = ReceiverStream<Result<UserAgentResponse, Status>>;
type ClientStream = ReceiverStream<Result<ClientResponse, Status>>; type ClientStream = ReceiverStream<Result<ClientResponse, Status>>;
#[tracing::instrument(level = "debug", skip(self))]
async fn client( async fn client(
&self, &self,
request: Request<tonic::Streaming<ClientRequest>>, request: Request<tonic::Streaming<ClientRequest>>,
) -> Result<Response<Self::ClientStream>, Status> { ) -> Result<Response<Self::ClientStream>, Status> {
let req_stream = request.into_inner(); let req_stream = request.into_inner();
let (tx, rx) = mpsc::channel(DEFAULT_CHANNEL_SIZE); let (tx, rx) = mpsc::channel(DEFAULT_CHANNEL_SIZE);
tokio::spawn(handle_client(
let transport = grpc::GrpcAdapter::new( self.context.clone(),
tx, BiStream {
req_stream, request_stream: req_stream,
IdentityRecvConverter::<ClientRequest>::new(), response_sender: tx,
ClientGrpcSender, },
); ));
ClientActor::spawn(ClientActor::new(self.context.clone(), transport));
info!(event = "connection established", "grpc.client");
Ok(Response::new(ReceiverStream::new(rx))) Ok(Response::new(ReceiverStream::new(rx)))
} }
#[tracing::instrument(level = "debug", skip(self))]
async fn user_agent( async fn user_agent(
&self, &self,
request: Request<tonic::Streaming<UserAgentRequest>>, request: Request<tonic::Streaming<UserAgentRequest>>,
) -> Result<Response<Self::UserAgentStream>, Status> { ) -> Result<Response<Self::UserAgentStream>, Status> {
let req_stream = request.into_inner(); let req_stream = request.into_inner();
let (tx, rx) = mpsc::channel(DEFAULT_CHANNEL_SIZE); let (tx, rx) = mpsc::channel(DEFAULT_CHANNEL_SIZE);
let context = self.context.clone();
let transport = grpc::GrpcAdapter::new( wire(
|prepared: PreparedActor<UserAgentActor>, recipient| {
prepared.spawn(UserAgentActor::new(context, recipient));
},
|prepared: PreparedActor<GrpcTransportActor<_, _, _>>, business_recipient| {
prepared.spawn(GrpcTransportActor::new(
tx, tx,
req_stream, req_stream,
IdentityRecvConverter::<UserAgentRequest>::new(), business_recipient,
UserAgentGrpcSender, ));
); },
UserAgentActor::spawn(UserAgentActor::new(self.context.clone(), transport)); )
.await;
info!(event = "connection established", "grpc.user_agent");
Ok(Response::new(ReceiverStream::new(rx))) Ok(Response::new(ReceiverStream::new(rx)))
} }

View File

@@ -1,2 +0,0 @@
#[path = "client/auth.rs"]
mod auth;

View File

@@ -1,102 +0,0 @@
use arbiter_proto::proto::client::{
AuthChallengeRequest, AuthChallengeSolution, AuthOk, ClientRequest, ClientResponse,
client_request::Payload as ClientRequestPayload,
client_response::Payload as ClientResponsePayload,
};
use arbiter_server::{
actors::client::{ClientActor, ClientError},
db::{self, schema},
};
use diesel::{ExpressionMethods as _, insert_into};
use diesel_async::RunQueryDsl;
use ed25519_dalek::Signer as _;
#[tokio::test]
#[test_log::test]
pub async fn test_unregistered_pubkey_rejected() {
let db = db::create_test_pool().await;
let mut client = ClientActor::new_manual(db.clone());
let new_key = ed25519_dalek::SigningKey::generate(&mut rand::rng());
let pubkey_bytes = new_key.verifying_key().to_bytes().to_vec();
let result = client
.process_transport_inbound(ClientRequest {
payload: Some(ClientRequestPayload::AuthChallengeRequest(
AuthChallengeRequest {
pubkey: pubkey_bytes,
},
)),
})
.await;
match result {
Err(err) => {
assert_eq!(err, ClientError::PublicKeyNotRegistered);
}
Ok(_) => {
panic!("Expected error due to unregistered pubkey, but got success");
}
}
}
#[tokio::test]
#[test_log::test]
pub async fn test_challenge_auth() {
let db = db::create_test_pool().await;
let mut client = ClientActor::new_manual(db.clone());
let new_key = ed25519_dalek::SigningKey::generate(&mut rand::rng());
let pubkey_bytes = new_key.verifying_key().to_bytes().to_vec();
{
let mut conn = db.get().await.unwrap();
insert_into(schema::program_client::table)
.values(schema::program_client::public_key.eq(pubkey_bytes.clone()))
.execute(&mut conn)
.await
.unwrap();
}
let result = client
.process_transport_inbound(ClientRequest {
payload: Some(ClientRequestPayload::AuthChallengeRequest(
AuthChallengeRequest {
pubkey: pubkey_bytes,
},
)),
})
.await
.expect("Shouldn't fail to process message");
let ClientResponse {
payload: Some(ClientResponsePayload::AuthChallenge(challenge)),
} = result
else {
panic!("Expected auth challenge response, got {result:?}");
};
let formatted_challenge = arbiter_proto::format_challenge(challenge.nonce, &challenge.pubkey);
let signature = new_key.sign(&formatted_challenge);
let serialized_signature = signature.to_bytes().to_vec();
let result = client
.process_transport_inbound(ClientRequest {
payload: Some(ClientRequestPayload::AuthChallengeSolution(
AuthChallengeSolution {
signature: serialized_signature,
},
)),
})
.await
.expect("Shouldn't fail to process message");
assert_eq!(
result,
ClientResponse {
payload: Some(ClientResponsePayload::AuthOk(AuthOk {})),
}
);
}

View File

@@ -1,5 +1,30 @@
mod common; mod common;
use arbiter_proto::proto::UserAgentResponse;
use arbiter_server::actors::user_agent::UserAgentError;
use kameo::{Actor, actor::Recipient, actor::Spawn, prelude::Message};
/// A no-op actor that discards any messages it receives.
#[derive(Actor)]
struct NullSink;
impl Message<Result<UserAgentResponse, UserAgentError>> for NullSink {
type Reply = ();
async fn handle(
&mut self,
_msg: Result<UserAgentResponse, UserAgentError>,
_ctx: &mut kameo::prelude::Context<Self, Self::Reply>,
) -> Self::Reply {
}
}
/// Creates a `Recipient` that silently discards all messages.
fn null_recipient() -> Recipient<Result<UserAgentResponse, UserAgentError>> {
let actor_ref = NullSink::spawn(NullSink);
actor_ref.recipient()
}
#[path = "user_agent/auth.rs"] #[path = "user_agent/auth.rs"]
mod auth; mod auth;
#[path = "user_agent/unseal.rs"] #[path = "user_agent/unseal.rs"]

View File

@@ -1,48 +1,55 @@
use arbiter_proto::proto::user_agent::{ use arbiter_proto::proto::{
AuthChallengeRequest, AuthChallengeSolution, AuthOk, UserAgentRequest, UserAgentResponse, UserAgentResponse,
user_agent_request::Payload as UserAgentRequestPayload, auth::{self, AuthChallengeRequest, AuthOk},
user_agent_response::Payload as UserAgentResponsePayload, user_agent_response::Payload as UserAgentResponsePayload,
}; };
use arbiter_server::{ use arbiter_server::{
actors::{ actors::{
GlobalActors, GlobalActors,
bootstrap::GetToken, bootstrap::GetToken,
user_agent::{UserAgentActor, UserAgentError}, user_agent::{HandleAuthChallengeRequest, HandleAuthChallengeSolution, UserAgentActor},
}, },
db::{self, schema}, db::{self, schema},
}; };
use diesel::{ExpressionMethods as _, QueryDsl, insert_into}; use diesel::{ExpressionMethods as _, QueryDsl, insert_into};
use diesel_async::RunQueryDsl; use diesel_async::RunQueryDsl;
use ed25519_dalek::Signer as _; use ed25519_dalek::Signer as _;
use kameo::actor::Spawn;
#[tokio::test] #[tokio::test]
#[test_log::test] #[test_log::test]
pub async fn test_bootstrap_token_auth() { pub async fn test_bootstrap_token_auth() {
let db = db::create_test_pool().await; let db =db::create_test_pool().await;
let actors = GlobalActors::spawn(db.clone()).await.unwrap(); let actors = GlobalActors::spawn(db.clone()).await.unwrap();
let token = actors.bootstrapper.ask(GetToken).await.unwrap().unwrap(); let token = actors.bootstrapper.ask(GetToken).await.unwrap().unwrap();
let mut user_agent = UserAgentActor::new_manual(db.clone(), actors); let user_agent =
UserAgentActor::new_manual(db.clone(), actors, super::null_recipient());
let user_agent_ref = UserAgentActor::spawn(user_agent);
let new_key = ed25519_dalek::SigningKey::generate(&mut rand::rng()); let new_key = ed25519_dalek::SigningKey::generate(&mut rand::rng());
let pubkey_bytes = new_key.verifying_key().to_bytes().to_vec(); let pubkey_bytes = new_key.verifying_key().to_bytes().to_vec();
let result = user_agent let result = user_agent_ref
.process_transport_inbound(UserAgentRequest { .ask(HandleAuthChallengeRequest {
payload: Some(UserAgentRequestPayload::AuthChallengeRequest( req: AuthChallengeRequest {
AuthChallengeRequest {
pubkey: pubkey_bytes, pubkey: pubkey_bytes,
bootstrap_token: Some(token), bootstrap_token: Some(token),
}, },
)),
}) })
.await .await
.expect("Shouldn't fail to process message"); .expect("Shouldn't fail to send message");
assert_eq!( assert_eq!(
result, result,
UserAgentResponse { UserAgentResponse {
payload: Some(UserAgentResponsePayload::AuthOk(AuthOk {})), payload: Some(UserAgentResponsePayload::AuthMessage(
arbiter_proto::proto::auth::ServerMessage {
payload: Some(arbiter_proto::proto::auth::server_message::Payload::AuthOk(
AuthOk {},
)),
},
)),
} }
); );
@@ -61,25 +68,31 @@ pub async fn test_bootstrap_invalid_token_auth() {
let db = db::create_test_pool().await; let db = db::create_test_pool().await;
let actors = GlobalActors::spawn(db.clone()).await.unwrap(); let actors = GlobalActors::spawn(db.clone()).await.unwrap();
let mut user_agent = UserAgentActor::new_manual(db.clone(), actors); let user_agent =
UserAgentActor::new_manual(db.clone(), actors, super::null_recipient());
let user_agent_ref = UserAgentActor::spawn(user_agent);
let new_key = ed25519_dalek::SigningKey::generate(&mut rand::rng()); let new_key = ed25519_dalek::SigningKey::generate(&mut rand::rng());
let pubkey_bytes = new_key.verifying_key().to_bytes().to_vec(); let pubkey_bytes = new_key.verifying_key().to_bytes().to_vec();
let result = user_agent let result = user_agent_ref
.process_transport_inbound(UserAgentRequest { .ask(HandleAuthChallengeRequest {
payload: Some(UserAgentRequestPayload::AuthChallengeRequest( req: AuthChallengeRequest {
AuthChallengeRequest {
pubkey: pubkey_bytes, pubkey: pubkey_bytes,
bootstrap_token: Some("invalid_token".to_string()), bootstrap_token: Some("invalid_token".to_string()),
}, },
)),
}) })
.await; .await;
match result { match result {
Err(err) => { Err(kameo::error::SendError::HandlerError(err)) => {
assert_eq!(err, UserAgentError::InvalidBootstrapToken); assert!(
matches!(err, arbiter_server::actors::user_agent::UserAgentError::InvalidBootstrapToken),
"Expected InvalidBootstrapToken, got {err:?}"
);
}
Err(other) => {
panic!("Expected SendError::HandlerError, got {other:?}");
} }
Ok(_) => { Ok(_) => {
panic!("Expected error due to invalid bootstrap token, but got success"); panic!("Expected error due to invalid bootstrap token, but got success");
@@ -93,7 +106,9 @@ pub async fn test_challenge_auth() {
let db = db::create_test_pool().await; let db = db::create_test_pool().await;
let actors = GlobalActors::spawn(db.clone()).await.unwrap(); let actors = GlobalActors::spawn(db.clone()).await.unwrap();
let mut user_agent = UserAgentActor::new_manual(db.clone(), actors); let user_agent =
UserAgentActor::new_manual(db.clone(), actors, super::null_recipient());
let user_agent_ref = UserAgentActor::spawn(user_agent);
let new_key = ed25519_dalek::SigningKey::generate(&mut rand::rng()); let new_key = ed25519_dalek::SigningKey::generate(&mut rand::rng());
let pubkey_bytes = new_key.verifying_key().to_bytes().to_vec(); let pubkey_bytes = new_key.verifying_key().to_bytes().to_vec();
@@ -107,44 +122,50 @@ pub async fn test_challenge_auth() {
.unwrap(); .unwrap();
} }
let result = user_agent let result = user_agent_ref
.process_transport_inbound(UserAgentRequest { .ask(HandleAuthChallengeRequest {
payload: Some(UserAgentRequestPayload::AuthChallengeRequest( req: AuthChallengeRequest {
AuthChallengeRequest {
pubkey: pubkey_bytes, pubkey: pubkey_bytes,
bootstrap_token: None, bootstrap_token: None,
}, },
)),
}) })
.await .await
.expect("Shouldn't fail to process message"); .expect("Shouldn't fail to send message");
let UserAgentResponse { let UserAgentResponse {
payload: Some(UserAgentResponsePayload::AuthChallenge(challenge)), payload:
Some(UserAgentResponsePayload::AuthMessage(arbiter_proto::proto::auth::ServerMessage {
payload:
Some(arbiter_proto::proto::auth::server_message::Payload::AuthChallenge(challenge)),
})),
} = result } = result
else { else {
panic!("Expected auth challenge response, got {result:?}"); panic!("Expected auth challenge response, got {result:?}");
}; };
let formatted_challenge = arbiter_proto::format_challenge(challenge.nonce, &challenge.pubkey); let formatted_challenge = arbiter_proto::format_challenge(&challenge);
let signature = new_key.sign(&formatted_challenge); let signature = new_key.sign(&formatted_challenge);
let serialized_signature = signature.to_bytes().to_vec(); let serialized_signature = signature.to_bytes().to_vec();
let result = user_agent let result = user_agent_ref
.process_transport_inbound(UserAgentRequest { .ask(HandleAuthChallengeSolution {
payload: Some(UserAgentRequestPayload::AuthChallengeSolution( solution: auth::AuthChallengeSolution {
AuthChallengeSolution {
signature: serialized_signature, signature: serialized_signature,
}, },
)),
}) })
.await .await
.expect("Shouldn't fail to process message"); .expect("Shouldn't fail to send message");
assert_eq!( assert_eq!(
result, result,
UserAgentResponse { UserAgentResponse {
payload: Some(UserAgentResponsePayload::AuthOk(AuthOk {})), payload: Some(UserAgentResponsePayload::AuthMessage(
arbiter_proto::proto::auth::ServerMessage {
payload: Some(arbiter_proto::proto::auth::server_message::Payload::AuthOk(
AuthOk {},
)),
},
)),
} }
); );
} }

View File

@@ -1,32 +1,27 @@
use arbiter_proto::proto::user_agent::{ use arbiter_proto::proto::{
AuthChallengeRequest, UnsealEncryptedKey, UnsealResult, UnsealStart, UnsealEncryptedKey, UnsealResult, UnsealStart, auth::AuthChallengeRequest,
UserAgentRequest, UserAgentResponse,
user_agent_request::Payload as UserAgentRequestPayload,
user_agent_response::Payload as UserAgentResponsePayload, user_agent_response::Payload as UserAgentResponsePayload,
}; };
use arbiter_proto::transport::DummyTransport;
use arbiter_server::{ use arbiter_server::{
actors::{ actors::{
GlobalActors, GlobalActors,
bootstrap::GetToken, bootstrap::GetToken,
keyholder::{Bootstrap, Seal}, keyholder::{Bootstrap, Seal},
user_agent::{UserAgentActor, UserAgentError}, user_agent::{
HandleAuthChallengeRequest, HandleUnsealEncryptedKey, HandleUnsealRequest,
UserAgentActor,
},
}, },
db, db,
}; };
use chacha20poly1305::{AeadInPlace, XChaCha20Poly1305, XNonce, aead::KeyInit}; use chacha20poly1305::{AeadInPlace, XChaCha20Poly1305, XNonce, aead::KeyInit};
use kameo::actor::{ActorRef, Spawn};
use memsafe::MemSafe; use memsafe::MemSafe;
use x25519_dalek::{EphemeralSecret, PublicKey}; use x25519_dalek::{EphemeralSecret, PublicKey};
type TestUserAgent =
UserAgentActor<DummyTransport<UserAgentRequest, Result<UserAgentResponse, UserAgentError>>>;
async fn setup_authenticated_user_agent( async fn setup_authenticated_user_agent(
seal_key: &[u8], seal_key: &[u8],
) -> ( ) -> (arbiter_server::db::DatabasePool, ActorRef<UserAgentActor>) {
arbiter_server::db::DatabasePool,
TestUserAgent,
) {
let db = db::create_test_pool().await; let db = db::create_test_pool().await;
let actors = GlobalActors::spawn(db.clone()).await.unwrap(); let actors = GlobalActors::spawn(db.clone()).await.unwrap();
@@ -39,37 +34,37 @@ async fn setup_authenticated_user_agent(
.unwrap(); .unwrap();
actors.key_holder.ask(Seal).await.unwrap(); actors.key_holder.ask(Seal).await.unwrap();
let mut user_agent = UserAgentActor::new_manual(db.clone(), actors.clone()); let user_agent =
UserAgentActor::new_manual(db.clone(), actors.clone(), super::null_recipient());
let user_agent_ref = UserAgentActor::spawn(user_agent);
let token = actors.bootstrapper.ask(GetToken).await.unwrap().unwrap(); let token = actors.bootstrapper.ask(GetToken).await.unwrap().unwrap();
let auth_key = ed25519_dalek::SigningKey::generate(&mut rand::rng()); let auth_key = ed25519_dalek::SigningKey::generate(&mut rand::rng());
user_agent user_agent_ref
.process_transport_inbound(UserAgentRequest { .ask(HandleAuthChallengeRequest {
payload: Some(UserAgentRequestPayload::AuthChallengeRequest( req: AuthChallengeRequest {
AuthChallengeRequest {
pubkey: auth_key.verifying_key().to_bytes().to_vec(), pubkey: auth_key.verifying_key().to_bytes().to_vec(),
bootstrap_token: Some(token), bootstrap_token: Some(token),
}, },
)),
}) })
.await .await
.unwrap(); .unwrap();
(db, user_agent) (db, user_agent_ref)
} }
async fn client_dh_encrypt( async fn client_dh_encrypt(
user_agent: &mut TestUserAgent, user_agent_ref: &ActorRef<UserAgentActor>,
key_to_send: &[u8], key_to_send: &[u8],
) -> UnsealEncryptedKey { ) -> UnsealEncryptedKey {
let client_secret = EphemeralSecret::random(); let client_secret = EphemeralSecret::random();
let client_public = PublicKey::from(&client_secret); let client_public = PublicKey::from(&client_secret);
let response = user_agent let response = user_agent_ref
.process_transport_inbound(UserAgentRequest { .ask(HandleUnsealRequest {
payload: Some(UserAgentRequestPayload::UnsealStart(UnsealStart { req: UnsealStart {
client_pubkey: client_public.as_bytes().to_vec(), client_pubkey: client_public.as_bytes().to_vec(),
})), },
}) })
.await .await
.unwrap(); .unwrap();
@@ -96,22 +91,16 @@ async fn client_dh_encrypt(
} }
} }
fn unseal_key_request(req: UnsealEncryptedKey) -> UserAgentRequest {
UserAgentRequest {
payload: Some(UserAgentRequestPayload::UnsealEncryptedKey(req)),
}
}
#[tokio::test] #[tokio::test]
#[test_log::test] #[test_log::test]
pub async fn test_unseal_success() { pub async fn test_unseal_success() {
let seal_key = b"test-seal-key"; let seal_key = b"test-seal-key";
let (_db, mut user_agent) = setup_authenticated_user_agent(seal_key).await; let (_db, user_agent_ref) = setup_authenticated_user_agent(seal_key).await;
let encrypted_key = client_dh_encrypt(&mut user_agent, seal_key).await; let encrypted_key = client_dh_encrypt(&user_agent_ref, seal_key).await;
let response = user_agent let response = user_agent_ref
.process_transport_inbound(unseal_key_request(encrypted_key)) .ask(HandleUnsealEncryptedKey { req: encrypted_key })
.await .await
.unwrap(); .unwrap();
@@ -124,12 +113,12 @@ pub async fn test_unseal_success() {
#[tokio::test] #[tokio::test]
#[test_log::test] #[test_log::test]
pub async fn test_unseal_wrong_seal_key() { pub async fn test_unseal_wrong_seal_key() {
let (_db, mut user_agent) = setup_authenticated_user_agent(b"correct-key").await; let (_db, user_agent_ref) = setup_authenticated_user_agent(b"correct-key").await;
let encrypted_key = client_dh_encrypt(&mut user_agent, b"wrong-key").await; let encrypted_key = client_dh_encrypt(&user_agent_ref, b"wrong-key").await;
let response = user_agent let response = user_agent_ref
.process_transport_inbound(unseal_key_request(encrypted_key)) .ask(HandleUnsealEncryptedKey { req: encrypted_key })
.await .await
.unwrap(); .unwrap();
@@ -142,26 +131,28 @@ pub async fn test_unseal_wrong_seal_key() {
#[tokio::test] #[tokio::test]
#[test_log::test] #[test_log::test]
pub async fn test_unseal_corrupted_ciphertext() { pub async fn test_unseal_corrupted_ciphertext() {
let (_db, mut user_agent) = setup_authenticated_user_agent(b"test-key").await; let (_db, user_agent_ref) = setup_authenticated_user_agent(b"test-key").await;
let client_secret = EphemeralSecret::random(); let client_secret = EphemeralSecret::random();
let client_public = PublicKey::from(&client_secret); let client_public = PublicKey::from(&client_secret);
user_agent user_agent_ref
.process_transport_inbound(UserAgentRequest { .ask(HandleUnsealRequest {
payload: Some(UserAgentRequestPayload::UnsealStart(UnsealStart { req: UnsealStart {
client_pubkey: client_public.as_bytes().to_vec(), client_pubkey: client_public.as_bytes().to_vec(),
})), },
}) })
.await .await
.unwrap(); .unwrap();
let response = user_agent let response = user_agent_ref
.process_transport_inbound(unseal_key_request(UnsealEncryptedKey { .ask(HandleUnsealEncryptedKey {
req: UnsealEncryptedKey {
nonce: vec![0u8; 24], nonce: vec![0u8; 24],
ciphertext: vec![0u8; 32], ciphertext: vec![0u8; 32],
associated_data: vec![], associated_data: vec![],
})) },
})
.await .await
.unwrap(); .unwrap();
@@ -177,22 +168,27 @@ pub async fn test_unseal_start_without_auth_fails() {
let db = db::create_test_pool().await; let db = db::create_test_pool().await;
let actors = GlobalActors::spawn(db.clone()).await.unwrap(); let actors = GlobalActors::spawn(db.clone()).await.unwrap();
let mut user_agent = UserAgentActor::new_manual(db.clone(), actors); let user_agent =
UserAgentActor::new_manual(db.clone(), actors, super::null_recipient());
let user_agent_ref = UserAgentActor::spawn(user_agent);
let client_secret = EphemeralSecret::random(); let client_secret = EphemeralSecret::random();
let client_public = PublicKey::from(&client_secret); let client_public = PublicKey::from(&client_secret);
let result = user_agent let result = user_agent_ref
.process_transport_inbound(UserAgentRequest { .ask(HandleUnsealRequest {
payload: Some(UserAgentRequestPayload::UnsealStart(UnsealStart { req: UnsealStart {
client_pubkey: client_public.as_bytes().to_vec(), client_pubkey: client_public.as_bytes().to_vec(),
})), },
}) })
.await; .await;
match result { match result {
Err(err) => { Err(kameo::error::SendError::HandlerError(err)) => {
assert_eq!(err, UserAgentError::StateTransitionFailed); assert!(
matches!(err, arbiter_server::actors::user_agent::UserAgentError::InvalidState),
"Expected InvalidState, got {err:?}"
);
} }
other => panic!("Expected state machine error, got {other:?}"), other => panic!("Expected state machine error, got {other:?}"),
} }
@@ -202,13 +198,13 @@ pub async fn test_unseal_start_without_auth_fails() {
#[test_log::test] #[test_log::test]
pub async fn test_unseal_retry_after_invalid_key() { pub async fn test_unseal_retry_after_invalid_key() {
let seal_key = b"real-seal-key"; let seal_key = b"real-seal-key";
let (_db, mut user_agent) = setup_authenticated_user_agent(seal_key).await; let (_db, user_agent_ref) = setup_authenticated_user_agent(seal_key).await;
{ {
let encrypted_key = client_dh_encrypt(&mut user_agent, b"wrong-key").await; let encrypted_key = client_dh_encrypt(&user_agent_ref, b"wrong-key").await;
let response = user_agent let response = user_agent_ref
.process_transport_inbound(unseal_key_request(encrypted_key)) .ask(HandleUnsealEncryptedKey { req: encrypted_key })
.await .await
.unwrap(); .unwrap();
@@ -219,10 +215,10 @@ pub async fn test_unseal_retry_after_invalid_key() {
} }
{ {
let encrypted_key = client_dh_encrypt(&mut user_agent, seal_key).await; let encrypted_key = client_dh_encrypt(&user_agent_ref, seal_key).await;
let response = user_agent let response = user_agent_ref
.process_transport_inbound(unseal_key_request(encrypted_key)) .ask(HandleUnsealEncryptedKey { req: encrypted_key })
.await .await
.unwrap(); .unwrap();

View File

@@ -9,12 +9,7 @@ arbiter-proto.path = "../arbiter-proto"
kameo.workspace = true kameo.workspace = true
tokio = {workspace = true, features = ["net"]} tokio = {workspace = true, features = ["net"]}
tonic.workspace = true tonic.workspace = true
tonic.features = ["tls-aws-lc"]
tracing.workspace = true tracing.workspace = true
ed25519-dalek.workspace = true ed25519-dalek.workspace = true
smlang.workspace = true smlang.workspace = true
x25519-dalek.workspace = true x25519-dalek.workspace = true
thiserror.workspace = true
tokio-stream.workspace = true
http = "1.4.0"
rustls-webpki = { version = "0.103.9", features = ["aws-lc-rs"] }

View File

@@ -1,72 +0,0 @@
use arbiter_proto::{
proto::{
user_agent::{UserAgentRequest, UserAgentResponse},
arbiter_service_client::ArbiterServiceClient,
},
transport::{IdentityRecvConverter, IdentitySendConverter, grpc},
url::ArbiterUrl,
};
use ed25519_dalek::SigningKey;
use kameo::actor::{ActorRef, Spawn};
use tokio::sync::mpsc;
use tokio_stream::wrappers::ReceiverStream;
use tonic::transport::ClientTlsConfig;
#[derive(Debug, thiserror::Error)]
pub enum ConnectError {
#[error("Could establish connection")]
Connection(#[from] tonic::transport::Error),
#[error("Invalid server URI")]
InvalidUri(#[from] http::uri::InvalidUri),
#[error("Invalid CA certificate")]
InvalidCaCert(#[from] webpki::Error),
#[error("gRPC error")]
Grpc(#[from] tonic::Status),
}
use super::UserAgentActor;
pub type UserAgentGrpc = ActorRef<
UserAgentActor<
grpc::GrpcAdapter<
IdentityRecvConverter<UserAgentResponse>,
IdentitySendConverter<UserAgentRequest>,
>,
>,
>;
pub async fn connect_grpc(
url: ArbiterUrl,
key: SigningKey,
) -> Result<UserAgentGrpc, ConnectError> {
let bootstrap_token = url.bootstrap_token.clone();
let anchor = webpki::anchor_from_trusted_cert(&url.ca_cert)?.to_owned();
let tls = ClientTlsConfig::new().trust_anchor(anchor);
// TODO: if `host` is localhost, we need to verify server's process authenticity
let channel = tonic::transport::Channel::from_shared(format!("{}:{}", url.host, url.port))?
.tls_config(tls)?
.connect()
.await?;
let mut client = ArbiterServiceClient::new(channel);
let (tx, rx) = mpsc::channel(16);
let bistream = client.user_agent(ReceiverStream::new(rx)).await?;
let bistream = bistream.into_inner();
let adapter = grpc::GrpcAdapter::new(
tx,
bistream,
IdentityRecvConverter::new(),
IdentitySendConverter::new(),
);
let actor = UserAgentActor::spawn(UserAgentActor::new(key, bootstrap_token, adapter));
Ok(actor)
}

View File

@@ -1,195 +1,66 @@
use arbiter_proto::{ use arbiter_proto::{proto::UserAgentRequest, transport::TransportActor};
format_challenge, use ed25519_dalek::SigningKey;
proto::user_agent::{ use kameo::{
AuthChallengeRequest, AuthChallengeSolution, AuthOk, Actor, Reply,
UserAgentRequest, UserAgentResponse, actor::{ActorRef, WeakActorRef},
user_agent_request::Payload as UserAgentRequestPayload, prelude::Message,
user_agent_response::Payload as UserAgentResponsePayload,
},
transport::Bi,
}; };
use ed25519_dalek::{Signer, SigningKey};
use kameo::{Actor, actor::ActorRef};
use smlang::statemachine; use smlang::statemachine;
use tokio::select; use tonic::transport::CertificateDer;
use tracing::{error, info}; use tracing::{debug, error};
struct Storage {
pub identity: SigningKey,
pub server_ca_cert: CertificateDer<'static>,
}
#[derive(Debug)]
pub enum InitError {
StorageError,
Other(String),
}
statemachine! { statemachine! {
name: UserAgent, name: UserAgentStateMachine,
custom_error: false, custom_error: false,
transitions: { transitions: {
*Init + SentAuthChallengeRequest = WaitingForServerAuth, *Init + SendAuthChallenge = WaitingForAuthSolution
WaitingForServerAuth + ReceivedAuthChallenge = WaitingForAuthOk,
WaitingForServerAuth + ReceivedAuthOk = Authenticated,
WaitingForAuthOk + ReceivedAuthOk = Authenticated,
} }
} }
pub struct DummyContext;
impl UserAgentStateMachineContext for DummyContext {}
#[derive(Debug, thiserror::Error)]
pub enum InboundError {
#[error("Invalid user agent response")]
InvalidResponse,
#[error("Expected response payload")]
MissingResponsePayload,
#[error("Unexpected response payload")]
UnexpectedResponsePayload,
#[error("Invalid state for auth challenge")]
InvalidStateForAuthChallenge,
#[error("Invalid state for auth ok")]
InvalidStateForAuthOk,
#[error("State machine error")]
StateTransitionFailed,
#[error("Transport send failed")]
TransportSendFailed,
}
pub struct UserAgentActor<Transport> pub struct UserAgentActor<A: TransportActor<UserAgentRequest>> {
where
Transport: Bi<UserAgentResponse, UserAgentRequest>,
{
key: SigningKey, key: SigningKey,
bootstrap_token: Option<String>, server_ca_cert: CertificateDer<'static>,
state: UserAgentStateMachine<DummyContext>, sender: ActorRef<A>,
transport: Transport,
} }
impl<A: TransportActor<UserAgentRequest>> Actor for UserAgentActor<A> {
impl<Transport> UserAgentActor<Transport>
where
Transport: Bi<UserAgentResponse, UserAgentRequest>,
{
pub fn new(key: SigningKey, bootstrap_token: Option<String>, transport: Transport) -> Self {
Self {
key,
bootstrap_token,
state: UserAgentStateMachine::new(DummyContext),
transport,
}
}
fn transition(&mut self, event: UserAgentEvents) -> Result<(), InboundError> {
self.state.process_event(event).map_err(|e| {
error!(?e, "useragent state transition failed");
InboundError::StateTransitionFailed
})?;
Ok(())
}
async fn send_auth_challenge_request(&mut self) -> Result<(), InboundError> {
let req = AuthChallengeRequest {
pubkey: self.key.verifying_key().to_bytes().to_vec(),
bootstrap_token: self.bootstrap_token.take(),
};
self.transition(UserAgentEvents::SentAuthChallengeRequest)?;
self.transport
.send(UserAgentRequest {
payload: Some(UserAgentRequestPayload::AuthChallengeRequest(req)),
})
.await
.map_err(|_| InboundError::TransportSendFailed)?;
info!(actor = "useragent", "auth.request.sent");
Ok(())
}
async fn handle_auth_challenge(
&mut self,
challenge: arbiter_proto::proto::user_agent::AuthChallenge,
) -> Result<(), InboundError> {
self.transition(UserAgentEvents::ReceivedAuthChallenge)?;
let formatted = format_challenge(challenge.nonce, &challenge.pubkey);
let signature = self.key.sign(&formatted);
let solution = AuthChallengeSolution {
signature: signature.to_bytes().to_vec(),
};
self.transport
.send(UserAgentRequest {
payload: Some(UserAgentRequestPayload::AuthChallengeSolution(solution)),
})
.await
.map_err(|_| InboundError::TransportSendFailed)?;
info!(actor = "useragent", "auth.solution.sent");
Ok(())
}
fn handle_auth_ok(&mut self, _ok: AuthOk) -> Result<(), InboundError> {
self.transition(UserAgentEvents::ReceivedAuthOk)?;
info!(actor = "useragent", "auth.ok");
Ok(())
}
pub async fn process_inbound_transport(
&mut self,
inbound: UserAgentResponse
) -> Result<(), InboundError> {
let payload = inbound
.payload
.ok_or(InboundError::MissingResponsePayload)?;
match payload {
UserAgentResponsePayload::AuthChallenge(challenge) => {
self.handle_auth_challenge(challenge).await
}
UserAgentResponsePayload::AuthOk(ok) => self.handle_auth_ok(ok),
_ => Err(InboundError::UnexpectedResponsePayload),
}
}
}
impl<Transport> Actor for UserAgentActor<Transport>
where
Transport: Bi<UserAgentResponse, UserAgentRequest>,
{
type Args = Self; type Args = Self;
type Error = (); type Error = InitError;
async fn on_start( async fn on_start(args: Self::Args, actor_ref: ActorRef<Self>) -> Result<Self, Self::Error> {
mut args: Self::Args, todo!()
_actor_ref: ActorRef<Self>,
) -> Result<Self, Self::Error> {
if let Err(err) = args.send_auth_challenge_request().await {
error!(?err, actor = "useragent", "auth.start.failed");
return Err(());
}
Ok(args)
} }
async fn next( async fn on_link_died(
&mut self, &mut self,
_actor_ref: kameo::prelude::WeakActorRef<Self>, _: WeakActorRef<Self>,
mailbox_rx: &mut kameo::prelude::MailboxReceiver<Self>, id: kameo::prelude::ActorId,
) -> Option<kameo::mailbox::Signal<Self>> { _: kameo::prelude::ActorStopReason,
loop { ) -> Result<std::ops::ControlFlow<kameo::prelude::ActorStopReason>, Self::Error> {
select! { if id == self.sender.id() {
signal = mailbox_rx.recv() => { error!("Transport actor died, stopping UserAgentActor");
return signal; Ok(std::ops::ControlFlow::Break(
} kameo::prelude::ActorStopReason::Normal,
inbound = self.transport.recv() => { ))
match inbound { } else {
Some(inbound) => { debug!(
if let Err(err) = self.process_inbound_transport(inbound).await { "Linked actor {} died, but it's not the transport actor, ignoring",
error!(?err, actor = "useragent", "transport.inbound.failed"); id
return Some(kameo::mailbox::Signal::Stop); );
} Ok(std::ops::ControlFlow::Continue(()))
}
None => {
info!(actor = "useragent", "transport.closed");
return Some(kameo::mailbox::Signal::Stop);
}
}
}
}
} }
} }
} }
mod grpc;
pub use grpc::{connect_grpc, ConnectError};

View File

@@ -1,139 +0,0 @@
use arbiter_proto::{
format_challenge,
proto::user_agent::{
AuthChallenge, AuthOk,
UserAgentRequest, UserAgentResponse,
user_agent_request::Payload as UserAgentRequestPayload,
user_agent_response::Payload as UserAgentResponsePayload,
},
transport::Bi,
};
use arbiter_useragent::UserAgentActor;
use ed25519_dalek::SigningKey;
use kameo::actor::Spawn;
use tokio::sync::mpsc;
use tokio::time::{Duration, timeout};
struct TestTransport {
inbound_rx: mpsc::Receiver<UserAgentResponse>,
outbound_tx: mpsc::Sender<UserAgentRequest>,
}
impl Bi<UserAgentResponse, UserAgentRequest> for TestTransport {
async fn send(&mut self, item: UserAgentRequest) -> Result<(), arbiter_proto::transport::Error> {
self.outbound_tx
.send(item)
.await
.map_err(|_| arbiter_proto::transport::Error::ChannelClosed)
}
async fn recv(&mut self) -> Option<UserAgentResponse> {
self.inbound_rx.recv().await
}
}
fn make_transport() -> (
TestTransport,
mpsc::Sender<UserAgentResponse>,
mpsc::Receiver<UserAgentRequest>,
) {
let (inbound_tx, inbound_rx) = mpsc::channel(8);
let (outbound_tx, outbound_rx) = mpsc::channel(8);
(
TestTransport {
inbound_rx,
outbound_tx,
},
inbound_tx,
outbound_rx,
)
}
fn test_key() -> SigningKey {
SigningKey::from_bytes(&[7u8; 32])
}
#[tokio::test]
async fn sends_auth_request_on_start_with_bootstrap_token() {
let key = test_key();
let pubkey = key.verifying_key().to_bytes().to_vec();
let bootstrap_token = Some("bootstrap-123".to_string());
let (transport, inbound_tx, mut outbound_rx) = make_transport();
let actor = UserAgentActor::spawn(UserAgentActor::new(key, bootstrap_token.clone(), transport));
let outbound = timeout(Duration::from_secs(1), outbound_rx.recv())
.await
.expect("timed out waiting for auth request")
.expect("channel closed before auth request");
let UserAgentRequest {
payload: Some(UserAgentRequestPayload::AuthChallengeRequest(req)),
} = outbound
else {
panic!("expected auth challenge request");
};
assert_eq!(req.pubkey, pubkey);
assert_eq!(req.bootstrap_token, bootstrap_token);
drop(inbound_tx);
drop(actor);
}
#[tokio::test]
async fn challenge_flow_sends_solution_from_transport_inbound() {
let key = test_key();
let verify_key = key.verifying_key();
let (transport, inbound_tx, mut outbound_rx) = make_transport();
let actor = UserAgentActor::spawn(UserAgentActor::new(key, None, transport));
let _initial_auth_request = timeout(Duration::from_secs(1), outbound_rx.recv())
.await
.expect("timed out waiting for initial auth request")
.expect("missing initial auth request");
let challenge = AuthChallenge {
pubkey: verify_key.to_bytes().to_vec(),
nonce: 42,
};
inbound_tx
.send(UserAgentResponse {
payload: Some(UserAgentResponsePayload::AuthChallenge(challenge.clone())),
})
.await
.unwrap();
let outbound = timeout(Duration::from_secs(1), outbound_rx.recv())
.await
.expect("timed out waiting for challenge solution")
.expect("missing challenge solution");
let UserAgentRequest {
payload: Some(UserAgentRequestPayload::AuthChallengeSolution(solution)),
} = outbound
else {
panic!("expected auth challenge solution");
};
let formatted = format_challenge(challenge.nonce, &challenge.pubkey);
let sig: ed25519_dalek::Signature = solution
.signature
.as_slice()
.try_into()
.expect("signature bytes length");
verify_key
.verify_strict(&formatted, &sig)
.expect("solution signature should verify");
inbound_tx
.send(UserAgentResponse {
payload: Some(UserAgentResponsePayload::AuthOk(AuthOk {})),
})
.await
.unwrap();
drop(inbound_tx);
drop(actor);
}

View File

@@ -1,31 +0,0 @@
Extension Discovery Cache
=========================
This folder is used by `package:extension_discovery` to cache lists of
packages that contains extensions for other packages.
DO NOT USE THIS FOLDER
----------------------
* Do not read (or rely) the contents of this folder.
* Do write to this folder.
If you're interested in the lists of extensions stored in this folder use the
API offered by package `extension_discovery` to get this information.
If this package doesn't work for your use-case, then don't try to read the
contents of this folder. It may change, and will not remain stable.
Use package `extension_discovery`
---------------------------------
If you want to access information from this folder.
Feel free to delete this folder
-------------------------------
Files in this folder act as a cache, and the cache is discarded if the files
are older than the modification time of `.dart_tool/package_config.json`.
Hence, it should never be necessary to clear this cache manually, if you find a
need to do please file a bug.

View File

@@ -1 +0,0 @@
{"version":2,"entries":[{"package":"arbiter","rootUri":"../","packageUri":"lib/"}]}

View File

@@ -1,172 +0,0 @@
{
"configVersion": 2,
"packages": [
{
"name": "async",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/async-2.13.0",
"packageUri": "lib/",
"languageVersion": "3.4"
},
{
"name": "boolean_selector",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/boolean_selector-2.1.2",
"packageUri": "lib/",
"languageVersion": "3.1"
},
{
"name": "characters",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/characters-1.4.0",
"packageUri": "lib/",
"languageVersion": "3.4"
},
{
"name": "clock",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/clock-1.1.2",
"packageUri": "lib/",
"languageVersion": "3.4"
},
{
"name": "collection",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/collection-1.19.1",
"packageUri": "lib/",
"languageVersion": "3.4"
},
{
"name": "fake_async",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/fake_async-1.3.3",
"packageUri": "lib/",
"languageVersion": "3.3"
},
{
"name": "flutter",
"rootUri": "file:///Users/kaska/.local/share/mise/installs/flutter/3.38.9-stable/packages/flutter",
"packageUri": "lib/",
"languageVersion": "3.8"
},
{
"name": "flutter_lints",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/flutter_lints-6.0.0",
"packageUri": "lib/",
"languageVersion": "3.8"
},
{
"name": "flutter_test",
"rootUri": "file:///Users/kaska/.local/share/mise/installs/flutter/3.38.9-stable/packages/flutter_test",
"packageUri": "lib/",
"languageVersion": "3.8"
},
{
"name": "leak_tracker",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/leak_tracker-11.0.2",
"packageUri": "lib/",
"languageVersion": "3.2"
},
{
"name": "leak_tracker_flutter_testing",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/leak_tracker_flutter_testing-3.0.10",
"packageUri": "lib/",
"languageVersion": "3.2"
},
{
"name": "leak_tracker_testing",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/leak_tracker_testing-3.0.2",
"packageUri": "lib/",
"languageVersion": "3.2"
},
{
"name": "lints",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/lints-6.1.0",
"packageUri": "lib/",
"languageVersion": "3.8"
},
{
"name": "matcher",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/matcher-0.12.17",
"packageUri": "lib/",
"languageVersion": "3.4"
},
{
"name": "material_color_utilities",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/material_color_utilities-0.11.1",
"packageUri": "lib/",
"languageVersion": "2.17"
},
{
"name": "meta",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/meta-1.17.0",
"packageUri": "lib/",
"languageVersion": "3.5"
},
{
"name": "path",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/path-1.9.1",
"packageUri": "lib/",
"languageVersion": "3.4"
},
{
"name": "sky_engine",
"rootUri": "file:///Users/kaska/.local/share/mise/installs/flutter/3.38.9-stable/bin/cache/pkg/sky_engine",
"packageUri": "lib/",
"languageVersion": "3.8"
},
{
"name": "source_span",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/source_span-1.10.2",
"packageUri": "lib/",
"languageVersion": "3.1"
},
{
"name": "stack_trace",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/stack_trace-1.12.1",
"packageUri": "lib/",
"languageVersion": "3.4"
},
{
"name": "stream_channel",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/stream_channel-2.1.4",
"packageUri": "lib/",
"languageVersion": "3.3"
},
{
"name": "string_scanner",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/string_scanner-1.4.1",
"packageUri": "lib/",
"languageVersion": "3.1"
},
{
"name": "term_glyph",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/term_glyph-1.2.2",
"packageUri": "lib/",
"languageVersion": "3.1"
},
{
"name": "test_api",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/test_api-0.7.7",
"packageUri": "lib/",
"languageVersion": "3.5"
},
{
"name": "vector_math",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/vector_math-2.2.0",
"packageUri": "lib/",
"languageVersion": "3.1"
},
{
"name": "vm_service",
"rootUri": "file:///Users/kaska/.pub-cache/hosted/pub.dev/vm_service-15.0.2",
"packageUri": "lib/",
"languageVersion": "3.5"
},
{
"name": "arbiter",
"rootUri": "../",
"packageUri": "lib/",
"languageVersion": "3.10"
}
],
"generator": "pub",
"generatorVersion": "3.10.8",
"flutterRoot": "file:///Users/kaska/.local/share/mise/installs/flutter/3.38.9-stable",
"flutterVersion": "3.38.9",
"pubCache": "file:///Users/kaska/.pub-cache"
}

View File

@@ -1,224 +0,0 @@
{
"roots": [
"arbiter"
],
"packages": [
{
"name": "arbiter",
"version": "0.1.0",
"dependencies": [
"flutter"
],
"devDependencies": [
"flutter_lints",
"flutter_test"
]
},
{
"name": "flutter_lints",
"version": "6.0.0",
"dependencies": [
"lints"
]
},
{
"name": "flutter_test",
"version": "0.0.0",
"dependencies": [
"clock",
"collection",
"fake_async",
"flutter",
"leak_tracker_flutter_testing",
"matcher",
"meta",
"path",
"stack_trace",
"stream_channel",
"test_api",
"vector_math"
]
},
{
"name": "flutter",
"version": "0.0.0",
"dependencies": [
"characters",
"collection",
"material_color_utilities",
"meta",
"sky_engine",
"vector_math"
]
},
{
"name": "lints",
"version": "6.1.0",
"dependencies": []
},
{
"name": "stream_channel",
"version": "2.1.4",
"dependencies": [
"async"
]
},
{
"name": "meta",
"version": "1.17.0",
"dependencies": []
},
{
"name": "collection",
"version": "1.19.1",
"dependencies": []
},
{
"name": "leak_tracker_flutter_testing",
"version": "3.0.10",
"dependencies": [
"flutter",
"leak_tracker",
"leak_tracker_testing",
"matcher",
"meta"
]
},
{
"name": "vector_math",
"version": "2.2.0",
"dependencies": []
},
{
"name": "stack_trace",
"version": "1.12.1",
"dependencies": [
"path"
]
},
{
"name": "clock",
"version": "1.1.2",
"dependencies": []
},
{
"name": "fake_async",
"version": "1.3.3",
"dependencies": [
"clock",
"collection"
]
},
{
"name": "path",
"version": "1.9.1",
"dependencies": []
},
{
"name": "matcher",
"version": "0.12.17",
"dependencies": [
"async",
"meta",
"stack_trace",
"term_glyph",
"test_api"
]
},
{
"name": "test_api",
"version": "0.7.7",
"dependencies": [
"async",
"boolean_selector",
"collection",
"meta",
"source_span",
"stack_trace",
"stream_channel",
"string_scanner",
"term_glyph"
]
},
{
"name": "sky_engine",
"version": "0.0.0",
"dependencies": []
},
{
"name": "material_color_utilities",
"version": "0.11.1",
"dependencies": [
"collection"
]
},
{
"name": "characters",
"version": "1.4.0",
"dependencies": []
},
{
"name": "async",
"version": "2.13.0",
"dependencies": [
"collection",
"meta"
]
},
{
"name": "leak_tracker_testing",
"version": "3.0.2",
"dependencies": [
"leak_tracker",
"matcher",
"meta"
]
},
{
"name": "leak_tracker",
"version": "11.0.2",
"dependencies": [
"clock",
"collection",
"meta",
"path",
"vm_service"
]
},
{
"name": "term_glyph",
"version": "1.2.2",
"dependencies": []
},
{
"name": "string_scanner",
"version": "1.4.1",
"dependencies": [
"source_span"
]
},
{
"name": "source_span",
"version": "1.10.2",
"dependencies": [
"collection",
"path",
"term_glyph"
]
},
{
"name": "boolean_selector",
"version": "2.1.2",
"dependencies": [
"source_span",
"string_scanner"
]
},
{
"name": "vm_service",
"version": "15.0.2",
"dependencies": []
}
],
"configVersion": 1
}

View File

@@ -1 +0,0 @@
3.38.9

View File

@@ -1,11 +0,0 @@
// This is a generated file; do not edit or check into version control.
FLUTTER_ROOT=/Users/kaska/.local/share/mise/installs/flutter/3.38.9-stable
FLUTTER_APPLICATION_PATH=/Users/kaska/Documents/Projects/Major/arbiter/useragent
COCOAPODS_PARALLEL_CODE_SIGN=true
FLUTTER_BUILD_DIR=build
FLUTTER_BUILD_NAME=0.1.0
FLUTTER_BUILD_NUMBER=0.1.0
DART_OBFUSCATION=false
TRACK_WIDGET_CREATION=true
TREE_SHAKE_ICONS=false
PACKAGE_CONFIG=.dart_tool/package_config.json

View File

@@ -1,12 +0,0 @@
#!/bin/sh
# This is a generated file; do not edit or check into version control.
export "FLUTTER_ROOT=/Users/kaska/.local/share/mise/installs/flutter/3.38.9-stable"
export "FLUTTER_APPLICATION_PATH=/Users/kaska/Documents/Projects/Major/arbiter/useragent"
export "COCOAPODS_PARALLEL_CODE_SIGN=true"
export "FLUTTER_BUILD_DIR=build"
export "FLUTTER_BUILD_NAME=0.1.0"
export "FLUTTER_BUILD_NUMBER=0.1.0"
export "DART_OBFUSCATION=false"
export "TRACK_WIDGET_CREATION=true"
export "TREE_SHAKE_ICONS=false"
export "PACKAGE_CONFIG=.dart_tool/package_config.json"