-
Notifications
You must be signed in to change notification settings - Fork 1
feat!: World ID 3.0 compatibility #44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
@codex review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 01af092d1b
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| // Map requests to WASM format | ||
| const reqs = requests.map((req) => ({ | ||
| credential_type: req.credential_type, | ||
| signal: typeof req.signal === 'string' ? req.signal : undefined, | ||
| signal_bytes: req.signal_bytes, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Preserve ABI-encoded request signals
In createClient you only forward req.signal when it is a string. RequestConfig.signal still allows AbiEncodedValue, but those values are dropped here, so the request is sent without a signal. With verification_level/top-level signal removed in this commit, there is no remaining path to pass ABI‑encoded signals, which means proofs will be bound to an empty signal and on‑chain verification that expects the ABI signal will fail. Consider converting AbiEncodedValue to signal_bytes (or rejecting non‑string signals unless signal_bytes is provided).
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will address in another PR
| pub fn from_credential_types(types: &[CredentialType]) -> Self { | ||
| // Check in reverse priority order (least restrictive first) | ||
| if types.contains(&CredentialType::Device) { | ||
| Self::Device | ||
| } else if types.contains(&CredentialType::Face) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Avoid dropping face when deriving legacy level
from_credential_types chooses Device whenever any device request is present. For mixed request sets like { face, device }, this derives verification_level=Device, which in World App 3.0 allows only device or orb. That silently excludes face even though the request set allowed it, so legacy clients will reject valid face proofs. Consider detecting incompatible mixes (face+device, document+device, etc.) and either error/warn or pick a level that preserves all requested credential types when possible.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not accurate, this will be refactored
| /// | ||
| /// Priority (most to least restrictive): orb > `secure_document` > document > face > device | ||
| //TODO: This is not 100% accurate, will refine this in a followup PR, but for now it "works". | ||
| fn derive_legacy_fields(requests: &[Request]) -> (VerificationLevel, String) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not final, but basically, I'm picking the minimum credential and hashing the signal.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As I think about this more, I think the better path to migration is to create like 4 types that they can do
OrbMin, DocMin, SecureDocMin, DeviceMin. Put these all under CredentialPreset.Legacy.
If you request anything else we dont support backwards compatibility. After a certain time they can choose to request a new credential. Otherwise there's a ton of edge cases here that I think will be confusing to someone who tries.
What if you have a signal for 4 proofs and you actually want them all. You don't want to just take one proof. What if you have multiple requests with the same credential type. I want an any secure with face auth + orb face, or orb + secure with face auth.
I feel like a bimodal approach here is actually the cleanest
You use our predefined legacy types that you know will work like what you had before. By the time this rolls out formally most people should be on the new type. People will need to update the app anyways to get the new authenticators.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I feel like this roughly outlines what I was thinking. It simplifies the code and our mental model greatly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
agree, i wanted to get this merged and come back to refactor
| action_description, | ||
| requests: requests.clone(), | ||
| constraints, | ||
| signal, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this might be breaking too right? In old IDKit we hash the signal in the new one we rely on authenticator to do so.
| action: ACTION, | ||
| verification_level: verificationLevel, | ||
| signal: 'demo-signal-' + Date.now(), | ||
| requests: requests, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: not sure requests is the right word here. Maybe challenges?
| @@ -1,6 +1,6 @@ | |||
| import type { IDKitConfig, ISuccessResult, IErrorState } from '../' | |||
|
|
|||
| export type WidgetConfig = Pick<IDKitConfig, 'app_id' | 'action' | 'signal' | 'requests' | 'constraints' | 'bridge_url' | 'partner' | 'action_description'> & { | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need to figure out how to migrate partner since it is used to trigger the Razer flow right now
| face_auth?: boolean | ||
| }> | ||
| /** Credential requests - at least one required */ | ||
| requests: RequestConfig[] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: Constraints below
| await initIDKit() | ||
|
|
||
| // Validate requests | ||
| if (!requests || requests.length === 0) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I feel like we should just decide on the policy/requests thing we are talking about a decision 1 day away
| let session: WasmModule.Session | ||
|
|
||
| if (requests && requests.length > 0) { | ||
| if (verification_level) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I do think this warn is good. To say if they try to request a type of credential that won't be backwards comaptible
| /// | ||
| /// This is the credential type that determines the level (the least restrictive one). | ||
| #[must_use] | ||
| pub const fn primary_credential(&self) -> CredentialType { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not a fan of this
| /// | ||
| /// Priority (most to least restrictive): orb > `secure_document` > document > face > device | ||
| //TODO: This is not 100% accurate, will refine this in a followup PR, but for now it "works". | ||
| fn derive_legacy_fields(requests: &[Request]) -> (VerificationLevel, String) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As I think about this more, I think the better path to migration is to create like 4 types that they can do
OrbMin, DocMin, SecureDocMin, DeviceMin. Put these all under CredentialPreset.Legacy.
If you request anything else we dont support backwards compatibility. After a certain time they can choose to request a new credential. Otherwise there's a ton of edge cases here that I think will be confusing to someone who tries.
What if you have a signal for 4 proofs and you actually want them all. You don't want to just take one proof. What if you have multiple requests with the same credential type. I want an any secure with face auth + orb face, or orb + secure with face auth.
I feel like a bimodal approach here is actually the cleanest
You use our predefined legacy types that you know will work like what you had before. By the time this rolls out formally most people should be on the new type. People will need to update the app anyways to get the new authenticators.
| /// | ||
| /// Priority (most to least restrictive): orb > `secure_document` > document > face > device | ||
| //TODO: This is not 100% accurate, will refine this in a followup PR, but for now it "works". | ||
| fn derive_legacy_fields(requests: &[Request]) -> (VerificationLevel, String) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I feel like this roughly outlines what I was thinking. It simplifies the code and our mental model greatly
| /// | ||
| /// Priority (most to least restrictive): orb > `secure_document` > document > face > device | ||
| //TODO: This is not 100% accurate, will refine this in a followup PR, but for now it "works". | ||
| fn derive_legacy_fields(requests: &[Request]) -> (VerificationLevel, String) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| action_description, | ||
| requests: requests.clone(), | ||
| constraints, | ||
| signal, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this might be breaking too right? In old IDKit we hash the signal in the new one we rely on authenticator to do so.
This PR:
verification_levelandsignalto Bridge payload, to keep compatibility with old World App versions and World ID 3.0VerificationLevelfrom the SDKs in favor of requests and credentials.Try this your self by: