Non SSI Identity Standards



The OpenID Foundation is pleased to share its new whitepaper, “Open Banking, Open Data and Financial-Grade APIs”. The paper documents the international movement towards Open Banking, Open Finance, and secure, consent driven access to all user data. It describes the OpenID Foundation and in particular the Financial-Grade API (FAPI) Working Group’s experience with Open Banking ecosystems internationally.

  1. A identity standard’s adoption is driven by its value of the reliability, repeatability and security of its implementations.
  2. A standard’s value can be measured by the number of instances of certified technical conformance extant in the market.
  3. Certified technical conformance is necessary but insufficient for global adoption.
  4. Adoption at scale requires widespread awareness, ongoing technical improvement and a open and authoritative reference source.
  5. When Libraries/Directories/ Registries act as authoritative sources they amplify awareness, extend adoption and promote certification.
  6. Certified technical conformance importantly complements legal compliance and together optimize interoperability.
  7. Interoperability enhances security, contains costs and drives profitability.

GAIN is marked by a cross sector, crowd sourced, open, global due diligence. GAIN’s self organized participants are actively seeking evidence that disconfirms the GAIN hypothesis.

Board participation requires a substantial investment of time and energy. It is a volunteer effort that should not be undertaken lightly. Should you be elected, expect to be called upon to serve both on the board and on its committees. You should have your employer’s agreement to attend two or more in-person board meetings a year, which are typically collocated with important identity conferences around the world.

This specification defines event types and their contents based on the SSE Framework that are required to implement Risk Incident Sharing and Coordination.





Secure QR Code



I think what happens is that a first blank node is created for the proof, and since that node has @container @graph, instead of being able to trace the relationships directly from credential to proof to verification method…

Each proof is being treated as a disjoint subgraph, and the relationship is not being preserved during import… […]

I suspect this is solvable with a more complicated graph config:

But I wonder if we might correct this behavior in VC Data Model 2.0, such that RDF representations don’t have this odd behavior when imported as labeled property graphs. […]

answer on the github issue for the standard, I raised it here:

The goal of this group is to standardize the way many of us digitally sign Verifiable Credentials. This working group has been about decade in the making (some would say two decades) and is important for achieving things like BBS+ selective disclosure as well as standardizing the way we format Verifiable Credentials before they are digitally signed.

The announcement is here

The proposed charter is here

I’ve instrumented the rdf-canonicalize library so I can inspect the order of execution, and it appears that what differs between my implementation and the Javascript one is the order of the permutations. The spec doesn’t say how the permutations should be ordered, and my intuition is that the order does indeed matter - though I’m happy to be corrected if I’m wrong.

So, here is my question(s):

Award recipients will be studying, researching, interning or working in a field relevant to one or more OpenID Foundation working groups and consistent with Foundation’s Mission. The recipients will also be invited to participate in Foundation breakout meetings at the European Identity Conference and Identiverse which will provide exposure to both the Foundation’s business as well as leading technologists.

GAIN was a big topic of discussion

GAIN: The Global Assured Identity Network @OIX_Nick and @gailhodges on the main stage.


just like trade unions helped the working class during the industrial revolution to fight for their rights. In this panel session, we will discuss about the enablers of such a different approach and the requirements to actually be successfull.

The OpenID Foundation formed the “Shared Signals and Events” (SSE) Working Group as a combination of the previous OpenID RISC working group and an informal industry group that was focused on standardizing Google’s CAEP proposal. These represented two distinct applications of the same underlying mechanism of managing asynchronous streams of events. Therefore the SSE Framework is now proposed to be a standard for managing such streams of events for any application, not just CAEP and RISC. In effect, it is a standard for generalized Webhooks.

I’ve defined an Authentication Method Reference (AMR) value called “pop” to indicate that Proof-of-possession of a key was performed. Unlike the existing “hwk” (hardware key) and “swk” (software key) methods […] Among other use cases, this AMR method is applicable whenever a WebAuthn or FIDO authenticator are used.

The official voting period will be between Tuesday, February 1, 2022 and Tuesday, February 8, 2022, following the 45-day review of the specifications.

The goal of this whitepaper is to inform and educate the readers about the work on the OpenID for Verifiable Credentials (OpenID4VC) specifications family. It addresses use-cases referred to as Self-Sovereign Identity, Decentralized Identity, or User-Centric Identity.