The World Wide Web Consortium (W3C) has officially published Verifiable Credentials 2.0 (VC 2.0) as a web standard. It's a technical milestone that sets a solid foundation for the way trusted data is issued, shared, and verified at scale.
As Brent Zundel, co-chair of the Verifiable Credentials Working Group, put it: "Verifiable credentials are poised to make a significant impact on the way people and systems share data."
For those working on global interoperability projects like the UN Transparency Protocol (UNTP), this is a meaningful step forward.
At Pyx, we've advocated for the use of Verifiable Credentials because they can reduce dependency on centralised databases, improve data provenance, and enable selective disclosure. Now, with VC 2.0, there's a recognised standard to anchor this work — from open supply chains and ESG reporting to digital product passports and beyond.
The bones of VC 2.0: What's inside the new standard
VC 2.0 defines a consistent and flexible way to represent digital credentials such as product certifications, regulatory declarations, business licenses, or identity attributes, in a format that is:
☑ Verifiable
☑ Privacy-preserving
☑ Decentralised
☑ Tamper-evident
While VC 1.1 was widely referenced in practice, it was not a formal web standard. VC 2.0 builds on that transitional version and formalises a technical stack that includes compact JSON-LD, decentralised identifiers, scalable status lists, and cryptographic proof formats like JOSE/COSE.
In short, it gives systems a way to issue and verify claims without relying on a single source of truth (i.e. a centralised database), and without exposing more data than necessary.
Why this matters for UNTP
Verifiable Credentials are the backbone of how UNTP represents traceability, conformity, and product-level sustainability claims. These include product passports, conformity credentials, and supply chain events—each issued as a Verifiable Credential.
According to the UNTP specification, implementations should:
- Use W3C VC Data Model 2.0 in compact JSON-LD form
- Support did:web identifiers and JOSE/COSE proof formats
- Enable status and revocation with the Bitstring Status List
- Render credentials in human-readable ways to support adoption across the supply chain
The VC 2.0 standard provides a clear, recommended profile for these credentials.
It provides a common foundation that reduces guesswork and fragmentation. Because W3C is the body responsible for setting the standards that the world wide web is built on - the standard has the authority needed to support policy, procurement, and infrastructure-level adoption.
Besides helping developers, it will allow governments, industry bodies, and global trading partners to align on how to issue and verify evidence without centralisation, without vendor lock-in, and without needing to reinvent the wheel.
Now that Verifiable Credentials are standardised — what's next?
The publication of VC 2.0 removes a key obstacle to broader uptake. With a formal standard in place, more organisations can move forward with clearer guidance and greater confidence.
Crucially, VC 2.0 provides the baseline needed to evaluate real-world systems and processes against an accepted model. As a result, familiar questions are becoming easier to address:
- Do existing credential models and data formats align with VC 2.0?
- Can credentials issued in one environment be reliably verified in another?
- Are implementation decisions about credential issuance, verification, and lifecycle management being made with scalability, portability, and openness in mind?
These are the kinds of questions Pyx helps organisations navigate. With the standard now in place, they're becoming more actionable across the broader digital trust ecosystem.
🔗 Read the W3C announcement here: World Wide Web Consortium Verifiable Credentials 2.0 Press Release
Join the Pyx Community!
- 📢 Chat on Pyx Zulip – Join real-time conversations with the community
- 💬 Join the Pyx Forums – Engage in discussions about the UNTP and related topics