Standardizing Quantum Technologies: Global Efforts and the Rise of ETSI’s Quantum Committee

Global efforts to standardize quantum tech are expanding. ETSI TC QT leads in quantum computing standards, post-quantum cryptography, QKD, and more.

Standardizing Quantum Technologies: Global Efforts and the Rise of ETSI’s Quantum Committee
Photo by D koi / Unsplash

Quantum computing and related quantum technologies are advancing rapidly, prompting a global push for standards to ensure these new systems are secure, interoperable, and widely adoptable. Around the world, quantum communication networks are moving from labs into real deployment – from metropolitan QKD (quantum key distribution) links in Cambridge and Tokyo to a 2,000 km quantum backbone between Beijing and Shanghai, with China’s Micius satellite extending quantum-encrypted keys globally. This surge in activity has created a pressing need for industry standards to guarantee that quantum devices and protocols can work together seamlessly. In parallel, the looming threat of quantum computers to current encryption has galvanized efforts on quantum-safe cryptography, ensuring today’s data remains secure in the quantum era. In this overview, we examine major standardization initiatives across the globe – with a special focus on the newly established ETSI Technical Committee on Quantum Technologies (TC QT) – covering security and beyond: from post-quantum encryption and network protocols to quantum sensing, computing benchmarks, and integration with classical systems.

ETSI Technical Committee on Quantum Technologies (TC QT)

In late 2025, the European Telecommunications Standards Institute (ETSI) launched a dedicated Technical Committee on Quantum Technologies (TC QT) to consolidate ongoing standards work and address future needs. This marks a significant milestone, building on ETSI’s pioneering efforts (ETSI formed the first QKD working group back in 2008). The new TC QT has a broad scope spanning multiple domains of quantum tech. Its primary objective is to develop specifications for quantum communications and networks across sectors – essentially laying the foundation for secure global quantum-enhanced communication infrastructure. Notably, TC QT aims to align with European strategic initiatives like the EuroQCI (European Quantum Communication Infrastructure), reinforcing ETSI’s role in Europe’s quantum strategy. While ETSI’s standards are global in applicability, EuroQCI is seen as a key target for early adoption of these quantum standards.

Key areas of ETSI TC QT’s activity include:

  • Quantum Communications: Secure data transmission using quantum principles (e.g. leveraging superposition and entanglement for encryption).
  • Quantum Networking: Connecting quantum devices and systems for distributed computing, quantum cryptography, and advanced sensing applications.
  • Quantum Sensing: Enabling ultra-precise measurements within quantum communication networks, to improve network timing and detection capabilities.
  • Satellite Quantum Communications: Standardizing classical beacon signals (wavelengths, channels) and interfaces for space-based quantum links, including satellite QKD systems, and defining secure operational procedures for optical ground stations. (Coordination with space standards bodies like ECSS is explicitly emphasized.)
  • Quantum Random Number Generators (QRNGs): Defining requirements for generators of true randomness from quantum processes (photon detection, electron spin, etc.), crucial for cryptography, simulations, and secure communications.
  • Quantum Security & Vulnerability Testing: Establishing methodologies to assess hardware vulnerabilities in quantum tech (e.g. QKD devices), validate security proofs, and evaluate susceptibility to side-channel attacks. This means creating standard testing procedures for quantum cryptographic equipment – a collaborative effort between theoretical security experts and experimental quantum physicists.

ETSI’s new committee is not starting from scratch; it leverages ongoing work by ETSI’s Quantum Key Distribution (QKD) Industry Specification Group and Quantum-Safe Cryptography initiatives. For example, ETSI’s QKD group has already published specs on QKD module interfaces, component characterisation, and an API for delivering quantum keys to applications. Going forward, TC QT will take such work to a next level and broaden it. It plans to engage other standards organizations internationally and even open-source communities, to promote alignment and avoid duplicated effort. By integrating research outcomes (e.g. from EU Horizon Europe programs) and collaborating with bodies like the European Commission, ETSI’s TC QT is positioned to ensure its standards meet global, regional, and national requirements. In short, the formation of TC QT underscores a concerted move to create a comprehensive framework of quantum tech standards, from Europe to the world.

Post-Quantum Cryptography: Preparing Encryption for the Quantum Era

One of the most urgent standardization efforts globally is post-quantum cryptography (PQC) – new cryptographic algorithms designed to withstand attacks by quantum computers. Traditional public-key algorithms (RSA, ECC) could be broken by a future quantum computer, so the race is on to deploy quantum-resistant alternatives. The U.S. National Institute of Standards and Technology (NIST) has led a multi-year international competition to select and standardize these algorithms. In August 2024, NIST published the first three official PQC standards: FIPS 203, 204, and 205, specifying algorithms derived from CRYSTALS-Kyber (for key encapsulation), CRYSTALS-Dilithium (for digital signatures), and SPHINCS+ (an alternate stateless signature). This milestone – “quantum-secure” encryption algorithms approved for general use – is driving global adoption of PQC. Major tech companies, telecom providers, and government agencies have announced support for NIST’s PQC standards, and many countries are expected to follow NIST’s selections in their own cryptographic guidelines.

NIST isn’t stopping there: a second cohort of PQC algorithms is under evaluation for standardization, including candidates like HQC (a code-based encryption scheme) which was selected for further standardization in 2025. Internationally, coordination is underway so that PQC standards are consistent. For instance, the ISO/IEC Joint Technical Committee 1 has been working on ISO/IEC 23837, a framework for security requirements and evaluation methods for QKD systems that also aligns with common criteria (ISO/IEC 15408) – showcasing how classical security evaluation standards are being extended to quantum tech. Similarly, Europe’s standards bodies and cybersecurity agencies (ENISA, BSI, etc.) have issued PQC migration recommendations to ensure a smooth transition. ETSI itself has run annual Quantum-Safe Cryptography (QSC) workshops and produced a well-regarded “Quantum Safe Cryptography and Security” white paper, which outlines approaches for deploying PQC alongside quantum key distribution.

Beyond just defining algorithms, standards for integrating PQC into existing protocols and systems are crucial. This is where the Internet Engineering Task Force (IETF) has taken the lead: the IETF is updating protocols like TLS, IPsec, and X.509 certificates to support PQC algorithms. For example, the IETF has defined hybrid key exchange modes for TLS 1.3 (combining classical and post-quantum keys), and its IPsec working group is finalizing a mechanism to mix pre-shared keys in IKEv2 for post-quantum securityum schemes. The IEEE is also contributing: IEEE P1943 aims to define a post-quantum optimized version of network security protocols, and IEEE P3172 provides a recommended practice for PQC migration, outlining multi-step processes to implement hybrid (classic + quantum-resistant) cryptography in enterprise environments. Taken together, these efforts ensure that PQC algorithms can be interoperably implemented in everything from web browsers and VPNs to government communications systems. The end goal is a quantum-safe cryptographic infrastructure where even if a large-scale quantum computer arrives, our confidential data – whether financial transactions, health records, or state secrets – remains secure.

Quantum Communication and QKD: Towards a Global Quantum Network

On another front, standards are emerging to govern quantum communication protocols and networks, particularly quantum key distribution. QKD uses quantum physics to exchange encryption keys with theoretically unconditional security (any eavesdropping is instantly detectable). While PQC is a software solution, QKD requires new hardware and network architecture, so standardization is vital to make it practical on a large scale. In recent years we’ve seen a flurry of QKD network deployments (as noted, across Europe, Asia, and even satellites), and standards bodies have moved quickly to support them.

ETSI’s QKD group has been a trailblazer in this area. It developed some of the first QKD interface standards – for example, defining how a QKD device delivers secret keys to applications via a standard API. It has also published specifications on QKD module security, optical component characterization, and protocols to guard against known attacks (like Trojan-horse attacks on QKD systems). These ETSI Group Specifications have laid groundwork for interoperable QKD equipment, such that, say, a QKD transmitter from company A and a receiver from company B can operate together under common protocols. Indeed, industry trials have demonstrated multi-vendor QKD interoperability: in one 2023 European trial, Orange and Adtran achieved a 400 Gbps data transmission secured by QKD, using equipment from different providers and citing the importance of open standards like the ETSI key delivery interface to combine classical encryption with QKD keys. Such real-world tests underscore how standard interfaces help integrate quantum links into today’s telecom networks.

At the International Telecommunication Union (ITU), efforts have focused on network-level standards for quantum communications. The ITU-T has published the Y.3800-series Recommendations on Quantum Key Distribution Networks (QKDN). ITU-T Y.3800, issued in late 2019, provides one of the first comprehensive frameworks for networks supporting QKD. It outlines how to incorporate QKD into existing telecom infrastructure, defining components like QKD nodes, key management systems, and their interfaces with classical network layers. Subsequent ITU standards and technical reports delve into specifics: for example, QKDN architectures, key management functional models, and even a standardization roadmap for quantum networking and services. There are ongoing studies in the ITU on satellite-based QKD as well, given the interest in global quantum-secure links – a recent ITU technical report (TR.SQKDN, 2025) surveys satellite QKD experiments (like China’s Micius) and the need for standards to handle things like inter-operable ground stations and trusted nodes. All this work in the ITU ensures that as telcos and nations roll out quantum networks, they have a common reference architecture and agreed-upon protocols, much as today’s internet relies on standardized TCP/IP and telecom relies on standardized optical network interfaces.

Global consensus is also being built through ISO/IEC JTC 3 – Quantum Technologies, the new joint technical committee formed in 2024 to coordinate quantum tech standards internationally. Within JTC 3, there is a dedicated working group on “Quantum secure communication”, which covers QKD and related security aspects. Notably, in 2024 ISO released its first QKD security standard, which defines a comprehensive framework for evaluating the security of QKD systems. This standard (aligned to the ISO/IEC 15408 Common Criteria approach) lays out common security functional requirements covering everything from the quantum optical components to the conventional network elements of QKD setups. By having ISO/IEC endorse such criteria, vendors of QKD systems can design to an international “gold standard” for QKD security, and independent labs can certify QKD devices against these benchmarks. This is a big step toward assurance and trust in quantum cryptography: buyers and users will be able to rely on certified claims (e.g. that a QKD device meets certain tamper-resistance or side-channel mitigation requirements, as per ISO specs).

Beyond QKD, quantum networks of the future may enable more than just key distribution – e.g. distributed quantum computing or sensor networks – but these are still nascent. However, pre-standardization work is happening. The IETF Quantum Internet Research Group (QIRG), for instance, has been discussing architectures for a quantum internet (entanglement swapping, quantum repeaters, etc.), though formal standards are likely a few years away. In the meantime, today’s focus remains on interoperability frameworks for QKD integration into classical networks: things like standardized handover between quantum links and classical routers, and common formats for quantum keys so they can be used by classical encryption applications. ETSI’s ongoing work on a software-defined networking (SDN) control interface for QKD is one example, aiming to let network controllers manage QKD equipment much as they do classical optical gear. In summary, the standardization of quantum communications is bridging two worlds – marrying novel quantum physics-based tech with the proven frameworks of classical telecom. The outcome will be networks where quantum devices (whether ground-based or satellite) plug-and-play securely, handing off keys and quantum signals across a global quantum-ready infrastructure.

Standardizing Quantum Computing Performance and Hardware

Moving from communications to computation: quantum computing standards are crucial to foster a healthy ecosystem as hardware and software mature. Unlike classical computing, which had decades to develop common architectures (like x86) and benchmarks, quantum computing is in a formative stage with diverse approaches. Recognizing this, international bodies have begun laying down foundational standards for quantum computing terminology, performance metrics, and even hardware architectures.

A significant step was the publication of ISO/IEC 4879:2024, a vocabulary standard that defines key terms in quantum computingiso.org. This helps ensure that engineers, researchers, and vendors use a common language – whether discussing “qubits”, “quantum gates”, or “entanglement”. IEEE has a parallel effort (IEEE P7130) for quantum technology definitions, indicating the strong demand for clarity in this new field. Consistent terminology might seem basic, but it underpins all other standards; for instance, when specifying a performance metric, all parties must agree on what constitutes a “quantum logic operation” or an “error rate” in comparable terms.

When it comes to performance benchmarking, the goal is to move beyond proprietary metrics (like IBM’s well-known “Quantum Volume”) to standardized ones that allow apples-to-apples comparison of different quantum processors. The IEEE Standards Association has a project P7131 to define quantum computing performance metrics and benchmarking criteria. This standard aims to cover a range of metrics – possibly including gate fidelity, circuit depth, error rates, and algorithmic benchmarks – so that the community can regularly evaluate progress in quantum hardware and software using agreed methods. Similarly, ISO/IEC JTC 3 has an ad-hoc group on “Quantum terminology and metrics”, reflecting that metrics (from qubit quality to algorithm runtime scaling) are a priority at the international level. In the near future, we might see standard benchmark suites for quantum computers (analogous to SPEC benchmarks in classical computing) that vendors and labs run to measure their systems’ capabilities. This will be especially valuable to R&D and industry adopters, to know whether one machine can reliably outperform another on certain tasks, or to track improvements as new generations of quantum processors come online.

Another area of focus is creating common hardware and architecture standards. Quantum hardware is technologically diverse – superconducting qubits, trapped ions, photonic qubits, etc. – each with its own control systems and qubit connectivity. IEEE’s P3120 project is working on a standard for quantum computing architecture, which seeks to define technical architectures for quantum computers based on different qubit modalities. The idea is to provide a reference model or taxonomy for quantum computer designs, which could include how qubits, quantum logic units, memory, and classical control units interrelate. Such a standard can guide both developers and users in understanding and comparing architectures (e.g., what does a “64-qubit superconducting processor” entail vs a “64-qubit trapped-ion system”, in a standardized descriptive sense). There’s also work on quantum programming interfaces and algorithm design: IEEE P2995, for instance, defines methods for quantum algorithm design and development, aiming for a standardized workflow or language for describing quantum algorithms irrespective of the underlying hardware.

Crucially, energy efficiency and error correction are not being overlooked. IEEE P3329 is set to establish a universal metric for quantum computing energy efficiency, acknowledging that as quantum computers scale up, their power consumption (cooling systems, control electronics) and efficiency will matter for practical deployment. Meanwhile, topics like quantum error correction and fault tolerance might see future standards once the field converges on certain techniques (for example, if a particular error correction code becomes a de-facto choice, standards could ensure it’s implemented uniformly).

International coordination in quantum computing standards is evident: the ISO/IEC JTC 3 scope explicitly covers quantum computing and simulation, and the committee has working groups on quantum computing as well as related enabling technologies. This means national bodies from around the world (with leadership from BSI in the UK as secretariat) are collaborating on standards that span from the fundamental (terminology, metrics) to the practical (components and interfaces). As these standards take shape, they will provide the scaffolding needed for the quantum computing industry to grow — much like how classical computing benefited from standards like ASCII, IEEE floating-point, or USB interfaces in its developmental years. With agreed standards, an engineer will be able to, say, run a quantum program on different hardware backends with minimal changes, or a data center planner will be able to trust benchmark figures when deciding which quantum processor to deploy for a particular use case.

Quantum Sensing and Metrology: Beyond Computing and Security

Not all quantum breakthroughs are in computing or cryptography; quantum sensing and metrology is a thriving area with its own standardization needs. Quantum sensors (devices that exploit quantum effects to measure physical quantities with extreme precision) are poised to revolutionize fields from navigation to medical imaging. As these devices inch toward practical use, creating standards for their performance and integration is important to ensure they can be widely adopted.

One aspect is quantum metrology, which deals with using quantum phenomena to define measurement standards. In fact, the international metrology community has already embraced quantum principles in the global system of units (SI) – for example, the redefinition of the kilogram and other units in 2018 fixed fundamental constants, effectively relying on quantum effects (like the Planck constant via Kibble balances, and atomic transition frequencies for the second). Now, research is underway on next-generation standards such as optical atomic clocks that could redefine the second with 100× better precision than today’s cesium clocks. As these advances mature, standards will be needed so that different national labs and companies can calibrate and compare quantum-enhanced sensors. The new ISO/IEC JTC 3 explicitly includes quantum metrology in its scope, indicating that developing standards for quantum-based measurement techniques (e.g. standard procedures for using a quantum gravimeter, or specifications for a single-photon source used in metrology) will be on the agenda.

Quantum sensing covers devices like quantum magnetometers (for detecting minute magnetic fields), quantum LIDAR or radar, quantum accelerometers, and quantum-enhanced imaging systems. A prominent near-term example is quantum timing: sensors like optical clocks or networked atomic clocks can provide ultra-precise time signals for navigation (GPS/GNSS) or financial transaction timestamping. Standards efforts here might focus on formats and protocols to disseminate this enhanced timing (perhaps through existing time distribution standards like NTP or PTP, extended for higher precision). We are also seeing initiatives to standardize how quantum sensors report data and interface with classical systems. For instance, if multiple companies produce quantum accelerometers for self-driving cars or aerospace, standards would ensure they can all report measurements in a common format and be tested against the same sensitivity benchmarks.

Another angle is benchmarking and characterizing quantum sensors. Just as quantum computers need benchmarks, sensors need agreed performance metrics: what constitutes the sensitivity of a quantum gravimeter, how to define dynamic range or response time of a quantum sensor, etc. IEEE’s quantum standards program and others will likely encompass some of these. In the CEN-CENELEC quantum technology roadmap (a European standards roadmap), there are sections identifying gaps in standards for things like single-photon detectors and quantum imaging, noting that many such technologies are not yet covered by standards. Efforts to plug these gaps are expected, for example by developing calibration standards for single-photon sources and detectors – crucial components not only in sensors but also in quantum communication systems.

In summary, quantum metrology and sensing standards ensure that the quantum-enhanced measurements can be trusted and compared universally. They pave the way for quantum sensors to be certified for use in critical applications (imagine quantum accelerometers in airplanes needing regulatory approval – standards will be the basis for tests and certification). They also facilitate innovation: when researchers know there’s a common reference, they can push the limits of sensitivity or accuracy and objectively claim improvements. From defining how to verify a quantum random number generator’s randomness to setting safety guidelines for quantum magnetic resonance imagers, this is a broad but increasingly important frontier in standardization.

Integrating Quantum and Classical Systems: Interoperability Frameworks

A recurring theme in quantum standardization is integration – quantum technologies will not operate in a vacuum, they must interface with classical information and communication technology (ICT) systems. Several standardization efforts focus on this intersection, ensuring a smooth bridge between quantum and classical realms.

One key area is hybrid quantum-classical computing. Because quantum computers are not stand-alone devices (they rely on classical computers for control, pre-/post-processing, and error correction), standards are being drafted to describe hybrid architectures. The IEEE P3185 project is developing a Standard for Hybrid Quantum-Classical Computing, which defines the hardware and software architecture of systems where quantum processors work in tandem with classical computers. This could cover how tasks are divided, how data moves between classical and quantum parts, and performance considerations for the overall system. Having such standards will help in building modular quantum accelerators that can plug into classical data centers or HPC (high-performance computing) facilities.

Another important piece is at the software layer: how quantum programs are represented and passed between platforms. This is the focus of the Quantum Intermediate Representation (QIR) Alliance, a joint industry effort under the Linux Foundation to create an open standard intermediate language for quantum programs. QIR is based on the LLVM compiler framework and serves as a common interface between quantum programming languages/frameworks and the low-level hardware instructions. By standardizing an intermediate representation, QIR allows quantum software to be portable across different hardware backends – much like how today’s high-level code can be compiled down to a common machine-independent bytecode. The QIR Alliance explicitly targets the interface between quantum and classical computing, enabling quantum instructions to be linked with classical computing routines in one workflow. For example, a developer could write a quantum algorithm in a high-level language, compile to QIR, and then any vendor’s quantum processor that supports QIR can run it, with classical control code interwoven as needed. Such interoperability is crucial for quantum cloud services where users want to write hardware-agnostic code. While QIR is an industry-driven “open standard” rather than an official SDO output, it complements formal standards by addressing practical integration in the software toolchain.

In communication networks, integration means making quantum security tools work with existing network equipment and practices. We’ve touched on how standards define APIs for QKD systems to deliver keys to classical encryption devices – essentially linking quantum key generation with classical encryption algorithms like AES. Similarly, key management systems (KMS) are being adapted to quantum-safe operations. In a classical enterprise KMS or a military key distribution system, standards are being updated so they can incorporate PQC keys or QKD-derived keys alongside traditional keys. For instance, X.509 certificates (the standard for public-key certificates in TLS/SSL) are being extended to support PQC public keys and signatures of larger sizes, and new certificate types for hybrid signatures are being standardized in IETF. Ensuring that a certificate carrying a Dilithium signature can be handled by certificate authorities and browsers is a matter of agreeing on identifier codes and data formats – all of which is in scope for standards development right now.

Another concrete example of integration is the use of SDN (Software-Defined Networking) principles for quantum networks. Classical networks use SDN controllers to dynamically manage resources; ETSI’s work on a QKD control interface means a telecom operator’s SDN software could treat quantum links somewhat like regular links, routing keys and monitoring QKD link status via standardized commands. Likewise, trusted node architectures in QKD (where keys are relayed over multiple network nodes) benefit from standardized protocols so that, say, a trusted node made by one vendor can interoperate with another’s QKD link.

On the hardware side, integration also involves physical and electrical standards – for example, if quantum computers are to be installed in data centers, one might need standards for their form factors, cooling interfaces, or control signal connections to classical computers. While not yet formalized, these will become relevant as systems get larger (perhaps standards akin to how rack servers are standardized, or how peripheral interconnects like PCI-Express could be extended to attach quantum co-processors).

Finally, testing and certification frameworks form an often-unsung part of integration. Bodies like NIST and ETSI are considering how to test quantum devices in standardized ways so they can be certified for use in classical infrastructure. For instance, a quantum random number generator might undergo a battery of standard statistical tests and entropy evaluations (NIST has SP800-90 series for random number generators; work is ongoing to tailor such standards to QRNGs). Similarly, as noted earlier, ISO’s QKD security standard leverages Common Criteria evaluation – integration at the process level, so that existing security certification regimes can accommodate quantum products.

In short, integration-focused standards ensure that quantum tech doesn’t live on an island. They allow quantum components to slot into existing systems and workflows, whether it’s a quantum chip controlled by a classical computer, or a quantum-derived key being used in an Internet protocol. This interoperability is key to actual deployment: users will adopt quantum solutions only if they can integrate with minimal disruption. Thanks to these standards, the transition can be evolutionary – adding quantum advantages to classical systems – rather than revolutionary replacement.

Use Cases Driving Quantum Standardization

The breadth of standardization activity in quantum technologies is ultimately driven by real-world use cases and requirements across various sectors. Here we highlight a few domains and how they are influencing (and benefiting from) quantum standards:

  • Telecommunications: Perhaps the most immediate driver, the telecom industry sees quantum communication (like QKD) as a way to secure network infrastructure for the long term. Telecom operators are already piloting quantum-secured links (e.g., the Orange–Adtran 400G trial in Europe mentioned above) and contributing to standards to ensure multi-vendor interoperability. Standards for quantum networks, from the ITU’s QKDN specs to ETSI’s interfaces, enable telcos to integrate quantum key exchange over existing fiber networks and even upcoming 6G architectures. The use case is clear: protecting backbone links, inter-datacenter connections, and eventually customer VPNs with quantum-enhanced security. By standardizing how quantum keys are distributed and used, telecom companies can mix-and-match QKD equipment and scale up deployment without vendor lock-in. In wireless communications, looking ahead, standard bodies like 3GPP are also starting to scope how quantum-safe cryptography will be applied in 5G/6G systems to secure everything from device authentication to Over-The-Air updates, again following global standards for algorithms and protocols.
  • Finance: Banks, stock exchanges, and the financial services sector require strong security and integrity for transactions. This sector has a keen interest in both PQC and QKD. For example, some financial institutions have tested QKD for inter-site data replication – ensuring backup data centers sync over quantum-encrypted links. The motivation is to protect sensitive financial data (trades, customer records) against future decryption threats. Standards play a role in making such implementations feasible: a bank will only deploy QKD if it can be certified secure (per standards) and if it interfaces with their existing encryption systems (via standard APIs). On the PQC side, the finance industry is often an early adopter of new cryptography (given regulatory pressures and high stakes), so they are actively involved in standards for PQC in TLS (to secure online banking) and in cryptographic hardware modules (HSMs) that need to support larger PQC keys. The collaboration of industry groups like INCITS (which mirrors ISO/IEC crypto standards in the US), EMVCo (for payment systems), etc., with NIST and ETSI ensures that financial use cases are considered in drafting quantum-safe crypto standards. The financial sector’s demand for long-term confidentiality – think of encrypted transactions that must remain secret for decades – is a key reason PQC standards are being expedited.
  • Space and Satellite Communications: Space agencies and companies are exploring quantum technology for secure space-to-ground communications and earth sensing. Satellite QKD is a prime use case: projects like the Chinese Micius satellite and Europe’s planned EuroQCI satellites aim to distribute keys over global distances. Standards development is addressing the unique challenges here, such as the need for common wavelength standards for satellite QKD beacons (so that ground stations can work with multiple satellite systems) and protocols for trusted nodes where satellites relay keys. The European Space Agency (ESA) and other national agencies are working with bodies like ETSI TC QT and the European Cooperation for Space Standardisation (ECSS) to align space-related quantum standards. Beyond QKD, quantum sensors in space (like space-based quantum gravimeters or clocks for navigation) will require standards for calibration and data formats to integrate with global systems. One concrete example is the consideration of optical clocks in GPS/GNSS: if future navigation satellites carry optical atomic clocks, there must be standardized time signals and algorithms to utilize their precision on the ground. The use case of national security overlaps heavily with space and telecom – many governments view quantum-secure communication (whether via fiber or satellite) as a way to secure military and diplomatic communications. This is driving international agreements on standards (so allied nations’ systems can interoperate securely) and fueling programs like EuroQCI, which explicitly ties into NATO and EU security infrastructure. TC QT’s mandate to support European policy objectives including the European Quantum Communication Infrastructure underscores this focus.
  • Healthcare and Data Privacy: Another use case emerging is protecting sensitive personal data – for instance, genomic data or health records – using quantum-safe methods. As noted in an ETSI report, secure communication of human genome sequences has already been demonstrated with quantum cryptography. The healthcare sector, which requires confidentiality (patient data privacy) and integrity, could leverage quantum-safe encryption for long-term protection of records. Standards will be needed for integrating quantum-safe encryption in health information systems. Moreover, quantum sensors promise advances in medical imaging (like enhanced MRI or new types of scanners). If quantum sensors are used for diagnostics, standards for their safety and accuracy would be required before regulatory approval – tying back to quantum metrology and device standards so that these innovations can be safely deployed in hospitals.
  • R&D and Academia: Finally, research and development efforts themselves drive standardization as use cases in the lab. For example, national labs and quantum startup companies are building prototype quantum computers – they benefit from standard metrics (to publish results and claim “quantum advantage” in a meaningful way) and standard file formats or programming interfaces to share algorithms and software. Collaborative R&D initiatives, like the U.S. Quantum Economic Development Consortium (QED-C) or the EU Quantum Flagship, often include working groups on standards to ensure that breakthroughs can transition out of the lab. QED-C, for instance, has worked on surveys of quantum technology terminology and advocated for standards in metrics and benchmarking. These efforts feed into bodies like IEEE and ISO. Another R&D-oriented use case is quantum benchmarking networks – researchers comparing devices across different labs. A standardized benchmark (such as the proposed IEEE P7131 metrics) lets them do so rigorously. Moreover, consider quantum random number generation research: as new QRNG methods are developed, standards (like entropy evaluation protocols) help validate their randomness for use in cryptography. The academic drive to test quantum algorithms on various hardware also benefits from a common intermediate representation (like QIR) so that code isn’t limited to one vendor – enabling broader experimentation.

Across all these sectors, a common pattern is visible: use cases push for standards, and standards enable broader use of quantum technology in those use cases. Telecom and finance demand security – so we get PQC and QKD standards. Space demands interoperability – so we get aligned specs for satellite QKD. Researchers need common tools – so we get standardized languages and benchmarks. By addressing real-world needs, the global standardization community (ETSI, ISO/IEC, ITU, IEEE, NIST, IETF, and others in concert) is turning quantum computing and communication from a collection of experimental systems into a set of reliable, interoperable technologies. This not only accelerates innovation (since companies can build to a known standard rather than reinvent the wheel), but also builds trust among users and policymakers that quantum tech can be safely integrated into critical infrastructure.

Conclusion

The landscape of quantum technology standardization in 2025 is both vibrant and rapidly evolving. What began with niche groups drafting specs for quantum key distribution has expanded into a coordinated global endeavor spanning multiple domains – from cryptography standards that will protect us from quantum hackers, to networking protocols that weave quantum links into the internet, to performance metrics that gauge the power of quantum computers, and beyond. The newly formed ETSI TC on Quantum Technologies exemplifies this broadened scope: it brings under one umbrella a range of topics (quantum communication, sensing, computing, security) that used to be addressed in isolation, recognizing that a holistic approach is needed to truly build the quantum ecosystem. By emphasizing links to policy (EuroQCI and European quantum initiatives) and collaboration with other SDOs, ETSI’s effort shows how regional and global interests intersect in standardization.

Meanwhile, international bodies like ISO/IEC JTC 3 are ensuring that duplication is minimized and best practices are shared worldwide. The inclusion of liaisons such as ETSI and ITU in the ISO committee underscores that all these organizations are communicating to align their work. The IEEE is contributing technical depth with its standards for definitions, architectures, and benchmarks, complementing the high-level frameworks from ISO/IEC and ITU. NIST has provided a focal point in cryptography by delivering PQC standards that many will adopt, while the IETF ensures our Internet protocols remain secure and compatible with those new algorithms. Even open-source consortia like the QIR Alliance play an important role in carving out de-facto standards that might later inform formal ones.

For engineers and technically curious readers, these developments mean that the quantum revolution is being built on a backbone of standards. Much like classical ICT standards enabled the digital age (imagine computing without ANSI C, or networking without IEEE 802.11 Wi-Fi standards), quantum standards are quietly enabling disparate innovations to coalesce into something greater – interoperable quantum solutions. For the general public, the existence of these standards efforts should be reassuring: as quantum tech leaves research labs, standards will help ensure it’s safe, reliable, and compatible with the systems we use every day.

In the coming years, we can expect even more areas to be standardized – perhaps quantum internet protocols, quantum cloud service interfaces, or ethical guidelines for quantum data. The work is far from done, but the foundation is being laid now. The global uptake of quantum standardization is a collaborative marathon, not a sprint, involving experts from every continent and sector. With initiatives like ETSI’s TC QT driving momentum and coordination, the outlook is that standards will keep pace with (or even guide) quantum technological progress. This balanced approach – fostering innovation while imposing order – gives us the best chance to harness quantum computing and communication for real-world benefit, whether in securing our communications, improving our sensors and clocks, or powering the next leaps in computing. As these standards mature and gain adoption, quantum technology will increasingly integrate into the fabric of modern life, much as classical tech standards did before, ushering in a new era of information technology grounded in the weird yet wondrous laws of quantum physics.

Sources:

  • ETSI Technical Committee QT establishment letter and press release (scope of work: quantum communications, networking, sensing, satellite QKD, QRNG, hardware security testing; alignment with EuroQCI).
  • ETSI QKD ISG overview (global QKD network deployments; need for standards; ETSI QKD interface specs and white papers).
  • ISO/IEC JTC 3 Quantum Technologies scope and structure (international quantum standards committee covering computing, metrology, sensors, communications, etc., with liaisons to ETSI, ITU).
  • ITU-T Y.3800-series Recommendations (framework for quantum key distribution networks and roadmap for quantum networking standards).
  • NIST Post-Quantum Cryptography project updates (publication of first PQC standards in 2024 for Kyber, Dilithium, SPHINCS+; ongoing selection of additional algorithms like HQC).
  • Quantum Insider news on ISO QKD security standardthequantuminsider.comthequantuminsider.com (ISO’s first international standard for QKD security requirements, aligning with Common Criteria).
  • IEEE Quantum Standards initiatives (projects P3120, P7131, P3185, etc., covering quantum computer architecture, performance benchmarking, hybrid quantum-classical systems, and more).
  • Linux Foundation QIR Alliance announcement (quantum intermediate representation for interoperability between quantum software and hardware, interfacing quantum and classical computing).
  • Adtran/Orange QKD trial press release (real-world telecom use case of a multi-vendor QKD-secured link, highlighting the role of open standards for key delivery and interoperability).
  • ETSI Quantum Safe Cryptography white paper and related content (use cases requiring long-term secrecy, and approaches combining quantum cryptography and post-quantum algorithms for future-proof security).

Disclaimer: All content published on this site represents my personal views and opinions. It does not reflect the views, policies, or positions of any past, present, or future employers, collaborators, or affiliated organizations. Any errors or omissions are my own.