SBN

Quantum Computing and the Risk to Classical Cryptography

The recent standardization of first three post-quantum cryptography (PQC) encryption and digital signature algorithms by the U.S. National Institute of Standards and Technology (NIST) has officially kicked off the race to PQC readiness. In its PQC press release, NIST cites predictions that within the next decade, a cryptographically-relevant quantum computer (CRQC) capable of running Shor’s algorithm will pose a significant risk to classical cryptography because it will have the ability to efficiently solve problems that are currently infeasible for classical computers.

Shor’s algorithm running on a CRQC will be capable of factoring large integers and computing discrete logarithms over finite fields and elliptic curves. This capability will undermine the security of widely-used asymmetric cryptographic algorithms used for encryption and digital signatures such as ECDH, RSA, ECDSA, and EdDSA.

Currently, classical cryptographic algorithms are ubiquitous and have a function to protect the confidentiality and integrity of digital data in code, development pipelines, network protocols, and applications running on end-user system and servers.

To address the risks posed by a CRQC, the three PQC standards approved by NIST provide algorithms to handle encryption and digital signatures which are considered quantum-resistant. However, due to the extensive use of classical asymmetric algorithms across various use cases and complex systems, the transition will require the correct engagement of people, processes, and tooling to be successful. The scale and complexity of this transition to quantum-resistant algorithms will be several orders of magnitude greater than previous cryptographic algorithm upgrades.

Claroty

In this article we provide the relevant background and motivation to why PQC migration is now an essential consideration for CISOs and we also demonstrate how this migration strategy dovetails into other competing priorities.

Crypto-Agility vs ‘Crypto-Fragility’

Crypto-agility is an organization’s ability to transition to new cryptographic standards and algorithms in response to new security threats.

Historical cases point to crypto-agility being an overly optimistic property for a system to possess. One of the largest cryptographic upgrades in the late 1990s and early 2000s was the replacement of IBM’s Data Encryption Standard (DES).

DES was broken in the late 1990s mainly through the use of brute-force attacks. The upgrade to Triple DES (3DES) allowed certain configurations to be backward-compatible with DES through the reuse of the secret key. Furthermore, in 2001 the Advanced Encryption Standard (AES) was standardized by NIST and was promoted to be the preferred replacement for DES.

DES was widely used in Automated Teller Machines (ATMs) – often implemented in hardware. Therefore, ‘crypto-agility’ in certain use cases meant physical hardware replacements of susceptible hardware.

More recently, the Secure Hashing Algorithm 1 (SHA-1) was no longer accepted on commercial web browsers using TLS certificates in 2017. TLS certificates were required to use the SHA-2 algorithm as a replacement to SHA-1. The motivation behind this hashing algorithm upgrade was due to a clear risk of a plausible attack on SHA-1 known as a collision attack being identified in 2011. A seasoned CISO will recall the resource intensive task of replacing these certificates throughout their organizations.

The commercial web browser vendors were not crying wolf. In 2017, a collision attack was realized by researchers at CWI and Google on two PDF files, SHAttered.

In both instances of DES replacement and SHA-1 replacement, crypto-agility was not feasible in practice. With DES, labor intensive hardware upgrades were needed in various use cases. With SHA-1, without highly mature Certificate Lifecycle Management (CLM) at the time, the task of identifying and upgrading TLS certificates was manual, slow and resource intensive.

Without crypto-agility CISOs are faced with the opposing reality of ‘crypto-fragility’. Crypto-fragility is marked by two characteristics:

  1. Not having clarity about the systems which host the vulnerable cryptography (lack of measurement)
  2. Not having the people, processes, and tooling to upgrade the cryptography efficiently (lack of management)

Why PQC Migration is Inherently More Complex

There are many more algorithms to upgrade across machines, applications, workloads and cloud services across more complex hybrid multi-cloud environments.

The DES upgrade to 3DES and ultimately to AES was the replacement of a symmetric algorithm. The SHA-1 upgrade to SHA-2 was the replacement of a hashing algorithm. With the new NIST PQC standards, there are requirements to replace not only the asymmetric algorithm, but also auxiliary functions, such as the hashing function.

Therefore, for PQC migration, two algorithms will need upgrading, namely the main asymmetric algorithm and the hashing algorithm, as compared to one algorithm in the case of DES, namely the replacement of a symmetric algorithm, and SHA-1, namely the replacement of the hashing algorithm.

Set up your own quantum-safe PKI hierarchy and begin your PQC journey today.

Size Requirements

When DES was first upgraded to 3DES, the key size went from 56-bits to 112-bits or 168-bits. This marked a two or three times larger key size for an upgrade. This step change in key sizes, made adoption in the field challenging. When SHA-1 was upgraded to SHA-2 the key size went from 160-bits to 256-bits, a 1.6 times increase in size.

By way of example, let us consider the upgrade of the ECDSA algorithm to ML-DSA. Upgrading a 32-byte public key would require at minimum for the ML-DSA-44 algorithm a 1312-byte public key. This is a 41 times increase in

Furthermore, if we chose to adopt the SLH-DSA-SHAKE-128s algorithm, although the public key size is comparable to the ECDSA key size, the minimum signature size is 7856-bytes, this is a 123 times size increase when compared to an ECDSA 64-byte signature.

Clearly the scale and complexity of a PQC migration must not be underestimated. Even less complex historical examples of cryptographic algorithm updates presented challenges. Therefore, it is a fair assumption to make that PQC migrations will pose greater difficulties than previously encountered.

If You Fail to Prepare, You Will Prepare to Fail

The key to any successful migration project is preparation. In this section, we provide insight to why early preparation for successful PQC migrations is essential.

The Importance of Mosca’s Theorem

Mosca’s Theorem provides a simple formula to identify the timeframe required to adopt quantum-resistant algorithms to reduce the risks to confidentiality and integrity of your digital data. Learn more about Mosca’s Theorem here.

The formula is as follows: if x+y > z, then you have a risk of your confidential data being exposed due to the risk of a CRQC. Where x is the number of years you need to keep the organizations data safe, y is the number of years required to adopt quantum-resistant algorithms, and z is the number of years for a CRQC to be accessible.

Illustration of Mosca’s Theorem 2015

Figure 1: Illustration of Mosca’s Theorem from page 21 of Mosca’s 2015 presentation: Cyber security in a Quantum World: Will We be Ready?

This less mathematical example is more practical approach to demonstrate the urgency of a PQC migrations to other decision makers within you organization

Illustration of Mosca’s Theorem

Figure 2: Illustration of Mosca’s Theorem reproduced from Figure 1 of the Global Risk Institute’s 2021 full Quantum Threat Timeline Report by Mosca and Piani.

As a CISO, you are the custodian of your organization’s data. Therefore, you have very little control over variable x. Also, unless you are working for a leading research and development company focusing on the advancements in quantum-computing, you also have very little control over variable z. However, where you do have control is on variable y, the time it will take to adopt PQC for the relevant digital data assets in your organization.

Accordingly, it is in your organization’s best interest with you as the leader of the cybersecurity team and strategies, that you progress with the planning and preparation for migrating to quantum-resistant algorithms as soon as possible. Budgets, processes and resources will need to be in place to make this project a success.

Benefits of Starting PQC Migrations Now

In the NIST Cybersecurity Framework’s (CSF) Five functions of Identify, Protect, Detect, Respond, Recover (IPDRR) (The CSF 1.1 Five Functions ), early adoption of PQC fits into the Identify and Protect functions.

By measuring and identifying the systems vulnerable to a CRQC, you can then begin to understand the rough order of magnitude that will be required for migrating to quantum-resistant algorithms. This will also provide you with the ability to prioritize the correct systems and data for remediation based on Mosca’s Theorem.

Also, by ensuring that protective controls are in place for the critical digital systems and assets in your organization, this can greatly mitigate the resources and costs associated with the Detect, Respond, and Recover functions. A proper defense-in-depth strategy focuses heavily on the initial first line of defense protection of assets.

To get started, crypto-agility is key and the first step is to gain full visibility into your crypto landscape. Without knowing what you have, it will be impossible to properly scope and plan your PQC migration strategy. An effective certificate lifecycle management (CLM) solution that can enable crypto-agility is a must for visibility, automation and control of all crypto assets in your PKI landscape.

To learn more about crypto-agility and certificate lifecycle management, request a demo of the AppViewX AVX ONE CLM solution.

AppViewX can help you implement crypto-agility and start preparing today for Post-Quantum Cryptography

*** This is a Security Bloggers Network syndicated blog from Blogs Archive - AppViewX authored by Dr. Angelique Faye Loe. Read the original post at: https://www.appviewx.com/blogs/quantum-computing-and-the-risk-to-classical-cryptography/

Application Security Check Up