Contemporary cryptography and the attack on trust
- 17 hours ago
- 6 min read

Friday 13 February 2026
Cryptography is the quiet engineering discipline that makes modern trust possible. It underwrites everything from a WhatsApp message to a bank transfer, from a software update to a battlefield drone’s telemetry link. Yet it is not, in the popular imagination, an art of secret writing, nor even simply “encryption”. Contemporary cryptography is a set of mathematical tools for building systems in which parties can compute, communicate and authenticate under hostile conditions, while assuming that some participants, networks, devices, or even future breakthroughs will behave badly.
To understand where cryptography is now, it helps to see what it is trying to guarantee. In broad terms, modern cryptography delivers four families of assurances.
Confidentiality means that an eavesdropper cannot read your data. Integrity means that an attacker cannot alter your data without being detected. Authenticity means that you can verify who created or approved something, whether it is a message, a software update or a contract. Freshness and replay resistance mean that an attacker cannot capture yesterday’s valid message and make it appear relevant today. Behind these assurances sit concrete mechanisms: encryption, digital signatures, key exchange, message authentication codes, hash functions and the protocols that bind them together.
Two ideas dominate practical cryptography in 2026: keys and threat models.
A cryptographic key is not a password; it is a secret value that is meant to be uniformly random, long enough that guessing is infeasible, and handled by software or hardware rather than memory. Threat models are the statements, sometimes written and often merely assumed, about what the attacker can do. Can it read network traffic? Can it modify it? Can it compromise an end device? Can it store encrypted traffic for years, hoping future methods will unlock it? The same cryptographic primitive (basic foundational algorithm) can be perfectly adequate under one threat model and dangerously weak under another.
From this perspective, the last thirty years of cryptography have been an exercise in system-building. Public key cryptography lets strangers agree secrets over an open network and authenticate one another without sharing a secret in advance. Symmetric cryptography lets those agreed secrets encrypt large volumes of data efficiently. Hash functions turn arbitrary data into compact fingerprints. Protocols like TLS (used for HTTPS) choreograph these tools into a sequence of messages which, if implemented correctly, yields secure channels at global scale.
The trouble is that “implemented correctly” is not a footnote. The mathematics of a scheme can be sound, while an implementation leaks secrets through timing, power consumption, memory access patterns, error messages, random number failures, or simply a careless interface. Cryptography is therefore both an intellectual discipline and an industrial practice. It is as much about eliminating sharp edges as it is about elegant proofs.
That duality has driven a shift in how modern cryptography is designed. Classical work often centred on constructing primitives and proving them secure relative to idealised assumptions. Contemporary work still prizes proofs, but it increasingly cares about formal verification of code, constant-time implementations (so execution time does not betray secrets), safer protocol composition, and deployability under real constraints such as mobile devices, embedded controllers, satellite links or battlefield radios.
The field has also become more plural. For decades much of deployed public key cryptography depended on a small set of hard problems: factoring large integers (RSA), computing discrete logarithms (Diffie–Hellman), and the elliptic-curve variants of discrete logarithms (ECDH and ECDSA). These remain widely used, but the field is now consciously diversifying. The reason is not fashion; it is risk management. If a breakthrough, whether classical or quantum, weakens a dominant assumption, too much of the world breaks at once.
This brings us to the most urgent contemporary theme: the transition to post-quantum cryptography.
A sufficiently powerful quantum computer could, in principle, break the public key schemes that secure most of today’s internet. The timeline is uncertain, but the “harvest now, decrypt later” threat is not. An adversary can record encrypted traffic today and attempt to decrypt it years from now if the underlying public key method becomes vulnerable. That makes migration to quantum-resistant methods a present-day problem, not a future one.
In August 2024 the United States’ National Institute of Standards and Technology (NIST) published the first set of finalised post-quantum cryptography standards, including a lattice-based key encapsulation mechanism (ML-KEM) for establishing shared secrets and signature schemes (ML-DSA and SLH-DSA) for authentication. These are not merely academic badges. They are intended to become the default building blocks for future secure communications, software signing and identity.
Migration, however, is messy. Replacing primitives is not like swapping a component in a machine. Protocols, certificate infrastructures, device constraints, compliance regimes and decades of assumptions sit on top. That is why standards bodies are drafting “hybrid” approaches, which combine classical methods with post-quantum methods so that an attacker must break both to recover a secret. The IETF’s work on hybrid key exchange for TLS 1.3 is explicitly framed as a practical route for adoption, not just as a theoretical design exercise. Similar work is underway for SSH, where drafts define post-quantum and traditional hybrid key exchanges built around ML-KEM.
This is the pattern of contemporary cryptography more generally. The intellectual frontier is broad, but the live issues are increasingly about deployment at scale, survivability under multiple forms of uncertainty, and the sociology of trust.
Below is a list of some of the most live issues in cryptography today, that reflects the field’s current pressure points rather than her historical syllabus.
Post-quantum migration in real protocols: how quickly major protocols can move to NIST’s new standards, how hybrids should be specified, and how to manage backwards compatibility without creating permanent weak paths.
Crypto agility: designing systems so that primitives can be replaced without redesigning everything else, while avoiding the trap where “agile” becomes “anything goes” and security becomes untestable.
Implementation security at scale: constant-time coding, safe memory handling, side-channel resistance and supply-chain hardening, because the easiest attacks are often against code, not mathematics.
Key management as the true bottleneck: hardware security modules, secure enclaves, threshold signing, rotation policies, recovery mechanisms and human processes, because cryptography fails most often when keys are exposed, mishandled, or mis-issued.
Certificate and identity trust: the fragility of global public key infrastructure, the consequences of compromised certificate authorities, and the practical limits of revocation and transparency in a world of billions of devices.
Zero-knowledge proofs becoming engineering, not just theory: proofs that let one party demonstrate a statement is true without revealing why it is true, now used for privacy-preserving identity, verifiable computation and some blockchain systems; the live issues are efficiency, auditability, and standard patterns that do not depend on folklore.
Fully homomorphic encryption and privacy-enhancing computation: performing computation on encrypted data without decrypting it, long viewed as impractical, now moving into pilot deployments; the live questions are performance, standardisation pathways and how to express risk when the mathematics is subtle but the marketing is loud.
The tension between “privacy by design” and lawful access demands: policy pressure for exceptional access colliding with the reality that deliberate weaknesses tend to be reusable by criminals and hostile intelligence services, not merely by courts.
Cryptography for constrained devices and wartime infrastructure: lightweight primitives, resilient key establishment in intermittent networks, and secure update mechanisms for embedded controllers, sensors and drones, where compromise may be physical as well as digital.
Randomness quality: reliable generation of unpredictable numbers in cheap hardware, virtual machines and mobile devices, because weak randomness quietly destroys otherwise sound cryptography.
Formal methods and verified cryptography: the push to mechanically prove that implementations match their specifications, particularly for protocol stacks and libraries that the entire internet depends on.
Interactions with artificial intelligence: securing model weights and prompts, using cryptography to watermark or authenticate outputs, and exploring whether “proof of provenance” can be made robust without creating new surveillance infrastructure; much of this is immature, but it is moving quickly.
Usability and human factors: making secure defaults truly default, making dangerous choices hard to select, and designing interfaces that do not trick users into disabling security because it is inconvenient.
A comprehensive introduction to contemporary cryptography, then, is less a tour of ciphers than a tour of trade-offs. Cryptography is a discipline of precise claims. It does not promise “security” in a general sense. It promises that, under defined assumptions, an attacker with specified powers cannot achieve specified goals within feasible resources. When those assumptions shift, whether through quantum computing, new cryptanalysis, new hardware side channels, or the simple reality that engineers are fallible, the field must adapt.
For Ukraine, and for Europe more broadly, this matters in an unusually concrete way. War accelerates adversarial innovation. It pushes communications into degraded networks, increases the value of intercepted traffic, and raises the stakes of software supply chains. It also brings into view the central fact of cryptography: it is not a luxury layer atop a functioning society. It is part of the scaffolding that keeps modern institutions, markets and militaries operating when trust is under direct attack.




