Why I don't like smartcards, HSMs, YubiKeys, etc.

This is a rant about smartcards with some digression into ranting about the integrated circuit industry.

The smartcard that didn't exist

Smartcards and HSMs are essentially two “brands” for the same thing: a chip which guards access to the data stored within it, and will only allow that data to be accessed in certain ways or under certain conditions. HSMs are the “enterprise” label for such devices, whereas smartcards are essentially the same thing, only cheaper.

To begin with the arbitrary disparity between the cost of smartcards and HSMs itself seems rather dubious, given that these devices perform the same function. Perhaps HSMs tend to be FIPS validated and that cost needs to be recouped. Perhaps companies charge more for them just because they can. The truth is probably somewhere in the middle. At any rate, I will refer to all of these devices as HSMs herein.

What irks me is that the product I want to buy does not, apparently, exist. That product is this:

  • A (contact or proximity) smartcard or other compact, portable device with a common interface (i.e. USB).

  • The device is a general purpose computer which can have an arbitrary program loaded onto it targeting a common hardware ISA (e.g. ARM).

  • The volatile and non-volatile memory of the device can be accessed only in two ways:

    • It can be read and written by the program executing on the device.

    • Via a special 'write program' function, which may be triggered by the interface. The current program cannot preempt or prevent the activation of this function.

      All of volatile and nonvolatile memory is securely erased. A new program is then read from the interface, which then replaces the current program.

  • The device is fast enough, or has appropriate coprocessors, such that it can perform operations using a modern cryptographically secure hash function, a modern block or stream cipher, a modern public key signature system, and a modern key exchange system. The device has a hardware RNG. (And yes, by “modern” I mean “compliant with Kerckhoffs's principle and presently well-regarded by the global public cryptographic community” and not “a proprietary, secret cipher you won't disclose to anyone”.)

    I'm not sure how tall an order this is. Performance aside, a hash function can be turned into a stream cipher, so you only really need a hash function, public key crypto functions and a hardware RNG. Hardware implementations of e.g. the SHA functions appear to be common, and hardware RNGs also appear to be common. The major issue is then the public key primitives.

  • The physical tamperproofing you would expect of such a device.

The merits of this design should be self-evident. Because replacing the program destroys all data on the device, the program on the device can effectively and totally mediate access to the information stored within. This is a general-purpose HSM. The current program could, of course, choose to overwrite itself with a new program read from the interface, if it decides it trusts that program; if this causes logistical issues (overwriting code that is currently executing), the hardware may provide a way for the program to invoke the 'write program' function, but passing a hash of the new program, whereby the new program must match the hash, and existing memory is not erased.

Every so often, I am unable to believe that this product, apparently, does not exist. I find this so hard to believe that every few months, I end up going on some intense google-dive only to reach the same conclusion: it — incomprehensibly — doesn't.

Why on earth is there such a thing as an “OpenPGP smartcard”? Are you seriously telling me that using PGP with an HSM requires a hardware manufacturer to specifically decide to support that application?

There do appear to be general-purpose, programmable smartcards, but they are for some reason built on interpreters (JavaCard, BasicCard, MULTOS) and not a hardware ISA; I have yet to determine why. It is rather baffling given the constrained computer power these devices possess; adding an interpreter cannot be helping matters. (It's also unclear what format the program delivered to a BasicCard takes, as there's no way it's a BASIC dialect.)

For that matter, who decides to standardise on BASIC as a language for programming smart cards? Who on earth are they targeting it at? Shouldn't programming HSMs be left to people with some knowledge of cryptography? I'm not saying there are no programmers sufficiently knowledgable and capable to competently implement cryptographic protocols which have BASIC as their preferred programming language, but I'm also fairly sure that set of people is very small. (Though on the other hand, at least it's a memory-safe language. That's at least better than most cryptographic software. No Heartbleed on BasicCard!)

Programmability as a premium

Here's an interesting Stack Exchange question: Are there any hardware HSMs that can host/run custom applications using the HSM processor(s) within the hardened security boundary?


The Thales nShield HSM (previously nCipher) allow for generic programming. This is a rather expensive option; it must first be enabled in the HSM (through a "feature file" which is signed by Thales and specific to the serial number of a HSM), and then the extra code can run as long as it is signed with a key known to the HSM for such usage.

So it appears that HSM (that is, the expensive “enterprise” sort of HSM) manufacturers consider programmability a “premium” feature, perhaps also a niche feature. Though making it a niche feature is a self fulfilling prophecy; there'll be only a niche market for general-purpose HSMs if HSM manufacturers insist on treating them as a niche market.

I'm going to conjecture on basically no evidence and hypothesise that HSM/smartcard manufacturers maybe feel threatened by the idea of generality of purpose. Why sell one model of HSM that can do basically anything when you can sell ten, all with different pre-programmed functions, all at different price points based on the market?

Another motivation might be to protect security theatre features, like manufacturer-programmed unique IDs and futile attempts to prevent a card lying about them.

You'll take my NDA from my cold, dead hands

The cryptographic hardware community isn't just selling the wrong products. It's attitude to “security” seems backwards in general, reminiscent of a parallel universe in which Kerckhoffs's principle doesn't apply. Probably the most famous instance of this is NXP's MIFARE Classic smartcard, which uses its own proprietary, secret cipher created by NXP blandly named “Crypto1”. Or at least it was secret until someone decapped a chip, photographed the gates and OCRed them into an understanding of the algorithm, at which point they ran rings around the predictably rubbish cipher.

Philips-NXP now makes a line of smartcards with more modern crytography, called MIFARE DESFire. Wait! Come back! The MIFARE DESFire smartcards don't just support DES — the DESFire line includes cards which support AES, too.

The fact that the DESFire line has not been renamed even though it supports a far superior cipher suggests that in the smartcard industry, advertising a cryptographic product as involving "DES" is not the customer-repellent you would expect it to be. Then again, people happily standardised on MIFARE Classic cards, so, I guess that's not so surprising. (To be fair, MIFARE DESFire actually implemented 3DES from the outset, but that just reinforces the point; the product was selling itself short from the beginning, apparently without issue.)

Hardware companies in general seem to love NDAs, seemingly out of superstition more than anything else. After so many years, AMD finally started publishing documentation for its GPUs so that Linux could implement proper support, and believe it or not, the sky didn't fall in on AMD. (Or at least, not for that reason.)

Companies like NVIDIA and Broadcom have retained their attachment to such secrecy. It rather reminds me of the Snowden revelations, in which heads of various US and UK intelligence agencies talk about what a disaster the Snowden revelations have been. The interesting thing to consider is that if you had asked these people what would happen if the exact information published in the Snowden revelations was published, before the Snowden revelations, they would most likely claim in evasively vague terms that it would be an unmitigated disaster and lead to some sort of lesser armageddon. And yet here we are, and society is still functioning. (The fact that these intelligence heads continue to decry the Snowden revelations as an unmitigated disaster demonstrates their incapability to have even the slightest amount of objectivity or perspective.)

Broadcom, for those not aware, makes, amongst other things, switch chips now often used in high-performance datacentre switches. The so-called “merchant silicon” trend is to move away from the IBM-esque approach to networking that Cisco has been presiding over for so long in favour of commodity devices with merchant silicon and some sort of standard probably Linux-based operating system. Importantly, the end user, and not the vendor, decides what operating system to use.

In other words, it ends up looking a bit like the PC GPU industry, where you take your pick of GPU and use some sort of SDK or standardised API to make use of it from a wholly unexceptional (x86, ARM, etc.) host system.

History is however determined to repeat itself, as just as companies like NVIDIA and previously AMD were averse to releasing the slightest amount of interface documentation, so too is Broadcom. Broadcom appears to have at least realised that preventing people from using its products is not in its interests, and so has released a closed-source SDK strangely called “OpenNSL”.

Other entities are working on a “Switch Abstraction Interface”, which presumably intends to be an OpenGL-like effort.

It's cryptographic — quick, NDA it!

Maxim IC makes iButtons, small electronic devices which can do various things, like measure temperature or store a small amount of data or spit out a factory-programmed serial number.

Most of these iButtons have public datasheets. This one doesn't, seemingly for no reason other than that it involves cryptography. Reading the feature list, it's hard to see any reason for NDAing the datasheet. Another example.

It's like someone at these companies goes around telling people to NDA datasheets if the product does anything that sounds even vaugely cryptographic. At best, these are secure products pointlessly NDA'd; at worst, they are insecure or provide security theatre and conceal their futility behind NDAs.

Nitrokey: Light at the end of the tunnel?

The Nitrokey (previously Cryptostick) at first glance seems to be bucking the trend by providing a programmable HSM. At second glance there is much to raise eyebrows at.

The Nitrokey Storage seems to be positioned as the flagship product, but since I'm mainly interested in HSM functions not disk encryption functions, I consider the Nitrokey Pro more interesting; though for some reason only the Nitrokey Storage claims to support firmware updates.

The first peculiarity in the feature table is the claim of a “tamper resistant smart card”. Well, judging by the photographs that's physically impossible, so I'm assuming they mean a tamper-resistent smart card chip. But this raises the question of why an HSM-like device feels the need to repackage a smartcard chip. If the device is just a repackaging of a smartcard chip, it accomplishes nothing.

The feature table also lists various supported applications, demonstrating the interest of the manufacturer in programming the device for specific applications, rather than providing a platform for others to do so. (Imagine if manufacturers of USB drives made USB drives for text files and USB drives for image files and USB drives for MP3 files and so on, and the idea of selling a USB block device was alien to these people. If you wanted to store a new kind of file on a USB drive, you had to convince the manufacturer to implement support for it.) The draw of the Nitrokey then is the possibility the manufacturer merely incidentally allows alternate firmware to be flashed, rather than the manufacturer explicitly capitalising on the premise of an HSM as a general-purpose platform.

The microcontroller used in the Nitrokey Pro is an STM32F103TB. This microcontroller doesn't appear to offer as much in terms of fuse bits, etc. as the Atmel AT32UC3A3256S used for the Nitrokey Storage — see below — but apparently it is nonetheless possible to prevent readout via JTAG. So long as the device does not expose any facility to read out memory or change firmware via the USB interface, the device is secure. Theoretically, one could presumably implement a firmware update mechanism that securely erases memory or checks an update signature, but this does not appear to have been implemented for the Nitrokey Pro.

Strangely, the Bill of Materials for the Nitrokey Pro contains a SIM card socket, so presumably they have a trimmed down smartcard in there. Is the Nitrokey Pro really just a smartcard chip with a USB interface? Is the microcontroller just used to interface the smartcard to USB? Examination of the (open source) firmware seems to suggest something along these lines, although some functions, such as HOTP support, are implemented in the microcontroller.

The available information about the Nitrokey Pro then suggests a haphazard design in which some functions are performed on a microcontroller but others, for some reason, are performed by a smartcard chip. Perhaps this due to a lack of confidence in the security of the main microcontroller? Or perhaps the smartcard chip is used as a poor man's cryptographic coprocessor, to accelerate public key operations. The latter would make more sense, as no public key cryptography appears to be used in the firmware, but the firmware's execution environment appears to be trusted enough to implement other applications such as HOTP and a password safe. Except that the existence of gnuk (see below) demonstrates that it is absolutely feasible to perform public key cryptography on a STM32F103.

Turning to the Nitrokey Storage, which promises “firmware updates and verification”. The device also promises encrypted mass storage, which is apparently implemented via an onboard microSD card. I consider this a distraction for the purposes of an HSM, so I will focus on the more HSM-like side of the device.

The Nitrokey Storage appears to use an Atmel AT32UC3A3256S. This microprocessor has some fuse bits which appear to facilitate the sort of desirable security properties I described above. Rather ironic that this ends up being in some arbitrary microprocessor not specifically indended for HSM purposes. It still has a smartcard chip attached to it, though.

Concerningly, the ideas page of the Nitrokey wiki describes a proposal for a secure firmware update scheme. This is in itself desirable, but it appears unclear whether this is intended to remove control of the device from the user. If a user is to retain the freedom to nominate their own firmware, how is this done, and can it also be done securely?

Nitrokey has published the results of a third party audit of the Nitrokey Storage on their website. Their interpretation of the audit report is “great results”. This seems like an odd interpretation of the audit.

From my perspective the results are not good: the report indicates that they didn't even set the security fuse bit, enabling uncontrolled access to the device's memory. This is baffling — you'd expect it to be one of the first bases to cover, and it demonstrates a disturbing lack of attention to detail in a security-sensitive product.

Still, most of the other issues raised by the hardware audit seem inapplicable to me, like complaining about the accessibility of the JTAG pins. If a device has JTAG pins, it has JTAG pins, and making them harder to get at isn't a terribly big improvement. But the presence of the JTAG pins isn't a problem anyway; the audit takes issue with the fact that it allows the device to be securely erased and then reprogrammed. But this hardly seems like a problem — and even if it is, obfuscating access to the JTAG pins doesn't make it not a problem. Certainly these proposed obfuscations might be improvements, but they are only small ones. Replacing an HSM program with a malicious program that doesn't have access to any of the existing data seems unlikely to pose much of a threat; I suppose if a user considers the spontaneous erasure of all data on the device to be a fluke and proceeds to reload it without appropriate investigation, it may pose a threat. For HSM applications where a malicious program can pose a threat even without the previously stored data, it seems more logical to authenticate the device via public key cryptography.

The hardware audit also raises issues with a lack of physical tamper detection. Ultimately there is a question of what you consider to be the greater threat: malicious software, or physical attacks? A device providing protection only against malicious software, or additionally offering only the most minimal protection against physical attacks, can nonetheless be a highly valuable improvement in security, as malicious software will often pose a greater risk.

The firmware audit reaffirms that the Nitrokey Storage in its default configuration fundamentally fails to protect the data stored in it, again due to the failure to set the security fuse bit. Interestingly the externally attached smartcard chip increases the vulnerability here, as for some reason an administrative password needed to overwrite firmware is verified via it, making the main microcontroller vulnerable to a device impersonating the smartcard chip. At any rate, the premise of predicating the fundamental security of an HSM on a password seems extremely dubious, especially keeping in mind that it will necessarily need to be entered via the computer to which the Nitrokey Storage is attached, which may be compromised, this being the very threat an HSM guards against.

Other issues revealed by the firmware audit include the use of ECB mode in several places, as well as the incorrect use of CBC mode.


gnuk is an implementation of the OpenPGP smartcard protocol for the STM32F103 series of microcontrollers. Seems neat.

Coincidentally, the Nitrokey Pro uses an STM32F103. So it's clearly possible to implement public key operations on the STM32F103, making the additional smartcard chip in the Nitrokey Pro (not to mention the Nitrokey Storage) even more puzzling.