Reconfigurable Physical Unclonable Functions - COSIC - KU Leuven

0 downloads 1 Views 321KB Size Report
Abstract—A PUF or Physical Unclonable Function is a function that is embodied in ... This is for instance the case with SRAM based PUFs, as SRAM cells can be.

Reconfigurable Physical Unclonable Functions – Enabling Technology for Tamper-Resistant Storage ˇ Klaus Kursawe∗ , Ahmad-Reza Sadeghi† , Dries Schellekens‡ , Boris Skori´ c§ and Pim Tuyls‡¶ ∗

Philips Research Laboratories, Eindhoven, The Netherlands email: [email protected] † Horst G¨ ortz Institute for IT Security, Ruhr-Universit¨at Bochum, Germany email: [email protected] ‡ Katholieke Universiteit Leuven, ESAT-SCD/COSIC, Belgium and IBBT email: [email protected] § Technische Universiteit Eindhoven, The Netherlands email: [email protected] ¶ Intrinsic-ID, Eindhoven, The Netherlands email: [email protected]

Abstract—A PUF or Physical Unclonable Function is a function that is embodied in a physical structure that consists of many random uncontrollable components which originate from process variations during manufacturing. Due to this random structure a physical stimulus or challenge generates unpredictable responses. Because of their physical properties PUFs are unclonable and very promising primitives for the purpose of authentication and storage of cryptographic keys. Previous work on PUFs considers mainly static challenge-response PUFs. In many applications, however, a dynamic PUF would be desirable, e.g., in order to allow the key derived from the PUF to be updated. We define a new primitive, the reconfigurable PUF (rPUF) which is a PUF with a mechanism to transform it into a new PUF with a new unpredictable and uncontrollable challenge-response behavior, even if the challengeresponse behavior of the original PUF is already known. We present two practical instantiations of a reconfigurable PUF. One is a new variant of the optical PUF, and the other is based on phase change memory. We also illustrate how an rPUF can be used to protect non-volatile storage against invasive physical attacks. Index Terms—Physical unclonable function, reconfiguration, non-volatile memory, tamper resistance.

I. I NTRODUCTION Physical Unclonable Functions (PUFs) form a cryptographic primitive that received quite some attention in the last years. The first example of PUFs were Optical PUFs and were introduced in [1], [2]. A physical structure containing randomly distributed light scattering particles is illuminated with a laser beam and the resultc 2009 IEEE 978-1-4244-4804-3/09/$25.00 

ing interference pattern is detected with a camera. Optical PUFs have multiple independent challenge-response pairs, where the challenge can for instance be the angle of incidence or the wave length of the laser beam, and the response is the recorded speckle pattern. The security of these PUFs was rigorously analyzed in [3] and it was shown in [4], [5] that the optical measurement setup can be integrated in a single, miniaturized device. Later, two types of PUFs embedded in silicon devices were presented: delay PUFs and memory based PUFs. Both are based on the fact that every Integrated Circuit (IC) is unique due to process variations during manufacturing. Ring Oscillator PUFs [6] and Arbiter PUFs [7] are of the first type and measure the difference in delay between seemingly identical circuits. In most cases the exact route through the delay circuit can be configured with a challenge. The resolution of the measurement is limited, and hence the response of these PUFs has a small size. It has been shown in [8], [9] that the behavior of some delay PUFs can be modeled accurately and hence that their challenge-response pairs are often not independent. Memory based PUFs on the other hand rely on the mismatch between gates in memory structures, such as SRAM cells [10], [11], [12], cross-coupled NOR gates [13], [14] or cross-coupled latches [15]. Every memory element yields one response bit, and typically no challenge mechanism is present1 . 1

A challenge-response behavior can be mimicked by using more memory cells. A downside of this approach is the fact that the amount of hardware resources scales exponentially with the challenge size.

22

Protective coatings are another method to integrate a PUF with an IC [16], [5]. During manufacturing a coating layer consisting of particles with different dielectric properties is sprayed on the top of the IC. The capacitance of this coating can be measured by the IC with an array of sensors. The challenge selects a specific sensor and the response is the locally measured capacitance value. This type of PUFs have a small number of challenge-response pairs, because the amount of sensors is limited and every sensor yields only a few bits. It was shown in [4], [17] that with appropriate postprocessing cryptographic keys can be derive from PUF responses. This makes PUFs a very promising technology to provide secure key storage in ICs in the invasive setting. In this model it is allowed that an attacker breaks into devices with powerful tools such as a focused ion beam, a laser cutter, etc. Experiments with coating PUFs [16] in particular have demonstrated that invasive physical attacks alter the behavior of the PUF to such extent that the stored key is damaged. Moreover PUF based key storage often is more cost efficient than the integration of dedicated Non-Volatile Memory (NVM), such as fuses and Flash memory. This is for instance the case with SRAM based PUFs, as SRAM cells can be implemented with standard CMOS technology. Some applications require an updatable cryptographic key. A good example is the protection of sensitive data in untrusted non-volatile storage. The confidentiality and integrity of such data can be protected with an encryption and authentication algorithm respectively, but a source of freshness is needed to detect the replay of old versions of the data. One approach to achieve replay detection is to alter the key material of the protection scheme every time that data is modified. A. Related Work The PUF constructions in the literature have always considered a static challenge-response behavior: for a given challenge, the PUF should always yield the same response2 . In order to achieve updatable PUF based key storage, a mechanism is needed to alter the challengeresponse behavior. Lim was the first to observe that this is a desirable feature and denoted this as a reconfigurable PUF (rPUF) [7]. He also sketched an Arbiter PUF with floating gate transistors. However it is unclear how the proposed PUF construction would be reconfigured and 2 In practice, due to measurement noise, the responses should be as close to each other as possible.

whether a reconfiguration sufficiently changes the PUF’s responses. Majzoobi et al. also use the term reconfigurable PUF [18] for the implementation of a static PUF in reconfigurable hardware. However, this approach does not yield a secure rPUF because the reconfigurations of contemporary FPGAs can be undone. B. Contributions There are two main contributions of this paper. Firstly, we present the first concrete realizations of a reconfigurable PUF. Secondly, we present an rPUF based non-volatile memory protection scheme. This scheme provides security even against invasive physical attacks. II. R ECONFIGURABLE P HYSICAL U NCLONABLE F UNCTIONS Loosely speaking a reconfigurable PUF is a PUF with a mechanism to change the PUF into a new one, ideally with a new unpredictable challenge-response behavior even if one would know the challenge-response behavior of the original PUF. Additionally the new PUF inherits all the security properties of the original one. We denote the interface to read the response r of a PUF for a given challenge c with r ← Read(c) and the additional reconfiguration interface with Reconf(). A more formal definition of these interfaces and the security properties of PUFs is given in Appendix A. It is important to note that the reconfiguration mechanism has to be uncontrollable and should not be based on updating a hidden parameter; e.g., part of the challenge or the location of the PUF structure in reconfigurable FPGA logic [18]. The PUF reconfiguration must be difficult to revert, even with invasive means. A. Reconfigurable Optical PUFs As a first example of a reconfigurable PUF we present the reconfigurable optical PUF. The physical structure is an object containing light scattering particles. The position and physical state (polarization) of these particles define the configuration of the structure [3]. The structure satisfies the following two conditions: 1) When the structure is irradiated with a laser beam within normal operating conditions, the structure does not change its internal configuration and produces a “steady” speckle pattern (see Fig. 1(a)). 2) When the structure is irradiated with a laser beam outside the normal operating conditions, the structure will change its internal configuration (see Fig. 1(b)).

2009 IEEE International Workshop on Hardware-Oriented Security and Trust (HOST)

23

(a) Before Reconf(). Fig. 1.

(b) After Reconf().

Example of speckle pattern.

Concretely we propose a structure that consists of a polymer containing randomly distributed light scattering particles3 as opposed to the normal glass PUF [1]. In terms of the definitions in Appendix A the reconfigurable optical PUF is described as follows. • The configuration space is given by S = {0, 1}n where n = V /λ3 and V the volume of the structure and λ the wavelength of the laser. A cube of volume λ3 is usually called a voxel. A configuration string s ∈ S defines the voxels in which a scatterer is present and defines completely the speckle patterns that are generated when the structure is irradiated with a laser beam. • The challenge space C is defined by the set of angles and locations under which the structure should be irradiated by the laser beam. • The response space R is the set of all possible speckle patterns (see Fig. 1 for two examples). • The set P is the set of random variables that define the responses of the optical structure to the applied challenges. • It was shown in [3] that the angles and locations have to be sufficiently far apart to satisfy the information theoretical security condition of Def. 3. • The fact that an optical PUF is hard to clone in a physical way follows from the fact that speckle phenomena are very sensitive to very small variations in the locations of the scattering particles [1]. Accurate mathematical modeling of speckle phenomena is very difficult [19]. • The interface Read(c) applied to a challenge c is implemented by irradiating the PUF with the laser according to the angle and position defined by c and measuring the speckle pattern with the CMOS sensor (see Fig. 2). • The Reconf() command is implemented by driving the laser at a higher current such that a laser beam of 3 Alternatively a phase change substance, widely used in rewritable optical discs, can be used instead of a polymer.

24

Fig. 2. Schematic side view of an integrated reconfigurable optical PUF [20].

r=0

r=1

r=0

r=1

“left”

“right”

“left”

“right”

logical 0

resistance (Ω)

logical 1

Fig. 3. PCM based rPUF. Limited control over the heating allows for 2 logical states and an accurate measurement gives one rPUF bit r.

higher intensity is created which melts the polymer locally and enables to the scattering particles to reposition. After a short time the laser beam is removed and the structure cools down such that the particles freeze. Finally we note that in [4] it was shown that optical PUFs (containing a laser and a structure) can be completely integrated and therefore be produced very compactly. The results obtained there clearly transfer to the reconfigurable optical PUF too. Hence we believe that a practical implementation is feasible. B. Phase Change Memory based rPUF Phase Change Memory (PCM) is a new type of fast non-volatile memory that has the potential to replace Flash and even DRAM. Each memory cell contains a piece of chalcogenide glass, usually a doped alloy of germanium, antimony and tellurium (GeSbTe), the same material used in rewritable optical discs. By subjecting it to a specific heating pattern, a phase change is induced: in the amorphous state the resistivity is high (logical “1” state); in the crystalline state it is low (“0”). Intermediate states (e.g., semi-amorphous and semi-crystalline) can be realized as well, allowing for more than one bit of storage per cell. The heating is regulated by passing a current through the cell. The state is read out by measuring its resistance. PCM has very favorable properties. Phase transition times of 5 ns have been achieved. Furthermore, a PCM cell may endure around 108 write cycles [21].

2009 IEEE International Workshop on Hardware-Oriented Security and Trust (HOST)

We observe that the control over the phase can be made less precise than the accuracy of the resistance measurement. Consider a cell whose state can be controlled just well enough to reliably realize n logical states (resistance axis divided into n intervals), while measurements are precise enough not only to tell in which interval the resistance lies, but also where in that interval. Each logical interval is subdivided into a number of more fine-grained intervals, e.g., “left” and “right” (Fig. 3). This additional information is easy to read, but cannot be controlled during the writing process. Hence we have an uncontrolled process resulting in a long-lived random state that can be reconfigured at will; precisely what is needed for an rPUF. PCM’s suitability for embedded memory [22] allows for a compact rPUF located inside an integrated circuit. C. Protective Coating on Non-Volatile Memory The security properties of an rPUF can be also be achieved by placing a static coating PUF on top of non-volatile memory, which contains part of the PUF’s challenge. The challenge c that is exposed using the external read interface, is combined internally with the nonvolatile reconfiguration state sReconf stored in the NVM and applied to the coating PUF: r ← Read(c||sReconf ). An external reconfiguration corresponds with updating the state sReconf . The internal reconfiguration state can either be a randomly generated value or a monotonic counter. In essence this approach mimics the behavior of an rPUF by using a large amount of static PUFs. After every reconfiguration, a different static PUF is selected. This comes at the expense of using a large amount of resources, compared to a dedicated rPUF that can reconfigure its physical structure. In order to protect against invasive replay attacks, it is crucial that the static PUF and the NVM are tightly integrated. Attempts to access the NVM from the outside should significantly change the responses of the PUF. As demonstrated in [16] coating PUFs satisfy this requirement. III. P ROTECTION

OF

N ON -VOLATILE S TORAGE WITH R PUF S

PUFs are a security component that is typically used for IC authentication/identification or secure key storage. In this work, we focus on the latter application. As a case study we investigate how the non-volatile memory of a security module (e.g., a smart card or a Trusted Platform Module) can be protected by an rPUF against invasive

attacks. The same protection scheme can be used when the security module lacks on-chip reprogrammable nonvolatile memory and consequently has to rely on external persistent storage. A. Reliable Key Extraction with Fuzzy Extractors The generation of a secret key from PUF responses needs some additional processing steps. The process that turns a PUF response into a cryptographic keys, consists of two steps: error correction and randomness extraction. We briefly repeat the concept of a fuzzy extractor or helper data algorithm that allows to extract a cryptographically secure key from noisy and not uniformly random PUF responses. For details we refer to the literature [23], [24], [4], [20], [25]. 1) Enrolment: During enrolment of the PUF, the procedure Gen is carried out: (k, w) ← Gen(r)

(1)

It takes as input a noisy PUF response r and creates a key k and helper data w. Clearly the key k has to be kept secret. The helper data w on the other hand is public and can be stored in non-secure NVM. In order to protect against active attackers (changing the helper data) robust fuzzy extractors should be used [26]. 2) Reconstruction: During the key-reconstruction phase, a noisy PUF response r is measured. Then the procedure Rep is run: k ← Rep(r , w)

(2)

It produces the same key k on input r and w given that r is sufficiently close to r. B. Persistent Storage Protection The core idea is to protect the sensitive state information of a security module with an updatable key derived from an rPUF, as depicted in Fig. 4. Each time the state is modified, the rPUF is reconfigured and the data is re-encrypted and re-authenticated with the resulting new key. We will apply the standard techniques used for the protection of external RAM [27], which usually consist of two components: memory encryption and efficient integrity verification using a hash tree, also called Merkle tree. Typically, the root of the authentication tree is stored inside the security module. In order to protect the integrity of the Merkle tree against invasive attacks, we propose to additionally authenticate the root with an rPUF derived key. Our proposal is depicted in Fig. 5.

2009 IEEE International Workshop on Hardware-Oriented Security and Trust (HOST)

25

Security Module

Security Module Internal NVM

T

μC

μC

AE

AEkT (T ) Read(cT ) rT

rPUF

Reconf() Read(cT )

kT wT

Rep

rPUF

(a) Reading persistent state. Fig. 4.

Internal NVM



r˜T

AE

AEk˜T (T˜ )

k˜T Gen

w˜T

(b) Updating persistent state.

Authenticated encryption of integrated non-volatile memory with an updatable rPUF derived key kT .

n0

n1

n2

n3

ci = EkT (ni, oi)

e0

e1

e2

e3

hi = H(ni||oi )

h0

h1

h2

h3

h4

hi = H(ni ||oi )

h5 hRoot

m1

(4)

The nonce ni is included in the calculation of the authentication tag, such that no information is leaked about the object. As an alternative, a dedicated authenticated encryption scheme could be used to calculate the tags:

hj = H(hi||hi+1)

mi = HkAuth,i (hRoot)

number of zeros depends on the length of the object. 2) Authentication Tree: The leaf nodes of the authentication tree are calculated as the hash of the nonce ni and the object oi :

m2

(ei , hi ) = AEkT (ni , oi )

(5)

Fig. 5. Memory encryption with a static key kT and robust integrity verification using a Merkle tree, whose root is authenticated with two updatable keys kAuth,1 and kAuth,2 .

The intermediate nodes are computed as the hash of their sibling nodes. The root of the authentication tree hRoot is authenticated with two keys kAuth,1 and kAuth,2 derived from different rPUFs:

1) Memory Encryption: The persistent state, denoted with T , is split into different logical objects oi . These objects are encrypted with a module specific, non-updatable key kT , which is derived from a static PUF. This yields the encrypted objects ei :

∀i ∈ {1, 2} : mi = HkAuth,i (hRoot )

ei = EkT (ni , oi )

with

ni ∈R {0, 1}∗

(3)

As encryption algorithm, AES can be used in CBC or counter mode. In case of CBC mode, the nonce ni represents the initialization vector (IV) and for counter mode it denotes the initial counter value. The nonces should be different for every object such that an adversary can not identify whether certain objects are (partially) the same. • For CBC mode, the initialization vector can be chosen randomly (i.e., ni = r) or as ni = i||ctri with ctri a monotonic counter that is incremented on every update of oi . • For counter mode, the initial counter value can be chosen partially random (i.e., ni = r||0) or ni = i||ctri ||0 with ctri a monotonic counter. The 26

(6)

3) Reliable State Updates: The usage of two authentication keys is required to recover from accidental or malicious failures during an update of the state T . Each time an object oi changes, the following steps must be performed: 1) A copy of the path from oi to the root hRoot is temporary stored in non-volatile memory. 2) A new nonce n˜i is generated, the corresponding cipher text e˜i and authentication tag h˜i and the path in the authentication tree are recomputed. 3) The first rPUF is reconfigured, yielding a fresh key k˜Auth,1 . ˜ Root is authenticated with this 4) The new root node h ˜ Root ). reconfigured key: m˜1 = Hk˜Auth,1 (h 5) The second rPUF is reconfigured, yielding a fresh key k˜Auth,2 . 6) The Merkle tree root node is authenticated with ˜ Root ).. this new key as wel: m˜2 = Hk˜Auth,2 (h

2009 IEEE International Workshop on Hardware-Oriented Security and Trust (HOST)

7) The backup copy (made in step 1) is deleted. A number of failure scenarios can be distinguished: • If a failure occurs before step 3, the previous state T can be recovered from the copy made at the beginning. • A failure during step 3 or 4 results in m1 and kAuth,1 being out of sync. However, the integrity of the previous state T can still be validated with m2 which have not been modified. In order to recover from the failure, the update process has to restart from step 3. • If a failure occurs after step 4, the security module can use T˜ as state. The recovery process has to restart from step 5, because m2 is invalid. C. Security in Invasive Setting The security of the state protection scheme relies on the confidentiality of the static encryption key kT and the confidentiality and integrity of the dynamic authentication keys kAuth,i . Cryptographic keys are susceptible to various kinds of hardware attacks when they are in use or at rest. In [16] it was demonstrated that (coating) PUFs provide a highly secure method to store secret keys at rest. PUF based key extraction provides automatic key zeroization when the device is being tampered with: attempts to discover the responses, and hence the derived keys, with (semi)invasive physical attacks will substantially altered the challenge-response behavior. Additionally, PUF reconfigurations are uncontrollable and consequently difficult to revert. This enables the security module to detect replay attacks. An adversary might be able to read the encrypted state including the Merkle tree with invasive means and can try to overwrite it at a later moment in time with this old copy. However, he will not be successful as the rPUF derived authentication keys will have been altered. It should be noted that appropriate countermeasures must be taken to protect the cryptographic keys in use. Side-channel attacks exploit the fact that a physical implementation of a cryptographic algorithm leaks unwanted information while the key is used to process data. D. Non-volatile Memory Externalization The proposed protection scheme can be used to secure the integrated persistent storage of a security module. The integration of non-volatile memory is typically done in two ways: embedded on the same chip (in a so called System-on-Chip or SoC) or included as a separate

chip inside the same package (as a so called System-inPackage or SiP). High-end embedded devices, such as the application processor of a mobile phone, tend to use the SiP approach because current NVM technologies, such as Flash memory, incur an additional cost (e.g., additional masks or process steps). Physical attacks on SiP devices are less difficult. If an attacker can remove the NVM chip from the package, he can read and write the content on the integrated memory with its internal communication interface. In case the NVM is embedded on the same die as the trusted module, this attack requires more powerful tools. The security module regards the non-volatile storage as untrusted. Hence, the scheme can also be used to protect storage in an external NVM chip. In this situation an adversary has access to the external interface of the NVM chip and can use it to manipulate the externalized state information. However, the confidentiality, integrity and freshness of the state are still guaranteed when the PUFs are embedded in the security module. E. Implementation Cost The rPUF based protection scheme comes at a cost in chip area. Additional hardware resources are needed for the fuzzy extractor, a symmetric cipher and a hash function. In most systems, adding these additional functions is relatively cheap. The main price point will be the implementation cost of the actual rPUF; however, as we see several ways to implement the construct and research is still in an early state, it is impossible to predict the practical cost right now. In addition, the protection scheme introduces an overhead in memory wear, as the sensitive state is encrypted. Whereas changing small state information normally only requires a small portion of NVM to be changed, now at least one encrypted block, as well as a number of nodes in the authentication tree need to be updated4 . We believe though that the total overhead of write operations into the memory is relatively small, and standard techniques used in NVM memory controllers (such as relocation of ‘hotspots’) can be used to mitigate the additional effects. IV. C ONCLUSION AND F UTURE W ORK In this paper we have defined the notion of reconfigurable Physical Unclonable Functions. Loosely speaking 4

It should be noted though that the description of our scheme omits possible optimizations; it is possible, for example, to have an unbalanced authentication tree with dynamic items (such as counters) close to the root and in small blocks, while more static objects (such as fixed cryptographic keys) can be in bigger blocks and further from the root of the authentication tree)

2009 IEEE International Workshop on Hardware-Oriented Security and Trust (HOST)

27

a reconfigurable PUF is a PUF with a mechanism to transform it into a new PUF with a new unpredictable and uncontrollable challenge-response behavior even if the challenge-response behavior of the original PUF is already known, while preserving all the security properties of the original one. We have presented the example of the reconfigurable optical PUF and sketched an rPUF based on Phase Change Memory. We have shown how this type of PUFs can be deployed to protect non-volatile storage against invasive attacks. The research of this new security primitive is still in an early stage. We have experimentally demonstrated the working principle of the optical rPUF, namely the redistribution of light scattering particles in a polymer material by irradiating it with a high intensity laser beam. However, more work has to be done in order to determine how many reconfigurations can be reliably performed. At the moment the PCM based rPUF remains a concept, which has not been practically verified. ACKNOWLEDGMENTS This work was in part supported by the IAP Program P6/26 BCRYPT of the Belgian State, by K.U.LeuvenBOF funding (OT/06/04), by the FWO project G.0300.07 (Security components for trusted computer systems) and by the European Commission through the IST Programme under Contract IST-027635 OPEN TC. R EFERENCES [1] R. S. Pappu, “Physical One-Way Functions,” Ph.D. dissertation, Massachusetts Institute of Technology, March 2001. [2] R. Pappu, B. Recht, J. Taylor, and N. Gershenfeld, “Physical One-Way Functions,” Science, vol. 297, no. 5589, pp. 2026– 2030, 2002. ˇ [3] P. Tuyls, B. Skori´ c, S. Stallinga, A. H. M. Akkermans, and W. Ophey, “Information-Theoretic Security Analysis of Physical Uncloneable Functions,” in Financial Cryptography and Data Security, 9th International Conference, FC 2005, Roseau, The Commonwealth of Dominica, February 28 - March 3, 2005, Revised Papers, ser. Lecture Notes in Computer Science, A. S. Patrick and M. Yung, Eds., vol. 3570. Springer-Verlag, 2005, pp. 141–155. ˇ [4] B. Skori´ c, P. Tuyls, and W. Ophey, “Robust Key Extraction from Physical Uncloneable Functions,” in Applied Cryptography and Network Security, Third International Conference, ACNS 2005, New York, NY, USA, June 7-10, 2005, Proceedings, ser. Lecture Notes in Computer Science, J. Ioannidis, A. D. Keromytis, and M. Yung, Eds., vol. 3531. New-York: Springer-Verlag, 2005, pp. 407–422. ˇ [5] B. Skori´ c, G.-J. Schrijen, W. Ophey, R. Wolters, N. Verhaegh, and J. van Geloven, “Experimental Hardware for Coating PUFs and Optical PUFs,” in Security with Noisy Data: On Private Biometrics, Secure Key Storage and Anti-Counterfeiting. Springer-Verlag, 2007, ch. 15.

28

[6] B. Gassend, D. E. Clarke, M. van Dijk, and S. Devadas, “Silicon Physical Unknown Functions,” in ACM Conference on Computer and Communications Security – CCS 2002, V. Atluri, Ed. ACM, 2002, pp. 148–160. [7] D. Lim, “Extracting Secret Keys from Integrated Circuits,” Master’s thesis, Massachusetts Institute of Technology, May 2004. [8] B. Gassend, D. Lim, D. Clarke, M. van Dijk, and S. Devadas, “Identification and Authentication of Integrated Circuits,” Concurrency and Computation - Practice and Experience, vol. 16, no. 11, pp. 1077–1098, 2004. ¨ urk, G. Hammouri, and B. Sunar, “Towards Robust [9] E. Ozt¨ Low Cost Authentication for Pervasive Devices,” in PERCOM ’08: Proceedings of the 2008 Sixth Annual IEEE International Conference on Pervasive Computing and Communications. Washington, DC, USA: IEEE Computer Society, 2008, pp. 170– 178. [10] J. Guajardo, S. S. Kumar, G.-J. Schrijen, and P. Tuyls, “FPGA Intrinsic PUFs and Their Use for IP Protection,” in Cryptographic Hardware and Embedded Systems - CHES 2007, 9th International Workshop, Vienna, Austria, September 10-13, 2007, Proceedings, ser. Lecture Notes in Computer Science, P. Paillier and I. Verbauwhede, Eds., vol. 4727. Springer, 2007, pp. 63–80. [11] D. E. Holcomb, W. P. Burleson, and K. Fue, “Initial SRAM State as a Fingerprint and Source of True Random Numbers for RFID Tags,” in Proceedings of the Conference on RFID Security, Jul. 2007. [12] D. E. Holcomb, W. P. Burleson, and K. Fu, “Power-up SRAM State as an Identifying Fingerprint and Source of True Random Numbers,” IEEE Transactions on Computers, 2009, to appear. [13] Y. Su, J. Holleman, and B. P. Otis, “A Digital 1.6 pJ/bit Chip Identification Circuit Using Process Variations,” in IEEE International Solid-State Circuits Conference (ISSCC) 2007, 2007, pp. 15–17. [14] ——, “A Digital 1.6 pJ/bit Chip Identification Circuit Using Process Variations,” IEEE Journal of Solid-State Circuits, vol. 43, no. 1, pp. 69–77, Jan. 2008. [15] S. S. Kumar, J. Guajardo, R. Maes, G. J. Schrijen, and P. Tuyls, “The Butterfly PUF: Protecting IP on every FPGA,” in IEEE International Workshop on Hardware-Oriented Security and Trust, HOST 2008, Anaheim, CA, USA, June 9, 2008. Proceedings, M. Tehranipoor and J. Plusquellic, Eds. IEEE Computer Society, 2008, pp. 67–70. ˇ [16] P. Tuyls, G. J. Schrijen, B. Skori´ c, J. van Geloven, N. Verhaegh, and R. Wolters, “Read-Proof Hardware from Protective Coatings,” in Cryptographic Hardware and Embedded Systems CHES 2006, 8th International Workshop, Yokohama, Japan, October 10-13, 2006, Proceedings, ser. Lecture Notes in Computer Science, L. Goubin and M. Matsui, Eds., vol. 4249. SpringerVerlag, 2006, pp. 369–383. [17] G. E. Suh, C. W. O’Donnell, I. Sachdev, and S. Devadas, “Design and Implementation of the AEGIS Single-Chip Secure Processor Using Physical Random Functions,” in 32st International Symposium on Computer Architecture (ISCA 2005), 4-8 June 2005, Madison, Wisconsin, USA. IEEE Computer Society, 2005, pp. 25–36. [18] M. Majzoobi, F. Koushanfar, and M. Potkonjak, “Techniques for Design and Implementation of Secure Reconfigurable PUFs,” ACM Transactions on Reconfigurable Technology and Systems, vol. 2, no. 1, pp. 1–33, 2009. [19] J. W. Goodman, “Statistical properties of laser speckle patterns,”

2009 IEEE International Workshop on Hardware-Oriented Security and Trust (HOST)

[20]

[21] [22]

[23]

[24]

[25]

[26]

[27]

in Laser Speckle and Related Phenomena, J. W. Dainty, Ed. Springer-Verlag, 1975. ˇ P. Tuyls, B. Skori´ c, and T. Kevenaar, Eds., Security with Noisy Data: On Private Biometrics, Secure Key Storage and AntiCounterfeiting. Springer-Verlag, 2007. M. Yam, “Intel to Sample Phase Change Memory This Year,” 2007, http://www.dailytech.com/article.aspx?newsid=6371. Ovonyx, “Market Applications of Ovonic Unified Memory (OUM),” http://ovonyx.com/technology/ market-applications-of-oum.html. J.-P. M. G. Linnartz and P. Tuyls, “New Shielding Functions to Enhance Privacy and Prevent Misuse of Biometric Templates,” in Audio-and Video-Based Biometrie Person Authentication – AVBPA 2003, ser. Lecture Notes in Computer Science, J. Kittler and M. S. Nixon, Eds., vol. 2688. Springer-Verlag, 2003, pp. 393–402. Y. Dodis, L. Reyzin, and A. Smith, “Fuzzy Extractors: How to Generate Strong Keys from Biometrics and Other Noisy Data,” in Advances in Cryptology – EUROCRYPT 2004, ser. Lecture Notes in Computer Science, vol. 3027. Springer-Verlag, 2004, pp. 523–540. C. B¨osch, J. Guajardo, A.-R. Sadeghi, J. Shokrollahi, and P. Tuyls, “Efficient Helper Data Key Extractor on FPGAs,” in 10th International Workshop on Cryptographic Hardware and Embedded Systems – CHES 2008, ser. LNCS, E. Oswald and P. Rohatgi, Eds. Washington, DC, USA: Springer-Verlag, 2008. X. Boyen, “Robust and Reusable Fuzzy Extractors,” in Security with Noisy Data: On Private Biometrics, Secure Key Storage and Anti-Counterfeiting. Springer-Verlag, 2007, ch. 6. G. E. Suh, D. E. Clarke, B. Gassend, M. van Dijk, and S. Devadas, “Efficient Memory Integrity Verification and Encryption for Secure Processors,” in 36th Annual International Symposium on Microarchitecture. ACM/IEEE, 2003, pp. 339–350.

A PPENDIX A. Formal Definition of a Reconfigurable PUF Let S = {0, 1}n denote the configuration space of the physical system that constitutes the PUF, C be the space of challenges that can be applied to the system and R be the response space of the system. The responses in R are observed through a noisy channel. Therefore we model the mapping that maps challenges to responses as random variables as follows, P = {R : S × C → R|R(s, c) is a random variable

distributed according to Ps,c }, where Ps,c denotes a probability distribution on R. The responses R(., .) of the PUF depend on the configuration s ∈ S of the physical system, which is secret, and on the challenge c (which is public) applied to it. Note that for a fixed challenge c there are 2n possible noisy responses R(., c) since the number of physical systems is 2n . The state space S defines the configurations of the random components in the PUF that determine its functional behavior. As an example we mention that it defines the

positions of the scattering particles in the case of optical PUFs [3]. More precisely we define PUFs as follows: Definition 1 (Type of PUF): We define a PUF type as a set of physical systems represented by the tuple (S, P, C, R) with state space S , function space P , challenge space C , response space R. Definition 2 (PUF Instantiation): An instantiation of the PUF type (S, P, C, R) is defined by (s, R(s, .)) with secret state s ∈R S , where R(s, .) is the restriction of the random variable R(., .) to the space C . Next we consider the security of PUF instantiations. Definition 3 (PUF Security): We call a PUF instantiation information theoretically secure5 iff the following conditions hold: I(R(C  ); R(C), C, C  ) ≈ 0,

where C, C  , R(C), R(C  ) are random variables over C and R, and I(.; .) denotes the mutual information. We call it physically secure if 1) It is very hard to clone the PUF, i.e., the time complexity of making a physical or mathematical (simulation) copy is exponential in n. 2) The PUF is tamper evident. We remind that unclonability of a PUF means that the PUF is also unclonable for the manufacturer of the PUF (manufacturer non-reproducibility). The production of the PUF is not based on a secret that the manufacturer has to hide. Tamper evidence stands for the fact that when the PUF is damaged (e.g., by an invasive attack) the PUF is damaged up to such an extent that its challenge-response behavior is completely changed in an unpredictable way. For a given challenge c ∈ C we denote by r(s, c) the actual response of the PUF (often we will write r(c) when it is clear which PUF is being considered). Definition 4 (PUF Interfaces): An instantiation of a PUF (s, R(s, .)) has the following interface function r(c) ← Read(c)

with r(c) ∈ R.

Moreover, an instantiation of a reconfigurable PUF has additionally the following interface function s˜ ← Reconf()

with s˜ ∈R S.

Note that s˜ ∈R S implies that a completely new uncontrollable physical system (and hence challenge-response behavior) is generated. 5 Analog definitions can be given to define cryptographically secure PUFs.

2009 IEEE International Workshop on Hardware-Oriented Security and Trust (HOST)

29

Suggest Documents