draft-ietf-rats-architecture-11.txt   draft-ietf-rats-architecture-12.txt 
RATS Working Group H. Birkholz RATS Working Group H. Birkholz
Internet-Draft Fraunhofer SIT Internet-Draft Fraunhofer SIT
Intended status: Informational D. Thaler Intended status: Informational D. Thaler
Expires: 1 October 2021 Microsoft Expires: 25 October 2021 Microsoft
M. Richardson M. Richardson
Sandelman Software Works Sandelman Software Works
N. Smith N. Smith
Intel Intel
W. Pan W. Pan
Huawei Technologies Huawei Technologies
30 March 2021 23 April 2021
Remote Attestation Procedures Architecture Remote Attestation Procedures Architecture
draft-ietf-rats-architecture-11 draft-ietf-rats-architecture-12
Abstract Abstract
In network protocol exchanges it is often useful for one end of a In network protocol exchanges it is often useful for one end of a
communication to know whether the other end is in an intended communication to know whether the other end is in an intended
operating state. This document provides an architectural overview of operating state. This document provides an architectural overview of
the entities involved that make such tests possible through the the entities involved that make such tests possible through the
process of generating, conveying, and evaluating evidentiary claims. process of generating, conveying, and evaluating evidentiary claims.
An attempt is made to provide for a model that is neutral toward An attempt is made to provide for a model that is neutral toward
processor architectures, the content of claims, and protocols. processor architectures, the content of claims, and protocols.
skipping to change at page 2, line 10 skipping to change at page 2, line 10
Internet-Drafts are working documents of the Internet Engineering Internet-Drafts are working documents of the Internet Engineering
Task Force (IETF). Note that other groups may also distribute Task Force (IETF). Note that other groups may also distribute
working documents as Internet-Drafts. The list of current Internet- working documents as Internet-Drafts. The list of current Internet-
Drafts is at https://datatracker.ietf.org/drafts/current/. Drafts is at https://datatracker.ietf.org/drafts/current/.
Internet-Drafts are draft documents valid for a maximum of six months Internet-Drafts are draft documents valid for a maximum of six months
and may be updated, replaced, or obsoleted by other documents at any and may be updated, replaced, or obsoleted by other documents at any
time. It is inappropriate to use Internet-Drafts as reference time. It is inappropriate to use Internet-Drafts as reference
material or to cite them other than as "work in progress." material or to cite them other than as "work in progress."
This Internet-Draft will expire on 1 October 2021. This Internet-Draft will expire on 25 October 2021.
Copyright Notice Copyright Notice
Copyright (c) 2021 IETF Trust and the persons identified as the Copyright (c) 2021 IETF Trust and the persons identified as the
document authors. All rights reserved. document authors. All rights reserved.
This document is subject to BCP 78 and the IETF Trust's Legal This document is subject to BCP 78 and the IETF Trust's Legal
Provisions Relating to IETF Documents (https://trustee.ietf.org/ Provisions Relating to IETF Documents (https://trustee.ietf.org/
license-info) in effect on the date of publication of this document. license-info) in effect on the date of publication of this document.
Please review these documents carefully, as they describe your rights Please review these documents carefully, as they describe your rights
skipping to change at page 2, line 38 skipping to change at page 2, line 38
1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 3 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 3
2. Reference Use Cases . . . . . . . . . . . . . . . . . . . . . 5 2. Reference Use Cases . . . . . . . . . . . . . . . . . . . . . 5
2.1. Network Endpoint Assessment . . . . . . . . . . . . . . . 5 2.1. Network Endpoint Assessment . . . . . . . . . . . . . . . 5
2.2. Confidential Machine Learning Model Protection . . . . . 5 2.2. Confidential Machine Learning Model Protection . . . . . 5
2.3. Confidential Data Protection . . . . . . . . . . . . . . 6 2.3. Confidential Data Protection . . . . . . . . . . . . . . 6
2.4. Critical Infrastructure Control . . . . . . . . . . . . . 6 2.4. Critical Infrastructure Control . . . . . . . . . . . . . 6
2.5. Trusted Execution Environment Provisioning . . . . . . . 7 2.5. Trusted Execution Environment Provisioning . . . . . . . 7
2.6. Hardware Watchdog . . . . . . . . . . . . . . . . . . . . 7 2.6. Hardware Watchdog . . . . . . . . . . . . . . . . . . . . 7
2.7. FIDO Biometric Authentication . . . . . . . . . . . . . . 7 2.7. FIDO Biometric Authentication . . . . . . . . . . . . . . 7
3. Architectural Overview . . . . . . . . . . . . . . . . . . . 8 3. Architectural Overview . . . . . . . . . . . . . . . . . . . 8
3.1. Appraisal Policies . . . . . . . . . . . . . . . . . . . 9 3.1. Layered Attestation Environments . . . . . . . . . . . . 12
3.2. Reference Values . . . . . . . . . . . . . . . . . . . . 9 3.2. Composite Device . . . . . . . . . . . . . . . . . . . . 14
3.3. Two Types of Environments of an Attester . . . . . . . . 10 3.3. Implementation Considerations . . . . . . . . . . . . . . 16
3.4. Layered Attestation Environments . . . . . . . . . . . . 11 4. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.5. Composite Device . . . . . . . . . . . . . . . . . . . . 13 4.1. Roles . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.6. Implementation Considerations . . . . . . . . . . . . . . 15 4.2. Artifacts . . . . . . . . . . . . . . . . . . . . . . . . 18
4. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 15 5. Topological Patterns . . . . . . . . . . . . . . . . . . . . 19
4.1. Roles . . . . . . . . . . . . . . . . . . . . . . . . . . 15 5.1. Passport Model . . . . . . . . . . . . . . . . . . . . . 20
4.2. Artifacts . . . . . . . . . . . . . . . . . . . . . . . . 16 5.2. Background-Check Model . . . . . . . . . . . . . . . . . 21
5. Topological Patterns . . . . . . . . . . . . . . . . . . . . 18 5.3. Combinations . . . . . . . . . . . . . . . . . . . . . . 22
5.1. Passport Model . . . . . . . . . . . . . . . . . . . . . 18 6. Roles and Entities . . . . . . . . . . . . . . . . . . . . . 23
5.2. Background-Check Model . . . . . . . . . . . . . . . . . 19 7. Trust Model . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.3. Combinations . . . . . . . . . . . . . . . . . . . . . . 20 7.1. Relying Party . . . . . . . . . . . . . . . . . . . . . . 24
6. Roles and Entities . . . . . . . . . . . . . . . . . . . . . 21 7.2. Attester . . . . . . . . . . . . . . . . . . . . . . . . 25
7. Trust Model . . . . . . . . . . . . . . . . . . . . . . . . . 22 7.3. Relying Party Owner . . . . . . . . . . . . . . . . . . . 25
7.1. Relying Party . . . . . . . . . . . . . . . . . . . . . . 22 7.4. Verifier . . . . . . . . . . . . . . . . . . . . . . . . 25
7.2. Attester . . . . . . . . . . . . . . . . . . . . . . . . 23 7.5. Endorser, Reference Value Provider, and Verifier Owner . 27
7.3. Relying Party Owner . . . . . . . . . . . . . . . . . . . 24 8. Conceptual Messages . . . . . . . . . . . . . . . . . . . . . 28
7.4. Verifier . . . . . . . . . . . . . . . . . . . . . . . . 24 8.1. Evidence . . . . . . . . . . . . . . . . . . . . . . . . 28
7.5. Endorser, Reference Value Provider, and Verifier Owner . 25 8.2. Endorsements . . . . . . . . . . . . . . . . . . . . . . 28
8. Conceptual Messages . . . . . . . . . . . . . . . . . . . . . 26 8.3. Reference Values . . . . . . . . . . . . . . . . . . . . 29
8.1. Evidence . . . . . . . . . . . . . . . . . . . . . . . . 26 8.4. Attestation Results . . . . . . . . . . . . . . . . . . . 29
8.2. Endorsements . . . . . . . . . . . . . . . . . . . . . . 26 8.5. Appraisal Policies . . . . . . . . . . . . . . . . . . . 30
8.3. Attestation Results . . . . . . . . . . . . . . . . . . . 27 9. Claims Encoding Formats . . . . . . . . . . . . . . . . . . . 31
9. Claims Encoding Formats . . . . . . . . . . . . . . . . . . . 28 10. Freshness . . . . . . . . . . . . . . . . . . . . . . . . . . 32
10. Freshness . . . . . . . . . . . . . . . . . . . . . . . . . . 29 10.1. Explicit Timekeeping using Synchronized Clocks . . . . . 33
10.1. Explicit Timekeeping using Synchronized Clocks . . . . . 30 10.2. Implicit Timekeeping using Nonces . . . . . . . . . . . 33
10.2. Implicit Timekeeping using Nonces . . . . . . . . . . . 30 10.3. Implicit Timekeeping using Epoch IDs . . . . . . . . . . 33
10.3. Implicit Timekeeping using Epoch IDs . . . . . . . . . . 31 10.4. Discussion . . . . . . . . . . . . . . . . . . . . . . . 35
10.4. Discussion . . . . . . . . . . . . . . . . . . . . . . . 32 11. Privacy Considerations . . . . . . . . . . . . . . . . . . . 35
11. Privacy Considerations . . . . . . . . . . . . . . . . . . . 32 12. Security Considerations . . . . . . . . . . . . . . . . . . . 36
12. Security Considerations . . . . . . . . . . . . . . . . . . . 33 12.1. Attester and Attestation Key Protection . . . . . . . . 36
12.1. Attester and Attestation Key Protection . . . . . . . . 33 12.1.1. On-Device Attester and Key Protection . . . . . . . 37
12.1.1. On-Device Attester and Key Protection . . . . . . . 34 12.1.2. Attestation Key Provisioning Processes . . . . . . . 37
12.1.2. Attestation Key Provisioning Processes . . . . . . . 34 12.2. Integrity Protection . . . . . . . . . . . . . . . . . . 39
12.2. Integrity Protection . . . . . . . . . . . . . . . . . . 35 12.3. Epoch ID-based Attestation . . . . . . . . . . . . . . . 39
12.3. Epoch ID-based Attestation . . . . . . . . . . . . . . . 36 12.4. Trust Anchor Protection . . . . . . . . . . . . . . . . 40
13. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 37 13. IANA Considerations . . . . . . . . . . . . . . . . . . . . . 40
14. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 37 14. Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . 41
15. Notable Contributions . . . . . . . . . . . . . . . . . . . . 37 15. Notable Contributions . . . . . . . . . . . . . . . . . . . . 41
16. Appendix A: Time Considerations . . . . . . . . . . . . . . . 37 16. Appendix A: Time Considerations . . . . . . . . . . . . . . . 41
16.1. Example 1: Timestamp-based Passport Model Example . . . 39 16.1. Example 1: Timestamp-based Passport Model Example . . . 43
16.2. Example 2: Nonce-based Passport Model Example . . . . . 40 16.2. Example 2: Nonce-based Passport Model Example . . . . . 44
16.3. Example 3: Epoch ID-based Passport Model Example . . . . 42 16.3. Example 3: Epoch ID-based Passport Model Example . . . . 46
16.4. Example 4: Timestamp-based Background-Check Model 16.4. Example 4: Timestamp-based Background-Check Model
Example . . . . . . . . . . . . . . . . . . . . . . . . 43 Example . . . . . . . . . . . . . . . . . . . . . . . . 47
16.5. Example 5: Nonce-based Background-Check Model Example . 44 16.5. Example 5: Nonce-based Background-Check Model Example . 48
17. References . . . . . . . . . . . . . . . . . . . . . . . . . 45 17. References . . . . . . . . . . . . . . . . . . . . . . . . . 49
17.1. Normative References . . . . . . . . . . . . . . . . . . 45 17.1. Normative References . . . . . . . . . . . . . . . . . . 49
17.2. Informative References . . . . . . . . . . . . . . . . . 45 17.2. Informative References . . . . . . . . . . . . . . . . . 49
Contributors . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Contributors . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 48 Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . 53
1. Introduction 1. Introduction
The question of how one system can know that another system can be The question of how one system can know that another system can be
trusted has found new interest and relevance in a world where trusted trusted has found new interest and relevance in a world where trusted
computing elements are maturing in processor architectures. computing elements are maturing in processor architectures.
Systems that have been attested and verified to be in a good state Systems that have been attested and verified to be in a good state
(for some value of "good") can improve overall system posture. (for some value of "good") can improve overall system posture.
Conversely, systems that cannot be attested and verified to be in a Conversely, systems that cannot be attested and verified to be in a
skipping to change at page 7, line 11 skipping to change at page 7, line 11
equipment. equipment.
Relying Party: A device or application connected to potentially Relying Party: A device or application connected to potentially
dangerous physical equipment (hazardous chemical processing, dangerous physical equipment (hazardous chemical processing,
traffic control, power grid, etc.). traffic control, power grid, etc.).
2.5. Trusted Execution Environment Provisioning 2.5. Trusted Execution Environment Provisioning
A Trusted Application Manager (TAM) server is responsible for A Trusted Application Manager (TAM) server is responsible for
managing the applications running in a Trusted Execution Environment managing the applications running in a Trusted Execution Environment
(TEE) of a client device. To achieve its purpose, the TAM needs to (TEE) of a client device, as described in
assess the state of a TEE, or of applications in the TEE, of a client [I-D.ietf-teep-architecture]. To achieve its purpose, the TAM needs
device. The TEE conducts Remote Attestation Procedures with the TAM, to assess the state of a TEE, or of applications in the TEE, of a
which can then decide whether the TEE is already in compliance with client device. The TEE conducts Remote Attestation Procedures with
the TAM's latest policy. If not, the TAM has to uninstall, update, the TAM, which can then decide whether the TEE is already in
or install approved applications in the TEE to bring it back into compliance with the TAM's latest policy. If not, the TAM has to
compliance with the TAM's policy. uninstall, update, or install approved applications in the TEE to
bring it back into compliance with the TAM's policy.
Attester: A device with a TEE capable of running trusted Attester: A device with a TEE capable of running trusted
applications that can be updated. applications that can be updated.
Relying Party: A TAM. Relying Party: A TAM.
2.6. Hardware Watchdog 2.6. Hardware Watchdog
There is a class of malware that holds a device hostage and does not There is a class of malware that holds a device hostage and does not
allow it to reboot to prevent updates from being applied. This can allow it to reboot to prevent updates from being applied. This can
skipping to change at page 9, line 17 skipping to change at page 9, line 44
illustrated in Figure 1. illustrated in Figure 1.
An Attester creates Evidence that is conveyed to a Verifier. An Attester creates Evidence that is conveyed to a Verifier.
A Verifier uses the Evidence, any Reference Values from Reference A Verifier uses the Evidence, any Reference Values from Reference
Value Providers, and any Endorsements from Endorsers, by applying an Value Providers, and any Endorsements from Endorsers, by applying an
Appraisal Policy for Evidence to assess the trustworthiness of the Appraisal Policy for Evidence to assess the trustworthiness of the
Attester. This procedure is called the appraisal of Evidence. Attester. This procedure is called the appraisal of Evidence.
Subsequently, the Verifier generates Attestation Results for use by Subsequently, the Verifier generates Attestation Results for use by
Relying Parties. The Appraisal Policy for Evidence might be obtained Relying Parties.
from an Endorser along with the Endorsements, and/or might be
obtained via some other mechanism, such as being configured in the The Appraisal Policy for Evidence might be obtained from the Verifier
Verifier by the Verifier Owner. Owner via some protocol mechanism, or might be configured into the
Verifier by the Verifier Owner, or might be programmed into the
Verifier, or might be obtained via some other mechanism.
A Relying Party uses Attestation Results by applying its own A Relying Party uses Attestation Results by applying its own
appraisal policy to make application-specific decisions, such as appraisal policy to make application-specific decisions, such as
authorization decisions. The Appraisal Policy for Attestation authorization decisions. This procedure is called the appraisal of
Results is configured in the Relying Party by the Relying Party Attestation Results.
Owner, and/or are programmed into the Relying Party. This procedure
is called the appraisal of Attestation Results.
3.1. Appraisal Policies
The Verifier, when appraising Evidence, or the Relying Party, when
appraising Attestation Results, checks the values of some Claims
against constraints specified in its appraisal policy. Examples of
such constraints checking include:
* comparison for equality against a Reference Value, or
* a check for being in a range bounded by Reference Values, or
* membership in a set of Reference Values, or
* a check against values in other Claims.
The actual data format and semantics of any Appraisal Policy is
implementation specific.
3.2. Reference Values
Reference Values used in appraisal procedures come from a Reference
Value Provider and are then used by the appraisal policy.
The actual data format and semantics of any Reference Values are The Appraisal Policy for Attestation Results might be obtained from
specific to Claims and implementations. This architecture document the Relying Party Owner via some protocol mechanism, or might be
does not define any general purpose format for Reference Values or configured into the Relying Party by the Relying Party Owner, or
general means for comparison. might be programmed into the Relying Party, or might be obtained via
some other mechanism.
3.3. Two Types of Environments of an Attester See Section 8 for further discussion of the conceptual messages shown
in Figure 1. ## Two Types of Environments of an Attester
As shown in Figure 2, an Attester consists of at least one Attesting As shown in Figure 2, an Attester consists of at least one Attesting
Environment and at least one Target Environment. In some Environment and at least one Target Environment. In some
implementations, the Attesting and Target Environments might be implementations, the Attesting and Target Environments might be
combined. Other implementations might have multiple Attesting and combined. Other implementations might have multiple Attesting and
Target Environments, such as in the examples described in more detail Target Environments, such as in the examples described in more detail
in Section 3.4 and Section 3.5. Other examples may exist. All in Section 3.1 and Section 3.2. Other examples may exist. All
compositions of Attesting and Target Environments discussed in this compositions of Attesting and Target Environments discussed in this
architecture can be combined into more complex implementations. architecture can be combined into more complex implementations.
.--------------------------------. .--------------------------------.
| | | |
| Verifier | | Verifier |
| | | |
'--------------------------------' '--------------------------------'
^ ^
| |
skipping to change at page 11, line 16 skipping to change at page 11, line 46
Environments collect the values and the information to be represented Environments collect the values and the information to be represented
in Claims, by reading system registers and variables, calling into in Claims, by reading system registers and variables, calling into
subsystems, taking measurements on code, memory, or other security subsystems, taking measurements on code, memory, or other security
related assets of the Target Environment. Attesting Environments related assets of the Target Environment. Attesting Environments
then format the Claims appropriately, and typically use key material then format the Claims appropriately, and typically use key material
and cryptographic functions, such as signing or cipher algorithms, to and cryptographic functions, such as signing or cipher algorithms, to
generate Evidence. There is no limit to or requirement on the types generate Evidence. There is no limit to or requirement on the types
of hardware or software environments that can be used to implement an of hardware or software environments that can be used to implement an
Attesting Environment, for example: Trusted Execution Environments Attesting Environment, for example: Trusted Execution Environments
(TEEs), embedded Secure Elements (eSEs), Trusted Platform Modules (TEEs), embedded Secure Elements (eSEs), Trusted Platform Modules
(TPMs), or BIOS firmware. (TPMs) [TCGarch], or BIOS firmware.
An arbitrary execution environment may not, by default, be capable of An arbitrary execution environment may not, by default, be capable of
Claims collection for a given Target Environment. Execution Claims collection for a given Target Environment. Execution
environments that are designed specifically to be capable of Claims environments that are designed specifically to be capable of Claims
collection are referred to in this document as Attesting collection are referred to in this document as Attesting
Environments. For example, a TPM doesn't actively collect Claims Environments. For example, a TPM doesn't actively collect Claims
itself, it instead requires another component to feed various values itself, it instead requires another component to feed various values
to the TPM. Thus, an Attesting Environment in such a case would be to the TPM. Thus, an Attesting Environment in such a case would be
the combination of the TPM together with whatever component is the combination of the TPM together with whatever component is
feeding it the measurements. feeding it the measurements.
3.4. Layered Attestation Environments 3.1. Layered Attestation Environments
By definition, the Attester role generates Evidence. An Attester may By definition, the Attester role generates Evidence. An Attester may
consist of one or more nested environments (layers). The root layer consist of one or more nested environments (layers). The root layer
of an Attester includes at least one root of trust. In order to of an Attester includes at least one root of trust. In order to
appraise Evidence generated by an Attester, the Verifier needs to appraise Evidence generated by an Attester, the Verifier needs to
trust the Attester's root of trust. Trust in the Attester's root of trust the Attester's root of trust. Trust in the Attester's root of
trust can be established either directly (e.g., the Verifier puts the trust can be established in various ways as discussed in Section 7.4.
root of trust's public key into its trust anchor store) or
transitively via an Endorser (e.g., the Verifier puts the Endorser's
public key into its trust anchor store). In layered attestation, a
root of trust is the initial Attesting Environment. Claims can be
collected from or about each layer. The corresponding Claims can be
structured in a nested fashion that reflects the nesting of the
Attester's layers. Normally, Claims are not self-asserted, rather a
previous layer acts as the Attesting Environment for the next layer.
Claims about a root of trust typically are asserted by an Endorser.
The device illustrated in Figure 3 includes (A) a BIOS stored in In layered attestation, a root of trust is the initial Attesting
read-only memory, (B) an operating system kernel, and (C) an Environment. Claims can be collected from or about each layer. The
application or workload. corresponding Claims can be structured in a nested fashion that
reflects the nesting of the Attester's layers. Normally, Claims are
not self-asserted, rather a previous layer acts as the Attesting
Environment for the next layer. Claims about a root of trust
typically are asserted by an Endorser.
.-------------. Endorsement for A The example device illustrated in Figure 3 includes (A) a BIOS stored
| Endorser |-----------------------. in read-only memory, (B) a bootloader, and (C) an operating system
'-------------' | kernel.
v
.-------------. Reference .----------. .-------------. Endorsement for ROM
| Reference | Values | | | Endorser |-----------------------.
| Value |---------------->| Verifier | '-------------' |
| Provider(s) | for A, B, | | v
'-------------' and C '----------' .-------------. Reference .----------.
^ | Reference | Values for | |
.------------------------------------. | | Value |----------------->| Verifier |
| | | | Provider(s) | ROM, bootloader, | |
| .---------------------------. | | '-------------' and kernel '----------'
| | Target | | | Layered ^
| | Environment | | | Evidence .------------------------------------. |
| | C | | | for | | |
| '---------------------------' | | B and C | .---------------------------. | |
| Collect | | | | | Kernel | | |
| Claims | | | | | | | | Layered
| .---------------|-----------. | | | | Target | | | Evidence
| | Target v | | | | | Environment | | | for
| | Environment .-----------. | | | | '---------------------------' | | bootloader
| | B | Attesting | | | | | Collect | | | and
| | |Environment|-----------' | Claims | | | kernel
| | | B | | | | .---------------|-----------. | |
| | '-----------' | | | | Bootloader v | | |
| | ^ | | | | .-----------. | | |
| '---------------------|-----' | | | Target | Attesting | | | |
| Collect | | Evidence | | | Environment |Environment|-----------'
| Claims v | for B | | | | | | |
| .-----------. | | | '-----------' | |
| | Attesting | | | | ^ | |
| |Environment| | | '-----------------|---------' |
| | A | | | Collect | | Evidence for |
| '-----------' | | Claims v | bootloader |
| | | .---------------------------. |
'------------------------------------' | | ROM | |
| | | |
| | Attesting | |
| | Environment | |
| '---------------------------' |
| |
'------------------------------------'
Figure 3: Layered Attester Figure 3: Layered Attester
Attesting Environment A, the read-only BIOS in this example, has to The first Attesting Environment, the read-only BIOS in this example,
ensure the integrity of the bootloader (Target Environment B). There has to ensure the integrity of the bootloader (the first Target
are potentially multiple kernels to boot, and the decision is up to Environment). There are potentially multiple kernels to boot, and
the bootloader. Only a bootloader with intact integrity will make an the decision is up to the bootloader. Only a bootloader with intact
appropriate decision. Therefore, the Claims relating to the integrity will make an appropriate decision. Therefore, the Claims
integrity of the bootloader have to be measured securely. At this relating to the integrity of the bootloader have to be measured
stage of the boot-cycle of the device, the Claims collected typically securely. At this stage of the boot-cycle of the device, the Claims
cannot be composed into Evidence. collected typically cannot be composed into Evidence.
After the boot sequence is started, the BIOS conducts the most After the boot sequence is started, the BIOS conducts the most
important and defining feature of layered attestation, which is that important and defining feature of layered attestation, which is that
the successfully measured Target Environment B now becomes (or the successfully measured bootloader now becomes (or contains) an
contains) an Attesting Environment for the next layer. This Attesting Environment for the next layer. This procedure in layered
procedure in layered attestation is sometimes called "staging". It attestation is sometimes called "staging". It is important that the
is important that the new Attesting Environment B not be able to bootloader not be able to alter any Claims about itself that were
alter any Claims about its own Target Environment B. This can be collected by the BIOS. This can be ensured having those Claims be
ensured having those Claims be either signed by Attesting Environment either signed by the BIOS or stored in a tamper-proof manner by the
A or stored in an untamperable manner by Attesting Environment A. BIOS.
Continuing with this example, the bootloader's Attesting Environment Continuing with this example, the bootloader's Attesting Environment
B is now in charge of collecting Claims about Target Environment C, is now in charge of collecting Claims about the next Target
which in this example is the kernel to be booted. The final Evidence Environment, which in this example is the kernel to be booted. The
thus contains two sets of Claims: one set about the bootloader as final Evidence thus contains two sets of Claims: one set about the
measured and signed by the BIOS, plus a set of Claims about the bootloader as measured and signed by the BIOS, plus a set of Claims
kernel as measured and signed by the bootloader. about the kernel as measured and signed by the bootloader.
This example could be extended further by making the kernel become This example could be extended further by making the kernel become
another Attesting Environment for an application as another Target another Attesting Environment for an application as another Target
Environment. This would result in a third set of Claims in the Environment. This would result in a third set of Claims in the
Evidence pertaining to that application. Evidence pertaining to that application.
The essence of this example is a cascade of staged environments. The essence of this example is a cascade of staged environments.
Each environment has the responsibility of measuring the next Each environment has the responsibility of measuring the next
environment before the next environment is started. In general, the environment before the next environment is started. In general, the
number of layers may vary by device or implementation, and an number of layers may vary by device or implementation, and an
Attesting Environment might even have multiple Target Environments Attesting Environment might even have multiple Target Environments
that it measures, rather than only one as shown in Figure 3. that it measures, rather than only one as shown by example in
Figure 3.
3.5. Composite Device 3.2. Composite Device
A composite device is an entity composed of multiple sub-entities A composite device is an entity composed of multiple sub-entities
such that its trustworthiness has to be determined by the appraisal such that its trustworthiness has to be determined by the appraisal
of all these sub-entities. of all these sub-entities.
Each sub-entity has at least one Attesting Environment collecting the Each sub-entity has at least one Attesting Environment collecting the
Claims from at least one Target Environment, then this sub-entity Claims from at least one Target Environment, then this sub-entity
generates Evidence about its trustworthiness. Therefore, each sub- generates Evidence about its trustworthiness. Therefore, each sub-
entity can be called an Attester. Among all the Attesters, there may entity can be called an Attester. Among all the Attesters, there may
be only some which have the ability to communicate with the Verifier be only some which have the ability to communicate with the Verifier
skipping to change at page 15, line 19 skipping to change at page 16, line 45
Attesters and conveys it to a Verifier. Collection of Evidence from Attesters and conveys it to a Verifier. Collection of Evidence from
sub-entities may itself be a form of Claims collection that results sub-entities may itself be a form of Claims collection that results
in Evidence asserted by the lead Attester. The lead Attester in Evidence asserted by the lead Attester. The lead Attester
generates Evidence about the layout of the whole composite device, generates Evidence about the layout of the whole composite device,
while sub-Attesters generate Evidence about their respective while sub-Attesters generate Evidence about their respective
(sub-)modules. (sub-)modules.
In this scenario, the trust model described in Section 7 can also be In this scenario, the trust model described in Section 7 can also be
applied to an inside Verifier. applied to an inside Verifier.
3.6. Implementation Considerations 3.3. Implementation Considerations
An entity can take on multiple RATS roles (e.g., Attester, Verifier, An entity can take on multiple RATS roles (e.g., Attester, Verifier,
Relying Party, etc.) at the same time. Multiple entities can Relying Party, etc.) at the same time. Multiple entities can
cooperate to implement a single RATS role as well. In essence, the cooperate to implement a single RATS role as well. In essence, the
combination of roles and entities can be arbitrary. For example, in combination of roles and entities can be arbitrary. For example, in
the composite device scenario, the entity inside the lead Attester the composite device scenario, the entity inside the lead Attester
can also take on the role of a Verifier, and the outer entity of can also take on the role of a Verifier, and the outer entity of
Verifier can take on the role of a Relying Party. After collecting Verifier can take on the role of a Relying Party. After collecting
the Evidence of other Attesters, this inside Verifier uses the Evidence of other Attesters, this inside Verifier uses
Endorsements and appraisal policies (obtained the same way as by any Endorsements and appraisal policies (obtained the same way as by any
skipping to change at page 22, line 26 skipping to change at page 24, line 8
necessarily use the Internet Protocol. Such interactions might use a necessarily use the Internet Protocol. Such interactions might use a
loopback device or other IP-based communication between separate loopback device or other IP-based communication between separate
environments, but they do not have to. Alternative channels to environments, but they do not have to. Alternative channels to
convey conceptual messages include function calls, sockets, GPIO convey conceptual messages include function calls, sockets, GPIO
interfaces, local busses, or hypervisor calls. This type of interfaces, local busses, or hypervisor calls. This type of
conveyance is typically found in composite devices. Most conveyance is typically found in composite devices. Most
importantly, these conveyance methods are out-of-scope of RATS, but importantly, these conveyance methods are out-of-scope of RATS, but
they are presumed to exist in order to convey conceptual messages they are presumed to exist in order to convey conceptual messages
appropriately between roles. appropriately between roles.
For example, an entity that both connects to a wide-area network and
to a system bus is taking on both the Attester and Verifier roles.
As a system bus-connected entity, a Verifier consumes Evidence from
other devices connected to the system bus that implement Attester
roles. As a wide-area network connected entity, it may implement an
Attester role.
In essence, an entity that combines more than one role creates and In essence, an entity that combines more than one role creates and
consumes the corresponding conceptual messages as defined in this consumes the corresponding conceptual messages as defined in this
document. document.
7. Trust Model 7. Trust Model
7.1. Relying Party 7.1. Relying Party
This document covers scenarios for which a Relying Party trusts a This document covers scenarios for which a Relying Party trusts a
Verifier that can appraise the trustworthiness of information about Verifier that can appraise the trustworthiness of information about
an Attester. Such trust might come by the Relying Party trusting the an Attester. Such trust might come by the Relying Party trusting the
Verifier (or its public key) directly, or might come by trusting an Verifier (or its public key) directly, or might come by trusting an
entity (e.g., a Certificate Authority) that is in the Verifier's entity (e.g., a Certificate Authority) that is in the Verifier's
certificate chain. certificate path. Such trust is expressed by storing one or more
"trust anchors" in a secure location known as a trust anchor store.
As defined in [RFC6024], "A trust anchor represents an authoritative
entity via a public key and associated data. The public key is used
to verify digital signatures, and the associated data is used to
constrain the types of information for which the trust anchor is
authoritative." The trust anchor may be a certificate or it may be a
raw public key along with additional data if necessary such as its
public key algorithm and parameters.
The Relying Party might implicitly trust a Verifier, such as in a The Relying Party might implicitly trust a Verifier, such as in a
Verifier/Relying Party combination where the Verifier and Relying Verifier/Relying Party combination where the Verifier and Relying
Party roles are combined. Or, for a stronger level of security, the Party roles are combined. Or, for a stronger level of security, the
Relying Party might require that the Verifier first provide Relying Party might require that the Verifier first provide
information about itself that the Relying Party can use to assess the information about itself that the Relying Party can use to assess the
trustworthiness of the Verifier before accepting its Attestation trustworthiness of the Verifier before accepting its Attestation
Results. Results.
For example, one explicit way for a Relying Party "A" to establish For example, one explicit way for a Relying Party "A" to establish
skipping to change at page 24, line 21 skipping to change at page 25, line 48
authentication or attestation in both directions might be needed, in authentication or attestation in both directions might be needed, in
which case typically one side's Evidence must be considered safe to which case typically one side's Evidence must be considered safe to
share with an untrusted entity, in order to bootstrap the sequence. share with an untrusted entity, in order to bootstrap the sequence.
See Section 11 for more discussion. See Section 11 for more discussion.
7.4. Verifier 7.4. Verifier
The Verifier trusts (or more specifically, the Verifier's security The Verifier trusts (or more specifically, the Verifier's security
policy is written in a way that configures the Verifier to trust) a policy is written in a way that configures the Verifier to trust) a
manufacturer, or the manufacturer's hardware, so as to be able to manufacturer, or the manufacturer's hardware, so as to be able to
appraise the trustworthiness of that manufacturer's devices. In a appraise the trustworthiness of that manufacturer's devices. Such
typical solution, a Verifier comes to trust an Attester indirectly by trust is expressed by storing one or more trust anchors in the
having an Endorser (such as a manufacturer) vouch for the Attester's Verifier's trust anchor store.
ability to securely generate Evidence.
In a typical solution, a Verifier comes to trust an Attester
indirectly by having an Endorser (such as a manufacturer) vouch for
the Attester's ability to securely generate Evidence, in which case
the Endorser's key material is stored in the Verifier's trust anchor
store.
In some solutions, a Verifier might be configured to directly trust In some solutions, a Verifier might be configured to directly trust
an Attester by having the Verifier have the Attester's key material an Attester by having the Verifier have the Attester's key material
(rather than the Endorser's) in its trust anchor store. (rather than the Endorser's) in its trust anchor store.
Such direct trust must first be established at the time of trust Such direct trust must first be established at the time of trust
anchor store configuration either by checking with an Endorser at anchor store configuration either by checking with an Endorser at
that time, or by conducting a security analysis of the specific that time, or by conducting a security analysis of the specific
device. Having the Attester directly in the trust anchor store device. Having the Attester directly in the trust anchor store
narrows the Verifier's trust to only specific devices rather than all narrows the Verifier's trust to only specific devices rather than all
skipping to change at page 25, line 42 skipping to change at page 27, line 29
Environment(s). Environment(s).
2. All unprotected Evidence that is conveyed is supplied exclusively 2. All unprotected Evidence that is conveyed is supplied exclusively
by the Attesting Environment that has the key material that by the Attesting Environment that has the key material that
protects the conveyance channel protects the conveyance channel
3. The root of trust protects both the conveyance channel key 3. The root of trust protects both the conveyance channel key
material and the Attesting Environment with equivalent strength material and the Attesting Environment with equivalent strength
protections. protections.
As illustrated in [I-D.birkholz-rats-uccs], an entity that receives
unprotected Evidence via a trusted conveyance channel always takes on
the responsibility of vouching for the Evidence's authenticity and
freshness. If protected Evidence is generated, the Attester's
Attesting Environments take on that responsibility. In cases where
unprotected Evidence is processed by a Verifier, Relying Parties have
to trust that the Verifier is capable of handling Evidence in a
manner that preserves the Evidence's authenticity and freshness.
Generating and conveying unprotected Evidence always creates
significant risk and the benefits of that approach have to be
carefully weighed against potential drawbacks.
See Section 12 for discussion on security strength. See Section 12 for discussion on security strength.
7.5. Endorser, Reference Value Provider, and Verifier Owner 7.5. Endorser, Reference Value Provider, and Verifier Owner
In some scenarios, the Endorser, Reference Value Provider, and In some scenarios, the Endorser, Reference Value Provider, and
Verifier Owner may need to trust the Verifier before giving the Verifier Owner may need to trust the Verifier before giving the
Endorsement, Reference Values, or appraisal policy to it. This can Endorsement, Reference Values, or appraisal policy to it. This can
be done similarly to how a Relying Party might establish trust in a be done similarly to how a Relying Party might establish trust in a
Verifier. Verifier.
As discussed in Section 7.3, authentication or attestation in both As discussed in Section 7.3, authentication or attestation in both
directions might be needed, in which case typically one side's directions might be needed, in which case typically one side's
identity or Evidence must be considered safe to share with an identity or Evidence must be considered safe to share with an
untrusted entity, in order to bootstrap the sequence. See Section 11 untrusted entity, in order to bootstrap the sequence. See Section 11
for more discussion. for more discussion.
8. Conceptual Messages 8. Conceptual Messages
Figure 1 illustrates the flow of a conceptual messages between
various roles. This section provides additional elaboration and
implementation considerations. It is the responsibility of protocol
specifications to define the actual data format and semantics of any
relevant conceptual messages.
8.1. Evidence 8.1. Evidence
Evidence is a set of Claims about the target environment that reveal Evidence is a set of Claims about the target environment that reveal
operational status, health, configuration or construction that have operational status, health, configuration or construction that have
security relevance. Evidence is appraised by a Verifier to establish security relevance. Evidence is appraised by a Verifier to establish
its relevance, compliance, and timeliness. Claims need to be its relevance, compliance, and timeliness. Claims need to be
collected in a manner that is reliable. Evidence needs to be collected in a manner that is reliable. Evidence needs to be
securely associated with the target environment so that the Verifier securely associated with the target environment so that the Verifier
cannot be tricked into accepting Claims originating from a different cannot be tricked into accepting Claims originating from a different
environment (that may be more trustworthy). Evidence also must be environment (that may be more trustworthy). Evidence also must be
skipping to change at page 27, line 10 skipping to change at page 29, line 19
compliant devices are considered authorized for some purpose. For compliant devices are considered authorized for some purpose. For
example, an enterprise using remote attestation for Network Endpoint example, an enterprise using remote attestation for Network Endpoint
Assessment [RFC5209] may not wish to let every healthy laptop from Assessment [RFC5209] may not wish to let every healthy laptop from
the same manufacturer onto the network, but instead only want to let the same manufacturer onto the network, but instead only want to let
devices that it legally owns onto the network. Thus, an Endorsement devices that it legally owns onto the network. Thus, an Endorsement
may be helpful information in authenticating information about a may be helpful information in authenticating information about a
device, but is not necessarily sufficient to authorize access to device, but is not necessarily sufficient to authorize access to
resources which may need device-specific information such as a public resources which may need device-specific information such as a public
key for the device or component or user on the device. key for the device or component or user on the device.
8.3. Attestation Results 8.3. Reference Values
Reference Values used in appraisal procedures come from a Reference
Value Provider and are then used by the Verifier to compare to
Evidence. Reference Values with matching Evidence produces
acceptable Claims. Additionally, appraisal policy may play a role in
determining the acceptance of Claims.
8.4. Attestation Results
Attestation Results are the input used by the Relying Party to decide Attestation Results are the input used by the Relying Party to decide
the extent to which it will trust a particular Attester, and allow it the extent to which it will trust a particular Attester, and allow it
to access some data or perform some operation. to access some data or perform some operation.
Attestation Results may carry a boolean value indicating compliance Attestation Results may carry a boolean value indicating compliance
or non-compliance with a Verifier's appraisal policy, or may carry a or non-compliance with a Verifier's appraisal policy, or may carry a
richer set of Claims about the Attester, against which the Relying richer set of Claims about the Attester, against which the Relying
Party applies its Appraisal Policy for Attestation Results. Party applies its Appraisal Policy for Attestation Results.
skipping to change at page 28, line 22 skipping to change at page 30, line 29
allows a Relying Party's appraisal policy to be simpler, potentially allows a Relying Party's appraisal policy to be simpler, potentially
based on standard ways of expressing the information, while still based on standard ways of expressing the information, while still
allowing interoperability with heterogeneous devices. allowing interoperability with heterogeneous devices.
Finally, whereas Evidence is signed by the device (or indirectly by a Finally, whereas Evidence is signed by the device (or indirectly by a
manufacturer, if Endorsements are used), Attestation Results are manufacturer, if Endorsements are used), Attestation Results are
signed by a Verifier, allowing a Relying Party to only need a trust signed by a Verifier, allowing a Relying Party to only need a trust
relationship with one entity, rather than a larger set of entities, relationship with one entity, rather than a larger set of entities,
for purposes of its appraisal policy. for purposes of its appraisal policy.
8.5. Appraisal Policies
The Verifier, when appraising Evidence, or the Relying Party, when
appraising Attestation Results, checks the values of matched Claims
against constraints specified in its appraisal policy. Examples of
such constraints checking include:
* comparison for equality against a Reference Value, or
* a check for being in a range bounded by Reference Values, or
* membership in a set of Reference Values, or
* a check against values in other Claims.
Upon completing all appraisal policy constraints, the remaining
Claims are accepted as input toward determining Attestation Results,
when appraising Evidence, or as input to a Relying Party, when
appraising Attestation Results.
9. Claims Encoding Formats 9. Claims Encoding Formats
The following diagram illustrates a relationship to which remote The following diagram illustrates a relationship to which remote
attestation is desired to be added: attestation is desired to be added:
+-------------+ +------------+ Evaluate +-------------+ +------------+ Evaluate
| |-------------->| | request | |-------------->| | request
| Attester | Access some | Relying | against | Attester | Access some | Relying | against
| | resource | Party | security | | resource | Party | security
+-------------+ +------------+ policy +-------------+ +------------+ policy
skipping to change at page 33, line 35 skipping to change at page 36, line 24
a Relying Party would appraise an Attestation Result for any other a Relying Party would appraise an Attestation Result for any other
purpose. purpose.
Another approach to deal with Evidence is to remove PII from the Another approach to deal with Evidence is to remove PII from the
Evidence while still being able to verify that the Attester is one of Evidence while still being able to verify that the Attester is one of
a large set. This approach is often called "Direct Anonymous a large set. This approach is often called "Direct Anonymous
Attestation". See [CCC-DeepDive] section 6.2 for more discussion. Attestation". See [CCC-DeepDive] section 6.2 for more discussion.
12. Security Considerations 12. Security Considerations
This document provides an architecture for doing remote attestation.
No specific wire protocol is documented here. Without a specific
proposal to compare against, it is impossible to know if the security
threats listed below have been mitigated well. The security
considerations below should be read as being essentially requirements
against realizations of the RATS Architecture. Some threats apply to
protocols, some are against implementations (code), and some threats
are against physical infrastructure (such as factories).
12.1. Attester and Attestation Key Protection 12.1. Attester and Attestation Key Protection
Implementers need to pay close attention to the protection of the Implementers need to pay close attention to the protection of the
Attester and the manufacturing processes for provisioning attestation Attester and the manufacturing processes for provisioning attestation
key material. If either of these are compromised, intended levels of key material. If either of these are compromised, intended levels of
assurance for RATS are compromised because attackers can forge assurance for RATS are compromised because attackers can forge
Evidence or manipulate the Attesting Environment. For example, a Evidence or manipulate the Attesting Environment. For example, a
Target Environment should not be able to tamper with the Attesting Target Environment should not be able to tamper with the Attesting
Environment that measures it, by isolating the two environments from Environment that measures it, by isolating the two environments from
each other in some way. each other in some way.
skipping to change at page 34, line 18 skipping to change at page 37, line 18
from the Target Environment it collects Claims about and that it from the Target Environment it collects Claims about and that it
signs the resulting Claims set with an attestation key, so that the signs the resulting Claims set with an attestation key, so that the
Target Environment cannot forge Evidence about itself. Such an Target Environment cannot forge Evidence about itself. Such an
isolated environment might be provided by a process, a dedicated isolated environment might be provided by a process, a dedicated
chip, a TEE, a virtual machine, or another secure mode of operation. chip, a TEE, a virtual machine, or another secure mode of operation.
The Attesting Environment must be protected from unauthorized The Attesting Environment must be protected from unauthorized
modification to ensure it behaves correctly. Confidentiality modification to ensure it behaves correctly. Confidentiality
protection of the Attesting Environment's signing key is vital so it protection of the Attesting Environment's signing key is vital so it
cannot be misused to forge Evidence. cannot be misused to forge Evidence.
In many cases the user or owner of a device that takes on the role of In many cases the user or owner of a device that includes the role of
Attester must not be able to modify or extract keys from its Attester must not be able to modify or extract keys from the
Attesting Environments. For example, the owner or user of a mobile Attesting Environments, to prevent creating forged Evidence. Some
phone or FIDO authenticator might not be trusted to use the keys to common examples include the user of a mobile phone or FIDO
report Evidence about the environment that protects the keys. An authenticator. An essential value-add provided by RATS is for the
essential value-add provided by RATS is for the Relying Party to be Relying Party to be able to trust the Attester even if the user or
able to trust the Attester even if the user or owner is not trusted. owner is not trusted.
Measures for a minimally protected system might include process or Measures for a minimally protected system might include process or
application isolation provided by a high-level operating system, and application isolation provided by a high-level operating system, and
restricted access to root or system privileges. In contrast, For restricted access to root or system privileges. In contrast, For
really simple single-use devices that don't use a protected mode really simple single-use devices that don't use a protected mode
operating system, like a Bluetooth speaker, the only factual operating system, like a Bluetooth speaker, the only factual
isolation might be the sturdy housing of the device. isolation might be the sturdy housing of the device.
Measures for a moderately protected system could include a special Measures for a moderately protected system could include a special
restricted operating environment, such as a TEE. In this case, only restricted operating environment, such as a TEE. In this case, only
skipping to change at page 34, line 50 skipping to change at page 38, line 5
attacks, power supply and clock glitching, faulting injection and RF attacks, power supply and clock glitching, faulting injection and RF
and power side channel attacks. and power side channel attacks.
12.1.2. Attestation Key Provisioning Processes 12.1.2. Attestation Key Provisioning Processes
Attestation key provisioning is the process that occurs in the Attestation key provisioning is the process that occurs in the
factory or elsewhere to establish signing key material on the device factory or elsewhere to establish signing key material on the device
and the validation key material off the device. Sometimes this is and the validation key material off the device. Sometimes this is
procedure is referred to as personalization or customization. procedure is referred to as personalization or customization.
12.1.2.1. Off-Device Key Generation
One way to provision key material is to first generate it external to One way to provision key material is to first generate it external to
the device and then copy the key onto the device. In this case, the device and then copy the key onto the device. In this case,
confidentiality protection of the generator, as well as for the path confidentiality protection of the generator, as well as for the path
over which the key is provisioned, is necessary. The manufacturer over which the key is provisioned, is necessary. The manufacturer
needs to take care to protect corresponding key material with needs to take care to protect corresponding key material with
measures appropriate for its value. measures appropriate for its value.
Confidentiality protection can be realized via physical provisioning The degree of protection afforded to this key material can vary by
facility security involving no encryption at all. For low-security device, based upon considerations as to a cost/benefit evaluation of
use cases, this might be simply locking doors and limiting personnel the intended function of the device. The confidentiality protection
that can enter the facility. For high-security use cases, this might is fundamentally based upon some amount of physical protection: while
involve a special area of the facility accessible only to select encryption is often used to provide confidentiality when a key is
security-trained personnel. conveyed across a factory, where the attestation key is created or
applied, it must be available in an unencrypted form. The physical
protection can therefore vary from situations where the key is
unencrypted only within carefully controlled secure enclaves within
silicon, to situations where an entire facility is considered secure,
by the simple means of locked doors and limited access.
Typically, cryptography is used to enable confidentiality protection. The cryptography that is used to enable confidentiality protection of
This can result in recursive problems, as the key material used to the attestation key comes with its own requirements to be secured.
This results in recursive problems, as the key material used to
provision attestation keys must again somehow have been provisioned provision attestation keys must again somehow have been provisioned
securely beforehand (requiring an additional level of protection, and securely beforehand (requiring an additional level of protection, and
so on). so on).
In general, a combination of some physical security measures and some So, this is why, in general, a combination of some physical security
cryptographic measures is used to establish confidentiality measures and some cryptographic measures is used to establish
protection. confidentiality protection.
Another way to provision key material is to generate it on the device 12.1.2.2. On-Device Key Generation
and export the validation key. If public-key cryptography is being
used, then only integrity is necessary. Confidentiality of public
keys is not necessary.
In all cases, attestation key provisioning must ensure that only When key material is generated within a device and the secret part of
attestation key material that is generated by a valid Endorser is it never leaves the device, then the problem may lessen. For public-
established in Attesters. For many use cases, this will involve key cryptography, it is, by definition, not necessary to maintain
physical security at the facility, to prevent unauthorized devices confidentiality of the public key: however integrity of the chain of
from being manufactured that may be counterfeit or incorrectly custody of the public key is necessary in order to avoid attacks
configured. where an attacker is able get a key they control endorsed.
To summarize: attestation key provisioning must ensure that only
valid attestation key material is established in Attesters.
12.2. Integrity Protection 12.2. Integrity Protection
Any solution that conveys information used for security purposes, Any solution that conveys information used for security purposes,
whether such information is in the form of Evidence, Attestation whether such information is in the form of Evidence, Attestation
Results, Endorsements, or appraisal policy must support end-to-end Results, Endorsements, or appraisal policy must support end-to-end
integrity protection and replay attack prevention, and often also integrity protection and replay attack prevention, and often also
needs to support additional security properties, including: needs to support additional security properties, including:
* end-to-end encryption, * end-to-end encryption,
skipping to change at page 37, line 16 skipping to change at page 40, line 34
victim's timeline at will. This ability could be used by a malicious victim's timeline at will. This ability could be used by a malicious
actor (e.g., a compromised router) to mount a confusion attack where, actor (e.g., a compromised router) to mount a confusion attack where,
for example, a Verifier is tricked into accepting Evidence coming for example, a Verifier is tricked into accepting Evidence coming
from a past epoch as fresh, while in the meantime the Attester has from a past epoch as fresh, while in the meantime the Attester has
been compromised. been compromised.
Reordering and dropping attacks are mitigated if the transport Reordering and dropping attacks are mitigated if the transport
provides the ability to detect reordering and drop. However, the provides the ability to detect reordering and drop. However, the
delay attack described above can't be thwarted in this manner. delay attack described above can't be thwarted in this manner.
12.4. Trust Anchor Protection
As noted in Section 7, Verifiers and Relying Parties have trust
anchor stores that must be secured. Specifically, a trust anchor
store must resist modification against unauthorized insertion,
deletion, and modification.
If certificates are used as trust anchors, Verifiers and Relying
Parties are also responsible for validating the entire certificate
path up to the trust anchor, which includes checking for certificate
revocation. See Section 6 of [RFC5280] for details.
13. IANA Considerations 13. IANA Considerations
This document does not require any actions by IANA. This document does not require any actions by IANA.
14. Acknowledgments 14. Acknowledgments
Special thanks go to Joerg Borchert, Nancy Cam-Winget, Jessica Special thanks go to Joerg Borchert, Nancy Cam-Winget, Jessica
Fitzgerald-McKay, Diego Lopez, Laurence Lundblade, Paul Rowe, Hannes Fitzgerald-McKay, Diego Lopez, Laurence Lundblade, Paul Rowe, Hannes
Tschofenig, Frank Xia, and David Wooten. Tschofenig, Frank Xia, and David Wooten.
skipping to change at page 37, line 38 skipping to change at page 41, line 23
Thomas Hardjono created initial versions of the terminology section Thomas Hardjono created initial versions of the terminology section
in collaboration with Ned Smith. Eric Voit provided the conceptual in collaboration with Ned Smith. Eric Voit provided the conceptual
separation between Attestation Provision Flows and Attestation separation between Attestation Provision Flows and Attestation
Evidence Flows. Monty Wisemen created the content structure of the Evidence Flows. Monty Wisemen created the content structure of the
first three architecture drafts. Carsten Bormann provided many of first three architecture drafts. Carsten Bormann provided many of
the motivational building blocks with respect to the Internet Threat the motivational building blocks with respect to the Internet Threat
Model. Model.
16. Appendix A: Time Considerations 16. Appendix A: Time Considerations
Section 10 discussed various issues and requirements around freshness
of evidence, and summarized three approaches that might be used by
different solutions to address them. This appendix provides more
details with examples to help illustrate potential approaches, to
inform those creating specific solutions.
The table below defines a number of relevant events, with an ID that The table below defines a number of relevant events, with an ID that
is used in subsequent diagrams. The times of said events might be is used in subsequent diagrams. The times of said events might be
defined in terms of an absolute clock time, such as the Coordinated defined in terms of an absolute clock time, such as the Coordinated
Universal Time timescale, or might be defined relative to some other Universal Time timescale, or might be defined relative to some other
timestamp or timeticks counter, such as a clock resetting its epoch timestamp or timeticks counter, such as a clock resetting its epoch
each time it is powered on. each time it is powered on.
+====+============+=================================================+ +====+============+=================================================+
| ID | Event | Explanation of event | | ID | Event | Explanation of event |
+====+============+=================================================+ +====+============+=================================================+
skipping to change at page 45, line 23 skipping to change at page 49, line 23
continued use beyond the period for which it deems the Attestation continued use beyond the period for which it deems the Attestation
Result to remain valid. Thus, if the Attestation Result sends a Result to remain valid. Thus, if the Attestation Result sends a
validity lifetime in terms of "time(RX_v)-time(RG_v)", then the validity lifetime in terms of "time(RX_v)-time(RG_v)", then the
Relying Party can check "time(OP_r)-time(ER_r) < time(RX_v)- Relying Party can check "time(OP_r)-time(ER_r) < time(RX_v)-
time(RG_v)". time(RG_v)".
17. References 17. References
17.1. Normative References 17.1. Normative References
[RFC5280] Cooper, D., Santesson, S., Farrell, S., Boeyen, S.,
Housley, R., and W. Polk, "Internet X.509 Public Key
Infrastructure Certificate and Certificate Revocation List
(CRL) Profile", RFC 5280, DOI 10.17487/RFC5280, May 2008,
<https://www.rfc-editor.org/rfc/rfc5280>.
[RFC7519] Jones, M., Bradley, J., and N. Sakimura, "JSON Web Token [RFC7519] Jones, M., Bradley, J., and N. Sakimura, "JSON Web Token
(JWT)", RFC 7519, DOI 10.17487/RFC7519, May 2015, (JWT)", RFC 7519, DOI 10.17487/RFC7519, May 2015,
<https://www.rfc-editor.org/info/rfc7519>. <https://www.rfc-editor.org/rfc/rfc7519>.
[RFC8392] Jones, M., Wahlstroem, E., Erdtman, S., and H. Tschofenig, [RFC8392] Jones, M., Wahlstroem, E., Erdtman, S., and H. Tschofenig,
"CBOR Web Token (CWT)", RFC 8392, DOI 10.17487/RFC8392, "CBOR Web Token (CWT)", RFC 8392, DOI 10.17487/RFC8392,
May 2018, <https://www.rfc-editor.org/info/rfc8392>. May 2018, <https://www.rfc-editor.org/rfc/rfc8392>.
17.2. Informative References 17.2. Informative References
[CCC-DeepDive] [CCC-DeepDive]
Confidential Computing Consortium, "Confidential Computing Confidential Computing Consortium, "Confidential Computing
Deep Dive", n.d., Deep Dive", n.d.,
<https://confidentialcomputing.io/whitepaper-02-latest>. <https://confidentialcomputing.io/whitepaper-02-latest>.
[CTAP] FIDO Alliance, "Client to Authenticator Protocol", n.d., [CTAP] FIDO Alliance, "Client to Authenticator Protocol", n.d.,
<https://fidoalliance.org/specs/fido-v2.0-id-20180227/ <https://fidoalliance.org/specs/fido-v2.0-id-20180227/
fido-client-to-authenticator-protocol-v2.0-id- fido-client-to-authenticator-protocol-v2.0-id-
20180227.html>. 20180227.html>.
[I-D.birkholz-rats-tuda] [I-D.birkholz-rats-tuda]
Fuchs, A., Birkholz, H., McDonald, I., and C. Bormann, Fuchs, A., Birkholz, H., McDonald, I. E., and C. Bormann,
"Time-Based Uni-Directional Attestation", Work in "Time-Based Uni-Directional Attestation", Work in
Progress, Internet-Draft, draft-birkholz-rats-tuda-04, 13 Progress, Internet-Draft, draft-birkholz-rats-tuda-04, 13
January 2021, <http://www.ietf.org/internet-drafts/draft- January 2021,
birkholz-rats-tuda-04.txt>. <https://tools.ietf.org/html/draft-birkholz-rats-tuda-04>.
[I-D.birkholz-rats-uccs] [I-D.birkholz-rats-uccs]
Birkholz, H., O'Donoghue, J., Cam-Winget, N., and C. Birkholz, H., O'Donoghue, J., Cam-Winget, N., and C.
Bormann, "A CBOR Tag for Unprotected CWT Claims Sets", Bormann, "A CBOR Tag for Unprotected CWT Claims Sets",
Work in Progress, Internet-Draft, draft-birkholz-rats- Work in Progress, Internet-Draft, draft-birkholz-rats-
uccs-02, 2 December 2020, <http://www.ietf.org/internet- uccs-03, 8 March 2021,
drafts/draft-birkholz-rats-uccs-02.txt>. <https://tools.ietf.org/html/draft-birkholz-rats-uccs-03>.
[I-D.ietf-teep-architecture] [I-D.ietf-teep-architecture]
Pei, M., Tschofenig, H., Thaler, D., and D. Wheeler, Pei, M., Tschofenig, H., Thaler, D., and D. Wheeler,
"Trusted Execution Environment Provisioning (TEEP) "Trusted Execution Environment Provisioning (TEEP)
Architecture", Work in Progress, Internet-Draft, draft- Architecture", Work in Progress, Internet-Draft, draft-
ietf-teep-architecture-13, 2 November 2020, ietf-teep-architecture-14, 22 February 2021,
<http://www.ietf.org/internet-drafts/draft-ietf-teep- <https://tools.ietf.org/html/draft-ietf-teep-architecture-
architecture-13.txt>. 14>.
[I-D.tschofenig-tls-cwt] [I-D.tschofenig-tls-cwt]
Tschofenig, H. and M. Brossard, "Using CBOR Web Tokens Tschofenig, H. and M. Brossard, "Using CBOR Web Tokens
(CWTs) in Transport Layer Security (TLS) and Datagram (CWTs) in Transport Layer Security (TLS) and Datagram
Transport Layer Security (DTLS)", Work in Progress, Transport Layer Security (DTLS)", Work in Progress,
Internet-Draft, draft-tschofenig-tls-cwt-02, 13 July 2020, Internet-Draft, draft-tschofenig-tls-cwt-02, 13 July 2020,
<http://www.ietf.org/internet-drafts/draft-tschofenig-tls- <https://tools.ietf.org/html/draft-tschofenig-tls-cwt-02>.
cwt-02.txt>.
[OPCUA] OPC Foundation, "OPC Unified Architecture Specification, [OPCUA] OPC Foundation, "OPC Unified Architecture Specification,
Part 2: Security Model, Release 1.03", OPC 10000-2 , 25 Part 2: Security Model, Release 1.03", OPC 10000-2 , 25
November 2015, <https://opcfoundation.org/developer-tools/ November 2015, <https://opcfoundation.org/developer-tools/
specifications-unified-architecture/part-2-security- specifications-unified-architecture/part-2-security-
model/>. model/>.
[RFC4949] Shirey, R., "Internet Security Glossary, Version 2", [RFC4949] Shirey, R., "Internet Security Glossary, Version 2",
FYI 36, RFC 4949, DOI 10.17487/RFC4949, August 2007, FYI 36, RFC 4949, DOI 10.17487/RFC4949, August 2007,
<https://www.rfc-editor.org/info/rfc4949>. <https://www.rfc-editor.org/rfc/rfc4949>.
[RFC5209] Sangster, P., Khosravi, H., Mani, M., Narayan, K., and J. [RFC5209] Sangster, P., Khosravi, H., Mani, M., Narayan, K., and J.
Tardo, "Network Endpoint Assessment (NEA): Overview and Tardo, "Network Endpoint Assessment (NEA): Overview and
Requirements", RFC 5209, DOI 10.17487/RFC5209, June 2008, Requirements", RFC 5209, DOI 10.17487/RFC5209, June 2008,
<https://www.rfc-editor.org/info/rfc5209>. <https://www.rfc-editor.org/rfc/rfc5209>.
[RFC6024] Reddy, R. and C. Wallace, "Trust Anchor Management
Requirements", RFC 6024, DOI 10.17487/RFC6024, October
2010, <https://www.rfc-editor.org/rfc/rfc6024>.
[RFC8322] Field, J., Banghart, S., and D. Waltermire, "Resource- [RFC8322] Field, J., Banghart, S., and D. Waltermire, "Resource-
Oriented Lightweight Information Exchange (ROLIE)", Oriented Lightweight Information Exchange (ROLIE)",
RFC 8322, DOI 10.17487/RFC8322, February 2018, RFC 8322, DOI 10.17487/RFC8322, February 2018,
<https://www.rfc-editor.org/info/rfc8322>. <https://www.rfc-editor.org/rfc/rfc8322>.
[strengthoffunction] [strengthoffunction]
NISC, "Strength of Function", n.d., NISC, "Strength of Function", n.d.,
<https://csrc.nist.gov/glossary/term/ <https://csrc.nist.gov/glossary/term/
strength_of_function>. strength_of_function>.
[TCG-DICE] Trusted Computing Group, "DICE Certificate Profiles", [TCG-DICE] Trusted Computing Group, "DICE Certificate Profiles",
n.d., <https://trustedcomputinggroup.org/wp- n.d., <https://trustedcomputinggroup.org/wp-
content/uploads/DICE-Certificate-Profiles- content/uploads/DICE-Certificate-Profiles-
r01_3june2020-1.pdf>. r01_3june2020-1.pdf>.
 End of changes. 51 change blocks. 
230 lines changed or deleted 309 lines changed or added

This html diff was produced by rfcdiff 1.48. The latest version is available from http://tools.ietf.org/tools/rfcdiff/