Sensitive Information in Resource Not Removed Before Reuse

Draft Base
Structure: Simple
Description

The product releases a resource such as memory or a file so that it can be made available for reuse, but it does not clear or "zeroize" the information contained in the resource before the product performs a critical state transition or makes the resource available for reuse by other entities.

Extended Description

When resources are released, they can be made available for reuse. For example, after memory is de-allocated, an operating system may make the memory available to another process, or disk space may be reallocated when a file is deleted. As removing information requires time and additional resources, operating systems do not usually clear the previously written information. Even when the resource is reused by the same process, this weakness can arise when new data is not as large as the old data, which leaves portions of the old data still available. Equivalent errors can occur in other situations where the length of data is variable but the associated data structure is not. If memory is not cleared after use, the information may be read by less trustworthy parties when the memory is reallocated. This weakness can apply in hardware, such as when a device or system switches between power, sleep, or debug states during normal operation, or when execution changes to different users or privilege levels.

Common Consequences 1
Scope: Confidentiality

Impact: Read Application Data

Detection Methods 2
Manual AnalysisHigh
Write a known pattern into each sensitive location. Trigger the release of the resource or cause the desired state transition to occur. Read data back from the sensitive locations. If the reads are successful, and the data is the same as the pattern that was originally written, the test fails and the product needs to be fixed. Note that this test can likely be automated.
Automated Static AnalysisHigh
Automated static analysis, commonly referred to as Static Application Security Testing (SAST), can find some instances of this weakness by analyzing source code (or binary/compiled code) without having to execute it. Typically, this is done by building a model of data flow and control flow, then searching for potentially-vulnerable patterns that connect "sources" (origins of input) with "sinks" (destinations where the data interacts with external components, a lower layer such as the OS, etc.)
Potential Mitigations 2
Phase: Architecture and DesignImplementation
During critical state transitions, information not needed in the next state should be removed or overwritten with fixed patterns (such as all 0's) or random data, before the transition to the next state.

Effectiveness: High

Phase: Architecture and DesignImplementation
When releasing, de-allocating, or deleting a resource, overwrite its data and relevant metadata with fixed patterns or random data. Be cautious about complex resource types whose underlying representation might be non-contiguous or change at a low level, such as how a file might be split into different chunks on a file system, even though "logical" file positions are contiguous at the application layer. Such resource types might require invocation of special modes or APIs to tell the underlying operating system to perform the necessary clearing, such as SDelete (Secure Delete) on Windows, although the appropriate functionality might not be available at the application layer.

Effectiveness: High

Demonstrative Examples 3

ID : DX-147

This example shows how an attacker can take advantage of an incorrect state transition.
Suppose a device is transitioning from state A to state B. During state A, it can read certain private keys from the hidden fuses that are only accessible in state A but not in state B. The device reads the keys, performs operations using those keys, then transitions to state B, where those private keys should no longer be accessible.

Code Example:

Bad
Other

During the transition from A to B, the device does not scrub the memory.

After the transition to state B, even though the private keys are no longer accessible directly from the fuses in state B, they can be accessed indirectly by reading the memory that contains the private keys.

Code Example:

Good
Other

For transition from state A to state B, remove information which should not be available once the transition is complete.

ID : DX-148

The following code calls realloc() on a buffer containing sensitive data:

Code Example:

Bad
C
c
There is an attempt to scrub the sensitive data from memory, but realloc() is used, so it could return a pointer to a different part of memory. The memory that was originally allocated for cleartext_buffer could still contain an uncleared copy of the data.
The following example code is excerpted from the AES wrapper/interface, aes0_wrapper, module of one of the AES engines (AES0) in the Hack@DAC'21 buggy OpenPiton System-on-Chip (SoC). Note that this SoC contains three distinct AES engines. Within this wrapper module, four 32-bit registers are utilized to store the message intended for encryption, referred to as p_c[i]. Using the AXI Lite interface, these registers are filled with the 128-bit message to be encrypted.

Code Example:

Bad
Verilog

module aes0_wrapper #(...)(...); ... always @(posedge clk_i)

verilog

p_c[3] <= reglk_ctrl_i[3] ? p_c[3] : wdata[31:0];** 2:

verilog
verilog

p_c[1] <= reglk_ctrl_i[3] ? p_c[1] : wdata[31:0];** 4:

verilog
verilog
The above code snippet [REF-1402] illustrates an instance of a vulnerable implementation of the AES wrapper module, where p_c[i] registers are cleared at reset. Otherwise, p_c[i]registers either maintain their old values (if reglk_ctrl_i[3]is true) or get filled through the AXI signal wdata. Note that p_c[i]registers can be read through the AXI Lite interface (not shown in snippet). However, p_c[i] registers are never cleared after their usage once the AES engine has completed the encryption process of the message. In a multi-user or multi-process environment, not clearing registers may result in the attacker process accessing data left by the victim, leading to data leakage or unintentional information disclosure. To fix this issue, it is essential to ensure that these internal registers are cleared in a timely manner after their usage, i.e., the encryption process is complete. This is illustrated below by monitoring the assertion of the cipher text valid signal, ct_valid [REF-1403].

Code Example:

Good
Verilog

module aes0_wrapper #(...)(...); ... always @(posedge clk_i)

verilog

else if(ct_valid) //encryption process complete, clear p_c[i]**

verilog
verilog

p_c[0] <= 0;**

verilog
Observed Examples 8
CVE-2019-3733Cryptography library does not clear heap memory before release
CVE-2003-0001Ethernet NIC drivers do not pad frames with null bytes, leading to infoleak from malformed packets.
CVE-2003-0291router does not clear information from DHCP packets that have been previously used
CVE-2005-1406Products do not fully clear memory buffers when less data is stored into the buffer than previous.
CVE-2005-1858Products do not fully clear memory buffers when less data is stored into the buffer than previous.
CVE-2005-3180Products do not fully clear memory buffers when less data is stored into the buffer than previous.
CVE-2005-3276Product does not clear a data structure before writing to part of it, yielding information leak of previously used memory.
CVE-2002-2077Memory not properly cleared before reuse.
Applicable Platforms
Languages:
Not Language-Specific : Undetermined
Technologies:
Not Technology-Specific : Undetermined
Modes of Introduction
Implementation
Functional Areas
  1. Memory Management
  2. Networking
Affected Resources
  1. Memory
Taxonomy Mapping
  • PLOVER
  • CERT C Secure Coding
  • Software Fault Patterns
Notes
RelationshipThere is a close association between Sensitive Information in Resource Not Removed Before Reuse and Improper Removal of Sensitive Information Before Storage or Transfer. The difference is partially that of perspective. Sensitive Information in Resource Not Removed Before Reuse is geared towards the final stage of the resource lifecycle, in which the resource is deleted, eliminated, expired, or otherwise released for reuse. Technically, this involves a transfer to a different control sphere, in which the original contents of the resource are no longer relevant. Improper Removal of Sensitive Information Before Storage or Transfer, however, is intended for sensitive data in resources that are intentionally shared with others, so they are still active. This distinction is useful from the perspective of the CWE research view (Research Concepts).
MaintenanceThis entry needs modification to clarify the differences with Improper Removal of Sensitive Information Before Storage or Transfer. The description also combines two problems that are distinct from the CWE research perspective: the inadvertent transfer of information to another sphere, and improper initialization/shutdown. Some of the associated taxonomy mappings reflect these different uses.
Research GapThis is frequently found for network packets, but it can also exist in local memory allocation, files, etc.