Exposure of Private Personal Information to an Unauthorized Actor

Incomplete Base
Structure: Simple
Description

The product does not properly prevent a person's private, personal information from being accessed by actors who either (1) are not explicitly authorized to access the information or (2) do not have the implicit consent of the person about whom the information is collected.

The product does not properly prevent a person's private, personal information from being accessed by actors who either (1) are not explicitly authorized to access the information or (2) do not have the implicit consent of the person about whom the information is collected.
Common Consequences 1
Scope: Confidentiality

Impact: Read Application Data

Detection Methods 2
Architecture or Design ReviewHigh
Private personal data can enter a program in a variety of ways: - Directly from the user in the form of a password or personal information - Accessed from a database or other data store by the application - Indirectly from a partner or other third party If the data is written to an external location - such as the console, file system, or network - a privacy violation may occur.
Automated Static AnalysisHigh
Automated static analysis, commonly referred to as Static Application Security Testing (SAST), can find some instances of this weakness by analyzing source code (or binary/compiled code) without having to execute it. Typically, this is done by building a model of data flow and control flow, then searching for potentially-vulnerable patterns that connect "sources" (origins of input) with "sinks" (destinations where the data interacts with external components, a lower layer such as the OS, etc.)
Potential Mitigations 2
Phase: Requirements
Identify and consult all relevant regulations for personal privacy. An organization may be required to comply with certain federal and state regulations, depending on its location, the type of business it conducts, and the nature of any private data it handles. Regulations may include Safe Harbor Privacy Framework [REF-340], Gramm-Leach Bliley Act (GLBA) [REF-341], Health Insurance Portability and Accountability Act (HIPAA) [REF-342], General Data Protection Regulation (GDPR) [REF-1047], California Consumer Privacy Act (CCPA) [REF-1048], and others.
Phase: Architecture and Design
Carefully evaluate how secure design may interfere with privacy, and vice versa. Security and privacy concerns often seem to compete with each other. From a security perspective, all important operations should be recorded so that any anomalous activity can later be identified. However, when private data is involved, this practice can in fact create risk. Although there are many ways in which private data can be handled unsafely, a common risk stems from misplaced trust. Programmers often trust the operating environment in which a program runs, and therefore believe that it is acceptable store private information on the file system, in the registry, or in other locally-controlled resources. However, even if access to certain resources is restricted, this does not guarantee that the individuals who do have access can be trusted.
Demonstrative Examples 3
The following code contains a logging statement that tracks the contents of records added to a database by storing them in a log file. Among other values that are stored, the getPassword() function returns the user-supplied plaintext password associated with the account.

Code Example:

Bad
C#
c#
The code in the example above logs a plaintext password to the filesystem. Although many developers trust the filesystem as a safe storage location for data, it should not be trusted implicitly, particularly when privacy is a concern.

ID : DX-111

This code uses location to determine the user's current US State location.
First the application must declare that it requires the ACCESS_FINE_LOCATION permission in the application's manifest.xml:

Code Example:

Bad
XML
xml
During execution, a call to getLastLocation() will return a location based on the application's location permissions. In this case the application has permission for the most accurate location possible:

Code Example:

Bad
Java
java
While the application needs this information, it does not need to use the ACCESS_FINE_LOCATION permission, as the ACCESS_COARSE_LOCATION permission will be sufficient to identify which US state the user is in.
In 2004, an employee at AOL sold approximately 92 million private customer e-mail addresses to a spammer marketing an offshore gambling web site [REF-338]. In response to such high-profile exploits, the collection and management of private data is becoming increasingly regulated.
References 11
Seven Pernicious Kingdoms: A Taxonomy of Software Security Errors
Katrina Tsipenyuk, Brian Chess, and Gary McGraw
NIST Workshop on Software Security Assurance Tools Techniques and MetricsNIST
07-11-2005
ID: REF-6
AOL man pleads guilty to selling 92m email addies
J. Oates
The Register
2005
ID: REF-338
Guide to Protecting the Confidentiality of Personally Identifiable Information (SP 800-122)
NIST
04-2010
ID: REF-339
Safe Harbor Privacy Framework
U.S. Department of Commerce
ID: REF-340
Financial Privacy: The Gramm-Leach Bliley Act (GLBA)
Federal Trade Commission
ID: REF-341
Health Insurance Portability and Accountability Act (HIPAA)
U.S. Department of Human Services
ID: REF-342
California SB-1386
Government of the State of California
12-02-2002
ID: REF-343
FIPS PUB 140-2: SECURITY REQUIREMENTS FOR CRYPTOGRAPHIC MODULES
Information Technology Laboratory, National Institute of Standards and Technology
25-05-2001
ID: REF-267
Mobile App Top 10 List
Chris Wysopal
13-12-2010
ID: REF-172
General Data Protection Regulation
Wikipedia
ID: REF-1047
California Consumer Privacy Act (CCPA)
State of California Department of Justice, Office of the Attorney General
ID: REF-1048
Applicable Platforms
Languages:
Not Language-Specific : Undetermined
Technologies:
Mobile : Undetermined
Modes of Introduction
Architecture and Design
Implementation
Operation
Alternate Terms

Privacy violation

Privacy leak

Privacy leakage

Taxonomy Mapping
  • 7 Pernicious Kingdoms
  • The CERT Oracle Secure Coding Standard for Java (2011)
Notes
MaintenanceThis entry overlaps many other entries that are not organized around the kind of sensitive information that is exposed. However, because privacy is treated with such importance due to regulations and other factors, and it may be useful for weakness-finding tools to highlight capabilities that detect personal private information instead of system information, it is not clear whether - and how - this entry should be deprecated.
Other There are many types of sensitive information that products must protect from attackers, including system data, communications, configuration, business secrets, intellectual property, and an individual's personal (private) information. Private personal information may include a password, phone number, geographic location, personal messages, credit card number, etc. Private information is important to consider whether the person is a user of the product, or part of a data set that is processed by the product. An exposure of private information does not necessarily prevent the product from working properly, and in fact the exposure might be intended by the developer, e.g. as part of data sharing with other organizations. However, the exposure of personal private information can still be undesirable or explicitly prohibited by law or regulation. Some types of private information include: - Government identifiers, such as Social Security Numbers - Contact information, such as home addresses and telephone numbers - Geographic location - where the user is (or was) - Employment history - Financial data - such as credit card numbers, salary, bank accounts, and debts - Pictures, video, or audio - Behavioral patterns - such as web surfing history, when certain activities are performed, etc. - Relationships (and types of relationships) with others - family, friends, contacts, etc. - Communications - e-mail addresses, private messages, text messages, chat logs, etc. - Health - medical conditions, insurance status, prescription records - Account passwords and other credentials Some of this information may be characterized as PII (Personally Identifiable Information), Protected Health Information (PHI), etc. Categories of private information may overlap or vary based on the intended usage or the policies and practices of a particular industry. Sometimes data that is not labeled as private can have a privacy implication in a different context. For example, student identification numbers are usually not considered private because there is no explicit and publicly-available mapping to an individual student's personal information. However, if a school generates identification numbers based on student social security numbers, then the identification numbers should be considered private.