C_TEC Comment Letter on Proposed Rulemaking Under California Privacy Rights Act
November 8, 2021
California Privacy Protection Agency
Attn: Debra Castanon
915 Capitol Mall, Suite 350A
Sacramento, CA 95814
RE: Invitation for Preliminary Comments on Proposed Rulemaking Under the California Privacy Rights Act of 2020 (Proceeding No. 01-21)
In response to the California Privacy Protection Agency’s invitation for preliminary comments, the U.S. Chamber of Commerce Technology Engagement Center (“C_TEC” or “Chamber”) appreciates the opportunity to provide comments regarding the proposed rulemaking under the California Privacy Rights Act (“CPRA”). Although the business community asserts it is imperative that Congress pass a national privacy law that protects all Americans equally, it is also important that California’s Privacy Protection Agency (“Agency” or “CPPA”) effectively implements the CPRA and create certainty for consumers and businesses.
Businesses need clarity to facilitate compliance with the regulations. Additionally, the CPPA should give companies adequate lead time to implement compliance programs and practices before rules are enforced.
The Agency should, where feasible and appropriate, work to align the requirements of CPRA with other state privacy laws to encourage better compliance and uniformity. The Chamber also encourages the Agency to facilitate permanent exemptions for employee and business-to-business information.1
In response to the Agency’s specific regulatory requests, the Chamber offers the following comments organized by question number for your consideration.
1) Processing that Presents a Significant Risk to Consumers’ Privacy or Security: Cybersecurity Audits and Risk Assessments Performed by Businesses
To promote greater uniformity nationwide and ease compliance, the Chamber suggests harmonizing approaches with those undertaken in Virginia and Colorado.
As drafted, it is unclear what would constitute a “significant risk” and therefore trigger an audit and assessment. It is suggested that the Agency clarify the definition of “significant risk to consumers’ privacy or security.” To ensure that audits and assessments meaningfully enhance consumer privacy, The definition should be focused on mandating audits and assessments for processing that involve a substantial and identifiable risk of harm to consumers.
For the cybersecurity auditing requirements, the regulations should follow a risk-based approach. Businesses may be required to certify that they have implemented and adhere to policies and procedures designed to secure that personal information whose dissemination would present the greatest risk for the consumer’s privacy or security. Any new requirements should be consistent with California’s existing data security requirements, as established in Cal Civ. Code § 1798.81.5. Businesses should be permitted to leverage existing industry standards certifications to make this process less onerous. This includes the ISO 27000 series certification, conformity with the NIST Cybersecurity Framework, the annual Payment Card Industry merchant certification, Service Organization Control audits by internal and third parties, and/or security programs established pursuant to consent decrees with regulators such as the FCC or FTC. Businesses should be permitted to select qualified, independent third-party auditors of their choice. Moreover, the regulations should also permit internal audits, provided that there are structures in place to ensure that any internal audit can remain both thorough and independent. The option for an internal audit will be critically important for SMEs, which likely will not have the sources for the burden and expense of independent third-party audits.
For the risk assessment requirement, a business that has completed and submitted a risk assessment, a business should not be required to perform additional risk assessments. Moreover, the regulations should expressly acknowledge that the scope of a risk assessment is limited to the specific processing activity or activities that trigger the requirement under the “significant risk” definition. This will focus the assessments on enhancing consumer privacy protections while balancing effective oversight by the Agency.
The CPPA will be overwhelmed if it requires the constant submission of risk assessments. Instead, the regulations should give the Agency the power to request risk assessments when they are relevant to an investigation or inquiry. These assessments should be confidential, and the rules should recognize that privileged information or trade secrets will be redacted. This will help protect company intellectual property as well as consumer personal information contained in the report. The Agency should ensure that the assessments cannot be revealed through California’s Public Records Act and should not be made public.
2) Automated Decisionmaking – The CPRA provides for regulations governing consumers’ “access and opt-out rights with respect to businesses’ use of automated decisionmaking technology.”
- What activities should be deemed to constitute “automated decisionmaking technology” and/or “profiling.”
The use of innovative technologies, such as automated processes and technologies, benefit businesses tremendously, allowing them to increase productivity, prevent and detect fraud and identity theft, improve business processes, save costs, better allocate resources, and better use the talents of their employees. As the Agency looks at what should be deemed an “automated decisionmaking technology,” C_TEC encourages the Agency to take a risk-based approach, focusing not on technologies, but on the circumstances where those technologies have a significant, direct, tangible impact on either the economic or legal rights of the consumer. Any rules should not apply to inconsequential decisions made by automated decision technology.
Furthermore, C_TEC would encourage the CPPA to review current State and Federal regulations that already regulate automated decisionmaking technologies. Potentially deeming those already regulated industries within the scope of the CPRA could possibly cause unnecessary duplication of rules for businesses. Moreover, we encourage the CPPA to consider other domestic and international questions this rulemaking will raise. This includes how to harmonize with any federal requirements and frameworks. It also includes the recent EU-US pledge to collaborate on a common framework for the protection of human rights in AI at the summit for the recently launched Trade and Technology Council. Finally, we would encourage the Agency to make any regulation flexible to allow for future refinements.
- When consumers should be able to access information about businesses’ use of automated decisionmaking technology and what processes consumers and businesses should follow to facilitate access.
C_TEC would encourage that any CPRA rulemaking indicates that the information should be presented upfront to the consumer in a disclosure (e.g., privacy policy) that will provide necessary information regarding the businesses’ use of “automated decisionmaking technology.” Furthermore, we believe consumers should be able to use the same self-service portals or other methods by which they currently exercise rights under the CPPA or other sector-specific regulations.
- What information businesses must provide to consumers in response to access requests, including what businesses must do in order to provide “meaningful information about the logic” involved in the automated decisionmaking process.
C_TEC encourages CPPA to leverage existing NIST principles, including the recently finalized “Four Principles of Explainable AI”, to provide aligned guidance with what businesses must do to provide “meaningful information about the logic” involved in the automated decision-making process. Meaningful information about the logic should be focused on high level controls that support explainability, transparency, robustness, and trustworthy AI principles. The actual logic of the model is proprietary and should remain so.
C_TEC believes that it is essential to highlight that general access and correction rights are already provided to consumers within CPRA and required in many other sector-specific regulations.
d. The scope of consumers’ opt-out rights with regard to automated decisionmaking, and what processes consumers and businesses should follow to facilitate opt outs.
C_TEC would encourage any substantive expansion of opt-out rights in the CPRA to be adopted by the legislature rather than through an administrative rulemaking procedure. The core of California privacy law is the opt-out right, which is clearly defined in statute and has been subject to voter approval. The ambiguous provision in the rules regarding opt-out rights and automated decisionmaking does not support the creation of new duties and rights, which further expands the newly amplified opt-out right.
If any new rules regarding opt-out must be adopted, personal information protected under other financial privacy laws (federal or state) should continue to be excluded from the scope of this specific opt-out request. As well as an exemption for when an opt-out may cause harm or adverse impact to consumer out – e.g. packet routing – opt-out could slow down internet speed. Finally, and an exemption should be put in place for when an opt-out request is not feasible – e.g. a non-automated decision system cannot accomplish the task.
Furthermore, if the CPPA moves forward with an opt-out right tied to automated, decision-making, we would highly encourage it to follow the General Data Protection Regulation; consumers may opt-out of solely automated decisionmaking by requesting a human review of a decision that has caused a significant, direct, and tangible impact. Allowing consumers to opt-out of any automated process involving consumer data that leads to an insignificant decision (e.g., the decision to recommend one tv show over another on a streaming service) has the potential to cause disruption and inefficiencies for businesses without providing a commensurate benefit to consumers.
3) Audits Performed by the Agency
The Agency should perform an audit only where there is evidence that a business has misused personal information or violated substantive provisions of the CPRA, creating either harm or a substantial risk of harm to consumers. For example, a company that is honoring a consumer’s “Do Not Share” wishes but whose sole failure under CRPA is not proving a “Do Not Share” button should not trigger an audit without other negative circumstances. The rules should require a majority of Agency members to vote in favor of an audit before one can be ordered and to issue a resolution that cites the relevant evidence and defines the scope of the audit being required. The scope should be limited to addressing practices directly related to the misuse of personal information that gave rise to the audit. The Agency might follow the lead of the Federal Trade Commission and require audits to be performed after the end of an enforcement action against a business.
The CPRA should give a business the option to select an independent, certified auditor to perform any audits. (Regulations must also ensure the protection of businesses’ proprietary information disclosed during the audit.)
Because audits can and do result in a finding of no material deficiencies, the agency should ensure that any audits contain robust confidentiality/proprietary safeguards so that an audit cannot be revealed to the public through California’s Public Records Act. Additionally, the data, algorithms, and other proprietary material that the agency is authorized to review should receive similar confidentiality/proprietary protections.
4) Consumers’ Right to Delete, Right to Correct, and Right to Know
Responding to Requests
The Agency should provide clarification on the requirement for businesses with a physical presence to have a toll-free phone number allowing consumers to exercise their privacy rights. Some companies have a very small physical presence in which all users are funneled through an app or other online means, making their requirement for a staffed toll-free number an extremely burdensome and highly unnecessary one.
In responding to consumer requests, businesses should not be required to take extra steps (beyond what’s required today under the CCPA) to identify a consumer whose identity is unknown to the business. This would represent a disproportionate effort.
Right to Correction
The CPRA specifies that the right to correction should take into account “the nature of the personal information and the purposes of the processing of the personal information.” The right should have limited application to personal information that is necessary for the consumer to receive services (e.g. name, contact and payment information) and to exercise rights related to the business (e.g. payment or credit history with the business). It should not apply to data points that are obtained from third parties or are generated automatically through use of the business’ services and that do not impact the consumer’s rights or services (e.g. IP address, inferences, or telemetry data). It should not apply to inferences made about the consumer or to information obtained from third parties, unless this information is necessary to provide services to the consumer.
A consumer should not be permitted to alter a contract or terms to which s/he has agreed by exercising the right to correction.
Regulations should have provisions on verification of identity similar to those of the CCPA (11 C.C.R. §999.323-999.326). Businesses should be able to develop processes to prevent fraud, such as using the precise geolocation of a consumer to verify identity, or the staggering of timeframes in which certain data is corrected. It is essential for businesses to be able to use strong methods of authenticating consumers’ identities prior to releasing or changing personal information.
Separately, when consumers request a correction to personal information, they must be required to show that the requested change is necessary and accurate by showing proof like a phone bill.
A business that receives a consumer’s request to correct information should not be required to correct information if it was not the original source of the information. For example, a business may have information in its system that was inputted incorrectly by the consumers themselves and shared by another party. A business that was not the original source of the information should be able to inform the consumer to contact the original source so that the information is corrected at its source. Otherwise, incorrect data will continue to feed back into business systems.
Right to Know
A business should be required to provide information in response to a consumer request to know if it is readily available and in electronic format. To contrast, a business that has information in archive systems or non-electronic formats should be able to claim that providing such information “would involve a disproportionate effort.”
5) Consumers’ Rights to Opt-Out of the Selling or Sharing of Their Personal Information and to Limit the Use and Disclosure of Their Sensitive Personal Information.
Data is vital for preventing incidents like fraud and securing network. Personal information was instrumental in promoting public safety like stopping the San Bernardino shooters, expanding consumer access to credit, and improving public health.2 An interpretation of the CPRA by the Agency should take into consideration these societally beneficial purposes when determining when opt-out is not required.
Private and public implementations of universal opt-outs can have negative spillover effects for both individual companies and the broader internet ecosystem. Because of this, the design of these mechanisms should be developed collaboratively with input from industry and other stakeholders. Regulations must be consistent with the text of the CPRA, which clarifies that it is optional for a business to recognize a signal to opt out of the sale or sharing of personal information or to limit the use of sensitive information (§1798.135(b)(1), (3)). Other consumer notice and competition considerations contained in §1798.185(a)(19)(A) must also be reflected in the rules. Moreover, the CPRA directs the CPPA to cooperate with other states to ensure consistent application of privacy protections. § 1798.199.40(i). Colorado also is poised to start a rulemaking on an opt-out signal with regulatory directives to consider similar, and in some instances nearly identical, specifications to what the CPRA directs. The CPPA should work with Colorado to ensure that interoperable and aligned requirements for these signals are developed.
Any specifications that apply to global privacy controls (“GPC”) should provide businesses with sufficient flexibility to implement the technical solutions that fit their business models. Businesses use a variety of solutions today, and the Agency should avoid mandating a specific type of solution that may thwart innovation and reduce incentives to provide consumers the full range of choices in opt-out solutions. Any specifications must accurately identify which consumers are located in California so that businesses can accurately honor the request. Businesses should be limited to online data collection and not require a company to identify unauthenticated users to ensure that they are opted out of all forms of “sale” of personal information. This would be inconsistent with §1798.145(j). Businesses must be able to notify consumers of the consequences of an opt-out and solicit permission to use cookies. This is consistent with the CPRA’s aims of transparency and consumer choice. Any GPC must inform users of the meaning of the “Do Not Sell” signal in California. Default choices must be avoided to prevent uninformed choice or market distortion.
Companies honoring opt-out signals will inevitably receive competing signals (i.e. – a person opts out through a universal control but then opts in for a specific service). It will be important to provide guidance to companies about how to manage competing signals.
Ample time is needed by companies to adhere to any preference signal not obtained directly. If the signal is an incoming global request from a browser, another platform, etc, businesses need IT resources to read and direct traffic into our direct request/response system. Time would be necessary to adjust based on the preference signal that may be developed. In the interim, the preference signal solution should direct consumers to the individual companies to handle their specific requests, so consumer needs are met.
Companies should have the ability to win back people on an individual basis. There should be guardrails for this, but the relationship that businesses build with their customers should be preserved. For instance, a company could give users the ability to win back opportunities for some extended period of time.
7) Information to Be Provided in Response to a Consumer Request to Know (Specific Pieces of Information)
Businesses should only be able to provide identifiable personal information that is readily and reasonably available in their active production systems and does not present undue administrative cost or burden. The Agency should consider that this information may be harmful if exposed and is actively in use. Consumers should be allotted one request per 12 months for requests for data.
Businesses could spend disproportionate efforts to provide personal information from unstructured environments (e.g. log files), archived, non-active or non-production systems, and personal information that may not be identifiable on its own. The regulations should establish that IP addresses are not considered personal information if a business does not link the IP address with a specific person. For example, if an IP address is considered personal information, it is not individually identifiable on its own. The business may have to tie multiple pieces of data, systems, and vendor/partner data together to attempt to properly identify the individual, which could increase privacy risks for consumers. If identifiable information could even be provided back, the information is not digestible by the average consumer. CCPA does not require the business to reidentify or otherwise link any data that, in the ordinary course of business, is not maintained in a manner that would be considered personally identifiable information (e.g. aggregated, pseudonymized, or deidentified data).
8) Definitions and Categories
The CCPA and CPRA provide for various regulations to create or update definitions of important terms and categories of information or activities covered by the statute.
c. Updates, if any, to the law’s definitions of “deidentified” and/or “unique identifier.”
The Agency should align the definition of “deidentified” with the Virginia Consumer Data Privacy Act’s (“VCDPA”) definition for clarity and better implementation. The Agency should remove the reference to inferring information, add a reference to devices linked to a consumer, and sharpen the distinction between “pseudonymized” and “deidentified” data by applying exceptions similar to those in the VCDPA and Colorado Privacy Act (“CPA”). There should also be the added benefit of incentivizing the use of privacy protective technologies even where deidentification may not be feasible.
In the definition of “unique identifier,” the Agency should remove references to devices linked to a consumer and the list of example identifiers. Doing so would clarify the definition, remove circular references, and align the treatment of linked devices with VCDPA. “Unique identifier” shouldn’t include cookies, beacons, pixel tags, mobile ad identifiers, or similar technology; that is information that might link to a unique identifier. The technology or cookies themselves wouldn’t be uniquely identifying the individual. It would be helpful to clarify that the identifier is unique if the persistent identifier can reasonably identify the individual without the burden on the business to reidentify and link other data to make it individually identifiable.
e. Further defining the business purposes for which businesses, service providers, and contractors may combine consumers’ personal information that was obtained from different sources.
The current list of business purposes includes auditing, ensuring security and integrity, debugging, short-term transient use including non-personalized advertising shown as part of a current interaction, performing services on behalf of the business including maintaining or servicing accounts, providing advertising and marketing services except for cross-context behavioral advertising, undertaking internal research, and undertaking activities to verify or maintain the quality or service of a service or device.
Businesses rely on these established permissible uses to help improve their products, detect and prevent fraud, protect the security of the information of their customers, and generally support their services. With that in mind, it is important to preserve the current list.
h. What definition of “specific pieces of information obtained from the consumer” the Agency should adopt.
The regulations should clarify that “specific pieces of information” should not include data stored client-side/on user-device only, and non-human readable data. Platforms will not have access to the former, and the latter will typically be of little practical use for individuals.
- The changes, if any, that should be made to further define “precise geolocation.”
Industry technical standards for precision utilize decimal points of latitudinal and longitudinal coordinates rather than a radius. It would be helpful to clarify the definition of “precise geolocation” to align to industry technical standards. At minimum, the regulations should explain how a radius of 1,850 feet translates into latitude/longitude coordinates.
j. The regulations, if any, that should be adopted to further define “dark patterns.”
Regulations should avoid setting technical specifications or image requirements that constitute “dark patterns.” Any regulations in this area should also be consistent with any guidance or reports issued by the Federal Trade Commission, which is also investigating this subject. It should align with the rich body of FTC case law, which turns on whether the misrepresentation or omission is material.
The definition of “dark pattern” in the CPRA would be impossible for companies to implement. Rather than describing the elements of a dark pattern, it focuses on the effect of the interface- specifically whether it subverts or impairs people’s autonomy, decision-making, or choice. The current definition would have the unintentional consequence of prohibiting privacy-protective default settings because they would impair choice and autonomy (e.g. – where location sharing is automatically toggled off and the consumer has to toggle it back on to share location data).
The use of an examples-based approach is particularly important because this is a novel area of regulation. It will be important to recognize that companies do not have existing familiarity with design-related requirements. Therefore, it will be critical to provide significant guidance. In particular, the Chamber requests that the Agency more specifically defines the practices that constitute dark patterns. For example, this could include practices like displaying one option prominently while making it hard to see or access another option. In short, the goal should be to eliminate bad practices by providing clear guidance to companies about what those practices are. Instead, the current text would have companies attempt to understand whether the design of their website or app impacts a person’s “autonomy” — a vague, if not impossible to meet, standard.
Regulations should balance clear and precise descriptions of risky practices with the risk of negative effects from overly prescriptive design. The best design is context sensitive, consistent with the wider user experience and a users’ expectations. It should be aware of the particular goals and intent that a person may have at that time in the user journey.
Again, because this is a novel area of regulation, it will be important to continue to consult with a range of stakeholders, but particularly with designers, to understand design constraints and design best practices.
The Chamber appreciates the ability to provide comments on the issue areas requested above. Another area in which the Agency should consider harmonizing approaches with other states is enforcement. Virginia and Colorado provide at least a 30-day cure period for alleged violations before enforcement is undertaken. The CPRA gives the Agency discretion to provide businesses with a cure period.3 The Chamber requests that the Agency promulgate a blanket 30-day cure period to enable greater collaboration between businesses and regulators.
We look forward to working with you to ensure consumer protection and clear rules for compliance in implementing the CPRA.
Sincerely,
Jordan Crenshaw
Vice President
Chamber Technology Engagement Center
The Weekly Download
Subscribe to receive a weekly roundup of the Chamber Technology Engagement Center (C_TEC) and relevant U.S. Chamber advocacy and events.
The Weekly Download will keep you updated on emerging tech issues including privacy, telecommunications, artificial intelligence, transportation, and government digital transformation.