On December 24, 2024, the Oregon Attorney General published AI guidance, “What you should know about how Oregon’s laws may affect your company’s use of Artificial Intelligence,” (the “Guidance”) that clarifies how existing Oregon consumer protection, privacy and anti-discrimination laws apply to AI tools. Through various examples, the Guidance highlights key themes such as privacy, accountability and transparency, and provides insight into “core concerns,” including bias and discrimination.
Consumer Protection – Oregon’s Unlawful Trade Practice Act (“UTPA”)
The Guidance emphasizes that misrepresentations, even when they are not directly made to the consumer, may be actionable under the UTPA, and an AI developer or deployer may be “liable to downstream consumers for the harm its products cause.” The Guidance provides a non-exhaustive list of examples that may constitute violations of the UTPA, such as:
- failing to disclose any known material defect or nonconformity when delivering an AI product;
- misrepresenting that an AI product has characteristics, uses, benefits or qualities that it does not have;
- using AI to misrepresent that real estate, goods or services have certain characteristics, uses, benefits or qualities (e.g., a developer or deployer using a chatbot while falsely representing that it is human);
- using AI to make false or misleading representations about price reductions (e.g., using AI generated ads or emails indicating “limited time” or “flash sale” when a similar discount is offered year-round);
- using AI to set excessively high prices during an emergency;
- using an AI-generated voice as part of a robocall campaign to misrepresent or falsify certain information, such as the caller’s identity and the purpose of the call; and
- leveraging AI to use unconscionable tactics regarding the sale, rental or disposal of real estate, goods or services, or collecting or enforcing an obligation (e.g., knowingly taking advantage of a consumer’s ignorance or knowingly permitting a consumer to enter into a transaction that does not materially benefit them).
Data Privacy – Oregon Consumer Protection Act (“OCPA”)
In addition, the Guidance notes that developers, suppliers and users of AI may be subject to OCPA, given generative AI systems ingest a significant amount of words, images and other content that often consists of personal data. Key takeaways from the Guidance regarding OCPA include:
- developers that use personal data to train AI systems must clearly disclose that they do so in an accessible and clear privacy notice;
- if personal data includes any categories of sensitive data, entities must first obtain explicit consent from consumers before using the data to develop or train AI models;
- if the developer purchases or uses another data’s company for model training, the developer may be considered a “controller” under OCPA, and therefore must comply with the same standards as the company that initially collected the data;
- data suppliers and developers are prohibited from “retroactively or passively” altering privacy notices or terms of use to legitimatize the use of previously collected personal data to train AI models, and instead are required to obtain affirmative consent for any secondary or new uses of that data;
- developers and users of AI must provide a mechanism for consumers to withdraw previously-given consent (and if the consent is revoked, stop processing the data within 15 days of receiving the revocation);
- entities subject to OCPA must consider how to account for specific consumer rights when using AI models, including a consumer’s right to (1) opt-out of the use of profiling in decisions that have legal or similarly significant effects (e.g., housing, education or lending) and (2) request the deletion of their personal data; and
- in connection with OCPA’s requirement to conduct data protection assessments for certain processing activities, due to the complexity of generative AI models and proprietary data and algorithms, entities “should be aware that feeding consumer data into AI models and processing it in connection with these models likely poses heightened risks to consumers.”
Data Security – Oregon Consumer Information Protection Act
The Guidance clarifies that AI developers (as well as their data suppliers and users) that “own, license, maintain, store, manage, collect, acquire or otherwise possess” personal information also must comply with the Oregon Consumer Information Protection Act, which requires businesses to safeguard personal information and implement an information security program that meets specific requirements. The Guidance also notes that to the extent there is a security breach, AI developers, data suppliers and users may be required to notify consumers and the Oregon Attorney General.
Anti-Discrimination – Oregon Equality Act
The Guidance explains that AI systems that “utilize discretionary inputs or produce biased outcomes that harm individuals based on protected characteristics” may trigger the Oregon Equality Act. The law prohibits discrimination based on race, color, religion, sex, sexual orientation, gender identity, national origin, marital status, age or disability, including in connection with housing and public accommodations. The Guidance also includes an illustrative example regarding how the law applies to the use of AI. Specifically, the Guidance notes that a rental management company’s use of an AI mortgage approval system that consistently denies loans to qualified applicants based on certain neighborhoods or ethnic backgrounds because the AI system was trained on historically biased data may be considered a violation of the law.
Search
Recent Posts
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Committee on Foreign Investment in the United States
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DORA
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Electronic Protected Health Information
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition Technology
- FACTA
- Fair Credit Reporting Act
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Geolocation Data
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- HIPAA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- Iowa
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Meta
- Mexico
- Microsoft
- Minnesota
- Mobile
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- North Korea
- Norway
- Obama Administration
- OCPA
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Oklahoma
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Profiling
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Sensitive Data
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Dakota
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code