FederalIn Committee

HR 8623

To require artificial intelligence chatbots to implement age verification measures and make certain disclosures, and for other purposes.

Medium Risk

May require changes to AI practices. Monitor and prepare.

TL;DR

Rep. Blake Moore (R-UT) introduced HR 8623, which would force AI chatbot operators like ChatGPT, Character.AI, and Replika to verify users' ages and disclose key information about how their bots work. The bill aims to protect minors from AI chatbot harms and ensure users know when they're talking to AI rather than a human.

How This Might Impact Your Business

Consumer-facing AI chatbot operators would need to build and deploy age verification systems before granting users access, similar to compliance burdens already faced by online alcohol and gambling sites.

Companies offering companion AI, mental health chatbots, or character roleplay bots (Character.AI, Replika, Inflection) face the highest impact since minors are core users.

Mandatory disclosures would require chatbots to clearly identify themselves as AI, not human, likely at the start of conversations and potentially at recurring intervals.

Enterprise SaaS vendors embedding chatbots (Salesforce Einstein, Intercom Fin, Zendesk AI) may need to add age gates or disclosure layers to consumer-facing deployments.

Non-compliance penalties are not yet detailed in the referred text but federal chatbot bills typically carry FTC enforcement and per-violation fines.

Bill sits in House Judiciary and Energy and Commerce committees, meaning it is early-stage with no guaranteed floor vote, but signals bipartisan momentum on AI child safety following state laws in California and Utah.

B2B-only chatbots and internal enterprise tools would likely fall outside scope, though final language could expand definitions.

What Should You Do

1

Inventory every customer-facing chatbot your company operates or licenses, and flag which ones could be accessed by users under 18.

2

Ask product and legal teams to evaluate age verification vendors (Yoti, Incode, Persona) and estimate integration costs before federal or state mandates force a rushed deployment.

3

Review your chatbot's opening messages and system prompts to confirm clear AI disclosure language is present; this is low-cost insurance regardless of whether HR 8623 passes.

4

Assign someone to track committee action in House Judiciary and Energy and Commerce, and monitor for companion Senate legislation that could accelerate timelines.

5

If you sell chatbot technology to other businesses, prepare a compliance FAQ for customers who will start asking how your product handles age verification and disclosures.

Who It Affects

Consumer AIEdTechSocial MediaMental Health TechGamingCustomer Service Software

Sponsors

Status Timeline

committee

Referred to the Committee on the Judiciary, and in addition to the Committee on Energy and Commerce, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.

April 30, 2026

AI-generated analysis for informational purposes only. Not legal advice. Always consult a qualified attorney for legal guidance.

Need help preparing your team for AI compliance?

Talk to LaunchReady about AI Training

Get the Weekly AI Law Roundup

Plain-English summaries of the AI laws that matter for your business. Every Monday. Free.

No spam. Unsubscribe anytime.