RISE Act of 2025
The Responsible Innovation and Safe Expertise Act of 2025 (RISE Act) would create a conditional civil liability immunity for certain artificial intelligence developers when their AI products are used by learned professionals in providing professional services. To qualify for immunity, a developer must publicly release and continuously maintain a model card and a model specification before deployment, with the model specification allowing redactions only for trade secrets unrelated to safety and accompanied by a written justification for any redaction. The developer must also provide clear, conspicuous documentation to learned professionals about the AI’s known limitations, failure modes, and appropriate domains of use. Immunity is limited to acts or omissions that do not amount to recklessness or willful misconduct and can be lost if the developer fails to update the model card, model specification, and documentation within 30 days after deploying a new version or after discovering a new material failure mode, with harms caused after the lack of update potentially negating immunity. The Act would preempt State-law claims arising from an error if the developer is immune, but would not preempt fraud, knowing misrepresentation, or conduct outside the professional use of the AI. The bill takes effect December 1, 2025, applying to acts or omissions occurring on or after that date.