top of page
Ravit banner.jpg

Regulation of AI
In the United States

About this Project

 

Regulation of artificial intelligence is emerging. While it is impossible to know for sure what the regulation would ultimately look like, we can learn a lot by examining what regulators are trying to accomplish.

 

With that in mind, I studied trends in US bills up until mid 2023. Including passed, pending, and failed bills. I uncovered trends:

 

➤ AI regulation in the US is on the rise, with a marked increase in 2023.

 

➤ The government sector is the most common regulatory target, especially national security. 

 

➤ Bills that target the government sector mostly recommend setting up committees and other organizational structures and writing reports and other documents. Almost no bills propose any restrictions on the public sector. 

 

➤ Outside of the government, the most targeted sectors are HR, especially anti-discrimination in hiring decisions, and the financial sector, especially anti-discrimination in insurance underwriting. Law enforcement and criminal justice are conspicuously missing from the list of targeted sectors. 

 

➤ Specific AI technologies don’t get much direct attention, but the top ones are AI in social media algorithms and facial recognition. 

 

➤ The top AI ethics themes were improving AI capabilities, fairness, and data rights. Three AI ethics themes notably absent are explainability, human autonomy, and risks related to Artificial General Intelligence. 

 

➤ Democrats exhibit greater activity in introducing bills, emphasizing general AI ethics and fairness, whereas Republicans prioritize AI capabilities and data rights.

​

➤ The US approach seems different from both the EU and UK approaches to AI regulation

​

This project was supported by The Center for Governance and Markets at the University Pittsburgh. 

​

Mishaal Najam contributed to the project as a research assistant.  

RELATED RESOURCES

FOR UPDATES

Join my newsletter for tech ethics resources. 

I will never use your email for anything else.

bottom of page