top of page

Evaluating AI Governance

How can you tell if a company manages its AI responsibly? We analyzed public data about 250+ companies and will present our conclusions.

Registration is closed
See other events
Evaluating AI Governance
Evaluating AI Governance

Time & Location

Nov 01, 2023, 12:00 PM – 1:00 PM

https://us06web.zoom.us/j/86166470274

About the event

How can you tell if a company manages its responsible with its AI? We analyzed public data from 250+ companies and will present our conclusions.

➤ Our study

🌟EthicsGrade collected public data about the AI ethics activities of 250+ companies in 2022

🌟We analyzed this data (using the NIST AI RMF)

🌟We especially looked for correlations between signals of good AI governance, such as having AI ethics principles, and implementation activties

➤Key findings

💻No evidence that AI ethics principles and voluntary commitments are correlated with actually mitigating risks

💻No evidence that any of the other governance signals we studied is correlated with implementation, including thought leadership and employing specialized personnel/teams

💻Overall, more companies seem to have deteriorated than improved the volume of their AI ethics activities during 2022. Most companies stayed stable, however.

➤ Key insights

💡Many push for the adoption of AI ethics principles. For example, the US and Canada came out with initiatives to solicit such commitments. The UK thinks that voluntary guidelines can sometime replace regulation.

💡Consumers, investors, and other actors need to rely on external signals. Such actors often rely on the existence of AI ethics principles and personnel.

💡Our findings raise a red flag. Such signals may be misleading.

➤ The team includes Gil Rosenthal, Tess Buckley, Dr Joshua Scarpino, Luke Patterson, and Thorin Bristow

Hope to see you there!

Tickets

  • Free participation

    $0.00
    Sale ended

Total

$0.00

Share this event

FOR UPDATES

Join my mailing list for tech ethics news and resources. 

I will never use your email for anything else.

bottom of page