The federal government’s approach to deploying AI systems is a defining force in shaping industry standards, academic research, and public perception of these technologies. Public sentiment toward AI remains mixed,?with many Americans expressing a?lack of trust?in AI systems.?To fully harness the benefits of AI, the public must have confidence that these systems are deployed responsibly and enhance their lives and livelihoods.
The first Trump Administration’s AI policies clearly recognized the opportunity to promote AI adoption through transparency and public trust. President Trump’s?Executive Order 13859?explicitly stated that agencies must design, develop, acquire, and use “AI in a manner that fosters public trust and confidence while protecting privacy, civil rights, civil liberties, and American values.”?This commitment laid the foundation for increasing government accountability in AI use.
To support continued transparency and accountability in government AI use, the Federation of American Scientists has written?a letter, co-signed by 16 additional scientific and technical organizations, urging OMB to maintain its detailed guidance?on AI inventories. We believe that sustained transparency is crucial to ensuring responsible AI governance, fostering public trust, and enabling industry innovation.
“The federal government has immense power to shape industry standards, academic research, and public perception of artificial intelligence,” says?Daniel Correa, CEO of the Federation of American Scientists. “By continuing the work set forth by the first Trump administration in?Executive Order 13960??and continued by the bipartisan?2023 Advancing American AI Act, OMB’s detailed use cases help us understand the depth and scope of AI systems used for government services.”
“FAS and our fellow organizations urge the administration to maintain these use case standards because these inventories provide a critical check on government AI use,” says Dr.?Jedidah Isler, Ph.D., Chief Science Officer at FAS.
Read more ??
https://lnkd.in/ezyvsBFi