Written by 1:20 pm AI, Discussions

– The Urgent Race of States to Keep Pace with Unsupervised AI in Daily Life

Lawmakers in at least seven states are taking big legislative swings to regulate bias in artificial…

DENVER (AP) — Despite the buzz surrounding ChatGPT, artificial intelligence has quietly permeated various aspects of daily life, including screening job applications, rental requests, and even influencing medical decisions.

While some AI systems have been found to exhibit bias favoring specific races, genders, or income levels, there is a notable lack of government oversight in this area.

In response to this gap in regulation, lawmakers in seven states are making significant efforts to address bias in artificial intelligence, stepping in where Congress has failed to act. These legislative proposals mark the initial stages of a long-term discussion on how to balance the advantages of this emerging technology with its well-documented risks.

Suresh Venkatasubramanian, a professor at Brown University and co-author of the White House’s Blueprint for an AI Bill of Rights, emphasized the pervasive impact of AI on individuals’ lives, highlighting the importance of addressing issues when these systems fail to function properly.

The success of these legislative endeavors hinges on policymakers navigating complex challenges while engaging with an industry valued in the hundreds of billions of dollars and advancing at an unprecedented pace.

In the past year, only a small fraction of the numerous AI-related bills introduced at the state level were enacted into law, according to BSA The Software Alliance. These bills primarily focused on regulating specific aspects of AI, with a significant emphasis on addressing issues such as deepfakes and chatbot regulations.

Separate from these efforts, seven state bills are currently under consideration to combat AI discrimination across various sectors, reflecting one of the most intricate and troubling aspects of this technology. However, experts warn that states are already lagging behind in establishing necessary safeguards.

The widespread use of algorithms in critical decision-making processes, such as hiring practices, remains largely obscured from public view. Despite the prevalence of these automated decision tools, a majority of Americans are unaware of their existence and potential biases, as revealed by polling data from Pew Research.

Instances like Amazon’s biased hiring algorithm, which favored male applicants due to historical data discrepancies, underscore the inherent risks associated with AI learning from flawed datasets. Such biases can have far-reaching consequences, as seen in cases of discriminatory practices in rental screening processes.

The lack of transparency and accountability in the deployment of AI-driven tools is a key focus of the proposed bills, aiming to enhance oversight and mitigate discriminatory outcomes. These legislative measures would require companies to conduct impact assessments, disclose the use of AI in decision-making, and offer opt-out mechanisms for consumers.

While industry stakeholders generally support certain aspects of these proposals, such as impact assessments, the road to enacting comprehensive AI regulations remains challenging. Several states, including Washington and California, have faced setbacks in passing AI-related legislation, signaling the complexities involved in regulating this rapidly evolving technology.

As states like California, Colorado, Rhode Island, Illinois, Connecticut, Virginia, and Vermont continue to deliberate on AI regulation, the need for robust safeguards against bias and discrimination remains a pressing concern. Experts advocate for more stringent measures, such as bias audits, to ensure accountability and transparency in AI systems.

Despite the hurdles ahead, the introduction of these bills marks a critical step towards addressing the ethical and practical implications of AI’s pervasive influence on society.


Associated Press reporter Trân Nguyễn in Sacramento, California, contributed.

Visited 1 times, 1 visit(s) today
Tags: , Last modified: March 5, 2024
Close Search Window
Close