As elections loom in the age of AI, Washington officials look for answers
OLYMPIA – Gov. Jay Inslee on Tuesday signed an executive order requiring state agencies to make guidelines for the use of artificial intelligence that mimics the human brain to create new pictures, words, sounds and videos.
Known as generative AI, this form of creative artificial intelligence now plays a role in pretty much any space where people use smartphones or can access the internet. Questions about the morality and safety of AI’s place in classrooms, courtrooms or driving cars have been a key issue in the state’s legislative session this year.
One thing remains undisputed: Artificial intelligence is growing exponentially in terms of both technological advancement and economic influence. In 2022, the release of ChatGPT quickly put artificial intelligence on the map. And a few months out from national elections, experts fear the ever-changing technology could wreak unforeseen havoc on the Democratic process.
Inslee said the executive order lays out a yearlong plan for government agencies to work with WaTech, the agency that heads the state’s technology and information security services.
“It’s our duty to the public to be thorough and thoughtful in how we adopt these powerful new tools,” the governor said.
Part of the plan will be to ensure safety in “high-risk” generative artificial intelligence cases, such as technology that could impact a person’s health, safety or fundamental rights. Examples reportedly include fingerprints, DNA, employment history, health care information, voting records and more.
The executive order comes just a week after Secretary of State Steve Hobbs sounded alarm bells for voters to be wary of deepfakes – a name for information created or altered via artificial intelligence that is often false. Deepfakes frequently take the form of a video or recording of a person in which their face, body or voice has been digitally altered so that they appear to be someone else.
In elections, deepfakes are typically used maliciously or to spread false information. Hobbs’ warning memo last week was spurred by face elections robocalls in New Hampshire that reportedly simulated President Joe Biden’s voice. In the robocalls, the voice was reported to have simulated Biden speaking negatively about the 2024 U.S. presidential campaign.
“Hobbs warned Washington voters that deepfakes are an ongoing threat to elections and the voting public,” the memo reads.
The fake Biden voice reportedly urged New Hampshire voters to “save your vote for the November election” rather than participating in their state’s presidential primary, NBC News reported.
In 2023, Hobbs and a group of legislators got a new law passed that made deepfake advertising for political campaigns illegal in Washington. Yet with the enormous amount of content and advertising on the internet, elections officials fear state borders won’t protect residents from a lot of false information.
“The disturbing situation we’ve seen in New Hampshire’s campaign is just the tip of the iceberg for 2024,” Hobbs said. “These false messages will get more polished and harder to tell from real ones. Voters must remain vigilant and skeptical, and turn to trusted information sources to verify things that just don’t seem right.”
During this year’s short 60-day legislative session, Washington lawmakers are grappling with how to address artificial intelligence. A few of them have proposed bills to address problems they’ve noticed:
Attorney General Bob Ferguson this year has pushed for legislation that would create a state task force to monitor the ways artificial intelligence gets used in state agencies, along with the private sector.
Sen. Derek Stanford, D-Bothell, sponsored a bill that would prohibit employers from using artificial intelligence to emulate a worker’s face or voice without the worker’s consent.
On March 12, Washington state will hold its presidential primary. Primaries for statewide races will appear on the Aug. 6 ballot.
The general election this year will be held Nov. 5.