The Federal Communications Commission (FCC) declared the action on Thursday and stated that it will go into effect right away.
The FCC added that it grants the state the authority to prosecute any criminals responsible for these calls.
It coincides with an increase of robocalls imitating the tones of political candidates and celebrities.
In a statement released on Thursday, FCC chairwoman Jessica Rosenworcel stated, “Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters.”
“We’re putting the fraudsters behind these robocalls on notice.”
Should we be concerned about a vocal clone attack?
Following an incident last month in which New Hampshire voters received robocalls posing as US President Joe Biden ahead of the state’s presidential primary, the action was taken.
Voters were urged by the calls not to participate in the primary. Between 5,000 and 25,000 were put there, roughly.
The attorney general of New Hampshire indicated that a criminal investigation is in progress and that the calls were connected to two businesses in Texas.
According to the FCC, these calls could mislead customers by pretending to be prominent personalities or, in certain cases, close relatives.
The agency went on to say that although state attorneys general may bring charges against the businesses and people making these calls for offenses like fraud or scams, this most recent move makes the use of AI-generated voices in these calls unlawful in and of itself.
The legislation “expands the legal avenues through which state law enforcement agencies can hold these perpetrators accountable under the law” .
Attorneys general from 26 states wrote the FCC in mid-January pleading with the organization to take action to limit the use of AI in marketing phone calls.
Leading the initiative was Pennsylvania Attorney General Michelle Henry. “Technology is advancing and expanding, seemingly by the minute, and we must ensure these new developments are not used to prey upon, deceive, or manipulate consumers,” Henry said.