Lifestyle

AI speech clone is so real that makers say its ‘potential risks’ could prove too dangerous

Experts are speechless.

Researchers at Microsoft have developed an artificially intelligent text-to-speech program at a human level of believability.

It is so realistic that creators are keeping the high-tech interface “purely a research project” and will not yet allow it to be used by the public.

Microsoft has unveiled a new text to speech tool so realistic that it is not safe to be harnessed by the public yet. OleCNX – stock.adobe.com

VALL-E 2, as it is called, is the first AI vocal program of its kind in “achieving human parity,” Microsoft announced. In other words, it can’t be differentiated from a person’s speech.

Until now, more rudimentary developments can be detected as AI through small nuances in verbiage.

Most notably, VALL-E 2 is said to be crystal clear “even for sentences that are traditionally challenging due to their complexity or repetitive phrases,” according to a paper on the software.

High powered AI voice cloning has reached a human level. garrykillian – stock.adobe.com

It can also replicate a voice thoroughly after hearing as little as three seconds of audio.

The program also “surpasses previous systems in speech robustness, naturalness, and speaker similarity,” researchers noted.

Its creators have good intentions for use both medically — being used as an aid for those with aphasia or similar pathological disabilities — and socially.

Specifically, researchers boast that VALL-E 2 “could be used for educational learning, entertainment, journalistic, self-authored content, accessibility features, interactive voice response systems, translation, chatbot, and so on.”

However, they are not ignorant of the potential misuse of such a high-powered tool.

“It may carry potential risks in the misuse of the model, such as spoofing voice identification or impersonating a specific speaker.”

For this reason, there are “no plans to incorporate VALL-E 2 into a product or expand access to the public.”

VALL-E 2, an ultra realistic AI, can clone voices at a human level of believability. Microsoft

Voice spoofing, creating a pretend voice for phone calls and other long-distance interactions, is becoming a concerning issue due to easy accessibility of AI programs. Apple just listed it as a top concern amid an increase in phishing.

The elderly are typically targeted, but some mothers have received fake calls that their children were kidnapped for ransom — mistakenly believing it was their child on the other end.

Experts, like Lisa Palmer, a strategist for consulting firm AI Leaders, recommend families and loved ones create tightly kept, verbal passwords to share on the phone in cases of doubt.