After being scammed into thinking her daughter was kidnapped, an Arizona woman testified in the US Senate about the dangers side of artificial intelligence technology when in the hands of criminals.
After being scammed into thinking her daughter was kidnapped, an Arizona woman testified in the US Senate about the dangers side of artificial intelligence technology when in the hands of criminals.
I’ve only been thinking about the implications of faking a celebrity’s voice - personalizing it like this makes me sick to my stomach. Had no idea it’s already that easy. I don’t think the voice would even have to be that realistic - if they’re faking a life threatening situation, my first thought isn’t going to be “Hey, their voice sounded a little off”. Absolutely horrifying.