said last week that it had opened an investigation into Clubhouse.

Clubhouse updated the app this month, addressing some of the privacy concerns. It did not immediately respond to a request for comment.

There are kinder ways than sharing your address book to find out whether your friends are using a new service — like asking them directly.

All security experts agreed on one rule of thumb: Trust no one.

When you receive an email from someone asking for your personal information, don’t click on any links and contact the sender to ask if the message is legitimate. Fraudsters can easily embed emails with malware and impersonate your bank, said Adam Kujawa, a director of the security firm Malwarebytes.

When in doubt, opt out of sharing data. Businesses and banks have experimented with fraud-detection technologies that listen to your voice to verify your identity. At some point, you may even interact with customer service representatives on video calls. The most sophisticated fraudsters could eventually use the media you post online to create a deepfake, or a computer-generated video or audio clip impersonating you, Mr. Balasubramaniyan said.

While this could sound alarmist because deepfakes are not an immediate concern, a healthy dose of skepticism will help us survive the future.

“Think about all the different ways in which you’re leaving biometric identity in your online world,” he said.

View Source

Deepfake Videos of Eerie Tom Cruise Revive Debate

To those fearful of a future in which videos of real people are indistinguishable from computer-generated forgeries, two recent developments that attracted an audience of millions might have seemed alarming.

First, a visual effects artist worked with a Tom Cruise impersonator to create startlingly accurate videos imitating the actor. The videos, created with the help of machine-learning techniques and known as deepfakes, gained millions of views on TikTok, Twitter and other social networks in late February.

Then, days later, MyHeritage, a genealogy website best known for its role in tracking down the identity of the Golden State Killer, offered a tool to digitally animate old photographs of loved ones, creating a short, looping video in which people can be seen moving their heads and even smiling. More than 26 million images had been animated using the tool, called Deep Nostalgia, as of Monday, the company said.

The videos renewed attention to the potential of synthetic media, which could lead to significant improvements in the advertising and entertainment industries. But the technology could also be used — and has been — to raise doubts about legitimate videos and to insert people, including children, into pornographic images.

digitally resurrected him for a video promoting gun safety legislation. The police in the Australian state of Victoria used a police officer who died by suicide in 2012 to deliver a message about mental health support.

And “Welcome to Chechnya,” a documentary released last year about anti-gay and lesbian purges in Chechnya, used the technology to shield the identity of at-risk Chechens.

The effects could also be used in Hollywood to better age or de-age actors, or to improve the dubbing of films and TV shows in different languages, closely aligning lip movements with the language onscreen. Executives of international companies could also be made to look more natural when addressing employees who speak different languages.

But critics fear the technology will be further abused as it improves, particularly to create pornography that places the face of one person on someone else’s body.

Nina Schick, the author of “Deepfakes: The Coming Infocalypse,” said the earliest deepfaked pornography took hours of video to produce, so celebrities were the typical targets. But as the technology becomes more advanced, less content will be needed to create the videos, putting more women and children at risk.

A tool on the messaging app Telegram that allowed users to create simulated nude images from a single uploaded photo has already been used hundreds of thousands of times, according to BuzzFeed News.

have called the “liar’s dividend.”

In Gabon, opposition leaders argued that a video of President Ali Bongo Ondimba giving a New Year’s address in 2019 was faked in an attempt to cover up health problems. Last year, a Republican candidate for a House seat in the St. Louis area claimed that the video of George Floyd’s death in police custody had been digitally staged.

As the technology advances, it will be used more broadly, according to Mr. Gregory, the artificial intelligence expert, but its effects are already pronounced.

“People are always trying to think about the perfect deepfake when that isn’t necessary for the harmful or beneficial uses,” he said.

In introducing the Deep Nostalgia tool, MyHeritage addressed the issue of consent, asking users to “please use this feature on your own historical photos and not on photos featuring living people without their permission.” Mr. Ume, who created the deepfakes of Mr. Cruise, said he had no contact with the actor or his representatives.

Of course, people who have died can’t consent to being featured in videos. And that matters if dead people — especially celebrities — can be digitally resurrected, as the artist Bob Ross was to sell Mountain Dew, or as Robert Kardashian was last year in a gift to his daughter Kim Kardashian West from her husband, Kanye West.

Black Mirror,” whole aspects of our personalities could be simulated after death, trained by our voices on social media.

But that raises a tricky question, he said: “In what cases do we need consent of the deceased to resurrect them?”

“These questions make you feel uncomfortable, something feels a bit wrong or unsettling, but its difficult to know if that’s just because it’s new or if it hints at a deeper intuition about something problematic,” Mr. Ajder said.

View Source