View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Germany Looks Into Covid Deniers’ Links With Far Right

The German domestic intelligence agency is keeping close tabs on a group of coronavirus deniers, who, in their protests against restrictions and tendency to believe in conspiracy theories, have found common cause with far-right extremists.

Government officials said the movement’s close ties to extremist organizations, such as the “Reich citizens” — or “Reichsbürger,” as they are known in German, referring to a group that refuses to accept the legitimacy of the modern German state — were troubling. Many of the coronavirus deniers say they also believe in QAnon conspiracy theories, and protesters are frequently seen holding signs with anti-Semitic tropes. A number of journalists have been attacked while covering the demonstrations.

A spokesman for the Interior Ministry said in a statement, “Our basic democratic order, as well as state institutions such as parliaments and governments, have faced multiple attacks since the beginning of the measures to contain the Covid-19 pandemic.” Several regional intelligence agencies have already been observing participants in the movement, he added.

The group of deniers, which started as a fringe movement last spring, has grown into a coordinated effort that organizes mass demonstrations across Germany. The rallies occasionally turn aggressive, and many have ended in scuffles with law enforcement officers.

AfD, a German right-wing populist party, have allied themselves with protesters. The national intelligence agency’s formal observation of the deniers’ group is the first step in a procedure that could lead to it being declared anti-constitutional and ultimately banned.

A week ago, about 8,000 people in Berlin protested the passing of a law that gives the federal government power to implement tougher restrictions. Germany has seen a persistently high average number of new daily cases recently, averaging about 18,000 a day, according to a New York Times database, up from about 8,000 a day two months ago.

View Source

Extremists Find a Financial Lifeline on Twitch

Terpsichore Maras-Lindeman, a podcaster who fought to overturn the 2020 presidential election, recently railed against mask mandates to her 4,000 fans in a live broadcast and encouraged them to enter stores maskless. On another day, she grew emotional while thanking them for sending her $84,000.

Millie Weaver, a former correspondent for the conspiracy theory website Infowars, speculated on her channel that coronavirus vaccines could be used to surveil people. Later, she plugged her merchandise store, where she sells $30 “Drain the Swamp” T-shirts and hats promoting conspiracies.

And a podcaster who goes by Zak Paine or Redpill78, who pushes the baseless QAnon conspiracy theory, urged his viewers to donate to the congressional campaign of an Ohio man who has said he attended the “Stop the Steal” rally in Washington on Jan. 6.

Facebook, YouTube and other social media platforms clamped down on misinformation and hate speech ahead of the 2020 election.

apps like Google Podcasts, where far-right influencers have scattered as their options for spreading falsehoods have dwindled.

Twitch became a multibillion-dollar business thanks to video gamers broadcasting their play of games like Fortnite and Call of Duty. Fans, many of whom are young men, pay the gamers by subscribing to their channels or donating money. Streamers earn even more by sending their fans to outside sites to either buy merchandise or donate money.

Now Twitch has also become a place where right-wing personalities spread election and vaccine conspiracy theories, often without playing any video games. It is part of a shift at the platform, where streamers have branched out from games into fitness, cooking, fishing and other lifestyle topics in recent years.

But unlike fringe livestreaming sites like Dlive and Trovo, which have also offered far-right personalities moneymaking opportunities, Twitch attracts far larger audiences. On average, 30 million people visit the site each day, the platform said.

stricter rules than other social media platforms for the kinds of views that users can express. It temporarily suspended Mr. Trump’s account for “hateful conduct” last summer, months before Facebook and Twitter made similar moves. Its community guidelines prohibit hateful conduct and harassment. Ms. Clemens said Twitch was developing a misinformation policy.

This month, Twitch announced a policy that would allow it to suspend the accounts of people who committed crimes or severe offenses in real life or on other social media platforms, including violent extremism or membership in a known hate group. Twitch said it did not consider QAnon to be a hate group.

Despite all this, a Twitch channel belonging to Enrique Tarrio, the leader of the Proud Boys, a white nationalist organization, remained online until the middle of this month after The New York Times inquired about it. And the white nationalist Anthime Joseph Gionet, known as Baked Alaska, had a Twitch channel for months, even though he was arrested in January by the F.B.I. and accused of illegally storming the U.S. Capitol on Jan. 6. Twitch initially said his activities had not violated the platform’s policies, then barred him this month for hateful conduct.

has said is dangerous. Last week, he referred to a QAnon belief that people are killing children to “harvest” a chemical compound from them, then talked about a “criminal cabal” controlling the government, saying people do not understand “what plane of existence they come from.”

Mr. Paine, who is barred from Twitter and YouTube, has also asked his Twitch audience to donate to the House campaign of J.R. Majewski, an Air Force veteran in Toledo, Ohio, who attracted attention last year for painting his lawn to look like a Trump campaign banner. Mr. Majewski has used QAnon hashtags but distanced himself from the movement in an interview with his local newspaper, The Toledo Blade.

Mr. Majewski has appeared on Mr. Paine’s streams, where they vape, chat about Mr. Majewski’s campaign goals and take calls from listeners.

“He is exactly the type of person that we need to get in Washington, D.C., so that we can supplant these evil cabal criminal actors and actually run our own country,” Mr. Paine said on one stream.

Neither Mr. Paine nor Mr. Majewski responded to a request for comment.

Joan Donovan, a Harvard University researcher who studies disinformation and online extremism, said streamers who rely on their audience’s generosity to fund themselves felt pressured to continue raising the stakes.

“The incentive to lie, cheat, steal, hoax and scam is very high when the cash is easy to acquire,” she said.

View Source

Catch up: Jack Dorsey says Twitter played a role in U.S. Capitol riot.

“I don’t think anyone wants a world where you can only say things that private companies judge to be true.” “Our mission is to organize the world’s information, and make it universally accessible and useful.” “We believe in free debate and conversation to find the truth. At the same time, we must balance that with our desire for our service not to be used to sow confusion, division or destruction.” “There are two faces to each of your platforms. Facebook has family and friends, neighborhood, but it is right next to the one where there is a white nationalist rally every day. YouTube is a place where people share quirky videos, but down the street, anti-vaxxers Covid deniers, QAnon supporters and Flat Earthers are sharing videos.” “You’ve failed to meaningfully change after your platform has played a role in fomenting insurrection, and abetting the spread of the virus and trampling American civil liberties. And while it may be true that some bad actors will shout ‘fire’ in the crowded theater by promoting harmful content, your platforms are handing them a megaphone to be heard in every theater across the country and the world. Your business model itself has become the problem.” “How is it possible for you not to at least admit that Facebook played a central role or a leading role in facilitating the recruitment, planning and execution of the attack on the Capitol?” “Chairman, my point is that I think that the responsibility here lies with the people who took the actions to break the law, and take and do the insurrection and secondarily, also the people who spread that content, including the president, but others as well.” “Your platform bears some responsibility for disseminating disinformation related to the election and the ‘Stop the Steal’ movement that led to the attack on the Capitol. Just a yes or no answer.” “Congressman, it’s a complex question. We —” “OK, we’ll move on. Mr Dorsey.” “Yes, but you also have to take into consideration a broader ecosystem. It’s not just the technology platforms we use.” “We’re all aware of big tech’s ever-increasing censorship of conservative voices and their commitment to serve the radical progressive agenda by influencing a generation of children — removing, shutting down or canceling any news, books and even now, toys, that aren’t considered woke.” “First of all, do you recognize that there is a real concern, that there’s an anti-conservative bias on Twitter’s behalf? And would you recognize that this has to stop if this is going to be, Twitter is going to be viewed by both sides as a place where everybody is going to get a fair treatment?” “We don’t write policy according to any particular political leaning. If we find any of it, we route it out.”

View Source

Lawmakers Grill Tech C.E.O.s on Capitol Riot, Getting Few Direct Answers

WASHINGTON — Lawmakers grilled the leaders of Facebook, Google and Twitter on Thursday about the connection between online disinformation and the Jan. 6 riot at the Capitol, causing Twitter’s chief executive to publicly admit for the first time that his product had played a role in the events that left five people dead.

When a Democratic lawmaker asked the executives to answer with a “yes” or a “no” whether the platforms bore some responsibility for the misinformation that had contributed to the riot, Jack Dorsey of Twitter said “yes.” Neither Mark Zuckerberg of Facebook nor Sundar Pichai of Google would answer the question directly.

The roughly five-hour hearing before a House committee marked the first time lawmakers directly questioned the chief executives regarding social media’s role in the January riot. The tech bosses were also peppered with questions about how their companies helped spread falsehoods around Covid-19 vaccines, enable racism and hurt children’s mental health.

It was also the first time the executives had testified since President Biden’s inauguration. Tough questioning from lawmakers signaled that scrutiny of Silicon Valley’s business practices would not let up, and could even intensify, with Democrats in the White House and leading both chambers of Congress.

tweeted a single question mark with a poll that had two options: “Yes” or “No.” When asked about his tweet by a lawmaker, he said “yes” was winning.

The January riot at the Capitol has made the issue of disinformation deeply personal for lawmakers. The riot was fueled by false claims from President Donald J. Trump and others that the election had been stolen, which were rampant on social media.

Some of the participants had connections to QAnon and other online conspiracy theories. And prosecutors have said that groups involved in the riot, including the Oath Keepers and the Proud Boys, coordinated some of their actions on social media.

ban Mr. Trump and his associates after the Jan. 6 riots. The bans hardened views by conservatives that the companies are left-leaning and are inclined to squelch conservative voices.

“We’re all aware of Big Tech’s ever-increasing censorship of conservative voices and their commitment to serve the radical progressive agenda,” said Representative Bob Latta of Ohio, the ranking Republican on the panel’s technology subcommittee.

The company leaders defended their businesses, saying they had invested heavily in hiring content moderators and in technology like artificial intelligence, used to identify and fight disinformation.

Mr. Zuckerberg argued against the notion that his company had a financial incentive to juice its users’ attention by driving them toward more extreme content. He said Facebook didn’t design “algorithms in order to just kind of try to tweak and optimize and get people to spend every last minute on our service.”

He added later in the hearing that elections disinformation was spread in messaging apps, where amplification and algorithms don’t aid in spread of false content. He also blamed television and other traditional media for spreading election lies.

The companies showed fissures in their view on regulations. Facebook has vocally supported internet regulations in a major advertising blitz on television and in newspapers. In the hearing, Mr. Zuckerberg suggested specific regulatory reforms to a key legal shield, known as Section 230 of the Communications Decency Act, that has helped Facebook and other Silicon Valley internet giants thrive.

The legal shield protects companies that host and moderate third-party content, and says companies like Google and Twitter are simply intermediaries of their user-generated content. Democrats have argued that with that protection, companies aren’t motivated to remove disinformation. Republicans accuse the companies of using the shield to moderate too much and to take down content that doesn’t represent their political viewpoints.

“I believe that Section 230 would benefit from thoughtful changes to make it work better for people,” Mr. Zuckerberg said in the statement.

He proposed that liability protection for companies be conditional on their ability to fight the spread of certain types of unlawful content. He said platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it. Reforms, he said, should be different for smaller social networks, which wouldn’t have the same resources like Facebook to meet new requirements.

Mr. Pichai and Mr. Dorsey said they supported requirements of transparency in content moderation but fell short of agreeing with Mr. Zuckerberg’s other ideas. Mr. Dorsey said that it would be very difficult to distinguish a large platform from a smaller one.

Lawmakers did not appear to be won over.

“There’s a lot of smugness among you,” said Representative Bill Johnson, a Republican of Ohio. “There’s this air of untouchable-ness in your responses to many of the tough questions that you’re being asked.”

Kate Conger and Daisuke Wakabayashi contributed reporting.

View Source

Big Tech C.E.O.s Face Lawmakers on Disinformation: Live Updates

Mark Zuckerberg of Facebook, Jack Dorsey of Twitter and Sundar Pichai of Google are appearing at a hearing held by the House Energy and Commerce Committee about how disinformation spreads across their platforms.

Video

Video player loading
Mark Zuckerberg of Facebook, Sundar Pichai of Google, and Jack Dorsey of Twitter testify remotely before Congress on “misinformation and disinformation plaguing online platforms.”CreditCredit…Pool photo by Greg Nash
The Capitol riots “and the movement that motivated it started and was nourished on your platforms,” Representative Mike Doyle, Democrat of Pennsylvania, told the C.E.O.s.
Credit…Energy and Commerce Committee, via YouTube

Democratic lawmakers accused the chief executives of allowing disinformation to run rampant online, reflecting their mounting frustration about the spread of extremism, conspiracy theories and falsehoods online in the aftermath of the Jan. 6 riot at the Capitol.

Their comments opened the first hearing since President Biden’s inauguration featuring Mark Zuckerberg of Facebook, Sundar Pichai of Google and Jack Dorsey of Twitter. They were a signal that scrutiny of Silicon Valley’s business practices will not let up, and may even intensify, with Democrats in the White House and leading both chambers of Congress.

The January riot made the issue of disinformation intensely personal for many lawmakers. Some participants have been linked to online conspiracies like QAnon, which the platforms have tried to stem in recent months.

“We fled as a mob desecrated the Capitol, the House floor and our democratic process,” said Representative Mike Doyle, a Pennsylvania Democrat. “That attack and the movement that motivated it started and was nourished on your platforms.”

Lawmakers argued that the platforms also had enabled misinformation about the coronavirus pandemic.

The lawmakers’ growing frustration comes as they consider whether to more tightly regulate the business models of the platforms. Some have proposed modifying a legal shield that protects websites from lawsuits over content posted by their users, arguing that it allows the companies to get away negligence in policing their products.

Representative Jan Schakowsky, Democrat of Illinois, said Thursday that the executives should take away that “self-regulation has come to the end of its road.”

Representative Bob Latta, Republican of Ohio, accused the platforms of a “commitment to serve the radical progressive agenda.”
Credit…Energy and Commerce Committee, via YouTube

Republican lawmakers came into the hearing steaming about the Jan. 6 Capitol riots, but their animus was focused on the decisions by the platforms to ban right-wing figures, including former President Donald J. Trump, for inciting violence.

The decisions to ban Mr. Trump, many of his associates and other conservatives, they said, amounted to liberal bias and censorship.

“We’re all aware of Big Tech’s ever-increasing censorship of conservative voices and their commitment to serve the radical progressive agenda,” said Bob Latta, the ranking Republican of the House’s communications and technology subcommittee.

After the Capitol riots, Mr. Trump and some of his top aides were temporarily or indefinitely banned on major social media sites.

Mr. Latta’s comments are expected to be echoed by many Republicans in the hearing. They say the platforms have become gatekeepers of information, and they accuse the companies of trying to suppress conservative views. The claims have been consistently refuted by academics.

Mr. Latta homed in on the legal shield known as Section 230 of the Communications Decency Act and whether the big tech companies deserve the regulatory protection.

“Section 230 provides you with the liability protection for content moderation decisions made in good faith,” Mr. Latta said. But he said the companies have appeared to use their moderating powers to censor viewpoints that the companies disagree with. “I find that highly concerning.”

The chief executives of Facebook, Alphabet and Twitter are expected to face tough questions from lawmakers on both sides of the aisle. Democrats have focused on disinformation, especially in the wake of the Capitol riot. Republicans, meanwhile, have already questioned the companies about their decisions to remove conservative personalities and stories from their platforms.

New York Times reporters have covered many of the examples that could come up. Here are the facts to know about them:

After his son was stabbed to death in Israel by a member of the militant group Hamas in 2016, Stuart Force decided that Facebook was partly to blame for the death, because the algorithms that power the social network helped spread Hamas’s content. He joined relatives of other terror victims in suing the company, arguing that its algorithms aided the crimes by regularly amplifying posts that encouraged terrorist attacks. Arguments about the algorithms’ power have reverberated in Washington.

Section 230 of the Communications Decency Act, has helped Facebook, YouTube, Twitter and countless other internet companies flourish. But Section 230’s liability protection also extends to fringe sites known for hosting hate speech, anti-Semitic content and racist tropes. As scrutiny of big technology companies has intensified in Washington over a wide variety of issues, including how they handle the spread of disinformation or police hate speech, Section 230 has faced new focus.

After inflaming political discourse around the globe, Facebook is trying to turn down the temperature. The social network started changing its algorithm to reduce the political content in users’ news feeds. Facebook previewed the change earlier this year when Mark Zuckerberg, the chief executive, said the company was experimenting with ways to tamp down divisive political debates among users. “One of the top pieces of feedback we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services,” he said.

As the Electoral College affirmed Joseph R. Biden Jr.’s election, voter fraud misinformation subsided. But peddlers of online falsehoods ramped up lies about the Covid-19 vaccines. Rep. Marjorie Taylor Greene, a Republican of Georgia, as well as far-right websites like ZeroHedge, have begun pushing false vaccine narratives, researchers said. Their efforts have been amplified by a robust network of anti-vaccination activists like Robert F. Kennedy Jr. on platforms including Facebook, YouTube and Twitter.

In the end, two billionaires from California did what legions of politicians, prosecutors and power brokers had tried and failed to do for years: They pulled the plug on President Trump. Journalists and historians will spend years unpacking the improvisational nature of the bans, and scrutinizing why they arrived just as Mr. Trump was losing his power, and Democrats were poised to take control of Congress and the White House. The bans have also turned up the heat on a free-speech debate that has been simmering for years.

Chief executives from Google, Apple, Amazon and Facebook testifying in July. Mark Zuckerberg of Facebook has testified six times on Capitol Hill.
Credit…Pool photo by Mandel Ngan

In the fall of 2017, when Congress called on Google, Facebook and Twitter to testify about their role in Russia’s interference with the 2016 presidential election, the companies didn’t send their chief executives — as lawmakers had requested — and instead summoned their lawyers to face the fire.

During the hearings, the politicians complained that the general counsels were answering questions about whether the companies contributed to undermining the democratic process instead of “the top people who are actually making the decisions,” as Senator Angus King, an independent from Maine, put it.

It was clear Capitol Hill wanted its pound of C.E.O. flesh and that hiding behind the lawyers was not going to work for long. That initial concern about how the chieftains of Silicon Valley would handle grilling from lawmakers is no longer a worry. After a slew of hearings in recent years, both virtual and in-person, the executives have had plenty of practice.

Since 2018, Sundar Pichai, Google’s chief executive, has testified on three different occasions. Jack Dorsey, Twitter’s chief executive, has made four appearances, and Mark Zuckerberg, Facebook’s chief, has testified six times.

And when the three men again face questioning on Thursday, they will do so now as seasoned veterans in the art of deflecting the most vicious attacks and then redirecting to their carefully practiced talking points.

In general, Mr. Pichai tends to disagree politely and quickly at the sharpest jabs from lawmakers — such as when Mr. Pichai was asked last year why Google steals content from honest businesses — but not harp on it. When a politician tries to pin him down on a specific issue, he often relies on a familiar delay tactic: My staff will get back to you.

Mr. Pichai is not a dynamic cult-of-personality tech leader like Steve Jobs or Elon Musk, but his reserved demeanor and earnestness is well suited for the congressional spotlight.

Mr. Zuckerberg has also grown more comfortable with the hearings over time and more emphatic about what the company is doing to combat misinformation. At his first appearance in 2018, Mr. Zuckerberg was contrite and made promises to do better for failing to protect users’ data and prevent Russian interference in elections.

Since then, he has pushed the message that Facebook is a platform for good, while carefully laying out the steps that the company is taking to stamp out disinformation online.

As the sessions have gone virtual during the pandemic, Mr. Dorsey’s appearances, hunched over a laptop camera, carry a just-another-guy-on-Zoom vibe when compared to the softly lit neutral backdrops for the Google and Facebook chiefs.

Mr. Dorsey tends to remain extremely calm — almost zen-like — when pressed with aggressive questions and often engages on technical issues that rarely illicit a follow-up.

View Source

Zuckerberg, Dorsey and Pichai testify about disinformation.

The chief executives of Google, Facebook and Twitter are testifying at the House on Thursday about how disinformation spreads across their platforms, an issue that the tech companies were scrutinized for during the presidential election and after the Jan. 6 riot at the Capitol.

The hearing, held by the House Energy and Commerce Committee, is the first time that Mark Zuckerberg of Facebook, Jack Dorsey of Twitter and Sundar Pichai of Google are appearing before Congress during the Biden administration. President Biden has indicated that he is likely to be tough on the tech industry. That position, coupled with Democratic control of Congress, has raised liberal hopes that Washington will take steps to rein in Big Tech’s power and reach over the next few years.

The hearing is also be the first opportunity since the Jan. 6 Capitol riot for lawmakers to question the three men about the role their companies played in the event. The attack has made the issue of disinformation intensely personal for the lawmakers since those who participated in the riot have been linked to online conspiracy theories like QAnon.

Before the hearing, Democrats signaled in a memo that they were interested in questioning the executives about the Jan. 6 attacks, efforts by the right to undermine the results of the 2020 election and misinformation related to the Covid-19 pandemic.

October article in The New York Post about President Biden’s son Hunter.

Lawmakers have debated whether social media platforms’ business models encourage the spread of hate and disinformation by prioritizing content that will elicit user engagement, often by emphasizing salacious or divisive posts.

Some lawmakers will push for changes to Section 230 of the Communications Decency Act, a 1996 law that shields the platforms from lawsuits over their users’ posts. Lawmakers are trying to strip the protections in cases where the companies’ algorithms amplified certain illegal content. Others believe that the spread of disinformation could be stemmed with stronger antitrust laws, since the platforms are by far the major outlets for communicating publicly online.

“By now it’s painfully clear that neither the market nor public pressure will stop social media companies from elevating disinformation and extremism, so we have no choice but to legislate, and now it’s a question of how best to do it,” said Representative Frank Pallone, the New Jersey Democrat who is chairman of the committee.

The tech executives are expected to play up their efforts to limit misinformation and redirect users to more reliable sources of information. They may also entertain the possibility of more regulation, in an effort to shape increasingly likely legislative changes rather than resist them outright.

View Source

Unmasked: man behind cult set to replace QAnon

The mysterious individual behind a new and rapidly growing online disinformation network targeting followers of QAnon, the far-right cult, can be revealed as a Berlin-based artist with a history of social media manipulation, a prominent anti-racism group claims.

Since Donald Trump left the White House, QAnon’s vast online community has been in a state of flux as it comes to terms with the reality that its conspiracy theories – such as the former US president being destined to defeat a cabal of Satan-worshipping paedophiles – amount to nothing.

That may explain why significant numbers have turned to a new far-right network, found mostly on the Telegram messaging app, that is growing quickly in the UK and globally and has amassed more than one million subscribers so far this year.

Sebastian Bieniek in a blue shirt and jeans, sitting on a director's chair
Sebastian Bieniek in 2018. The campaign group Hope Not Hate says he has a track record of inventing online conspiracies. Photograph: Reza Mahmoudidschad

Called the Sabmyk Network, like QAnon it is a convoluted conspiracy theory that features fantastical elements and is headed by a mysterious messianic figure. Since its emergence there has been widespread speculation about who that figure might be. The person who first posted as “Q” has never been positively identified.

This week the British anti-fascist group Hope Not Hate will unmask Sabmyk’s leader, who it claims is 45-year-old German art dealer Sebastian Bieniek. It says Bieniek – who has not responded to questions from the Observer – has a history of creating online conspiracies and even wrote a book in 2011 called RealFake that detailed a campaign to deceptively promote his work.

But Hope Not Hate says the speed of Sabmyk’s growth serves as a warning of the opportunities for manipulation that exist on social media, particularly unregulated alt-tech platforms such as Telegram.

Gregory Davis of Hope Not Hate, which will publish its annual report into the far right on Monday, said: “His success in developing such a huge audience is a reminder that the QAnon template of anonymous online manipulation will continue to pose a threat in the years to come.”

Since 21 December last year, when Sabmyk was supposedly “awakened”, more than 136 channels in English, German, Japanese, Korean and Italian have sprung up, adding tens of thousands of followers on a daily basis.

Much of Sabmyk’s content is designed to appeal to QAnon followers; it features Covid mask scepticism, anti-vaccine conspiracies and false assertions that the 2020 US election was stolen from Trump.

Some is also designed to actively recruit Britons: one Sabmyk channel, the British Patriotic Party, uses the same branding as anti-Muslim group Britain First and posts about the mayor of London, Sadiq Khan.

Other channels are entitled London Post and Liverpool Times, as well as the Great Awakening UK, a reference to a well-known QAnon trope predicting a day of reckoning in which Trump would rise against his liberal enemies. Others include WWG1WGA, an acronym for the QAnon rallying call “where we go one, we go all.”

Among the clues used to identify Bieniek are posts saying that the messiah Sabmyk can be identified by specific marks on his body. One post claimed that Sabmyk would have “17 V-shaped scars” on his arm, the result of a “prophetic ceremony at the age of 24”.

What is QAnon and why is it so dangerous? – video explainer
What is QAnon and why is it so dangerous? – video explainer

Hope Not Hate has found a since-deleted section on Bieniek’s website recalling a 1999 art exhibit in which, aged 24, he cut V-shaped wounds into his arm for 16 days in a row.

Attempts to connect Sabmyk to Trump have been made, including a clip that splices together instances of the former president saying “17”, and a doctored image showing him with a Sabmyk pamphlet in his suit pocket.

Bieniek has created countless false identities, according to the Hope Not Hate investigation, to promote his career as an artist. The group also says his German Wikipedia page has been deleted at least four times, most recently in January.

A list of Bieniek’s accounts has been sent to platforms including Telegram with a call for them to be removed on the basis of “inauthentic and coordinated platform manipulation”. Telegram has been approached for comment.

View Source