View Source

Big Tech C.E.O.s Face Lawmakers on Disinformation: Live Updates

Mark Zuckerberg of Facebook, Jack Dorsey of Twitter and Sundar Pichai of Google are appearing at a hearing held by the House Energy and Commerce Committee about how disinformation spreads across their platforms.

Video

Video player loading
Mark Zuckerberg of Facebook, Sundar Pichai of Google, and Jack Dorsey of Twitter testify remotely before Congress on “misinformation and disinformation plaguing online platforms.”CreditCredit…Pool photo by Greg Nash
The Capitol riots “and the movement that motivated it started and was nourished on your platforms,” Representative Mike Doyle, Democrat of Pennsylvania, told the C.E.O.s.
Credit…Energy and Commerce Committee, via YouTube

Democratic lawmakers accused the chief executives of allowing disinformation to run rampant online, reflecting their mounting frustration about the spread of extremism, conspiracy theories and falsehoods online in the aftermath of the Jan. 6 riot at the Capitol.

Their comments opened the first hearing since President Biden’s inauguration featuring Mark Zuckerberg of Facebook, Sundar Pichai of Google and Jack Dorsey of Twitter. They were a signal that scrutiny of Silicon Valley’s business practices will not let up, and may even intensify, with Democrats in the White House and leading both chambers of Congress.

The January riot made the issue of disinformation intensely personal for many lawmakers. Some participants have been linked to online conspiracies like QAnon, which the platforms have tried to stem in recent months.

“We fled as a mob desecrated the Capitol, the House floor and our democratic process,” said Representative Mike Doyle, a Pennsylvania Democrat. “That attack and the movement that motivated it started and was nourished on your platforms.”

Lawmakers argued that the platforms also had enabled misinformation about the coronavirus pandemic.

The lawmakers’ growing frustration comes as they consider whether to more tightly regulate the business models of the platforms. Some have proposed modifying a legal shield that protects websites from lawsuits over content posted by their users, arguing that it allows the companies to get away negligence in policing their products.

Representative Jan Schakowsky, Democrat of Illinois, said Thursday that the executives should take away that “self-regulation has come to the end of its road.”

Representative Bob Latta, Republican of Ohio, accused the platforms of a “commitment to serve the radical progressive agenda.”
Credit…Energy and Commerce Committee, via YouTube

Republican lawmakers came into the hearing steaming about the Jan. 6 Capitol riots, but their animus was focused on the decisions by the platforms to ban right-wing figures, including former President Donald J. Trump, for inciting violence.

The decisions to ban Mr. Trump, many of his associates and other conservatives, they said, amounted to liberal bias and censorship.

“We’re all aware of Big Tech’s ever-increasing censorship of conservative voices and their commitment to serve the radical progressive agenda,” said Bob Latta, the ranking Republican of the House’s communications and technology subcommittee.

After the Capitol riots, Mr. Trump and some of his top aides were temporarily or indefinitely banned on major social media sites.

Mr. Latta’s comments are expected to be echoed by many Republicans in the hearing. They say the platforms have become gatekeepers of information, and they accuse the companies of trying to suppress conservative views. The claims have been consistently refuted by academics.

Mr. Latta homed in on the legal shield known as Section 230 of the Communications Decency Act and whether the big tech companies deserve the regulatory protection.

“Section 230 provides you with the liability protection for content moderation decisions made in good faith,” Mr. Latta said. But he said the companies have appeared to use their moderating powers to censor viewpoints that the companies disagree with. “I find that highly concerning.”

The chief executives of Facebook, Alphabet and Twitter are expected to face tough questions from lawmakers on both sides of the aisle. Democrats have focused on disinformation, especially in the wake of the Capitol riot. Republicans, meanwhile, have already questioned the companies about their decisions to remove conservative personalities and stories from their platforms.

New York Times reporters have covered many of the examples that could come up. Here are the facts to know about them:

After his son was stabbed to death in Israel by a member of the militant group Hamas in 2016, Stuart Force decided that Facebook was partly to blame for the death, because the algorithms that power the social network helped spread Hamas’s content. He joined relatives of other terror victims in suing the company, arguing that its algorithms aided the crimes by regularly amplifying posts that encouraged terrorist attacks. Arguments about the algorithms’ power have reverberated in Washington.

Section 230 of the Communications Decency Act, has helped Facebook, YouTube, Twitter and countless other internet companies flourish. But Section 230’s liability protection also extends to fringe sites known for hosting hate speech, anti-Semitic content and racist tropes. As scrutiny of big technology companies has intensified in Washington over a wide variety of issues, including how they handle the spread of disinformation or police hate speech, Section 230 has faced new focus.

After inflaming political discourse around the globe, Facebook is trying to turn down the temperature. The social network started changing its algorithm to reduce the political content in users’ news feeds. Facebook previewed the change earlier this year when Mark Zuckerberg, the chief executive, said the company was experimenting with ways to tamp down divisive political debates among users. “One of the top pieces of feedback we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services,” he said.

As the Electoral College affirmed Joseph R. Biden Jr.’s election, voter fraud misinformation subsided. But peddlers of online falsehoods ramped up lies about the Covid-19 vaccines. Rep. Marjorie Taylor Greene, a Republican of Georgia, as well as far-right websites like ZeroHedge, have begun pushing false vaccine narratives, researchers said. Their efforts have been amplified by a robust network of anti-vaccination activists like Robert F. Kennedy Jr. on platforms including Facebook, YouTube and Twitter.

In the end, two billionaires from California did what legions of politicians, prosecutors and power brokers had tried and failed to do for years: They pulled the plug on President Trump. Journalists and historians will spend years unpacking the improvisational nature of the bans, and scrutinizing why they arrived just as Mr. Trump was losing his power, and Democrats were poised to take control of Congress and the White House. The bans have also turned up the heat on a free-speech debate that has been simmering for years.

Chief executives from Google, Apple, Amazon and Facebook testifying in July. Mark Zuckerberg of Facebook has testified six times on Capitol Hill.
Credit…Pool photo by Mandel Ngan

In the fall of 2017, when Congress called on Google, Facebook and Twitter to testify about their role in Russia’s interference with the 2016 presidential election, the companies didn’t send their chief executives — as lawmakers had requested — and instead summoned their lawyers to face the fire.

During the hearings, the politicians complained that the general counsels were answering questions about whether the companies contributed to undermining the democratic process instead of “the top people who are actually making the decisions,” as Senator Angus King, an independent from Maine, put it.

It was clear Capitol Hill wanted its pound of C.E.O. flesh and that hiding behind the lawyers was not going to work for long. That initial concern about how the chieftains of Silicon Valley would handle grilling from lawmakers is no longer a worry. After a slew of hearings in recent years, both virtual and in-person, the executives have had plenty of practice.

Since 2018, Sundar Pichai, Google’s chief executive, has testified on three different occasions. Jack Dorsey, Twitter’s chief executive, has made four appearances, and Mark Zuckerberg, Facebook’s chief, has testified six times.

And when the three men again face questioning on Thursday, they will do so now as seasoned veterans in the art of deflecting the most vicious attacks and then redirecting to their carefully practiced talking points.

In general, Mr. Pichai tends to disagree politely and quickly at the sharpest jabs from lawmakers — such as when Mr. Pichai was asked last year why Google steals content from honest businesses — but not harp on it. When a politician tries to pin him down on a specific issue, he often relies on a familiar delay tactic: My staff will get back to you.

Mr. Pichai is not a dynamic cult-of-personality tech leader like Steve Jobs or Elon Musk, but his reserved demeanor and earnestness is well suited for the congressional spotlight.

Mr. Zuckerberg has also grown more comfortable with the hearings over time and more emphatic about what the company is doing to combat misinformation. At his first appearance in 2018, Mr. Zuckerberg was contrite and made promises to do better for failing to protect users’ data and prevent Russian interference in elections.

Since then, he has pushed the message that Facebook is a platform for good, while carefully laying out the steps that the company is taking to stamp out disinformation online.

As the sessions have gone virtual during the pandemic, Mr. Dorsey’s appearances, hunched over a laptop camera, carry a just-another-guy-on-Zoom vibe when compared to the softly lit neutral backdrops for the Google and Facebook chiefs.

Mr. Dorsey tends to remain extremely calm — almost zen-like — when pressed with aggressive questions and often engages on technical issues that rarely illicit a follow-up.

View Source

Atlanta Spa Shootings Reverberate Across South Korea, Long a U.S. Ally

SEOUL—The killings happened more than 7,000 miles away. But for many South Koreans, the Atlanta-area spa shootings hit close to home. “The Victims Were Korean Mothers,” read a headline Sunday from the country’s largest newspaper.

Of the eight people who died, six were women of Asian descent—including four who have been identified as ethnic Koreans, ranging in age from 51 years old to 74. One was a South Korean citizen.

The rampage in Georgia has reverberated across this nation of 52 million, which in the decades since the Korean War has had a deep and enduring relationship with the U.S. The two are allies and share close cultural ties.

It can often seem like every Korean knows someone with relatives or friends living in the U.S. South Korea sends more of its children to study in America than in any other foreign country.

Lee Myung-kyu, a 55-year-old office worker, said he knows many South Korean families who have dreamed of immigrating to the U.S., hoping for a better life. His own daughter wants to go to school in America. But Mr. Lee said he now has doubts.

“I keep thinking about whether something like this could happen to her,” Mr. Lee said.


Demonstrators Call for End to Anti-Asian Violence in U.S.

Protests and vigils urging an end to violence against Asian-Americans were held around the country on Saturday.

Hundreds gathered Saturday in San Francisco’s Chinatown, calling for an end to violence against Asian-Americans. Eight people, including six women of Asian descent, were killed in a shooting spree in the Atlanta area on Tuesday.

Justin Sullivan/Getty Images

1 of 11


Local police say the white man from Georgia charged with murder in the case said he was driven by what he called a sex addiction. Authorities say they are investigating whether the killings were racially motivated.

The attack has sparked fear at the same time police and government officials in New York and other U.S. cities have said hate crimes against Asian-Americans have risen since the start of the Covid-19 pandemic, which first emerged in China.

Han Ye-rim, 32, said she has long idealized the U.S. as a diverse society. But staring at a victim list that looks much like herself, Ms. Han wonders how she would actually fare leaving Seoul.

“Learning about the Atlanta incident was a wake-up call to me,” Ms. Han said. “I’m realizing that I can be targeted for being different if I leave this country.”

What made the Atlanta rampage especially jarring was how good South Koreans, and Korean-Americans, had been feeling lately about their standing in the U.S.

U.S. Secretary of State Antony Blinken and Defense Secretary Lloyd Austin visited Korea on their first foreign trip. Barely a year ago, the South Korean film “Parasite” emerged with an unprecedented Best Picture win at the Academy Awards. BTS, the Korean pop band, had recently performed at the Grammys and topped Billboard’s album charts.

People rallied on Saturday in Atlanta.

Photo: shannon stapleton/Reuters

Meanwhile, South Koreans had rushed to the local box office to see the U.S. film “Minari,” which depicts a new Korean immigrant family in rural Arkansas and was itself just nominated for several Oscars.

“It’s really a weird kind of dichotomy,” said Abraham Kim, executive director for the Council of Korean Americans, a Washington-based nonprofit group, with celebrations of pop culture on the one hand and what he described as Asians “being targeted for violence on the other.”

South Korean media has given widespread coverage to the Atlanta shootings. In a Thursday editorial, Kyunghyang Shinmun, a left-leaning newspaper, called American society “defenseless to racist attacks.” Another outlet, the right-leaning Segye Ilbo, urged the U.S. to take “effective measures so that crimes against humanity do not take root.”

On Friday, President Biden, saying that the investigation is still under way, mourned the victims and declared that “hate can have no safe harbor in America.”

South Korean President Moon Jae-in has described the Atlanta killings as shocking.

Photo: Jewon Heon-kyun/Associated Press

South Korean President Moon Jae-in has called the Atlanta killings shocking, while the country’s foreign ministry supported the U.S. government’s efforts to stand against hatred and violence. “Such a crime is unacceptable under any circumstances,” the foreign ministry said in a Saturday statement.

Walking with a friend just blocks from the U.S. Embassy in Seoul, where the American flag continues to fly at half-staff in honor of the shooting victims, Yoon Ji-a recalled living in California during her youth. Her parents had a few brushes with racism, she said. But the events in Atlanta caught her by surprise.

“It’s scary,” said Ms. Yoon, a 20-year-old college student.

There are about 1.8 million Korean-Americans, according to the U.S. figures. The biggest Korean populations are in the metropolitan areas of Los Angeles, New York and Washington, D.C., according to Pew Research Center figures, analyzing U.S. data. Atlanta ranks seventh-largest.

Jean Lee has two children living in the U.S., though she hadn’t learned of the Atlanta-area shootings until local media began broadcasting coverage of the weekend protests and vigils across nearly two dozen American cities. Now the 48-year-old fears her children could be targeted.

“A lot of hate speech surfaced when people began calling the coronavirus the ‘Wuhan virus’ and it’s unfortunate that this issue came to light because of the shootings,” Ms. Lee said. “It feels late for Asians who have been experiencing discrimination for so long.”

Jenna Lee, a 25-year-old online shopping-mall owner, said she lived in Atlanta for two years as a teenager. In recent days, she said, she watched “Minari,” with its tale of struggling immigrants, and it prompted her to wonder whether Asian-Americans would be forever foreign and forever invisible.

“Asians are more than just people trying to assimilate into American society,” Ms. Lee said. And in her view, she said, “the shootings show how vulnerable we are to discrimination.”

Write to Timothy W. Martin at timothy.martin@wsj.com and Dasl Yoon at dasl.yoon@wsj.com

Copyright ©2020 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

View Source

The Strange History of Harmony Day and Australia’s Racism Discussion

The Australia Letter is a weekly newsletter from our Australia bureau. Sign up to get it by email. This week’s issue is written by Yan Zhuang, a reporter with the Australia bureau.

Two weeks ago, four local councilors of Asian heritage across Sydney received letters calling for death to “all Chinese people.” On Wednesday night, one of them proposed a motion for his council to take part in a campaign called “Racism Not Welcome.”

The motion was narrowly defeated. Some councilors said it was unnecessary because the problem didn’t exist in the local community. Others took issue with one word in the campaign’s name: Racism.

“It’s a terrible word, and I don’t want to see it in any fashion or form in our community,” said one councilor who voted against the motion.

“I don’t agree with using those particular words,” said another. “I think we should be using more encouraging words. More inclusive words. More belonging words. More words of togetherness, rather than words of separation or segregation.”

when it comes to other campaigns, like climate change.

Finding a new value that everyone could gather around might also lift concerns some Australians held about a loss of national identity as the country gradually moved from a primarily Anglo culture to a more diverse one as new migrants arrived, the report said. That in itself could help in reducing racism.

one in five Chinese-Australians had been threatened or attacked during the pandemic, and in places like the United States, where shootings in Atlanta killed eight people, six of them Asian women, critics say the focus on harmony become a disincentive to candor and addressing the problem.

What appeared at the local council goes back decades. There’s a much wider awareness now that racism is a significant problem, Professor Jakubowicz said, but the unwillingness to talk about race, reflected in the history of Harmony Day, continues.

“The majority position, that Australians would like not to be disturbed by reflecting on these facts, is very much there. And Harmony Day says you don’t have to be disturbed,” he said.

Now here are our stories of the week:


View Source

For Political Cartoonists, the Irony Was That Facebook Didn’t Recognize Irony

SAN FRANCISCO — Since 2013, Matt Bors has made a living as a left-leaning cartoonist on the internet. His site, The Nib, runs cartoons from him and other contributors that regularly skewer right-wing movements and conservatives with political commentary steeped in irony.

One cartoon in December took aim at the Proud Boys, a far-right extremist group. With tongue planted firmly in cheek, Mr. Bors titled it “Boys Will Be Boys” and depicted a recruitment where new Proud Boys were trained to be “stabby guys” and to “yell slurs at teenagers” while playing video games.

Days later, Facebook sent Mr. Bors a message saying that it had removed “Boys Will Be Boys” from his Facebook page for “advocating violence” and that he was on probation for violating its content policies.

It wasn’t the first time that Facebook had dinged him. Last year, the company briefly took down another Nib cartoon — an ironic critique of former President Donald J. Trump’s pandemic response, the substance of which supported wearing masks in public — for “spreading misinformation” about the coronavirus. Instagram, which Facebook owns, removed one of his sardonic antiviolence cartoons in 2019 because, the photo-sharing app said, it promoted violence.

Facebook barred Mr. Trump from posting on its site altogether after he incited a crowd that stormed the U.S. Capitol.

At the same time, misinformation researchers said, Facebook has had trouble identifying the slipperiest and subtlest of political content: satire. While satire and irony are common in everyday speech, the company’s artificial intelligence systems — and even its human moderators — can have difficulty distinguishing them. That’s because such discourse relies on nuance, implication, exaggeration and parody to make a point.

That means Facebook has sometimes misunderstood the intent of political cartoons, leading to takedowns. The company has acknowledged that some of the cartoons it expunged — including those from Mr. Bors — were removed by mistake and later reinstated them.

“If social media companies are going to take on the responsibility of finally regulating incitement, conspiracies and hate speech, then they are going to have to develop some literacy around satire,” Mr. Bors, 37, said in an interview.

accused Facebook and other internet platforms of suppressing only right-wing views.

In a statement, Facebook did not address whether it has trouble spotting satire. Instead, the company said it made room for satirical content — but only up to a point. Posts about hate groups and extremist content, it said, are allowed only if the posts clearly condemn or neutrally discuss them, because the risk for real-world harm is otherwise too great.

Facebook’s struggles to moderate content across its core social network, Instagram, Messenger and WhatsApp have been well documented. After Russians manipulated the platform before the 2016 presidential election by spreading inflammatory posts, the company recruited thousands of third-party moderators to prevent a recurrence. It also developed sophisticated algorithms to sift through content.

Facebook also created a process so that only verified buyers could purchase political ads, and instituted policies against hate speech to limit posts that contained anti-Semitic or white supremacist content.

Last year, Facebook said it had stopped more than 2.2 million political ad submissions that had not yet been verified and that targeted U.S. users. It also cracked down on the conspiracy group QAnon and the Proud Boys, removed vaccine misinformation, and displayed warnings on more than 150 million pieces of content viewed in the United States that third-party fact checkers debunked.

But satire kept popping up as a blind spot. In 2019 and 2020, Facebook often dealt with far-right misinformation sites that used “satire” claims to protect their presence on the platform, Mr. Brooking said. For example, The Babylon Bee, a right-leaning site, frequently trafficked in misinformation under the guise of satire.

whose independent work regularly appears in North American and European newspapers.

When Prime Minister Benjamin Netanyahu said in 2019 that he would bar two congresswomen — critics of Israel’s treatment of Palestinians — from visiting the country, Mr. Hall drew a cartoon showing a sign affixed to barbed wire that read, in German, “Jews are not welcome here.” He added a line of text addressing Mr. Netanyahu: “Hey Bibi, did you forget something?”

Mr. Hall said his intent was to draw an analogy between how Mr. Netanyahu was treating the U.S. representatives and Nazi Germany. Facebook took the cartoon down shortly after it was posted, saying it violated its standards on hate speech.

“If algorithms are making these decisions based solely upon words that pop up on a feed, then that is not a catalyst for fair or measured decisions when it comes to free speech,” Mr. Hall said.

Adam Zyglis, a nationally syndicated political cartoonist for The Buffalo News, was also caught in Facebook’s cross hairs.

paid memberships to The Nib and book sales on his personal site, he gets most of his traffic and new readership through Facebook and Instagram.

The takedowns, which have resulted in “strikes” against his Facebook page, could upend that. If he accumulates more strikes, his page could be erased, something that Mr. Bors said would cut 60 percent of his readership.

“Removing someone from social media can end their career these days, so you need a process that distinguishes incitement of violence from a satire of these very groups doing the incitement,” he said.

Mr. Bors said he had also heard from the Proud Boys. A group of them recently organized on the messaging chat app Telegram to mass-report his critical cartoons to Facebook for violating the site’s community standards, he said.

“You just wake up and find you’re in danger of being shut down because white nationalists were triggered by your comic,” he said

Facebook has sometimes recognized its errors and corrected them after he has made appeals, Mr. Bors said. But the back-and-forth and the potential for expulsion from the site have been frustrating and made him question his work, he said.

“Sometimes I do think about if a joke is worth it, or if it’s going to get us banned,” he said. “The problem with that is, where is the line on that kind of thinking? How will it affect my work in the long run?”

Cade Metz contributed reporting.

View Source

Tech’s Legal Shield Appears Likely to Survive as Congress Focuses on Details

WASHINGTON — Former President Donald J. Trump called multiple times for repealing the law that shields tech companies from legal responsibility over what people post. President Biden, as a candidate, said the law should be “revoked.”

But the lawmakers aiming to weaken the law have started to agree on a different approach. They are increasingly focused on eliminating protections for specific kinds of content rather than making wholesale changes to the law or eliminating it entirely.

That has still left them a question with potentially wide-ranging outcomes: What, exactly, should lawmakers cut?

One bill introduced last month would strip the protections from content the companies are paid to distribute, like ads, among other categories. A different proposal, expected to be reintroduced from the last congressional session, would allow people to sue when a platform amplified content linked to terrorism. And another that is likely to return would exempt content from the law only when a platform failed to follow a court’s order to take it down.

open to trimming the law, an effort to shape changes they see as increasingly likely to happen. Facebook and Google, the owner of YouTube, have signaled that they are willing to work with lawmakers changing the law, and some smaller companies recently formed a lobbying group to shape any changes.

December op-ed that was co-written by Bruce Reed, Mr. Biden’s deputy chief of staff, said that “platforms should be held accountable for any content that generates revenue.” The op-ed also said that while carving out specific types of content was a start, lawmakers would do well to consider giving platforms the entire liability shield only on the condition that they properly moderate content.

Supporters of Section 230 say even small changes could hurt vulnerable people. They point to the 2018 anti-trafficking bill, which sex workers say made it harder to vet potential clients online after some of the services they used closed, fearing new legal liability. Instead, sex workers have said they must now risk meeting with clients in person without using the internet to ascertain their intentions at a safe distance.

Senator Ron Wyden, the Oregon Democrat who co-wrote Section 230 while in the House, said measures meant to address disinformation on the right could be used against other political groups in the future.

“If you remember 9/11, and you had all these knee-jerk reactions to those horrible tragedies,” he said. “I think it would be a huge mistake to use the disgusting, nauseating attacks on the Capitol as a vehicle to suppress free speech.”

Industry officials say carve-outs to the law could nonetheless be extremely difficult to carry out.

“I appreciate that some policymakers are trying to be more specific about what they don’t like online,” said Kate Tummarello, the executive director of Engine, an advocacy group for small companies. “But there’s no universe in which platforms, especially small platforms, will automatically know when and where illegal speech is happening on their site.”

The issue may take center stage when the chief executives of Google, Facebook and Twitter testify late this month before the House Energy and Commerce Committee, which has been examining the future of the law.

“I think it’s going to be a huge issue,” said Representative Cathy McMorris Rodgers of Washington, the committee’s top Republican. “Section 230 is really driving it.”

View Source