An international coalition of 35 children’s and consumer groups called on Instagram on Thursday to scrap its plans to develop a version of the popular photo-sharing app for users under age 13.
Instagram’s push for a separate children’s app comes after years of complaints from legislators and parents that the platform has been slow to identify underage users and protect them from sexual predators and bullying.
But in a letter to Mark Zuckerberg, the chief executive of Facebook — the company that owns the photo-sharing service — the nonprofit groups warned that a children’s version of Instagram would not mitigate such problems. While 10- to 12-year-olds with Instagram accounts would be unlikely to switch to a “babyish version” of the app, the groups said, it could hook even younger users on endless routines of photo-scrolling and body-image shame.
“While collecting valuable family data and cultivating a new generation of Instagram users may be good for Facebook’s bottom line,” the groups, led by the Campaign for a Commercial-Free Childhood in Boston, said in the letter to Mr. Zuckerberg, “it will likely increase the use of Instagram by young children who are particularly vulnerable to the platform’s manipulative and exploitative features.”
The coalition of nonprofit groups also includes the Africa Digital Rights’ Hub in Ghana; the Australian Council on Children and the Media; the Center for Digital Democracy in Washington; Common Sense Media in San Francisco; the Consumer Federation of America; and the 5Rights Foundation in Britain.
Stephanie Otway, a Facebook spokeswoman, said that Instagram was in the early stages of developing a service for children as part of an effort to keep those under 13 off its main platform. Although Instagram requires users to be at least 13, many younger children have lied about their age to set up accounts.
Ms. Otway said that company would not show ads in any Instagram product developed for children younger than 13, and that it planned to consult with experts on children’s health and safety on the project. Instagram is also working on new age-verification methods to catch younger users trying to lie about their age, she said.
“The reality is that kids are online,” Ms. Otway said. “They want to connect with their family and friends, have fun and learn, and we want to help them do that in a way that is safe and age-appropriate.”
WASHINGTON — Lawmakers grilled the leaders of Facebook, Google and Twitter on Thursday about the connection between online disinformation and the Jan. 6 riot at the Capitol, causing Twitter’s chief executive to publicly admit for the first time that his product had played a role in the events that left five people dead.
When a Democratic lawmaker asked the executives to answer with a “yes” or a “no” whether the platforms bore some responsibility for the misinformation that had contributed to the riot, Jack Dorsey of Twitter said “yes.” Neither Mark Zuckerberg of Facebook nor Sundar Pichai of Google would answer the question directly.
The roughly five-hour hearing before a House committee marked the first time lawmakers directly questioned the chief executives regarding social media’s role in the January riot. The tech bosses were also peppered with questions about how their companies helped spread falsehoods around Covid-19 vaccines, enable racism and hurt children’s mental health.
It was also the first time the executives had testified since President Biden’s inauguration. Tough questioning from lawmakers signaled that scrutiny of Silicon Valley’s business practices would not let up, and could even intensify, with Democrats in the White House and leading both chambers of Congress.
tweeted a single question mark with a poll that had two options: “Yes” or “No.” When asked about his tweet by a lawmaker, he said “yes” was winning.
The January riot at the Capitol has made the issue of disinformation deeply personal for lawmakers. The riot was fueled by false claims from President Donald J. Trump and others that the election had been stolen, which were rampant on social media.
Some of the participants had connections to QAnon and other online conspiracy theories. And prosecutors have said that groups involved in the riot, including the Oath Keepers and the Proud Boys, coordinated some of their actions on social media.
ban Mr. Trump and his associates after the Jan. 6 riots. The bans hardened views by conservatives that the companies are left-leaning and are inclined to squelch conservative voices.
“We’re all aware of Big Tech’s ever-increasing censorship of conservative voices and their commitment to serve the radical progressive agenda,” said Representative Bob Latta of Ohio, the ranking Republican on the panel’s technology subcommittee.
The company leaders defended their businesses, saying they had invested heavily in hiring content moderators and in technology like artificial intelligence, used to identify and fight disinformation.
Mr. Zuckerberg argued against the notion that his company had a financial incentive to juice its users’ attention by driving them toward more extreme content. He said Facebook didn’t design “algorithms in order to just kind of try to tweak and optimize and get people to spend every last minute on our service.”
He added later in the hearing that elections disinformation was spread in messaging apps, where amplification and algorithms don’t aid in spread of false content. He also blamed television and other traditional media for spreading election lies.
The companies showed fissures in their view on regulations. Facebook has vocally supported internet regulations in a major advertising blitz on television and in newspapers. In the hearing, Mr. Zuckerberg suggested specific regulatory reforms to a key legal shield, known as Section 230 of the Communications Decency Act, that has helped Facebook and other Silicon Valley internet giants thrive.
The legal shield protects companies that host and moderate third-party content, and says companies like Google and Twitter are simply intermediaries of their user-generated content. Democrats have argued that with that protection, companies aren’t motivated to remove disinformation. Republicans accuse the companies of using the shield to moderate too much and to take down content that doesn’t represent their political viewpoints.
“I believe that Section 230 would benefit from thoughtful changes to make it work better for people,” Mr. Zuckerberg said in the statement.
He proposed that liability protection for companies be conditional on their ability to fight the spread of certain types of unlawful content. He said platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it. Reforms, he said, should be different for smaller social networks, which wouldn’t have the same resources like Facebook to meet new requirements.
Mr. Pichai and Mr. Dorsey said they supported requirements of transparency in content moderation but fell short of agreeing with Mr. Zuckerberg’s other ideas. Mr. Dorsey said that it would be very difficult to distinguish a large platform from a smaller one.
Lawmakers did not appear to be won over.
“There’s a lot of smugness among you,” said Representative Bill Johnson, a Republican of Ohio. “There’s this air of untouchable-ness in your responses to many of the tough questions that you’re being asked.”
Kate Conger and Daisuke Wakabayashi contributed reporting.
The leaders of Google, Facebook and Twitter testified on Thursday before a House committee in their first appearances on Capitol Hill since the start of the Biden administration. As expected, sparks flew.
The hearing was centered on questions of how to regulate disinformation online, although lawmakers also voiced concerns about the public-health effects of social media and the borderline-monopolistic practices of the largest tech companies.
On the subject of disinformation, Democratic legislators scolded the executives for the role their platforms played in spreading false claims about election fraud before the Capitol riot on Jan. 6. Jack Dorsey, the chief executive of Twitter, admitted that his company had been partly responsible for helping to circulate disinformation and plans for the Capitol attack. “But you also have to take into consideration the broader ecosystem,” he added. Sundar Pichai and Mark Zuckerberg, the top executives at Google and Facebook, avoided answering the question directly.
Lawmakers on both sides of the aisle returned often to the possibility of jettisoning or overhauling Section 230 of the Communications Decency Act, a federal law that for 25 years has granted immunity to tech companies for any harm caused by speech that’s hosted on their platforms.
393 million, to be precise, which is more than one per person and about 46 percent of all civilian-owned firearms in the world. As researchers at the Harvard T.H. Chan School of Public Health have put it, “more guns = more homicide” and “more guns = more suicide.”
But when it comes to understanding the causes of America’s political inertia on the issue, the lines of thought become a little more tangled. Some of them are easy to follow: There’s the line about the Senate, of course, which gives large states that favor gun regulation the same number of representatives as small states that don’t. There’s also the line about the National Rifle Association, which some gun control proponents have cast — arguably incorrectly — as the sine qua non of our national deadlock.
But there may be a psychological thread, too. Research has found that after a mass shooting, people who don’t own guns tend to identify the general availability of guns as the culprit. Gun owners, on the other hand, are more likely to blame other factors, such as popular culture or parenting.
Americans who support gun regulations also don’t prioritize the issue at the polls as much as Americans who oppose them, so gun rights advocates tend to win out. Or, in the words of Robert Gebelhoff of The Washington Post, “Gun reform doesn’t happen because Americans don’t want it enough.”
Sign up here to get it delivered to your inbox.
Is there anything you think we’re missing? Anything you want to see more of? We’d love to hear from you. Email us at email@example.com.
The chief executives of Google, Facebook and Twitter are testifying at the House on Thursday about how disinformation spreads across their platforms, an issue that the tech companies were scrutinized for during the presidential election and after the Jan. 6 riot at the Capitol.
The hearing, held by the House Energy and Commerce Committee, is the first time that Mark Zuckerberg of Facebook, Jack Dorsey of Twitter and Sundar Pichai of Google are appearing before Congress during the Biden administration. President Biden has indicated that he is likely to be tough on the tech industry. That position, coupled with Democratic control of Congress, has raised liberal hopes that Washington will take steps to rein in Big Tech’s power and reach over the next few years.
The hearing is also be the first opportunity since the Jan. 6 Capitol riot for lawmakers to question the three men about the role their companies played in the event. The attack has made the issue of disinformation intensely personal for the lawmakers since those who participated in the riot have been linked to online conspiracy theories like QAnon.
Before the hearing, Democrats signaled in a memo that they were interested in questioning the executives about the Jan. 6 attacks, efforts by the right to undermine the results of the 2020 election and misinformation related to the Covid-19 pandemic.
October article in The New York Post about President Biden’s son Hunter.
Lawmakers have debated whether social media platforms’ business models encourage the spread of hate and disinformation by prioritizing content that will elicit user engagement, often by emphasizing salacious or divisive posts.
Some lawmakers will push for changes to Section 230 of the Communications Decency Act, a 1996 law that shields the platforms from lawsuits over their users’ posts. Lawmakers are trying to strip the protections in cases where the companies’ algorithms amplified certain illegal content. Others believe that the spread of disinformation could be stemmed with stronger antitrust laws, since the platforms are by far the major outlets for communicating publicly online.
“By now it’s painfully clear that neither the market nor public pressure will stop social media companies from elevating disinformation and extremism, so we have no choice but to legislate, and now it’s a question of how best to do it,” said Representative Frank Pallone, the New Jersey Democrat who is chairman of the committee.
The tech executives are expected to play up their efforts to limit misinformation and redirect users to more reliable sources of information. They may also entertain the possibility of more regulation, in an effort to shape increasingly likely legislative changes rather than resist them outright.
Good morning and happy spring. Here’s hoping you can enjoy another Sunday spent ignoring your tax returns (or, if you’ve already done them, feeling smug about it). But first, here’s what you need to know in business and tech news for the week ahead. — Charlotte Cowles
What’s Up? (March 14 to 20)
More Time for Taxes
Good news for procrastinators like me, or anyone whose taxes were complicated by the pandemic: The Internal Revenue Service has extended the deadline to file taxes by one month, to May 17. The extra time will help people navigate new tax rules that took effect with the passage of the American Rescue Plan. The law made the first $10,200 of unemployment benefits tax-free for people who earned less than $150,000 last year, a significant benefit for many people whose jobs were disrupted. But if you’ve already filed, don’t worry — the I.R.S. said it would automatically send those refunds to people who qualify.
Well, That Was Awkward
Relations between China and the Biden administration got off to a rocky start last week at the first face-to-face meeting between diplomats. The United States set a confrontational tone on the eve of the talks by imposing sanctions on 24 Chinese officials for undermining democracy in Hong Kong. In turn, China’s top diplomat accused his American counterparts of being “condescending,” among other claims. The purpose of the three-day meeting, according to President Biden’s team, was to find common ground on climate change and on controlling the pandemic, and to address U.S. concerns about Chinese trade and military encroachments. The tension does not bode well for making headway in future negotiations.
suing the Walt Disney Company for what they call “rampant gender pay discrimination” have added another accusation to their list: that Disney “maintains a strict policy of pay secrecy.” A new section of the lawsuit refers to an episode in which one female Disney employee was “disciplined for disclosing her pay to co-workers.” Pay transparency is considered an important part of closing racial and gender wage gaps, and retaliation for discussing your own salary violates California law as well as the National Labor Relations Act. Disney has denied the claims and vowed to defend itself.
What’s Next? (March 21 to 27)
Coming to a Walmart Near You
Walmart is jumping on the vaccine passport bandwagon, saying it will provide standardized digital vaccination credentials to anyone who gets vaccinated at one of its stores or at Sam’s Club. The retailer will develop a health passport app that people can use to verify their status at airports, schools, sports arenas and other potentially crowded places. Walmart joins an existing push by major health centers and tech companies, including Microsoft, Oracle, Salesforce and the Mayo Clinic, as well as a proposal from the European Union, which would require vaccine verification for travel in certain areas.
How Has the Pandemic Changed Your Taxes?
Nope. The so-called economic impact payments are not treated as income. In fact, they’re technically an advance on a tax credit, known as the Recovery Rebate Credit. The payments could indirectly affect what you pay in state income taxes in a handful of states, where federal tax is deductible against state taxable income, as our colleague Ann Carrns wrote. Read more.
Mostly. Unemployment insurance is generally subject to federal as well as state income tax, though there are exceptions (Nine states don’t impose their own income taxes, and another six exempt unemployment payments from taxation, according to the Tax Foundation). But you won’t owe so-called payroll taxes, which pay for Social Security and Medicare. The new relief bill will make the first $10,200 of benefits tax-free if your income is less than $150,000. This applies to 2020 only. (If you’ve already filed your taxes, watch for I.R.S. guidance.) Unlike paychecks from an employer, taxes for unemployment aren’t automatically withheld. Recipients must opt in — and even when they do, federal taxes are withheld only at a flat rate of 10 percent of benefits. While the new tax break will provide a cushion, some people could still owe the I.R.S. or certain states money. Read more.
Probably not, unless you’re self-employed, an independent contractor or a gig worker. The tax law overhaul of late 2019 eliminated the home office deduction for employees from 2018 through 2025. “Employees who receive a paycheck or a W-2 exclusively from an employer are not eligible for the deduction, even if they are currently working from home,” the I.R.S. said. Read more.
Self-employed people can take paid caregiving leave if their child’s school is closed or their usual child care provider is unavailable because of the outbreak. This works similarly to the smaller sick leave credit — 67 percent of average daily earnings (for either 2020 or 2019), up to $200 a day. But the caregiving leave can be taken for 50 days. Read more.
Yes. This year, you can deduct up to $300 for charitable contributions, even if you use the standard deduction. Previously, only people who itemized could claim these deductions. Donations must be made in cash (for these purposes, this includes check, credit card or debit card), and can’t include securities, household items or other property. For 2021, the deduction limit will double to $600 for joint filers. Rules for itemizers became more generous as well. The limit on charitable donations has been suspended, so individuals can contribute up to 100 percent of their adjusted gross income, up from 60 percent. But these donations must be made to public charities in cash; the old rules apply to contributions made to donor-advised funds, for example. Both provisions are available through 2021. Read more.
Back in the Hot Seat
Chief executives from Facebook, Google and Twitter will be grilled in Congress this Thursday, this time over their failure to crack down on the spread of misinformation. Tech executives were last summoned by lawmakers in November 2020, when Mark Zuckerberg of Facebook and Jack Dorsey of Twitter faced a firestorm of questioning about content moderation, mostly regarding their attempts to prevent a wave of falsehoods about the presidential election. This time, they will be asked about coronavirus vaccine misinformation and about the election fraud conspiracy theories that continue to spread on their platforms.
Elsewhere in Washington
The two biggest names in economic policy — the Federal Reserve chair, Jerome Powell, and Treasury Secretary Janet Yellen — will make their first joint appearance this week when they testify before the House Financial Services Committee on the progress of pandemic relief efforts. The hearing comes one week after the Fed revised its economic outlook to project stronger growth and offered more reassurances that it would keep interest rates near zero for the coming years.
jettisoned a Trump-era policy that limited debt relief for students who were defrauded by for-profit educational institutions. The newly hired Teen Vogue editor, Alexi McCammond, resigned over racist and homophobic tweets that she posted a decade ago. And retail sales dropped 3 percent in February as consumers grappled with declining stimulus effects and devastating winter storms.
MELBOURNE, Australia — Facebook has agreed to pay Rupert Murdoch’s News Corp for its journalism content in Australia, a month after the social media platform temporarily blocked news links inside the country over legislation pressing digital giants to compensate publishers.
The multiyear deal, announced on Tuesday, includes news content from major Murdoch conservative media outlets like The Australian, a national newspaper, and the news site news.com.au, as well as other metropolitan, regional and community publications.
It comes a month after Google unveiled its own three-year global agreement with News Corp to pay for the publisher’s news content, and after Facebook backed down, under heavy criticism, from its drastic step of blocking the sharing or viewing of news links in Australia.
Few details, including how much Facebook will pay News Corp for content, were released.
In a statement on Tuesday, Robert Thomson, chief executive of News Corp, said the agreement, which he called a “landmark,” would “have a material and meaningful impact on our Australian news businesses.”
said of the draft Australian legislation, “The proposed law fundamentally misunderstands the relationship between our platform and publishers who use it to share news content.”
While the Australian government has pointed to the consolidation of digital ad spending in companies like Google and Facebook, the tech giants say that they benefit news companies by driving traffic to their sites.
Facebook has also announced preliminary pay deals with independent news organizations including Private Media, Schwartz Media and Solstice Media. But so far, it has cemented agreements only with News Corp and Seven West Media, another major conservative news company.
Sky News Australia, also owned by Mr. Murdoch, extended an existing agreement with Facebook.