Senators push Facebook exec on Instagram policies for youth

September 30, 2021 GMT
FILE - In this March 20, 2018 file photo, Facebook's head of global safety policy Antigone Davis speaks during a roundtable on cyberbullying with first lady Melania Trump, in the State Dining Room of the White House in Washington. Facing lawmakers’ outrage against Facebook over its handling of internal research on harm to teens from Instagram, Davis is telling Congress that the company is working to protect young people on its platforms, on Thursday, Sept. 30, 2021.  (AP Photo/Evan Vucci, File)
FILE - In this March 20, 2018 file photo, Facebook's head of global safety policy Antigone Davis speaks during a roundtable on cyberbullying with first lady Melania Trump, in the State Dining Room of the White House in Washington. Facing lawmakers’ outrage against Facebook over its handling of internal research on harm to teens from Instagram, Davis is telling Congress that the company is working to protect young people on its platforms, on Thursday, Sept. 30, 2021.  (AP Photo/Evan Vucci, File)
FILE - In this March 20, 2018 file photo, Facebook's head of global safety policy Antigone Davis speaks during a roundtable on cyberbullying with first lady Melania Trump, in the State Dining Room of the White House in Washington. Facing lawmakers’ outrage against Facebook over its handling of internal research on harm to teens from Instagram, Davis is telling Congress that the company is working to protect young people on its platforms, on Thursday, Sept. 30, 2021.  (AP Photo/Evan Vucci, File)
1 of 2
FILE - In this March 20, 2018 file photo, Facebook's head of global safety policy Antigone Davis speaks during a roundtable on cyberbullying with first lady Melania Trump, in the State Dining Room of the White House in Washington. Facing lawmakers’ outrage against Facebook over its handling of internal research on harm to teens from Instagram, Davis is telling Congress that the company is working to protect young people on its platforms, on Thursday, Sept. 30, 2021. (AP Photo/Evan Vucci, File)
1 of 2
FILE - In this March 20, 2018 file photo, Facebook's head of global safety policy Antigone Davis speaks during a roundtable on cyberbullying with first lady Melania Trump, in the State Dining Room of the White House in Washington. Facing lawmakers’ outrage against Facebook over its handling of internal research on harm to teens from Instagram, Davis is telling Congress that the company is working to protect young people on its platforms, on Thursday, Sept. 30, 2021. (AP Photo/Evan Vucci, File)

WASHINGTON (AP) — Political adversaries in Congress are united in outrage against Facebook for privately compiling information that its Instagram photo-sharing service appears to grievously harm some teens, especially girls, while publicly downplaying the popular platform’s negative impact.

Mounting public pressure over the revelations have prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Those revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents.

ADVERTISEMENT

Facebook’s head of global safety, Antigone Davis, has been summoned to testify Thursday by a Senate Commerce Committee panel digging into Instagram’s impact on young users.

She’s expected to tell the lawmakers that Facebook works to prevent children under 13 from gaining access to platforms that aren’t suitable for them. The company is developing features to protect young people on its platforms, using research and consultations with outside experts to make the users’ experience positive, Davis is set to testify.

She says Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps. The goal is to keep young people safe on the platforms and ensure that those who aren’t old enough to use them do not.

The committee’s chairman, Sen. Richard Blumenthal, D-Conn., and its senior Republican, Sen. Marsha Blackburn of Tennessee, sit on opposite ends of the political spectrum. Blumenthal is a leading liberal, a former federal prosecutor who has pursued powerful industries over consumer protection issues and stressed civil rights. Blackburn, a solid ally of former president Donald Trump, is an outspoken conservative and abortion foe who has repeatedly accused Facebook, Google and Twitter of censoring those viewpoints.

The Instagram revelations have brought them together to call Facebook to account.

ADVERTISEMENT

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Blumenthal said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Blackburn also plan to take testimony next week from a Facebook whistleblower, possibly the person who leaked the Instagram research documents to the Journal.

A preview of the grilling Davis faces came last week, when, in a separate Senate hearing, Blumenthal told another Facebook executive regarding the Instagram research, “You’ve been sent here to defend the indefensible.”

“Accountability is coming,” Blumenthal said. “And it will be bipartisan.”

Facebook has criticized the Journal story as cherry-picking from its research, though it didn’t dispute the authenticity of the documents.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.

“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.

Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.

Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.

“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram’s negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers’ posture toward social media generally, which splits Republicans and Democrats. Republicans accuse Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge’s ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Senators fired a barrage of criticism Thursday at a Facebook executive over the social-networking giant’s handling of internal research on how its Instagram photo-sharing platform can harm teens.

The lawmakers accused Facebook of concealing the negative findings about Instagram and demanded a commitment from the company to make changes.

During testimony before a Senate Commerce subcommittee, Antigone Davis, Facebook’s head of global safety, defended Instagram’s efforts to protect young people using its platform. She disputed the way a recent newspaper story describes what the research shows.

“We care deeply about the safety and security of the people on our platform,” Davis said. “We take the issue very seriously. ... We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17.”

Sen. Richard Blumenthal, D-Conn., the subcommittee chairman, wasn’t convinced.

“I don’t understand how you can deny that Instagram is exploiting young users for its own profit,” he told Davis.

The panel is examining Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research showed.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents.

Comparisons to the tobacco industry’s coverups of cigarettes’ harmful effects abounded in a session that united senators of both parties in criticism of the giant social network and Instagram, the photo-sharing juggernaut valued at around $100 billion that Facebook has owned since 2012.

Said Sen. Edward Markey, D-Mass.: “Instagram is that first childhood cigarette meant to get teens hooked early. Facebook is just like Big Tobacco, pushing a product they know is harmful to the health of young people.”

The episode is quickly burgeoning into a scandal for Facebook approaching the level of the Cambridge Analytica debacle. Revelations in 2018 that the data mining firm had gathered details on as many as 87 million Facebook users without their permission similarly led to a public-relations offensive by Facebook and congressional hearings.

“It’s abundantly clear that Facebook views the events of the last two weeks purely as a PR problem, and that the issues raised by the leaked research haven’t led to any soul-searching or commitment to change,” said Josh Golin, executive director of the children’s online advertising group Fairplay. The group, formerly known as the Campaign for a Commercial-Free Childhood, doesn’t take money from Facebook or companies, unlike the nonprofits Facebook tends to bring in for expert advice on its products.

Facebook’s public response to the outcry over Instagram was to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

Pressed by senators, Davis wouldn’t say how long the pause would last. “I don’t have a specific date but I do have a commitment” that Facebook executives will consult with parents, policymakers and experts, she said. “We want to get this right.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal. An interview with the whistleblower is set to air on CBS’ “60 Minutes” program Sunday.

Davis, a one-time middle school teacher and aide in the Maryland attorney general’s office, insisted that the research on Instagram’s impact on young people “is not a bombshell.”

“This research is a bombshell,” Blumenthal countered. “It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children, and that it has concealed those facts and findings.”

__

The research documents released Wednesday by the Wall Street Journal: https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-comparison-on-instagram.pdf

__

Ortutay reported from Oakland, California. Associated Press writer Amanda Seitz in Columbus, Ohio, contributed to this report.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Senators fired a barrage of criticism Thursday at a Facebook executive over the social-networking giant’s handling of internal research on how its Instagram photo-sharing platform can harm teens.

The lawmakers accused Facebook of concealing the negative findings about Instagram and demanded a commitment from the company to make changes.

During testimony before a Senate Commerce subcommittee, Antigone Davis, Facebook’s head of global safety, defended Instagram’s efforts to protect young people using its platform. She disputed the way a recent newspaper story describes what the research shows.

“We care deeply about the safety and security of the people on our platform,” Davis said. “We take the issue very seriously. ... We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17.”

Sen. Richard Blumenthal, D-Conn., the subcommittee chairman, wasn’t convinced.

“I don’t understand how you can deny that Instagram is exploiting young users for its own profit,” he told Davis.

The panel is examining Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research showed.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents.

Comparisons to the tobacco industry’s coverups of cigarettes’ harmful effects abounded in a session that united senators of both parties in criticism of the giant social network and Instagram, the photo-sharing juggernaut valued at around $100 billion that Facebook has owned since 2012.

Said Sen. Edward Markey, D-Mass.: “Instagram is that first childhood cigarette meant to get teens hooked early. Facebook is just like Big Tobacco, pushing a product they know is harmful to the health of young people.”

The episode is quickly burgeoning into a scandal for Facebook approaching the level of the Cambridge Analytica debacle. Revelations in 2018 that the data mining firm had gathered details on as many as 87 million Facebook users without their permission similarly led to a public-relations offensive by Facebook and congressional hearings.

“It’s abundantly clear that Facebook views the events of the last two weeks purely as a PR problem, and that the issues raised by the leaked research haven’t led to any soul-searching or commitment to change,” said Josh Golin, executive director of the children’s online advertising group Fairplay. The group, formerly known as the Campaign for a Commercial-Free Childhood, doesn’t take money from Facebook or companies, unlike the nonprofits Facebook tends to bring in for expert advice on its products.

Facebook’s public response to the outcry over Instagram was to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

Pressed by senators, Davis wouldn’t say how long the pause would last. “I don’t have a specific date but I do have a commitment” that Facebook executives will consult with parents, policymakers and experts, she said. “We want to get this right.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal. An interview with the whistleblower is set to air on CBS’ “60 Minutes” program Sunday.

Davis, a one-time middle school teacher and aide in the Maryland attorney general’s office, insisted that the research on Instagram’s impact on young people “is not a bombshell.”

“This research is a bombshell,” Blumenthal countered. “It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children, and that it has concealed those facts and findings.”

__

The research documents released Wednesday by the Wall Street Journal: https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-comparison-on-instagram.pdf

__

Ortutay reported from Oakland, California. Associated Press writer Amanda Seitz in Columbus, Ohio, contributed to this report.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Senators fired a barrage of criticism Thursday at a Facebook executive over the social-networking giant’s handling of internal research on how its Instagram photo-sharing platform can harm teens.

The lawmakers accused Facebook of concealing the negative findings about Instagram and demanded a commitment from the company to make changes.

During testimony before a Senate Commerce subcommittee, Antigone Davis, Facebook’s head of global safety, defended Instagram’s efforts to protect young people using its platform. She disputed the way a recent newspaper story describes what the research shows.

“We care deeply about the safety and security of the people on our platform,” Davis said. “We take the issue very seriously. ... We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17.”

Sen. Richard Blumenthal, D-Conn., the subcommittee chairman, wasn’t convinced.

“I don’t understand how you can deny that Instagram is exploiting young users for its own profit,” he told Davis.

The panel is examining Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research showed.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents.

Comparisons to the tobacco industry’s coverups of cigarettes’ harmful effects abounded in a session that united senators of both parties in criticism of the giant social network and Instagram, the photo-sharing juggernaut valued at around $100 billion that Facebook has owned since 2012.

Said Sen. Edward Markey, D-Mass.: “Instagram is that first childhood cigarette meant to get teens hooked early. Facebook is just like Big Tobacco, pushing a product they know is harmful to the health of young people.”

The episode is quickly burgeoning into a scandal for Facebook approaching the level of the Cambridge Analytica debacle. Revelations in 2018 that the data mining firm had gathered details on as many as 87 million Facebook users without their permission similarly led to a public-relations offensive by Facebook and congressional hearings.

“It’s abundantly clear that Facebook views the events of the last two weeks purely as a PR problem, and that the issues raised by the leaked research haven’t led to any soul-searching or commitment to change,” said Josh Golin, executive director of the children’s online advertising group Fairplay. The group, formerly known as the Campaign for a Commercial-Free Childhood, doesn’t take money from Facebook or companies, unlike the nonprofits Facebook tends to bring in for expert advice on its products.

Facebook’s public response to the outcry over Instagram was to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

Pressed by senators, Davis wouldn’t say how long the pause would last. “I don’t have a specific date but I do have a commitment” that Facebook executives will consult with parents, policymakers and experts, she said. “We want to get this right.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal. An interview with the whistleblower is set to air on CBS’ “60 Minutes” program Sunday.

Davis, a one-time middle school teacher and aide in the Maryland attorney general’s office, insisted that the research on Instagram’s impact on young people “is not a bombshell.”

“This research is a bombshell,” Blumenthal countered. “It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children, and that it has concealed those facts and findings.”

__

The research documents released Wednesday by the Wall Street Journal: https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-comparison-on-instagram.pdf

__

Ortutay reported from Oakland, California. Associated Press writer Amanda Seitz in Columbus, Ohio, contributed to this report.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Senators fired a barrage of criticism Thursday at a Facebook executive over the social-networking giant’s handling of internal research on how its Instagram photo-sharing platform can harm teens.

The lawmakers accused Facebook of concealing the negative findings about Instagram and demanded a commitment from the company to make changes.

During testimony before a Senate Commerce subcommittee, Antigone Davis, Facebook’s head of global safety, defended Instagram’s efforts to protect young people using its platform. She disputed the way a recent newspaper story describes what the research shows.

“We care deeply about the safety and security of the people on our platform,” Davis said. “We take the issue very seriously. ... We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17.”

Sen. Richard Blumenthal, D-Conn., the subcommittee chairman, wasn’t convinced.

“I don’t understand how you can deny that Instagram is exploiting young users for its own profit,” he told Davis.

The panel is examining Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research showed.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents.

Comparisons to the tobacco industry’s coverups of cigarettes’ harmful effects abounded in a session that united senators of both parties in criticism of the giant social network and Instagram, the photo-sharing juggernaut valued at around $100 billion that Facebook has owned since 2012.

Said Sen. Edward Markey, D-Mass.: “Instagram is that first childhood cigarette meant to get teens hooked early. Facebook is just like Big Tobacco, pushing a product they know is harmful to the health of young people.”

The episode is quickly burgeoning into a scandal for Facebook approaching the level of the Cambridge Analytica debacle. Revelations in 2018 that the data mining firm had gathered details on as many as 87 million Facebook users without their permission similarly led to a public-relations offensive by Facebook and congressional hearings.

“It’s abundantly clear that Facebook views the events of the last two weeks purely as a PR problem, and that the issues raised by the leaked research haven’t led to any soul-searching or commitment to change,” said Josh Golin, executive director of the children’s online advertising group Fairplay. The group, formerly known as the Campaign for a Commercial-Free Childhood, doesn’t take money from Facebook or companies, unlike the nonprofits Facebook tends to bring in for expert advice on its products.

Facebook’s public response to the outcry over Instagram was to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

Pressed by senators, Davis wouldn’t say how long the pause would last. “I don’t have a specific date but I do have a commitment” that Facebook executives will consult with parents, policymakers and experts, she said. “We want to get this right.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal. An interview with the whistleblower is set to air on CBS’ “60 Minutes” program Sunday.

Davis, a one-time middle school teacher and aide in the Maryland attorney general’s office, insisted that the research on Instagram’s impact on young people “is not a bombshell.”

“This research is a bombshell,” Blumenthal countered. “It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children, and that it has concealed those facts and findings.”

__

The research documents released Wednesday by the Wall Street Journal: https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-comparison-on-instagram.pdf

__

Ortutay reported from Oakland, California. Associated Press writer Amanda Seitz in Columbus, Ohio, contributed to this report.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Senators fired a barrage of criticism Thursday at a Facebook executive over the social-networking giant’s handling of internal research on how its Instagram photo-sharing platform can harm teens.

The lawmakers accused Facebook of concealing the negative findings about Instagram and demanded a commitment from the company to make changes.

During testimony before a Senate Commerce subcommittee, Antigone Davis, Facebook’s head of global safety, defended Instagram’s efforts to protect young people using its platform. She disputed the way a recent newspaper story describes what the research shows.

“We care deeply about the safety and security of the people on our platform,” Davis said. “We take the issue very seriously. ... We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17.”

Sen. Richard Blumenthal, D-Conn., the subcommittee chairman, wasn’t convinced.

“I don’t understand how you can deny that Instagram is exploiting young users for its own profit,” he told Davis.

The panel is examining Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research showed.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents.

Comparisons to the tobacco industry’s coverups of cigarettes’ harmful effects abounded in a session that united senators of both parties in criticism of the giant social network and Instagram, the photo-sharing juggernaut valued at around $100 billion that Facebook has owned since 2012.

Said Sen. Edward Markey, D-Mass.: “Instagram is that first childhood cigarette meant to get teens hooked early. Facebook is just like Big Tobacco, pushing a product they know is harmful to the health of young people.”

The episode is quickly burgeoning into a scandal for Facebook approaching the level of the Cambridge Analytica debacle. Revelations in 2018 that the data mining firm had gathered details on as many as 87 million Facebook users without their permission similarly led to a public-relations offensive by Facebook and congressional hearings.

“It’s abundantly clear that Facebook views the events of the last two weeks purely as a PR problem, and that the issues raised by the leaked research haven’t led to any soul-searching or commitment to change,” said Josh Golin, executive director of the children’s online advertising group Fairplay. The group, formerly known as the Campaign for a Commercial-Free Childhood, doesn’t take money from Facebook or companies, unlike the nonprofits Facebook tends to bring in for expert advice on its products.

Facebook’s public response to the outcry over Instagram was to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

Pressed by senators, Davis wouldn’t say how long the pause would last. “I don’t have a specific date but I do have a commitment” that Facebook executives will consult with parents, policymakers and experts, she said. “We want to get this right.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal. An interview with the whistleblower is set to air on CBS’ “60 Minutes” program Sunday.

Davis, a one-time middle school teacher and aide in the Maryland attorney general’s office, insisted that the research on Instagram’s impact on young people “is not a bombshell.”

“This research is a bombshell,” Blumenthal countered. “It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children, and that it has concealed those facts and findings.”

__

The research documents released Wednesday by the Wall Street Journal: https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-comparison-on-instagram.pdf

__

Ortutay reported from Oakland, California. Associated Press writer Amanda Seitz in Columbus, Ohio, contributed to this report.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Senators fired a barrage of criticism Thursday at a Facebook executive over the social-networking giant’s handling of internal research on how its Instagram photo-sharing platform can harm teens.

The lawmakers accused Facebook of concealing the negative findings about Instagram and demanded a commitment from the company to make changes.

During testimony before a Senate Commerce subcommittee, Antigone Davis, Facebook’s head of global safety, defended Instagram’s efforts to protect young people using its platform. She disputed the way a recent newspaper story describes what the research shows.

“We care deeply about the safety and security of the people on our platform,” Davis said. “We take the issue very seriously. ... We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17.”

Sen. Richard Blumenthal, D-Conn., the subcommittee chairman, wasn’t convinced.

“I don’t understand how you can deny that Instagram is exploiting young users for its own profit,” he told Davis.

The panel is examining Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research showed.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents.

Comparisons to the tobacco industry’s coverups of cigarettes’ harmful effects abounded in a session that united senators of both parties in criticism of the giant social network and Instagram, the photo-sharing juggernaut valued at around $100 billion that Facebook has owned since 2012.

Said Sen. Edward Markey, D-Mass.: “Instagram is that first childhood cigarette meant to get teens hooked early. Facebook is just like Big Tobacco, pushing a product they know is harmful to the health of young people.”

The episode is quickly burgeoning into a scandal for Facebook approaching the level of the Cambridge Analytica debacle. Revelations in 2018 that the data mining firm had gathered details on as many as 87 million Facebook users without their permission similarly led to a public-relations offensive by Facebook and congressional hearings.

“It’s abundantly clear that Facebook views the events of the last two weeks purely as a PR problem, and that the issues raised by the leaked research haven’t led to any soul-searching or commitment to change,” said Josh Golin, executive director of the children’s online advertising group Fairplay. The group, formerly known as the Campaign for a Commercial-Free Childhood, doesn’t take money from Facebook or companies, unlike the nonprofits Facebook tends to bring in for expert advice on its products.

Facebook’s public response to the outcry over Instagram was to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

Pressed by senators, Davis wouldn’t say how long the pause would last. “I don’t have a specific date but I do have a commitment” that Facebook executives will consult with parents, policymakers and experts, she said. “We want to get this right.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal. An interview with the whistleblower is set to air on CBS’ “60 Minutes” program Sunday.

Davis, a one-time middle school teacher and aide in the Maryland attorney general’s office, insisted that the research on Instagram’s impact on young people “is not a bombshell.”

“This research is a bombshell,” Blumenthal countered. “It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children, and that it has concealed those facts and findings.”

__

The research documents released Wednesday by the Wall Street Journal: https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-comparison-on-instagram.pdf

__

Ortutay reported from Oakland, California. Associated Press writer Amanda Seitz in Columbus, Ohio, contributed to this report.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

WASHINGTON (AP) — Senators fired a barrage of criticism Thursday at a Facebook executive over the social-networking giant’s handling of internal research on how its Instagram photo-sharing platform can harm teens.

The lawmakers accused Facebook of concealing the negative findings about Instagram and demanded a commitment from the company to make changes.

During testimony before a Senate Commerce subcommittee, Antigone Davis, Facebook’s head of global safety, defended Instagram’s efforts to protect young people using its platform. She disputed the way a recent newspaper story describes what the research shows.

“We care deeply about the safety and security of the people on our platform,” Davis said. “We take the issue very seriously. ... We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17.”

Sen. Richard Blumenthal, D-Conn., the subcommittee chairman, wasn’t convinced.

“I don’t understand how you can deny that Instagram is exploiting young users for its own profit,” he told Davis.

The panel is examining Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research showed.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents.

Comparisons to the tobacco industry’s coverups of cigarettes’ harmful effects abounded in a session that united senators of both parties in criticism of the giant social network and Instagram, the photo-sharing juggernaut valued at around $100 billion that Facebook has owned since 2012.

Said Sen. Edward Markey, D-Mass.: “Instagram is that first childhood cigarette meant to get teens hooked early. Facebook is just like Big Tobacco, pushing a product they know is harmful to the health of young people.”

The episode is quickly burgeoning into a scandal for Facebook approaching the level of the Cambridge Analytica debacle. Revelations in 2018 that the data mining firm had gathered details on as many as 87 million Facebook users without their permission similarly led to a public-relations offensive by Facebook and congressional hearings.

“It’s abundantly clear that Facebook views the events of the last two weeks purely as a PR problem, and that the issues raised by the leaked research haven’t led to any soul-searching or commitment to change,” said Josh Golin, executive director of the children’s online watchdog group Fairplay. The group, formerly known as the Campaign for a Commercial-Free Childhood, doesn’t take money from Facebook or companies, unlike the nonprofits Facebook tends to bring in for expert advice on its products.

Facebook’s public response to the outcry over Instagram was to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out “to work with parents, experts and policymakers to demonstrate the value and need for this product.”

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

Pressed by senators, Davis wouldn’t say how long the pause would last. “I don’t have a specific date but I do have a commitment” that Facebook executives will consult with parents, policymakers and experts, she said. “We want to get this right.”

Blumenthal and Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal. An interview with the whistleblower is set to air on CBS’ “60 Minutes” program Sunday.

Davis, a one-time middle school teacher and aide in the Maryland attorney general’s office, insisted that the research on Instagram’s impact on young people “is not a bombshell.”

“This research is a bombshell,” Blumenthal countered. “It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children, and that it has concealed those facts and findings.”

__

The research documents released Wednesday by the Wall Street Journal: https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-comparison-on-instagram.pdf

__

Ortutay reported from Oakland, California. Associated Press writer Amanda Seitz in Columbus, Ohio, contributed to this report.

__

Follow Marcy Gordon at https://twitter.com/mgordonap

___

This story has been corrected to show that Fairplay is a children’s online watchdog group, not a children’s online advertising group.