Social media executives will respond directly to Congress for their role in the January attack on the US Capitol.This week, Facebook's Mark Zuckerberg, Twitter's Jack Dorsey and Google's Sundar Pichai will appear before a joint committee. Council on Thursdays at 12.00 Eastern Time.The hearing, organized by the House of Representatives Communications and Technology Subcommittee and the Consumer and Commerce Subcommittee, will focus on the role of social media in spreading disinformation, extremism and disinformation. wrong The Energy and Commerce Committee previously held a parallel hearing taking into account the role of traditional media in promoting the same social illness.
Earlier this slotsanook month, Energy and Commerce President Frank Pallone Jr., along with more than 20 other Democrats, sent a letter to Zuckerberg, pressing the Facebook CEO for a response as to why commercial devices were advertised. The tactics then appeared alongside a post promoting the Capitol Uprising. "Targeting advertisements in this way is dangerous and likely to induce acts of violence," wrote the letter authors. In late January, Facebook said it would temporarily stop ads showing associated accessories, weapons and related equipment.While the subcommittee has signaled an interest in Facebook's advertising practices, organic content on the website has presented a bigger problem than in the past.
In the uncertain time following last year's elections,the pro-Trump "Stop the Steal" movement has grown significantly on social media, especially among Facebook groups. At that time But that same move, caused by politically misinformation, was what drove Capitol rioters to block the vote count and to declare the death of violence on Jan. 6.The ruling is likely to delve into the extremist groups organized through Facebook groups as well, the chair of both subcommittees, who will question the tech CEO this week, previously raised questions to Facebook. Regarding the report, the company knows that algorithmic group recommendations are the path to user frenzy. Despite the warnings from experts,
Facebook continued to allow armed anti-government forces to publicly organize on the platform until the end of 2020 and despite the ban. But some people still do so.The Justice Department is reportedly considering charging a member of Oath Keepers, one of the prominent U.S. armed militia groups, involved in a sedition attack on the Capitol.Facebook has played a huge role in spreading extreme content and delivering it to the mainstream. But not alone Misinformation that undermines the integrity of US election results is generally easy to find on YouTube and Twitter, although those social networks are not designed to connect and mobilize people in the same way Facebook groups do.
Facebook began to slowly revise its rules on extremism until 2020, and quickly this January, when the company fired former President Trump from the platform. Facebook's external policy oversight is still reviewing that decision and can reverse it in the coming weeks.Over the past year, Twitter has tried to mislead its own policy decisions, communicating changes transparently and suggesting the ideas being considered. Under Dorsey's guidance, the company followed a living document-like platform rule that began tinkering with efforts to better shape user behavior.If Twitter's latest policy decisions are like thinking it out loud, YouTube takes the opposite approach.The company isn't proactive in its defense ahead of the 2020 election, and rarely responds to real-time events, YouTube waits for one.