Cover

title page




Summary of Broken Code

A

Summary of Jeff Horwitz’s book


Inside Facebook and the Fight

to Expose Its Harmful Secrets


GP SUMMARY



Summary of Broken Code by Jeff Horwitz: Inside Facebook and the Fight to Expose Its Harmful Secrets

By GP SUMMARY© 2023, GP SUMMARY.

All rights reserved.

Author: GP SUMMARY

Contact: GP.SUMMARY@gmail.com

Cover, illustration: GP SUMMARY

Editing, proofreading: GP SUMMARY

Other collaborators: GP SUMMARY

NOTE TO READERS


This is an unofficial summary & analysis of Jeff Horwitz’s “Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets” designed to enrich your reading experience.

 

DISCLAIMER


The contents of the summary are not intended to replace the original book. It is meant as a supplement to enhance the reader's understanding. The contents within can neither be stored electronically, transferred, nor kept in a database. Neither part nor full can the document be copied, scanned, faxed, or retained without the approval from the publisher or creator.


Limit of Liability


This eBook is licensed for your personal enjoyment only. This eBook may not be resold or given away to other people. If you are reading this book and did not purchase it, or it was not purchased for your use only, then please purchase your own copy. You agree to accept all risks of using the information presented inside this book.


Copyright 2023. All rights reserved.

1

Arturo Bejar returned to Facebook's Menlo Park campus in 2019 after six years away, feeling that something had gotten stuck. He had noticed things that seemed off, making it seem like the company didn't care about what its users experienced. Bejar's tech career was charmed, and he spent over a decade as the "Chief Paranoid" in Yahoo's security division. Mark Zuckerberg hired him as a Facebook director of engineering in 2009.



Bejar's expertise was in security but he embraced the idea that safeguarding Facebook's users meant more than just keeping out criminals. Early in his tenure, Facebook's chief operating officer asked Bejar to get to the bottom of skyrocketing user reports of nudity. His team sampled the reports and found they were overwhelmingly false. Instead of telling users to cut it out, they gave users the option to report not liking a photo of themselves, describing how it made them feel, and then prompting them to share that sentiment privately with their friend. Nudity reports dropped by roughly half.



Bejar created a team called Protect and Care, a testing ground for efforts to head off bad online experiences, promote civil interactions, and help users at risk of suicide. The only reason Bejar left the company in 2015 was because he was in the middle of a divorce and wanted to spend more time with his kids.



Arturo Bejar, a former member of Facebook's Protect and Care team, returned to the company after leaving to investigate the experience of young users on Instagram. He found that everyone at Facebook was as smart, friendly, and hardworking as before, even if no one believed social media was pure upside. The company's headquarters remained one of the world's best working environments, and it was good to be back.



Bejar noticed that Facebook had revamped its reporting system six months prior to redesigning it with the specific goal of reducing the number of completed user reports. This led to an arrogance in the company's approach, as users would report horrible things before realizing that Facebook wasn't interested.



Bejar found that many Facebook employees had been asking similar questions about the company's handling of social media issues. This effort, known as integrity work, required not just engineers and data scientists but intelligence analysts, economists, and anthropologists. These tech workers faced not just external adversaries but also senior executives who believed Facebook usage was an absolute good.



Facebook's integrity staffers became the keepers of knowledge that the outside world didn't know existed and their bosses refused to believe. As scrutiny of social media increased, Facebook had accumulated an ever-expanding staff devoted to studying and addressing social media's problems.



The author, a researcher with PhDs in data science, behavioral economics, and machine learning, was covering Facebook for the Wall Street Journal. They wanted to investigate how Facebook was altering human interaction and felt that their political accountability work felt pointless. Covering Facebook was a capitulation as the system of information sharing and consensus building was on its last legs. However, it was difficult to figure out the basics of Facebook's operations, such as its News Feed algorithm and its "People You May Know" recommendations.



The author became familiar with Facebook's mechanics and found that its automated enforcement systems were incapable of performing as billed, and the company knew far more about the negative effects of social media usage than it let on. The author tried to cultivate current employees as sources and obtained stray documents indicating that Facebook's powers and problems were greater than it let on.



Amid the flood of information, Frances Haugen, a mid-level product manager on Facebook's Civic Integrity team, responded to the author's LinkedIn messages, stating that Facebook's platforms eroded faith in public health, favored authoritarian demagoguery, and treated users as exploitable resources. She thought she might have to play a role in making these flaws public, which would produce tens of thousands of pages of confidential documents showing the depth and breadth of the harm being done to everyone from teenage girls to victims of Mexican cartels.



The author found that not every insider shared Haugen's exact diagnosis of what went wrong at Facebook or her prescription for fixing it, but they agreed with the written assessments of scores of employees who never spoke publicly. In the internal documents gathered by Haugen and hundreds more provided to the author after her departure, staffers documented the demons of Facebook's design and drew up plans to restrain them.

2

Facebook's senior Public Policy and Elections staff gathered in the conference room of their old Washington, DC, office to understand what Donald Trump's upset victory meant for the company. Elliot Schrage, Facebook's head of Public Policy and Communications, was convinced that Facebook would end up as 2016's scapegoat. The election had brought a new rage to American politics, with racist dog whistles and crude taunting of opponents becoming a regular feature of mainstream news coverage. Facebook had already faced criticism for censoring trending news stories with a right-wing bent, using the platform to launch attacks on Muslim and Mexican immigrants, and fabricating much of the platform's most popular news stories.



For the past five years, trying to prove that Facebook would transform politics had been her job. Katie Harbath, the head of Facebook's Elections team and a Republican, had caught the politics bug after volunteering for a Republican Senate campaign in college. She joined the Republican National Committee in 2008 and worked for the National Republican Senatorial Committee for the 2010 midterms.



Harbath bought a lot of Facebook advertising as part of her job at the NRSC and regularly consulted with Adam Conner, who had founded Facebook's DC office in 2007. By 2011, with another election around the corner, Conner decided it wasn't great having Republicans like Harbath discuss advertising strategy with a Democrat like himself. By 2011, Harbath joined the company's DC office as one of its first employees.



When the 2012 election was over, Harbath's political team hadn't won—but her corporate one had. At a time when Facebook was looking to compete with Twitter by getting into news and politics, Obama's reelection campaign's prominent use of the platform had been good for Facebook's clout. Harbath became Facebook's global emissary to the political world, traveling more than half the year to meet with major political parties in India.



Facebook's mission was compelling, and its stock proceeds covered the purchase of a two-bedroom condo in Arlington, Virginia. Facebook's role in politics was so successful that Facebook's Partnerships team tried to subsume it, but only Joel Kaplan, the head of Facebook's Public Policy team in Washington, kept it under Harbath. Facebook published research showing it could boost election turnout on a mass scale through messages directing users to state voter registration sign-ups and digital "I Voted" stickers. Harbath wanted Facebook to create dedicated political-organizing tools and channels for elected officials to interact with constituents before the next presidential election. Zuckerberg suggested building a team devoted to civic engagement work, which Harbath and her team sponsored and broadcasted every political event.



However, by the spring of 2016, Harbath started to feel something was off in online politics, particularly in the Philippines, where the president-elect, Rodrigo Duterte, had a combative and sometimes underhanded brand of politics. Facebook received reports of mass fake accounts, bald-faced lies on campaign-controlled pages, and coordinated threats of violence against Duterte critics.



In May 2016, the UK's referendum to leave the European Union reinforced Facebook's place in politics, but for Harbath, its role wasn't a feel-good kind. Both winning campaigns relied heavily on Facebook to push vitriol and lies. The success of Trump's campaign in the US was even more uncomfortable, as he used Facebook and Twitter to short-circuit traditional campaign coverage. Harbath broached the topic with Adam Mosseri, then Facebook's head of News Feed, but the company chose to punt when it came to lies on its platform. Facebook had signed on as a sponsor of the Democratic and Republican conventions and threw big parties at both.



Harbath handled the Republican convention and was horrified by the speeches from Trump's oddball celebrity acolytes and chants of "Lock her up," referring to Trump's opponent, Hillary Clinton. Facebook offered a dedicated staffer to help target Facebook ads, address technical problems, and liaise with company leadership. Harbath turned to James Barnes, a Republican friend who worked on the company's political ad sales team. Barnes relocated to the San Antonio offices of Giles-Parscale, the web marketing firm running Trump's digital campaign.



On October 7, 2016, a video from Access Hollywood leaked footage of Donald Trump boasting about his unsuccessful attempts to sleep with a married woman. Barnes left the office and never returned, as further work with the campaign was distasteful and pointless. He flew back to Washington, staying in only loose touch with the Trump people. Just days before the election, an article in Bloomberg Businessweek containing a boast from Trump's digital team that it was running voter suppression operations on Facebook, which Barnes had no idea what they were talking about.



On the evening of the election, Barnes took the results especially hard, feeling incredibly guilty for his actions. The next morning, Facebook's top lobbyist Joel Kaplan wanted a word with Barnes, telling him that it wasn't his fault that Trump had been elected or Facebook's fault. However, most Facebook executives were telling themselves that their core self-conception was that by building a platform and connecting people, Facebook was making the world a more knowledgeable and understanding place.



Facebook's largely liberal employee base believed this idea repeatedly over the years, but now they weren't really questioning whether Facebook had elected Trump as much as how his victory was compatible with Facebook's existence. The same questions were getting asked by journalists, who were walloped in election analysis pieces within 24 hours of the vote being called.



Mark Zuckerberg was angry at the implication that Facebook might have thrown the election. He believed math was on Facebook's side, as misinformation accounted for just a fraction of all news viewed on Facebook and news itself. He declared the possibility that it had swung the election "a crazy idea."



Facebook faced a significant challenge in the aftermath of the 2016 US presidential election, as Trump's campaign sought to credit the platform for its victory. The company had no data on how fake news came into existence, spread across the platform, and whether the Trump campaign used it in their Facebook ads. This led to questions about the company's responsibility to prevent fake news from spreading.



An analysis by BuzzFeed News reporter Craig Silverman showed that fake news had been the most viral election-related content on Facebook during the final months of the election. A story falsely claiming the pope had endorsed Trump had gotten over 900,000 likes, reshares, and comments, more engagement than even the most widely shared stories from CNN, the New York Times, or the Washington Post. Interest in the term "fake news" spiked on Google the day the story was published, and it stayed high for years.



The finding rattled Facebook executives, who had been saying that falsehoods on both sides would naturally cancel each other out. Addressing these questions would require Facebook to study and alter its platform in ways that favored it. With Zuckerberg's approval, a team of News Feed staffers was dispatched to quantify the problem of fake news and come up with potential solutions. The team focused on privacy and privacy concerns, packing up their desks and relocating to Building 20, the world's largest open-plan office designed by Frank Gehry and completed just the year before.

3

Facebook's CEO, Chris Cox, was deeply concerned about the platform's potential to change the outcome of the election and undermine democracy. After the election, an engineering executive from another company offered Cox his services, stating that Facebook had clearly screwed up and the platform was a threat to healthy public discourse. Cox had long been concerned with the platform's societal effects and its potential role in stoking divisions. Facebook began to think more seriously about what it allowed on its platform, establishing basic defenses such as spam and bulk data theft.



Facebook's path to becoming the world's largest social media company was paved with ambition and justifiable paranoia. CEO Mark Zuckerberg viewed any growing platform that allowed people to message, share, or broadcast content as an existential threat. He declared a "lockdown" during the first sign that other social media-like products might be catching on at college campuses.



Network effects made social media a winner-take-all game, with rival platforms both a threat and a hindrance to the free flow of information. Facebook explicitly boasted that it wanted to build features that could accommodate every part of offline life. Zuckerberg internalized that there were only a finite number of different social mechanics to invent, and the company would have to scramble to either copy or acquire them.



The company's motto "Move Fast and Break Things" reflected the need to worry about what was coming up in the rearview mirror.



Facebook's rapid adaptation to new features led to code being rough around the edges and breaking things. Engineers applied company wisdom that "Done Is Better Than Perfect" and built new systems to minimize the damage of sloppy work. Product design was based on the number of features employees "shipped," with bonuses and promotions doled out based on how many features they "shipped." Engineers and data scientists lived with perpetual uncertainty about where user data was collected and stored. The constant stream of emergencies and shoddily built features led to dark jokes that Facebook was the world's oldest startup, with sloppiness its culture's most enduring feature. In 2021, Mike Schroepfer, then chief technology officer, asked the company's engineers what their greatest frustration was, and

Imprint

Publisher: BookRix GmbH & Co. KG

Publication Date: 12-04-2023
ISBN: 978-3-7554-6273-6

All Rights Reserved

Dedication:
Facebook's Broken Code is a comprehensive analysis of the company's strategic failures in addressing its role in disinformation, political fracturing, and genocide. It reveals its manipulation tactics and the distorted online connection, revealing deeper issues like human trafficking and drug cartels. The book emphasizes that social media problems cannot be resolved by simply adjusting the platform.

Next Page
Page 1 /