Connect with us

Culture

Book Review: ‘Broken Code,’ by Jeff Horwitz

Published

on

The few inconsistencies are all the more conspicuous in a book that so frequently makes serious, sophisticated observations and arguments. Horwitz elegantly illuminates one reason Facebook’s top brass could be slow to recognize the dangerous influence of the platform’s network effects on the average user, who was more vulnerable to misinformation than the savvier, more educated executives whose own feeds showed them content created or shared by their savvy, educated friends. “Fires that broke out on Facebook’s lower floors had to become full-on infernos before its penthouse dwellers ever smelled smoke,” Horwitz writes.

It’s unclear whether Zuckerberg gave interviews specifically for this book (an endnote says only that Facebook “authorized background interviews with a range of its executives”), but he emerges through others’ impressions as a kind of remote, perpetually distracted emperor figure. In one memorable episode, a seasoned machine-learning specialist develops a novel algorithmic change that, if put into effect, will dampen the voices of hyperactive users, who are more likely to share misinformation. Zuckerberg listens for 10 minutes and orders, “Do it, but cut the weighting by 80 percent,” a directive obviously meant to render the initiative toothless.

The narrative starts to feel more focused once Haugen is properly introduced in the book’s final third. This section is the strongest, combining Facebook’s key role in spreading post-election disinformation and polarization with the inside-baseball news media drama of how publishing the “Facebook Files” articles played out for Horwitz, Haugen and other sources.

And, of course, for Facebook itself — soon to become Meta, in a rebrand announced the month after the “Facebook Files” series ran. In a disheartening but inevitable denouement, the company’s scramble to control the reputational damage it suffers from Horwitz’s reporting results in a “crackdown” on leaking and instructions for self-censorship. “The entirety of Facebook’s staff working on integrity and societal issues was now literally reporting to Marketing,” Horwitz writes.

In the book’s final pages, Horwitz examines an internal experiment in which Facebook’s own “frustrated data scientists” showed, in 2021, that the deep-seated notion that underpins almost all the conflict in this book — what’s good for integrity is bad for engagement — was mistaken all along. Watering down its integrity ranking weakened Facebook usage for only six months; after a year, “the integrity work began producing a modest but statistically significant usage gain.”