Stuffing Ourselves

There is a delivery service that employs around half a million people in the United States. It delivers billions of packages per year to tens of millions of addresses. It has hundreds of warehouses and sorting facilities, and its packages travel on thousands of identical trucks and vans. The delivery service has the extraordinary ability to get a box from one side of the country to another in just a few days. Some politicians call it a monopoly, but it nevertheless enjoys high approval ratings among most of the American public. The delivery service is one of the most essential features of modern life, and indeed it makes modern life possible. In the year 2020, we relied on it more than ever.

I am talking, of course, about the United States Postal Service, but you would be forgiven for assuming I was describing Amazon. It would be an overstatement to say that the two entities do the same thing, but it is quite reasonable to say that they do many of the same things. Most notably, they both ship stuff from one place to another. Despite what you may have heard to the contrary, the world has not transcended stuff. It still runs on stuff, and stuff, to use a voguish word, is non-fungible. It cannot be changed or transmogrified. It can, however, be moved around.

Amazon and the Postal Service are often said to have some sort of “relationship” with one another, but there is much disagreement about how to characterize their bond. Those who defend public services argue that the USPS is good and Amazon is bad, while champions of private business respond that the USPS is bad and Amazon is good. The former argue that the USPS is in thrall to Amazon, which provides the service with a huge amount of revenue; the latter claim that Amazon is in thrall to the Postal Service’s monopolistic delivery network. The dynamic could be symbiotic. It could be parasitic. It could be mutualistic. It depends.

Essentially, Amazon is a website where people order goods. When you order stuff, Amazon has to get the stuff to your house. The company accomplishes this by loading the stuff on trucks and airplanes and shipping it to a huge warehouse near your neighborhood. Once the stuff gets to the warehouse, someone sorts through it. Someone else, perhaps several someones, is then paid to bring the stuff from the warehouse to your house. Sometimes those people are employed by a private company like UPS or FedEx, but just as often they are government employees who work for the United States Postal Service. What distinguishes the Postal Service from these other carriers, and what makes it useful for Amazon, is that it is required by law to deliver to every address in the United States. 

The increasing isometry between the two entities can sometimes foster confusion in the customers they serve. You order an item from Amazon without checking whether it will be FedEx, UPS, the USPS, or Amazon itself that ferries the package to you. Without looking at your order history, do you feel that you could catalog with confidence which delivery services have dropped which items on your porch? Could you estimate offhand whether there are more USPS trucks or Amazon vans delivering packages in your neighborhood? 

The functional overlap between Amazon and the USPS belies climacteric differences in how the two entities treat their workers. In recent years, the allegation that Amazon effectively requires its workers to piss in water bottles has taken on an almost talismanic centrality in any discussion of the company’s labor practices: in 2018, a former Amazon warehouse worker went viral for tweeting that when he had left the company in 2015, “there were bottles & bins full of piss at odd points, bc you were penalised for toilet breaks.” The anecdote was picked up by Rep. Alexandria Ocasio-Cortez, who asked in 2019 “why Amazon workers have to urinate in bottles & work while on food stamps.” This past March, when Rep. Mark Pocan tweeted, “Paying workers $15/hr doesn’t make you a ‘progressive workplace’ when you union-bust & make workers urinate in water bottles,” the Amazon PR team swiftly rejected the charge, only to apologize and backtrack, admitting that its initial denial had been “incorrect.” 

Amazon’s inhumane treatment of its workers is not incidental — it’s an essential feature of how the company operates. Earlier this year, a group of workers at a company warehouse in Bessemer, Alabama filed to hold a unionization vote. As the election approached, Amazon deployed a bevy of union-busting strategies that were so desperate and creative as to seem almost personal. The company offered buyouts to current employees, blanketed the warehouse with anti-union propaganda, and — in a manner reminiscent of The Italian Job — even installed a USPS mailbox on the warehouse grounds in an apparent attempt to make workers think their bosses were monitoring the mail. The warehouse workers voted against the union by a margin of more than two to one. “I work hard for my money, and I don’t want any of it going to a union that maybe can get us more pay, or maybe can get us longer breaks,” one worker explained to a reporter.

A longshot union drive at a single site in Alabama did not constitute a material threat to the profits of a $1.8 trillion global conglomerate, but it did represent a kind of ideological threat to the Amazon way. Embedded in the genetics of the company is a thirst for what in the world of logistics are called “efficiencies” — minuscule operational tweaks that can save a marginal amount of labor, resources, time, or money somewhere in the supply chain. Amazon routes its packages through a massive and sui generis network of sorting facilities to avoid paying contractors and middlemen. It uses mountains of data to prestock and presort items. It tracks successful items sold by third-party vendors, and clones those items so it can sell them through its own private-label brands. Together these efficiencies are more than the sum of their parts — in other words, they have allowed Amazon to obtain a world-swallowing competitive edge through the marginal reduction of operational outlays. If workers suffer, pass out, and die as a result of those reductions, that is, in a literal sense, the cost of doing business.

The most obvious result of Amazon’s obsessive focus on cost efficiencies is that the company became really, really, really big. Amazon employs more people than live in Raleigh, Oakland, or Miami, and the combined area of its real estate has grown more than fiftyfold over the past twenty years to at least 300 million square feet. It accounts for more than half of all digital retail sales in the United States and around ten percent of all retail sales, period. The towering corporation has cast a long material shadow through the country, and earlier this year a new book from the political journalist Alec MacGillis offered the fullest portrait of its influence to date. Fulfillment: Winning and Losing in One-Click America attempts to trace what economists might call the “negative externalities” of Amazon’s power. MacGillis follows black families priced out of gentrifying Seattle, homeless workers in Dayton riding the city’s corrugated cardboard boom, and suburban neighbors in Virginia trying to prevent the construction of an enormous Amazon data center. One of the most shocking chapters follows the company’s almost colonial invasion of the city of El Paso, Texas, where Amazon killed off local office supply businesses by bribing and bullying third-party competitors. 

MacGillis links Amazon’s rise not just with the obliteration of individual families and towns, but also to a growing divide “between a handful of winner-take-all metropolises and a large number of left-behind rivals,” a dynamic that is “making large parts of the country incomprehensible to one another.” Indeed, Amazon has become so massive and so powerful as to serve as a kind of synecdoche for the modern economy. Its gravitational pull feels less like that of an individual firm than that of an entire market. First it took on independent bookstores, then retailers like Staples and Bed Bath & Beyond, then private carriers like FedEx and UPS — and now, through the Postal Service, the government itself. 

MacGillis argues rightly that the company only got so big because it cut corners and exploited workers, but there is something else going on here, something so omnipresent that it’s hard to see. It’s the reason package volumes at FedEx, UPS, and the USPS have grown even as Amazon has eaten up more and more of the delivery market, the reason cargo freighters are getting larger and delivery trucks of all kinds are becoming more plentiful. Amazon, “the everything store,” is selling something that many people seem to want — stuff, all kinds of stuff, delivered fast and at low cost. Everyone buys stuff, and for many years they didn’t seem to care all that much about who got it to them, or how they got it to them, or who got hurt in the process.

 

“I have stated my concerns with Amazon long before the Election,” then-President Donald Trump tweeted in March 2018. “Unlike others, they pay little or no taxes to state & local governments, use our Postal System as their Delivery Boy (causing tremendous loss to the U.S.), and are putting many thousands of retailers out of business!” Trump’s specific quarrel was with the USPS and Amazon’s shipping agreement, which gave the online retailer a bulk discount on the millions of packages it shipped through the Postal Service. Amazon relied on the Postal Service for around 40 percent of all parcel deliveries that year, according to one estimate, and the Postal Service was trying to retain the company’s business with lower rates.

That same year, my father began working for the Postal Service as a mail carrier in a non-union position at an understaffed station in Virginia. He soon found that most of his job was delivering packages for Amazon. As the company’s business boomed in the mid-2010s, it inundated the Postal Service with so many packages that its mail sorters and carriers could not keep up. An average mail carrier like my father often found himself forced to lift anywhere from dozens to hundreds of heavy packages every day, often returning to the station for a second delivery run if the day’s deliveries could not fit in a single truck; during the holiday season, management worked employees for 70 or 80 hours a week. The biggest client was Amazon, but it wasn’t the only source of the deluge. The Postal Service could not afford to hire additional workers to match the surge in packages — its labor force was shrinking every year — which meant fewer employees than ever were doing more work than before. 

Things have not always been so bleak. For much of the twentieth century, the Postal Service had been a veritable bastion of labor power, the mirror image of a union-busting conglomerate like Amazon. Postal workers were among the first federal government employees to unionize, tangling with Presidents Roosevelt and Wilson for collective bargaining rights. As the scholar and former postal worker Philip Rubio recounts in his 2010 book, There’s Always Work at the Post Office, the mid-1900s saw a long series of postal strikes and sick-outs to protest unsafe working conditions and high mail volumes. These actions secured wage and overtime concessions, which made the Postal Service an attractive employer for many black workers who found themselves unable to get jobs in the private sector. 

The growth of labor power at the service culminated in the Great Postal Strike of 1970, the largest wildcat strike in U.S. history. The New York-based action halted post office functions across the country and led President Richard Nixon to deploy more than 18,000 military service members to New York. The soldiers tried to deliver the mail themselves, but they couldn’t figure it out. Afterward, Congress reconstituted the Postal Service as an independent agency outside the Cabinet and extended collective bargaining rights to the entire workforce, which led to further gains for hundreds of thousands of postal workers. When Rubio started work at a Postal Service station in North Carolina in the 1980s, one of his coworkers told him that once he became a full-time employee, “they can’t get you out of here with a crowbar!’’ 

That was no longer the case by 2018. Even before the onslaught of Amazon packages, the Postal Service was on thin ice. Its struggles began in the early 2000s, when a Republican-controlled Congress mandated that the then-healthy agency pre-fund decades of employee retirement benefits, which cost the service billions of dollars a year. Then the one-two punch of the internet and the Great Recession caused a precipitous drop in mail volume, as companies stopped sending out as many hard-copy advertisements and magazines. The resultant decline in revenue made it difficult, if not impossible, for the Postal Service to sustain its retirement payments; soon officials stopped making them altogether. The service entered a kind of austerity limbo, trimming costs wherever it could even as it pleaded with Congress for relief from a debt it could not pay.

With the USPS in the red, the protections its union obtained eroded over a period of close to two decades: more than twenty percent of USPS workers are now temporary “non-career” employees without full benefits, up from thirteen percent in 2000. Hundreds of postal stations around the country face endemic staffing shortages; the most severe cases are often found in the rural areas where people rely on the Postal Service not only for routine deliveries but also for welfare checks, access to financial services, and letters from loved ones behind bars. Amazon has further strained this labor situation, forcing workers like my father to heave dozens more packages per day than they might have done twenty years earlier. 

If Amazon and the Postal Service have a symbiotic relationship, there can be little doubt which occupies the dominant role. Amazon is one of the world’s most profitable companies; the Postal Service is a struggling public agency that is billions of dollars in debt. Amazon created half a million new jobs last year and also leased twelve new delivery jets, bringing its air fleet to 80 planes; the Postal Service has cut down its workforce by a quarter over the past decade and maintains an aging fleet of rusted mail trucks that often catch fire. Almost half of all people in the United States have subscriptions to Amazon Prime; USPS letter volumes have steadily declined since an early-2000s peak, dropping from 103 billion mailpieces to around 50 billion in twenty years. 

By the time Trump began complaining about Amazon, the company was already working behind the scenes to free itself of the Postal Service. That year, Amazon hired several thousand of its own non-union delivery drivers and contracted millions of deliveries out to at least two other low-cost couriers. By late 2019, when its skeleton workforce had reached a suitable scale, the company began to redirect many urban package deliveries away from the unionized Postal Service and toward these contractors, with whom it could command even lower prices and more flexible labor standards. The Postal Service was already giving Amazon a discount, yes, but its workforce was unionized and labor costs were high, and the demand for stuff was greater than ever.

 

This was the sorry state of affairs in the spring of 2020, at the start of the coronavirus pandemic, which dealt a body blow to the already beleaguered Postal Service: viral infections raged through the agency’s workforce, causing thousands of workers to miss their shifts even as a locked-down population ordered more and more packages over the internet. Two months into the lockdown, the Trump administration tapped GOP fundraiser Louis DeJoy to the position of Postmaster General. Almost at once, the previously unknown logistics executive announced a series of operational changes that he claimed would reduce excess overtime hours and shave labor costs. These tweaks further slowed the already halting pace of delivery, with the result that mail in places like Ohio and Michigan arrived days or weeks late.

These twin crises were only the latest chapters in a long saga of austerity, but the terror of the pandemic and DeJoy’s second-degree connections to Trump drew an unprecedented amount of attention to the plight of the agency. It also made the Postal Service a new object of political sympathy. The pandemic meant millions more mail-in ballots than in years past, and many liberals interpreted DeJoy’s meddling as an attempt to rig the election for Trump; meanwhile, an upsurge in appreciation for essential workers drew attention to the “boys in blue.” Memes and infographics spread across social media, valorizing the Postal Service first as a kind of Atlas figure for American democracy, and later as a sick patient appealing to GoFundMe to finance life-saving surgery. We all needed to chip in, setting out snacks for postal workers or helping the USPS by buying stamps and merchandise like the dog mail carrier costume that sold for $17.99. None of this did anything to help postal workers get the mail out: the Postal Service’s debt load stems from misguided Congressional management, not a decline in stamp sales. 

The same belief in consumer activism has guided efforts to boycott Amazon — most recently around the Bessemer union vote — and, notably, the union itself disavowed the boycott. But the idea that everyday consumers have the power to intervene in the struggle between Amazon and the Postal Service is misguided because the structure of the delivery industry is dictated by something grander than individual demand. We cannot hurt Amazon by boycotting it for the same reason that we cannot improve factory conditions in China by refusing to buy Nike shoes: the macroscopic muscle of global demand will always trump the marginal sales hit that might result from “voting with one’s wallet.”

Despite the gulf between them, both Amazon and the Postal Service function as part of a much larger and much more insidious global supply chain. We can imagine the entities’ two respective workforces as different sectors of the same organization, an organization responsible for bringing everything, everywhere, to everyone. They are all “Delivery Boys,” as Trump put it, running errands for a gluttonous trade ecosystem that funnels billions of manufactured items from the third world to the first. That chain encompasses overseas shipping, transnational freight, package distribution, and parcel drop-off, all of which we can categorize under the heading of “logistics.”

The global trade system seems hegemonic and all-encompassing, but in fact it is only a few decades old. Not so long ago, a company created an item in a factory and shipped it to a store, then a consumer drove up to the store, bought the item, and took it home. It was prohibitively expensive and time-consuming to ship an item all the way around the world, so a great many goods originated in the same general area where they were sold. There was no such thing as “direct-to-consumer” commerce because individual consumers would not have been willing to pay the high prices such product deliveries would have required.

Seventy years later, none of those things are true. Leapfrog advances in marine shipping capacity led to the rise of “containerization,” the practice of shipping manufactured goods in metal containers that can be transferred easily from freighters to trains to trucks; the practice eliminated the need to sort through arriving product shipments and drastically reduced the cost of international shipping, which caused a surge in maritime traffic. Meanwhile, the formation of the World Trade Organization, the subsequent accession of China to the same, and the inking of NAFTA all incentivized American companies to offshore their manufacturing, recouping the cost of global shipping with the money they saved on labor. Even as companies outsourced the manufacture of their products from the United States, they insourced the delivery of their products through the creation of proprietary “distribution centers” where goods arriving from overseas could be sorted and shipped. The advent of the personal computer and the modern internet allowed consumers to bypass the physical store and cut in on this delicate logistical dance, opening the floodgates for what we now call e-commerce. 

By now there can be no doubt that we live in a world that has been molded by logistics: a central premise of life in a developed nation like the United States is that you can order almost anything you need on the internet, whenever you need it, at low cost. More than 800 million shipping containers left a port in the year 2019, along with more than 103 billion parcels, well over a dozen for every person living on earth. By what seems to be an iron law of psychology, the infinite availability of goods has fostered even greater demand. Now, many people cannot imagine living without a surfeit of packages that once would have been impossible. 

It is difficult to overstate our present reliance on the logistics industry. Around 80 percent of all retail goods travel by sea, and most of those travel in shipping containers. The number of warehouses in the United States has grown by almost 50 percent over the last decade, and the number of large warehouses has grown even faster. This year alone the United States added almost 250 million square feet of occupied warehouse space, almost one square foot for every adult in the country. E-commerce transactions now account for around ten percent of all retail sales, having more than doubled in size over the last ten years; that might not sound like a lot, but consider that another 40 percent of all sales consist of food, cars, and gasoline, all of which are tricky to send through the mail. This “industry” of delivery and storage is so entrenched that it’s hard to even think about opting out of it. Indeed, because Amazon has killed off so many local businesses, many people in rural areas now rely on e-commerce to deliver them items they can no longer go out and buy. The movement of stuff has become part and parcel, if you’ll excuse the pun, of ordinary life. 

None of this changes the fact that the present delivery regime is not a net good or even a neutral feature of the modern economy. For one thing, it is bad for the climate. The maritime shipping industry generates about three percent of all global emissions; diesel trucks account for close to another ten percent; and single-use plastic, millions of tons of which are used in parcel shipments every year, is made from either petroleum or natural gas. 

Beside its environmental costs, the system also depends on the exploitation of myriad workers, millions of them here in the United States and untold millions more overseas. Despite recent advancements in shipping capacity, moving stuff around the world is still very expensive, and the companies involved in this global trade need to protect their bottom lines. Even before the advent of e-commerce, protecting the bottom line almost always meant paying manufacturing and textile workers as little as humanly possible, and conditions are scarcely better today than they were when the WTO came into existence. The average factory worker in China still makes less than three dollars an hour, while the minimum wage for a worker in Cambodia is still below $200 a month; these workers are the backbone of the clothing and electronics industries, which have been fertile growth sectors for online retailers. The rise of containerization resulted in the laying off of thousands of longshoremen who had once sorted through arriving shipments. The creation of more than 600,000 new warehouse jobs over the past decade has more than made up the difference, but it also occasioned an enormous shift from union to non-union labor. One in every hundred American workers is employed by either Amazon, UPS, FedEx, or the Postal Service; meanwhile, Amazon and its Chinese counterpart Alibaba together employ more than a million people outside the U.S., a significant portion of whom work on a temporary basis. The overarching telos of this global exploitation, of course, is the sale and delivery of goods to first-world consumers.

That puts consumers in a morally compromised position. For as long as the current arrangement stays in place, manufacturing and delivery workers the world over will remain yoked together in poverty so that people in the first world can have unfettered access to whatever stuff they want. That system is so comprehensive that individuals cannot work against it by directing their money elsewhere. Although demand for stuff is the alpha and omega of the logistics economy, we cannot dismantle that economy by demanding less stuff one person at a time. 

If we wanted to change the way the logistics economy functions, we would have to attack the strange and vulnerable terrain of logistics itself — not the goods, but the network that moves them. What matters is not so much where we spend our money on stuff or even whether we spend our money on stuff, but how we undermine the total global dominance of a system premised on exploitation and pollution. In order to do so, we would need to engage in a political action that transcends the consumer logic of a boycott.

I had a friend in college who used to fantasize about a particular kind of worldwide political action: organizing workers to shut down one of the logistics chokepoints through which almost all shipping containers and parcels must pass on their way to your house and mine. We lived in Chicago at the time, less than 50 miles away from Elwood, Illinois, the site of an intermodal freight terminal that serves as a central hub for mega-companies like Amazon and Walmart. Thousands of shipping containers arrive by train every day at the so-called “inland port” and are stored for a short time in the hundreds of warehouses nearby. Workers in these facilities sort the stuff, load it onto trucks, and spirit goods away to retail stores for companies like Home Depot and Target, but also Amazon fulfillment centers, and then post offices, and then doorsteps. From above the complex resembles a vast parking lot, except instead of cars there are shipping containers full of stuff.

The enormity of a place like this, my friend liked to argue, was also its vulnerability — the bigger they are, the harder they fall. If a few hundred people managed to block the exits to that terminal, they could hold the entire national supply chain hostage, damming the flow of both e-commerce and traditional retail. Packages would cease to show up at post offices and warehouses, and then fail to show up at people’s doors. Store shelves would start to empty; prices would rise; industries would spasm.

If this sounds implausible, recall for a second the lodging of the Ever Given in the Suez Canal, which stopped transcontinental trade for days on end, precipitated price hikes, and delayed shipments for months afterward. The past year has provided myriad examples of how a snarl in the supply chain can have drastic consequences. Virus outbreaks at poultry plants, for instance, caused sudden chicken wing shortages. A brief ransomware hack of an oil pipeline sent gas prices soaring across the southeast United States. The shortage in semiconductors, meanwhile, has jacked up the price of new cars.

It’s not hard to imagine bringing about a crisis like the Ever Given crash on purpose. All you would have to do is surround a freight terminal, jam an interstate, block the exits to a warehouse, and you would hold the reins of the modern economy. The supply shock that resulted from such a disruption would be a far more effective political cudgel than any attempt at crowdfunding or boycotting, any attempt to vote with one’s wallet. The effect of an action like the one I am describing would be the same as that of the 1970 postal wildcat strike or, to choose a more contemporary example, the oil pipeline destruction advocated by the climate saboteur Andreas Malm. By restricting the flow of stuff, we reveal the extent of our reliance on it. By revealing that reliance, we also reveal our reliance on the exploitation that enables its existence. By revealing our reliance on exploitation, we invite consideration of alternatives — and, just maybe, we force consumers to think outside the cardboard box, to step outside the economic structure that hurls iPhones and yoga mats at them from the far reaches of the world. 

Of course such an action would not cause the existing structure to collapse. It would screw over many people and inconvenience a great many more; it would be branded as terrorism, decried by most of the political spectrum, and responded to with savage force. But it would also represent a preliminary step in the long process of building a fairer and more sustainable world. To engage in such an action would be to take to its logical conclusion an essential premise of the modern world: stuff stops moving not when we close our wallets and our web browsers, but when we throw a spanner in the works.

Truth and Consequences

The best way to begin is with a gun. We don’t need to see it, but we need to know it’s there. Fortunately, it doesn’t take much to convince an American that one is nearby. Start with murky fragments of a city skyline, lights piercing the falling night. Then a disembodied voice speaks as if the story has already begun, providing basic exposition: Dallas, October, Thursday night. A blue-eyed man, hair combed over the front of his head, says, “It’s as if I was meant to be here.” His head turns slightly, he purses his lips the way one does when they’re not quite finished speaking, but we move on. (All editors know that when an interview subject gives you a cliffhanger, you take it, but if he doesn’t, you can make one yourself by cutting a little tighter than you would have liked.) To generate an artificial pause, the film cuts to a red emergency light. The police, we now know, are involved. Already we’ve accounted for at least one gun. A new man in an orange prison jumpsuit appears. A gun and a crime, then. He tells us that he took a pistol and a shotgun, and we cut away from his face to an artist’s rendering of a revolver, spinning slightly in space. When we return to the imprisoned man, he continues narrating how he broke into a neighbor’s home and stole a car. Little has been given to us in the way of story, but much in the way of dread. One evening in Dallas, a man who is now an inmate was involved with the police and a firearm. The viewer is already racing ahead.

The opening of Errol Morris’s 1988 film The Thin Blue Line is a masterpiece, avant-garde in the old sense of the term. It blazed a trail that much of documentary filmmaking has since followed: the brooding repetitious strings of Philip Glass’s score, the unraveling of a crime, a miscarriage of justice, interviews interwoven with reenactments. Eventually we learn that Robert Wood, a Dallas police officer, was shot during a roadside vehicle stop, and the state investigation had resulted in the wrongful conviction of Randall Adams. Over the course of Morris’s own investigation, he not only uncovered the wrongful charge, but extracted a confession from the murderer. When the film was snubbed for an Oscar nomination, fellow filmmakers balked, particularly when it came out that the screening committee had not even finished watching the movie. (Michael Apted, director of the acclaimed series Up, called it “one of the most outrageous things in the modern history of the Academy.”) Even better for a film’s legacy than being lauded in its time is being thought of as insufficiently rewarded. 

Three decades later, a large segment of popular prestige nonfiction, not to mention the “trashier” fare, is effectively sketches after Morris. True crime has roots that extend beyond the advent of motion pictures, but Morris elevated it, just as Truman Capote once did with In Cold Blood. It wasn’t quite journalism, but Morris’s investigation did help free someone who had nearly been executed for a crime he did not commit.  

With The Thin Blue Line, documentarians, the perpetual do-gooders of the film world, were gifted a reliable formula: pulp fiction with a hint of social justice. True crime has by now so outpaced every other documentary genre that, last year, Morris tweeted an apology: “I’m sorry for ‘The Thin Blue Line.’ You solve a murder mystery and then people think that’s all documentary should do.” 

Over the last three decades, true crime standouts have laid track for the arrival of a broader documentary surge. In the past, the genre had a ceiling of popularity and profitability. Michael Moore occupied a league of his own; Fahrenheit 9/11 remains the only documentary to have topped $100 million in domestic box office sales, a feat accomplished just shy of 800 times by fiction features. In terms of market share, the entire documentary field typically hovers under two percent of the industry’s total yearly gross. Still, in 2018, the nonfiction films Free Solo, RBG, Three Identical Strangers, and Won’t You Be My Neighbor? each grossed over $10 million. In historical terms these were massive hits, especially considering that none of them were nature documentaries. That they all came out during the same summer indicated that audiences might have, at long last, begun to turn towards documentaries.

At the following year’s Sundance Film Festival, Netflix decided to test that theory. The company paid $10 million for Knock Down the House, Rachel Lears’s film about four women running progressive campaigns in Democratic primaries. Distributors told Deadline that, according to fuzzy industry math, a purchase price that high would necessitate around $75 million in global grosses. Netflix, famously, does not play the box office game. The company was banking on its subscribers’s growing demand for documentaries. They made a good bet. Parrot Analytics, a media data company, reported that from 2019 to 2020 demand for documentaries outpaced every other genre on streaming services. Parrot also estimated that the number of available documentary series had increased by 63 percent over a two year period, while viewership was up by 142 percent. Boom times are here.

Documentaries are too diverse to grasp at a glance, but certain patterns are developing as producers and directors chase the profits and large audiences that seemed nearly impossible only a short time ago. Now, the genre’s explosion in popularity threatens to limit popular understanding to a select few dominant forms. Outside of nature and music documentaries, which have long been successful, the tabloid is in vogue. Many of the most popular documentaries follow close on the heels of splashy events — cult investigations, suspicious murders, scams and grifts, celebrities rising and falling. These stories are often strung together with talking-head interviews whose subjects are narrators and analysts, telling us what they know, what it means, and how to feel.

As the critic Noel Murray has noted, the speed at which current events now become documentaries recalls “the network ‘special reports’ of old, when ABC, NBC, or CBS would turn over an hour of prime time to take a look at some much-talked-about story.” This trend crystallized when Netflix and Hulu nearly simultaneously released documentaries about the Fyre Festival debacle, less than two years after the last partygoers evacuated the island. (Hulu reportedly rushed to finish its version when word spread that Netflix was on the verge of releasing its own.) 

And sometimes, the documentary itself becomes the news. The popular 2015 HBO miniseries The Jinx guided the viewer along an investigation into the unsolved murder cases in which multimillionaire real estate heir Robert Durst was a longtime suspect. Recreations in the style of The Thin Blue Line run throughout the series, which at times feels like an episodic version of the film — except instead of trying to free someone from prison, the show was making a case for prosecution. Although its methods were cribbed, the series was praised for its artistry; NPR’s television critic wrote that it “felt like the birth of a new TV genre.The Jinx hit all the right beats at the right time, which isn’t easy when you’re working from reality. In real life, we tend to lose the plot.

Durst was arrested the day before the release of the series’s bombshell finale, in which he appears to confess to the murders. Off camera, he is heard saying, “What the hell did I do? Killed them all, of course,” an instantly infamous closing line that earned the show ever more rapturous reviews and two Emmys. Esquire declared that it would “likely be remembered as one of the most jaw-dropping moments in television history.” Hyperallergic later called it “The Thin Blue Line on steroids.” The Observer declared that the filmmakers’ “quest for the truth… found it, and found it spectacularly.” According to HBO, more than a million people watched The Jinx’s finale.

That could have been the last of it, but in 2019, a new document came to light during Durst’s murder trial. The filmmakers had turned over the raw production audio, and the full transcript revealed that the smoking-gun quote did not really exist. The filmmakers had re-ordered his words and removed other sentences in between, splicing together what had potentially been two disparate thoughts. Amid twenty rambling sentences on a hot mic, Durst may have made some kind of confession, but it was certainly not nearly as clear as the show had made it seem. Critics viewed this as a grave transgression, and journalists saw an engineered quote meant to gin up drama and deceive viewers. (The New York Times headline read, “As Durst Murder Case Goes Forward, HBO’s Film Will Also Be on Trial.”) For filmmakers, radical audio edits like this one are routine, even though the ramifications for their subjects tend to be less severe. It was a clear breach of journalistic ethics, but none of the men behind the film were journalists. The two camps were not really speaking the same language. 

 

The New York Times began the Trump era with the nearly Nixonian slogan “the truth is more important now than ever,” and journalism’s “just the facts, ma’am” ethos became an anchor for those who felt buffeted by torrents of alternative facts. Democratic politicians moved in lockstep to pivot from a near-decade of crackdowns on government leakers to support for dogged journalism. The heat of this moment was peculiar, given the frequency with which the preceding administration had used the Espionage Act to prosecute just that kind of activity. But there wasn’t time for everyone to get on the same page about what exactly journalism was or even what constituted the good or bad versions of the endeavor. The whole field had been conscripted into a fight. 

The series finale of The Jinx aired in the spring of 2015, not long before Trump descended his escalator and launched his campaign. Soon after, the American liberal media would be thrust into a years-long frenzy about the nature of, and the need to defend, truth in a supposedly post-truth era. Journalism was cast as an antidote to the tossed-off lies flooding the airwaves. Documentary, with its aura of educative power, is easily slotted into this narrative, but its practices have never aimed to produce verified facts in the same way.

In July, Variety announced its inaugural “Truth Seekers Summit, a “first-of-its-kind summit honoring the power of storytelling and the pursuit of truth.” Co-hosted with Rolling Stone (and celebrating the simultaneous launch of a new documentary vertical for Variety and a new Rolling Stone section devoted to investigative journalism), the announcement advertised a keynote speech by Errol Morris, in addition to “panels from the documentarians behind Allen v. Farrow, Billie Eilish: The World’s a Little Blurry, Crime Scene: The Vanishing at the Cecil Hotel, I’ll Be Gone in the Dark, Rise Again: Tulsa and the Red Summer and more.” Meanwhile, Variety will also introduce a new “Truth Seekers” award to honor “iconic documentarians or journalists.” Entire practices, histories, and methodologies are collapsed in that “or.” In the popular imagination, fiction is made up, and documentaries are real, and it has been a good time to buy into such a binary — the surge in documentaries coincided with a gnawing need for settled and verified truth, but documentary film can’t really sate this desire. At the form’s best and worst, its accounts are searching rather than definitive. Journalists sometimes make documentaries, but the field is larger than their work, and at the level of craft it has an entirely different set of priorities.

Writing in the Columbia Journalism Review in October 2020, Danny Funt, a senior editor at The Week, bemoaned the recent boom in documentaries that failed to prioritize “sound reporting” over entertainment. Funt is largely concerned with the celebrity documentary, and the ways in which powerful people have begun to exert control over their images; what concerns him is the turn away from objectivity, which he defines as editorial independence from the subject. “The best journalism is engrossing, but the most entertaining documentaries aren’t necessarily journalism,” he writes. Many things, entertaining or otherwise, are not journalism.  (Funt reserves kind words for Serena, Ryan White’s film on the tennis player’s 2015 season, because White assured him in an interview that Williams doesn’t “give a shit what people think about her.” This makes little sense, as Ryan White is the kind of filmmaker, Funt explains, that stars ask to make films. Naturally, an early cut was screened at Williams’s agency, WME. Whether or not she personally cares how she is portrayed, she has a team that’s paid to.) Funt contends that films like The Last Dance, ESPN’s series on Michael Jordan’s final season as a Chicago Bull, or Hillary, a biographical portrait of Hillary Clinton, are too compromised by their subjects-cum-producers. They are the opposite of journalism, which, for Funt, is advertising. 

Although the two are intertwined, it doesn’t quite make sense to think of documentary, especially its Trump-era surge, as an outgrowth of journalism. Most of the earliest motion pictures — depicting workers, trains, dancers, galloping horses — qualify as nonfiction. Only after it was swallowed by Vaudeville and the nickelodeons did the distinction between different kinds of cinematic images even become meaningful. Debates about truth and deception have value, but they obscure the fact that documentaries have always been more akin to essays than articles. It would be hard to hold up an essay as proof of anything at all, except perhaps consciousness. They are dramas of a mind, or often several, learning, searching, and making things cohere. Trying to relate the problems the booming documentary field faces to the supposed ethical commandments of journalism, as Funt does, obscures a bigger issue: most viewers are not taught to comprehend and evaluate documentaries on the terms by which they are constructed. Those who have never been in an editing room are often unaware of how moving images, frighteningly adept at emotional manipulation, actually function. Anyone who has made a film, borne the agony of a rough cut screening, tinkered with minor changes (or sometimes none at all), and then received confused feedback is aware of a simple truth: moving images are evidentiary, but all the evidence is circumstantial. 

In my work as a documentary editor, my greatest hurdle is merely piecing together something coherent out of a mass of recordings that do not, on their own, attest to very much at all. The first pass of an edit is often devoted to building rudimentary scenes that follow some kind of film grammar without much concern for whether they are proving a particular point. There is no real reason to believe that a shot of a woman looking to her right followed by a shot of a man crossing a road means that the woman was watching the man, except that this is the convention. It is a useful one, though. A strong enough look can direct the viewer’s attention almost anywhere. 

What is more difficult to parse — at least without access to a filmmaker’s hard drives — is how much of a documentary relies not on what you’re seeing, but on what you’re hearing. In recent years, experts warned that “deepfakes” — AI-assisted video manipulation in which somebody’s face would be convincingly grafted onto another body — had the ability to potentially destabilize societies. Someone, anyone with a computer and time could make a head of state declare war or engage in some compromising activity. As I read these concerns, I thought about how many dialogue edits in documentaries already go unnoticed. 

“Frankenbiting,” as it is known among editors, has been making people say things they never said for a long time. In my experience, the process is most frequently employed to save screentime. Most people speak circuitously, filling a conversation with half-finished thoughts, gestures, and plenty of “you know?”s. In documentaries, unlike in our daily lives, dialogue is edited for concision: I remove the pauses and ums, cut unnecessarily repeated words. I make people say what they mean. This typically requires some kind of b-roll to play over the top in order to patch over the cuts. (As a rule of thumb, I tend to assume that when a documentary cuts away from a speaker on screen, there are probably some words or sentences elided.) There is no cinematic equivalent to paraphrasing, no visual cue like an ellipsis to inform the viewer that something has been left out. Direct quotes and doctored ones sound the same unless you’ve spent a great deal of time toggling between frames, listening to slightly clipped breath or the unnatural inflection of an “and” inserted where originally there was none. 

Of course, some people do just speak with strange cadences, which can make it hard to be certain whether something has really been changed. In the opening of The Thin Blue Line, the speaker says, off-screen, “This was Randall Adams.” To my ear, the slight catch in the speaker’s breath and the use of b-roll indicates that there was probably a cut between “was” and “Randall.” The cut may have removed a stutter, or, as is common, they never got the interview subject to say who she was talking about directly. “Randall Adams” may have been pulled from one sentence, “this was” from another. Most people would read this edit as harmless — just a matter of circumventing a lack of direct narration. But there are more blatantly unethical versions of dialogue editing, where meaning is not clarified but created. The recent outcry over the use of artificial intelligence to generate Anthony Bourdain’s voice in the documentary Roadrunner largely focused on whether it was ethical to deceive viewers into believing that the man had said something he had only written. It may have been a blatant example of sonic editing, but it was hardly different in kind from what documentary editors do all the time. If you record someone talking long enough, and if you are unscrupulous enough, you can make them say almost anything.

 

Vérité documentaries all have the same subjects: the kind of people who say yes to being in a documentary. The desire to be recorded, examined and reconstructed is a peculiar one, but less so if you’ve got ambition or an election to win. The politician has long been a favored subject. Politicians are powerful and cagey, prone to preening and, if they’re any good at their jobs, capable of commanding a room. Popularly known as liars, politicians relish the opportunity to cash in on the form’s presumed virtues and progressive bonafides. The pipeline now runs both ways: the Obamas have pivoted to new roles as media moguls with an eye for nonfiction, whereas Georgia’s Jon Ossoff made the leap from an executive documentary producer to U.S. Senator. But the most professionally advantageous position is in front of the camera. The “intimate access” that many films trumpet implies that the viewer will get to see who their subjects really are, even though most know that intimacy and honesty are not always related. What the films tend to do is fix a certain representation in public memory.

Robert Drew’s 1960 film Primary, which centered on the Wisconsin primary contest between Hubert Humphrey and John F. Kennedy, casts a long shadow over American documentary. Its rapturous reception overseas helped launch Direct Cinema, a documentary tendency born of newly available lightweight equipment that gave practitioners the ability to work without studio crews or interference. Primary’s camera crew alone (Albert Maysles, Richard Leacock, D.A. Pennebaker) would go on to become giants of 20th century American documentary. The film employed a new kind of proximity to its subjects and relied on closely observed scenes as they played out, rather than narration, title cards, or reenactments. Not long after its release, Drew and his celebrated camera crew sat for an interview about the project. There is some tension as the men try to explain exactly what they had done. Drew was adamant that they were reporters creating “a new form of journalism,” though he was the only one with a real background in reporting. When asked if they believed that the “personality of the filmmaker” must recede as much as possible to get towards some objective truth, though, Drew felt that the issue of objectivity was often misunderstood. “Hasn’t begun to be understood, in fact, in motion pictures at least,” he said.“The filmmaker’s personality is the most striking thing you ever see about the final result.” Drew’s team did not understand themselves to have created a transparent screen between the viewer and reality. They were instead revealing something about themselves — their taste, interests, and predilections — and using the footage to do so.

Primary is now remembered as a film about JFK, but at the time, Drew was a believer in the “fairness doctrine,” an FCC policy that required broadcasters to present “balanced” coverage. Consequently, he made sure that both Humphrey and Kennedy had equal time on screen. The enduring images, however, are all of the young phenom from Brookline, most especially the shot of the future president wading through a packed hall. The camera is held high so that you see the crown of his head, but the real subjects are the members of the crowd, their eyes bright with the fanatic’s zeal. Without even seeing Kennedy’s face, you get the sense that he is at least a rock star, if not a prophet. We tend to remember how Kennedy outshone Nixon in the first televised debate, but he had already done it once before, to Humphrey. Drew tried to be fair to the candidates, but cinema isn’t fair. It gravitates to stars. In the edit room, it is hard to justify cutting away from a shining face.

In the twenty-first century, the election documentary that looms over the others is Marshall Curry’s 2005 Oscar-nominated Street Fight. Curry does not hide his sympathies. The film made a celebrity out of then-Newark city council member Cory Booker, even as he failed to unseat the city’s mayor. In the beginning of the film, as an anonymous uniformed man is seen telling Booker that he is not allowed to campaign in the hallways of a public housing complex, the director remarks on how many “forces” there are against him. It is unclear whether this man has some relationship to these “forces,” but the suggestion is powerful enough. Once Curry tells you there are forces at work, you search the frame, listen intently, seeking evidence. 

Street Fight was followed by a spate of other similar documentaries, including Knock Down the House; All In: The Fight for Democracy, a history of voter suppression framed as a portrait of Stacey Abrams; and Philly D.A., an eight-part PBS series that premiered this past April closely documenting the office and personality of progressive Philadelphia District Attorney Larry Krasner. All three seem to capitalize on a liberal appetite for uplifting political content in the Trump era. By the time Knock Down the House came out, Alexandria Ocasio-Cortez had already defeated Joe Crowley and become a star for the electoral left. The film might have been less rapturously received if she had not won, since the other three women in the project all lost their races in the 2018 election cycle. But, as Street Fight showed, losing in a nationally distributed documentary is not necessarily a bad thing if the filmmakers are on your side. Before turning to Ocasio-Cortez’s victory party, the movie returns to one of its subjects in Missouri: Cori Bush. “It was the old guard versus the progressive, new guard,” narrates a newscaster soundbite. “Many of you called for change after Ferguson. It does not look like that is happening.” Bush dabs at the edges of her eyes, her face beginning to twist. Her head drops and she rocks back and forth. It is as sympathetic a portrait of defeat as could be drawn. Two years later, and a year after the release of Knock Down the House, Bush ran again and defeated the same man who had once beat her by nearly twenty points.

It might be easy to say that Street Fight or Knock Down the House are so in the tank for their subjects that they stray into manipulation. Each film is clearly aligned with a political and social vision that mirrors that of its subject. Even Primary falls into this trap at times. In one scene, the film cuts from Kennedy giving a rather mundane speech about the future to rapt faces in the crowd. Had the film cut from the candidate to a man checking his watch, one might not have thought Kennedy a great orator at all. What one witnesses is not exactly proof of Kennedy’s greatness, but of its effect on Drew, Leacock, Maysles, and Pennebaker.

 

With Trump at least temporarily offstage, it is unclear whether truth will maintain the same level of purchase on the public’s imagination. For a time, it was a useful byword, a way to express opposition to the administration’s mendacity. In turn, it was good business for those who touted their ability to disclose what was being hidden by all the lies. But what documentaries capture, in the indexical sense, is a pattern of thought more than the matter of reality. Thinking of them as such demands a different kind of attention, less reliance on fact-checking and more on form, argument and affect. The point and pleasure is, as Elizabeth Hardwick said, “the drama of opinion.” History is culled, constructed and put to use. But facts are offered only in the barest sense. 

The documentary exists because some people are compelled to make the misshapen mass of experience recognizable. Recognition raises too many questions: who is doing the recognizing? What do they need to know to do so? What is to be done in the aftermath of all this sight and cognition?  Invoking “ethics” is a shortcut to answering these questions, but they are more fruitfully thought of as political instead. Whether one should maintain editorial independence from a subject has everything to do with who the subject is, who the filmmakers are, and what each is trying to do. The ethical answer, the one that cocoons directors from a responsibility to the world as it is and as it could be, is that distance must always be maintained so that viewers can somehow trust that they are not being lied to. But lying is rarely the problem. Viewers tend to suffer from misunderstanding. If a Farsi speaker talks to a person who only understands English, the English speaker is not really being deceived. Bridging the gap between them is a matter of language, intention, and, if one is lucky, solidarity. 

Barbara Kopple’s landmark labor film Harlan County, U.S.A. would, by many standards, fail some tests of critical distance. Bessie Parker, one of the women involved in the strike that forms the basis of the film, said in a later interview, “I can remember hiding that mic so Barbara could get the courtroom scene because they would not allow her in.” The scene in question shows Bessie refusing to apologize for her actions during the strike. It’s a stirring moment, but with the knowledge that she intended to be recorded, it is easy to see that she is not only giving testimony; she is performing for the camera, and through it — through Kopple and her crew — she is speaking to the viewer.

Is the film showing something real, then? It is, because it is about the kind of labor militants who would do such a thing. “I knew what I was doing because that’s what I wanted to do,” Bessie tells the judge. “For once I was able to take the offensive instead of coming down here to take a step backwards to try to defend what we did. What we did was right, and we all know that.” Parker’s “we” included her fellow workers, and it dared the judge not to include himself. It probably also included the people she knew were peeking through a door to film her speech and the people who would later sit in a theater to watch her give it. The bosses were the only people she was excluding. It’s an aggressive film, and opposition to its politics could, if one so desired, be rerouted through a demand for greater authenticity, whatever that would entail. But what Parker said in that courtroom is what all documentarians tell themselves, whether they are aesthetes, entertainers, radicals, liberals, stooges, or liars. What we did was right. We all know that. We just never say what it was that we did.