Company Towns Are Still with Us

On a May morning in 1920, a train pulled into town on the Kentucky–West Virginia border. Its passengers included a small army of armed private security guards, who had been dispatched to evict the families of striking workers at a nearby coal mine. Meeting them at the station were the local police chief—a Hatfield of the infamous Hatfield-McCoy feud—and several out-of-work miners with guns.

The private dicks and the local militia produced competing court orders. The street erupted in gunfire. When the smoke cleared, ten men lay dead—including two striking miners, the town mayor, and seven of the hired guns.

The striking miners had worked for the Stone Mountain Coal Company, in mines located outside the city limits of Matewan. There, they rented homes that were owned by their employer, shopped at a general store that was owned by their employer, and paid in a company-generated form of “cash” that could only be spent at that company store. When they joined a United Mine Workers organizing drive and struck for better pay, they were fired and blacklisted.

Without a union, a workplace can be a dictatorship. But what if your boss is also your landlord, your grocer, your bank, and your local police? That kind of 24/7 employer domination used to be a common practice before the labor movement and the New Deal order brought it to an end.

Today, however, the corporate assault on unions is leading to the return of the company town. These new company towns are dominated by one large business that owes no obligation to aid in the town’s well-being—quite the contrary, in fact. As was clear in this past summer’s failed UAW organizing drive at the Nissan plant in Canton, Mississippi, the ever-present threat that factory relocation poses to a one-company town bends the local power structure to the company’s will. That’s why so many of the newer large factories—like the auto and aerospace plants that have sprung up across the South in recent decades—are located in remote rural areas. That’s also one reason why organizing campaigns in those locales face very steep odds.

ALTHOUGH NEW ENGLAND clothing manufacturers experimented with company housing in the early 1800s, company towns really came into their own during the industrial revolution that followed the U.S. Civil War. They were common in industries where the work was necessarily physically remote, like coal mining and logging. The company simply owned all of the surrounding land and built cheap housing to rent to the workers they recruited. In new industries like steel production, factories were built in areas where land was cheap, and the companies bought lots of it. By constructing housing on the extra land, the companies found a great way to extract extra profit from their worker-tenants. Besides, a privately owned town enabled companies to keep union organizers away and to spy on potential union activity.

Some of the most infamous and bloody labor battles of the 19th century, like the Homestead strike and the Ludlow Massacre, were sparked by the violent eviction of striking workers from their company-owned housing.

Life was even more miserable for workers where the company-store system prevailed. Employers would own and operate a general store to sell the basic necessities to workers, with as much as a 20 percent markup. An 1881 Pennsylvania state investigation into union-buster and future-walking-head-wound Henry Clay Frick’s Coke Company found that the company cleared $160,000 in annual profit from its company store (that would be about $3.5 million today).

Some employers paid their employees in “company scrip,” a kind of I.O.U. that could only be exchanged for goods at the company store. A worker who was lucky enough to get paid in cash could be fired if he or she were caught bargain-hunting at an independent store in a neighboring town. And payday was often so meager and delayed that a worker might have to buy on credit, resulting in the kind of merry-go-round of debt and reduced-pay envelopes that is disturbingly similar to the practices of today’s “payday loans” predators. It’s not for nothing that the refrain of the classic song “Sixteen Tons” goes, “I owe my soul to the company store.”

The rise of unions and the New Deal order didn’t put an end to company towns per se, but companies that gave in to union recognition found less reason to own worker housing. However, companies that remained non-union—particularly in the South—continued to act as landlord, thereby instilling in their workers an additional layer of fear and oppression to keep unions out. Lane Windham’s excellent new book, Knocking on Labor’s Door: Union Organizing in the 1970’s and the Roots of a New Economic Divide, mentions—almost in passing—that one Amalgamated Clothing and Textile Workers Union organizing target, the textile giant Cannon Mills, continued to run a company town into the 1980s. When the company was purchased in a leveraged buy-out in 1982, the new owners quickly decided to sell the 2,000 houses it owned, giving workers 90 days to buy their homes or get out. The town—Kannapolis, North Carolina—finally incorporated and began electing its own city government after three-quarters of a century as a virtual dictatorship.

But Kannapolis’s conversion from a company town to a proper municipality only happened because an ailing firm in a globally competitive industry needed to sell off non-essential assets, and saw little need to be financially tethered to a community. The plant closed its doors for good in 2003, causing the largest mass layoff in North Carolina history.

Not all company towns were ramshackle developments. Some wealthy industrialists developed model company towns in misguided attempts at philanthropic social engineering.

George Pullman made a fortune building and leasing luxury sleeping cars to railroad companies. Pullman’s belief that the public would pay extra money for better-quality rail travel proved correct, and the Pullman Palace Car Company quickly had a monopoly in a market of its own invention. Pullman’s pressing need for new factories to meet consumer demand coincided with his growing paternalistic concern about poverty, disease, and alcoholism in the country’s industrial cities.

The town of Pullman was built on an area south of Chicago, near the Indiana border, adjacent to the Calumet River and the Illinois Central Railroad line. The company already owned some land there, and purchased more to begin construction in 1880. The housing that Pullman built was of much higher quality than what was typically found in working-class neighborhoods in industrial cities. There was green space and tree-lined streets. In the town center, he built a handsome and well-stocked library, a luxury hotel with the town’s only licensed bar, and a grand theater to feature “only such [plays] as he could invite his family to enjoy with utmost propriety.” Casting a shadow over the town was the towering steeple of the massive Greenwood church.

There was no requirement that Pullman’s factory workers reside in his town, and many commuted from Chicago and neighboring villages. But 12,600 Pullman employees did choose to live in his city by 1893. Some were supervisors and social climbers. Many more were young workers who wanted to raise their families in a new, clean environment. By the mid-1880s, the town was gaining a reputation as “the world’s healthiest city” for its low death rate.

Pullman’s undoing was his tendency to run his town like his business. As with his sleeping cars, he owned all the property and leased them to residents. His one giant church was too expensive for most congregations to afford its rent, and his ill-conceived attempt to convince all of the local denominations to merge into one generic mega-church failed. His library charged a membership fee to foster his notion of personal responsibility. Workers avoided the hotel bar and the watchful eye of “off-duty” supervisors, limiting their public carousing to a neighboring village colloquially known as “bum town.”

Pullman’s business sense led him to make a confounding choice for a civic father who was trying to instill middle-class values in his city: The housing, too, was for rent only. His aim was to ensure that housing remained in good repair and attractive, and he charged higher rents to maintain them. Here, Pullman applied his usual belief that the public would pay more for higher quality, ignoring the fact that this particular public—his employees—had little choice when his was the only housing in town.

The Panic of 1893, and the severe economic downturn that followed, presented Pullman with a dilemma. His business slowed to a near halt. Any capitalist who did not also feel responsible for running a city would simply have laid off all but a skeleton crew of workers. In a more traditional company town, the laid-off workers would have been violently evicted by Pinkertons or the local police. The Pullman company reduced its workers’ hours but kept everyone employed on a reduced payroll. Crucially, however, the Pullman Land Trust did not reduce rents, plunging the town’s residents into financial crisis. Many workers fell behind on their rent. Their debt to Pullman had the effect of restricting their freedom to quit. It provoked a strike at the factory.

The strike was soon joined by a nationwide boycott backed by the new American Railway Union (ARU), which was led by Eugene V. Debs. Rail transportation around the country ground to a stop as members of the new industrial union refused to move trains that carried Pullman sleeping cars. The strike was violently crushed by the National Guard and its leaders were jailed. (Debs later said of the experience, “in the gleam of every bayonet and the flash of every rifle the class struggle was revealed.” He emerged from jail a few years later as America’s most prominent socialist leader, calling the strike his “first practical lesson in Socialism.”)

George Pullman died in 1897, resentful of his reputation as a tyrant and of his model town’s ignominy.

Just a few years later, another bored plutocrat decided to build a model company town of his own. Friends cautioned Milton Hershey that Pullman had been a disaster for its owner. Warned that the town’s residents wouldn’t have elected George Pullman dogcatcher, Hershey responded, “I don’t like dogs that much.”

Hershey made his first fortune in caramel, and sold his confectionary for the unprecedented (for caramel, anyway) sum of $1 million in 1900. Although he retained rights to a small chocolate subsidiary, it was more of a local novelty. Prior to the advent of milk chocolate, the sweet was a luxurious treat for the wealthy that would not keep for long journeys by rail to allow for mass production and distribution

Then, like Pullman, Hershey became interested in solving the problems of modern industrial life. He founded the Hershey Chocolate Company to support his town—not vice versa. Hershey worked on a formula for milk chocolate that could be mass-produced, to provide his town with a sustainable industry.

Breaking ground in 1903, the town was located near its own source of dairy farms for his chocolate business. At the center of town was a 150-acre park, featuring a band shell, golf course, and zoo. After ten years, Hershey’s amusement park was receiving 100,000 visitors a year, making tourism a crucial second economic base for the model company town. Hershey built banks, department stores, and public schools. Unlike Pullman, homeownership was a key part of Hershey’s vision and business model.

In a case of history repeating itself, Hershey was rocked by a Congress of Industrial Organizations sit-down strike during the Great Depression. In 1937, 600 workers took control of the factory for five days. Their sit-down was broken up by scabs and angry local farmers who had watched 800,000 pounds of milk spoil each day. They broke into the factory, battering and forcibly removing the strikers.

Thanks to the New Deal order, which saw an activist federal government defending the rights of workers, however, a permanent union presence was eventually established at Hershey (although the company finagled to have its favored representative, a more conservative AFL union, win a collective-bargaining agreement).

The town of Hershey, though by no means the utopia that Milton Hershey envisioned, exists today as a modestly successful tourist trap. The theme park and the still-operating chocolate factory continue to serve as a job base for locals.

COMPANY TOWNS ARE STILL with us. In the 21st century, company towns operate less like Pullman and more like Kannapolis during the years between Cannon Mills’s sale of its company housing and the final closure of the mill. The companies no longer are their employees’ landlord, but because they’re the only major employer for miles around, they still wield extraordinary power.

This past August’s NLRB election defeat for Canton, Mississippi’s Nissan workers, who sought to be represented by the United Automobile Workers (UAW), should put unions on notice that company towns are not some relic from our sepia-toned past, but an essential feature of 21st-century manufacturing employment in the United States.

In 2003, Nissan, a French-owned multinational carmaker now valued at $41 billion, located its sole American auto assembly plant in the tiny town of Canton. The factory employs around 6,500 workers, while the town is home to roughly 13,000 residents.

In the run-up to the union election, Nissan did what almost every employer does. It didn’t threaten to fire union activists, because that would be too obviously illegal. Instead, management merely predicted that the invisible hand of the market would force it to shut down a newly unionized factory and ship all of the jobs out of town. Thusly terrorized, the entire political establishment of Canton, its churches, and the workers’ own neighbors amplified this threatening message to potential UAW supporters.

The company inundated the local airwaves with television ads in which a local pastor compared the ostensibly horrific period before Nissan arrived—when residents were “fluctuating back and forth looking for jobs”—with the good news that Nissan employees can “come through the door knowing the lights are on, the water is running.”

It actually makes sound business sense for multiple competing businesses in the same industry to be located in close physical proximity to each other. There are economies of scale that can be achieved through shared distribution channels, a major airport, a shared community of professional engineering talent, an education system designed to build the bench, and an ecosystem of parts suppliers and other complementary businesses.

It just doesn’t make business sense if you’re trying to operate on a union-free basis. The fact that Chrysler, GM, and Ford workers were friends and neighbors in Detroit and its suburbs helped organizers foster a culture of solidarity that was essential to organizing the auto industry in the 1930s and 1940s. The fact that few new auto factories, foreign or domestic, have been built anywhere near Detroit—or anywhere near each other—for more than half a century is not an accident. It’s not the result of “free trade,” of the tax-cutting “savvy” of Southern politicians, or of some inherent deficiency of the so-called Rust Belt.

It’s the product of a bloody-minded determination by “job creators” to avoid the conditions under which unions are even possible. From the overuse of “independent contractors” to sub-contracting and outsourcing, to locating new factories in small and remote geographies, corporations in America strategically structure their business to avoid the reach of NLRB-certified, enterprise-based collective bargaining.

These business practices make it clear that employers will continue to evade and sabotage any system of labor rights that is tied to individual workplaces, rather than one that applies to entire industries. We will need new labor laws and new models of worker representation to democratize our communities.

[This article originally appeared at the American Prospect.]

The West Virginia Teachers’ Strike Has Activists Asking: Should We Revive the Wildcat?

The stunning success of the recent statewide West Virginia teachers’ strike makes it one of the most inspiring worker protests of the Trump era.

The walkout over rising health insurance costs and stagnant pay began on Feb. 22 and appeared to be settled by Feb. 27 with promises from Gov. Jim Justice of a 5 percent pay raise for teachers. Union leaders initially accepted that deal in good faith, along with vague assurances that the state would work with them on a solution to escalating out-of-pocket costs for workers’ healthcare.

Dramatically, rank-and-file teachers refused to end the walkout. Every public school in the state remained closed for nine days due to the strike, until the West Virginia legislature voted to approve a 5 percent pay increase for all state workers as well as a formal labor-management committee to deal with the healthcare problem.

The entire experience leaves many labor activists asking variations of three questions: What is a wildcat strike? Was West Virginia a true wildcat? And should we have more wildcat strikes?

What is a wildcat strike?

Wildcat strikes are job actions led by rank-and-file members in defiance of official union leadership. Why would leaders try to stop a job action that members want to take? The answer, generally, is that the strike is either against the law or in violation of a contractual no-strike clause (and, often, the leaders are in some way legally compelled to discourage it). In either case, workers who strike could be fired with no legal recourse for the union to win them their jobs back. This is a peculiar feature of America’s post-World War II labor relations system.

Prior to the 1935 National Labor Relations Act (NLRA), a strike was a strike. It was not uncommon to have multiple unions vying for workplace leadership and engaging in a kind of one-upmanship of job actions. While these actions occasionally produced small gains in pay or reductions in hours, they rarely ended with union recognition—much less signed contracts.

That’s because employers didn’t have to deal with unions. They might have begrudgingly made a unilateral concession to the workers’ wage or hour demands in order to resume operations, but bosses almost never formally sat down with elected union representatives.

The NLRA changed that status quo by compelling employers to “bargain in good faith” with any group of union members that demanded it. As Charles J. Morris documents in his 2004 book, The Blue Eagle at Work: Reclaiming Democratic Rights in the American Workplace, the NLRA did not include any provision for certification elections of exclusive union representatives. The framers of the NLRA wrote it for the labor movement that existed at the time: a collection of voluntary associations that made bargaining demands for their members only.

Compelled to bargain with unions, employers quickly developed a preference to deal with only one as an exclusive representative. That way, bosses could have contractual assurance that all outstanding disputes would be settled (or at least channeled through grievance and arbitration procedures) for the period of a contract that also guaranteed no strikes (or lockouts or other forms of industrial actions) would occur during the terms of labor peace.

Under that framework, the wildcat became a unique kind of worker protest. The etymology of the term “wildcat” can probably be traced to the Industrial Workers of the World (IWW) and their unofficial symbol, the sabo cat.

Wildcat actions are not common and are rarely full-blown strikes. More often, they are temporary slowdowns or quick work stoppages in a smaller segment of a wider operation. They could be sparked, for example, over a sudden change in work rules or the belligerent actions of a supervisor. Usually, an official union representative rushes to the scene to attempt to settle the dispute with management and encourages the workers to return to their jobs.

Wildcats were more common in the early 1970s, during the last great strike wave in the United States. Those years saw a large number of strikes by teachers and other public-sector workers to win collective bargaining rights. Many of those strikes were technically illegal, but not wildcats as they were organized and led by official union leadership that had few alternatives in the absence of formal union rights under the NLRA.

However, in that climate of greater worker protest, many private-sector workers also went on strike. Many of those strikes were wildcats sparked by out-of-control inflation and intolerable speed-ups. In a sense, workers weren’t just striking in violation of their collective bargaining agreements but against their terms.

The most famous example was the 1972 rank-and-file rebellion at the General Motors factory in Lordstown, Ohio, which has fascinated generations of labor writers. In her 1975 book All the Livelong Day: The Meaning and Demeaning of Routine Work, Barbara Garson captured this illustrative conversation between workers:

“It pays good,” said one, “but it’s driving me crazy.”

“I don’t want more money,” said another. “None of us do.”

“I do,” said his friend, “so I can quit quicker.”

“The only money I want is my union dues back – if they don’t let us out on strike soon.”

In 1972, the factory was churning out Chevy Vegas at a pace that gave each worker 36 seconds to do a minute’s worth of work before the next car moved down the line in the blink of an eye. Workers had taken to acts of sabotage, like throwing a few loose screws in a gas tank, in hopes that the “error” would be caught by quality control and shut the line down for a few minutes of blessed relief.

While the United Autoworkers (UAW) leaders prioritized wages in bargaining—they won an impressive 13 percent increase for their members in the contract that was then in effect—the workers at Lordstown wanted to slow the pace of work. They went on a wildcat strike that lasted for 22 days, until management settled a slew of grievances and agreed to rehire a number of laid off positions in order to reduce the pace of work.

By the end of the decade, the competitive pressures of global trade put workers back on the defensive. The Lordstown plant is still in operation despite multiple threats to shutter it. In a 2010 profile, the New York Times called it one of GM’s “most productive and efficient plants,” and noted that 84 percent of the workers had recently voted to approve concessions during GM’s bankruptcy.

Those competitive pressures, combined with austerity budgets in the public sector, have severely reduced many workers’ living standards. The West Virginia strike may be a sign that these desperate times have turned many workplaces into powder kegs of simmering resentment and desperation.

Was West Virginia a true wildcat?

West Virginia schools have a peculiar framework: no contracts or formal collective bargaining, but a degree of official union recognition—including dues check-off—within a highly litigious tenure and grievance procedure with statewide pay and benefits subject to legislative lobbying. That environment appeared perfectly crafted to sap unions of their potential militancy, assuming the bosses understood they had to provide a minimally-decent standard of pay and benefits. Instead, teachers faced some of the lowest pay rates in the nation, along with rising healthcare costs, which helped lead to their decision to walk off the job.

Because the West Virginia strike happened outside the context of formal, contract-based unionism, Lois Weiner argues in New Politics that it is inaccurate to describe the statewide walkout as a wildcat. “Confusion on nomenclature reflects how remarkable this phenomenon is: we don’t know how to name a movement of workers that is self-organized, not confined by the strictures of collective bargaining,” she writes, continuing, “There is no legally prescribed procedure for ending the strike because the vast majority of people striking aren’t union members and strikes are not legal.”

Given the frontal assault on the entire legal framework of union representation—Janus vs. AFSCME being the massive tip of the gargantuan iceberg—what unionism looks like in the United States is bound to be radically altered in the coming years. Weiner does us a service by breaking the union framework down into its component parts. We need more writers doing this if we are going to have an informed debate about which parts are worth fighting to preserve, and which are overdue for replacement.

Respectfully, however, I would argue that the West Virginia strike was a wildcat. The political dynamics were essentially the same as in the ritualized contract bargaining of the post-war private sector. Union leaders were in the position of “bargaining” with the governor over a legislative fix to pay and healthcare. They took a deal that was reasonable enough in order to demonstrate their own reasonableness to the bosses.

When the rank-and-file rejected that settlement by continuing to stay off the job, the strike became a wildcat. Official union leaders continued to represent the interests of the striking workers and helped harness the continued strike into an even bigger win—all while presenting themselves to politicians as the reasonable negotiators who could help them get the teachers back to work.

That the strike happened in the first place is thanks to a good deal of self-organization among segments of the rank-and-file, aided in no small part by e-mail and social media. Because two unions—affiliates of the American Federation of Teachers and the National Education Association—vie for members across the state like pre-NLRA unions used to, this rank-and-file rebellion appears to have whipsawed the competing union leaderships into a one-upmanship over who could more effectively lead the strike and claim credit for the win.

This example does suggest one model for a new unionism, rooted in our recent past.

Should we have more wildcat strikes?

I recently wrote a piece for the Washington Post on the Janus vs. AFSCME case about how agency fees, which are directly challenged in this case, have historically been traded for the no-strike clause. I’ve been making variations of the same point at In These Times for over two years, but this time it’s created a bit of a stir.

Some commentators are beginning to recognize that an anti-union decision in Janus could spark constitutional and workplace chaos that could make messy protests like the West Virginia teachers’ strike a more regular occurrence.

If deprived of agency fees, it is probable that some unions will cede exclusive representation in order to kick out the scabs, or “free riders.” And one wonders how much longer private sector unions in right-to-work states will continue to slog through unfair NLRB elections in order to “win” the obligation to represent free-riders, instead of embracing Charles J. Morris’ theory that the original 1935 process for card check recognition of minority unions is still operational and demanding “members-only” bargaining.

That trend would inevitably lead to new worker organizations rushing to poach the unrepresented workers left behind. Some would likely compete by offering cheaper dues or by cozying up to management. Others would vie for members and shopfloor leadership by railing against disappointing deals. This will be messy. As in the pre-NLRA era, workplace competition between unions may not produce lasting union contracts.

But it will also make a guaranteed period of labor peace impossible—and that could lead to more strikes like the West Virginia wildcat. Through Janus, right-to-work and the renewed open-shop offensive, the bosses have made clear that they’re not interested in labor peace. Let’s give them what they want.

[This post originally appeared at In These Times.]