About this Issue

World War I remade the world, and Europe in particular. But its effects in the United States were also pervasive. While no combat took place on U.S. soil, the U.S. federal government and its political culture were never the same again. The United States abandoned a relatively modest foreign policy and began taking a much more active role in world affairs. Its federal government greatly expanded. Its intellectuals began increasingly to see the government as the key orchestrator of society’s resources—with ample title to arrange them for the public good.

Students today aren’t likely to know as much about World War I as they may know about World War II or the Vietnam War, but the country’s first great twentieth-century conflict paved the way for both of those later engagements, and for much federal domestic policy as well. Our lead essayist this month is Prof. John E. Moser of Ashland University; he will be joined by Prof. Justin Quinn Olmstead of the University of Central Oklahoma, notable author and co-founder of the Quincy Institute for Responsible Statecraft Andrew Bacevich, and Prof. Saladin Ambar of Rutgers University. Comments are open through the end of the month, and we invite readers to join the discussion.

Lead Essay

The Political Legacy of World War I

In July 1918, Columbia University philosophy professor John Dewey offered an explanation for why so many Progressive intellectuals had embraced U.S. involvement in the First World War. True, the “immediate aim” of the war was a mere expression of “the will to conquer,” but to focus exclusively on this was to miss the exciting possibilities that the war offered to society. Throughout the world, the conflict has “made it customary to utilize the collective knowledge and skill of scientific experts in all lines, organizing them for community ends.” “In every warring country,” he continued, “there has been the same demand that in the time of great national stress production for profit be subordinated to production for use. Legal possession and individual property rights have had to give way before social requirements. The old conception of the absoluteness of private property has received the world over a blow from which it will never wholly recover.” Not only would the eventual defeat of German autocracy and militarism make the world “safe for democracy,” as Woodrow Wilson had put it in his April 1917 war address, but it would “initiate a new type of democracy,” in which “the supremacy of the public and social interest” would finally be established “over the private possessive interest.”[1]

World War I was arguably the most important conflict of the twentieth century, bringing down four great empires and redrawing the map of Europe. The effect on the United States was quite different; it did not alter the country’s boundaries, or change its fundamental form of government, and the number of American men who lost their lives (126,000) paled in comparison to the figures from the European belligerents (2 million Germans, 1.4 million Frenchmen, nearly a million Britons). However, the war redefined the role of the federal government. While it did not quite lead to the democratic socialism that Dewey embraced, it redefined the relationship between Washington and its citizens, and set precedents to which subsequent presidents would repeatedly refer.

To say that the United States was unprepared for war in 1917 would be a serious understatement. The U.S. Army had well below 200,000 soldiers (by contrast, the Russian Army had nearly 6 million on the eve of war; the German Army had 4.5 million, while even Bulgaria fielded 280,000 men), and no arms industry capable of producing weapons heavier than rifles and pistols. While the reforms of the Progressive Era had marginally increased the power of the federal government, most authority still resided in the states, and the economy was almost entirely market-driven.

In order to assemble an army large enough to make a difference on the battlefields of Europe, the Wilson administration employed the power of the federal government on an unprecedented level. Even before the war the president had established a nonpartisan advisory committee—the Council for National Defense—made up of business and labor leaders to oversee the process of mobilization. When, in the first weeks of the war, calls for volunteers failed to meet army quotas, Wilson pushed through Congress a Selective Service Act that instituted mass conscription. To make sure that the new army—and the soldiers and civilians of Allied countries—would be properly fed, the president persuaded Congress to pass the Food and Fuel Control Act, which authorized the administration “to requisition foods, feeds, fuels, and other supplies necessary to the support of the Army…or any other public use connected with the common defense.” The bill created a Food Administration—headed by former mining engineer Herbert Hoover—that was empowered to fix prices and even control the amount of food consumed by American civilians; soon “wheatless” and “meatless” days became regular features of American life. To organize production of arms and equipment, Wilson established the War Industries Board, under the leadership of Wall Street banker and Wilson confidante Bernard Baruch. The WIB was empowered to coordinate wartime production by channeling resources where they were most needed, even if that meant commandeering private property. Grosvenor B. Clarkson, director of the Council for National Defense, called the agency “an industrial dictatorship without parallel—a dictatorship by force of necessity and common consent which step by step at last encompassed the Nation and united it into a coordinated and mobile whole, supporting the army and navy with all the incomparable strength of the greatest industrial potentiality in the world.”[2] When certain businesses that were regarded vital to national defense complained that they were having trouble securing bank loans and marketing their bonds on Wall Street, the president set up the War Finance Corporation—chaired by Wilson’s Treasury Secretary (and son-in-law) William Gibbs McAdoo—that was empowered to issue emergency loans to industries whose ordinary lines of credit were disrupted due to the war.

Taken together—along with many other new federal agencies and boards—these institutions placed a vast amount of power in the hands of the government. To ensure compliance from the population, the administration employed a variety of carrots and sticks that historian Christopher Capozzola has called “coercive voluntarism.”[3] Powerful interest groups such as business and labor organizations were bought off, the former by the promise of lucrative contracts, the latter by binding government arbitration of labor disputes. An official government propaganda agency, the Committee for Public Education, enlisted speakers, writers, photographers, illustrators, and filmmakers in a campaign to convince the American people of the need to sacrifice for a just cause. However, the administration certainly did not shrink from the use of force if the situation required it. When the railroads proved incapable of meeting the increased wartime demands placed on them, Wilson simply ordered that they be nationalized, thus bringing one of the country’s largest industries under direct federal control. Meanwhile, individuals deemed insufficiently patriotic faced merciless repression. Failure to register for the draft brought a penalty of one year in prison, and the rights of conscientious objectors were seldom recognized, let alone respected. In addition, more than 2,000 Americans were prosecuted under the Espionage Act of 1917 and the Sedition Act of 1918, which threatened even the mildest critics of the war or the administration with prison terms of up to twenty years.

In short, during the year and a half in which the United States was an active belligerent in the First World War, the federal government assumed powers, and made demands upon its citizens, that were unprecedented in the nation’s history. Before the country entered the war, the government had never spent more than $746 million in a single year; in 1918 it spent $12.68 billion. Government spending as a percentage of gross national product increased from 5.9 percent in 1916 to 21.3 percent in 1918. In 1916 the top income tax rate had been 15 percent, and that applied to incomes over $2 million; by 1918 the rate had soared to 77 percent, paid on all income beyond $1 million. The public debt, which had stood at just over $1 billion in 1916, reached more than $25 billion by the end of 1918. On the eve of war 5.5 percent of the civilian workforce was employed by the federal government; in 1918 it was 6.3 percent. In 1916 there were just over 108,000 men in the U.S. Army, and roughly 60,000 in the U.S. Navy; two years later those numbers stood at 2.4 million and nearly 449,000, respectively. Freedom of speech and the press were curtailed, private property was subject to appropriation, and thanks to conscription millions of young men’s very lives were placed at the disposal of the Wilson administration.

Of course, all this was bound to provoke a reaction, and the first signs of it came in the congressional elections of 1918, which returned Republican majorities in both houses for the first time in ten years. The Republicans, it must be said, had been no less committed to war than Wilson—indeed, many GOP leaders faulted the president for waiting until 1917 to intervene—yet during the campaign they were quick to capitalize on public frustration with wartime controls. Less than a week after Election Day the German government signed an armistice, bringing the fighting in Europe to an end, and in the months that followed the myriad of wartime agencies and bureaus were gradually dismantled. Conscription ended almost immediately, and the Selective Service System was terminated early in 1919. In 1920 Ohio Senator Warren G. Harding won the presidency by promising the nation a return to “normalcy.”

However, anyone who thought that “normalcy” meant the status quo ante bellum was in for a rude shock. In his 1987 book Crisis and Leviathan, economic historian Robert Higgs discusses the so-called “ratchet effect,” in which each crisis of twentieth-century America produced a sudden and substantial growth in the size and power of government. After each crisis passed, however, the situation never returned to what it had been before.[4] The statistics on the size and scope of the federal government bear this out. Much is often made of the cuts that the Harding and Coolidge administration made, both in spending and taxes. However, at no point in the 1920s did the government spend less than $2.9 billion (more than three times the level of 1916); as a percentage of GNP in 1924 federal spending made up 7.9 percent (as opposed to 5.9 percent in 1916). Even in 1929, the national debt stood at $16.9 billion, more than four times what it had been before the war. Moreover, while the Coolidge administration reduced the top tax rate to 25 percent, it applied to all income over $100,000; compare this to 1916, when a rate of 15 percent applied to all income over $2 million. Despite a rapid demobilization after the war, there were still 25 percent more soldiers in the U.S. Army in the mid-1920s than there had been in 1916, and nearly 60 percent more sailors in the U.S. Navy. Finally, the proportion of the civilian work force employed by the federal government in 1926 was 6.3 percent—greater even than it had been in 1918!

So what happened? Even at the time, John Dewey predicted that mobilization for war would have a lasting impact on Americans’ expectation of their government. “It is true,” he wrote in 1918, “that not every instrumentality brought into the war for the purpose of maintaining the public interest will last. Many of them will melt away when the war comes to an end.” However, once the public learned that it was possible for the government to direct the economy and society, allocating billions of dollars if necessary to solve pressing social problems, they would demand exactly this. “In this sense, no matter how many among the special agencies for public control decay with the disappearance of war stress, the movement will never go backward.”[5]

A prime example of wartime controls continuing—indeed, expanding—in the postwar era involved alcoholic beverages. The Food and Fuel Control Act of 1917 stipulated that “no foods, fruits, food materials, or feeds shall be used in the production of distilled spirits for beverage purposes,” and further banned the importation of liquor into the country. It was added as a sop to prohibitionists, who were well represented in Congress at the time, but it also addressed public concerns about drunkenness on the part of new recruits. At a time when the country was supposed to focus its energies single-mindedly on defeating the Kaiser, intoxicating beverages seemed like a frivolous purpose. (Nor did it help that the country’s most popular alcoholic beverage, beer, was almost entirely produced by German-Americans.) This provision of the law clearly paved the way for national Prohibition; indeed, Congress in December 1917 passed the Eighteenth Amendment, which banned “the manufacture, sale, or transportation of intoxicating liquors.” By January 1919 the required 36 of 48 states had ratified the amendment, and in October the Volstead Act authorized enforcement. Beginning January 17, 1920, Prohibition was the law of the land.

Prohibition cost the federal government some $300 million to enforce (in addition to the billions of lost dollars in excise taxes at the state and federal level), and for the first time brought federal law enforcement into communities across the country. Much of that enforcement was left to the Justice Department’s Bureau of Investigations, an agency which was created in 1908, but only came into its own during the war, when it was tasked with monitoring the activities of those deemed “pro-German”—a category which quickly expanded to include labor activists, women’s suffragists, and African-American leaders. In July 1917 the Justice Department hired a young man named John Edgar Hoover, and one of his first major projects was to oversee the registration of enemy aliens. He took to the task with such vigor and efficiency that he attracted the notice of A. Mitchell Palmer, who was appointed Attorney General in March 1919. When a wave of mail bombings struck the country in June, Palmer organized a new General Intelligence Division, with Hoover as its head. Over the next several months the young bureaucrat had amassed 200,000 files on suspected radicals, then oversaw the so-called “Palmer Raids” of November 1919 and January 1920, in which thousands whom he believed were dangerous to national security were rounded up for questioning; most of them were quickly released, but 556 were deported without trial. The “Red Scare” of 1919-1920 soon passed, but Hoover was appointed director of the Bureau of Investigations in 1924, and would preside over the agency’s reorganization as the Federal Bureau of Investigation in 1935. Hoover’s methods, first developed in the First World War and in its immediate aftermath, served as the foundation for the modern surveillance state.

The primary legacy of the war, however, was the precedent that it set for the federal government not merely to regulate private business, but to manage the economy as a whole. In the years following the war Americans grew less and less certain that participation in the First World War had been worthwhile, but the example of the country’s rapid mobilization for war remained a model for dealing with other problems. Herbert Hoover’s tenure as Secretary of Commerce from 1921 to 1928 exemplifies this. Hoover, who had directed Wilson’s Food Administration, had also overseen U.S. food relief efforts in Europe during the immediate postwar period. At Commerce he championed the sort of government-business partnership that was practiced on a much larger scale in Wilson’s Council for National Defense. In this way he brought about the regulation of new industries such as radio broadcasting and aviation, and established voluntary standards for items such as automobile tires and milk bottles. He also coordinated relief efforts when the Mississippi River flooded in 1927, leaving millions homeless. For good reason Hoover was dubbed “Master of Emergencies.”

Hoover’s record as Secretary of Commerce helped catapult him to the presidency in 1928, but his skills as an expert administrator were sorely tested by the coming of the Great Depression. His strategy was once again to rely on the voluntarism that had worked so well during the war, organizing conferences with business and labor leaders in an effort to keep unemployment down and wages high. Hoover frequently invoked military analogies in discussing his recovery efforts; fighting the Depression, he told Americans in January 1932, was “like a great war in that it is not a battle upon a single front but upon many fronts.” To this end he devoted enormous sums of money; federal spending ballooned from under $3 billion in 1928 to more than $4.5 billion in 1932. He even resurrected the old War Finance Board in a new form—the Reconstruction Finance Corporation—designed to bail out troubled banks, railroads, insurance companies, and other institutions whose collapse might bring down the rest of the economy. [6]

But it would be Hoover’s successor, Franklin D. Roosevelt, who would most fully apply the war analogy to the economic crisis. While campaigning for the Democratic nomination in April 1932, he reminded his radio audience that he had served as Assistant Secretary of the Navy in 1917. “The generalship of that moment,” he told them, “conceived of a whole Nation mobilized for war.” In the Depression, he continued, “the Nation faces today a more grave emergency than in 1917,” hence nothing short of national planning was required. After winning in a landslide that November, he delivered an inaugural address in which he announced the need to “move as a trained and loyal army willing to sacrifice for the good of a common discipline,” acting “with a unity of duty hitherto evoked only in time of armed strife.” If Congress refused to support his program, he would exercise “broad Executive power to wage a war against the emergency, as great as the power that would be given to me if we were in fact invaded by a foreign foe.”[7]

In 1906 the philosopher William James, a self-described pacifist, admitted that the “martial virtues…intrepidity, contempt of softness, surrender of private interest, obedience to command, must still remain the rock upon which states are built.” Even if war were to be abolished, the world would need “the moral equivalent” in order to inspire citizens to do their duty. The radical journalist Randolph Bourne, who opposed the war, put it more bluntly: “War is the health of the state.”[8] Involvement in the First World War proved to Americans the truth of these sentiments; little wonder, then, that so many presidents since Wilson have involved the country in other wars, both literal and figurative—not only against foreign enemies, but against social problems such as the Depression, poverty, drugs, crime, and, most recently terrorism. This constant war footing, whether for a real war or any of several metaphorical ones, is the most critical legacy of World War I for the United States. World War I was horrific, but paradoxically, we have been returning to war again and again ever since.

Notes


[1] John Dewey, “What Are We Fighting For?” The Independent (July 1918): 474, 480-483.

[2] Grosvenor B. Clarkson, Industrial America in the World War: The Strategy Behind the Line, 1917-1918 (Boston: Houghton Mifflin, 1923), p. 292.

[3] Christopher Capozzola, Uncle Sam Wants You: World War I and the Making of the Modern American Citizen (New York: Oxford University Press, 2008).

[4] Robert Higgs, Crisis and Leviathan: Critical Episodes in the Growth of American Government (New York: Oxford University Press, 1987), pp. 57-74;

[5] Dewey, “What Are We Fighting For?” p. 482.

[6] John E. Moser (ed.), The Great Depression and the New Deal: Core Documents (Ashland, OH: Ashbrook Press, 2017), p. 26.

[7] Moser, Great Depression and New Deal, pp. 32, 85-86.

[8] William James, “The Moral Equivalent of War,” 1906, https://www.uky.edu/~eushe2/Pajares/moral.html, accessed 10/4/20; Randolph Bourne, “War is the Health of the State,” 1918, https://www.panarchy.org/bourne/state.1918.html, accessed 10/4/20.

Response Essays

The Importance of U.S. Economic and Foreign Policy in the First World War and Beyond

One hundred years after the guns fell silent in Europe, the First World War is still pivotal in modern history. U.S. involvement in the war certainly redefined the relationship between the U.S. government and the American people and was full of “exciting opportunities” as John Moser notes in his essay. The phrase “exciting opportunities” is relative, but to Americans in 1917, the war was an opportunity to reform not only their society but that of the world.[1]

As the nation geared up for war, it was also gearing up for change. Women’s suffrage, Prohibition, and immigration restrictions were all changes resulting from U.S. participation in the war. President Woodrow Wilson’s worry did not seem to be about positive reform. In 1916, the president had promised a government that focused on the American people’s everyday concerns. At the war’s end, the president was faced with racial strife in the form of race riots in no less than twenty-five cities and labor strife gripping major industries. During the war, Wilson had guaranteed the right of labor to unionize and bargain collectively. Wilson’s administration quickly abandoned this agreement at war’s end, thanks in large part to the Red Scare that gripped the nation in 1919.

But Wilson was more worried about the negative changes that would come out of American participation. Involvement in the war meant, to Wilson, that his plans for international reform were in jeopardy. A more significant worry to the president was that it would forever change domestic politics in the United States. Indeed, the president feared that a movement to war footing would require Americans to shelve their liberal beliefs in a “free society” to compete with Germany and keep Britain and France at bay when it came to the use of U.S. troops. Wilson believed “the [U.S.] Constitution would not survive” a war that required militarism and legislation such as the Espionage and Sedition Acts.[2] Patriotic activities caused Americans to assimilate quickly. Propaganda targeted immigrants specifically as well as the American population more generally as the country moved to a war footing to produce the number of troops needed to fight a modern, industrial war.

It is essential that Wilson’s primary fear—involvement in the war meant his plans for international reform were not realized—is not forgotten. Despite the fight over the ratification of the Treaty of Versailles with the Senate, Wilson hoped to have the United States work as part of an international community to ensure world peace. In the lead essay, Moser gives a nod to Wilson’s desire to “make the world safe for democracy,” but only a nod. This is a mistake because the United States’ growing participation in world affairs unmistakably begins with this phrase. The legacy of the First World War rests on American’s newfound place in the world. Wilson believed that if he had to lead the country into a war, he would attempt to make American involvement force other belligerent nations to create and maintain a peace based on territorial integrity, political independence, equal trade opportunities, and limitation of armaments managed by an evolving League of Nations.[3]

The impact of America’s growing role in international affairs can be seen in the diplomacy and propaganda used by the belligerent nations to entice the United States to become involved in the fighting. However, the postwar foreign policy had a far more significant impact on the United States and its population. To begin with, it was during the period of U.S. neutrality that America’s status changed from a debtor nation to the leading creditor nation. By the war’s end, the United States had loaned Britain 10 billion dollars. An all too often forgotten aspect of U.S. First World War loans was the stipulation that the money could only be spent in America.[4] In one stroke, the United States became the world’s leading exporter. As David Reynolds has noted, during the 1920s, “the world needed America more than America needed the world.”[5] This not only helped Americans prosper, but it provided the United States with the leverage to become the dominant economic force in the world.

America’s newfound economic connection to the world stands in stark contrast to the idea that the United States remained on a war footing to regulate private business and manage the economy. The backlash against war profiteering led to a greater public distrust of bankers, Britain, and the U.S. government’s decision to go to war. The American public sent members to Congress in the late 1920s to stop any future involvement in war. Negotiated by U.S. Secretary of State Frank B. Kellogg and French Foreign Minister Aristide Briand, the Kellogg-Briand Pact of 1928 outlawed war and was eventually signed by more than fifty nations. Republican Senator Gerald P. Nye sponsored legislation to keep America off of a war footing by passing the Neutrality Act of 1935, prohibiting the sale of munitions to belligerents and the use of American ships to carry munitions. A second Neutrality Act was passed in early 1936 banning American loans to belligerents, and a third Neutrality Act in 1937 banned American citizens from traveling on ships owned by belligerent nations. Each of these Acts damaged American industry’s ability to maintain its new position of economic hegemony in the world and limited President Roosevelt’s ability to provide business to American companies during the Great Depression.

America’s rise to world economic powerhouse would not only survive the financial upheaval of the 1930s but be strengthened by it. As U.S. consumption slackened in 1928 and the government turned its focus on domestic economic issues, the world financial system faltered and collapsed. The lesson was clear to the new generation of American leaders who, in 1944, began to plan for a world more tightly intertwined with the U.S. economy. It would be the post-Second World War legacy that would place the United States on a constant war footing. The First World War turned out to be, shockingly, only the prelude.

Notes


[1] Jennifer D. Keene, The United States and the First World War (Harlow: Pearson Education Limited, 2000), p. 85.

[2] Ross A. Kennedy, The Will to Believe: Woodrow Wilson, World War I, and America’s Strategy for Peace and Security (Kent: The Kent State University Press, 2009), p. 129.

[3] Thomas J. Knock, To End All Wars: Woodrow Wilson and the Quest for a New World Order (Princeton: Princeton University Press, 1992), pp. 126-7.

[4] Adam Tooze, The Deluge: The Great War, America and the Remaking of the Global Order, 1916-1931 (New York: Viking Press, 2014), p. 207.

[5] David Reynolds, The Long Shadow: The Legacies of the Great War in the Twentieth Century (New York: W.W. Norton & Company, 2014), p. 128.

World War I and the Ideology of Empire

I am inclined to agree with Professor Moser’s assessment that “World War I was arguably the most important conflict of the twentieth century.” In terms of lasting historical legacy, it may well be the century’s single most important event of any kind.

Where Moser and I differ is on how best to frame the war’s importance. I am disinclined to do so by assessing its impact on the size and reach of the United States government. Indeed, as an episode in the emergence of the American Leviathan, U.S. participation in World War I trails in significance well behind the Great Depression, World War II, the Cold War, and even the post-Cold War era.

Among other things, those later episodes lasted substantially longer than the eighteen months during which the United States was an active belligerent in 1917-1918. And whereas the passing of the “Great War” restored some semblance of prewar “normalcy” to American life, rearming for the next war against Germany ended normalcy for good. Or perhaps more accurately, it gave birth to a new normal in which American political elites came to regard global military dominion as an absolute imperative. Even today, except on the radical Left and the anti-interventionist Right, that radically revised conception of normalcy persists and prevents any serious reconsideration of basic U.S. policy.

Where’s the proof? It’s in the matchless size of the military budget, the vastness of the Pentagon’s network of foreign bases, and the design of U.S. forces as instruments of global power projection, with the actual defense of the United States and of the American people something of an afterthought. None of these can be attributed to events that occurred between April 1917 and November 1918.

What then imbues World War I with its singular importance? While there are several legitimate answers to that question, allow me to assign pride of place to the demise of race-based and ethnically defined imperialism.

Put simply, prior to 1914 the legitimacy of empires rooted in the principle of white supremacy or ethnic superiority had gone to a large extent unquestioned, especially in the West. I am not suggesting that subject peoples passively accepted subordination. They did not. But in terms of the reigning moral and civilizational calculus, the forces of the colonial order held the upper hand. More substantively, they enjoyed a clear advantage in terms of access to instruments of coercion. As Hilaire Belloc put it succinctly in 1898, “Whatever happens, we have got the Maxim gun, and they have not.”

Both the course of World War I and its outcome signaled the coming demise of racist and ethnic imperialism. That the standard bearers of western civilization blindly transformed Europe itself into a vast charnel house undermined confidence in their supposed superiority. After 1918, even in the West, it became increasingly difficult to justify the subjugation of peoples deemed inferior on the basis of race or ethnicity.

The pyrrhic Allied victory over Germany and the Ottomans did enable the British to expand their imperial holdings. But the burdens of imperial maintenance, whether in Ireland or India or Palestine, were now also increasing. Whether empire paid was becoming an open question. This was true not only for Great Britain, but also for France and the United States, even as the latter remained an empire in denial despite controlling various colonies, dominions, and protectorates.

So World War I undermined the traditional rationale for empire–superior people governing inferiors. Yet it also gave rise to a new rationale, with revolutionary ideology now offering a basis for exercising control, directly or indirectly, beyond the boundaries of a single nation-state. The Union of Soviet Socialist Republics, Benito Mussolini’s fascist Italy, and Hitler’s Third Reich offer prime examples. Some of these ideologically justified empires had a blessedly brief existence. Others, such as the post-1945 Soviet empire, based on Marxism-Leninism, and the post-1945 American empire, an outgrowth of liberal democratic capitalism, survived for a handful of decades.

Each of these Cold War empires could trace its lineage back to World War I. Each suffered from severe mismanagement. At the end of the 1980s, the decrepit Soviet empire collapsed with startling suddenness. In the first two decades of the present century, the American empire staggered through a period of rapid decay, although even today the cadre of imperial managers in Washington appear adamant on pretending otherwise.

From an imperial perspective, the commotion created by Donald Trump’s various shenanigans has provided an excuse for elites to ignore much larger forces at play both at home and abroad. But unless I miss my guess, whoever takes the oath of office as president on January 20, 2021 will find it increasingly difficult to sustain the pretense that the United States is history’s “indispensable nation.” As with Trump’s backlog of unpaid loans and back taxes, very large imperial bills are coming due, with little to suggest that the American people are in the mood to pay them.

If elected, Joe Biden promises to “save the soul” of America. He is not promising to refurbish the American empire.

Whether yet another variant of empire—centered perhaps on corporate behemoths rather than mere states—will impart yet another twist to the narrative of imperial evolution traceable to World War I is difficult to say. What we can say with some assurance is that the emergence of Leviathan, which is Professor Moser’s chief concern, is to a very large extent a byproduct of this larger story of modern empire.

Rethinking Wilson, World War I, and Progressivism’s Martial Spirit

John E. Moser is right to remind us that the First World War fundamentally altered the relationship and expectations of the American people and their government. Progressives like John Dewey and W.E.B. Du Bois supported the war effort, engaging in a Faustian bargain that sought the social and economic benefits of total war, while hoping that the limited costs to democracy and the nation’s character would on balance, justify their hopes. Both men were proven wrong. As progressives have learned over the past one hundred years, the price of war far exceeds the imagined dividend to economic and racial justice. Du Bois, who called for African Americans to “close ranks” (The Crisis, July 1918) and support Wilson’s efforts, was rightly haunted by his support for the war. World War I began not only a “ratcheting” up of expected commitments of the federal government as Moser notes; it also intensified the power of the presidency, the national security state, and the abandonment of racial justice at home.

Moser is correct in arguing that the Great War did not change the form of American government. But democracy is defined as much by its character and values, as it is by its form. By engaging in the suppression of dissenting voices and arresting those presumed to be subversive over the course of the war, the Wilson administration departed from long-established republican values. While Wilson did not begin this practice, one as old as the Civil War – and indeed dating as far back to the administration of John Adams – World War I saw an expansion and intensification of this practice. The horrific national blemish of Japanese American interment during the Second World War was but the lamentable continuation of this tradition.

Du Bois understood that America’s intervention into the war was not simply about “making the world safe for democracy” as Wilson proposed. It was also a continuation of “the wild quest for Imperial expansion among colored races between Germany, England and France primarily, and Belgium, Italy, Russia and Austria-Hungary in lesser degree.” (The Crisis, November 1914) America was inserting itself into a conflict premised on the global structure of white supremacy. One cannot divorce the late nineteenth century “scramble for Africa” from the rise of totalitarianism in Europe, as Hannah Arendt noted. (On Totalitarianism, 1951) That the purported democratic impulses that led America into World War I were soon followed by the humiliation of black soldiers, racial pogroms, and the resurgence of the Ku Klux Klan in the United States, is instructive.

The idea that “war is the health of the state” is simply incompatible with republicanism. It’s why all American wars have been couched in terms of democratic expansion or its protection. The invocation of the racial “other” has likewise been part of the rationalization for American intervention – and the First World War was no different. The demonization of Native Americans, Mexicans, Germans, Japanese, Vietnamese, and Muslims, must be incorporated into any health assessment of American democracy. That progressives have been party to this history is indicative of its power as a durable feature of American life. Notably, for African Americans, proofs of loyalty during these wars were never enough. As Wilson told his physician Dr. Cary Grayson, “the American negro returning from abroad would be our greatest medium in conveying bolshevism to America.”[1]

It was not only the fear of African Americans presenting an internal threat to national security that diminished the quality of American democracy and limited the reach of Wilson’s New Freedom agenda. In Franklin Roosevelt’s most important campaign speech in 1932, delivered at the Commonwealth Club in San Francisco, he grieved the interruption of progressivism’s march.

Had there been no World War, had Mr. Wilson been able to devote eight years to domestic instead of to international affairs, we might have had a wholly different situation at the present time. However, the then distant roar of European cannon, growing ever louder, forced him to abandon the study of this issue. The problem he saw so clearly is left with us as a legacy; and no one of us on either side of the political controversy can deny that it is a matter of grave concern to the government.

Of course, the problem Wilson saw was the rise of financial power and corporate combinations that threatened the fabric of American democracy. FDR saw himself as an heir to this problem. Nevertheless, he too chose to almost exclusively limit his New Deal programs to whites, while also ordering the relocation of tens of thousands of American citizens to concentration camps on the basis of their race. The project of making the world safe for democracy has proven to be contingent upon a racially exclusive definition of the term.

The shadows of Wilson and World War I remain with us. Indeed, Dr. Martin Luther King, Jr.’s Riverside Church address opposing America’s war in Vietnam, was premised on a wholly new and deeply moral understanding of democracy – one that rejected racial hierarchy at home and around the world, for a more just economic system. As President Lyndon Johnson’s Great Society programs and antipoverty agenda were dashed, King came to know that “America would never invest the necessary funds or energies in rehabilitation of its poor so long as adventures like Vietnam continued to draw men and skills and money like some demonic, destructive suction tube.”

George W. Bush’s justification of the war in Iraq in 2003, and the subsequent “War on Terror,” were but the latest in Wilsonian lines of reasoning that have become an embedded part of the American national political character. Imagining a new set of possibilities for social, economic, and racial justice – outside of the paradigm of war – is a task well worth undertaking. That we have not been able to sustain efforts towards a more equitable society ought not to deter us. A government strong enough to wage war can be empowered to marshal its forces in a different direction. As Pope Francis has recently argued, “We can no longer think of war as a solution, because its risks will probably always be greater than its supposed benefits. In view of this, it is very difficult nowadays to invoke the rational criteria elaborated in earlier centuries to speak of a possibility of a ‘just war.’” That Woodrow Wilson and Wilsonianism are still seen as America’s moral vision to the rest of the world is an abiding legacy of the First World War. If World War I has taught us anything, it is that we must dissociate our democratic ideals from the martial spirit that continues to pervade in the United States. War may be the health of the state – but a state so oriented to war can never be a democratic one.

Note

[1] Lloyd E. Ambrosius. Woodrow Wilson and American Internationalism (Cambridge: Cambridge University Press, 2017), 111.

The Conversation

Wilson Stands Tall Compared to Some

Saladin Ambar writes: “That Woodrow Wilson and Wilsonianism are still seen as America’s moral vision to the rest of the world is an abiding legacy of the First World War.”

Would that were the case! To consider U.S. policy since the end of the Cold War, “America’s moral vision” leans toward militarized imperialism ineptly pursued. My guess is that Wilsonianism exercises less influence on the way others view the United States today than do Clinton-era claims of America as the “indispensable nation,” the Bush Doctrine of preventive war, and all the nonsense of the present administration.

I’m not a fan of Woodrow Wilson. But recent presidents make him look pretty good by comparison.

Woodrow Wilson’s Vision—and Our Moral Leadership

Undoubtedly, Woodrow Wilson’s internationalist vision has transformed into something beyond its original purpose. But such is the tendency of all messianic visions. Whether LBJ’s call to save the “little nation” of Vietnam form communism, or George W. Bush’s call to upend an “Axis of Evil” through war in Iraq—the moral fervor of Wilson’s call to save the world abides. That the world (and many Americans) has grown increasingly skeptical of this vision doesn’t mean Wilsonianism is dead. It just means our ability to project power through moral leadership no longer holds. I take Andrew Baceivch’s point that America’s interventions have become more inept with age—but Wilson opened that Pandora’s Box—and he deserves all the credit—and much of the blame—for doing so.

Some Replies on Wilson, Race, and World War I

I am indebted to all three of my commenters for filling some gaps in my account. Of course, in a 3,000-word essay (about fifty percent longer than what Cato originally asked for) there is much that I had to leave out.

Obviously, I agree that participation in the First World War had extremely important effects beyond domestic politics. Justin Quinn Olmstead is certainly correct in reminding us of the expanded role that the United States would play in the global economy after 1918. I disagree with him, however, that Wilson was much concerned about the war’s effects on individual liberty. Wilson’s famous quote, “Once lead this people into war and they will forget there ever was such a thing as tolerance” (which Professor Olmstead does not mention), and his less famous one that “the Constitution would not survive [the war]” (which he does), both seem to suggest this. However, as the late historian Thomas Fleming pointed out, the only source for these words is Frank I. Cobb, editor of the New York World and an ardent Wilson supporter. Cobb claimed that Wilson made these statements to him in the White House on the night of April 1-2, 1917, but White House logs show no record of his having been there. Perhaps I would be more willing to accept the portrayal of Wilson-the-civil-libertarian if he hadn’t so willingly signed the Espionage and Sedition Acts, or if he had spoken out even once in defense of German-Americans, labor radicals, or others targeted during his the war—not to mention his silence on the lynchings of African-Americans, which surged during his presidency.

I also find hard to accept Professor Olmstead’s statement that “involvement in the war meant his plans for international reform were not realized.” It is true that Wilson had, ever since August 1914, expressed the hope that the war would end in stalemate, giving the United States an opportunity to serve as mediator and to impose a new world order on an exhausted Europe. Indeed, he repeated this wish as late as January 1917 in his “peace without victory” speech. However, almost immediately after giving that speech the president seems to have come to the realization that his hoped-for stalemate was not going to materialize. Did this mean that he abandoned his hopes for international reform? Jane Addams certainly did not think so; when she and a group of fellow peace activists met with the president in February, he informed them that he had already decided to ask Congress for a declaration of war. “As head of a nation participating in the war,” Wilson told them, he “would have a seat at the peace table, but…if he remained the representative of a neutral country” he could do nothing more than “call through a crack in the door.” While it is true that he had to compromise repeatedly with the European Allies at the Paris Peace Conference, one gets no sense that Wilson believed that he had made some Faustian bargain by entering the war.

Andrew Bacevich likewise reminds us of the critical role that World War I played in undermining European imperialism. However, I believe he is too quick to dismiss the period 1917-18 “an episode in the emergence of the American Leviathan.” He is right in mentioning (as I also did) the return to “normalcy” that followed the war—something that did not occur after the Great Depression, World War II, or any of the crises that followed. My argument is that U.S. involvement in the First World War laid the foundation for the U.S. government’s response to all of these later episodes. The rapid mobilization for war, and the vast centralized control granted to the federal government, provided a blueprint for how to address future crises. Indeed, even as early as 1921, when the country faced a sharp economic contraction, Commerce Secretary Herbert Hoover sought to employ the “associationalist” tactics that he had learned during the war—and would have undoubtedly done so had the situation not improved dramatically that summer.

Finally, I thank Saladin Ambar for noting that the “character and values” of a government are as important as its form, and that the wartime state first developed during World War I was fundamentally incompatible with democracy. I agree with everything he has to say in his perceptive essay, which makes me wish I could have devoted more attention in my essay to the subject of race.

Actions Speak Louder than Quotes

I want to begin by acknowledging Professor Moser’s breadth of knowledge of famous quotes. However, I fail to see how noting which quote I chose to include and which I chose to omit is germane to the discussion other than to act as a red herring. In an attempt to put the discussion back on course I submit that “Wilson’s famous quote” provides support for the position I put forth in my original response.

Because neither individuals nor history are “clean,” analysis of an individual’s actions must be taken on the whole. Wilson was a complicated man. He was neither wholly good or wholly bad. As such, it is Wilson’s cross to bear that he lamented the loss of liberty while remaining passive as others limited civil liberties in his name. “Once lead this people into war and they will forget there ever was a thing as tolerance” is a testament to Wilson’s ability to “read” the American people. The attacks on German-Americans and the constant bombardment from state and local officials to take action against labor radicals are but two examples of how right Wilson was in his assessment of the American people and what he called their “mob passion.” I agree with Professor Moser that the president did not protect individual liberties during the war. While it is claimed he disagreed with many of the actions taken under the Espionage Act, the mistake he made was to not reverse or call for a stop to them.

Professor Moser seems to have missed the point entirely in his second response to my comments. Wilson did not abandon his hopes for international reform. Rather, they simply did not come to fruition, as he feared. Decisions to go to war in order to have a seat at the table notwithstanding, Wilson’s hope was for a world without war. Did he succeed by leading the United States into the war? I submit the rise of fascism and the Second World War as exhibits A and B that he did not. Additionally, basing the argument on Wilson’s belief that a seat at the table would allow him to reach his goals falls flat once the lens is expanded to encompass the Paris Peace Conference and the presence of Lloyd George, and Georges Clemenceau. Both of these men spent four years pressuring Wilson to get involved in the war and then held Wilson at arm’s length due to the lack of American blood spilt on European fields. Actions, as Professor Moser has acknowledged, speak louder than quotes.