In July 1918, Columbia University philosophy professor John Dewey offered an explanation for why so many Progressive intellectuals had embraced U.S. involvement in the First World War. True, the “immediate aim” of the war was a mere expression of “the will to conquer,” but to focus exclusively on this was to miss the exciting possibilities that the war offered to society. Throughout the world, the conflict has “made it customary to utilize the collective knowledge and skill of scientific experts in all lines, organizing them for community ends.” “In every warring country,” he continued, “there has been the same demand that in the time of great national stress production for profit be subordinated to production for use. Legal possession and individual property rights have had to give way before social requirements. The old conception of the absoluteness of private property has received the world over a blow from which it will never wholly recover.” Not only would the eventual defeat of German autocracy and militarism make the world “safe for democracy,” as Woodrow Wilson had put it in his April 1917 war address, but it would “initiate a new type of democracy,” in which “the supremacy of the public and social interest” would finally be established “over the private possessive interest.”
World War I was arguably the most important conflict of the twentieth century, bringing down four great empires and redrawing the map of Europe. The effect on the United States was quite different; it did not alter the country’s boundaries, or change its fundamental form of government, and the number of American men who lost their lives (126,000) paled in comparison to the figures from the European belligerents (2 million Germans, 1.4 million Frenchmen, nearly a million Britons). However, the war redefined the role of the federal government. While it did not quite lead to the democratic socialism that Dewey embraced, it redefined the relationship between Washington and its citizens, and set precedents to which subsequent presidents would repeatedly refer.
To say that the United States was unprepared for war in 1917 would be a serious understatement. The U.S. Army had well below 200,000 soldiers (by contrast, the Russian Army had nearly 6 million on the eve of war; the German Army had 4.5 million, while even Bulgaria fielded 280,000 men), and no arms industry capable of producing weapons heavier than rifles and pistols. While the reforms of the Progressive Era had marginally increased the power of the federal government, most authority still resided in the states, and the economy was almost entirely market-driven.
In order to assemble an army large enough to make a difference on the battlefields of Europe, the Wilson administration employed the power of the federal government on an unprecedented level. Even before the war the president had established a nonpartisan advisory committee—the Council for National Defense—made up of business and labor leaders to oversee the process of mobilization. When, in the first weeks of the war, calls for volunteers failed to meet army quotas, Wilson pushed through Congress a Selective Service Act that instituted mass conscription. To make sure that the new army—and the soldiers and civilians of Allied countries—would be properly fed, the president persuaded Congress to pass the Food and Fuel Control Act, which authorized the administration “to requisition foods, feeds, fuels, and other supplies necessary to the support of the Army…or any other public use connected with the common defense.” The bill created a Food Administration—headed by former mining engineer Herbert Hoover—that was empowered to fix prices and even control the amount of food consumed by American civilians; soon “wheatless” and “meatless” days became regular features of American life. To organize production of arms and equipment, Wilson established the War Industries Board, under the leadership of Wall Street banker and Wilson confidante Bernard Baruch. The WIB was empowered to coordinate wartime production by channeling resources where they were most needed, even if that meant commandeering private property. Grosvenor B. Clarkson, director of the Council for National Defense, called the agency “an industrial dictatorship without parallel—a dictatorship by force of necessity and common consent which step by step at last encompassed the Nation and united it into a coordinated and mobile whole, supporting the army and navy with all the incomparable strength of the greatest industrial potentiality in the world.” When certain businesses that were regarded vital to national defense complained that they were having trouble securing bank loans and marketing their bonds on Wall Street, the president set up the War Finance Corporation—chaired by Wilson’s Treasury Secretary (and son-in-law) William Gibbs McAdoo—that was empowered to issue emergency loans to industries whose ordinary lines of credit were disrupted due to the war.
Taken together—along with many other new federal agencies and boards—these institutions placed a vast amount of power in the hands of the government. To ensure compliance from the population, the administration employed a variety of carrots and sticks that historian Christopher Capozzola has called “coercive voluntarism.” Powerful interest groups such as business and labor organizations were bought off, the former by the promise of lucrative contracts, the latter by binding government arbitration of labor disputes. An official government propaganda agency, the Committee for Public Education, enlisted speakers, writers, photographers, illustrators, and filmmakers in a campaign to convince the American people of the need to sacrifice for a just cause. However, the administration certainly did not shrink from the use of force if the situation required it. When the railroads proved incapable of meeting the increased wartime demands placed on them, Wilson simply ordered that they be nationalized, thus bringing one of the country’s largest industries under direct federal control. Meanwhile, individuals deemed insufficiently patriotic faced merciless repression. Failure to register for the draft brought a penalty of one year in prison, and the rights of conscientious objectors were seldom recognized, let alone respected. In addition, more than 2,000 Americans were prosecuted under the Espionage Act of 1917 and the Sedition Act of 1918, which threatened even the mildest critics of the war or the administration with prison terms of up to twenty years.
In short, during the year and a half in which the United States was an active belligerent in the First World War, the federal government assumed powers, and made demands upon its citizens, that were unprecedented in the nation’s history. Before the country entered the war, the government had never spent more than $746 million in a single year; in 1918 it spent $12.68 billion. Government spending as a percentage of gross national product increased from 5.9 percent in 1916 to 21.3 percent in 1918. In 1916 the top income tax rate had been 15 percent, and that applied to incomes over $2 million; by 1918 the rate had soared to 77 percent, paid on all income beyond $1 million. The public debt, which had stood at just over $1 billion in 1916, reached more than $25 billion by the end of 1918. On the eve of war 5.5 percent of the civilian workforce was employed by the federal government; in 1918 it was 6.3 percent. In 1916 there were just over 108,000 men in the U.S. Army, and roughly 60,000 in the U.S. Navy; two years later those numbers stood at 2.4 million and nearly 449,000, respectively. Freedom of speech and the press were curtailed, private property was subject to appropriation, and thanks to conscription millions of young men’s very lives were placed at the disposal of the Wilson administration.
Of course, all this was bound to provoke a reaction, and the first signs of it came in the congressional elections of 1918, which returned Republican majorities in both houses for the first time in ten years. The Republicans, it must be said, had been no less committed to war than Wilson—indeed, many GOP leaders faulted the president for waiting until 1917 to intervene—yet during the campaign they were quick to capitalize on public frustration with wartime controls. Less than a week after Election Day the German government signed an armistice, bringing the fighting in Europe to an end, and in the months that followed the myriad of wartime agencies and bureaus were gradually dismantled. Conscription ended almost immediately, and the Selective Service System was terminated early in 1919. In 1920 Ohio Senator Warren G. Harding won the presidency by promising the nation a return to “normalcy.”
However, anyone who thought that “normalcy” meant the status quo ante bellum was in for a rude shock. In his 1987 book Crisis and Leviathan, economic historian Robert Higgs discusses the so-called “ratchet effect,” in which each crisis of twentieth-century America produced a sudden and substantial growth in the size and power of government. After each crisis passed, however, the situation never returned to what it had been before. The statistics on the size and scope of the federal government bear this out. Much is often made of the cuts that the Harding and Coolidge administration made, both in spending and taxes. However, at no point in the 1920s did the government spend less than $2.9 billion (more than three times the level of 1916); as a percentage of GNP in 1924 federal spending made up 7.9 percent (as opposed to 5.9 percent in 1916). Even in 1929, the national debt stood at $16.9 billion, more than four times what it had been before the war. Moreover, while the Coolidge administration reduced the top tax rate to 25 percent, it applied to all income over $100,000; compare this to 1916, when a rate of 15 percent applied to all income over $2 million. Despite a rapid demobilization after the war, there were still 25 percent more soldiers in the U.S. Army in the mid-1920s than there had been in 1916, and nearly 60 percent more sailors in the U.S. Navy. Finally, the proportion of the civilian work force employed by the federal government in 1926 was 6.3 percent—greater even than it had been in 1918!
So what happened? Even at the time, John Dewey predicted that mobilization for war would have a lasting impact on Americans’ expectation of their government. “It is true,” he wrote in 1918, “that not every instrumentality brought into the war for the purpose of maintaining the public interest will last. Many of them will melt away when the war comes to an end.” However, once the public learned that it was possible for the government to direct the economy and society, allocating billions of dollars if necessary to solve pressing social problems, they would demand exactly this. “In this sense, no matter how many among the special agencies for public control decay with the disappearance of war stress, the movement will never go backward.”
A prime example of wartime controls continuing—indeed, expanding—in the postwar era involved alcoholic beverages. The Food and Fuel Control Act of 1917 stipulated that “no foods, fruits, food materials, or feeds shall be used in the production of distilled spirits for beverage purposes,” and further banned the importation of liquor into the country. It was added as a sop to prohibitionists, who were well represented in Congress at the time, but it also addressed public concerns about drunkenness on the part of new recruits. At a time when the country was supposed to focus its energies single-mindedly on defeating the Kaiser, intoxicating beverages seemed like a frivolous purpose. (Nor did it help that the country’s most popular alcoholic beverage, beer, was almost entirely produced by German-Americans.) This provision of the law clearly paved the way for national Prohibition; indeed, Congress in December 1917 passed the Eighteenth Amendment, which banned “the manufacture, sale, or transportation of intoxicating liquors.” By January 1919 the required 36 of 48 states had ratified the amendment, and in October the Volstead Act authorized enforcement. Beginning January 17, 1920, Prohibition was the law of the land.
Prohibition cost the federal government some $300 million to enforce (in addition to the billions of lost dollars in excise taxes at the state and federal level), and for the first time brought federal law enforcement into communities across the country. Much of that enforcement was left to the Justice Department’s Bureau of Investigations, an agency which was created in 1908, but only came into its own during the war, when it was tasked with monitoring the activities of those deemed “pro-German”—a category which quickly expanded to include labor activists, women’s suffragists, and African-American leaders. In July 1917 the Justice Department hired a young man named John Edgar Hoover, and one of his first major projects was to oversee the registration of enemy aliens. He took to the task with such vigor and efficiency that he attracted the notice of A. Mitchell Palmer, who was appointed Attorney General in March 1919. When a wave of mail bombings struck the country in June, Palmer organized a new General Intelligence Division, with Hoover as its head. Over the next several months the young bureaucrat had amassed 200,000 files on suspected radicals, then oversaw the so-called “Palmer Raids” of November 1919 and January 1920, in which thousands whom he believed were dangerous to national security were rounded up for questioning; most of them were quickly released, but 556 were deported without trial. The “Red Scare” of 1919-1920 soon passed, but Hoover was appointed director of the Bureau of Investigations in 1924, and would preside over the agency’s reorganization as the Federal Bureau of Investigation in 1935. Hoover’s methods, first developed in the First World War and in its immediate aftermath, served as the foundation for the modern surveillance state.
The primary legacy of the war, however, was the precedent that it set for the federal government not merely to regulate private business, but to manage the economy as a whole. In the years following the war Americans grew less and less certain that participation in the First World War had been worthwhile, but the example of the country’s rapid mobilization for war remained a model for dealing with other problems. Herbert Hoover’s tenure as Secretary of Commerce from 1921 to 1928 exemplifies this. Hoover, who had directed Wilson’s Food Administration, had also overseen U.S. food relief efforts in Europe during the immediate postwar period. At Commerce he championed the sort of government-business partnership that was practiced on a much larger scale in Wilson’s Council for National Defense. In this way he brought about the regulation of new industries such as radio broadcasting and aviation, and established voluntary standards for items such as automobile tires and milk bottles. He also coordinated relief efforts when the Mississippi River flooded in 1927, leaving millions homeless. For good reason Hoover was dubbed “Master of Emergencies.”
Hoover’s record as Secretary of Commerce helped catapult him to the presidency in 1928, but his skills as an expert administrator were sorely tested by the coming of the Great Depression. His strategy was once again to rely on the voluntarism that had worked so well during the war, organizing conferences with business and labor leaders in an effort to keep unemployment down and wages high. Hoover frequently invoked military analogies in discussing his recovery efforts; fighting the Depression, he told Americans in January 1932, was “like a great war in that it is not a battle upon a single front but upon many fronts.” To this end he devoted enormous sums of money; federal spending ballooned from under $3 billion in 1928 to more than $4.5 billion in 1932. He even resurrected the old War Finance Board in a new form—the Reconstruction Finance Corporation—designed to bail out troubled banks, railroads, insurance companies, and other institutions whose collapse might bring down the rest of the economy. 
But it would be Hoover’s successor, Franklin D. Roosevelt, who would most fully apply the war analogy to the economic crisis. While campaigning for the Democratic nomination in April 1932, he reminded his radio audience that he had served as Assistant Secretary of the Navy in 1917. “The generalship of that moment,” he told them, “conceived of a whole Nation mobilized for war.” In the Depression, he continued, “the Nation faces today a more grave emergency than in 1917,” hence nothing short of national planning was required. After winning in a landslide that November, he delivered an inaugural address in which he announced the need to “move as a trained and loyal army willing to sacrifice for the good of a common discipline,” acting “with a unity of duty hitherto evoked only in time of armed strife.” If Congress refused to support his program, he would exercise “broad Executive power to wage a war against the emergency, as great as the power that would be given to me if we were in fact invaded by a foreign foe.”
In 1906 the philosopher William James, a self-described pacifist, admitted that the “martial virtues…intrepidity, contempt of softness, surrender of private interest, obedience to command, must still remain the rock upon which states are built.” Even if war were to be abolished, the world would need “the moral equivalent” in order to inspire citizens to do their duty. The radical journalist Randolph Bourne, who opposed the war, put it more bluntly: “War is the health of the state.” Involvement in the First World War proved to Americans the truth of these sentiments; little wonder, then, that so many presidents since Wilson have involved the country in other wars, both literal and figurative—not only against foreign enemies, but against social problems such as the Depression, poverty, drugs, crime, and, most recently terrorism. This constant war footing, whether for a real war or any of several metaphorical ones, is the most critical legacy of World War I for the United States. World War I was horrific, but paradoxically, we have been returning to war again and again ever since.
 John Dewey, “What Are We Fighting For?” The Independent (July 1918): 474, 480-483.
 Grosvenor B. Clarkson, Industrial America in the World War: The Strategy Behind the Line, 1917-1918 (Boston: Houghton Mifflin, 1923), p. 292.
 Christopher Capozzola, Uncle Sam Wants You: World War I and the Making of the Modern American Citizen (New York: Oxford University Press, 2008).
 Robert Higgs, Crisis and Leviathan: Critical Episodes in the Growth of American Government (New York: Oxford University Press, 1987), pp. 57-74;
 Dewey, “What Are We Fighting For?” p. 482.
 John E. Moser (ed.), The Great Depression and the New Deal: Core Documents (Ashland, OH: Ashbrook Press, 2017), p. 26.
 Moser, Great Depression and New Deal, pp. 32, 85-86.
 William James, “The Moral Equivalent of War,” 1906, https://www.uky.edu/~eushe2/Pajares/moral.html, accessed 10/4/20; Randolph Bourne, “War is the Health of the State,” 1918, https://www.panarchy.org/bourne/state.1918.html, accessed 10/4/20.