Question Home

Position:Home>Arts & Humanities> What major war do you consider the most important in U.S. history,and why?


Question:

What major war do you consider the most important in U.S. history,and why?

A)Revolutioary warB)Civil warC)W.W.1D)W.W.2E)Korean F)VietnamG)the war on terror.


Best Answer - Chosen by Asker: Importance is a fairly broad term, given that it can be applied to different arenas that are difficult to compare. For example, the Revolutionary War was important in that it was the official beginning of the U.S., but any major event in the past is a factor in future wars, and the Revolutionary War happens to be first among this list. (By extention, the French & Indian War of the 1750's, while in colonial times, could be evoked by the same logic.)

The Civil War was important as it established the U.S. as a unified nation. A couple of points worth noting: before the Civil War, the name "United States" was plural, but afterword, it became a singular collective noun. Also, when Abraham Lincoln gave his famous Gettysburg Address, he intentionally used the phrase "nation" to refer to the whole, which was a bold move for the time.

World War I was not so much important to U.S. history as it was to Europe. Still, a new sense of isolationism and a national paranoia over communism resulted. I tend to consider that minor.

World War II was definitely the most important war of 20th century U.S. history. It turned the U.S. into the world power (along with the Soviet Union) after Europe decided to downgrade its global status. The wars after that (Korea, Viet Nam, Persian Gulf, Iraq) are all resultant of the U.S. status since WWII, and were at most hiccups in the framing of the American experience.

So I'll say the Revolutionary War, Civil War, and WWII form a three-way tie. However, I proffer one not on the list. The War of 1812, generally not thought of, is a very important war in U.S. history. The results of that war led to the U.S. becoming the only real power in the Western Hemisphere as it allowed the country to not be caught up in European affairs. Another result was that it allowed the U.S. to expand westward to the Pacific, given that the British withdrew from arming Native Americans in what is now the Midwest in exchange for Canada's safety from U.S. invasion. The War with Mexico is a direct result of this, and the Civil War is a child of this as well. Perhaps the most telling aspect of the War of 1812 is that before the war, an American was a First Nations individual, while whites were generally considered something else. After that war, the term American applied to the whole of the U.S., thus our true national identity was born.