User:Lockup2000

America Changes Forever
The United States changed forever on December 7, 1941. President Franklin Roosevelt called it "a date which will live in imfamy!" Although it was a bad day as far as American military history is concerned, it did cause the United States to change its political policies and as a result, the US became a world power. Prior to that day, Americans had very little concern in regards to the rest of the world. As a matter of fact, Americans were for the most part isolationists and voted in politicians that intended to keep America isolated from the rest of the world. These politicians in return didn't go against their constituants and made far reaching promises to keep America out of any war, especially the one that was raging in Eurpoe at the time.

World War I had left Americans feeling somewhat cheated, even though it was American intervention that caused the Allied powers to win the war. But, for the most part, Americans only saw the lists of dead and wounded and the veterans had to fight hard for any benefit or reward that had been promised to them. Some veterans had even been shot and killed in Washington, DC when they were there demanding their promised benefits. This caused Americans to distance themselves from any intervention when Europe decided to go to war again.

The events of December 7, 1941 were different. In the last war, America hadn't been directly attacked, but this time America was caught by surprise and Japan nearly destroyed the entire American naval fleet at Pearl Harbor, Oahu, Hawaii. This put a different feeling about war in the hearts of Americans and the country that had no intentions of fighting a war was now seeing millions of men volunteering to go join the military for revenge. Thus, America changed in just a matter of a few minutes from an isolationist government into a nation of wrath.

Even though Europe was not involved in the Japanese attack, the Axis powers (Japan, Germany and Italy)had promised to declare war on any nation that declared war on one of them. Therefore, Germany and Italy declared war on the United States immediately following the declaration of war by America on the Empire of Japan.

America did fight the war and came out victorious. After seeing the atrocities of the Axis powers, Americans vowed to intervene when circumstances called for measures to stop genocide and other such war crimes. America went from hiding from war to a nation intent on policing the world. The Cold War soon began after World War II ended, and America found itself fighting all over the world to stop the Communists from overtaking the free world.

What Japan did on December 7, 1941, not only changed America, but inevitably changed the entire world.

Now, the War on Terror is America's fight and is still keeping America on the front lines of warfare to this day.