A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

What were the positive effects of WWII in America?

Best Answers

World War II pulled America out of its isolationist era. When the Great Depression affected the American economy, America was going to only focus on their problems and not concern itself with any other affairs. After Pearl Harbor on December 6th of 1941, America transitioned its massive economy into a Total War economy. read more

The Effect on America By Rahma, Ruqayyah, Yuser, and Raadia Introduction WW2 was one of the bloodiest wars known to mankind. Although millions of people were killed, positive effects still remain. WW2 open new windows of opportunities for many (including women and minorities). read more

Effects of WW1 on America Fact 30: The impact and effects of the Great War on America were extremely diverse and directly led to the period in history from 1917-1920 referred to as the First Red Scare and the emergence of the 1920's Ku Klux Klan. read more

Related Types

war