Prehistoric Europe   German Early History   Medieval Germany   Building of Germany   German Reich   Germany during World War II   Post-war Germany since 1945   Federal Republic of Germany   


  The history of Germany during World War II closely parallels that of Nazi Germany under Adolf Hitler. Adolf Hitler came to power in Germany in 1933. From that point onward, Germany followed a policy of rearmament and confrontation with other countries. During the war German armies occupied most of Europe; Nazi forces defeated France, took Norway, invaded Yugoslavia and Greece, and occupied much of the European portion of the Soviet Union. Germany also forged alliances with Hungary, Romania, Bulgaria, and later Finland, as well as collaborators in several other nations. The German defeat at the Battle of Stalingrad in 1942 was considered the decisive victory that turned the tide of the war against Germany and her Anti-Cominterm allies. The Second World War culminated in Germany's unconditional surrender to the Allies, the fall of Nazi Germany, and the death of Adolf Hitler.


Comments (0)