English Wikipedia - The Free Encycl...
Download this dictionary
German nationalism
German nationalism is the idea that asserts that Germans are a nation and promotes the cultural unity of Germans. The earliest origins of German nationalism began with the birth of Romantic nationalism during the Napoleonic Wars when Pan-Germanism started to rise. Advocacy of a German nation began to become an important political force in response to the invasion of German territories by France under Napoleon. After the rise and fall of Nazi Germany which opposed the Jews and others during World War II, German nationalism has been generally viewed in the country as taboo. However, during the Cold War, German nationalism arose that supported the reunification of East and West Germany that was achieved in 1990.

See more at Wikipedia.org...


© This article uses material from Wikipedia® and is licensed under the GNU Free Documentation License and under the Creative Commons Attribution-ShareAlike License