Generally, no. There was a combination of things. Here are some of the major contributing factors:
- the outcome of World War I and the Treaty of Versailles. Germany lost the war and the initial outcome of it left Germany in a significantly weakened position. France occupied a major industrial area of western Germany, Germany totally lost her overseas colonies in Africa and Asia, and the Treaty of Versailles imposed harsh reparations and restrictions upon Germany.
- the unstable Weimar Republic. Germany went from being an monarchy to a parliamentary republic overnight, with no transition time. This first attempt at democracy was poorly planned, and the government struggled with gridlock as politicians and their political parties generally refused to cooperate with each other.
- Extremist politics. During the post-World War I period, extremist political parties, from the right-wing Nazis to the left-wing Communists, experienced a surge in popularity. This further exacerbated the instability of the Weimar Republic as extremists literally fought battles against each other in the streets of German cities.
- the Great Depression. Lenders in the United States helped Germany recover from World War I; so when the U.S. suffered Black Tuesday in 1929 and then fell into the Depression, it also very strongly affected Germany, and their economy was shattered.