answersLogoWhite

0


Best Answer

Not in the way you might think, no. The United States did give (West) Germany a "hand up" to get on it's feet after World War II with various kinds of aid, yes. Germany was literally flattened into the dust by World War II. Thanks mostly to the industriousness of the German people, and the excellent leadership of Konrad Adenauer, their recovery was almost miraculous.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Did US rebuild Germany after the war?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

How was Germany created?

Germany was created after the world war 1 when they had to rebuild they named it Germany .


Why did the US lend money to the USSR to help them rebuild after World War 2?

Because they were an Ally who helped defeat Germany


Why did allied leaders insist Germany pay reparation for the war?

they wanted to weaken Germany so it could not rebuild its military


Why did allied leaders insist that Germany pay reparations for war?

They wanted to weaken Germany so it could not rebuild its military.-novanet


What was the role of the US after the war?

The US often provided funds and materials to help rebuild what was destroyed by the war, but the US also stayed in countries to help them rebuild - possibly restructure - the governments.


What was the role of us after the war?

The US often provided funds and materials to help rebuild what was destroyed by the war, but the US also stayed in countries to help them rebuild - possibly restructure - the governments.


Why did allied leaders insist that Germany pay for the war?

They wanted to weaken Germany so it could not rebuild its military.-novanet


How did the US treat Japan after the war?

the US helped rebuild its cities and economoy


Did US help japan in post war years?

Yes. US aid workers were sent to help Japanese civilians rebuild and replenish their supplies. They helped for almost 4 years (1945-1949). Note: Germany was given the same treatment, except US civilians AND soldiers helped to rebuild there.


What happened to the US after it declared war on Japan?

Germany declared war on the US, the US then declared war on Germany.


Us declares war on Germany in World War 2?

No Germany declares war on US


What day did US declcare war on Germany?

Germany declared war on the US in the morning, and the US declared war on Germany in the afternoon on December 11, 1941.