In general, post-war feelings were very positive. Most believed that the war was justified on the two major fronts, Europe and the Pacific, and were happy with the outcome. There was a sense of exhuberance and for decades the US rode on high hopes for the future. The Allied Forces were more than decent in how they treated the defeated nations. We helped Japan and Germany in their reconstruction and we heartily welcomed them into the world culture and economy. The actions taken by the US were also in the best interests of the US, but few will deny that US aid was crucial in the restoration of the parts of the world that were scarred by the war.