answersLogoWhite

0

No. Aside from a brief period peace which only lead to an even worse World War II and some experience in warfare as a country, Australia really gained nothing after World War I. Some would argue that Australia became more of an independant nation and what not, but on the whole the amount of lives that were lost during the war was not worth it.

User Avatar

Wiki User

12y ago

What else can I help you with?