West Prussia was a province of the German Empire until 1918, when some parts were given to Poland, and 1945, when its totality was included in Poland.
The Prussian Empire was a powerful and influential state that existed from the late 19th century until the end of World War I. Led by the Hohenzollern dynasty, it encompassed a significant part of Central Europe and exerted considerable political and military influence over the region. Known for its disciplined military and efficient bureaucracy, the Prussian Empire played a crucial role in shaping the geopolitical landscape of Europe during its existence.
The Roman Empire has the same.
Franco-Prussian War
After the Franco-Prussian war in 1871 Prussia was able to unite Germany, which existed before that out of many small states, into one major empire .
NO. The German Empire (led by the Prussians) were the unambiguous winners.
ottawan van biskmark
William 1
It was taken by the German Empire at the end of the Franco-Prussian War.
It was in West Africa.
The empire in the west which was contemporary to the Han Dynasty was the Roman Empire.
You are assuming the West is an evil empire.
Firstly, the (second) German Empire was proclaimed in 1871 (following the Franco-Prussian war) and secondly, what are the options?