Americans don't really pay much attention to World War I. We came in late to the war and lost few men compared to the other countries. But World War I was a great tragedy for Europe, and put an end to an optimistic era of belief in the unstoppable progress of civilization. European colonies saw their imperialist rulers tear themselves to pieces in the stupidest war in history, and knew they did not need to submit to this rule any longer. Germany and France and England lost a generation of young men. The English became pacifists. The French (most of World War I was fought in France) lost the will to fight for a generation. And the Germans, poor and suffering after the war like everyone else, but punished for starting it by crushing reparations in the Versailles treaty, began to accumulate resentments that culminated in Hitler and World War II. Here's a photo which someone realized included Hitler, celebrating the German declaration of war in Munich, in 1914: