In 1941, in solidarity with Japan, Germany formally declared war on the United States thereby bringing the US into the conflict in Europe. Germany was under no obligation to do this and given that the American military made life very difficult for Germany it raises the question: why did Germany declare war on America? To find out why watch this short and simple animated history documentary.