the-american-catholic.com
December 11, 1941: Germany Declares War on the US
After Pearl Harbor war between the US and Nazi Germany was inevitable, but Hitler relieved FDR from the tricky business of turning the attention of Congress and the US, riveted on Japan, to Germany…