The Only British Territory Taken By The Nazis In WWII
Obtaining British territory was frightfully difficult for Germany
5 min readDec 29, 2021
After the Fall of France in 1940, Hitler’s Germany was hoping to negotiate peace with Great Britain. Germany had long had Anglophiles in its government and there were many, including Hitler himself, who did not want to come to blows with the United Kingdom…