Or is it in the sense that the only reason Nazis became a thing is because the American political establishment ground the German people into the dirt after WW1 and Nazism was how they recovered some dignity?
Like, once again, cruel and arrogant American policy is driving people towards Nazism and we're going to die because of it?
Or is it in the sense that the only reason Nazis became a thing is because the American political establishment ground the German people into the dirt after WW1 and Nazism was how they recovered some dignity?
Like, once again, cruel and arrogant American policy is driving people towards Nazism and we're going to die because of it?