The lie that America is an empire building imperialist power has been around since the founding of our nation. It's one of those situations where you help people and then they turn around and stab you in the back for it (the French come to mind). I can ramble on about how many countries we've liberated and how other countries have invaded, taken control and destroyed countries, but Bernie over at Plank's Constant has already done it for me.
While he specifically focuses on Muslim takeover of countries, his descriptions of what America has done for the world - and then been vilified for - is right on the money. Just think, there's a ton of Liberals in this country right now as you read this who think our country is the most hate-filled nation on earth. An imperialist empire bent on crushing all others for no reason at all.
... he is confused about America when he writes "that the very foundation of the americas is conquest of foreign land for white/christian domination over native dwellers." The British, Spanish, and French came to the Americas in conquest and yes, for white/christian domination over natives. But America was founded on different principles, so that when we fought wars, contrary to his argument, we didn't impose white/Christian beliefs on the vanquished.
Japan in 1940 was 85% to 95% Shinto or Buddhist. After decades of American occupation, the numbers haven't changed. 64 years after surrendering to us, there is no white/christian dominion over Japan. The culture has remained Japanese, they still write in the same script, and the only real difference is that a thousand years of belligerent dreams of world conquest have been redirected toward productive and peaceful ventures.
Read the rest