Is the US an "empire"?
  we in fluence world politics, world finance, maintain over 1000 military bases
  We have started wars of so called peace, to bring about "something better" through violence 
   
  Empires oppress, police, exert force, look after their own interests
  All empires have followed the same arc. they rise and fall. Is that the destiny of the US?
   
   
  Can  empires ultimately do good?
  Do truly great societies want to be empires?
  Can the US bring about world peace, world democracy?
  What is the word for a great, balanced nation of peace and altruism? Possibly thats a word that doesn't exist. Is anarchy the only answer?