Why does everyone blame the Americans for everything that goes around in the world.. They are nice, peace loving, friendly people.. I know it.. I have been there and I can say for sure that there is no place in the whole wide world as nice as the United States and there are no people as friendly, sweet, fair and conscientious as the Americans.. If I had an option, I would just love to move there permanently and never leave that heavenly country till I die..So please give them a brake and stop blaming them for everything that goes wrong anywhere..