World views US 'more positively'
A poll for BBC World Service suggests people are viewing the US more positively than in recent years.
Ref. https://news.bbc.co.uk/1/hi/world/a...cas/7324337.stm
I believe that the US has had some serious bumps that has lessened its positive role in the worlds eyes. I am not sure that we are getting a more positive role but less negativity is working in the US's favor right now. I think it is the assistance the US has had in helping other countries in their times of need and in overcoming some poverty.
To answer this one would have to write a thesis after doing decades of research. Generally I believe the influence of the US is very positive. In fact, without the US wars would be loss and certain countries would have lost their sovereignty. At the same time we have badly messed up certain specific areas.
International Level: Activist / Political Participation: 29 2.9%
The answer to the original question is based on where you live, your culture, your religion and so many things. There cannot be just one easy answer. If you were born in North Korea you will hardly see anything from the USA being anything else but evil. If you're from Mexico then the USA is heaven on earth.
International Level: Specialist / Political Participation: 44 4.4%
I think the USA can be a positive influence in the world if they were consistent in their dictates. For instance, you can't ask the rest of the world to do something but then exclude yourself from it.
International Level: Politics 101 / Political Participation: 4 0.4%