I'm not really talking about the gradual decline of American influence, it's more that I've really noticed a rapid decline over the past two years - everything from relations with other countries, to internal affairs.
I'd suggest your post is really just a bias with a political slant. Not that American relations with other nations isn't going through a transition YES which might feel uncomfortable but you conclude that's a bad thing. Is it really though?
If a nation has to make a stand to set and establish things on a more just footing yes it can create some turbulence but wise leaders look to the end result they're seeking to achieve not the temporary pats on the back from other nations who want to direct America's foreign policy to their own advantage.
Upvote
0