The United States government has had an uncontested position of world domination, a one world government under which it has subjected the planet to its dictates, since the end of the Cold War with the Soviet Union. This is now changing; recognizing this is the first step to creating a brighter future, but in order Read More…
The post The Decline Of The Western Empire appeared first on The Last American Vagabond.
Views: 0