Whether World War II made or merely marked the transition of the United States from a major world power to a superpower, the fact remains that America's role in the world around it had undergone a dramatic change. Other nations had long recognized the potential of the United States. They had seen its power exercised regularly in economics, if only sparodically in politics. But World War II, and the landscape it left behind, prompted American leaders and the Congress to conclude that they had to use the nation's strength to protect and advance its interests.