Since the end of the cold war, the world has watched as the United States became, not merely the world's only superpower but what the French began calling a "hyperpower." Now, with the United States asserting its will and power on such issues as Iraq and the war on terror while rejecting contraints that the international community tries to place on it, some suggest that the term American empire is more appropriate. If America does have an empire, it is not based on territorial expansion as in past empires. So what is it based on? And would taking on the role of imperial hegemon be good for America and the world?