The culture wars of recent decades have largely pitted a growing secularization of America against its more traditional Christian culture. The secular left has sought to push religion out of the public square—whether from schools, or over public monuments, or in public expressions (coins, pledges, holidays). Meanwhile the religious right has pushed back, becoming a major political and legal force. The culture wars are a series of battles in elections, legislatures, executive offices and the courts.