Liberalism
America’s Woke Liberals and Muslim Millennials
For decades, the American left has pushed faith to the margins of political life, even as the right has embraced it as part of the culture wars. But post-9/11 culture wars and foreign wars are creating alliances among liberals that are bringing it back to the center.