Do college-educated people become liberal or conservative?
When it comes to moral and religious issues, does a college education create open-minded and tolerant citizens, or citizens who are aligned with their often-liberal college professors? In politics, do Americans follow their internal moral compass, or are they highly influenced by academics and popular culture? Ref. Source 2