The Humanities Are Dead
The line between toppling statues and burning books is razor thin.
Of course, you’d need at least a cursory education in the humanities to understand why. But if recent events have taught us one thing, it’s that American education no longer includes the humanities.
The humanities—along with anthropology, sociology and related fields—are supposed to convey an appreciation for culture. Language, art, sculpture, drama, literature, architecture, fashion, design and poetry provide ways to communicate with the past. They teach us about the range of human difference that coexists with basic human sameness. …
America’s recent college graduates appear ignorant of such niceties. They see every figure, every icon and every expression solely through a contemporary lens. They simplify everything as either progressive or evil— and then move to strike down evil.
Read more at Newsweek.com