The Truth About Being a ‘Woke’ White Person
It seems it’s once again time to revisit what the word actually means, and where it came from to begin with.
If you lived on a diet of Fox, OAN, Newsmax, Rumble, or any of the (plentiful) other right-wing media outlets, you’d likely despise the word ‘woke’ and everything it represented. Well… everything you thought it represented, that is. If pressed to define what ‘woke’ actually means, you’d almost certainly have trouble finding words that constitute a coherent definition. After all, it probably just boils down to a feeling. An instinct. “You just know it when you see it,” you might offer.
Or, maybe you’d have a very specific definition of the word. Something like “liberals,” or, “people with a socialist/communist agenda.” Perhaps woke is to republicans what MAGA is to democrats.
To be fair, language does evolve. This is usually for the greater good of society. A forward-thinking or progressive action that seeks to be inclusive as our cultural norms shift and change. These gestures, though seemingly difficult for some people to accept, highlight society’s altruism. To do better. To destigmatize what never should’ve been stigmatized in the first place.