What do you define woke culture as? What's the difference between it and political correctness?
Woke is the catch all term for radical far left extremism, based around intersectional femenism, that views people not as individuals with free will, merely influenced by biology and upbringing, but as interchangeable blank slates entirely beholden to and defined by their immutable characteristics such as race, sexuality and gender.
Woke also sees the whole world as a fight between these groups for dominance, and that all culture, entertainment and art is merely tools by whichever group is currently doing best (read straight white males) to keep everyone else down and maintain their powerbase. This is the soul reason anyone does anything.
It is a worldview defined by shallow superficiality, postmodernism and inate hatreds. The more 'oppressed' characteristics you have, the more help you need as the more discrimination you will face.
It also leads to the idea that to defeat oppression you need to simply swap out the more dominant individuals for the more oppressed.
Since everyone is equally able to do any job, and there is no objective reality or quality to judge anything against due to all views, experiences and beliefs being equally true and interchangeable, the woke seek to have as much diversity in as many visible and influential positions as possible, regardless of ability, qualifications or suitability, or even broad trends in any such groups interests, because they dont believe such things exist.
This is of course all utter bollocks, but it has social capital amongst left wing elites, and big globalist corporations bloody love it, so it gets rammed down our throats regardless.