I work in a female dominated profession. Nurses have a reputation for cattiness, much of it earned. Some of the events depicted in my story “Nurses Eat Their Young” (link in my profile) were based on my actual experiences as a new nurse xxx years ago.
Workplace culture is much different today. At the last two hospitals I’ve worked, employees are required to undergo mandatory training in sexual harassment, bullying, racial and religious diversity, LGBT awareness, dealing with anger, etc. They are strongly encouraged to report bullying, harassment, etc. I also teach and I’m careful with how I interact with students, lest anyone think my verbal testing of them gets interpreted as an attempt to humiliate by making them look stupid.
Sometimes, I think it can go a little too far. I’ve seen employees fired for crimes I personally thought should have been handled differently. Overall, it’s a good thing. Having lived through working with women who wanted to kill each other, I can say it creates a toxic, miserable, unproductive environment, male fantasies notwithstanding.
I realize I’m speaking about hospitals and, I believe, large companies. The strippers at Dino’s Bar and Grill may operate under a different workplace ethos.