How Work Has Changed for Women in Corporate America Over the Last 10 Years

How Work Has Changed for Women in Corporate America Over the Last 10 Years

Categories :


Ten years ago, I was feeling burned out after leaving a corporate job in the technology industry, where I had faced and witnessed bias, racism, and sexual harassment. The prevailing narrative around me about gender inequality was that women weren’t driven or confident enough to succeed in the workplace. Given that, seeing such ambitious, brilliant women colleagues — particularly women of color — face such inequities felt like cognitive dissonance. And yet the “women need to do better” narrative persisted.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *