Hey as a millenial I have only once worked at a job where there was honesty. I have had multiple coworkers who have treated me with respect and been shocked when I got laid off for a failure at a levels higher than I could control. I have also had a manager get in trouble for me responding to a question at maybe the wrong time.
So given this context are there jobs that even exist right now that want honesty from their employees or plan to be honest with their employees and/or act with integrity and follow through on their promises both to their employees and/or share holders?
I'd happily switch fields if it meant I got leaders who acted with integrity and ate the cost of their mistakes instead of laying people off and further burning out the employees. Especially if it meant I could work on making things more accessible or intuitive UX etc.
One of the companies I've worked at felt like the leadership ever had skin in the game. Some great managers but...a few good managers doesn't create a mentally healthy work culture.