I think empowerment of women is exactly what's happening now, with women being portrayed as human beings, and not just black and white. Men can be the anti-hero all the time, and it's cool, but when women are, they're twisted or messed up or something is wrong with them. I think it's just about portraying women in the world as equals to men, and vice versa.