We had early on women having the right to vote, then women in the workforce during WWII, just going back in history, and then we had the higher education of women, and then women more fully participating in the economy and in business, the professions, education, you name the subject... but the missing link has always been: is there quality, affordable healthcare for all women, regardless of what their family situation might be?