After decades of less than adequate representation in movies, the positive financial impact of women starring in films is finally apparent. A recent study shows that gender and ethnic diversity on screen leads to increased profits for Hollywood. Read more.
- Black History Lunch: One Hollywood Writers Room’s Quest to Diversify Staff Meals
- Why Women Wear White, A Brief History Of Political Fashion
- Women, your inner circle may be key to gaining leadership roles
- Navy To Launch First All-Female Flyover To Honor Pioneer Pilot Rosemary Mariner
- How Close Should an Activist Icon Get to Power? An Interview with Malala Yousafzai