After decades of less than adequate representation in movies, the positive financial impact of women starring in films is finally apparent. A recent study shows that gender and ethnic diversity on screen leads to increased profits for Hollywood. Read more.
- Notre Dame Coach Drops the Mic on the Lack of Women Leaders in Sports
- The Women Who Contributed to Science but Were Buried in Footnotes
- A Judge Ruled Requiring Girls To Wear Skirts At School Violates The Constitution
- Pfister announces 2019 artist in residence
- Karen Uhlenbeck Is First Woman to Win Abel Prize for Mathematics