After decades of less than adequate representation in movies, the positive financial impact of women starring in films is finally apparent. A recent study shows that gender and ethnic diversity on screen leads to increased profits for Hollywood. Read more.