Email us for help
Loading...
Premium support
Log Out
Our Terms of Use and Privacy Policy have changed. We think you'll like them better this way.
Hollywood film directors are some of the world's most powerful storytellers, shaping the fantasies and aspirations of people around the globe. Since the 1960s, African Americans have increasingly joined their ranks, bringing fresh insights to movie characterizations, plots, and themes and depicting areas of African American culture that were previously absent from mainstream films. Today, black directors are making films in all popular genres, while inventing new ones to speak directly from and to the black experience. Join us this week as we explore this topic on a deeper level and speak to some of our most respected and influential African American film makers, as well as those that we may not agree with in terms of their stance in Hollywood.