directors

Mandatory Staff Picks: Celebrating the Best Black Directors in Cinema

If there’s one thing about Hollywood that has become abundantly clear, it’s that we need diversity behind the camera; filmmakers able to astutely comment on race relations in this country while giving a voice to everyday African-Americans. Unfortunately, mainstream cinema has a tendency to sideline the stories of black communities in favor of white-centric narratives. Fortunately, prodigies have still found a way to scratch, crawl, and fight through stereotypes; bringing their unique (and needed) perspectives to the big screen. From big-budget superhero movies to honest depictions of South Central Los Angeles, the following directors and their films have, and continue to, revolutionize how, what, and why we watch.

Photo: David Crotty (Getty Images)

Mandatory Inspire: Celebrities Who Stand Up to Help the Black Community in Wake of George Floyd

Mandatory Reads: The Best Non-Fiction Books by Black Writers

Follow Mandatory on Facebook, Twitter, and Instagram.

TRENDING

X