The team behind Game of Thrones is now proposing a new show: an alternate reality series in which the South won the Civil War and perpetuated legal slavery into the present day. It might be a good show, if it can consistently show that slavery is an unforgivable evil and uses this alternate history to highlight real injustices. But do you trust them to make a socially conscious, critical analysis of contemporary America? I don’t. The people who made Game of Thrones have learned that sensational violence and sex sells. I predict many opportunities to show attractive naked black women on the auction block or serving white men while topless.
It also falls into the category of playing devil’s advocate, or “just asking questions”, playing rhetorical games with matters that affect people’s lives. That is inappropriate. To have two white guys propose this thought exercise is troubling.
But mostly, we have a dreadful track record on dealing with the Civil War’s legacy in movies. Go read that link; we’re subjected to the most awful romantic schlock about the Confederacy, which was apparently full of rugged, noble individuals who were fighting for their way of life and dealing with the aftermath of loss with dignified grace.
People still think Gone With the Wind was a great movie. I doubt that it was; I’ve tried to watch it multiple times at the urging of friends, and have never lasted more than 15 minutes before I’ve left the room.