Hollywood's portrayal of the South has never been sympathetic…
-
agracier-574-436194 — 9 years ago(December 13, 2016 04:13 PM)
It was a region of vast disparities of wealth versus poverty. Wealthy plantation slave owners convincing dirt poor farmers and laborers that it was in their interest to give their lives so the rich could continue to enjoy their ill-gotten wealth.
You can't have been all that smart to have fallen into that slave-owners trap -
digitaldiva — 9 years ago(January 01, 2017 04:31 PM)
Agreed, amcalabrese. The South has always been treated well in films like the original
Birth of a Nation, Gone With the Wind, So Red the Rose
and many others. Only recently have filmmakers looked at the reality of slavery and presented a less romantic view of the South.