Hollywood's portrayal of the South has never been sympathetic…
-
digitaldiva — 9 years ago(January 01, 2017 04:31 PM)
Agreed, amcalabrese. The South has always been treated well in films like the original
Birth of a Nation, Gone With the Wind, So Red the Rose
and many others. Only recently have filmmakers looked at the reality of slavery and presented a less romantic view of the South.