Originally Posted by
alan
I would disagree.
Southern Democrats have historically been rather conservative. After World War II, during the civil rights movement, Democrats in the South initially still voted loyally with their party. After the signing of the Civil Rights Act, white voters who became tolerant of diversity began voting against Democratic incumbents for GOP candidates. Rising educational levels and rising prosperity in the South, combined with shifts to the left by the national Democratic Party following the New Deal and a variety of other socio-economic issues, led to widespread abandonment of the Democratic Party by white voters and Republican dominance in many Southern states. In my opinion, liberalism had everything to do with it while race played a very minor role.