Jump to content

Featured Replies

Posted

Would the world we live in today be better off if the Confederate States of America had prevailed over the United States of America?

 

 

OR

 

 

Would we be better off if many of the cultural ideas existing in the South had become mainstream instead of those in the North?

 

Why or why not.

 

 

I contend that the world would not be as consumed by capitalist greed if at the very least Southern values and/or ideals had been accepted as common in the United States.

  • Author

Forget slavery.

 

 

 

 

We are not talking about slavery.

 

 

 

The War of Northern Aggression was not fought for slavery. We fought it for our rights.

 

 

And the North was just as racist as the South.

  • Author

I implore all to present their idea in this debate.

 

 

It might be interesting to discover things.

 

 

 

I will also contend that its a travesty that the Spanish and French cultures have largely been 'lost' in the American/British imperialist conquest of the land known as the United States.

  • Author

I can see where you are coming from.

 

 

 

 

But I still think this country would a be a better place if some aspects of Southern culture would have been maintained as intergal parts of this American culture we have today.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...
Background Picker
Customize Layout