- Nov 21, 2011
- 2,809
- 410
- Faith
- Baptist
- Marital Status
- Single
- Politics
- US-Republican
Who "started" the "institution" of slavery as far as history goes? Who benefited the most out of Ancients? Who really benefited the most out of slavery in the US? Most slaves in the US (obviously) were black and mulatto with some whites. Who really benefited, whites, some blacks, some Native Americans, some Jews, or all of the above? How did the concept of racism that we in the States have now evolved from the "institution" of slavery? Did the idea that the Africans were not just slaves, but in their minds "heathens who needed to be "Christianized"" one reason?
I wanted to know all of this because the history of black African people in the US started in the 1600s with enslavement, indentured servitude, and the Middle Passage. I was told that slavery was deemed legal for less than a century yet it continued until 1865 under law. Maybe I need to read up on State Law and the Constitution some more. Anyways, did most white free people really benefit from slavery or was it just the enslavers and plantation owners who owned numerous slaves of all "races" and colors?
I have all of these questions because especially Hollywood and our schools see blacks as below everyone else as far as education and socioeconomic issues. However, true history has not been written when it comes to slavery. Slavery was, and always will be, evil and vile whoever was enslaved, "African" or "European" descendants. If the truth is, many people were enslaved were black, how come Hollywood shows only black slaves, and white masters whereas history was more "racially" and "ethnically" diverse than just "black" and "white"?
I wanted to know all of this because the history of black African people in the US started in the 1600s with enslavement, indentured servitude, and the Middle Passage. I was told that slavery was deemed legal for less than a century yet it continued until 1865 under law. Maybe I need to read up on State Law and the Constitution some more. Anyways, did most white free people really benefit from slavery or was it just the enslavers and plantation owners who owned numerous slaves of all "races" and colors?
I have all of these questions because especially Hollywood and our schools see blacks as below everyone else as far as education and socioeconomic issues. However, true history has not been written when it comes to slavery. Slavery was, and always will be, evil and vile whoever was enslaved, "African" or "European" descendants. If the truth is, many people were enslaved were black, how come Hollywood shows only black slaves, and white masters whereas history was more "racially" and "ethnically" diverse than just "black" and "white"?
Last edited: