Monday, October 3, 2016

Colonial Period

    The first sign of racism in America was towards the Native Americans. They were forced out their lands by the Englishmen. They enslaves the Native Americans and sold them for money. After they realized that they couldn't control the Native Americans, they needed people to work on their land for free. Englishmen brought slaves from Africa cheap. They brought them over to America and forced them to work on their land. Africans were too far from their homes to run away. At first, slaves were treated as indentured servants, but in the late 17th century the Virginia House of Burgesses passed a series of laws that recognized slaves as property. 


http://americanhistoryrules.com/divisionandreunion/the-origins-of-slavery-in-america/