Originally Posted by
WBA1955
The Indians were always the bad guy's in the early westerns, then from the seventies they were the good guys and the whites were the bad guy's. It's not as red and white as that. Read Empire of the Summer Moon or A Fate Worse Than Death or A Slave of the Sioux. The atrocities carried out by the Indians made me hate them just reading it. Babies, men and young children were killed on the spot, usually in a cruel way in front of the mother's and wives. The women were repeatedly raped then either killed or sent to a camp where they became slaves and were beaten by the Indian women. Some young boys were assimilated into the tribe to become warriors.
You can see why the early settlers hated them and wanted to exterminate the them all.
But the Indians were warrior men who were defending their homeland and they treated the enemies of the tribe, other Indians the same.
I have also read Bury My Heart at Wounded Knee and The Earth is Weeping that gives the native American view.
Always best to see two sides, that was what my parents taught me.