Do American schools teach Native American History lessons?
yeah I had two semester of Native American Studies. Something you don't get from a high school ed. Two tough semesters but did well enough to get on the dean's list. But still don't know everything. So what do you want to Know about Native Americans, tateziwin?oldrooster said:They do, they teach it in history classes. There are many classes at the universities that you can take on a wide range of Native American subjects. I took one on battle stratagies of chiefs. Good class.
Lexluther said:In my opinin, it isn't taught adequately (at most schools, at least) unless you take a college course in the subject. In high school classes, Native American perspectives tend to be sidelined to a great extent, and accounts of many events, like the fall of the Aztec empire or the Sioux Wars, are told with a definite white man bias. Certainly native prehistory is often overlooked entirely. But I think these things are in a process of change. Fifty years ago, Native American history wasn't taught at all, or was taught from the basis that indian cultures really were inferior. So, what we have is a marked improvement, and I think will continue to improve as time goes by.
Xen_Antares said:I agree, usually we are taught that a bunch of naked heathens ruled the land until God almighty sent white man to kill them and take their land and teach them how to build towers and stuff. I learned more by visiting websites for the Cherokee than anything else.
Real Corona said:What the hell are you talking about?
I've never heard that term before.
I have heard that some tribes want to be known as West Indians.
In Canada they're called First Nations.