Here’s a great look at the portrayal of Native Americans in classic Hollywood movies:

[youtube=http://www.youtube.com/watch?v=_hJFi7SRH7Q&w=475]

They do a great job highlighting the portrayal of American Indians as violent, uncivilized, and animalistic, and the effect that has on Native American moviegoers. I did notice, though, that all the movies they showed were fairly old, and that such blatant racist rhetoric would have a harder time now. But does that mean the problem has actually gone away, or has Hollywood just stopped portraying Indians at all, negatively or positively? Or have more subtle, insidious stereotypes slipped in to take the place of what we see here?


ADA Site Compliance-Accessibility Policy

Discover more from Lee & Low Books

Subscribe now to keep reading and get access to the full archive.

Continue reading