top of page

American History 101

We all know the basic tales of how America came to be. We have heard the cute little rhymes that teach us the bright side of the past. However, just because "Ocean Blue" rhymes with "1942" does not mean Christopher Columbus was a good man. Just because nothing rhymes with "Genocide" does not mean that one of largest and longest running mass genocides did not take place on American soil.  There are many different aspects of American history we do not teach our children in school, or even bother discussing as adults. Here are a few upsetting, but direly important historical facts about America every American should be aware of. 

1. The treatment of Native Americans

     - From a young age American Children are force fed the idea that colonists and Natives were civil towards each other and even shared bountiful feasts that are still celebrated every year on thanksgiving. However, the colonists were horrible and nasty people who tricked the Natives into signing over North American land before sending the Natives on a fatal journey to the west where they were forced to live on small patches of the land. 

2. Slavery and Racism Post Civil War: 

     -  We learn about slavery in school, but not to the extent we should. It is something that is skimmed, something briefly mentioned, but not something we give fair attention to. We have a tendency to downplay the horror of what went on during the time of slavery, things such as rape, murder, whipping, and many other brutal modes torture. Not to mention the downplay of the Jim Crow Era and the use of Confederate statues as a way to remind African Americans they were unwanted. 

3. Truth about Founding Fathers and Freedom for "All"

       - Many people hold the founding fathers in high regards and want to use them as role models for how we should still be running our society. However, the founding fathers never intended for their to be equality for all people, but only for white men who owned property. 

​

​

​

​

​

​

​

​

Our Mission:   
 
Here on The American Reality we aim to explore the honesty behind the concept of the American Dream. We want to expose the exclusivity of success in America, and use this truth to inspire the American people to come together as inclusive group accepting of all those willing to accept others. 
bottom of page