Worth it Wednesday

America

Once again, life has taken on a different course than normal. This week is solely based on what’s happening in America and something I’ve become passionate about learning and uncovering. This week is about American History. No, not that watered-down and whitewashed half truth shoved down our throats in school. I’m speaking about what really happened within this nation that still has ramifications to this very day. 

Initially I took what was told to me as truth. As a child, that’s what you’re taught. Why would someone knowingly deceive a child into believing something that’s false? I knew that books attempted to make light of slavery, but many of the defining moments in history portrayed Europeans as the protagonists of the story when that’s not accurate.

Reading is essential here!! I am the first person to admit that I don’t do enough reading in this area, but it was only because I thought it was boring. Now I’m finding out that it’s honestly just horrifying. As a kid, I remember we would call someone that gave you something and then wanted to take it back an Indian giver. I continued to say it out of ignorance, but recently I asked my husband why that was even a thing. If you peek into history you’d know that the Indians gave as a sign of friendship only to be taken advantage of, stolen from, and killed (by disease or decision). This is makes me sad. And the idea of pushing them onto reservations because their lands were stolen from them (We have to admit this because it’s tiring acting as if lands were just “discovered”), is by no means compensation for what was taken. Sacred ancestral lands are still having to be fought for because the government cares more about raping the land of resources than respecting it. This is just one instance of an ethnic group experiencing horrible treatment. 

So here’s my connection to where we currently are: We were lied to believe in something that didn’t exist. Once we admit this fact, we can move to correct the wrongs. Nothing signed at the beginning of America was designed to include/promote anyone other than White men. White women were able to come along because men needed them. It’s no secret of the horrible practices that were instituted to sideline Black Americans since the ending of slavery. If you aren’t sure what I’m talking about please go find books related to this subject (usually I’m game to throw out so many book titles, but everyone has access to Google so if you really want to know you’ll search for it). There’s so much false narratives believed to be true, that it has tainted our current society. Acknowledgement is the only way we get better from here. For example, the Civil War has some foundational footing in slavery (I don’t care if your family fought for the South...it is what it is). Once again, reading for information without all the emotional connection would serve many people well to prove that they’ve believed wrong. I don’t expect everyone to change their mind, but sometimes just encountering different information can product a better result for future generations. Systems like redlining and requiring green books were established to keep Black Americans “in their place”, but we are still wanted to say the Pledge of Allegiance that speaks of equality and unity. At some point, the irony gets old. Songs created about this great land were made by White men for White men....can we just go ahead and quit skirting around this so we can heal and move on. We shouldn’t be arguing about this anymore... there’s plenty of evidence, but we don’t have to stay here!!! We will never move on if someone doesn’t start admitting that our American history is one of murder, thievery, and lies. Books need to quit attempting to portray a picture of peaceful encounters and just a pure desire to make life better for everyone. If we could just tell the truth, we could be more impactful in making the necessary changes. But, we won’t read. We would rather watch a news channel for information (which is biased based on political affliction) instead of forming opinions and thoughts based on facts that are presented in context appropriately. It’s all about entertainment instead of true encounters that produce internal change with inevitably aids in creating social change. We the people can do this, but it takes each of us being honest and bringing truths to the table and not traditions. 

Is there anything you’ve learned in regards to history that has changed your perspective?

Comment

Add new comment

This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

4 + 6 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.