I started compiling a list of things that we were taught in school but they are complete bullshit and I thought it would be good to get feedback from a variety of other Americans and see if there are some that I am overlooking or that I was never privy to. What are some things you were taught in public school that you now know to be complete bullshit or just socitial norms that fucky. Here are some examples that I could think of. I may write a book on this never tried to do that before not sure I have one in me I am pretty stupid in general but man the more I learn the more i see how dumb the average person really is.
Incest creates retarded babies (not condoning just wondering where this comes from an is it true)
Circumcision is good
War on Drugs is a good thing (& what was DARE program really about)
Everyone is special or talented
Food Pyramid
what do you got? any fond or un-fond memories of your indoctrination?
The blatant brainwashing about the Holocaust that was shoved down our throats every single year from fifth grade to graduation.
I'm not saying it was a complete hoax, I'm not agreeing with /pol/ to the extremes of their beliefs, but when you step back and look at the whole thing with an open mind it starts to get a little odd. The holocaust is not the most recent nor the most significant genocide in world history, yet it is the only one you are not allowed to question and it is the only one public education focuses on. There's an entire network of propaganda built around instilling one specific view point and silencing any questions. Seems a bit off to me.