I started compiling a list of things that we were taught in school but they are complete bullshit and I thought it would be good to get feedback from a variety of other Americans and see if there are some that I am overlooking or that I was never privy to. What are some things you were taught in public school that you now know to be complete bullshit or just socitial norms that fucky. Here are some examples that I could think of. I may write a book on this never tried to do that before not sure I have one in me I am pretty stupid in general but man the more I learn the more i see how dumb the average person really is.
Incest creates retarded babies (not condoning just wondering where this comes from an is it true)
Circumcision is good
War on Drugs is a good thing (& what was DARE program really about)
Everyone is special or talented
what do you got? any fond or un-fond memories of your indoctrination?