why do people need to be taught about sex?

891 views

Why is it that humans need to be told about sex but it seems every other species that uses sexual reproduction knows it instinctively? am i just a fucking idiot?

edit: once again i have proven myself to be an absolute imbecile.

In: Biology

8 Answers

Anonymous 0 Comments

It would come naturally if we didn’t have sex ed too. It did before. Sex ed is supposed to be (but isn’t because it’s often garbage) more about safety, pregnancy prevention and making sure they have the proper information instead of school hallway misinformation about something with such consequences.

You are viewing 1 out of 8 answers, click here to view all answers.