why do people need to be taught about sex?

890 views

Why is it that humans need to be told about sex but it seems every other species that uses sexual reproduction knows it instinctively? am i just a fucking idiot?

edit: once again i have proven myself to be an absolute imbecile.

In: Biology

8 Answers

Anonymous 0 Comments

They don’t need to be taught about it in order to do it. It’s instinctual. What we teach people about is how human reproduction works in a physiological sense, how people get pregnant so you can avoid it if you want, how diseases are spread and how to avoid them, etc.

You are viewing 1 out of 8 answers, click here to view all answers.