why do people need to be taught about sex?

885 views

Why is it that humans need to be told about sex but it seems every other species that uses sexual reproduction knows it instinctively? am i just a fucking idiot?

edit: once again i have proven myself to be an absolute imbecile.

In: Biology

8 Answers

Anonymous 0 Comments

Humans don’t need to be told about sex.

Humans need to be told about *safe* sex, about how getting pregnant happens, about STIs, about social norms, etc.

You are viewing 1 out of 8 answers, click here to view all answers.