why do people need to be taught about sex?

892 views

Why is it that humans need to be told about sex but it seems every other species that uses sexual reproduction knows it instinctively? am i just a fucking idiot?

edit: once again i have proven myself to be an absolute imbecile.

In: Biology

8 Answers

Anonymous 0 Comments

Human society requires a lot more family planning than animals do. People would figure it out without formal education on the subject. At least they would figure out that sex is pleasurable. But, in the process, there would be a lot more unwanted pregnancies.

You are viewing 1 out of 8 answers, click here to view all answers.