why do people need to be taught about sex?

893 views

Why is it that humans need to be told about sex but it seems every other species that uses sexual reproduction knows it instinctively? am i just a fucking idiot?

edit: once again i have proven myself to be an absolute imbecile.

In: Biology

8 Answers

Anonymous 0 Comments

Humans don’t *need* to be taught about sex. Put ten babies in a room and give them no outside human interaction, and once they reach puberty they’ll start figuring out sex pretty fast, because we have an instinctive drive to do it. In the animal kingdom (and humans before modern times), young creatures also have the advantage of seeing other members of their species doing it – nowadays, we tend to value privacy and don’t do that as openly in front of our kids, thankfully.

The reason we teach kids about sex isn’t about the actual mechanism of it, it’s more about all of the stuff that surrounds sex. Things like consent, safety, sexually transmitted diseases, pregnancy, all of those complex feelings that surround sex in a social context. That’s not stuff that’s instinctive, that’s stuff that humans have figured out to keep ourselves and others safer and happier. That’s what sex education is meant to be about, not just “put A in B, repeat.”

You are viewing 1 out of 8 answers, click here to view all answers.