I'm not saying that following it is entirely fair, at least it doesn't have to be. But it often feels like the media and culture describe sex, and especially boys, as destructive to women. It's as if the absolute minimum requirement for being kind to women is not having sex with them.