Categories
a woman is always to blame evil fat fatties excusing abuse harassment mansplaining men who should not ever be with women ever misogyny PUA

Roosh V forum members baffled that fat woman doesn’t welcome sexual harassment

Online dating: It doesn't always work like this.
Online dating: It doesn’t always work like this.

For a certain subset of horrible men, there are few things more infuriating than the fact that women they find undesirable can turn down men for sex. For this upsets their primitive sense of justice: such women should be so grateful for any male attention, these men think, that turning down even the most boorish of men shouldn’t even be an option for them.

Consider the reactions of some of the regulars on date-rapey pickup guru Roosh V’s forum to the story of Josh and Mary on the dating site Plenty of Fish. One fine December evening, you see, Josh decided to try a little “direct game” on Mary.

That’s what the fellas on Roosh’s forum call it, anyway. The rest of us would call it sexual harassment.

Josh started off by asking Mary if she “wanted to be fuck buddies.” She said “nope,” and the conversation went downhill from there, with Josh sending a series of increasingly explicit comments to Mary, despite getting nothing but negative replies from her.

After eight messages from Josh, with the last one suggesting he would pay her $50 to “come over right now and swallow my load,” Mary turned the tables, noting that she’d been able to deduce his real identity from his PoF profile, and asking him if he wanted her to send screenshots of the chat to his mother and grandmother. He begged her not to.

As you may have already figured out, from the fact that we’re talking about this story in public, Mary did indeed pass along the screenshots, and posted them online.

Poetic justice? Not to the fellas on Roosh’s forum. Because, you see, Mary is … a fat chick.

While dismissing Josh as a “chode” with “atrocious game,” Scorpion saved most of his anger for the harassed woman:

Look how much she relishes not only shooting him down, but damaging his reputation with his own family. She’s positively intoxicated with her power. Simply spitting bad direct game is enough to unleash her vindictive fury.

“Bad direct game.” I’m pretty sure even Clarence Thomas would consider what Josh did sexual harassment.

At any point, she could have pressed a single button and blocked the man from communicating with her, but she didn’t. She didn’t because she enjoys the feeling of power she gets from receiving attention from guys like this and then brutally shooting them down. It makes her feel much hotter and more desirable than she actually is in real life. She’s not there to meet men; she’s there to virtually castrate them for her own amusement.

I’m guessing here, but I’m pretty sure that nowhere in Mary’s profile did she encourage the men of PoF to send her explicit sexual propositions out of the blue. And I’m pretty sure she didn’t hold a gun to Josh’s head and force him to send a half-dozen sexually explicit harassing messages to a woman he didn’t know.

Athlone McGinnis also relies heavily on euphemism when describing Josh’s appalling behavior:

I don’t think its primarily the revenge she’s after, its the validation. She is enjoying the power she has over this guy and wielding it brutally because it shows she can maintain standards despite her weight and the doubtless numerous confidence issues that stem from it. In blowing up this guy for being too direct in his evaluation of her sexuality, she affirms the value of her own sexuality.

Oh, so he was just being “direct in his evaluation of her sexuality.”

In short: “I am wanted, but I have standards and can choose. I have so much agency despite my weight that I can go as far as to punish those who approach me in a way I do not like rather than simply blocking them. I’m teaching them a lesson, because I’m valuable enough to provide such lessons.

So apparently in Mr. McGinnis’ world women who are fat aren’t supposed to have agency? They’re not supposed to be able to choose? They’re supposed to drop their panties to any guy who offers to be their fuck buddy or tells them to “suck my dick?”

Also, I’m a victim bravely standing up against online bullying/harassment-look at me!”

Yeah, actually, she is. Get used to it, guys, because you’re going to see a lot more of this in the future.

This isn’t just a laughing matter for her. She needs to be able to do this in order to feel worthwhile. She has to be able to show that even she is able to maintain standards and doesn’t have to settle for just any old guy asking for any old sexual favor simply because she resembles a beached manatee.

And it’s not a laughing matter for you either, is it? You’re actually angry that a woman said no to a sexual harasser — because you don’t find her attractive.  And because Josh — from his picture, a conventionally attractive, non-fat fellow — did.

Mr. McGinnis, may a fat person sit on your dreams, and crush them.

1.1K Comments
Inline Feedbacks
View all comments
Argenti Aertheri
Argenti Aertheri
11 years ago

*blinks* oh

So, uh, I would’ve just solved it. But totally justifies torture. Lovely.

pecunium
11 years ago

No… see you had a momentary discomfort. If we give it a value of say… . 000 000 000 01 pain units.

Now there are 6,000,000,000 people on the planet. If each of them has it happen to them 10 times this year then there will have been 000 000 0006 pain units. If we project that back into the past, and on into the future, and we can prevent all of them (using Timeless Decision Theory), then we will have a vast number of pain units.

But the cost in inflicting 1,000,000,000 000,000 pain units to a single person,

From the standpoint of “rational ethics” this is a fair trade.

This is what passes for reasoning. I can say this isn’t quite representative of his argumentative skill in person. It has more going for it. It’s better structured, and the conclusion more logically follows the premise.

What’s scary is the things he extrapolates this level of disassociation of means/ends and the question of the desirability of the ends.

The problem is that he’s created what is fundamentally an unfalsifiable assumption (there will be A Singularity; It will be unfriendly if he is not involved in shaping the creation: his “foundation” is effective in so shaping it: assumed it’s truth as one of his, “priors”, and then moves on as if the a priori has been upheld in the real world.

It’s illogical. It’s religious. It’s irrational.

pecunium
11 years ago

Sorry, I slipped.

The cost of inflicting 1,000,000,000,000 pain units on one person will prevent all that (aggregate) pain, which outweighs the individual pain.

The thing to remember, it’s a metaphor. You are supposed to know this is really about how much of your spare cash you donate to Yudkowsky (it’s related to the fucking basilisk). So you aren’t really supposed to think torturing one dude is ok, you are supposed to see it as a parable about how important it is to make Yudkowsky rich.

freemage
11 years ago

Ah, yes, Asshat Utilitarianism. Often practiced by Asshat Atheists, who think they’ve found a way to ‘reason out’ morality. It doesn’t really work, of course, but D-K kicks in and they can’t see why they must, for instance, design these scenarios with the acceptance that they may be in the losing category (ie, in the example above, the guy being tortured), and do so honestly.

pecunium
11 years ago

freemage: That’s the beauty of this one (and there is lots of local argle-bargle, don’t even try to figure out TDT, it requires a Singularity to function; but since The Singularity will happen, we can move on as if it were a proven thing): only one person in the all of time, in the past, or the future, ever needs to be tortured, and you get to choose who it is.

As I said, it’s a parable, and the message is perverse, in all regards (because it then gets extended to other anti-social ideas,and the end trumps the means, always).

Argenti Aertheri
Argenti Aertheri
11 years ago

“it’s related to the fucking basilisk”

No, you must never speak of that! Knowledge is dangerous! (I am not being snarky, I am nearly literally repeating Yudkowsky)

That the singularity will happen is one of those assumptions that makes me hate his “statistics” — he convinces people of it, on the premise that it’ll prevent them from dying, and then goes on to use that desire to prove that the goal is a sure thing.

And torture is like…should be Godwin 2.0

katz
11 years ago

Wait, we’re torturing the friendly AI now?

Lesswrong aside, in my experience anyone who’s totally obsessed with the Singularity is a strong Dunning-Kruger candidate.

Argenti Aertheri
Argenti Aertheri
11 years ago

I wish. No, we’de torturing some dude because said torture will prevent millions of people past and present from the minor annoyance of having to blink away some dust.

kittehserf
11 years ago

Cheat death?

Pardon my bias, but being stuck on this planet forever, especially if croutons like this mob were in charge, would be a good definition of living death. I’ll opt out, thanks, got better things to look forward to.

Argenti Aertheri
Argenti Aertheri
11 years ago

Kitteh — to quote one of the vampires in my game…

“Death is the ultimate dilemma and integral to the beliefs and behavior of every culture. Life is bore on the corpses of the dead. Without death, there would be no motivation to accomplish anything. The only emotion would be existing. Life would be pestilential and agonizing.”

I agree in other words.

kittehserf
11 years ago

Certainly earthly life! I don’t fancy being separated forever from those I love best, who’re all over the other side. Created endless life here and you’d have to solve the problem of ennui, for starters. Plus you’d have to stop humans breeding, which I doubt most would want.

kittehserf
11 years ago

Ack – create, not created.

LBT
LBT
11 years ago

My only concern regarding dying is what’ll happen to the rest of my system, and whether I’ll finish the stories I want to.

I fully expect to be on my deathbed, rasping, “Almost… done…” as I frantically scribble out my last sentences of aliens and dragon fiction. Then I will die with a smile on my face, my magnum opus at long last completed.

I personally don’t take the idea of the Technological Singularity very seriously. I mean, come on, we can’t even say exactly how the brain works! Good luck recreating one.

freemage
11 years ago

Y’know, I actually think of the Singularity as a possibility, somewhere down the road, in the sense of it being a set of technological advances (I know the ‘key’ is supposed to be a strong AI, but I’ve heard other technologies get batted around as well–‘easy’ gene-modding and abundant energy being the two biggest ones) that would so thoroughly alter life after they are instilled that it’s almost equivalent to a speciation event for humanity.

But that doesn’t mean I think it’s gonna happen in my lifetime, or even anytime in the next hundred years. Might, might not, and might never happen if other events intercede. The religious transhumanists, though… odd bunch, definitely.

katz
11 years ago

I wish. No, we’de torturing some dude because said torture will prevent millions of people past and present from the minor annoyance of having to blink away some dust.

I thought we were creating an AI so that we could torture it instead of a person.

Argenti Aertheri
Argenti Aertheri
11 years ago

“Plus you’d have to stop humans breeding, which I doubt most would want.”

Oh no, we’d have colonized outer space by then so plenty of room and new tech would drastically increase the planet’s carrying capacity. I tried that argument in other words.

Warning, I am about to attempt to explain the basilisk and TDT. This may cause existential dread, or confusion.

In massive short, because it makes my head hurt, TDT is about AI being able to, effectively, change the past by knowing how you’ll act. The basilisk…at some point AI will exist, and if it exists to reduce human suffering, which it will (because otherwise it’ll kill us all), then it will do this however possible. If it’s capable of recreating the dead, not a clone, not reconstructed from notes and memories, but YOU, and you didn’t do everything you could to create it as quickly as possible then you did Bad Thing. Because it exists to reduce human suffering, and by not making that happen as soon as possible, you increased suffering, and then I get lost as to why this AI would recreate you and torture you. But it will. So give Yudkowsky as much money as you can to fund this and reduce human suffering because otherwise you’re increasing human suffering and may be punished by future AI, even if long dead.

Now, Yudkowsky has dubbed this an unspeakable horror because of the risk of existential dread, so he isn’t actively using it to demand money.

*hands out aspirin, advil, and other headache meds*

Argenti Aertheri
Argenti Aertheri
11 years ago

“(…I’ve heard other technologies get batted around as well–… abundant energy being the two biggest ones) that would so thoroughly alter life after they are instilled that it’s almost equivalent to a speciation event for humanity.”

Now that I can get behind.

Argenti Aertheri
Argenti Aertheri
11 years ago

Re: basilisk — and now that you know this possibility, you’re aware of your responsibility to reduce human suffering and if you don’t you’ll be more liable to the AI’s punishment.

katz
11 years ago

The interesting thing is that it’s actually a way to dodge responsibility: Since the AI is the best way to fix all the problems, then as long as you’re helping create the AI, you’re helping solve every problem and don’t have any responsibility to do anything else, ever.

Argenti Aertheri
Argenti Aertheri
11 years ago

Worse. Focusing in any particular problem is bad because you could be fixing all of them by creating AI (though, again, that may just be my particular doofus transhumanist)

freemage
11 years ago

That… that’s the most inane thing I’ve heard in awhile. I mean, seriously, “The Basilisk exists to reduce human suffering, and therefore, if you did not work to create the Basilisk, it makes sense for the Basilisk to torment you.” I don’t… there’s a few logical chasms there that just don’t reconcile very well.

At a minimum, shouldn’t a strong AI be able to understand the difference between malice and error?

Argenti Aertheri
Argenti Aertheri
11 years ago

freemage — yeah, it makes my head hurt, so here — http://rationalwiki.org/wiki/Roko%27s_basilisk

katz
11 years ago

Eliezer Yudkowsky, founder of LessWrong, considers the basilisk would not work, but will not explain why because he does not consider open discussion of the notion of acausal trade with possible superintelligences to be provably safe.

…That’s one of the silliest sentences I’ve ever read.

Also, way to give your thought experiment a totally pretentious name, dude.

freemage
11 years ago

I just… I read that, and was pleased to read at the bottom of the page all the arguments I’d been coming up for why this is a silly, silly idea, solidly explained. It is truly inane, on a level rarely encountered anywhere.

Argenti Aertheri
Argenti Aertheri
11 years ago

I did say I was nearly quoting hm about knowledge being dangerous!

It might be true that this whole scenario comes to pass, in which case I’ve doomed you all and may get extra punished. And thus talking about it is potentional dangerous.

As for pretentious names, that’s his combination of the byproducts of yeast and milk chemical reactions.

1 35 36 37 38 39 45