Categories
a woman is always to blame evil fat fatties excusing abuse harassment mansplaining men who should not ever be with women ever misogyny PUA

Roosh V forum members baffled that fat woman doesn’t welcome sexual harassment

Online dating: It doesn't always work like this.
Online dating: It doesn’t always work like this.

For a certain subset of horrible men, there are few things more infuriating than the fact that women they find undesirable can turn down men for sex. For this upsets their primitive sense of justice: such women should be so grateful for any male attention, these men think, that turning down even the most boorish of men shouldn’t even be an option for them.

Consider the reactions of some of the regulars on date-rapey pickup guru Roosh V’s forum to the story of Josh and Mary on the dating site Plenty of Fish. One fine December evening, you see, Josh decided to try a little “direct game” on Mary.

That’s what the fellas on Roosh’s forum call it, anyway. The rest of us would call it sexual harassment.

Josh started off by asking Mary if she “wanted to be fuck buddies.” She said “nope,” and the conversation went downhill from there, with Josh sending a series of increasingly explicit comments to Mary, despite getting nothing but negative replies from her.

After eight messages from Josh, with the last one suggesting he would pay her $50 to “come over right now and swallow my load,” Mary turned the tables, noting that she’d been able to deduce his real identity from his PoF profile, and asking him if he wanted her to send screenshots of the chat to his mother and grandmother. He begged her not to.

As you may have already figured out, from the fact that we’re talking about this story in public, Mary did indeed pass along the screenshots, and posted them online.

Poetic justice? Not to the fellas on Roosh’s forum. Because, you see, Mary is … a fat chick.

While dismissing Josh as a “chode” with “atrocious game,” Scorpion saved most of his anger for the harassed woman:

Look how much she relishes not only shooting him down, but damaging his reputation with his own family. She’s positively intoxicated with her power. Simply spitting bad direct game is enough to unleash her vindictive fury.

“Bad direct game.” I’m pretty sure even Clarence Thomas would consider what Josh did sexual harassment.

At any point, she could have pressed a single button and blocked the man from communicating with her, but she didn’t. She didn’t because she enjoys the feeling of power she gets from receiving attention from guys like this and then brutally shooting them down. It makes her feel much hotter and more desirable than she actually is in real life. She’s not there to meet men; she’s there to virtually castrate them for her own amusement.

I’m guessing here, but I’m pretty sure that nowhere in Mary’s profile did she encourage the men of PoF to send her explicit sexual propositions out of the blue. And I’m pretty sure she didn’t hold a gun to Josh’s head and force him to send a half-dozen sexually explicit harassing messages to a woman he didn’t know.

Athlone McGinnis also relies heavily on euphemism when describing Josh’s appalling behavior:

I don’t think its primarily the revenge she’s after, its the validation. She is enjoying the power she has over this guy and wielding it brutally because it shows she can maintain standards despite her weight and the doubtless numerous confidence issues that stem from it. In blowing up this guy for being too direct in his evaluation of her sexuality, she affirms the value of her own sexuality.

Oh, so he was just being “direct in his evaluation of her sexuality.”

In short: “I am wanted, but I have standards and can choose. I have so much agency despite my weight that I can go as far as to punish those who approach me in a way I do not like rather than simply blocking them. I’m teaching them a lesson, because I’m valuable enough to provide such lessons.

So apparently in Mr. McGinnis’ world women who are fat aren’t supposed to have agency? They’re not supposed to be able to choose? They’re supposed to drop their panties to any guy who offers to be their fuck buddy or tells them to “suck my dick?”

Also, I’m a victim bravely standing up against online bullying/harassment-look at me!”

Yeah, actually, she is. Get used to it, guys, because you’re going to see a lot more of this in the future.

This isn’t just a laughing matter for her. She needs to be able to do this in order to feel worthwhile. She has to be able to show that even she is able to maintain standards and doesn’t have to settle for just any old guy asking for any old sexual favor simply because she resembles a beached manatee.

And it’s not a laughing matter for you either, is it? You’re actually angry that a woman said no to a sexual harasser — because you don’t find her attractive.  And because Josh — from his picture, a conventionally attractive, non-fat fellow — did.

Mr. McGinnis, may a fat person sit on your dreams, and crush them.

1.1K Comments
Inline Feedbacks
View all comments
Argenti Aertheri
Argenti Aertheri
11 years ago

LBT — if you’re looking for an overly complex scientific take on Harry Potter then it’ll scratch that. If you can’t manage to suspend disbelief and don’t buy anything spoon fed you from a can of rational sauce…you’ll want to scream. It’s all purple prose about science with a veneer that it must make actual sense since the parts you understand make sense. Except it doesn’t, and is painfully purple.

As for Yudowsky’s little cult…It works if you’re looking for something…I’m going to start That Discussion, I just know it…it works if you’re an atheist desperate for the salvation religions offer.

(Note: I realize that not all religions offer salvation, not all religious people are religious for that reason, and not all atheists want a pseudo-religion…but these ones do, for reasons pecunium explained above)

katz
11 years ago

AAAAAH HE IS THE GUY WHO WROTE HARRY POTTER AND THE METHODS OF RATIONALITY?

If I had gum, I would have swallowed it. (I haven’t read it but I’ve heard of it and most people seem to think it’s really, really good.)

katz
11 years ago

It’s all purple prose about science with a veneer that it must make actual sense since the parts you understand make sense.

In other words, just like everything he says.

kittehserf
11 years ago

Argenti – not wanting to start That Discussion either, but “yep” to your comments about where Yudkowsky and his crew seem to be coming from.

This has however proven useful, ‘cos I was just browsing about the Earthsea books and saw stuff on The Other Wind, which I’d never read (I gave up in disgust after Tehanu), where the whole Dry Land idea is shown to be a colossal stuff-up by the mages. Might be worth reading at that … I’d always thought Le Guin was just doing a poetic sort of “there’s nothing after this” thing.

Dvärghundspossen
11 years ago

I haven’t really followed this entire discussion of the basilisk and stuff, just want to add a little something about the ontological argument. It’s not quite right to say that it attempts to prove that if you can imagine something it must exist, or defining God into existence. It’s an attempt to prove that God’s existence is logically necessary (at least it is in its most charitable interpretation). This is why that argument is actually very difficult to wrap your head around. I’ve had seminars for first semester philosophy students on the ontological argument, and everyone is initially like “But duh, it’s about saying that something is real because you imagine it, I could ‘prove’ that Santa Claus is real the same way”, and it really takes some effort for students to a) actually learn the argument, and b) understand what’s actually wrong with it according to Immanuel Kant (the most popular dismissal), namely that existence isn’t a predicate but belong to a different logical category (and Kant’s counter argument against the ontological argument is actually a bit contested – there are logicians who think that existence can be used as a predicate, but that the argument is still flawed for some other (complicated) reason).

There’s a good reason why so many philosophers for hundreds of years thought that there’s probably something fishy going on here, but couldn’t quite put their finger on exactly what, and that’s not that they were stupid.

ANYWAY, even without having read through everything about this “basilisk” I doubt that an analogous argument could be made for it. The ontological argument depends on God being the ultimate/the perfect/the greatest. An AI that doesn’t even exist for a really long time can’t fill these shoes. (At least probably not – there’s an article from 2004 by Peter Millican according to which the only flaw in the ontological argument is that “the greatest being” in the argument may be an actual, limited being – but if Millican is right, it doesn’t have to be that basilisk thing, it could be a human being as well, so still doesn’t prove that the basilisk will come.)

Dvärghundspossen
11 years ago

(Note: I realize that not all religions offer salvation, not all religious people are religious for that reason, and not all atheists want a pseudo-religion…but these ones do, for reasons pecunium explained above)

Yeah, there really are many atheists with a religious vein in them. For instance, I think moral realism is driven by religious-instinct-laden intuitions, and some atheists are super-dedicated to moral realism. Also, lots of atheists seem to get some kind of religious feeling out of thinking about how elements have formed in stars and are now part of our bodies. I really don’t see anything wrong with thinking about this and getting all kinds of fuzzy feelings from it (Yay for fuzzy feelings! They take us through the day!), although I don’t feel that way myself on contemplating elements. However, if you say that “the stars die SO THAT we could live” or something along these lines, I think you’re really veering into pseudo-religious territory, since SO THAT and similar phrases normally signifies some kind of aim or purpose.

katz
11 years ago

Dvarg: That was a much better summary of the ontological argument, but the point is that, with the basilisk as with the God argument, you’re imagining something that logically must exist based, so a) the case for the basilisk’s existence isn’t any better than that particular case for God’s existence and b) it’s a terribly ironic line of argument coming from people who have probably smugly denounced the ontological argument elsewhere.

kittehserf
11 years ago

(Yay for fuzzy feelings! They take us through the day!)

Especially when cat and dog furs lend extra fuzz power! 🙂

Dvärghundspossen
11 years ago

@Katz: I guess I didn’t quite get my point through in that long text, but… I wanted to point out that even though you can make an ontological argument for the existence of GOD which is at least good enough to make it really difficult to actually point out where the flaw lies, I can’t see how you could make an ontological argument for the existence of a certain AI that’s even half-decent.

So, it’s not just that they cling to an argument that’s AS BAD as an argument they dislike, it seems to me like any argument for the existence of an AI must be WAY WORSE than the best versions of the ontological argument for the existence of God. 🙂

pecunium
11 years ago

They do, I think, make a bit of a warped ontological argument.

One of the things Yudkowsky believe in is a “many worlds multiverse” (enter the handwavium). In his explanation, anything which could happen, will happen; so by postulating a thing; that thing becomes possible. If one works toward it, sooner or later it will become real (because some branching of quantum physics will cause a deterministic manifestation of it in the “real” world”).

This, I think, is why the Singularitarians are so fervent. They know they will be saved, because their miracle AI will come. Oddly this seems to give them real internal comfort.

Me, I think (were I to believe such a thing) that knowing it can’t fail would mean I could relax, and enjoy life more; because it doesn’t matter: I will be save in The Great Retrieval. They don’t. They dream of having the 10,000+ dollars required to have their heads frozen in liquid nitrogen the moment they die (and some want to be able to pay doctors to decapitate them just before they die so the “preservation” won’t have any delay which might hinder their eventual restoration).

When they do spend the money, they wear the proof in a very public way; and preach the joys of knowing you will be cryonically preserved. And still they seem scared. They get upset when people say “The Singularity” is a nice McGuffin for a story, but can’t really work out that way.

Which makes me think they don’t believe it. That they know they are whistling in the dark. Which is what makes me really upsent with Yudkowsky. He’s making a lot of people’s lives less happy.

Argenti Aertheri
Argenti Aertheri
11 years ago

“He’s making a lot of people’s lives less happy.”

Besides my hatred of misleading statistics, THAT. My Yudcultist honestly thinks he has to get a job that pays as much as possible to aid in the quest for AI — this kid’s smart, he could get most any job he wanted, but instead of picking a field he enjoys, he’s going to pick whatever seems to pay best. And instead of giving it to organizations that do current, practical, hands on aid, it’ll go to Yudkowsky to fund…what exactly? Just what is he doing with all this money? Not program testing any degree of AI, seeing how he can’t program shit.

(FTR, MSF // Doctors without Borders is my charity of choice, because starting medical programs in war zones and famines and underdeveloped regions, working against infectious disease that’d otherwise go untreated, treating malnutrition…yeah, far more useful than dumping money into Yudkowsky’s fund for what exactly)

pecunium
11 years ago

He’s supporting the most important man in the history of mankind.

That’s the takeaway (and Yudkowsky is willing to tell this to people, face to face. It’s amazing to watch, and only a sense of politesse (and utter ignorance as to just how serious he was) kept me from laughing at him when he said it to me (in more than one form, in the course of that evening).

He’s sold them a bill of goods; first he makes them certain The Singularity will happen (but some how it’s only once, not an infinite number of good/bad/friendly/unfriendly/neutral AIs) and that if HE isn’t on the ground floor of the philosophy of AI, in the design phase, then it WILL be unfriendly; because it won’t be, “rational”.

It hokum. It’s internally inconstent, in ways that make it unsavory, and I think unsalvageable.

Howard Bannister
11 years ago

Harry Potter and the Methods of Rationality… ooooh, yeah.

There’s some really good parts. There’s some really awfully terrible parts. His eight-year-olds are basically thirty, and his adults are basically clueless moral monsters. And underlying it all is that horrible ‘take the Red Pill’ mentality.

But there’s also a wonderful sense of humor…

Heh.

So, everybody’s read the short Matrix interlude piece, right?

It comes down to basically this. In the first movie, we get to the part where Morpheus pulls out the battery, explaining how the machines are using humans, and Neo shakes his head and starts explaining physics to Morpheus, telling him how this that doesn’t make any sense and is kind of stupid. (which it is–but I bet you-all know that, and know it wasn’t how it went in the original script)

And Morpheus asks… “Neo, where did you learn these ‘physics’?”

“In school.”

“Inside the Matrix.”

“…”

“The machines craft beautiful lies.”

“…can I have a real physics textbook?”

“Oh, Neo. The real world doesn’t run on math.”

I must have laughed for several solid minutes.

Argenti Aertheri
Argenti Aertheri
11 years ago

Hokum is one word. I was think more like con or scam.

As for why there will only be one AI, I believe the line is that it’ll prevent any others from being made. No matter what form this AI takes. The most bullshity of the bullshit, to me anyways, is CEV.

You need to watch the ends of Doctor Who seasons 1 and 3 — tell your beloved that I said you need to see the Bad Wolf and Yana, she should know which episodes I mean. The first set is how Rose becomes a god, the second the Master tries using the same power. The difference between a caring and vengeful god could not be more clear. (Do not drink to Dalek diatribes though, that’s how I ended up puking sweettarts!)

Howard — nope, no math here, move right along, these are not the droids you are looking for.

pecunium
11 years ago

Argenti: As an argument that might be plausible, but… not in the many-worlds of Yudkowsky: That’s really more scary (to them) than anything else, if they believed it.

Here’s the thing, they want to live forever. Not just some iteration of themselves in the infinite multiverses of their fantasies, but the one who is thinking about it.

So far I can accept that. I think it’s a but less than completely rational; but I use the word differently to their usage.

If, however, all possibilities will come to be (and the entire “branching” aspects have some ontological problems… where is the “action” which causes the branching. H. Beam Piper actually played with this in a story in the ’60s, but I digress) then “I” will live forever.

“I” will also be tortured by the basilisk. And I died in a motorcycle crash, and I settled down and married various people and…

None of it matters, of course to me, if it’s not in this universe that it happens. They know this, and are desperate to make it be this one. Because deep down, they lack faith. And lots of people (esp. in the US) confuse outward display (in the form of donation, see, “The 700 Club”) with a way to manufacture sincere belief. By accident, or design, Yudkowsky has tapped into that.

Argenti Aertheri
Argenti Aertheri
11 years ago

Yeah, the want all the, arg how to word this and not start That Discussion…fuck…they want the living forever (afterlife) of some religions, cloaked in rationality, and in a form that allows them to “buy in”.

I hate multiverse theory, the what is a decision thing bugs me a lot — we get a new universe every time I decide what the fish are getting for dinner? What the hell difference does it make if they get flakes on Friday and algae wafers on Saturday or vice versa? And more, we get a new universe every time each one of them picks which flake to go after? I spawned a hundred universes at feeding time last night then.

“And I died in a motorcycle crash”

And sulfa, and was it Ukraine? And I think there was a horse incident. And a few others.

And that’s just the dying option, don’t forget the breaking other bones options (there are well over a thousand combinations) and the various methods by which dead can occur.

But yeah, if an unfriendly AI in any universe can torture the you in this one…we’re screwed. And the end and specials of season 5 to your watch list — the reality bomb and the return of the Master (he really needs to stay dead already!). Email me if you want the summary, we’re not having another rot13 conversation!

Falconer
11 years ago

the return of the Master (he really needs to stay dead already!)

He’s indestructible, the whole universe knows that.

pecunium
11 years ago

No, I was pointing out that one need not do anything to get all possible results. If Roko’s basilisk can happen, it will happen; what I^ care about is that it happens to I^^. But really, it doesn’t matter. So I can ignore it and just work on being happy.

Argenti Aertheri
Argenti Aertheri
11 years ago

Eh, sorta. Since doing nothing is physically impossible. But no, one need not do anything specific, nor avoid anything in particular.

Other than that little bit of pedantry, yes.

katz
11 years ago

@Katz: I guess I didn’t quite get my point through in that long text, but… I wanted to point out that even though you can make an ontological argument for the existence of GOD which is at least good enough to make it really difficult to actually point out where the flaw lies, I can’t see how you could make an ontological argument for the existence of a certain AI that’s even half-decent.

So, it’s not just that they cling to an argument that’s AS BAD as an argument they dislike, it seems to me like any argument for the existence of an AI must be WAY WORSE than the best versions of the ontological argument for the existence of God. 🙂

Ah, OK. Fuck the ontological argument for being obtuse. No wonder everyone just argues against it by example. But you do agree that the argument being made for why the AI has to exist is basically a version of the ontological argument, right?

LBT
LBT
11 years ago

RE: Argenti

if you’re looking for an overly complex scientific take on Harry Potter then it’ll scratch that.

If I want science, then I’ll fucking read science. It’s not like there’s a lack of interesting nonfiction science lit around. (Ramachandran! Gould! Hell, whatever your feelings on Gladwell, I at least found him interesting to read.) If I want fantasy, then I’ll read goddamn fantasy.

I hate multiverse theory, the what is a decision thing bugs me a lot — we get a new universe every time I decide what the fish are getting for dinner? What the hell difference does it make if they get flakes on Friday and algae wafers on Saturday or vice versa? And more, we get a new universe every time each one of them picks which flake to go after? I spawned a hundred universes at feeding time last night then.

Infinity Smashed actually runs on this theory (there’s very little actual space travel; it’s far cheaper, faster, and easier to travel dimensions instead). And the answer is yes, though it doesn’t run on ‘decisions,’ but instead just divergent possibilities. So, in theory, there are infinite dimensions that are multiplying near constantly, and it’s a hobby for some geeks to keep track of spontaneously-appearing dimensional rifts.

So, the short version is, yes, in that world, you WOULD spawn a hundred universes feeding your fish. Now multiply that by every atom in the universes. There’s a reason even the people in Infinity Smashed see it as a THEORY, not a law; you can’t exactly TEST for infinity.

The original draft got Raige winning M.D.’s friendship by arguing with her over this and winning. God, we were nerdy teenagers…

katz
11 years ago

One of the things Yudkowsky believe in is a “many worlds multiverse” (enter the handwavium). In his explanation, anything which could happen, will happen; so by postulating a thing; that thing becomes possible. If one works toward it, sooner or later it will become real (because some branching of quantum physics will cause a deterministic manifestation of it in the “real” world”).

OK, what is it with “skeptics” and multiverse theory? The idea of God is completely irrational and beyond the pale, but the existence of billions of parallel universes, well, that’s just a given.

Not that there’s anything wrong with believing in a multiverse if that floats your boat, but just like the existence of God/the supernatural, it’s completely unprovable based on physical evidence. You can make theoretical arguments about why its existence seems probable and those arguments might seem compelling to you…but you could do the same thing with God. If you think that no rational person should believe in something that can’t be empirically proven, then you’re just going to have to reject multiverse theory out of hand.

Alternately, if you choose to accept multiverse theory, you will just have to accept that other people have come to the conclusion that other empirically unprovable things exist, and that they are not deluded or stupid but in fact just as rational as you.

Argenti Aertheri
Argenti Aertheri
11 years ago

Ramachandan! Phantoms in the Brain is one of my favorite books actually.

katz
11 years ago

Hokum is one word.

Because if it was two words, it would sound dirty.

There, I just raised the intellectual bar of this conversation.

LBT
LBT
11 years ago

RE: Argenti

Ramachandran is fantasmo. And I’m not even including the history books I’ve read.

RE: katz

There, I just raised the intellectual bar of this conversation.

You raised MY bar! HEYO! (I’m contributing to the intellectual rise too!)