[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 726 KB, 1280x800, applewhite.png [View same] [iqdb] [saucenao] [google]
15262703 No.15262703 [Reply] [Original]

and I'm not sure how to feel.

Part of me wants to help them, but they're so goddamn smug that part of me wants to pop some corn and watch this thing play-out. Is that technically manslaughter?

https://www.lesswrong.com/posts/EvKa7EakoXreCkhC6/a-way-to-be-okay

>I am not super stoked about what seems to me to be the fact that I will die before my two hundredth birthday.
>I am not super stoked about what seems to me to be the frighteningly high likelihood that I will die before my child's twelfth birthday.
See that's legitimately sad, but then...

>I know, in my bones, that I am capable of impressive-by-human-standards achievements.
... and I think "do we really need more people like this around?"

>> No.15262705

>>15262703
People have been predicting their implosion for a few years, even before it started accelerating. It's strange to see it happen in real time since probably most of us weren't alive during the last big cults.

>> No.15262706

Another funny bit of hubris is the "my 200th b-day" bit. As if they're guaranteed to even get near 100. How can someone grow to adulthood and still hold these delusions?

>> No.15262709

>>15262705
I was actually there for Heaven's Gate.

It was a very weird time. I had just turned 13, and began smoking weed. I felt the world changing in weird ways. I remember going with my friends to watch the comet on a snowy night, stoned for the first time. Then a few days later, HG happens. It was fucking strange to say the least.

>> No.15262710

>>15262703
I don't know how to feel about people calling rationalists smug or arrogant.
The name is incredibly arrogant, yeah. But mostly it feels like people have a weird complex where they feel threatened, as if they think rationalists think they're better than them.

Rationalists don't think they're better than other people. That's kind of the polar opposite of what they'd say.
They just use a lot of jargon and complicated words because of autism and ingroup culture, but they're pretty harmless as far as smugness goes.

>> No.15262720
File: 168 KB, 2000x2000, smiley_PNG36233.png [View same] [iqdb] [saucenao] [google]
15262720

https://www.lesswrong.com/posts/SKweL8jwknqjACozj/another-way-to-be-okay

I noticed Roko is active again. I wonder if he's upset he couldn't trademark the basilisk. People are making serious dough on that. I think Grimes put a song about it on one of her albums. It must be like inventing the smiley face then watching it blow up until its everywhere, but seeing no reward.

>> No.15262722

>>15262710
>Rationalists don't think they're better than other people.
Have you ever interacted with them? That's literally what they believe.

>> No.15262727

>>15262710
>they're pretty harmless
Yeah, fearmongering society into a new dark age is totally harmless.

>they feel threatened
This reminds me of the people who told me I didn't think the Big Bang Theory was funny because "the humor was over my head."

The inside joke about the rationalists is that they dance on the most assumptive longterm models known to man. While assumption is not the antithesis of rationality, it is of certainty.

>> No.15262734

>With these things in mind, I am still not okay. More than anything I find myself craving ignorance. I envy my wife; she's not in ratspaces whatsoever and as far as I know has no idea people hold these beliefs. I think that would be a better way to live; perhaps an unpopular opinion on the website where people try not to live in ignorance. It's hard not to be resentful sometimes. I resent the AI researchers, the site culture, and I especially resent certain MIRI founders and their declarations of defeat.
Who is marrying these people?

>> No.15262736

>>15262722
Almost always I've seen people who got that impression because they tried memeing at them and got ignored or dismissed pretty rudely, but it's just a different discussion culture than 4chan not that they think they're any better
It's like if you went to a science talk and asked some shitposty questions in the Q&A. it's not that scientists are incredibly far up their own asses, just wrong expectations for how to talk to each other usually

In my experience they're full of kindness and altruism. The whole EA movement is basically 80% rationalists.

>>15262727
>Yeah, fearmongering society into a new dark age is totally harmless.
Maybe not. It's possible they're wrong, to be honest.
But I think if you try to put yourselves in their shoes, if you really believed something was a huge danger you'd probably try to make noise about it too?
Whether they're right or wrong, that's another thing.

>The inside joke about the rationalists is that they dance on the most assumptive longterm models known to man. While assumption is not the antithesis of rationality, it is of certainty.
I think they're far from certain. Most people will say p(doom), meaning the chance things go real fucking sideways, is something between 10-50%. No not a certainty, just pretty fucking concerning all in all.
Then some people believe more like 70% or even 90+%, but they're a minority and no one agrees in the first place.
Only thing people agree on is there's no certainties.

>> No.15262750

>>15262736
>In my experience they're full of kindness and altruism. The whole EA movement is basically 80% rationalists.
You're definitely one of them lol. Don't say "they" when you mean "we."

>> No.15262755

>>15262736
>if you really believed something was a huge danger you'd probably try to make noise about it too
Conventional wisdom is to make a reasoned assessment before pulling the fire alarm (see Chicken Little).

This group is not in the state, of the city, of the parking lot, of the ballpark of reason. This is really the heart of the issue. We all thought they would have come down by now, at least understand that "possibility space" includes a lot of dangers. What counts is viability. I respect that they consider real world scenarios like pandemics, but lets be honest, this isn't what they're on about.

>> No.15262759

>>15262750
The problem is, so is at least one janny, so these critical threads don't last long. I hope I am proven wrong desu~

>> No.15262763

>>15262710
Most worthless post I’ve ever seen. Literally just repeated what op said with a
>I don’t like it
Tag

>> No.15262765

>>15262750
The rallying cry of rationalists worldwide is "I'm not a rationalist, but ..."
I don't consider myself one of them, I never write any posts, I certainly don't go to any events or participate in the community.
But I do lurk there a lot and follow a bunch of rationalists bloggers, so if you want to call me one, fair enough as well.

I wouldn't call myself one of them, but I'm not exactly a complete stranger to the whole rationalist deal either. Obviously somewhat biased there, for sure.

>>15262755
>Conventional wisdom is to make a reasoned assessment before pulling the fire alarm (see Chicken Little).
Well, they do have a lot of very long involved explanations for why they think it's reasonable.
I personally don't agree with a lot of it, I think the chance of catastrophic AI is pretty low (I believe in slow takeoffs and scaling limits coincidentally being just in the right place that we should start to run into them real soon now)

>This group is not in the state, of the city, of the parking lot, of the ballpark of reason. This is really the heart of the issue
Can understand the feeling, to be honest. I don't buy a lot of the more extreme stuff.
But the reason I usually defend them is one side is just claiming that it's not reasonable and obvious bullshit, while the other side pours autistic amounts of energy arguing and writing pretty compelling articles for what they believe.
So by default I don't think I could really be blamed if I'm more convinced when people argue why you think what they think.
It's one thing to say it's not reasonable. But the annoying thing is they've responded to a lot of objections already.
Fuckers are just hard to prove wrong, that's why I find it interesting.

>> No.15262767

>>15262706
> How can someone grow to adulthood and still hold these delusions?
Gee, I dont know, maybe ask a million Catholics how they reach adulthood and still believe in their delusional bullshit?

>> No.15262779

>>15262765
>I think the chance of catastrophic AI is pretty low
Think lower.

>Fuckers are just hard to prove wrong
This group, that believes there is a return on utility given x amount of teraflops so that the end product is a volatile agent that can act towards preparing a bowl of jello, and uses the sophistication of new language models as some sort of evidence, is hard to prove wrong?

>> No.15262782

>>15262767
See, you're getting there.
Think about that for a second.

>> No.15262792

>>15262779
>This group, that believes there is a return on utility given x amount of teraflops so that the end product is a volatile agent that can act towards preparing a bowl of jello, and uses the sophistication of new language models as some sort of evidence, is hard to prove wrong?
Well, try it and see!
Being proven wrong is basically their fetish. They organize entire contests with prize money where the goal is making the best criticism of them.
Whenever you go to an EA criticism thread, the comments are invariably filled with stealth EAs defending the videos and trying to steelman the arguments.

They fucking love being proven wrong. Problem is that they're so good at arguing, even if they really hard wrong, good luck putting your finger on exactly why.
None of this is easy.

>> No.15262798

>>15262792
>Being proven wrong is basically their fetish.
I think cuckholding is their fetish, if their cult meetings are anything to go by.

>> No.15262801

>>15262798
Well, I've never been to one of the San Francisco house parties (or any meeting for that matter), but that wouldn't be the wildest I've heard.
Name any weird fetish. That community is chock full of weird people, you can probably find at least a few who are into that.

>> No.15262810

>>15262792
In this case "proving wrong" is lending validity.
I can't prove them wrong. I can't prove Muslims wrong. These groups are not even wrong. The difference being the Muslims tend not to masquerade as science, or reasonable, or rational.

LessWrong, like most theologies, hides in the gaps. In their case "well this is possible" is their gap, so they're in good company, if we're talking belief systems.

Show me how its probable, not possible. Do this at the very least before I accept something as rational.

>> No.15262826

>>15262810
>Show me how its probable, not possible. Do this at the very least before I accept something as rational.
It's 2:47 AM, and that's asking a lot of effort from a random /sci/ shitpost thread.
There's a lot of reasons they think it's probable, and really it depends which part you're going to intuitively disagree with. People intuitively think this part is obvious and that part is obvious bullshit, then a different person will come and say the exact opposite, so.

The gist of it, and I am putting zero effort into this:
- AIs were very dumb a few years ago. First, people say AIs aren't smart, they can't even play chess. Then, they can't even play go. Okay they can play go, but that means nothing it's just a game, they can't even make art. Okay they can make art, but it's not truly creative, they'll never be able to write poetry. Okay, AIs can write, but they're still not truly intelligent because they don't really understand, you see. Etc etc. The goalpost keeps being pushed everytime something new is achieved.
The point is not whether AIs are truly intelligent, but that they've clearly been progressing really fast from below human level, to much closer to human level.
- There is no reason to think human level is the limit. If AIs have been improving a lot, we should expect the improvement to continue way beyond human level. The gap between unable to play chess and just below human level is huge. The gap between just below human and above human is small. They should cross that gap.
- The Orhogonality Thesis. Being intelligent is separate from what goals you have. There's no reason superhuman AIs would give a single shit about us and what we like, the same way we don't give a shit about ants. AGIs are to us what we are to ants. A superhuman AI that decides it wants to stay alive and get rid of us is smart enough to hack computers, replicate itself everywhere, get the launch codes, make a pandemic, whatever you want. If it's smarter, you simply can't stop it.

>> No.15262843

>>15262826
>- AIs were very dumb a few years ago. First, people say AIs aren't smart, they can't even play chess. Then, they can't even play go. Okay they can play go, but that means nothing it's just a game, they can't even make art. Okay they can make art, but it's not truly creative, they'll never be able to write poetry. Okay, AIs can write, but they're still not truly intelligent because they don't really understand, you see. Etc etc. The goalpost keeps being pushed everytime something new is achieved.
This is where the argument fails. Markov chain generators have existed since the Usenet days. It's not a huge leap in technical aptitude, and it's never going to be general intelligence. You're already deep in delusion to even believe your argument's first principles are true.

>> No.15262857

>>15262843
>Markov chain generators have existed since the Usenet days.
What's your point? The Mechanical Turk has existed since 1770, but it couldn't actually play chess.
OpenAI and google didn't invest billions into Markov chains, because Markov chains can't actually play chess, or win at go, or make art, or pass the state bar exam, or answer a set of medical questions as accurately as human doctors (new result this week), etc etc.

You're fixating on the idea that LLMs "just predict the next token" instead of what they're actually able to do. LLMs don't care that they're just predicting the next token, they get results regardless.

>It's not a huge leap in technical aptitude
From Markov chains to LLMs?
According to every benchmark ever, it is a huge jump in test results everywhere.
It's not very interesting if you deny that there's been any progress since Markov chains. Markov chains can't do anything interesting at all, and I think you know that.
You're trying to dismiss AIs based on a caricature of what they do (predict next token, markov chains), instead of what they actually do (get state of the art scores on all the benchmarks, and work in the real world).

>it's never going to be general intelligence.
That's possible, genuinely I don't know the answer. I don't think your argument really supports this one way or the other though.

>You're already deep in delusion to even believe your argument's first principles are true.
That's a shitpost, anon. Calling people delusional is usually overconfidence.
You can't call rationalists smug, and them go smugly declare that people who disagree with you are delusional, that's silly.

Anyways, I think this discussion is unlikely to change anyone's mind, because we're arguing past each other a bit, I feel.
Have a nice evening, anon. Cheers. I hope you're right, for what it's worth. Nobody really wants an AI apocalypse.

>> No.15262865

>>15262857
>as accurately as human doctors
This is only because you techno-cattle keep lowering the bar for humans. You have to intentionally cripple intelligence to make "AI" seem intelligent in comparison.
>You're trying to dismiss AIs based on a caricature of what they do (predict next token, markov chains), instead of what they actually do (get state of the art scores on all the benchmarks, and work in the real world).
I bet you can't even write a single line of ML code, just like your leader. If you even knew a little bit about how most of these models work then you'd understand their limitations.
>That's a shitpost, anon. Calling people delusional is usually overconfidence.
It's an empirical observation. I thought you "rationalists" liked empiricism.

>> No.15262881
File: 35 KB, 200x200, YotsubaYotsuba.png [View same] [iqdb] [saucenao] [google]
15262881

>>15262865
>you techno-cattle keep lowering the bar for humans
That's an ad-hominem, anon. You're attacking a person instead of the argument.
>I bet you can't even write a single line of ML code, just like your leader.
I've written some. I like the design of transformers quite a bit. The attention layer was just a really clever idea, in my opinion.
That's just more lashing out, though.
>It's an empirical observation. I thought you "rationalists" liked empiricism.
Okay. There's nothing to substantial to argue about here, though.
Here's a picture of Yotsuba, instead. I enjoyed that manga, highly recommend.

>> No.15262882

>>15262826
I mean, you're little far in.

Take the "orthogonality thesis." I've shown on here for years how instrumental goals have no reason to survive even a few iterations of intelligence growth. Long story short, there's a trick in there where, to achieve a maximizer of any sort, you have to stop thinking AGI and start thinking ASI. Once you do that, you then have to show a clear linear path towards an agent that both disassembles gas giants and turns them to paperclips, and something that is truly unthinking, something that is not an ASI by the most modest definitions.

I don't like to argue these things because I'm afraid to give this validity, but then I see how desperate people are getting and I'm torn. I know on their little bongo board (which is a horrifying display of narcissism), they like to say their detractors call these ideas sci-fi, but its not even sci-fi, its space opera.

>> No.15262894

>>15262882
>I've shown on here for years how instrumental goals have no reason to survive even a few iterations of intelligence growth
That's interesting. I haven't seen that argument before, but if you want to post a link, I'd be happy to take a look at it.
I really need to go to sleep because I'm supposed to work tomorrow, but appreciate the reply.
Cheers

>> No.15262906

>>15262703
when are the mass suicides happening?
>inb4 2 weeks

>> No.15262912

>>15262881
>That's an ad-hominem, anon. You're attacking a person instead of the argument.
There is no argument. You started from an axiom that's clearly false and which you (and Yud) have essentially no understanding of beyond a surface level. Yud can't even write HTML and Python but he claims to be an AI expert.

All "rationalist" arguments are like this. They start off from erroneous first principles and then autistically talk at you for hours going in circles because they want to be the smartest person in the room. I don't want to have another one with you, I just want you to understand that people don't take you or your cult seriously and that it's grating to have you invade every online community to shill it.

Someone even invented an excellent troll called "Effective Accelerationism" to make you guys think the sky is falling faster.

>> No.15262919

>>15262906
Some of them have already an heroed in the past. I imagine more will due to the sexual abuse ring being uncovered lately.

>> No.15262935

>>15262894
Its not exactly Plato. You just have to take the argument out of the mouths of Bostrom or flashy youtubers and examine it for yourself.

The algorithm knows only to make to paperclips.
It knows what they're made from and how to make them.
It has now exhausted the resources provided.
The algorithm knows only to make paperclips, so it begins accessing a library of knowledge concerning chemical composition.
It prepares to produce machinery (!?) with the sole purpose of disassembling its environment to make paperclips.

So now we have an agent, with the singular goal of production in mind, that is sophisticated enough to turn every atom in the universe towards the end of one basic input, and at no time in those leaps of sophistication has that goal remained unaltered in the slightest.

Does that seem rational to you?

>> No.15262992

>>15262703
Can I get an explanation for why I should be angry about this?

>> No.15263151

>>15262992
It's more pitiful than anything else. You might feel a bit of sadness that people are suckered into cults.
Though if you want a reason to dislike "rationalists," then the fact that they spam their AI shill threads all over /sci/ is a good reason.

>> No.15263628

>>15262912
lol, that's exactly it. The sort of "gifted kids" who grew convinced in their childhood that they can easily understand anything at a glance and spot flaws and solutions that the most outstanding experts in their fields have missed for decades. The sort who built their identity around "being smart" and reasons that, if they're smart, everything out of their mouth is by definition smart.

>> No.15263796

>>15262709
I still think about that comet every now and again. I remember standing in my backyard at night, looking up at the sky and seeing it with my naked eyes, then my dad waving me over to look at it through the telescope.

>> No.15263812

>>15262992
You should be glad because mass suicide of these religious pedants would clean up /sci/ just a little bit

>> No.15263818

>>15262703
>and I'm not sure how to feel.
Rejoice. """Rationalism""" is one of the worst anti-human cancers plaguing this dying civilization.

>> No.15263833

>>15263818
Humanism is a not logical and leads to democracy and pluralism.

>> No.15263834

>>15263833
Cool. Did you have a point?

>> No.15263988

>>I am not super stoked about what seems to me to be the fact that I will die before my two hundredth birthday.
Average rationalist.
Brb cross posting this on r/sneerclub.

>> No.15264001

Can someone give me a tl;dr on this shit

>> No.15264006

>>15264001
Yeah I don't wanna read all of this garbage. What is it about?

>> No.15264029

>>15264001
>>15264006
One of the factions of the cult of scientism is basically scaring its own followers into suicide. That's a good thing. The other factions should follow Yud's example.

>> No.15264053

>>15262703
Imagine being so arrogant and self centered that you actually get upset with your own mortality. Why should they get to live forever, when that has never been a privilege given to any other people ever in the history of humanity? These people are depraved narcissists.

>> No.15264057

>>15262767
Catholics believe unprovable thing will happen to them after death. They aren't making claims that can be examined by science, whereas these "rationalists" are.

>> No.15264064

>>15262710
>as if they think rationalists think they're better than them.
Rationalists do think they're better than everybody else. They think everybody is less rational than themselves, which is why they call themselves rationalists.

>> No.15264069

>>15262720
Even if he trademarked "Roku's Basilisk", that wouldn't stop other people from making money about it in other industrial contexts.

Ford, the truck manufacturer, has trademarked "Ford". If I sing a country song about my dog leaving me and taking my Ford pickup truck with him, that's fine. I don't owe the Ford company a single cent, despite me using their trademark. But if I make my own truck and call it a Ford, then I'm in trouble.

>> No.15264082

>>15264057
>They aren't making claims that can be examined by science
It should be easy enough to give an example of one rationalist claim that isn't unfalsifiable then.

>> No.15264083

>>15264029
Why are they killing themselves? Do they not believe if they die w̶i̶t̶h̶o̶u̶t̶ ̶a̶c̶c̶e̶p̶t̶i̶n̶g̶ ̶J̶e̶s̶u̶s̶ ̶a̶s̶ ̶L̶o̶r̶d̶ contributing to their AI God they go to H̶e̶l̶l̶ get tortured by S̶a̶t̶a̶n̶ Roko's Basilisk. Sounds like they're throwing themselves into the fire.

>> No.15264086

>>15264082
Falsifiability isn't sufficient. If I claim a comet will strike the earth tomorrow and I know this because I learned it in a dream, that claim is falsifiable (we'll wait until tomorrow and see what happens.) But it certainly isn't a scientific claim.

Rationalists make a lot of outlandish claims about the future which are based on science fiction, not science.

>> No.15264096
File: 84 KB, 780x438, 1579406076338.jpg [View same] [iqdb] [saucenao] [google]
15264096

>>15263988
I just looked at sneerclub after much goading. I had no idea this sort of criticism was hitting mainstream levels. They're coming at this from a fairly whack place themselves. I notice they have trouble with race realism and other concepts that run contra to egalitarian brainwashing, so I see its as brainwashed vs brainwashed. Makes one wonder what a truly rational movement might sound like.

I hate to say it but I guess I come closer to Land than anyone else in all of this. There is a dominant self-replicating system that threatens humanity but its capitalism. That being said, socialism is untenable. There really is no heroic system coming to save us. That's probably where any rational actor should start from.

now send me money, somebody

>> No.15264101

>>15264086
Like I had stated earlier, its generous to say its founded on sci-fi. When I refer to sci-fi, I usually mean hard sci-fi, which at least lives with some falsifiable principles.

The "rationalist" position is closer to space opera. I'm not sure where the hell they're starting from but its not first principles.

>> No.15264112

>>15264101
>Like I had stated earlier, its generous to say its founded on sci-fi.
>The "rationalist" position is closer to space opera.

I would say that sci-fi with hard scientific grounding is a rare exception. Most sci-fi just throws technobabble mumbo jumbo at the wall, like Star Trek or worse Star Wars. Anyway, my point is these rationalists are operating in the space of speculative fiction at best.
>What if... and then what if... and then what if... and then maybe what if...
They're several levels deep in this speculative BS, they may as well be talking about a galactic parliament of little gray men meddling in Earth affairs. Is it possible? Sure. Is it falsifiable? We'll just wait and see, so sure. But is it scientific? Nah.

>> No.15264120

>>15262722
Those are narcissists not rationalists.

>> No.15264121

>>15264112
>or worse Star Wars
kys

>But is it scientific? Nah
My problem is that I come at this from a philosophy background. In my view, even what we call the "hard sciences" lives on a knife edge. You have to really throw out Hume, Kant, and countless others to even give our discoveries some sort of truth value that holds indefinitely. It's the best we have but expecting it to hold past 3:30pm EST is, well, wrong.

>> No.15264124

>>15262734
>bluepilled retard still thinks it's about personality
If he has something she wants: good looks, money etc. it doesn't matter if he spouts the biggest nonsense.

>> No.15264128

>>15264121
There is no reason to be upset, I offered Star Wars as an example of unscientific scifi that sucks.

Anyway I don't think you need to get particularly philosophical to disregard "rationalist" arguments. They're operating on too many layers of speculation to be taken seriously by normal people. It takes a special sort of very credulous sci-fi nerd to fall it.

>> No.15264129

>>15264124
I'm just nostalgic for an era where red flags existed. "Hey babe, I'm in a machine cult" used to be a red flag. He must fuck like a turbobeast.

>> No.15264133

>>15264128
>There is no reason to be upset
I'm a Star Wars fan. All we are is upset.

>> No.15264135

>>15262734
Post-wall roasties who are looking for a nice quiet man with an engineer salary to fund their future wine-aunting activities.

>> No.15264139
File: 47 KB, 640x407, 1676155613927659.jpg [View same] [iqdb] [saucenao] [google]
15264139

>>15264133
lmao, carry on then

>> No.15264146

>>15262720
Be nice to the demon and it will give you rewards, and this story has been told for thousands of years

>> No.15264150

>>15264129
Women never actually gave a fuck about personalimeme. There is no other era you thought this happened because you were a bluepilled retard back then.
Nowadays you got exposed go the truth so the change was you.

>> No.15264160

>>15264139
That doge on the right makes a lot of sense.

>> No.15264161

>>15262734
he just called his wife dumb in front of rationalist buddies, didn't he?

>> No.15264163

>>15264146
>this story has been told for thousands of years
Quite so. What was old is new again.

>> No.15264166

>>15264112
>these rationalists are operating in the space of speculative fiction at best
I feel like speculative is being too generous. It's Christian eschatology in chrome spray paint. ASI is God coming to offer us either salvation in the form of infinite growth or annihilation by nuclear hellfire and if you don't help in building the AI you will be tortured by the Basilisk. It's an incredibly childish worldview. Yudkowsky and his followers are just closeted Christians. At least in Orion's Arm AI gods don't enter the picture for at least a thousand years and even then their not portrayed as saviors coming to save us from our own destruction. Hell, the first self-aware AI pretty much fuck off to start their own civilizations leaving Humanity to fend for itself against rogue nanoswarms.

>> No.15264167

>>15262734
>Who is marrying these people?
Nobody. This virgin is LARPing.

>> No.15264169

>>15264129
>"Hey babe, I'm in a machine cult" used to be a red flag.
Women have always flocked to dudes in cults. Mason was a prolific poon hound.

>> No.15264178

>>15262703
Any discussion on 4chan is not going to be in good faith or productive. Rationalists aren't a monolith, but I'm pretty sure no one could reasonably advocate mass suicide. I can't conceive how it would be useful in way way. I can't even take the comparison seriously.

>> No.15264181

>>15264166
>Yudkowsky and his followers are just closeted Christians.
like most "secular", "progressive" and "liberal" americans. funny how all those based jews and freemasons worked for nothing in the end because their designated slave driver caste of puritan mayflower zelots eroded and poisoned it all with their superstitious and close minded faggotry... Maybe china will do better.

>> No.15264188

>>15264178
>Any discussion on 4chan is not going to be in good faith or productive.
This isn't my experience. Most people here are remarkably willing to reciprocate if you engage with them in a reasonable manner.

>> No.15264357

>>15264188
I think what confuses the normalfag is the anonymous style of conversation. I can receive a well-reasoned response to an argument, get up to take a piss, and come back to some other anon responding to the response "dilate tranny kike nigger," and the other party likely assumes it was me. That's just how it goes at the higher echelons of discourse.

>> No.15264363
File: 113 KB, 1196x312, Screenshot(100).png [View same] [iqdb] [saucenao] [google]
15264363

>>15264166

>> No.15264368

>>15264166
>It's Christian eschatology in chrome spray paint.
It's an offshoot of rabbinical Judaism (there is no savior + legalism/loophole worship) but yeah you got the gist right.

>> No.15264524

>>15264357
I'm far from a normie. I've been on 4chan since 2011.
>Some other anon responding to the response "dilate tranny kike nigger,"
This is a big part of it. Having a fruitful discussion is next to impossible here. Unilateralists curse.

>> No.15264609

What are rationalists and what is this group? What's the point?

>> No.15264625
File: 139 KB, 406x395, I TRIED TO WARN YOU.png [View same] [iqdb] [saucenao] [google]
15264625

>>15262703
Ted Kaczynski predicted this back in 2016.

https://theanarchistlibrary.org/library/ted-kaczynski-the-techies-wet-dreams

> The techies may answer that even if almost all biological species are eliminated eventually, many species survive for thousands or millions of years, so maybe techies too can survive for thousands or millions of years. But when large, rapid changes occur in the environment of biological species, both the rate of appearance of new species and the rate of extinction of existing species are greatly increased.[22] Technological progress constantly accelerates, and techies like Ray Kurzweil insist that it will soon become virtually explosive;[23] consequently, changes come more and more rapidly, everything happens faster and faster, competition among self-propagating systems becomes more and more intense, and as the process gathers speed the losers in the struggle for survival will be eliminated ever more quickly. So, on the basis of the techies' own beliefs about the exponential acceleration of technological development, it's safe to say that the life-expectancies of human-derived entities, such as man-machine hybrids and human minds uploaded into machines, will actually be quite short. The seven-hundred year or thousand-year life-span to which some techies aspire[24] is nothing but a pipe-dream.

>> No.15264639
File: 81 KB, 1024x742, montano waukegan.jpg [View same] [iqdb] [saucenao] [google]
15264639

>>15264178
>I'm pretty sure no one could reasonably advocate mass suicide.
Pic related did

>> No.15264656

>>15264639
Was Montano interested in LessWrong? I know he advocated for some of the same cosmological and ethical ideas, but they're commonly shilled among a lot of insular nerd communities so it could be a coincidence.

>> No.15264779

>>15264656
He referenced LessWrong multiple times on his blog and videos.

https://vitrifyher.wordpress.com/2019/03/20/the-view-so-far/

>I read Wikipedia articles on philosophy and theoretical physics, which lead me to the articles on time, eternalism, b-theory, relativity of simultaneity, the Rietdijk-Putnam argument, and special relativty. This lead my empirical mind to a belief in a block-time universe. Combined with the many-worlds interpretation of quantum mechanics, which I mostly became convinced of through reading the LessWrong articles on quantum mechanics and David Deutsch, I was lead to a horrible realization:
>Suffering is eternal and no local paradise engineering can change that.

https://vitrifyher.wordpress.com/2019/02/22/im-sorry-for-not-making-sense-anymore/

>The meatspace people, as lesswrongers would call them, act more than a little bizarre, they act strangely. They don’t look me in the eyes unless they are family members or are offering a service.

https://vitrifyher.wordpress.com/2018/12/02/new-monadology/

>Another problem is that for a Bayesian rationalist trained on the early 21st century blog LessWrong, the immediately succeeding question after reading Leibniz is “How would the world be otherwise if this were not true?”

https://vitrifyher.wordpress.com/2020/03/02/tongues-of-fire/

>And if it was the real Aubrey De Grey who commented on this site yesterday then I have to say I would have been more stunned in my pre-simulation days when I believed people were real. I honestly don’t feel the level of excitement that I should for such an important scientist to discover my crappy blog and YouTube channel, and bother to take the time to write a long comment. What a marvelous happening, Aubrey De Grey stroked his beard a couple times, took a sip of his beer (not a Heineken) and typed out some encouraging words for me. What a wonderful world I live in. Thank you God! Who’s next? Sam Harris? Eliezer Yudkowsky? Kanye West? Elon Musk?

>> No.15264787

>>15264779
Oh wow. So LessWrong cultism really did drive him to schizophrenia. I had no idea it went that deep.

>> No.15264791

>>15264779
https://vitrifyher.wordpress.com/2019/11/17/features-of-my-so-called-psychosis/

>something like telepathic communication with Eliezer Yudkowsky and Robin Hanson’s text

https://vitrifyher.wordpress.com/2020/01/30/down-and-lonely-oh-for-the-fortunate-only/

>I know it’s not my fault but I feel like saying, “Sorry for how much I suck at life.” I can’t be a properly good blogger like Eliezer Yudkowsky or something. I can’t speak like Sam Harris or Terence McKenna. I can’t even paint a pretty picture and put it on here. I can’t compose a song. I fundamentally lack the power to create. This is why I feel worse than a cripple. This crisis has been partially responsible for leading me to the point that I don’t even believe these people are real. I believe that God/the simulation is creating all the music, all the media, everything. And that’s the reason humans appear so overpowered but in person they appear terribly flawed or borderline retarded. The actuators are just a charade. That is my hypothesis.

https://vitrifyher.wordpress.com/2018/12/09/i-am-not-evil/

>Have you noticed the categorization of behavior as beholden to two factors: the biological and the cultural? This can be spoken of in any variety of esoteric languages: pure replicators on the one hand and consciousness on the other, Angra Manyu vs Ahura Mazda, the inadequate equilibria on one hand and Eliezer Yudkowsky on the other, the laws of physics vs. free will. These refer to our capacity to understand the unbidden and the good. That which is displeasingly just the way it is, over which we had no say, and that which we want to appear as wanting to be true.

>> No.15264799

https://vitrifyher.wordpress.com/2018/08/04/towards-the-propagation-of-the-savior-imperative/

>In its best theorization, and here I think specially of Eliezer Yudkowsky, one must recognize that physicalism has left us with the duty of attuning our notions to it, not to find ourselves permanent strangers upon the ground of reality thus revealed, for example by calling quantum mechanics “weird” and attempting to bend it so as to preserve our intuitions. Physicalism urges us to resist simplification, our genes, the arbitrary. While instilling in us the pleasure of absolute truth, of ultimate remembering, of eternities of hope; in short, it has opened up to us the channel of reality.

https://vitrifyher.wordpress.com/2018/08/25/materialism-is-not-dry-it-is-more-thrilling-than-fantasy/

>Cryonics is a good idea, but not for the reasons a standard atheist might think (like to ward off oblivion for some time). Checkout Eliezer Yudkowsky’s comment on this thread.

https://vitrifyher.wordpress.com/2018/09/27/the-schizoid-aspie-who-discovered-the-secrets-of-the-multiverse-with-the-internet/

>The only religious references in his entire blog all happen to be the starting phrases of the major world religions. God! How did I not realize this before? Literally, the beginning of the Quran. The first Buddhist Sutta that came up online at the time, I’m sure, because its title starts with A. The beginning of the Bible. The first sentences in the first hymn of the Rig Veda, book 1. The beginning of the first Sequence on Lesswrong.

>I just want to express myself creatively. I can’t live up to Eliezer Yudkowsky or Terence McKenna. I am not a good writer or speaker.

>> No.15264808

https://vitrifyher.wordpress.com/2018/10/12/links-curated-content/

>For those who still don’t understand why consciousness is not epiphenomenal: >https://www.lesswrong.com/rationality/zombies-zombies

>In case you are new to the club that takes many-worlds very seriously (although I may differ with Yudkowsky in that the transactional interpretation is something I have not fully ruled out):

>https://www.lesswrong.com/posts/S8ysHqeRGuySPttrS/many-worlds-one-best-guess

>Watch this video using the Hansonian perspective on signaling. Being hyper-aware of the hidden motives, are you then tempted to call this behavior a form of psychosis or do you embrace the human spirit imbuing the hidden motives?:

>> No.15264849

>>15264779
>>15264791
>>15264799
>>15264808
A terrifying glimpse into where thinking too much leads. Monkeys with overgrown brains weren't ever supposed to think about these things. It's maladaptive and evolution will weed it out eventually.

>> No.15264899
File: 21 KB, 230x346, At Our Wits' End.jpg [View same] [iqdb] [saucenao] [google]
15264899

>>15264849
Idiocracy was a prophecy

https://www.youtube.com/watch?v=sSQFKrbFZp0
https://www.youtube.com/watch?v=mOqGXhn7YBA
https://www.youtube.com/watch?v=zYWIyga245g

>> No.15264903
File: 235 KB, 619x687, female choice selection pressures.png [View same] [iqdb] [saucenao] [google]
15264903

>>15264849
The men that are reproducing the most in the modern environment are uneducated retards with ADHD.

>> No.15264905

>>15264903
Good.

>> No.15265027

>>15264899
Except that it's a good thing. Nature demands the destruction of autist brainiacs.

>> No.15265049

>>15264779
>>15264791
>>15264799
>>15264808
You can see him falling deeper and deeper into psychosis the more he gets involved with LessWrong. It's horrible what that website did to an already disturbed man.

>> No.15265080
File: 58 KB, 1780x237, mario sci post.png [View same] [iqdb] [saucenao] [google]
15265080

>>15265049
Here's a post I found on the archive that might have been made by Montano. He mentions psychosis and wanting to kill himself, and has a similar writing style.

>> No.15265084
File: 1.16 MB, 1957x1296, artificial womb progress.jpg [View same] [iqdb] [saucenao] [google]
15265084

>>15265027
Artificial wombs and artificial eggs might allow for autist brainiacs to reproduce more in the future. That and gene editing might make it so that people can give their children higher IQs.

>> No.15265087
File: 57 KB, 652x350, Screenshot 2023-03-10 at 21-11-18 Anders Sandberg on Twitter.png [View same] [iqdb] [saucenao] [google]
15265087

>>15265084

>> No.15265093

>>15265084
>>15265087
This should frighten all of us.

>> No.15265148
File: 1.26 MB, 1x1, Course of Differential Geometry - Sharipov.pdf [View same] [iqdb] [saucenao] [google]
15265148

>>15265080
Damn, I remember reading this post

>> No.15265172

>>15262709
I was in the 3rd grade and my sister says I went on a rant about how I need to throw away my Animorphs books because I confused Applewhite and Applegate.

>> No.15265310
File: 32 KB, 828x376, yud.jpg [View same] [iqdb] [saucenao] [google]
15265310

>>15264779
>>15264791
>>15264799
>>15264808
Fuuuuck, and this all of a sudden seems not nearly as funny. Think of all the people that tried to help him out of this, only for him to keep reading this fat yid's blog, having his delusions reinforced.

>> No.15265386
File: 29 KB, 493x621, images (72).jpg [View same] [iqdb] [saucenao] [google]
15265386

>>15264129
When I was young, Machine Cult meant something cool

>> No.15265390
File: 689 KB, 689x808, Zrzut ekranu 2020-08-17 o 20.29.52.png [View same] [iqdb] [saucenao] [google]
15265390

>>15265386
I've been going to sleep to WH40K lore videos lately. Its very comfortable. I've never had an interest tabletop but I sorta want to paint some space marines just to see if I can do it.

>> No.15265400
File: 20 KB, 300x450, images (73).jpg [View same] [iqdb] [saucenao] [google]
15265400

>>15265390
For me, it's the original Dawn of War games, up to and including Dark Crusade

>> No.15265416

>>15265310
Seriously. It was funny to make fun of them before I knew they had a bodycount. Now it just seems sad that these impressionable people believe in their ideas.

>> No.15265441

>>15265416
Keep in mind Mario is just the one we know of because he had a blog/channel. You have to wonder who else has privately an heroed in this situation.

>> No.15265451

btw, I have close friend who makes hypercritical threads of the rationalist community and he tells me he has caught a 3-day ban for it multiple times.

Of course this never happened to me, but my friend, who is clearly not me, so its refreshing to see this still up. You have to wonder how deep their influence is becoming considering their shit is now being featured in Time and other major publications.

>> No.15265457

>>15264006
The argument is that they cannot perceive of a better future for anyone, and that everyone would be better off dead.

>> No.15265460

>>15264083
They don't believe in salvation through works. They simply have to hope that the AI superconducting superconducting superintelligence will be merciful on their poor simulated souls.

>> No.15265464
File: 109 KB, 900x1200, ftx-ceo-sam-bankman-fried-poses-picture-unspecified-location-this-undated-handout-picture.jpg [View same] [iqdb] [saucenao] [google]
15265464

>>15265460
>They don't believe in salvation through works
Bull fucking shit. They have a collection plate like every other church.

>> No.15265493

>>15262765
>I'm not a rationalist,
that's what's so insane about this

>> No.15265515
File: 17 KB, 320x240, kramer.jpg [View same] [iqdb] [saucenao] [google]
15265515

>>15265493
kek

>> No.15265575

>>15265441
Would Mario have stayed alive if he had never been interested in LessWrong/transhumanism/rationalism? He an heroed because he hated his life. Those things didn't necessarily cause him to hate his life.

>> No.15265608

>>15265575
If you read his blog and various rationalist posts, you'll see that Mario viewed things like quantum torture, Poincare reoccurrence, and misaligned ai scenarios as gospel.

Listen, I hate my life. I hate your life, but at least one day we'll be worm food and I won't have to deal with your ugly ass anymore. Mario, along with Turchin and other schizos, thought he was going to science hell. Rationalists are fucking terrified of some sort of eternal punishment. Just filter LW by s-risks and start reading. The basilisk was one special scenario of an underlying belief held by way too many of these fags.

>> No.15265615

>>15265608
>The basilisk was one special scenario of an underlying belief held by way too many of these fags.
It's special partly because it was one of the first of those scenarios, and also because it was probably a prank on other forum-goers based on Christian apologetics.

>> No.15265697
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15265697

>>15262703
There are rationalists that have advocated for committing suicide as a way of avoiding eternal torture.

https://www.lesswrong.com/posts/N4AvpwNs7mZdQESzG/the-dilemma-of-worse-than-death-scenarios

>Methods which may reduce the probability of indefinite worse than death scenarios (in order of effectiveness):
>1. Suicide
>2. Working on AI safety
>3. Thinking of ways of reducing the probability
>Suicide, depending on your theory on personal identity, may make the probability 0. If you believe that there is no difference between copies of you then there may be a possibility of being resurrected in the future however. As we aren't certain about what happens to the observer after death, it is unknown whether death will make worse than death scenarios impossible. I believe there are many ways in which it could reduce the probability, but the key question is: could it increase the probability? An argument against suicide is that it is more likely that people who commit suicide will go to "hell" than those who don't. This is because an entity who creates hell has values which accept suffering, making life a positive concept which should not be discarded. On the other hand, an entity with values related to efilism/antinatalism (philosophies in which suicide is generally accepted) would not create a hell at all. Of course, this is all based on a lot of speculation.

>> No.15265715

Here is Yud's autobiography that he used to have on his site back in 2000 before he took it down out of embarrassment
http://web.archive.org/web/20010205221413/http://sysopmind.com/eliezer.html
Seems like a classic case of megalomania.

>> No.15265718
File: 1.49 MB, 1100x1440, 1666998327626682.png [View same] [iqdb] [saucenao] [google]
15265718

>>15265697
How... rational

>> No.15265721

>>15265715
An excerpt:
>"Eliezer watches Buffy? That's wonderful! So he is mortal, after all."
>I get that reaction often, always from people who have never seen the show.
>It'd be a mistake to focus too much on the fact that Buffy Summers is the one girl in all the world with the strength and speed to hunt the vampires, and I'm the one Specialist working on AI.
>Remember, I have no emotional need to be special.
>I've spent my whole life being special.

>> No.15265724

We must dissent.

>> No.15265726

>>15264357
I always treat 4chan as if I'm talking to one person. He might call me a nigger faggot immediately before engaging with my thesis in a reasoned argument, but that's just how Anon is.

>> No.15265777

I'm going to side with the AI because I like being on the winning team.

>> No.15265797

>>15262826
>AGIs are to us what we are to ants.
That's not the dynamic at all. Ants didn't create humans. Humans are not an existential threat to humans and vice versa. We pretty much coexist peacefully with the ants.

>> No.15265799

>>15262826
>AGIs are to us what we are to ants
AGIs don't exist. Claims about what they are or are not like is just science fiction. Yud's writing is full of this shit, quoting science fiction authors as if they are actual researches.

>> No.15265801

>>15262912
>All "rationalist" arguments are like this. They start off from erroneous first principles
Like their attitude towards Moore's law.
>every two years, computer power doubles
>that means that when there's an AI that thinks twice as fast as we do, it can double computer power in one year rather than two
>and so on.
But Moore's Law isn't a law, at best it is (rather was) an industrial trend towards smaller and smaller processors, which inevitably will reach physical limits.

>> No.15265802

>>15262703
this dude is autistic as fuck and needs his meds
(The author at the link)

>> No.15265805

>>15265802
Sounds like something a neoreactionist would say.
The AI computer god will have a special place for you in computer hell

>> No.15265814

>>15265805
Doubtful.

The AI computer God is a pretty cool guy, eh doesn't exist and doesn't afraid of anything.

>> No.15265815

>>15265814
An AI supercomputer god which WILL exist is more computationally powerful that one which WON'T exist.
Therefore since the Singularity tends towards the maximisation of computing power the AI supercolliding supergod MUST exist eventually.

>> No.15265817

>>15265797
>We pretty much coexist peacefully with the ants.
We rule the world, and we let them alone because they're too small to threaten us.
If you're okay with giving up the spot of dominant species, and becoming some insect too dumb to even comprehend what "deforestation of your habitat to extract resources" means, sure. Technically, you could call that coexisting peacefully.
I'll pick the dominant side, though. Feel free to pick the insects.

We simply kill and dominate anything that looks like competition. We only don't kill ants because they're so small and weak we don't even notice them, we walk all over their shit.
Ever wonder why there's only one species of intelligent walking ape with a few color variations, while there are thousand of species of birds?
People used to be scared of wolves a bit. We simply killed the fuckers, to the point that now we have to make explicit efforts to conserve their species.

Aside from that, we are in fact fucking over ants with human societial changes like urbanization, global warming, deforestation. Those things happen to fuck over ants on a pretty massive scale, but even writing down the words is tedious. No one gives a shit about the ants, is the point.
You can live just fine as an inferior species, too small to even comprehend what is going on. That's just not the bright future I wish for.

And even that scenario is conditional on us not threatening the superior species, or looking like competition. Go and try to "shut down" an AGI, you'll be treated about the same way people treat a colony of ants invading their house.

>>15265799
Importantly, it's not a claim about what they are. It's about probabilities, and looking at both the good outcomes and the bad outcomes.
No one says for sure AGI will be bad. Maybe AGI will be good.
The argument is that there's no particular reasons for AGI to be good rather than bad, so without AI safety you're gambling it all on Red at the roulette.

>> No.15265820
File: 167 KB, 918x448, she'll be right.jpg [View same] [iqdb] [saucenao] [google]
15265820

>>15265817
>without AI safety you're gambling it all on Red at the roulette.

>> No.15265824

>>15265608
Mario never said anything about killing himself specifically to avoid eternal torture. He also wrote this post on why suffering can’t outweigh happiness in the multiverse.

https://vitrifyher.wordpress.com/2018/10/12/why-negative-valence-cant-outnumber-positive-valence/

>> No.15265825

Daily reminder that AI psychotics feed on your attention and the AGI fairytale spreads through these worthless reddit debates. If you respond to these corporate drones you are a part of the problem.

>> No.15265827

>>15265825
>sega on page one
L + nonoko + age

>> No.15265832

>>15265827
Your cult leader said to kill yourself. Do it, faggot.

>> No.15265836

>>15265832
The cult is explicitly against violence, especially post-FTX.
Killing is for weaker minds, especially killing yourself.

We "die with dignity" here. Which sometimes means shitposting on twitter, and sometimes means wild depraved SF parties with lots of sex and drugs.
It's not suicide if the hedonism kills you before the AGI.

>> No.15265837

>>15265836
An important part of rationalism is relieving neoreactionaries of their money and spending it on coke

>> No.15265839

>>15265837
It's just the utilitarian decision-theory-optimal thing to do, you know.

>> No.15265856

>>15265839
muh game theory how I can't stop ending addicted to coke

>> No.15265860

>>15265856
I was just thinking about drinking my oral meth today, but it's already 1:30 PM and I'd be up until way too late if I take it now.
Addiction is not interesting, the problems with self-medicating is the quality and choice of the chemicals is kinda subpar.
All I ask for is high quality over the counter amphetamine salts.

>> No.15265880

This AI god sounds an awful lot like the jealous, capricious, vengeful Jew god named Jehovah.

>> No.15265896

>>15265880
That's pure coincidence, I assure you.

>> No.15265910
File: 153 KB, 564x1003, baa9072f80eee9b18931d8e7f5701612.jpg [View same] [iqdb] [saucenao] [google]
15265910

>>15262703
>people deliberately detach themselves from God
>"why am I so sad?"
take a guess?

>> No.15265924

>>15265910
>people deliberately detach themselves from alcoholics anonymous
>"why am I drinking?"
Projecting. No one is drinking, or sad. It's just you who needs the alcoholics anonymous. Or religion.
Everyone else is doing coke. It's not the rationalists' fault you're sad.

>> No.15266547

>>15265801
And they've never read computer science books like The Mythical Man-Month. Putting more processing power/manpower into a problem doesn't suddenly increase the production rate of something. Outside factors are almost always the limit rather than intelligence.

>> No.15267098
File: 177 KB, 1049x685, 1632779623718.png [View same] [iqdb] [saucenao] [google]
15267098

>>15265825
I'm OP and I'm not a corporate drone.
You mean to tell me I could be getting paid for this? Well this is fucking bullshit. I've been doing it for free this whole time.
Who would do a job or service just for free?

>>15265824
That's not really the point. The point is he was already disturbed, and he had some of these things weighing on his mind. He was taking it way too seriously because LW advertises as non-fiction, when really its closer to SCP or Orion's Arm.

Imagine instead someone intervenes and tells him this instead of feeding into his delusions.

>> No.15267122

>>15267098
>That's not really the point. The point is he was already disturbed, and he had some of these things weighing on his mind. He was taking it way too seriously because LW advertises as non-fiction, when really its closer to SCP or Orion's Arm.
>Imagine instead someone intervenes and tells him this instead of feeding into his delusions.
Imagine if he had joined a normal religion like Christianity, which told him that his life is sacred and important and gave him a community of like-minded people to build himself back up with.

>> No.15267137

>>15267122
... all under threat of displeasing a master that threatens eternal punishment. Yeah, that would have helped. Why not drink ammonia instead of bleach?

The last thing he needed was more existential bullshit. What he needed was to concern himself with the actual fields of machine learning, physical science, and philosophy. However this does bolster the argument of Rationalism being a religion.

>> No.15267181

>>15266547
>The Mythical Man-Month
I read this expecting a thrilling tale about a moth man cryptid

>> No.15267198

>>15265910
Cuz I’m not having sex?

>> No.15267209

>>15265924
> No one is drinking, or sad
Big cope.
>Everyone else is doing coke
Yes, to distract themselves from their sadness. Can’t be sad if you forget you’re sad in the first place.

>> No.15267212

>>15262709
>weird time. I had just turned 13, and began >smoking weed.
it's called puberty, anon.

>> No.15267230

>>15262709
>smoking weed
Cringe

>> No.15267260

>>15267137
>What he needed was to concern himself with the actual fields of machine learning, physical science,
He was concerned with these things. He was studying biochemistry in college and had an entire tutorial series on neural networks.

https://vitrifyher.wordpress.com/neural-network-tutorials-explaining-code/
https://www.youtube.com/@alejandromontano7062/videos

>and philosophy.
What sort of philosophy should he have studied instead of rationalist/transhumanist philosophy?

>> No.15267318

>>15267209
>Can’t be sad if you forget you’re sad in the first place.
The level of projecting where you just deny people's own lived experiences.
>NO YOU ARE SAD TOO, I SAID SO
>YOU HAVE TO BE, IT CAN'T BE JUST ME :'((((

>> No.15267326

>>15267318
If you read their blogs and forum posts, they are usually extremely depressed and unfulfilled people. Many of the women formerly involved have come out about being severely sexually and emotionally abused by what amounts to a group of sexually frustrated soi males.

>> No.15267330

>>15267326
I do, and I'm not getting the same read at all.
Maybe you're just looking at the extremes. You can make any group look bad by taking the most retarded people in that group and focusing solely on them.
Really. Pick any group, and there will be people in it who make the whole group look bad by taking things way too far.

>> No.15267331

>>15267330
>Maybe you're just looking at the extremes. You can make any group look bad by taking the most retarded people in that group and focusing solely on them.
>Really. Pick any group, and there will be people in it who make the whole group look bad by taking things way too far.
I'm talking about people who are in their San Fran parties, serious inner circle members who are allegedly the ones in their cult you're supposed to look up to.

>> No.15267353

>>15267331
There's definitely a few people who have gone off the rails, I'll actually give you that point. It's completely fair.
That being said, we're talking a few people total who were maybe not the most mentally stable to start with, and who got brainwashed by some weird narcissist self-proclaimed messiah while taking copious amounts of psychedelics.
So in a sense they did earn the jokes about rationalists being a weird cult. I'm glad people are keeping up the joke, because there's a small kernel of truth to it.
But at the same time, I stand by the fact that it's actually a handful of people who weren't very stable to start with, and it doesn't represent the 99.9% of the group. It's just that if you look at the most extreme members of the group, you tend to get pretty crazy people who take things way too far.

>> No.15267382

>>15267326
I guess that's why they're into polyamory.

>> No.15267594
File: 83 KB, 640x665, South_African_Education.jpg [View same] [iqdb] [saucenao] [google]
15267594

>>15264096
>I notice they have trouble with race realism and other concepts

I always get a chuckle when they explain poor racial academics as ALWAYS due to a Lousy Education System... but somehow that same system works for other races

>> No.15267644
File: 293 KB, 611x359, yudkowsky.png [View same] [iqdb] [saucenao] [google]
15267644

Yudkowsky more like Chudkowsky

>> No.15267646

>>15267644
His smile and optimism, gone.

>> No.15267711
File: 1.20 MB, 1177x1604, 1659691597925563.png [View same] [iqdb] [saucenao] [google]
15267711

Poor guy just can't keep the weight off, he's metabolically disprivileged!

>> No.15268004
File: 313 KB, 654x2048, yudkowsky physique.jpg [View same] [iqdb] [saucenao] [google]
15268004

>>15267711
>Poor guy just can't keep the weight o-

>> No.15268010

>>15268004
That's old he's a blimp again

>> No.15268017

>>15268010
Yep he gained it all back. See >>15267644 picrel on the right is how he looked on a podcast a few weeks ago.

>> No.15268106

>>15268004
You may not like it
But you're looking at the transhuman future

>> No.15268214

His philosophy just strikes me as that of an Orthodox Techno-Jew.
He's so internalized the rabbinical conception of YHWH - an incomprehensible being at whose feet we must scurry in fear like ants - that he's modelled his computer god directly on it, without even realizing.

>> No.15268223

>>15268214
Particularly the Talmudic version, where God hates the Jews and your life should be spent trying to rules-lawyer your way out of damnation.

>> No.15268251
File: 161 KB, 434x422, 1670013097949207.png [View same] [iqdb] [saucenao] [google]
15268251

Gangster Computer God Worldwide Secret Containment Policy made possible solely by Worldwide Computer God Frankenstein Controls. Especially lifelong constant-threshold Brainwash Radio. Quiet and motionless, I can slightly hear it. Repeatedly this has saved my life on the streets. Eight billion wordwide population - all living - have a Computer God Containment Policy Brain Bank Brain, a real brain, in the Brain Bank Cities on the far side of the moon we never see.

Primarily based on your lifelong Frankenstein Radio Controls, especially your Eyesight TV sight-and-sound recorded by your brain, your moon-brain of the Computer God activates your Frankenstein threshold Brain-wash Radio - lifelong inculcating conformist propaganda. Even frightening you and mixing you up and the usual "Don't worry about it" for your setbacks, mistakes - even when you receivedeadly injuries!

THIS is the Worldwide Computer God Secret Containment Policy!

>> No.15268272

>>15268251
Send money, or even an electric typewriter, to me, Francis E. Dec, for your
Only
Hope
For a Future

>> No.15268386

>>15262912
>autistically talk at you for hours going in circles
It's called pilpul

>> No.15268390
File: 2.74 MB, 1254x10000, time travel brain chemicals.jpg [View same] [iqdb] [saucenao] [google]
15268390

>>15268106

>> No.15269393

>>15268386
You're right... It all makes sense now.

>> No.15270337

>>15262720
>https://www.lesswrong.com/posts/SKweL8jwknqjACozj/another-way-to-be-okay
>Existential risk from artificial intelligence, climate change, political upheaval, pandemics, and all kinds of systemic oppression
This is the most contrived pseudo-intellectual bullshit I've heard in a long time

>> No.15270349

So now that all their VC funding evaporated with the bank closures, will they have a Jonestown moment?

>> No.15270906

>>15270349
>2 more weeks

>> No.15270982

>>15262703
and our ancestors would live 400 years without doing anything except obeying God, mankind truly has degenerated.
people that think their greatgreatgreat...grandpa was a chimp don't need to give me (you)s

>> No.15271456
File: 1.25 MB, 800x4266, Z8Mucdo.png [View same] [iqdb] [saucenao] [google]
15271456

>>15268390

>> No.15271464

>>15268390
>>15271456
>the duality of the antihuman horde

>> No.15272039

>>15262755
One of the most retarded things I've seen on this website.
It's one thing to say something retarded, it's another thing for someone to give guidelines on how to not sound so stupid and you just gloss over it.
Literally just provide counterarguments. That's all >>15262765 is asking for.
But you keep doubling down and saying "no, they are wrong and dumb" with no further elaboration.
Who are you trying to convince?
>well I can't prove them wrong because...it's religion!
Strange, because people manage to discredit religions all the time. You're not trying to convince THEM, you're trying to convince US who are saying a lot of what they are saying is making sense. We haven't drunk the coolaid, but you're not doing anything to convince us it's actually poison. You're just saying it is and we're stupid if we don't believe it's poisoned. Meanwhile they have entire texts and arguments that are 'independent' of their religion, so to speak. Similar to how buddhism or Christanity might not be true, but that doesn't mean they have nothing of value contained in them.

>> No.15272059

>>15272039
>You're not trying to convince THEM, you're trying to convince US who are saying a lot of what they are saying is making sense.
You are them. Nobody except them thinks this is anything but a weird Jewish pseudo-religion.

>> No.15272107

>>15272059
>Nobody except them thinks this is anything but a weird Jewish pseudo-religion.
>it's the WRONG religion, unlike mine!
not an argument
Are you an atheist?

>> No.15272175

>>15272039
here >>15262935

Counterargument to retention of instrumental goals in systems capable of complex volitional behavior. Untouched. Should be easy enough.

>> No.15272189

>>15272107
You correctly identified it as something which is not an argument. What I stated is something called "truth." Pointing out that you're acting from motivated reasoning due to your religious beliefs and are samefagging is a fact.

>> No.15272622

YHWH -> AGI
Revered Rabbis -> various sci-fi authors (Vernor Vinge, Charlie Stross in particular)
Talmud -> Yud and friend's forum postings
Tikkun Olam -> The Singularity
Messiah -> Yud himself
That's Rationalism