[2018]DUDES: We made a website where you can look up a charity and see what % of donations it spends on admin overhead
ME: Hey that rules
DUDES: It's called effective altruism
[2025]OTHER DUDES: So those EA dudes want to pave all farmland on earth for the benefit of hypothetical robots 10,000,000,000,000 years in the future
ME: Wha
[DOES THIS STORY HAVE A MORAL? I CAN'T TELL.]
reshared this
Cassandra is only carbon now
in reply to mcc • • •margot
in reply to mcc • • •Oblomov reshared this.
margot
in reply to margot • • •Irenes (many)
in reply to margot • • •Cassandra is only carbon now
in reply to Irenes (many) • • •reshared this
Giuseppe Aceto e Oblomov reshared this.
JP
in reply to Cassandra is only carbon now • • •Irenes (many)
in reply to JP • • •Asta [AMP]
in reply to JP • • •I suddenly see a connection to one of Zeno's Paradoxes, as well, where when you build a seemingly logical construct of reality (to reach a location, you must first reach halfway between there and your starting point) and take it to the limit (infinitely many halved distances between you and the destination) you end up with results that seem sound (the harmonic series does not converge!) and yet are easily disproven just by looking at reality.
The scientific method applied here would suggest, "oh, your model is bad". But it's clear there was no applying that here.
Cassandra is only carbon now
in reply to Asta [AMP] • • •@aud @jplebreton @emaytch @ireneista There was a great article I read a while ago about how you can understand science as the transition from rationalism to empiricism. That is, that science is the idea that you need to actually check your logic against the real world. There are many logically consistent worlds which are not ours, so it doesn't matter what you derive in your own brain if you don't have a connection out to empirical observation.
Techbros could stand to take note.
Irenes (many)
in reply to Cassandra is only carbon now • • •Cassandra is only carbon now
in reply to Irenes (many) • • •Irenes (many)
in reply to Irenes (many) • • •Cassandra is only carbon now
in reply to Irenes (many) • • •Tom Forsyth
in reply to Cassandra is only carbon now • • •Cassandra is only carbon now
in reply to Tom Forsyth • • •mcc
in reply to Cassandra is only carbon now • • •reshared this
mcc e Oblomov reshared this.
mcc
in reply to mcc • • •reshared this
mcc e Oblomov reshared this.
Bruce Elrick
in reply to mcc • • •This sounds like a Old Testament curse.
Howard Chu @ Symas
in reply to mcc • • •Cassandra is only carbon now
in reply to mcc • • •Sensitive content
mcc
in reply to Cassandra is only carbon now • • •Sensitive content
George B
in reply to mcc • • •@xgranade @TomF @ireneista @emaytch
Oh, is that why they're called "7th generation"? I read that expecting it to be Dr. Bronner's until I got to that point
Neville Park
in reply to Cassandra is only carbon now • • •Cassandra is only carbon now
in reply to Neville Park • • •Rachel Barker
in reply to Cassandra is only carbon now • • •@xgranade @ireneista @emaytch That reminds me of something I was taught in physics, which has absolutely saved my bacon multiple times in terms of not getting drawn into weird ideologies:
If your model of the world predicts that something will be infinite, that doesn't mean that whatever it corresponds to in the real world is actually, literally infinite. Instead, it means that your model is missing something. You have extrapolated beyond the domain of applicability of your model, and something else you haven't accounted for will happen before you get there.
mcc
in reply to Rachel Barker • • •madison taylor
in reply to Irenes (many) • • •@ireneista @emaytch slight disagree: it's a multiply by infinity error (because the future is infinite)
more specifically: it's not the first place they went wrong, but the place that they really removed all guardrails that would mitigate any wrongness, was in giving value in the future a 0% discount rate.
while people in the future are not any less morally legitimate than people are today, not only is opportunity cost is real, but the future is uncertain (and your plans for the future even more so! you're not building the far future, you'll be dead)
once you've made that error, basically any imaginary future is justified by throwing excessively big numbers at it, and you're just Immanentizing The Eschaton again (usually a big sign you've gone wrong).
sure, we could argue for years about Repugnant Conclusions and whether various measures of utility in utilitarianism make any sense, but giving even the most modest discount rate to The Abstract Far Future would render 99.9% of these questions moot
Irenes (many)
in reply to madison taylor • • •mcc
in reply to Irenes (many) • • •margot
in reply to mcc • • •mcc
in reply to mcc • • •mcc
in reply to mcc • • •Irenes (many)
in reply to mcc • • •we've tried that line of argument and the immediate response is "but you still have a value function for them"
so we put on our mathematician hat and thought about why they aren't numbers, and this is the result
we agree with you, of course, but the goal with this line of argumentation is to reach people who are deeply mired in this belief system
mcc
in reply to Irenes (many) • • •@ireneista @tomoyo @emaytch "but you still have a value function for them"
I don't agree with that at all, but I assume you're not here person I need to convince
madison taylor
in reply to madison taylor • • •for those who don't spend a lot of time thinking about kinda-abstract economics (an eminently rational form of ignorance - in the economics sense not the rationalist sense - and quite defensible)
a normal discount rate in economic decision-making might be something like 8%-25% for a company (depending on the stability of the firm and its future prospects)
or, more abstractly, in terms of economic value, maybe something like 4% annually (in real terms)
but hey, i'm not picky. take, like, just a 1% discount rate, just to account for the risks to your own plans going badly awry. heck, be full of hubris and take a 0.1% discount rate: suddenly it's numerically clear that the infinities of your Machine God are all total nonsense, and you can join the rest of us worrying about
$BILLIONAIREhaving all the powerLuke
in reply to madison taylor • • •@tomoyo I'm shook cause I didn't realize they didn't do some sort of discount rate. Like even entry level Expected Value models account for probabilities, and everything compounds.
I never bothered to look at their calculations because I know it's so obviously wrong, but wow the amount of hubris to model like that and assume you are getting anything that "proves" anything.
Nathan Vander Wilt
in reply to margot • • •mcc
in reply to Nathan Vander Wilt • • •@natevw @emaytch so there's a "soft problematic" version of EA where they get really really focused on dollars that go directly to services and this winds up over-funding things that accidentally game that number and de-funding important community work which due to the structural nature of its work means a slightly higher percentage gets spent on facilities or outreach
your city gets a lot of mosquito nets but no arts funding, in blunt terms
mei | fully hingeless architecture now in production likes this.
Oblomov reshared this.
Nathan Vander Wilt
in reply to mcc • • •Oblomov reshared this.
margot
in reply to mcc • • •Oblomov
in reply to mcc • • •Video Game King
in reply to mcc • • •mcc
in reply to mcc • • •Robert Anton Wilson, 1979: Any false premise, sufficiently extended, provides a reasonable approximation of insanity
2025: Any false premise, sufficiently extended, turns out to be an already-existing thread on something called "lesswrong dot com" and it turns out a cofounder of Paypal has already given it 10 million dollars
mastodon.social/@emaytch/11546…
margot
2025-10-30 21:47:03
mei | fully hingeless architecture now in production likes this.
Oblomov reshared this.
Kevin Boyd (he/him) 🇨🇦
in reply to mcc • • •@emaytch dammit that domain is taken.
But guess what isn't?
lesswrong.ai
Lu-Tze
in reply to Kevin Boyd (he/him) 🇨🇦 • • •@kboyd
Oh boy, you're at the entrance of a glorious rabbit hole. It's got a basilisk that's going to torture us all for all eternity, several murders, Harry Potter fan fiction and disturbing ties to some of the most powerful people in the world.
It's a doozy.
@mcc @emaytch
mcc
in reply to Lu-Tze • • •Lu-Tze
in reply to mcc • • •It's my kind of trash, I guess. Among the coalition of factions trying to fuck over the vast majority of people, LessWrong is one of the goofier nodes.
They're the modern occultist strain of Nazis, like in Indiana Jones.
mcc
in reply to Lu-Tze • • •Lu-Tze
in reply to mcc • • •Hey, it's good to have my tastes challenged from time to time. I'm aware that sometimes I can get lost in fascination for the grotesque, and while that can be both fun and part of a useful critique of fascist movements, it's also easy to lose sight of the ways such movements hurt real people.
On the other hand, you've piqued my interest about Texan weirdos. Could you maybe suggest a good point of entry?
mcc
in reply to Lu-Tze • • •Lu-Tze
in reply to mcc • • •Chris Ammerman
in reply to mcc • • •A rat/rabbit hole that is full of batshit has to be a "bat hole", right?
That's for sure what I'm calling it from now on anyway!
"Sorry, I'm a bit tired today. Last night I learned about Effective Altruism and went way down a bat hole reading about it online instead of going to sleep."
mcc
in reply to mcc • • •Quotes from Jeff Miller @jmeowmeow in a followers-only discussion I wanted to foreground:
"Runaway inflation in the philosophical flattery economy."
"As wealth and power is narrowly concentrated, the reward of flattery as a practice increases. If there's competition for flattery work as verbally charming people lose other opportunities for subsistence, there's motivation to go bigger and bigger."
reshared this
Oblomov e The Janx Devil reshared this.
mhoye
in reply to mcc • • •λTotoro
in reply to mcc • • •I once saw an EA person do a presentation and their shiny formula for the expected value of the future of humanity included terms for the star density both in the milky way as well as in the local virgo supercluster as a whole.
Absolute clown show of a movement.
Oblomov reshared this.
Oblomov
in reply to λTotoro • • •@lambdatotoro the fun thing is that they aren't even doing THAT right. As an example (and shameless plug) some time ago I did some back of the napkin calculations to check what energy requirements world be like if we didn't stop pushing for “growth at all costs”. Billions of years? Turns out that at current growth rates we'd exhaust the entire Milky Way in a couple thousand years even if we could convert it to energy at 100% efficiency.
wok.oblomov.eu/tecnologia/nucl…
Nuclear will not save us
wokJason Petersen (he)
in reply to mcc • • •mcc
in reply to Jason Petersen (he) • • •@jason Charity navigator is generally considered to be an "effective altruist" project and it is the website I was thinking of. I pulled 2018 out of a hat because I was like "when did i first encounter that? 2015?".
It doesn't believe in the future semi-infinite AI people thing, but it embeds some of the other bad assumptions of EA. I would say that it awards its "stars" based on over-valuing metrics that may not actually be the best way to provide well-rounded community benefit.
Jason Petersen (he)
in reply to mcc • • •Orca 🌻 | 🎀 | 🪁 | 🏴🏳️⚧️
in reply to mcc • • •Nonprofit Explorer - ProPublica
ProPublicamcc
in reply to Orca 🌻 | 🎀 | 🪁 | 🏴🏳️⚧️ • • •NOS 🅰️ ®️ ✝️ U
in reply to mcc • • •jiub
in reply to mcc • • •what i don't understand is how the fuck is roko's basilisk anything more than dumb bullshit that you talk about when you're extremely drunk or high
but supposedly intelligent men believe it's a real thing
mcc
in reply to jiub • • •ShadSterling
in reply to mcc • • •Eleanor Saitta
in reply to mcc • • •mcc
in reply to Eleanor Saitta • • •Eleanor Saitta
in reply to mcc • • •