If you aren't familiar with it, Less Wrong is a community blog started by Eliezer Yudkowsky, better known on the internet as the writer of Harry Potter and the Methods of Rationality, based around refining rationality, with knowledge of how human psychological biases work -- humans have many flaws metaphorically engineered into them by evolution, and we work to understand how these biases work and correct for them, getting a better picture of how reality works.
We avoid stupid ideas of how "rationality" works, like the emotionless pseudo-rational Spock, or the magical Sherlock Holmes, and we aren't soulless and joyless calculating minimalists, either -- we promote a rationality that humans can achieve, and use to improve human life, and your own life -- I've felt like my life has gotten much better after applying rationalist and anti-bias principles to it.
Don't expect us to support libertarianism, MRAism, and all that neo-reactionary garbage -- a lot of neo-reactionaries have latched onto LW, but the central community and its leaders disdain them. But don't expect us to blindly accept everything that a certain political position espouses. Our morals are generally on the bleeding liberal-socialist side, perhaps even more to the left. Our economics tends towards the post-scarcity kind, as we're futurists who believe that the future's gonna be a hoot.
We've been compared to a cult on several sites, but it's pretty nice regardless. Be warned that we're usually associated with several other ideas, like nanotech utopianism, singularitarianism, transhumanism, antideath/cryonics, superhuman AI/AGI, and polyamory/gamy (I'm a monogamist myself, mind you). If you're familiar with the Culture, that's basically what weŕe aiming for!
I'm sure I'll get snarky replies, but if you actually disagree, I would prefer you actually argued your points clearly, with a minimal of rhetoric. You won't win points by quoting one of the ludicrous beliefs we have, 'cause while we have many; much of it is "distant in inferential space" -- in other words, you need to understand some less weird sounding things to understand the weird things, like how you need to know arithmetic to do algebra, and algebra to do calculus.
If you're not put off by all this, and want to explore LWian rationality more, I recommend reading the LessWrong sequences. Books like Predictably Irrational are good too. If you want to look broader, check the map of the rationalist community. If you're in Boston or the Bay Area, CFAR is hosting rationality workshops in the upcoming months.
We avoid stupid ideas of how "rationality" works, like the emotionless pseudo-rational Spock, or the magical Sherlock Holmes, and we aren't soulless and joyless calculating minimalists, either -- we promote a rationality that humans can achieve, and use to improve human life, and your own life -- I've felt like my life has gotten much better after applying rationalist and anti-bias principles to it.
Don't expect us to support libertarianism, MRAism, and all that neo-reactionary garbage -- a lot of neo-reactionaries have latched onto LW, but the central community and its leaders disdain them. But don't expect us to blindly accept everything that a certain political position espouses. Our morals are generally on the bleeding liberal-socialist side, perhaps even more to the left. Our economics tends towards the post-scarcity kind, as we're futurists who believe that the future's gonna be a hoot.
We've been compared to a cult on several sites, but it's pretty nice regardless. Be warned that we're usually associated with several other ideas, like nanotech utopianism, singularitarianism, transhumanism, antideath/cryonics, superhuman AI/AGI, and polyamory/gamy (I'm a monogamist myself, mind you). If you're familiar with the Culture, that's basically what weŕe aiming for!
I'm sure I'll get snarky replies, but if you actually disagree, I would prefer you actually argued your points clearly, with a minimal of rhetoric. You won't win points by quoting one of the ludicrous beliefs we have, 'cause while we have many; much of it is "distant in inferential space" -- in other words, you need to understand some less weird sounding things to understand the weird things, like how you need to know arithmetic to do algebra, and algebra to do calculus.
If you're not put off by all this, and want to explore LWian rationality more, I recommend reading the LessWrong sequences. Books like Predictably Irrational are good too. If you want to look broader, check the map of the rationalist community. If you're in Boston or the Bay Area, CFAR is hosting rationality workshops in the upcoming months.