My predictions for 2018. As usual, I am calibrating my predictions by comparing the percentage of predictions I got right at each probability level versus their probability (e.g., for predictions at the 70% confidence level, perfect calibration would represent getting 7/10 of them correct). Predictions with a probability rating of less than 50% are converted... Read More
Previous predictions/calibrations: 2017: Predictions - Calibration 2016: Predictions - Calibration Please note that my "Blackpill Timeline" is a what-if scenario, not a prediction. While percentages for existing predictions will remain fixed, I reserve the right to add more predictions during the next couple of days. Oil prices (WTI Crude) are higher than $70: 50% BTC... Read More
In the spirit of #SkinInTheGame, Taleb’s idea that pundits should at least stake their reputations on the strength of their knowledge, last year I made some predictions about 2017. See also predictions and results for 2016. As usual, I am calibrating my predictions by comparing the percentage of predictions I got right at each probability... Read More
In the spirit of #SkinInTheGame, Taleb's idea that pundits should at least stake their reputations on the strength of their knowledge, last year I made some predictions about what has come to be known as The Current Year. Like Scott Alexander, I am calibrating my predictions by comparing the percentage of predictions I got right... Read More
It's just a bit ironic that the figurehead of a movement (best known to the general public as the author of the world's most popular Harry Potter fanfic) that emphasizes "politics is the mindkiller" has lapsed into all-out Putin-Trump Derangement Syndrome and fake news generation.
This month the SF Bay Area based rationality organization LessWrong has released the latest survey of its members, or rather of its "diaspora," since the site itself has gone mostly dormant with many of their members now congregating at Scott Alexander's blog and Yudkowsky's various offshoots). Although some critics disparage Less Wrong as a clique... Read More
Last month there was an interview with Eliezer Yudkowsky, the rationalist philosopher and successful Harry Potter fanfic writer who heads the world's foremost research outfit dedicated to figuring out ways in which a future runaway computer superintelligence could be made to refrain from murdering us all. It's really pretty interestingl. It contains a nice explication... Read More
I like predictions. Part of that is related to my passion for quantifying everything, but another is philosophical, and borne of my antipathy towards charlatanism (I am extremely sympathetic to N.N. Taleb on this issue). In 2005, U.C. Berkeley psychologist Philip Tetlock published a study on expert fallibility spanning 18 years, 284 experts and 82,361... Read More
Effective altruism (EA) is the fairly simple idea that in charitable giving as in financial investment, you should aim to put your money where it would do the most good - be it earning the highest returns, or helping the maximum number of people. It is a laudable enough goal, though the ideas behind it... Read More
I am a blogger, thinker, and businessman in the SF Bay Area. I’m originally from Russia, spent many years in Britain, and studied at U.C. Berkeley.
One of my tenets is that ideologies tend to suck. As such, I hesitate about attaching labels to myself. That said, if it’s really necessary, I suppose “liberal-conservative neoreactionary” would be close enough.
Though I consider myself part of the Orthodox Church, my philosophy and spiritual views are more influenced by digital physics, Gnosticism, and Russian cosmism than anything specifically Judeo-Christian.