Disclaimer: I am not a reliable guide to any intellectual arena.
The following is a post from su3su2u1’s Tumblr (now defunct).

On Reliable Guides and Bayesianism

Many years ago now, I took a survey course on philosophy that was actually very good. The professor had a way of convincing us young freshman of the validity of almost any argument presented, at least at first, and so the experience of the course was a bit unnerving. You’d find yourself walking out utterly convinced of a position that would be thoroughly demolished the very next class.
I bring this up only because this evening I had a long scotch-fueled discussion with a friend, who believes himself thoroughly Bayesian. However, never in his ‘Bayesian enlightenment’ has he encountered serious opposing viewpoints. I’m in no way an expert on such matter, but I have been working with machine learning for a few years now and so I have run into problems day to day where Bayes is just a terrible approach.
Also, Persi Diaconis is an intellectual hero of mine (he was a magician turned mathematician. I wanted to be a magician at one point, and ended up with a physics phd. Actually, now that I think about it, my real magic trick- after a phd in theoretical physics I convinced an insurance company to pay me to do statistics- TA-DA), but I digress. Diaconis and Freedman had a rather famous result about fairly well behaved situations where Bayes fails Even worse, Bayes fails without telling you it has failed- it looks like it has converged. Yes such cases are over infinite spaces, but a lot of interesting questions involve such.
But my friend, a self-declared Bayesian had never heard of these issues. He was also somewhat surprised when I suggested that most statisticians are pragmatists who will readily switch back and forth between Bayesian methods and non-Bayesian depending on the problem. There is no big war, no sides are being drawn.
This got me thinking about how important it can be to have a guide to a new intellectual arena. If my philosophy professor had stopped after presenting only half an argument (like reading just the Either of Kierkegaard’s Either/Or), I would have left that class utterly convinced. Even worse, I could have been convinced of either side.
Big Yud, in his sequences, is clearly an advocate for lots of ideas, but I fear that he isn’t a great guide into that intellectual realm. As an autodidact, no one has ever forced him to read in depth about viewpoints (or anything really) that he isn’t interested in.
I think its because of this eclectic, unguided collection of knowledge that I used to occasionally run into undergrads in my quantum classes that KNEW many worlds was the right quantum mechanics interpretation, even though they couldn’t follow a simple scattering or decay example I tried to use to cast some doubt on the issue.
Its why I’ve encountered people on the internet telling me that all scientists should do Solomonoff Induction. Apparently nobody told them its not computable, and even rather slowly asymptoting approximations are exponential (and therefore just about useless).
I’ve even surprised sequence readers by suggesting most scientists don’t think Drexler style nano-tech is possible (i.e. Richard Smalley won that debate), presenting results of an informal poll of physics grad students in my old department.
Even worse- the core of Less Wrong is supposed to be a sort of pop-Bayes interpretation. And any good pop-Bayesian should tell you that its foolish to hold strong opinions about subjects you have basically no technical grounding in. Can’t do quantum mechanics? You probably should have at most a weak preference for an interpretation of quantum. Don’t know any computation complexity/information theory stuff? Maybe you shouldn’t have a strong opinion about solomonoff induction as a basis for science or AI. Don’t know any physical chemistry? Maybe keep your opinions about nanotech rather weak.
Thats the paradox of Less Wrong- after a core component about the importance of Bayesian updates,etc, Yudkowsky spends lots of other sequences trying to convince you to take strong opinions on technical subjects given nothing but a few dozen pages of his word