Bayesian methods at the end

I was looking at the preface of an old statistic book and read this:

The Bayesian techniques occur at the end of each chapter; therefore they can be omitted if time does not permit their inclusion.

This approach is typical. Many textbooks present frequentist statistics with a little Bayesian statistics at the end of each section or at the end of the book.

There are a couple ways to look at that. One is simply that Bayesian methods are optional. They must not be that important or they’d get more space. The author even recommends dropping them if pressed for time.

Another way to look at this is that Bayesian statistics must be simpler than frequentist statistics since the Bayesian approach to each task requires fewer pages.

Related posts

10 thoughts on “Bayesian methods at the end

  1. Believe it or not, I think that is an improvement. I vague recall being taught Bayes Theorem in one call and never hearing or seeing it again until a few years ago. I’m trying to learning it now, but coming from the old school perspective, it is a bit of a challenge. Including it at the end of every chapter is an improvement, if professors take the time to teach it.

  2. If a course must be, say, 90% frequentist and 10% Bayesian, I don’t know whether it’s better to scatter the Bayesian material throughout the course or to concentrate it. The former may help students think about tasks: here are two approaches to point estimation, two approaches to hypothesis testing, etc. But the latter emphasizes the coherence of the Bayesian approach, all tasks being approached from the same starting point. I think I prefer the latter.

    Of course you could question the requirement of an 90-10 split, but that’s another subject.

  3. I believe you are being a bit unfair John. I think the problem is that there are no—or at least I haven’t seen—user friendly, introductory Bayesian books that present stats from scratch and that move into useful territory. The closest I’ve seen is Gelman and Hill’s book, but they chose to start from the familiar (frequentist glm) to the unfamiliar (Bayesian) I guess for pedagogical reasons.

    I have a few other Bayesian books in my bookshelf (Gelman, Carlin… and Jackman, for example) and I would hardly classify them as introductory material. I totally agree that presenting a coherent approach to stats, rather than a cookbook with recipes that do not make much sense as a whole, would be a much better way or learning the subject.

  4. Luis: I think this may be a matter of history. I don’t think it’s necessarily harder to write an introductory statistics book from a Bayesian perspective, but fewer people have tried.

    Bayesian statistics is certainly subtle, but so is frequentist statistics. If the goal is for a student to come out of a course with a level of (mis-)understanding equivalent to what they’d have after a traditional course, I think that’s doable.

  5. I’m not arguing that is not doable, but that I haven’t seen it done. I’m just starting getting into Bayesian applications and I have struggled to find the right book. Another separate issue is software, where I have called JAGS and MCMCglmm from R (I do use a mac so winbugs is a hassle). It does require a different mindset and developing new habits, but overall I have found it to be an enriching experience.

    It will take me years to get there and feel at home; at the same time, most worthy objectives require similar timeframes.

  6. Classical statistics is all about summarizing the data.

    Bayesian statistics is data + prior information.

    On those grounds alone, Bayes is more complicated, and it makes sense to do classical statistics first. Not necessarily p-values etc., but estimates, s.e.’s, and confidence intervals for sure.

  7. @Luise for beginners I agree tools are a problem. I would personally stick to analytical solutions where they exist, naive numerical solutions approaches for 2-3 variable models and least squares (explained from a Bayesian perspective) for more complicated problems where it’s applicable. Certainly this leaves out a lot of models, but I think it’s unavoidable for beginners.

Comments are closed.