I was a bit disappointed that it wasn’t about machines doing automatic analysis for us, but this article, “The vast majority of statistical analysis is not performed by statisticians“, is a bit of a wake-up call for those statisticians who haven’t realised that we need to improve the way we teach statistics and interact with non-statisticians. I don’t think we have enough statisticians in the world to do all the analysis that needs doing, so we need to focus on training scientists and others better so that we don’t leave them stuck in a culture of bad regression and t-tests in Excel.
Gianluca Baio (UCL) has a really nice introduction to INLA with a comparison to JAGS. I started using JAGS when WinBUGS/OpenBUGS was becoming too slow for the analysis I was doing but the major paper of my thesis uses INLA for spatio-temporal analysis. I still use both programs and when faced with a new problem will usually start in JAGS as it’s quite flexible in the way you set up priors. INLA has its advantages as well, one of them being that it will fit a Poisson likelihood to non-integer data very well.
There’s a neat little article on the PLoS blog about linkages between art and science and how the involvement of art in research (beyond making prettier plots, which is really more an issue of design than art) can lead to better scientific outcomes.
Radford Neal has just announced pqR, “pretty quick R”, which is designed to make use of multiple cores wherever possible and avoid unnecessarily onerous computation. It’s not available for Mac/Windows yet, so I won’t be able to look at it for the time being. I wonder if QUT’s HPC group would consider making it available on the supercomputer.