图书简介
Supported by a wealth of learning features, exercises, and visual elements as well as online video tutorials and interactive simulations, this book is the first student-focused introduction to Bayesian statistics. Without sacrificing technical integrity for the sake of simplicity, the author draws upon accessible, student-friendly language to provide approachable instruction perfectly aimed at statistics and Bayesian newcomers. Through a logical structure that introduces and builds upon key concepts in a gradual way and slowly acclimatizes students to using R and Stan software, the book covers: An introduction to probability and Bayesian inference Understanding Bayes’ rule Nuts and bolts of Bayesian analytic methods Computational Bayes and real-world Bayesian analysis Regression analysis and hierarchical methods This unique guide will help students develop the statistical confidence and skills to put the Bayesian formula into practice, from the basic concepts of statistical inference to complex applications of analyses.
Chapter 1: How to best use this book \\ The purpose of this book \\ Who is this book for? \\ Pre-requisites \\ Book outline \\ Route planner - suggested journeys through Bayesland \\ Video \\ Problem sets \\ Code \\ R and Stan \\ Why don’t more people use Bayesian statistics? \\ What are the tangible (non-academic) benefits of Bayesian statistics? \\ Part I: An introduction to Bayesian inference \\ Chapter 2: The subjective worlds of Frequentist and Bayesian statistics \\ Bayes’ rule - allowing us to go from the effect back to its cause \\ The purpose of statistical inference \\ The world according to Frequentists \\ The world according to Bayesians \\ Do parameters actually exist and have a point value? \\ Frequentist and Bayesian inference \\ Bayesian inference via Bayes’ rule \\ Implicit versus Explicit subjectivity \\ Chapter 3: Probability - the nuts and bolts of Bayesian inference \\ Probability distributions: helping us explicitly state our ignorance \\ Independence \\ Central Limit Theorems \\ A derivation of Bayes’ rule \\ The Bayesian inference process from the Bayesian formula \\ Part II: Understanding the Bayesian formula \\ Chapter 4: Likelihoods \\ What is a likelihood? \\ Why use ‘likelihood’ rather than ‘probability’? \\ What are models and why do we need them? \\ How to choose an appropriate likelihood? \\ Exchangeability vs random sampling \\ Maximum likelihood - a short introduction \\ Chapter 5: Priors \\ What are priors, and what do they represent? \\ The explicit subjectivity of priors \\ Combining a prior and likelihood to form a posterior \\ Constructing priors \\ A strong model is less sensitive to prior choice \\ Chapter 6: The devil’s in the denominator \\ An introduction to the denominator \\ The difficulty with the denominator \\ How to dispense with the difficulty: Bayesian computation \\ Chapter 7: The posterior - the goal of Bayesian inference \\ Expressing parameter uncertainty in posteriors \\ Bayesian statistics: updating our pre-data uncertainty \\ The intuition behind Bayes’ rule for inference \\ Point parameter estimates \\ Intervals of uncertainty \\ From posterior to predictions by sampling \\ Part III: Analytic Bayesian methods \\ Chapter 8: An introduction to distributions for the mathematically-un-inclined \\ The interrelation among distributions \\ Sampling distributions for likelihoods \\ Prior distributions \\ How to choose a likelihood \\ Table of common likelihoods, their uses, and reasonable priors \\ Distributions of distributions, and mixtures - link to website, and relevance \\ Chapter 9: Conjugate priors and their place in Bayesian analysis \\ What is a conjugate prior and why are they useful? \\ Gamma-poisson example \\ Normal example: giraffe height \\ Table of conjugate priors \\ The lessons and limits of a conjugate analysis \\ Chapter 10: Evaluation of model fit and hypothesis testing \\ Posterior predictive checks \\ Why do we call it a p value? \\ Statistics measuring predictive accuracy: AIC, Deviance, WAIC and LOO-CV \\ Marginal likelihoods and Bayes factors \\ Choosing one model, or a number? \\ Sensitivity analysis \\ Chapter 11: Making Bayesian analysis objective? \\ The illusion of the ’uninformative’ uniform prior \\ Jeffreys’ priors \\ Reference priors \\ Empirical Bayes \\ A move towards weakly informative priors \\ Part IV: A practical guide to doing real life Bayesian analysis: Computational Bayes \\ Chapter 12: Leaving conjugates behind: Markov Chain Monte Carlo \\ The difficulty with real life Bayesian inference \\ Discrete approximation to continuous posteriors \\ The posterior through quadrature \\ Integrating using independent samples: an introduction to Monte Carlo \\ Why is independent sampling easier said than done? \\ Ideal sampling from a posterior using only the un-normalised posterior \\ Moving from independent to dependent sampling \\ What’s the catch with dependent samplers? \\ Chapter 13: Random Walk Metropolis \\ Sustainable fishing \\ Prospecting for gold \\ Defining the Metropolis algorithm \\ When does Metropolis work? \\ Efficiency of convergence: the importance of choosing the right proposal scale \\ Metropolis-Hastings \\ Judging convergence \\ Effective sample size revisited \\ Chapter 14: Gibbs sampling \\ Back to prospecting for gold \\ Defining the Gibbs algorithm \\ Gibbs’ earth: the intuition behind the Gibbs algorithm \\ The benefits and problems with Gibbs and Random Walk Metropolis \\ A change of parameters to speed up exploration \\ Chapter 15: Hamiltonian Monte Carlo \\ Hamiltonian Monte Carlo as a sledge \\ NLP space \\ Solving for the sledge motion over NLP space \\ How to shove the sledge \\ The acceptance probability of HMC \\ The complete Hamiltonian Monte Carlo algorithm \\ The performance of HMC versus Random Walk Metropolis and Gibbs \\ Optimal step length of HMC: introducing the “No U-Turn Sampler” \\ Chapter 16: Stan \\ Why Stan, and how to get it \\ Getting setup with Stan using RStan \\ Our first words in Stan \\ Essential Stan reading \\ What to do when things go wrong \\ How to get further help \\ Part V: Hierarchical models and regression \\ Chapter 17: Hierarchical models \\ The spectrum from fully-pooled to heterogeneous \\ Non-centered parameterisations in hierarchical models \\ Case study: Forecasting the EU referendum result \\ The importance of fake data simulation for complex models \\ Chapter 18: Linear regression models \\ Example: high school test scores in England \\ Pooled model \\ Interactions \\ Heterogeneous coefficient model \\ Hierarchical model \\ Incorporating LEA-level data \\ Chapter 19: Generalised linear models and other animals \\ Example: electoral participation in European countries \\ Discrete parameter models in Stan
Trade Policy 买家须知
- 关于产品:
- ● 正版保障:本网站隶属于中国国际图书贸易集团公司,确保所有图书都是100%正版。
- ● 环保纸张:进口图书大多使用的都是环保轻型张,颜色偏黄,重量比较轻。
- ● 毛边版:即书翻页的地方,故意做成了参差不齐的样子,一般为精装版,更具收藏价值。
关于退换货:
- 由于预订产品的特殊性,采购订单正式发订后,买方不得无故取消全部或部分产品的订购。
- 由于进口图书的特殊性,发生以下情况的,请直接拒收货物,由快递返回:
- ● 外包装破损/发错货/少发货/图书外观破损/图书配件不全(例如:光盘等)
并请在工作日通过电话400-008-1110联系我们。
- 签收后,如发生以下情况,请在签收后的5个工作日内联系客服办理退换货:
- ● 缺页/错页/错印/脱线
关于发货时间:
- 一般情况下:
- ●【现货】 下单后48小时内由北京(库房)发出快递。
- ●【预订】【预售】下单后国外发货,到货时间预计5-8周左右,店铺默认中通快递,如需顺丰快递邮费到付。
- ● 需要开具发票的客户,发货时间可能在上述基础上再延后1-2个工作日(紧急发票需求,请联系010-68433105/3213);
- ● 如遇其他特殊原因,对发货时间有影响的,我们会第一时间在网站公告,敬请留意。
关于到货时间:
- 由于进口图书入境入库后,都是委托第三方快递发货,所以我们只能保证在规定时间内发出,但无法为您保证确切的到货时间。
- ● 主要城市一般2-4天
- ● 偏远地区一般4-7天
关于接听咨询电话的时间:
- 010-68433105/3213正常接听咨询电话的时间为:周一至周五上午8:30~下午5:00,周六、日及法定节假日休息,将无法接听来电,敬请谅解。
- 其它时间您也可以通过邮件联系我们:customer@readgo.cn,工作日会优先处理。
关于快递:
- ● 已付款订单:主要由中通、宅急送负责派送,订单进度查询请拨打010-68433105/3213。
本书暂无推荐
本书暂无推荐