Tree-Based Methods for Statistical Learning in R(Chapman )

R 中基于树的统计学习方法

概率论

原   价:
1136.25
售   价:
909.00
优惠
平台大促 低至8折优惠
发货周期:国外库房发货,通常付款后3-5周到货!
作      者
出  版 社
出版时间
2022年06月03日
装      帧
精装
ISBN
9780367532468
复制
页      码
400
开      本
234 x 156 mm (6.14 x 9.21
语      种
英文
综合评分
暂无评分
我 要 买
- +
库存 50 本
  • 图书详情
  • 目次
  • 买家须知
  • 书评(0)
  • 权威书评(0)
图书简介
Tree-based Methods for Statistical Learning in R provides a thorough introduction to both individual decision tree algorithms (Part I) and ensembles thereof (Part II). Part I of the book brings several different tree algorithms into focus, both conventional and contemporary. Building a strong foundation for how individual decision trees work will help readers better understand tree-based ensembles at a deeper level, which lie at the cutting edge of modern statistical and machine learning methodology.The book follows up most ideas and mathematical concepts with code-based examples in the R statistical language; with an emphasis on using as few external packages as possible. For example, users will be exposed to writing their own random forest and gradient tree boosting functions using simple for loops and basic tree fitting software (like rpart and party/partykit), and more. The core chapters also end with a detailed section on relevant software in both R and other opensource alternatives (e.g., Python, Spark, and Julia), and example usage on real data sets. While the book mostly uses R, it is meant to be equally accessible and useful to non-R programmers.Consumers of this book will have gained a solid foundation (and appreciation) for tree-based methods and how they can be used to solve practical problems and challenges data scientists often face in applied work.Features:
  • Thorough coverage, from the ground up, of tree-based methods (e.g., CART, conditional inference trees, bagging, boosting, and random forests).
  • A companion website containing additional supplementary material and the code to reproduce every example and figure in the book.
  • A companion R package, called treemisc, which contains several data sets and functions used throughout the book (e.g., there’s an implementation of gradient tree boosting with LAD loss that shows how to perform the
  • 本书暂无推荐
    本书暂无推荐
    看了又看
    • 上一个
    • 下一个