• Contact

  • Newsletter

  • About us

  • Delivery options

  • Prospero Book Market Podcast

  • News

  • The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition

    The Elements of Statistical Learning by Hastie, Trevor; Tibshirani, Robert; Friedman, Jerome;

    Data Mining, Inference, and Prediction, Second Edition

    Series: Springer Series in Statistics;

      • GET 20% OFF

      • The discount is only available for 'Alert of Favourite Topics' newsletter recipients.
      • Publisher's listprice EUR 80.24
      • The price is estimated because at the time of ordering we do not know what conversion rates will apply to HUF / product currency when the book arrives. In case HUF is weaker, the price increases slightly, in case HUF is stronger, the price goes lower slightly.

        34 037 Ft (32 416 Ft + 5% VAT)
      • Discount 20% (cc. 6 807 Ft off)
      • Discounted price 27 229 Ft (25 933 Ft + 5% VAT)

    34 037 Ft

    db

    Availability

    Estimated delivery time: In stock at the publisher, but not at Prospero's office. Delivery time approx. 3-5 weeks.
    Not in stock at Prospero.

    Why don't you give exact delivery time?

    Delivery time is estimated on our previous experiences. We give estimations only, because we order from outside Hungary, and the delivery time mainly depends on how quickly the publisher supplies the book. Faster or slower deliveries both happen, but we do our best to supply as quickly as possible.

    Short description:

    During the past decade there has been an explosion in computation and information technology. With it have come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It is a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting---the first comprehensive treatment of this topic in any book.

    This major new edition features many topics not covered in the original, including graphical models, random forests, ensemble methods, least angle regression and path algorithms for the lasso, non-negative matrix factorization, and spectral clustering. There is also a chapter on methods for ``wide'' data (p bigger than n), including multiple testing and false discovery rates.

    Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie co-developed much of the statistical modeling software and environment in R/S-PLUS and invented principal curves and surfaces. Tibshirani proposed the lasso and is co-author of the very successful An Introduction to theBootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, projection pursuit and gradient boosting.

    More

    Long description:

    This book describes the important ideas in a variety of fields such as medicine, biology, finance, and marketing in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of colour graphics. It is a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting---the first comprehensive treatment of this topic in any book.

    This major new edition features many topics not covered in the original, including graphical models, random forests, ensemble methods, least angle regression & path algorithms for the lasso, non-negative matrix factorisation, and spectral clustering. There is also a chapter on methods for "wide'' data (p bigger than n), including multiple testing and false discovery rates.

    More

    Table of Contents:

    Overview of Supervised Learning.- Linear Methods for Regression.- Linear Methods for Classification.- Basis Expansions and Regularization.- Kernel Smoothing Methods.- Model Assessment and Selection.- Model Inference and Averaging.- Additive Models, Trees, and Related Methods.- Boosting and Additive Trees.- Neural Networks.- Support Vector Machines and Flexible Discriminants.- Prototype Methods and Nearest-Neighbors.- Unsupervised Learning.- Random Forests.- Ensemble Learning.- Undirected Graphical Models.- High-Dimensional Problems: p ? N.

    More
    Recently viewed
    previous
    The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition

    The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition

    Hastie, Trevor; Tibshirani, Robert; Friedman, Jerome;

    34 037 HUF

    next