Machine Learning in Python

Machine Learning in Python  (English, Paperback, Michael Bowles)

4.4
21 Ratings & 0 Reviews
₹597
649
8% off
Hurry, Only 3 left!
  • Bank OfferExtra 5% off* with Axis Bank Buzz Credit Card
    T&C
  • Delivery
    Check
    Enter pincode
      Usually delivered in3-4 days
      ?
      Enter pincode for exact delivery dates/charges
    View Details
    Author
    Read More
    Highlights
    • Language: English
    • Binding: Paperback
    • Publisher: Wiley
    • ISBN: 9788126555925, 8126555920
    • Edition: 2015
    • Pages: 360
    Services
    • 10 Days Replacement Policy
      ?
    • Cash on Delivery available
      ?
    Seller
    BOOKCENTRE
    3.7
  • View more sellers starting from 584
  • Description
    Machine Learning in Python shows you how to successfully analyze data using only two core machine learning algorithms and how to apply them using Python. By focusing on two algorithm families that effectively predict outcomes, this book is able to provide full descriptions of the mechanisms at work and the examples that illustrate the machinery with specific, hackable code. The algorithms are explained in simple terms with no complex math and applied using Python, with guidance on algorithm selection, data preparation and using the trained models in practice.

    About the Author

    Michael Bowles began his career as an assistant professor at MIT and went on to found and run two Silicon Valley start ups, both of which went public. Dr. Bowles currently teaches machine learning at Hacker Dojo (A shared work space in Silicon Valley), consults on machine learning projects and is involved in a number of start-ups using machine learning in such areas as bioinformatics and high frequency trading. His courses at Hacker Dojo are nearly always sold out and receive great feedback from participants.
    Read More
    Specifications
    Book Details
    Publication Year
    • 2015 June
    Table of Contents
    • Introduction
      Chapter 1 : The Two Essential Algorithms for Making Predictions
      Why Are These Two Algorithms So Useful?
      What Are Penalized Regression Methods?
      What Are Ensemble Methods?
      How to Decide Which Algorithm to Use
      The Process Steps for Building a Predictive Model
      Framing a Machine Learning Problem
      Feature Extraction and Feature Engineering
      Determining Performance of a Trained Model
      Chapter Contents and Dependencies

      Chapter 2 Understand the Problem by Understanding the Data
      The Anatomy of a New Problem
      Different Types of Attributes and Labels Drive Modeling Choices
      Things to Notice about Your New Data Set
      Classification Problems - Detecting Unexploded Mines Using Sonar
      Physical Characteristics of the Rocks Versus Mines Data Set
      Statistical Summaries of the Rocks versus Mines Data Set
      Visualization of Outliers Using Quantile Plot
      Statistical Characterization of Categorical Attributes
      How to Use Python Pandas to Summarize the Rocks Versus Mines Data Set
      Visualizing Properties of the Rocks versus Mines Data Set
      Visualizing with Parallel Coordinates Plots
      Visualizing Interrelationships between Attributes and Labels
      Visualizing Attribute and Label Correlations Using a Heat Map
      Summarizing the Process for Understanding Rocks versus Mines Data Set Real]Valued Predictions with Factor Variables
      How Old Is Your Abalone?
      Parallel Coordinates for Regression Problems - Visualize Variable Relationships for Abalone Problem
      How to Use Correlation Heat Map for Regression - Visualize Pair]Wise Correlations for the Abalone Problem
      Real]Valued Predictions Using Real]Valued Attributes - Calculate How Your Wine Tastes
      Multiclass Classification Problem - What Type of Glass Is That?

      Chapter 3 : Predictive Model Building - Balancing Performance, Complexity and Big Data
      The Basic Problem - Understanding Function Approximation
      Working with Training Data
      Assessing Performance of Predictive Models
      Factors Driving Algorithm Choices and Performance - Complexity and Data Contrast Between a Simple Problem and a Complex Problem
      Contrast Between a Simple Model and a Complex Model
      Factors Driving Predictive Algorithm Performance
      Choosing an Algorithm - Linear or Nonlinear?
      Measuring the Performance of Predictive Models
      Performance Measures for Different Types of Problems
      Simulating Performance of Deployed Models
      Achieving Harmony Between Model and Data
      Choosing a Model to Balance Problem Complexity, Model Complexity and Data Set Size
      Using Forward Stepwise Regression to Control Over fitting
      Evaluating and Understanding Your Predictive Model
      Control Over fitting by Penalizing Regression
      Coefficients - Ridge Regression

      Chapter 4 Penalized Linear Regression
      Why Penalized Linear Regression Methods Are So Useful
      Extremely Fast Coefficient Estimation
      Variable Importance Information
      Extremely Fast Evaluation When Deployed
      Reliable Performance
      Sparse Solutions
      Problem May Require Linear Model
      When to Use Ensemble Methods
      Penalized Linear Regression - Regulating Linear Regression for Optimum Performance
      Training Linear Models - Minimizing Errors and More
      Adding a Coefficient Penalty to the OLS Formulation
      Other Useful Coefficient Penalties - Manhattan and Elastic Net
      Why Lasso Penalty Leads to Sparse Coefficient Vectors
      Elastic Net Penalty Includes Both Lasso and Ridge
      Solving the Penalized Linear Regression Problem
      Understanding Least Angle Regression and Its Relationship to Forward Stepwise Regression
      How LARS Generates Hundreds of Models of Varying Complexity
      Choosing the Best Model from The Hundreds LARS Generates
      Using Glmnet: Very Fast and Very General
      Comparison of the Mechanics of Glmnet and LARS Algorithms
      Initializing and Iterating the Glmnet Algorithm
      Extensions to Linear Regression with Numeric Input
      Solving Classification Problems with Penalized Regression
      Working with Classification Problems Having More Than Two Outcomes
      Understanding Basis Expansion - Using Linear Methods on Nonlinear Problems Incorporating Non - Numeric Attributes into Linear Methods

      Chapter 5 : Building Predictive Models using Penalized Linear Methods
      Python Packages for Penalized Linear Regression
      Multivariable Regression - Predicting Wine Taste
      Building and Testing a Model to Predict Wine Taste
      Training on the Whole Data Set before Deployment
      Basis Expansion - Improving Performance by Creating New Variables from Old Ones
      Binary Classification - Using Penalized Linear Regression to Detect Unexploded Mines
      Build a Rocks versus Mines Classifier for Deployment
      Multiclass Classification - Classifying Crime Scene
      Glass Samples

      Chapter 6 : Ensemble Methods
      Binary Decision Trees
      How a Binary Decision Tree Generates Predictions
      How to Train a Binary Decision Tree
      Tree Training Equals Split Point Selection
      How Split Point Selection Affects Predictions
      Algorithm for Selecting Split Points
      Multivariable Tree Training - Which Attribute to Split?
      Recursive Splitting for More Tree Depth
      Over fitting Binary Trees
      Measuring Over fit with Binary Trees
      Balancing Binary Tree Complexity for Best Performance
      Modifications for Classification and Categorical Features
      Bootstrap Aggregation - "Bagging"
      How Does the Bagging Algorithm Work?
      Bagging Performance - Bias versus Variance
      How Bagging Behaves on Multivariable Problem
      Bagging Needs Tree Depth for Performance
      Summary of Bagging
      Gradient Boosting
      Basic Principle of Gradient Boosting Algorithm
      Parameter Settings for Gradient Boosting
      How Gradient Boosting Iterates Toward a Predictive Model
      Getting the Best Performance from Gradient Boosting
      Gradient Boosting on a Multivariable Problem
      Summary for Gradient BoostingRandom Forest
      Random Forests - Bagging Plus Random Attribute Subsets
      Random Forests Performance Drivers
      Random Forests Summary

      Chapter 7 : Building Ensemble Models with Python
      Solving Regression Problems with Python Ensemble Packages
      Building a Random Forest Model to Predict Wine Taste
      Constructing a Random Forest Regressor Object
      Modeling Wine Taste with Random Forest Regressor
      Visualizing the Performance of a Random
      Forests Regression Model
      Using Gradient Boosting to Predict Wine Taste
      Using the Class Constructor for Gradient Boosting Regressor
      Using Gradient Boosting Regressor to Implement a Regression Model
      Assessing the Performance of a Gradient Boosting Model
      Coding Bagging to Predict Wine Taste
      Incorporating Non - Numeric Attributes in Python Ensemble Models
      Coding the Sex of Abalone for Input to Random Forest Regression in Python
      Assessing Performance and the Importance of Coded Variables
      Coding the Sex of Abalone for Gradient Boosting Regression in Python
      Assessing Performance and the Importance of Coded Variables with Gradient Boosting
      Solving Binary Classification Problems with Python Ensemble Methods Detecting Unexploded Mines with Python Random Forest
      Constructing a Random Forests Model to Detect Unexploded Mines
      Determining the Performance of a Random Forests Classifier
      Detecting Unexploded Mines with Python Gradient Boosting
      Determining the Performance of a Gradient Boosting Classifier
      Solving Multiclass Classification Problems with Python Ensemble Methods Classifying Glass with Random Forests
      Dealing with Class Imbalances
      Classifying Glass Using Gradient Boosting
      Assessing the Advantage of Using Random Forest Base Learners with Gradient Boosting
      Comparing Algorithms
      Summary
      Index

    Contributors
    Authored By
    • Michael Bowles
    Series & Set Details
    Series Name
    • MISL-WILEY
    Ratings and Reviews
    4.4
    21 Ratings &
    0 Reviews
    • 5
       15
    • 4
       2
    • 3
       2
    • 2
       1
    • 1
       1
    Have you used this product? Be the first to review!
    Have doubts regarding this product?
    Safe and Secure Payments.Easy returns.100% Authentic products.
    Back to top