YEH KUCH BATA PAYENGE TO BATAO ?

Rashid Moin
Rashid Moin
 

* Co-founded and advised many high-tech businesses in SF-Bay, Washington DC Metro and Bengaluru Area.

* Earlier CTO at Big Data Consulting Firm providing solutions on Cloud, Advanced Analytics and Data Science.

* Founder and Inventor of JHOOMBOX – first cloud based “Musical console” for generating and publishing original media content. A consumer electronic product – Merging Internet Video delivery platforms and Android ecosystem.

* Maker with special interest in Wearables and the new world of Smart Connected Devices. Developed two innovative IoT solutions for different markets – LinkPlus.IO (Event Mgmt) & HandsTel (Child Safety).

* Worked 9 years with IBM – Middleware, Distributed Computing and Data Engineering. Chief Architect at GM’s OnStar.

* Worked with Los Alamos National Laboratory (LANL) to develop a software package TRANSIMS that helps in modeling traffic congestion and air pollution.

* Received “Innovator of the Week”​ award from Consumer Electronic Association (CEA). [Interview](http://goo.gl/80HB5Y)

* BS in Engineering from Indian Institute of Technology, Kanpur (IITK) and a MS in Computer Science from George Washington University (GWU).

* Specialities : Product Management, Technology Vision, Startups, Android Ecosystem, Big Data Analytics, Machine Learning, Open Source Hardware and Software.

Ajay, you’re skilled in Start-ups

You’ve both worked at IBM

Want to endorse Rashid for Start-ups?

Highlights

Rashid’s Activity

1,165 followers

  • MIT AGI: Future of Intelligence (Ray Kurzweil)

    Rashid liked

  • Doug Hanks and Sadaf Fardeen speaking to ITP masters students on Network Engineering opportunities at LinkedIn

    Rashid liked

  • Hey Buddy, Can You Give Me a Hand?

    Rashid liked

See all activity

Experience

Education

Ask to be recommended

Recommendations

Received (3)Given (4)

Show more

Accomplishments

Rashid has 4 certifications4

Certifications

  • Convolutional Neural Networks
  • Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
  • Structuring Machine Learning Projects
  • Neural Networks and Deep Learning

Rashid has 4 courses4

Courses

  • Convolutional Neural Networks for Visual Recognition – Stanford University School of Engineering
  • Deep Learning for Self-Driving Cars – MIT
  • Machine Learning by Andrew Ng – Stanford University
  • Practical Deep Learning for Coders – Fast AI MOOC

Rashid has 3 honors3

Honors & Awards

  • DC’s Hottest Showcasing Startup
  • Innovator of the Week
  • Service Excellence Award

Rashid has 1 patent1

Patent

  • System for and Method of Operating a Web-Enabled Content Generating Device
 

* Co-founded and advised many high-tech businesses in SF-Bay, Washington DC Metro and Bengaluru Area.

* Earlier CTO at Big Data Consulting Firm providing solutions on Cloud, Advanced Analytics and Data Science.

* Founder and Inventor of JHOOMBOX – first cloud based “Musical console” for generating and publishing original media content. A consumer electronic product – Merging Internet Video delivery platforms and Android ecosystem.

* Maker with special interest in Wearables and the new world of Smart Connected Devices. Developed two innovative IoT solutions for different markets – LinkPlus.IO (Event Mgmt) & HandsTel (Child Safety).

* Worked 9 years with IBM – Middleware, Distributed Computing and Data Engineering. Chief Architect at GM’s OnStar.

* Worked with Los Alamos National Laboratory (LANL) to develop a software package TRANSIMS that helps in modeling traffic congestion and air pollution.

* Received “Innovator of the Week”​ award from Consumer Electronic Association (CEA). [Interview](http://goo.gl/80HB5Y)

* BS in Engineering from Indian Institute of Technology, Kanpur (IITK) and a MS in Computer Science from George Washington University (GWU).

* Specialities : Product Management, Technology Vision, Startups, Android Ecosystem, Big Data Analytics, Machine Learning, Open Source Hardware and Software.

Ajay, you’re skilled in Start-ups

You’ve both worked at IBM

Want to endorse Rashid for Start-ups?

Highlights

Rashid’s Activity

1,165 followers

  • MIT AGI: Future of Intelligence (Ray Kurzweil)

    Rashid liked

  • Doug Hanks and Sadaf Fardeen speaking to ITP masters students on Network Engineering opportunities at LinkedIn

    Rashid liked

  • Hey Buddy, Can You Give Me a Hand?

    Rashid liked

See all activity

Experience

Education

Ask to be recommended

Recommendations

Received (3)Given (4)

Show more

Accomplishments

Rashid has 4 certifications4

Certifications

  • Convolutional Neural Networks
  • Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
  • Structuring Machine Learning Projects
  • Neural Networks and Deep Learning

Rashid has 4 courses4

Courses

  • Convolutional Neural Networks for Visual Recognition – Stanford University School of Engineering
  • Deep Learning for Self-Driving Cars – MIT
  • Machine Learning by Andrew Ng – Stanford University
  • Practical Deep Learning for Coders – Fast AI MOOC

Rashid has 3 honors3

Honors & Awards

  • DC’s Hottest Showcasing Startup
  • Innovator of the Week
  • Service Excellence Award

Rashid has 1 patent1

Patent

  • System for and Method of Operating a Web-Enabled Content Generating Device

YEH KUCH BATA PAYENGE ? :) TO BATAO ?

Rashid Moin

Rashid Moin

1st degree connection1st

CTO, Deep Learning Computer Vision AI, Healthcare

ARTH.AI

Indian Institute of Technology, Kanpur

San Francisco Bay Area

500+ 500+ connections

MessageSend a message to Rashid Moin more_allyMore…

* Co-founded and advised many high-tech businesses in SF-Bay, Washington DC Metro and Bengaluru Area.

* Earlier CTO at Big Data Consulting Firm providing solutions on Cloud, Advanced Analytics and Data Science.

* Founder and Inventor of JHOOMBOX – first cloud based “Musical console” for generating and publishing original media content. A consumer electronic product – Merging Internet Video delivery platforms and Android ecosystem.

* Maker with special interest in Wearables and the new world of Smart Connected Devices. Developed two innovative IoT solutions for different markets – LinkPlus.IO (Event Mgmt) & HandsTel (Child Safety).

* Worked 9 years with IBM – Middleware, Distributed Computing and Data Engineering. Chief Architect at GM’s OnStar.

* Worked with Los Alamos National Laboratory (LANL) to develop a software package TRANSIMS that helps in modeling traffic congestion and air pollution.

* Received "Innovator of the Week"​ award from Consumer Electronic Association (CEA). [Interview](http://goo.gl/80HB5Y)

* BS in Engineering from Indian Institute of Technology, Kanpur (IITK) and a MS in Computer Science from George Washington University (GWU).

* Specialities : Product Management, Technology Vision, Startups, Android Ecosystem, Big Data Analytics, Machine Learning, Open Source Hardware and Software.

Media (6)

This position has 6 media

Previous Next

Show less Show less of Rashid’s summary

Ajay, you’re skilled in Start-ups

You’ve both worked at IBM

Want to endorse Rashid for Start-ups?

SkipEndorse

Highlights

Show more

FollowingUnfollow

Rashid’s Activity

1,165 followers

See all activity

Experience

Show more

Education

Featured Skills & Endorsements

Show more

Ask to be recommended

Recommendations

Received (3)Given (4)

Show more
Show more

Accomplishments

Rashid has 4 certifications4

Expand certifications section

Certifications

  • Convolutional Neural Networks
  • Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
  • Structuring Machine Learning Projects
  • Neural Networks and Deep Learning

Rashid has 4 courses4

Expand courses section

Courses

  • Convolutional Neural Networks for Visual Recognition – Stanford University School of Engineering
  • Deep Learning for Self-Driving Cars – MIT
  • Machine Learning by Andrew Ng – Stanford University
  • Practical Deep Learning for Coders – Fast AI MOOC

Rashid has 3 honors3

Expand honors & awards section

Honors & Awards

  • DC’s Hottest Showcasing Startup
  • Innovator of the Week
  • Service Excellence Award

Rashid has 1 patent1

Expand patents section

Patent

  • System for and Method of Operating a Web-Enabled Content Generating Device

HELO FELLAS – WILL THIS WORK?

Nonparametric regression

From Wikipedia, the free encyclopedia

Part of a series on Statistics

Regression analysis

Linear regression.svg

Models

Estimation

Background

Nonparametric regression is a category of regression analysis in which the predictor does not take a predetermined form but is constructed according to information derived from the data. Nonparametric regression requires larger sample sizes than regression based on parametric models because the data must supply the model structure as well as the model estimates.

Contents

[hide]

Gaussian process regression or Kriging[edit]

Main article: Gaussian process regression

In Gaussian process regression, also known as Kriging, a Gaussian prior is assumed for the regression curve. The errors are assumed to have a multivariate normal distribution and the regression curve is estimated by its posterior mode. The Gaussian prior may depend on unknown hyperparameters, which are usually estimated via empirical Bayes.

Smoothing splines have an interpretation as the posterior mode of a Gaussian process regression.

Kernel regression[edit]

Main article: Kernel regression

Example of a curve (red line) fit to a small data set (black points) with nonparametric regression using a Gaussian kernel smoother. The pink shaded area illustrates the kernel function applied to obtain an estimate of y for a given value of x. The kernel function defines the weight given to each data point in producing the estimate for a target point.

Kernel regression estimates the continuous dependent variable from a limited set of data points by convolving the data points’ locations with a kernel function—approximately speaking, the kernel function specifies how to “blur” the influence of the data points so that their values can be used to predict the value for nearby locations.

Nonparametric multiplicative regression[edit]


Two kinds of kernels used with kernel smoothers for nonparametric regression.


Use of Gaussian kernels for nonparametric multiplicative regression with two predictors. The weights from the kernel function for each predictor are multiplied to obtain a weight for a given data point in estimating a response variable (dependent variable) at a target point in the predictor space.


Two commonly used forms of a local model used in nonparametric regression, contrasted with a simple linear model.

Nonparametric multiplicative regression (NPMR) is a form of nonparametric regression based on multiplicative kernel estimation. Like other regression methods, the goal is to estimate a response (dependent variable) based on one or more predictors (independent variables). NPMR can be a good choice for a regression method if the following are true:

  1. The shape of the response surface is unknown.

  2. The predictors are likely to interact in producing the response; in other words, the shape of the response to one predictor is likely to depend on other predictors.

  3. The response is either a quantitative or binary (0/1) variable.

This is a smoothing technique that can be cross-validated and applied in a predictive way.

NPMR behaves like an organism

NPMR has been useful for modeling the response of an organism to its environment. Organismal response to environment tends to be nonlinear and have complex interactions among predictors. NPMR allows you to model automatically the complex interactions among predictors in much the same way that organisms integrate the numerous factors affecting their performance.[1]

A key biological feature of an NPMR model is that failure of an organism to tolerate any single dimension of the predictor space results in overall failure of the organism. For example, assume that a plant needs a certain range of moisture in a particular temperature range. If either temperature or moisture fall outside the tolerance of the organism, then the organism dies. If it is too hot, then no amount of moisture can compensate to result in survival of the plant. Mathematically this works with NPMR because the product of the weights for the target point is zero or near zero if any of the weights for individual predictors (moisture or temperature) are zero or near zero. Note further that in this simple example, the second condition listed above is probably true: the response of the plant to moisture probably depends on temperature and vice versa.

Optimizing the selection of predictors and their smoothing parameters in a multiplicative model is computationally intensive. With a large pool of predictors, the computer must search through a huge number of potential models in search for the best model. The best model has the best fit, subject to overfitting constraints or penalties (see below).[2][3]

The local model

NPMR can be applied with several different kinds of local models. By “local model” we mean the way that data points near a target point in the predictor space are combined to produce an estimate for the target point. The most common choices for the local models are the local mean estimator, a local linear estimator, or a local logistic estimator. In each case the weights can be extended multiplicatively to multiple dimensions.

In words, the estimate of the response is a local estimate (for example a local mean) of the observed values, each value weighted by its proximity to the target point in the predictor space, the weights being the product of weights for individual predictors. The model allows interactions, because weights for individual predictors are combined by multiplication rather than addition.

Overfitting controls

Understanding and using these controls on overfitting is essential to effective modeling with nonparametric regression. Nonparametric regression models can become overfit either by including too many predictors or by using small smoothing parameters (also known as bandwidth or tolerance). This can make a big difference with special problems, such as small data sets or clumped distributions along predictor variables.

The methods for controlling overfitting differ between NPMR and the generalized linear modeling (GLMs). The most popular overfitting controls for GLMs are the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) for model selection. The AIC and BIC depend on the number of parameters in a model. Because NPMR models do not have explicit parameters as such, these are not directly applicable to NPMR models. Instead, one can control overfitting by setting a minimum average neighborhood size, minimum data:predictor ratio, and a minimum improvement required to add a predictor to a model.

Nonparametric regression models sometimes use an AIC based on the “effective number of parameters”.[4] This penalizes a measure of fit by the trace of the smoothing matrix—essentially how much each data point contributes to estimating itself, summed across all data points. If, however, you use leave-one-out cross validation in the model fitting phase, the trace of the smoothing matrix is always zero, corresponding to zero parameters for the AIC. Thus, NPMR with cross-validation in the model fitting phase already penalizes the measure of fit, such that the error rate of the training data set is expected to approximate the error rate in a validation data set. In other words, the training error rate approximates the prediction (extra-sample) error rate.

Related techniques

NPMR is essentially a smoothing technique that can be cross-validated and applied in a predictive way. Many other smoothing techniques are well known, for example smoothing splines and wavelets. The optimal choice of a smoothing method depends on the specific application. Nonparametric regression models always fits for larger data.

Regression trees[edit]

Main article: Decision tree learning

Decision tree learning algorithms can be applied to learn to predict a dependent variable from data.[5] Although the original CART formulation applied only to predicting univariate data, the framework can be used to predict multivariate data including time series.[6]

See also[edit]

References[edit]

  1. Jump up^ McCune, B. (2006). “Non-parametric habitat models with automatic interactions”. Journal of Vegetation Science. 17 (6): 819–830. doi:10.1658/1100-9233(2006)17[819:NHMWAI]2.0.CO;2.

  2. Jump up^ Grundel, R.; Pavlovic, N. B. (2007). “Response of bird species densities to habitat structure and fire history along a Midwestern open–forest Gradient”. The Condor. 109 (4): 734–749. doi:10.1650/0010-5422(2007)109[734:ROBSDT]2.0.CO;2.

  3. Jump up^ DeBano, S. J.; Hamm, P. B.; Jensen, A.; Rondon, S. I.; Landolt, P. J. (2010). “Spatial and temporal dynamics of potato tuberworm (Lepidoptera: Gelechiidae) in the Columbia Basin of the Pacific Northwest”. Environmental Entomology. 39 (1): 1–14. doi:10.1603/EN08270.

  4. Jump up^ Hastie, T.; Tibsharani, R.; Friedman, J. (2001). The Elements of Statistical Learning. New York: Springer. p. 205. ISBN 0387952845.

  5. Jump up^ Breiman, Leo; Friedman, J. H.; Olshen, R. A.; Stone, C. J. (1984). Classification and regression trees. Monterey, CA: Wadsworth & Brooks/Cole Advanced Books & Software. ISBN 978-0-412-04841-8.

  6. Jump up^ Segal, M.R. (1992). “Tree-structured methods for longitudinal data”. Journal of the American Statistical Association. 87 (418): 407–418. doi:10.2307/2290271. JSTOR 2290271.

Further reading[edit]

External links

Someone just viewed: why IOWA ? OR IS IT LOCATION IOWA? HAVE U VER REALIZED THAT I WILL DIE THIS WAY?

sig?u=a_mishraAjay Mishra
My Latest

From: Streak <notifications>
Date: Sun, Feb 25, 2018 at 11:59 PM
Subject: Someone just viewed: why IOWA ? OR IS IT LOCATION IOWA? HAVE U VER REALIZED THAT I WILL DIE THIS WAY?
To: ajayinsead03


Someone just viewed your email with the subject: why IOWA ? OR IS IT LOCATION IOWA? HAVE U VER REALIZED THAT I WILL DIE THIS WAY?
Details
People on thread: x man, OLGA GOOGLE BLOGGER POST BY EMAIL
Device: Unknown Device
Location: moscow, mow

eyJlbWFpbF9pZCI6IlpNX3lBUUFCWWM0NlZTRW9hd2k2R0lhWm9XYl8ifQ==