BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//pretalx//global2023.pydata.org//UBLDAX
BEGIN:VEVENT
UID:pretalx-cfp-UBLDAX@global2023.pydata.org
DTSTART:20231206T173000Z
DTEND:20231206T180000Z
DESCRIPTION:Given a test data point similar to the training points\, we sho
uld expect the prediction of a machine learning model to be accurate.\nHow
ever\, we don't have the same guarantee for the prediction on the test poi
nt very far away from the training data\, but many models offer no quantif
ication of this uncertainty in our predictions.\nThese models\, including
the increasingly popular neural networks\, produce a single-valued number
as the prediction of a test point of interest\, making it difficult to qua
ntify how much the user should have trust in this prediction.\n\nGaussian
processes (GPs) address this concern\; a GP outputs as its prediction of a
given a test point\, instead of a single number\, a probability distribut
ion representing the range that the value we're predicting is likely to fa
ll into.\nBy looking at the mean of this distribution\, we obtain the most
likely predicted value\; by inspecting the variance of the distribution\,
we can quantify how uncertain we are about this prediction.\nThis ability
to produce well-calibrated uncertainty quantification gives GPs an edge i
n high-stakes machine learning use cases such as oil drilling\, drug disco
very\, and product recommendation.\n\nWhile GPs are widely used in academi
c research in Bayesian inference and active learning tasks\, many ML pract
itioners still shy away from it\, believing that they need a highly techni
cal background to understand and use GPs.\nThis talk aims to dispel that m
essage and offers a friendly introduction to GPs\, including its fundament
als\, how to implement it in Python\, and common practices.\nData scientis
ts and ML practitioners who are interested in uncertainty quantification a
nd probabilistic ML will benefit from this talk.\nWhile most background kn
owledge necessary to follow the talk will be covered\, the audience should
be familiar with common concepts in ML such as training data\, predictive
models\, multivariate normal distributions\, etc.
DTSTAMP:20240911T061345Z
LOCATION:Machine Learning Track
SUMMARY:But what is a Gaussian process? Regression while knowing how certai
n you are - Quan Nguyen
URL:https://global2023.pydata.org/cfp/talk/UBLDAX/
END:VEVENT
END:VCALENDAR