Predictive modeling has led to big successes in making inferences from data. Such models are used extensively, including in systems for recommending items, optimizing content, delivering ads, matching applicants to jobs, identifying health risks and so on. However, predictive models are not well-equipped to answer questions about cause and effect, which form the basis of many practical decision-making scenarios.
For example, if a recommendation system is changed or removed, what will be the effect on total customer activity? Which strategy leads to a higher engagement with a product? How can we learn generalizable insights about users from biased data (e.g. that of opt-in users)?Through practical examples, I will show the value of counterfactual reasoning and causal inference for such scenarios, by demonstrating that relying on predictive modeling based on correlations can be counterproductive. I will then present an overview of experimental and observational causal inference methods, that can better inform decision-making through data, and also lead to more robust and generalizable prediction models.
Amit Sharma is a Postdoctoral researcher at Microsoft Research. He uses a combination of data mining and experimental methods to understand the underlying decision processes that shape people's activities online. Applications include estimating the impact of recommender systems on purchasing decisions, influence of seeing friends' activities through feeds in social networks, and predicting the popularity of books or music. He received his Ph.D. in computer science at Cornell University and was awarded the 2012 Yahoo! Key Scientific Challenges Award.
Data Council, PO Box 2087, Wilson, WY 83014, USA - Phone: +1 (415) 800-4938 - Email: community (at) datacouncil.ai