Neurosymbolic Programming

Swarat Chaudhuri
UT-Austin

Abstract:

I will talk about neurosymbolic programming, an emerging research area that bridges the fields of deep learning and program synthesis. Here, one considers parameterized (often differentiable) programs that use traditional programming primitives as well as invocations to neural modules. Such programs are learned from a combination of data and auxiliary synactic and semantic constraints, using a combination of symbolic search and gradient-based optimization.

Neurosymbolic programs possess a number of advantages over classical neural networks. The symbolic elements of such programs make them easier to interpret, debug, and factorize than neural networks, as well as a natural fit for tasks that need algorithmic reasoning. The compositionality of such programs can aid transfer across learning settings. Finally, the constraints available during the learning process for such programs can serve as a form of regularization, leading to more reliable and data-efficient learning.

In this talk, I will describe a few forms that neurosymbolic programs can take, along with a few algorithmic approaches to learning them. I will end with a discussion of some of the open challenges in this area and ways in which PL researchers can contribute to it.

Bio:

Swarat Chaudhuri is an Associate Professor of Computer Science at UT Austin. Over the years, he has worked on many topics in PL and Formal Methods, including the foundations of model checking, concurrency, program analysis, and program synthesis. These days, he primarily works on problems in the intersection of PL and machine learning.

Talk: