Computer systems must increasingly operate in the presence of uncertainty in their execution environment in that new applications and hardware platforms require our systems to model and compute with objects that are inherently uncertain or partially observable in that their behaviors are only given by noisy measurements. This reality presents many new questions about how to interpret, debug, validate, verify, and optimize these systems.
As an illustrative example of such a system, I’ll present DiffTune, a technique for learning neurosymbolic performance models of modern computer processors. Processor performance models are critical for many computer systems engineering tasks, however due to the limits on our ability to introspect modern processors, these models must be inferred from behavioral measurements. Our system leverages deep learning to perform differentiable surrogate optimization of a CPU simulator to yield models that predict the performance of programs executed on modern Intel CPUs better than state-of-the-art, handcrafted techniques from LLVM.
Guided by these results, I’ll demonstrate how this system presents many of the challenges with engineering modern uncertain computations as well as connect these challenges to my work on new program semantics, optimizations, and analyses for uncertain computations.
Michael Carbin is the Jamieson Career Development Assistant Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology.
His primary research focus is the design of programming systems that manipulate system uncertainty to deliver improved performance, energy consumption, and resilience. Uncertainty — in the form of sampled and sensed values, dynamic computation structure, and intermittently available computing — is a first-order challenge in modern computing systems.
His research on verifying the reliability of programs that execute on unreliable hardware has received best paper awards at leading programming languages conferences (OOPSLA 2013 and OOPSLA 2014) as well as a Communications of the ACM Research Highlight in 2016. He has also published work at leading programming languages and systems conferences, including PLDI, OOPSLA, ASPLOS, LICS, SOSP, ICSE, and PPoPP.