(If you are on the POPL 2020 program committee, please stop reading until after your reviews are done.)
It's been a few months since I've last blogged, so I figure I should say what I have been up to lately. The answer seems to be comonads. I have two new draft papers, both of which involve comonadic type theories in a fairly critical way. (I suppose that this makes me a type-theoretic hipster, who doesn't do monads now that they're popular.)
Recovering Purity with Comonads and Capabilities
The first paper, written with Vikraman Chaudhury has the lovely subtitle The marriage of purity and comonads.
Folk wisdom has it that once your functional language supports impure features, that's game over: there's no way to recover a sublanguage which supports reasoning as if programs are pure, since higher-order functions mean you never know if a function variable could have an effect. So this is why people have historically turned to pure languages, and then isolate effects via a monadic type discipline (like Haskell) or effect system (like Koka).
In this paper, we show that folk wisdom jumped the gun! We start with a small impure language, and then extend it by adding a comonadic type constructor for pure terms. This lets us do the opposite of Haskell: effects are pervasive, except in the bits where we deny the capability to perform them. Interestingly, the word "capability" is critical -- our denotational semantics formalizes the notion of object capabilities, and we show that our type system doesn't just offer the ability to enforce purity, but also notions intermediate between purity and pervasive effects, such as capability safety.
I'm really pleased with this paper, since it pulls together ideas people working in areas as far apart as operating systems and categorical proof theory, and does so with really elementary techniques in semantics. It builds on ideas that Martin Hofmann invented in his 2003 paper Linear Types and Non-Size-Increasing Polynomial Time Computation, which is one of the papers that had a really big impact on me as a grad student.
Seminaïve Evaluation for a Higher-Order Functional Language
The next paper was written with Michael Arntzenius.
For the last few years, we've been working on a language called Datafun, which is a higher-order version of Datalog, which is either a query language akin to SQL plus recursion, or a simple bottom-up logic programming language.
We developed a really nice semantics for Datafun, but the thing about query languages is that people want them to go fast. The most important technique for making Datalog go fast is an optimization called semi-naive evaluation, but it was designed for a first-order logic programming language, and Datafun isn't that. So in this paper we worked out how seminaive evaluation should work for higher-order languages, and the answer involves a big old bag of tricks, starting with the incremental lambda calculus, but also requiring a pretty intricate use of comonads to control how recursion is used and to propagate derivative information throughout the program. (Interestingly, Martin Hofmann also worked on using comonads to stratify recursion in his paper A Mixed Modal/Linear Lambda Calculus With Applications To Bellantoni-Cook Safe Recursion!)
It is super satisfying to develop optimizations that make programs go asymptotically faster, and on top of that this technique feels like the tip of the iceberg -- the design space feels wide open to me, and I suspect static incrementalization techniques are potentially going to end up being as big a hammer as continuation-passing style.
Oh, the first one! I've been thinking of something similar, but from a slightly different perspective. Glad to see you are working on it. Looking forward to your paper.
ReplyDeletePretty stoked about the first paper! This seems a more realistic path towards wide acceptance and use of capability-safe programming.
ReplyDeleteHii great reading your post
ReplyDelete