Speakers include researchers from Stanford, Berkeley, UW, MIT, and Google Brain. As a preview, here are 3 topics that will be covered: hw-exploit synthesis (https://www.cs.princeton.edu/~ctrippel/#publications), end-user web programming (https://schasins.com/papers/), verification and synthesis of OS level code (https://unsat.cs.washington.edu/projects/).
Technical abstracts and registration is on: https://synthetic-minds.com/pages/conference/2019/#program. After the conference we’ll post the slides of the talks.
The foundations part will explain why synthesis reduces to solving an \exists\forall query. More specifically, the query asks “does there exists a program P such that for all inputs x, P(x) computes the correct output?” A solution to the query is the synthesized program P. How to solve the query? The Z3 prover works well for boolean satisfiability (i.e., one \exists query). To solve an \exists\forall query, one approach is to have a two Z3 solvers communicate: one solver synthesizes a candidate program P’ that is correct on a sample of inputs, while the other verifies that P’ is correct on all inputs. If P’s is incorrect, a counterexample input is added to the sample on inputs. The two solvers iterate until the latter is satisfied with the correctness. This process is not unlike GANs; it is called CEGIS (counterexample-guided inductive synthesis) and was invented 2005, and built on an earlier CEGAR (counterexample-guided abstraction refinement) technique from the 90s.
Please comment; or email us with questions.
The researchers will talk about their peer-reviewed work in web automation, hardware security, operating system extensions, programming for non-programmers, automatic code translation, and superoptimization. Hopefully, this will illustrate the power and limitations. We'd love for people to extrapolate from these onto their own domain-specific automation needs.
By touching upon foundational techniques (making imperative code functional, symbolic compilation, SMT encodings, partial evaluation), hopefully the leap to "code synthesis" will seem less like magic and more like an obvious next step. In addition, open-source frameworks exist (e.g., Rosette, Sketch) that abstract away these foundations, and the program will cover those in hands-on workshops.
We’d love to hear insights into application from people for whom synthesis is new. Some problems are exciting to us (Synthetic Minds is working on smart contract synthesis); and we’d love for the community to brainstorm applications to their domains.
Here's a raw example of a 32-bit adder super-optimized according to rough delay, gate complexity, and wire complexity models embedded in the 2QBF: https://share.riseup.net/#xg2ySn41zmhtHtmrlQWn5Q . According to post-PAR results from Innovus, it can actually (barely) beat DesignWare adders at some points in the trade-off space.
Short answer: to scale up hardware synthesis, it may be necessary to change the encoding rather than look for a better solver.
More details: I should first say that I have limited experience with synthesizing hardware. My lessons come from [1], where synthesized a small Wallace Tree multiplier. What that taught me is that hardware arithmetic circuits should perhaps not be formulated as a 2QBF problem -- because you might need too many counterexample inputs to terminate the CEGIS loop.
Instead, I believe that the circuit synthesis should use algebraic reasoning (for correctness) and combinatorial reasoning to explore the space of candidate circuits. Since one symbolic input is sufficient to show correctness, the problem simplifies from 2QBF to SAT.
This idea is briefly explained in Sec 4.1 in [3], a project that synthesizes software expressions that look very much like hardware circuits (permutations and such) [2].
I am happy to discuss this in person if it might be helpful to your work. Both Mangpo (the author of [2,3]) and I will be at the conference.
[1] https://ieeexplore.ieee.org/document/5227085
[2] https://github.com/mangpo/swizzle-inventor
[3] https://github.com/mangpo/mangpo.github.io/blob/master/paper...,
best, Athanasios
[1] https://arxiv.org/pdf/1603.03165.pdf [2] https://rishabhmit.bitbucket.io/papers/icse18.pdf
It uses constraint-based synthesis to search for small repairs. We also did some follow up work on learning from other student's solutions to learn to correct similar mistakes: https://rishabhmit.bitbucket.io/papers/sarfgen_pldi18.pdf https://rishabhmit.bitbucket.io/papers/dyn_iclr18.pdf