The problem of program synthesis is to automatically generate a program (or fragment) which fulfils certain specifications and/or improves some given performance requirements. This type of problem has been studied since the late 1960s mostly in the context of discrete structures (logical frameworks, model checking, etc). More recently aspects of program analysis (in particular abstract interpretation) have also been considered in this context.
A novel approach in this area is the consideration of probabilistic programs as a kind of continuous interpolation of deterministic programs. This can also be combined with quantitative aspects of program analysis. In this way program synthesis becomes, essentially, a non-linear optimisation problem. Application areas could include language based security (program obfuscation, information flow minimisation in covert channels, etc.) or service composition (in the context of e.g. cloud computing) but also optimisation on or close to the hardware level (e.g. speculative optimisation). Concrete research questions arising include: For what kind of synthesis problems is a quantitative approach feasible? What is the relation to conventional, discrete program synthesis (based e.g. on game theoretic ideas)? Can this be used to improve the performance of legacy code or to allow for changes in the (original) specification? Does the approach scale? What optimisation algorithms are best suited? How can one exploit abstraction to guided the construction of the desired program? Could one design a maybe multistage synthesis which improves different aspects of a program performances?