Keywords: partial evaluation, numerical programs, imperative languages
Summary: We investigate the application of partial evaluation to numerically oriented computation in scientific and engineering applications. We present our results using the Fast Fourier Transformation, the N-body attraction problem, and the cubic splines interpolation. The results demonstrate that existing partial evaluation technology is strong enough to improve the efficiency of a large class of numerical programs. However, using partial evaluation as a development tool in the `real world' still remains a challenging problem.
Obsolete! Superseded by D-272.
Keywords: partial evaluation, binding-time analysis, constraints, Standard ML
Summary: We present an efficient base algorithm for binding-time analysis based on constraint solving and the union-find algorithm. In practice it has been used to handle all of Standard ML except modules and we show the principles of how constraints can be used for binding-time analysis of Standard ML; in particular we show how to binding-time analyse nested pattern matching. To the best of our knowledge no previous binding-time analysis has treated nested pattern matching.
Keywords: partial evaluation, compiler generation, Standard ML
Summary: We argue that hand-writing a program generator generator has a number of advantages compared to generating a program generator generator by self-application of a partial evaluator. We show the basic principles of how to construct a program generator generator by presenting a program generator generator for a skeletal language, and we argue that it is not more difficult to use the direct approach than the indirect approach. Moreover, we report on some promising experiments made with a prototype implementation of a program generator generator for most of the Standard ML Core Language.
Keywords: partial evaluation, continuation passing style, direct style, handwritten compiler-generator, lambda-calculus
Summary: Combines the idea of hand-writing a compiler generator with cps-based specialization. The papers presents a ds- and a cps-based specializer as well as a ds- and a cps-based compiler generator. Both compiler generators are derived from the corresponding specializers. The cps-cogen is proven correct with respect to a cps-specializer. A correctness proof for a hand-written ds-cogen is given.
Keywords: hashing, multiset discrimination, symbol tables, hygienic macro expansion, left factoring, value numbers, acyclic coarsest partitioning, sequence congruence, strength reduction, useless code
Summary: It is shown how the programming technique of multiset discrimination can be used to solve a wide variety of programming language problems in worst case time and space that either matches or is better than the expected time (under the assumption that hashing takes unit expected time) of the best previous algorithm. An algorithm is presented that solves iterated strength reduction together with useless cose elimination in worst case time linear in the maximum length of the unoptimized and optimized programs. This is three orders of magnitude better than the expected time of the previous best algorithm due to Cocke and Kennedy.
Keywords: metacomputation, formal linguistic modeling, program composition, program inversion, program specialization, partial evaluation
Summary: We derive the basic requirements for metacomputation by a structural analysis of different model definitions using a single concept, namely formal linguistic modeling. We show that three operations must be performed effectively and efficiently by metacomputation: composition, inversion, and specialization of programs.
Keywords: Partial deduction, supercompilation, partial evaluation, unification-based information propagation, logic languages, functional languages
Summary: Partial deduction and driving are two methods used for program specialization in logic and functional languages, respectively. We argue that both techniques achieve essentially the same transformational effect by unification-based information propagation. We show their equivalence by analyzing the definition and construction principles underlying partial deduction and driving, and by giving a translation from a functional language to a definite logic language preserving certain properties. We discuss residual program generation, termination issues, and related other techniques developed for program specialization in logic and functional languages.
Keywords: program specialization, partial evaluation, self-application, interpreters
Summary: We formulate three specializer projections which enable us to generate program specializers. While the Futamura projections define the generation of compilers from interpreters, the specializer projections define the generation of specializers from interpreters. We investigate the potential use of the specializer projections and discuss three applications: generic specializers, bootstrapping of subject languages, and the generation of optimizing specializers. Specializer generation has, beyond several practical advantages, an interesting property: the correctness of the generated specializer is guaranteed by the correctness of the generic specializer and the correctness of the initial interpreter. Recent investigations confirm that the specializer projections can performed in practice using existing partial evaluators.
Keywords: partial evaluation, program generation, self-application, specializer generation, interpreters
Summary: We propose a new method for improving the specialization of programs by inserting an interpreter between a subject program and a specializer. We demonstrate that this interpretive approach may achieve stronger specializations than using a partial evaluator alone. Furthermore, we formulate three specializer projections which enable us to generate specializers from interpreters and self-applicable program specializers.
Keywords: partial evaluation, specializer projections, program generation, supercompilation, deforestation
Summary: The paper uses a new interpretive approach -- inserting an interpreter between a source program and a program specializer -- to improve the transformation of programs and to automatically generate program transformers. As defined by the specializer projections one may generate stand-alone transformers by self-application of a program specializer. We show that one can generate Wadler's deforestation algorithm and a version of Turchin's supercompiler this way.
Keywords: closure analysis, value flow, higher-order values, program analysis
Summary: Closure analysis is instrumental, but potentially expensive, in optimization and implementation of higher-order functional languages. We present a simplified closure analysis that yields practically satisfactory results and can be computed in (almost-)linear time.
Keywords: type inference, strictness analysis, fixed points, iterative algorithms
Summary: We present a reduction of a strictness analysis due to Amtoft to a fixed point problem. This is a conceptually simply reduction than Amtoft's on-line normalization process and leads to an exponentially faster algorithm as well as simpler proofs. More generally, it demonstrates the use of classical data flow analysis algorithms in type-based program analysis.
Keywords: representation analysis, polymorphism, type systems, Standard-ML
Summary: Representation analysis seeks to optimize the run-time representation of elements of data types in high-level programming languages. Using a language with explicit representation types and boxing/unboxing operations we axiomatize equationally the set of all explicitly boxed versions called completions. We give the equality axioms a rewriting interpretation that captures eliminating boxing/unboxing operations without relying on a specific implementation or even semantics of the underlying language. The rewriting system produces a unique ``optimal'' completion.
Keywords: binding time analysis, polymorphism, polymorphic recursion
Summary: A polymorphic binding time analysis with polymorphic recursion is proposed as an alternative to polyvariant binding time analysis. The type system is proven to have the principal typing property, and the analysis is proven to be correct and partially complete (unfolding a definition cannot improve the results of the analysis).
Keywords: strictness analysis, chaotic fixpoint iteration, implementation
Summary: The paper describes a strictness analyser for Haskell. The analyser is based on abstract interpretation using the techniques of Burn, Hankin, Abramsky and Wadler. It uses chaotic fixpoint iteration and the demand driven nature of this technique allows us to use large domains including function domain and retain reasonable efficiency. Furthermore it allows us to introduce a new way of handling polymorphism py parametrerizing the strictness functions with domain-variables.
Keywords: abstract interpretation, flow analysis, static program analysis
Summary: This is a broad overview of Abstract Interpretation, to be a large chapter (around 100 pages) in the above-mentioned handbook. It consists of three main parts: an Introduction with motivation and Descriptions of the main methods used in the field; a mathematical development of the logical relations approach with several applications; and short descriptions of a broad spectrum of Semantics-Based Program Analyses.
Obsolete! Superseded by D-332.
Keywords: Higher-order functional languages, abstract interpretation, minimal function graphs, closure analysis.
Summary: We present a minimal function graph semantics for a higher-order functional language with applicative evaluation order. The semantics captures the intermediate calls performed during the evaluation of a program. This information may be used in abstract interpretation as a basis for proving the soundness of program analyses. An example of this is the ``closure analysis'' of partial evaluation.
Keywords: constructive type theory, lambda calculus, logic, program transformation, sorting, coding theory
Summary: A collection of articles written in honor of Professor Satoru Takasu on the occasion of his final lecture at the Research Institute of Mathematical Sciences (RIMS) in Kyoto, to celebrate the many positive effects he has had upon theoretical computer science, both in Japan and worldwide. Professor Takasu has been involved in pathbreaking research work, especially in constructive type theory and its relation to automatic program synthesis. He was one of the first, and perhaps the first, seriously to consider using the computer to derive a program automatically from a constructive proof that a problem specification is satisfiable. The editors are three of Takasu's former students. The topics include constructing programs from proofs, several aspects of logic, lambda calculus, program transformation, complexity theory, and coding theory.
Keywords: program analysis, abstract interpretation, program transformation, partial evaluation, compiler generation
Summary: Atlantique is a transatlantic research collaboration project in the area of Semantics Based Program Manipulation, with 6 European and 8 American partners. The organizing meeting was held just after POPL in Portland, Oregon, USA in 1994. This volume contains oroject overviews from 13 of the partners, and technical papers or abstracts that were presented at the workshop.
Keywords: abstract interpretation, functional programming, logic programming
Summary: The functional and logic programming research communities are to a significant extent solving the same types of problems, and by similar methods. The purpose of this paper is to encourage the two communities to show greater awareness of and openness to each other's methods and solutions.
Keywords: Programming Languages, Language Constructs and Features; Theory of Computation, Computation by Abstract Devices; Analysis of Algorithms and Problem Complexity
Summary: The aim of this discussion paper is to simulate (or perhaps provoke) stronger interactions among theoreticians and practitioners interested in efficient problem solutions. We emphasize computing abilities in relation to programming and specification languages, as these provide the user's interface with the computer. Due to the breadth of the topic we mainly discuss efficiency as measured by computation time, ignoring space and other measures; and this on sequential, deterministic, individual, and fixed computers, not treating parallelism, stochastic algorithms, distributed computing, or dynamically expanding architectures.
Obsolete! Superseded by D-228.
Keywords: partial evaluation, Fortran
Summary: We developed a partial evaluator for a subset of Fortran 77. The partial evaluator is based on the off-line approach and uses a binding-time analysis prior to the specialization phase. The source language includes multi-dimensional arrays, procedures and functions, as well as global storage. The system is presented and experimental results for numerical and non-numerical applications are given.
Keywords: Prolog, abstract interpretation, flow analysis, logical relations
Summary: A framework is established for the static analysis of prolog programs. The framework is quite general and is solidly based in a formal semantics, thus enabling rigorous correctness proofs of various analyses. The methods used include an adaption of logical relations as applied by Nielson and Nielson.
Keywords: algorithms, automata, linear time simulation, program derivation
Summary: We extend 2-way deterministic push-down automata (2DPDAs) with a write-once-read-many (WORM) store. We show that it allows linear time simulation by a variant of Cook's construction. We show this allows us to write a recognizer for a language conjectured not be recognizable by an unexpanded 2DPDA
Keywords: semantics, types, multiset discrimination, data structures
Summary: The semantics of translating external input in string form into program data structures is defined for hereditarilly finite set, multiset, and tuple datatypes. An algorithm that makes use of multiset discrimination is shown to implement this reading problem in linear worst case time in the length of the input string.
Keywords: Lambda calculus, control operators, Curry-Howard Isomorphism, classical proof theory
Summary: By restriction of Felleisen's control operator F we obtain an operator Delta and a fully compatible, Church-Rosser control calculus lambda_Delta enjoying a number of desirable properties. It is shown that lambda_Delta contains a strongly normalizing typed subcalculus with a reduction corresponding closely to systems of proof normalization for classical logic. The calculus is more than strong enough to express a call-by-name catch-throw programming paradigm.
Keywords: interprocedural dataflow analysis, magic-sets transformation, demand algorithms
Summary: The paper describes how algorithms for demand versions of program-analysis problems can be obtained from their exhaustive counterparts essentially for free, by applying the so-called magic-sets transformation that was developed in the logic-programming and deductive-database communities.
Keywords: interprocedural program slicing, system dependence graph, dynamic programming, dynamic transitive closure
Summary: The paper presents a new algorithm for slicing system dependence graphs that is asymptotically faster than the best previous algorithm. A preliminary experimental study shows that the algorithm is significantly faster in practice, as well.
Keywords: demand dataflow analysis, interprocedural dataflow analysis, interprocedural program slicing, graph reachability
Summary: The paper shows how a large class of interprocedural dataflow-analysis problems can be solved precisely in polynomial time by transforming them into a special kind of graph-reachability problem.
Keywords: interprocedural dataflow analysis, interprocedural program slicing, graph reachability
Summary: The paper shows how a large class of interprocedural dataflow-analysis problems can be solved precisely in polynomial time by transforming them into a special kind of graph-reachability problem. Results are reported from a preliminary experimental study of C programs (for the problem of finding possibly-uninitialized-variables).
Keywords: interprocedural dataflow analysis, magic-sets transformation, interprocedural program slicing, demand algorithms
Summary: The paper describes how algorithms for demand versions of program-analysis problems can be obtained from their exhaustive counterparts essentially for free, by applying the so-called magic-sets transformation that was developed in the logic-programming and deductive-database communities. Applications to interprocedural dataflow analysis and interprocedural program slicing are described.
Keywords: interprocedural dataflow analysis, interprocedural program slicing, parallel algorithms, inherently sequential problems
Summary: This paper shows that the interprocedural slicing problem and the interprocedural gen/kill dataflow-analysis problem are log-space complete for P. These results provide evidence that there do not exists fast parallel algorithms for these problems.
Keywords: TeX, typesetting, diagrams
Summary: Discusses textual notation for categorical diagrams based on the authors experience from developing the Xy-pic package for TeX. In particular the notion of `conceptual markup' for diagrams is introduced and explained.
Obsolete! Superseded by D-237.
Keywords: TeX, typesetting, graphs, diagrams
Summary: Explains how to use the Xy-pic macro package to typeset graphs and diagrams with TeX.
Obsolete! Superseded by D-238.
Keywords: TeX, typesetting, graphs, diagrams
Summary: Reference manual for Xy-pic summarising syntax and `drawing semantics' of the capabilities in the kernel, all extensions and features, and the PostScript backend.
Summary: This paper studies composed reduction systems: a system of programs built up from the reduction relations of some reduction system, by means of parallel and sequential composition operators. The trace-based compositional semantics of composed reduction systems is considered, and a new graph-representation is introduced as an alternative basis for the study of compositional semantics, refinement, true concurrency (in the case of composed rewriting systems) and program logics.
Summary: We consider operational semantics of contexts (terms with holes) in the setting of lazy functional languages, with the aim of providing a balance between operational and compositional reasoning, and a framework for semantics-based program analysis and manipulation.
Keywords: Supercompilation, deforestation, partial evaluation, GPC, positive supercompilation, pattern matching
Summary: We study four transformation methodologies which are automatic instances of Burstall and Darlington's fold/unfold framework: partial evaluation, deforestation, supercompilation, and generalized partial computation (GPC). One can classify these and other fold/unfold based transformers by how much information they maintain during transformation. We introduce the positive supercompiler, a version of deforestation including more information propagation, to study such a classification in detail. Via the study of positive supercompilation we are able to show that partial evaluation and deforestation have simple information propagation, positive supercompilation has more information propagation, and supercompilation and GPC have even more information propagation. The amount of information propagation is significant: positive supercompilation, GPC, and supercompilation can specialize a general pattern matcher to a fixed pattern so as to obtain efficient output similar to that of the Knuth-Morris-Pratt algorithm. In the case of partial evaluation and deforestation, the general matcher must be rewritten to achieve this.
Keywords: Deforestation, termination, tree grammars
Summary: Wadler's deforestation algorithm removes intermediate data structures from functional programs, but is only guaranteed to terminate for treeless programs. Chin has shown how one can apply deforestation to all first-order programs: annotate non-treeless subterms and apply the extended deforestation algorithm which essentially leaves annotated subterms untransformed. We develop a new technique of putting annotations on programs. The basic idea is to compute a finite grammar which approximates the set of terms that the deforestation algorithm encounters. The technique extends Wadler's and Chin's in a certain sense.
Keywords: Supercompilation, driving, program transformation
Summary: Turchin's supercompiler is a program transformer that includes both partial evaluation and deforestation. Although known in the West since 1979, the essence of its techniques, its more precise relations to other transformers, and the properties of the programs that it produces are only now becomming apparent in the Western functional programming community. This thesis gives a new formulation of the supercompiler in familiar terms; we study the essence of it, how it achieves its effects, and its relations to related transformers; and we develop results dealing with the problems of preserving semantics, assessing the efficiency of transformed programs, and ensuring termination.
Keywords: Modules, Signatures, Standard ML, Higher-order Functors
Summary: In this paper we present a language for programming with higher-order modules.footnoteThis is a revised and expanded version of a paper with the same title, presented at the 19th ACM Symposium on Principles of Programming Languages, Albuquerque, New Mexico, Jan. 1992. The language, HML, is based on Standard ML in that it provides structures, signatures and functors. In HML, functors can be declared inside structures and specified inside signatures; this is not possible in Standard ML. We present an operational semantics for the static semantics of HML signature expressions with particular emphasis on the handling of sharing. As a justification for the semantics, we prove a theorem about the existence of principal signatures. This result is closely related to the existence of principal type schemes for functional programming languages with polymorphism.
Keywords: partial evaluation, theorem proving, HOL-90
Summary: Using partial evaluation it is shown how to speed up theorem proving in Hol-90. This is done by specializing a general rewriter with respect to a given set of equality theorems.