- D-203.

**L O Andersen**.

*Program Analysis and Specialization for the C Programming Language*.

PhD thesis. DIKU, University of Copenhagen. May, 1994. (DIKU report 94/19). - D-197.

**R Baier, R Glück, and R Zöchling**.

*Partial Evaluation of Numerical Programs in Fortran*.

In*ACM SIGPLAN Workshop on Partial Evaluation and Semantics-Based Program Manipulation*(, ed.), pp. 119-132. Volume 94/9 of Technical Report. University of Melbourne, Australia. 1994.*Keywords*: partial evaluation, numerical programs, imperative languages*Summary*: We investigate the application of partial evaluation to numerically oriented computation in scientific and engineering applications. We present our results using the Fast Fourier Transformation, the N-body attraction problem, and the cubic splines interpolation. The results demonstrate that existing partial evaluation technology is strong enough to improve the efficiency of a large class of numerical programs. However, using partial evaluation as a development tool in the `real world' still remains a challenging problem. - D-196.

**L Birkedal and M Welinder**.

*Binding-Time Analysis for Standard ML*.

In*(To be presented at the PEPM 94 workshop)*. June, 1994.**Obsolete!**Superseded by D-272.*Keywords*: partial evaluation, binding-time analysis, constraints, Standard ML*Summary*: We present an efficient base algorithm for binding-time analysis based on constraint solving and the union-find algorithm. In practice it has been used to handle all of Standard ML except modules and we show the principles of how constraints can be used for binding-time analysis of Standard ML; in particular we show how to binding-time analyse nested pattern matching. To the best of our knowledge no previous binding-time analysis has treated nested pattern matching. - D-199.

**L Birkedal and M Welinder**.

*Hand-Writing Program Generator Generators*.

In*(To be presented at the PLILP 94 conference)*. Springer-Verlag. September, 1994.*Keywords*: partial evaluation, compiler generation, Standard ML*Summary*: We argue that hand-writing a program generator generator has a number of advantages compared to generating a program generator generator by self-application of a partial evaluator. We show the basic principles of how to construct a program generator generator by presenting a program generator generator for a skeletal language, and we argue that it is not more difficult to use the direct approach than the indirect approach. Moreover, we report on some promising experiments made with a prototype implementation of a program generator generator for most of the Standard ML Core Language. - D-271.

**A Bondorf and D Dussart**.

*Improving CPS-Based Partial Evaluation: Writing Cogen by Hand*.

In*ACM SIGPLAN Workshop on Partial Evaluation and Semantics-Based Program Manipulation*(, ed.), pp. 1-9. Volume 94/9 of Technical Report. University of Melbourne, Australia. 1994.*Keywords*: partial evaluation, continuation passing style, direct style, handwritten compiler-generator, lambda-calculus*Summary*: Combines the idea of hand-writing a compiler generator with cps-based specialization. The papers presents a ds- and a cps-based specializer as well as a ds- and a cps-based compiler generator. Both compiler generators are derived from the corresponding specializers. The cps-cogen is proven correct with respect to a cps-specializer. A correctness proof for a hand-written ds-cogen is given.*Pointers*: BibTeX. - D-209.

**J Cai and R Paige**.

*Using Multiset Discrimination To Solve Language Processing Problems Without Hashing*.

Technical Report 94/16. DIKU. Dept. of Computer Science, U. of Copenhagen, Universitetsparken 1, DK-2100 Copenhagen, Denmark. 1994.*Keywords*: hashing, multiset discrimination, symbol tables, hygienic macro expansion, left factoring, value numbers, acyclic coarsest partitioning, sequence congruence, strength reduction, useless code*Summary*: It is shown how the programming technique of multiset discrimination can be used to solve a wide variety of programming language problems in worst case time and space that either matches or is better than the expected time (under the assumption that hashing takes unit expected time) of the best previous algorithm. An algorithm is presented that solves iterated strength reduction together with useless cose elimination in worst case time linear in the maximum length of the unoptimized and optimized programs. This is three orders of magnitude better than the expected time of the previous best algorithm due to Cocke and Kennedy. - D-185.

**R Glück and A V Klimov**.

*Metacomputation as a Tool for Formal Linguistic Modeling*.

In*Cybernetics and Systems '94*( R Trappl, ed.), pp. 1563-1570. Volume 2, pp. 1563-1570. World Scientific. Singapore. 1994.*Keywords*: metacomputation, formal linguistic modeling, program composition, program inversion, program specialization, partial evaluation*Summary*: We derive the basic requirements for metacomputation by a structural analysis of different model definitions using a single concept, namely formal linguistic modeling. We show that three operations must be performed effectively and efficiently by metacomputation: composition, inversion, and specialization of programs. - D-202.

**R Glück and M H Sørensen**.

*Partial deduction and driving are equivalent*.

In*Programming Languages: Implementations, Logics and Programs*( M Hermenegildo and J Penjam, eds.), pp. 165-181. Volume 844 of Lecture Notes in Computer Science. Springer-Verlag. 1994.*Keywords*: Partial deduction, supercompilation, partial evaluation, unification-based information propagation, logic languages, functional languages*Summary*: Partial deduction and driving are two methods used for program specialization in logic and functional languages, respectively. We argue that both techniques achieve essentially the same transformational effect by unification-based information propagation. We show their equivalence by analyzing the definition and construction principles underlying partial deduction and driving, and by giving a translation from a functional language to a definite logic language preserving certain properties. We discuss residual program generation, termination issues, and related other techniques developed for program specialization in logic and functional languages. - D-218.

**R Glück**.

*On the generation of specializers*.

Journal of Functional Programming**4**(4), pp. 499-514. 1994.*Keywords*: program specialization, partial evaluation, self-application, interpreters*Summary*: We formulate three specializer projections which enable us to generate program specializers. While the Futamura projections define the generation of compilers from interpreters, the specializer projections define the generation of specializers from interpreters. We investigate the potential use of the specializer projections and discuss three applications: generic specializers, bootstrapping of subject languages, and the generation of optimizing specializers. Specializer generation has, beyond several practical advantages, an interesting property: the correctness of the generated specializer is guaranteed by the correctness of the generic specializer and the correctness of the initial interpreter. Recent investigations confirm that the specializer projections can performed in practice using existing partial evaluators.*Pointers*: BibTeX. - D-186.

**R Glück and J Jørgensen**.

*Generating Optimizing Specializers*.

In*IEEE International Conference on Computer Languages*, pp. 183-194. IEEE Computer Society Press. 1994.*Keywords*: partial evaluation, program generation, self-application, specializer generation, interpreters*Summary*: We propose a new method for improving the specialization of programs by inserting an interpreter between a subject program and a specializer. We demonstrate that this interpretive approach may achieve stronger specializations than using a partial evaluator alone. Furthermore, we formulate three specializer projections which enable us to generate specializers from interpreters and self-applicable program specializers. - D-195.

**R Glück and J Jørgensen**.

*Generating Transformers for Deforestation and Supercompilation*.

In*Static Analysis*( B Le Charlier, ed.), pp. 432-448. Volume 864 of Lecture Notes in Computer Science. Springer-Verlag. 1994.*Keywords*: partial evaluation, specializer projections, program generation, supercompilation, deforestation*Summary*: The paper uses a new*interpretive approach*-- inserting an interpreter between a source program and a program specializer -- to improve the transformation of programs and to automatically generate program transformers. As defined by the*specializer projections*one may generate stand-alone transformers by self-application of a program specializer. We show that one can generate Wadler's deforestation algorithm and a version of Turchin's supercompiler this way. - D-193.

**F Henglein**.

*Simple Closure Analysis*.

March, 1992. DIKU Semantics Report D-193.*Keywords*: closure analysis, value flow, higher-order values, program analysis*Summary*: Closure analysis is instrumental, but potentially expensive, in optimization and implementation of higher-order functional languages. We present a simplified closure analysis that yields practically satisfactory results and can be computed in (almost-)linear time. - D-192.

**F Henglein**.

*Iterative Fixed Point Computation for Type-Based Strictness Analysis*.

March, 1994. DART Report.*Keywords*: type inference, strictness analysis, fixed points, iterative algorithms*Summary*: We present a reduction of a strictness analysis due to Amtoft to a fixed point problem. This is a conceptually simply reduction than Amtoft's on-line normalization process and leads to an exponentially faster algorithm as well as simpler proofs. More generally, it demonstrates the use of classical data flow analysis algorithms in type-based program analysis. - D-179.

**F Henglein and J Jørgensen**.

*Formally optimal boxing*.

In*21st Annual ACM SIGACT-SIGPLAN Symposium on Principles of Programming Languages. Portland, Oregon*, pp. 213-226. January, 1994.*Keywords*: representation analysis, polymorphism, type systems, Standard-ML*Summary*: Representation analysis seeks to optimize the run-time representation of elements of data types in high-level programming languages. Using a language with explicit representation types and boxing/unboxing operations we axiomatize equationally the set of all explicitly boxed versions called*completions*. We give the equality axioms a rewriting interpretation that captures eliminating boxing/unboxing operations without relying on a specific implementation or even semantics of the underlying language. The rewriting system produces a unique ``optimal'' completion. - D-198.

**F Henglein and C Mossin**.

*Polymorphic Binding-Time Analysis*.

In*Proceedings of European Symposium on Programming*( D Sannella, ed.), pp. 287-301. Volume 788 of Lecture Notes in Computer Science. Springer-Verlag. April, 1994.*Keywords*: binding time analysis, polymorphism, polymorphic recursion*Summary*: A polymorphic binding time analysis with polymorphic recursion is proposed as an alternative to polyvariant binding time analysis. The type system is proven to have the principal typing property, and the analysis is proven to be correct and partially complete (unfolding a definition cannot improve the results of the analysis). - D-207.

**K D Jensen, P Hjæresen, and M Rosendahl**.

*Efficient Strictness Analysis of Haskell*.

In*SAS'94*, page 15. Springer-Verlag. Lecture Notes in Computer Science. September, 1994.*Keywords*: strictness analysis, chaotic fixpoint iteration, implementation*Summary*: The paper describes a strictness analyser for Haskell. The analyser is based on abstract interpretation using the techniques of Burn, Hankin, Abramsky and Wadler. It uses chaotic fixpoint iteration and the demand driven nature of this technique allows us to use large domains including function domain and retain reasonable efficiency. Furthermore it allows us to introduce a new way of handling polymorphism py parametrerizing the strictness functions with domain-variables.*Pointers*: BibTeX. - D-58.

**N D Jones and F Nielson**.

*Abstract Interpretation: a Semantics-Based Tool for Program Analysis*.

In*Handbook of Logic in Computer Science*. Oxford University Press. 1994. 527-629.*Keywords*: abstract interpretation, flow analysis, static program analysis*Summary*: This is a broad overview of Abstract Interpretation, to be a large chapter (around 100 pages) in the above-mentioned handbook. It consists of three main parts: an Introduction with motivation and Descriptions of the main methods used in the field; a mathematical development of the logical relations approach with several applications; and short descriptions of a broad spectrum of Semantics-Based Program Analyses.*Pointers*: BibTeX. dvi.Z. ps.Z. - D-204.

**N D Jones and M Rosendahl**.

*Higher-Order Minimal Function Graphs*.

In*ALP'94*. Springer-Verlag. Lecture Notes in Computer Science. September, 1994. 11 pages.**Obsolete!**Superseded by D-332.*Keywords*: Higher-order functional languages, abstract interpretation, minimal function graphs, closure analysis.*Summary*: We present a minimal function graph semantics for a higher-order functional language with applicative evaluation order. The semantics captures the intermediate calls performed during the evaluation of a program. This information may be used in abstract interpretation as a basis for proving the soundness of program analyses. An example of this is the ``closure analysis'' of partial evaluation. - D-205.

**N D Jones, M Hagiya, and M Sato**.

*Logic, Language and Computation, a Festschrift in honor of Satoru Takasu*.

Volume 792. Springer-Verlag. 1994. 269 pages.*Keywords*: constructive type theory, lambda calculus, logic, program transformation, sorting, coding theory*Summary*: A collection of articles written in honor of Professor Satoru Takasu on the occasion of his final lecture at the Research Institute of Mathematical Sciences (RIMS) in Kyoto, to celebrate the many positive effects he has had upon theoretical computer science, both in Japan and worldwide. Professor Takasu has been involved in pathbreaking research work, especially in constructive type theory and its relation to automatic program synthesis. He was one of the first, and perhaps*the*first, seriously to consider using the computer to derive a program automatically from a constructive proof that a problem specification is satisfiable. The editors are three of Takasu's former students. The topics include constructing programs from proofs, several aspects of logic, lambda calculus, program transformation, complexity theory, and coding theory.*Pointers*: BibTeX. - D-206.

**N D Jones and C Talcott, editors**.

*Proceedings of the Atlantique Workshop on Semantics Based Program Manipulation*.

DIKU, University of Copenhagen. Universitetsparken 1, DK-2100 København Ø. June, 1994. 207 pages, available as DIKU report 94/12 from DIKU.*Keywords*: program analysis, abstract interpretation, program transformation, partial evaluation, compiler generation*Summary*: Atlantique is a transatlantic research collaboration project in the area of*Semantics Based Program Manipulation*, with 6 European and 8 American partners. The organizing meeting was held just after POPL in Portland, Oregon, USA in 1994. This volume contains oroject overviews from 13 of the partners, and technical papers or abstracts that were presented at the workshop.*Pointers*: BibTeX. - D-227.

**N D Jones**.

*Abstract Interpretation and Partial Evaluation in Functional and Logic Programming*.

In*Logic Programming. Proceedings of the 1994 International Symposium*( M Bruynooghe, ed.), pp. 17-22. Volume of . The MIT Press. 1994.*Keywords*: abstract interpretation, functional programming, logic programming*Summary*: The functional and logic programming research communities are to a significant extent solving the same types of problems, and by similar methods. The purpose of this paper is to encourage the two communities to show greater awareness of and openness to each other's methods and solutions.*Pointers*: BibTeX. dvi.gz. ps.gz. - D-225.

**N D Jones**.

*Program Speedups in Theory and Practice*.

In*IFIP Transactions Technology and Foundations Information Processing '94*( B Pehrson and I Simon, eds.), pp. 595-602. Volume Volume 1 of . Elsevier Science B.V. August, 1994.*Keywords*: Programming Languages, Language Constructs and Features; Theory of Computation, Computation by Abstract Devices; Analysis of Algorithms and Problem Complexity*Summary*: The aim of this discussion paper is to simulate (or perhaps provoke) stronger interactions among theoreticians and practitioners interested in efficient problem solutions. We emphasize computing abilities in relation to*programming and specification languages*, as these provide the user's interface with the computer. Due to the breadth of the topic we mainly discuss efficiency as measured by*computation time*, ignoring space and other measures; and this on*sequential, deterministic, individual, and fixed computers*, not treating parallelism, stochastic algorithms, distributed computing, or dynamically expanding architectures.*Pointers*: BibTeX. dvi.gz. ps.gz. - D-194.

**P Kleinrubatscher, A Kriegshaber, R Zöchling, and R Glück**.

*Fortran Program Specialization*.

In*Workshop Semantikgestützte Analyse, Entwicklung und Generierung von Programmen*( U M G Snelting, ed.), pp. 45-54. Justus-Liebig-Universität Giessen, Germany. 1994. Report No. 9402.**Obsolete!**Superseded by D-228.*Keywords*: partial evaluation, Fortran*Summary*: We developed a partial evaluator for a subset of Fortran 77. The partial evaluator is based on the off-line approach and uses a binding-time analysis prior to the specialization phase. The source language includes multi-dimensional arrays, procedures and functions, as well as global storage. The system is presented and experimental results for numerical and non-numerical applications are given. - D-94.

**K Marriot, H Søndergaard, and N D Jones**.

*Denotational Abstract Interpretation of Logic Programs*.

ACM Transactions on Programming Languages and Systems. 1994. Accepted.*Keywords*: Prolog, abstract interpretation, flow analysis, logical relations*Summary*: A framework is established for the static analysis of prolog programs. The framework is quite general and is solidly based in a formal semantics, thus enabling rigorous correctness proofs of various analyses. The methods used include an adaption of logical relations as applied by Nielson and Nielson. - D-208.

**T Æ Mogensen**.

*WORM-2DPDAs: An Extension to 2DPDAs that can be Simulated in Linear Time*.

Information Processing Letters. 1994.*Keywords*: algorithms, automata, linear time simulation, program derivation*Summary*: We extend 2-way deterministic push-down automata (2DPDAs) with a write-once-read-many (WORM) store. We show that it allows linear time simulation by a variant of Cook's construction. We show this allows us to write a recognizer for a language conjectured not be recognizable by an unexpanded 2DPDA*Pointers*: BibTeX. dvi.Z. ps.Z. - D-226.

**R Paige**.

*Efficient Translation of External Input in a Dynamically Typed Language*.

In*Technology and Foundations - Information Processing 94*( B Pehrson--> and I Simon, eds.), pp. 603-608. Volume 1 of IFIP Transactions A-51. North-Holland. Sept., 1994.*Keywords*: semantics, types, multiset discrimination, data structures*Summary*: The semantics of translating external input in string form into program data structures is defined for hereditarilly finite set, multiset, and tuple datatypes. An algorithm that makes use of multiset discrimination is shown to implement this reading problem in linear worst case time in the length of the input string. - D-189.

**J Rehof and M H Sørensen**.

*The lambda_Delta calculus*.

In*Theoretical Aspects of Computer Software*( M Hagiya and J Mitchell, eds.), pp. 516-542. Volume 789 of Lecture Notes in Computer Science. Springer-Verlag. 1994.*Keywords*: Lambda calculus, control operators, Curry-Howard Isomorphism, classical proof theory*Summary*: By restriction of Felleisen's control operator*F*we obtain an operator Delta and a fully compatible, Church-Rosser control calculus lambda_Delta enjoying a number of desirable properties. It is shown that lambda_Delta contains a strongly normalizing typed subcalculus with a reduction corresponding closely to systems of proof normalization for classical logic. The calculus is more than strong enough to express a call-by-name catch-throw programming paradigm. - D-211.

**T Reps**.

*Solving demand versions of interprocedural analysis problems*.

In*Proceedings of the Fifth International Conference on Compiler Construction*, pp. 389-403. Volume 786 of Lecture Notes in Computer Science. Springer-Verlag. 1993.*Keywords*: interprocedural dataflow analysis, magic-sets transformation, demand algorithms*Summary*: The paper describes how algorithms for demand versions of program-analysis problems can be obtained from their exhaustive counterparts essentially for free, by applying the so-called magic-sets transformation that was developed in the logic-programming and deductive-database communities. - D-214.

**T Reps, S Horwitz, M Sagiv, and G Rosay**.

*Speeding up slicing*.

June, 1994. Submitted for publication.*Keywords*: interprocedural program slicing, system dependence graph, dynamic programming, dynamic transitive closure*Summary*: The paper presents a new algorithm for slicing system dependence graphs that is asymptotically faster than the best previous algorithm. A preliminary experimental study shows that the algorithm is significantly faster in practice, as well. - D-215.

**T Reps, M Sagiv, and S Horwitz**.

*Interprocedural dataflow analysis via graph reachability*.

Technical Report 94/14. DIKU, University of Copenhagen, Denmark. 1994.*Keywords*: demand dataflow analysis, interprocedural dataflow analysis, interprocedural program slicing, graph reachability*Summary*: The paper shows how a large class of interprocedural dataflow-analysis problems can be solved precisely in polynomial time by transforming them into a special kind of graph-reachability problem. - D-216.

**T Reps, S Horwitz, and M Sagiv**.

*Precise interprocedural dataflow analysis via graph reachability*.

July, 1994. Submitted for publication.*Keywords*: interprocedural dataflow analysis, interprocedural program slicing, graph reachability*Summary*: The paper shows how a large class of interprocedural dataflow-analysis problems can be solved precisely in polynomial time by transforming them into a special kind of graph-reachability problem. Results are reported from a preliminary experimental study of C programs (for the problem of finding possibly-uninitialized-variables). - D-213.

**T Reps**.

*Demand interprocedural program analysis using logic databases*.

May, 1994. Extended version of D-211. Submitted for publication.*Keywords*: interprocedural dataflow analysis, magic-sets transformation, interprocedural program slicing, demand algorithms*Summary*: The paper describes how algorithms for demand versions of program-analysis problems can be obtained from their exhaustive counterparts essentially for free, by applying the so-called magic-sets transformation that was developed in the logic-programming and deductive-database communities. Applications to interprocedural dataflow analysis and interprocedural program slicing are described. - D-212.

**T Reps**.

*On the sequential nature of interprocedural program-analysis problems*.

January, 1994. Submitted for publication.*Keywords*: interprocedural dataflow analysis, interprocedural program slicing, parallel algorithms, inherently sequential problems*Summary*: This paper shows that the interprocedural slicing problem and the interprocedural gen/kill dataflow-analysis problem are log-space complete for P. These results provide evidence that there do not exists fast parallel algorithms for these problems. - D-210.

**K H Rose**.

*Xy-pic and Notation for Categorical Diagrams*.

Applied Categorical Structures. Invited submission to ECCT-94.*Keywords*: TeX, typesetting, diagrams*Summary*: Discusses textual notation for categorical diagrams based on the authors experience from developing the Xy-pic package for TeX. In particular the notion of `conceptual markup' for diagrams is introduced and explained.*Pointers*: BibTeX. html. ps.Z. talk.html. - D-200.

**K H Rose**.

*Xy-pic User's Guide*.

Mathematics Report 94-148. MPCE, Macquarie University. NSW 2109, Australia. June, 1994. For version 2.10+. Latest version available by anonymous ftp in`ftp.diku.dk: /diku/users/kris/ TeX/xyguide.ps.Z`.**Obsolete!**Superseded by D-237.*Keywords*: TeX, typesetting, graphs, diagrams*Summary*: Explains how to use the Xy-pic macro package to typeset graphs and diagrams with TeX. - D-201.

**K H Rose and R R Moore**.

*Xy-pic Reference Manual*.

Mathematics Report 94-155. MPCE, Macquarie University. NSW 2109, Australia. June, 1994. For version 2.10+. Latest version available by anonymous ftp in`ftp.diku.dk: /diku/users/kris/ TeX/xyrefer.ps.Z`.**Obsolete!**Superseded by D-238.*Keywords*: TeX, typesetting, graphs, diagrams*Summary*: Reference manual for Xy-pic summarising syntax and `drawing semantics' of the capabilities in the kernel, all extensions and features, and the PostScript backend. - D-219.

**D Sands**.

*Composed Reduction Systems*.

In*Proceedings of the 6th Nordic Workshop on Programming Theory*(, ed.), pp. 360-377. Volume NS-94-6 of BRICS Notes Series. Aarhus. October, 1994.*Summary*: This paper studies composed reduction systems: a system of programs built up from the reduction relations of some reduction system, by means of parallel and sequential composition operators. The trace-based compositional semantics of composed reduction systems is considered, and a new graph-representation is introduced as an alternative basis for the study of compositional semantics, refinement, true concurrency (in the case of composed rewriting systems) and program logics. - D-220.

**D Sands**.

*Towards Operational Semantics of Contexts in Functional Languages*.

In*Proceedings of the 6th Nordic Workshop on Programming Theory*(, ed.), pp. 378-385. Volume NS-94-6 of BRICS Notes Series. Aarhus, Denmark. 1994.*Summary*: We consider operational semantics of contexts (terms with holes) in the setting of lazy functional languages, with the aim of providing a balance between operational and compositional reasoning, and a framework for semantics-based program analysis and manipulation. - D-190.

**M H Sørensen, R Glück, and N D Jones**.

*Towards Unifying Deforestation, Supercompilation, Partial Evaluation, and Generalized Partial Computation*.

In*European Symposium on Programming*( D Sannella, ed.), pp. 485-500. Volume 788 of Lecture Notes in Computer Science. Springer-Verlag. 1994.*Keywords*: Supercompilation, deforestation, partial evaluation, GPC, positive supercompilation, pattern matching*Summary*: We study four transformation methodologies which are automatic instances of Burstall and Darlington's fold/unfold framework:*partial evaluation, deforestation, supercompilation,*and*generalized partial computation (GPC)*. One can classify these and other fold/unfold based transformers by how much information they maintain during transformation. We introduce the*positive supercompiler*, a version of deforestation including more information propagation, to study such a classification in detail. Via the study of positive supercompilation we are able to show that partial evaluation and deforestation have simple information propagation, positive supercompilation has more information propagation, and supercompilation and GPC have even more information propagation. The amount of information propagation is significant: positive supercompilation, GPC, and supercompilation can specialize a general pattern matcher to a fixed pattern so as to obtain efficient output similar to that of the Knuth-Morris-Pratt algorithm. In the case of partial evaluation and deforestation, the general matcher must be rewritten to achieve this. - D-191.

**M H Sørensen**.

*A Grammar-Based Data-Flow Analysis to Stop Deforestation*.

In*Colloquium on Trees in Algebra and Programming*( S Tison, ed.), pp. 335-351. Volume 787 of Lecture Notes in Computer Science. Springer-Verlag. 1994.*Keywords*: Deforestation, termination, tree grammars*Summary*: Wadler's*deforestation*algorithm removes intermediate data structures from functional programs, but is only guaranteed to terminate for*treeless*programs. Chin has shown how one can apply deforestation to all first-order programs:*annotate*non-treeless subterms and apply the*extended*deforestation algorithm which essentially leaves annotated subterms untransformed. We develop a new technique of putting annotations on programs. The basic idea is to compute a finite grammar which approximates the set of terms that the deforestation algorithm encounters. The technique extends Wadler's and Chin's in a certain sense. - D-297.

**M H Sørensen**.

*Turchin's Supercompiler Revisited.*.

Master's Thesis. Department of Computer Science, University of Copenhagen. 1994. DIKU-rapport 94/17.*Keywords*: Supercompilation, driving, program transformation*Summary*: Turchin's supercompiler is a program transformer that includes both partial evaluation and deforestation. Although known in the West since 1979, the essence of its techniques, its more precise relations to other transformers, and the properties of the programs that it produces are only now becomming apparent in the Western functional programming community. This thesis gives a new formulation of the supercompiler in familiar terms; we study the essence of it, how it achieves its effects, and its relations to related transformers; and we develop results dealing with the problems of preserving semantics, assessing the efficiency of transformed programs, and ensuring termination. - D-235.

**M Tofte and J-P Talpin**.

*Implementing the Call-By-Value Lambda-Calculus using a Stack of Regions*.

In*Proceedings of the 21st ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages*. ACM Press. January, 1994. - D-273.

**M Tofte**.

*Principal Signatures for Higher-order Program Modules*.

Journal of Functional Programming**4**(3), pp. 285-335. July, 1994.*Keywords*: Modules, Signatures, Standard ML, Higher-order Functors*Summary*: In this paper we present a language for programming with higher-order modules.footnoteThis is a revised and expanded version of a paper with the same title, presented at the 19th ACM Symposium on Principles of Programming Languages, Albuquerque, New Mexico, Jan. 1992. The language, HML, is based on Standard ML in that it provides structures, signatures and functors. In HML, functors can be declared inside structures and specified inside signatures; this is not possible in Standard ML. We present an operational semantics for the static semantics of HML signature expressions with particular emphasis on the handling of sharing. As a justification for the semantics, we prove a theorem about the existence of principal signatures. This result is closely related to the existence of principal type schemes for functional programming languages with polymorphism. - D-217.

**M Welinder**.

*Towards Efficient Conversions by use of Partial Evaluation*.

In*??*. September, 1994. To be presented at HUG-94, Malta, Category B.*Keywords*: partial evaluation, theorem proving, HOL-90*Summary*: Using partial evaluation it is shown how to speed up theorem proving in Hol-90. This is done by specializing a general rewriter with respect to a given set of equality theorems.

Generated automatically from ftp'able BibTeX sources.

topps-www@diku.dk / October 27, 2010.