- D-354.

**S M Abramov and R Glück**.

*Semantics Modifiers: an Approach to Non-Standard Semantics of Programming Languages*.

In*Third Fuji International Symposium on Functional and Logic Programming*( M Sato and Y Toyama, eds.), pp. 247-270. World Scientific. 1998.*Keywords*: programming languages, non-standard semantics, inverse computation, logic programming, program specialization, partial evaluation, metacomputation*Summary*: An approach for systematically modifying the semantics of programming languages by semantics modifiers is described. It allows the design of general and reusable ``semantics components'' without the penalty of being too inefficient. Inverse computation and neighborhood analysis are used to demonstrate the computational feasibility of the approach. Equivalence transformers are shown to fulfill the requirements of semantics modifiers. Finally, projections are given that allow the efficient implementation of non-standard interpreters and non-standard compilers using program specialization. - D-360.

**A M Ben-Amram and H Petersen**.

*Cons-free Programs with Tree Input*.

In*Proceedings, 25th Int. Colloquium on Automata, Languages and Programming (ICALP)*( K G Larsen, S Skyum, and G Winskel, eds.), pp. 271-282. Volume 1443 of LNCS. Springer. 1998.*Keywords*: computability in subrecursive langauges, complexity classes, multi-head automata*Summary*: We investigate the expressive power of programs operating on an input which is a LISP-style tree without the possibility to modify or create cons cells. They can thus apply car or cdr but not cons. We obtain a hierarchy of classes of sets that can be recognized by deterministic, non-deterministic and recursive programs respectively, with and without the additionl operation of testing pointer equality. Several results on the relationship of the classes defined by these languages to Turing-machine based complexity classes are obtained.*Pointers*: BibTeX. - D-351.

**A M Ben-Amram**.

*Pointer Machines and Pointer Algorithms: an Annotated Bibliography*.

Februar, 1998.*Keywords*: pointer machine, pointer algorithm, storage modification machine, SMM, PPM, HMM, KUM, LISP machine*Summary*: The term ``pointer machine'' has been used in Computer Science literature with various incompatible denotations. To help in disambiguating references, this report explains the different concepts and lists publications referring to each. - D-362.

**A M Ben-Amram**.

*A Precise Version of the Time Hierarchy for Language*.`I`

98/20. DIKU (Department of Computer Science). University of Copenhagen. August, 1998.*Keywords*: time hierarchy, constant factor*Summary*: This report repeats the proof of the hierarchy theorem for language pgtI by Jones (cf. item D-160 in this bibliography) in a sufficient level of detail that the constants involved can be evaluated, showing for example that there are problems decidable in time 127an but not an for every age 103. The constants are of course dependent on the programming language, but our main point is that they can be effectively calculated and are not very large. - D-352.

**M Brandt**.

*Recursive Subtyping: Axiomatizations and Computational Interpretations*.

Technical Report. April, 1997.*Keywords*: type, recursive, subtyping, coercion, axiomatization, inference rules, coinduction, fixpoint, Amadio, Cardelli*Summary*: We give new axiomatizations of type equality and subtyping for a type language with recursive types. The novelty of our approach is the use of the fixpoint rule, which captures a finitary form of coinduction. Our coinductive approach has several advantages over previous axiomatizations: The coinductive rule is conceptually simpler than the contraction rule and it is self-contained and strictly syntax directed. Natural operational interpretations (coercions) exist for the coinductive rule. Efficient and natural decision algorithms are easily constructed. The recursive subtype relation may be axiomatized without type equality. We prove our axiomatizations sound and complete with respect to Amadio, Cardelli's. Due to the coinductive rule are we able to construct a coercion language which encodes all subtype derivations and has a natural operational interpretation easily related to coercions in real programming languages. We develop a coherence theory of coercions and subtype derivations which is founded on a coinductive equality relation on coercions. Furthermore, we present a denotational semantics for coercions and show coercion equality sound under this interpretation. Finally, we formulate an optimality criteria for coercions and show that optimal coercions exist for all subtype derivations and we give an explicit algorithm for finding them. This algorithm also works as a decision algorithm for the subtype relation. - D-353.

**M Brandt and F Henglein**.

*Coinductive axiomatization of recursive type equality and subtyping*.

Fundamenta Informaticae**33**, pp. 309-338. 1998. Invited submission to special issue featuring a selection of contributions to the 3d Int'l Conf. on Typed Lambda Calculi and Applications (TLCA), 1997.*Keywords*: Recursive type, type equality, subtyping, axiomatization, inference rule, inference system, coercion, coinduction, contractive*Summary*: We present new sound and complete axiomatizations of type equality and subtype inequality for a first-order type language with regular recursive types. The rules are motivated by coinductive characterizations of type containment and type equality via simulation and bisimulation, respectively. The main novelty of the axiomatization is the fixpoint rule, a finitary coinduction principle. The new axiomatizations give rise to a natural operational interpretation of proofs as coercions and to efficient algorithms for constructing explicit coercions efficiently, not only deciding type equality and type containment. More generally, we show how adding the fixpoint rule makes it possible to define inductively a a set coinductively defined as the kernel (greatest fixed point) of an inference system. - D-350.

**M Bruynooghe, M Leuschel, and K Sagonas**.

*A Polyvariant Binding-Time Analysis for Off-line Partial Deduction*.

In*Proceedings of the European Symposium on Programming (ESOP'98)*( C Hankin, ed.), pp. 27-41. Volume 1381 of Lecture Notes in Computer Science. Springer-Verlag. 1998.*Keywords*: partial deduction, binding-time analysis, off-line, logic programming, partial evaluation, program specialisation*Summary*: We study the notion of binding-time analysis for logic programs. We formalise the unfolding aspect of an on-line partial deduction system as a Prolog program. Using abstract interpretation, we collect information about the run-time behaviour of the program. We use this information to make the control decisions about the unfolding at analysis time and to turn the on-line system into an off-line system. We report on some initial experiments.*Pointers*: BibTeX. - D-395.

**M V Christiansen and P Velschow**.

*Region-based Memory Management in Java*.

Master's Thesis. DIKU, University of Copenhagen. Universitetsparken 1, DK-2100 Copenhagen, Denmark. May, 1998.*Keywords*: dynamic memory management, Java, region, regions, type, inference system, soundness, static semantics , dynamic semantics*Summary*: We present a Java-like language in which objects are explicitly put in regions. The language has con structs for allocating, updating and deallocating regions, as well as region types for objects. For this language we prsent a static semantics ensuring that well-typed programs use regions safely, and we present a dynamic semantics t hat is intentional with respect to a region-based store. We formulate and prove a soundness theorem stating that wel l-typed programs do not go wrong. Finally, we develop a concrete model for implementing regions, and we compare this model to garbage collection for small exam ples. - D-379.

**O Danvy, R Glück, and P Thiemann, editors**.

*1998 Symposium on Partial Evaluation*.

Volume 30 of ACM Computing Surveys. ACM Press. September, 1998.*Pointers*: BibTeX. - D-369.

**P Eidorff, F Henglein, C Mossin, H Niss, M H Sørensen, and M Tofte**.

*AnnoDomini: From Type Theory to Year 2000 Conversion Tool*.

ERCIM News**36**. 1998. to appear.*Keywords*: AnnoDomini, Type Theory, Year 2000 Problem, Automatic Conversion*Summary*: AnnoDomini is a commercially available source-to-source conversion tool for making COBOL programs Year 2000 compliant. It was developed in the last two years by a group at DIKU (part of the ERCIM partner DANIT) and grew directly out of research in the theory of programming languages; it uses type-based specification, analysis, and transformation. These are combined into an integrated software reengineering tool and method for finding and fixing Year 2000 problems. AnnoDomini's primary goals have been flexibility, completeness, correctness, and a high degree of safe automation. - D-357.

**M Elsman**.

*Polymorphic Equality - No Tags Required*.

In*Second International Workshop on Types in Compilation*( X Leroy and A Ohori, eds.), pp. 136-155. Volume 1473 of Lecture Notes in Computer Science. Springer Verlag. 1998.*Pointers*: BibTeX. dvi.gz. ps.gz. - D-391.

**M Elsman**.

*Program Modules, Separate Copmpilation, and Intermodule Optimisation*.

PhD thesis. DIKU, University of Copenhagen. Universitetsparken 1, DK-2100 Copenhagen, Denmark. December, 1998. Available as Tech. Report 99/3.*Summary*: This thesis is about a framework for elaborating and interpreting module language constructs at compile time in such a way that (1) arbitrary compile time information about declared identifiers of a module may be propagated to other modules and (2) no code is generated for module language constructs. The framework for interpreting module constructs is called static interpretation. More information about referenced identifiers than can be obtained from programmer produced interfaces is necessary for analyses such as region inference. Further, many analyses improve significantly from availability of analysis specific information about referenced identifiers. Static interpretation facilitates intermodule optimisation, yet it still supports a variant of separate compilation called cut-off incremental recompilation. The thesis is divided into three parts. The first part is about the static properties of a small language called ModML, which features a small Core language and the essential constructs of of the Standard ML Modules language. The second part develops the framework for static interpretation by showing how ModML can be compiled into an explicitly typed intermediate language. The third part describes the ML Kit with regions compiler, which is based on the techniques developed in part one and part two.*Pointers*: BibTeX. - D-388.

**A Glenstrup, H Makholm, and J P Secher**.

*C-Mix: Making Easily Maintainable C-Programs run Fast*.

In*Proceedings of the International Workshop on Software Tools for Technology Transfer*( T Margaria and B Steffen, eds.), page . Volume of BRICS Notes Series, NS-98-4. BRICS. Aalborg, Denmark. June, 1998.*Keywords*: partial evaluation*Pointers*: BibTeX. pdf.gz. ps.gz. - D-389.

**A Glenstrup, H Makholm, and J P Secher**.

*Generating Program Generators*.

ERCIM News**36**, pp. 17-18. 1998.*Keywords*: partial evaluation*Pointers*: BibTeX. - D-358.

**R Glück and A V Klimov**.

*On the degeneration of program generators by program composition*.

New Generation Computing**16**(1), pp. 75-95. 1998.*Keywords*: programming languages, program transformation, partial evaluation, program specialization, program composition, metacomputation*Summary*: One of the main discovery in the seventies was that the concept of a generating extension covers a very wide class of apparently different program generators. Program specialization, or partial evaluation, is powerful because it provides uniform techniques for the automatic implementation of generating extensions from ordinary programs. The Futamura projections stand as the cornerstone of the development of program specialization. This paper takes the idea of the Futamura projections further. Three degeneration projections are formulated which tell us how to achieve the reverse goal by program composition, namely turning a generating extension into an ordinary program. The fact that program composition can invert the effect of program specialization shows that these projections are dual in some sense. The degeneration projections complete a missing link between programs and generating extensions and allow for novel applications of program transformation.*Pointers*: BibTeX. - D-394.

**F Henglein and J Rehof**.

*Constraint Automata and the Complexity of Recursive Subtype Entailment*.

In*Proc. 25th Int'l Coll. on Automata, Languages and Programming (ICALP), Aalborg, Denmark*, pp. 616-627. Volume 1443 of Lecture Notes in Computer Science. Springer-Verlag. July, 1998.*Keywords*: subtyping, subtype, entailment, recursive types, constraint, computational complexity*Summary*: We study entailment of structural and nonstructural recursive subtyping constraints. Constraints are formal inequalities between type expressions, interpreted over and ordered set of possibly infinite labeled trees. The nonstructural ordering on trees is the one introduced by Amadio and Cardelli for subtyping with recursive types. The structural ordering compares only trees with common shape. A constraint set entails an inequality if every assignment of meanings (trees) to type expressions that satisfies all the constraints also satisfies the inequality. In this paper we prove that nonstructural subtype entailment is PSPACE-hard, both for finite trees (simple types) and infinite trees (recursive types). For the structural ordering we prove that subtype entailment over infinite trees is PSPACE-complete when the order on trees is generated from a lattice of type constants. Since structural subtype entailment over finite trees has been shown to be coNP-complete, assuming coNP neq PSPACE these are the first complexity-theoretic separation results that show that, informally, nonstructural subtype entailment is harder than structural entailment, and recursive entailment is harder than nonrecursive entailment. - D-372.

**N D Jones, O de Moor, and J S Royer**.

*Programs: Improvements, Complexity, and Meanings*.

98231. IBFI (International Begegnungs- und Forschungszemtrum fuer Informatik), Universitaet des Saarlandes, Saarbruecken. Postfach 15 11 50 D-66041 Saarbruecken Germany. June, 1998.*Keywords*: Dagstuhl seminar report*Summary*: Abstracts of a meeting held June 8-12, 1998 at Schloss Dagstuhl*Pointers*: BibTeX. - D-373.

**N D Jones**.

*ESSLLI 98 Lecture Notes: Computability and Complexity from a Programming Perspective*.

Saarbrücken, Germany. European Summer Schools on Language, Logic and Information Lecture Notes, August, 1998.*Keywords*: computability, complexity, programming*Pointers*: BibTeX. - D-349.

**M Leuschel, B Martens, and D De Schreye**.

*Some Achievements and Prospects in Partial Deduction*.

ACM Computing Surveys**Symposium on Partial Evaluation**(), page . 1998. to appear.*Keywords*: partial deduction, on-line partial evaluation, program specialisation, logic programming*Pointers*: BibTeX. - D-356.

**M Leuschel**.

*On the Power of Homeomorphic Embedding for Online Termination*.

In*Static Analysis. Proceedings*( G Levi, ed.), pp. 230-245. Volume 1503 of Lecture Notes in Computer Science. Springer-Verlag. Pisa, Italy. September, 1998.*Keywords*: Termination, Well-quasi orders, Well-founded orders, Program Analysis, Specialisation, and Transformation, Partial Evaluation, Partial Deduction, Logic Programming, Functional & Logic Programming*Summary*: Recently well-quasi orders in general, and homeomorphic embedding in particular, have gained popularity to ensure the termination of program analysis, specialisation and transformation techniques. In this paper we investigate and clarify for the first time, both intuitively and formally, the advantages of such an approach over one using well-founded orders. Notably we show that the homeomorphic embedding relation is strictly more powerful than a large class of involved well-founded approaches.*Pointers*: BibTeX. - D-359.

**M Leuschel**.

*Program Specialisation and Abstract Interpretation Reconciled*.

In*Proceedings of the Joint International Conference and Symposium on Logic Programming (JICSLP'98)*( J Jaffar, ed.), page . Volume of . MIT Press. Manchester, UK. June, 1998.*Keywords*: program specialisation, partial evaluation, partial deduction, abstract interpretation*Summary*: We clarify the relationship between abstract interpretation and program specialisation in the context of logic programming. We present a generic top-down abstract specialisation framework, along with a generic correctness result, into which a lot of the existing specialisation techniques can be cast. The framework also shows how these techniques can be further improved by moving to more refined abstract domains. It, however, also highlights inherent limitations shared by all these approaches. In order to overcome them, and to fully unify program specialisation with abstract interpretation, we also develop a generic combined bottom-up/top-down framework, which allows specialisation and analysis outside the reach of existing techniques.*Pointers*: BibTeX. - D-371.

**C Mossin**.

*Higher-Order Value Flow Graphs*.

Nordic Journal of Computing**5**(), pp. 214-234. 1998.*Keywords*: Program analysis, type systems, efficiency, polymorphism, recursive types, polyvariance*Summary*: A value flow analysis based on a new concept of higher-order value flow graphs is presented. The language analysed need to be typed, but the types system may include polymorphism, sum types, recursive types and even dynamic types. The precision of the analysis is equivalent to Sestofts closure analysis, yet the the algorithm has quadratic complexity in the size of the explicitly typed program. A proof of correctness is given.*Pointers*: BibTeX. - D-375.

**J Rehof**.

*The Complexity of Simple Subtyping Systems*.

PhD thesis. DIKU, University of Copenhagen. 1998. Available as DIKU Rapport 98/19. - D-348.

**K Sagonas and M Leuschel**.

*Extending Partial Deduction to Tabled Execution: Some Results and Open Issues*.

ACM Computing Surveys**30, Symposium on Partial Evaluation**(3es), page . 1998.*Keywords*: partial deduction, tabling, partial evaluation, program specialisation, logic programming*Pointers*: BibTeX. - D-367.

**M H Sørensen**.

*Normalization in Lambda-Calculus and Type Theory*.

PhD thesis. Department of Computer Science, University of Copenhagen. 1997.*Keywords*: Lambda-Calculus, Type Theory, Weak Normalization, Strong Normalization*Summary*: Available as DIKU Rapport 97/27 - D-366.

**M H Sørensen**.

*Convergence of Program Transformers in the Metric Space of Tree*.

In*Mathematics of Program Construction*( J Jeuring, ed.), pp. 315-337. Volume 1422 of Lecture Notes in Computer Science. Springer-Verlag. 1998.*Keywords*: Program transformation, termination, metric spaces*Summary*: In recent years increasing consensus has emerged that program transformers, e.g., partial evaluation and unfold/fold transformations, should terminate; a compiler should stop even if it performs fancy optimizations! A number of techniques to ensure termination of program transformers have been invented, but their correctness proofs are sometimes long and involved. We present a framework for proving termination of program transformers, cast in the metric space of trees. We first introduce the notion of an abstract program transformer; a number of well-known program transformers can be viewed as instances of this notion. We then formalize what it means that an abstract program transformer terminates and give a general sufficient condition for an abstract program transformer to terminate. We also consider some specific techniques for satisfying the condition. As applications we show that termination of some well-known program transformers either follows directly from the specific techniques or is easy to establish using the general condition. Our framework facilitates simple termination proofs for program transformers. Also, since our framework is independent of the language being transformed, a single correctness proof can be given in our framework for program transformers using essentially the same technique in the context of different languages. Moreover, it is easy to extend termination proofs for program transformers to accommodate changes to these transformers. Finally, the framework may prove useful for designing new termination techniques for program transformers. - D-368.

**M H Sørensen and P Urzyczyn**.

*Lectures on the Curry-Howard Isomorphism*.

Available as DIKU Rapport 98/14, 1998.*Keywords*: The Curry-Howard isomorphism, Logic, Lambda-Calculus

Generated automatically from ftp'able BibTeX sources.

topps-www@diku.dk / October 27, 2010.