Algebraic data types and catamorphisms (folds) play a central role in functional programming as they allow programmers to define recursive data structures and operations on them uniformly by structural recursion. Likewise, in object-oriented (OO) programming, recursive hierarchies of object types with virtual methods play a central role for the same reason. There is a semantical correspondence between these two situations which we reveal and formalize categorically. To this end, we assume a coalgebraic model of OO programming with functional objects. The development may be helpful in deriving refactorings that turn sufficiently disciplined functional programs into OO programs of a designated shape and vice versa. (Joint work with Ondrej Rypacek)
Ralf Lämmel is Professor of Computer Science at the Department of Computer Science at the University of Koblenz-Landau since July 2007. In the years 2005-2007, Ralf Lämmel was affiliated with Microsoft Corp., where he served on a research and development position with focus on XML technologies. In the years 2001--2004, Ralf Lämmel served on a permanent faculty position, at the Free University of Amsterdam, in the Software Engineering department, and he was also affiliated with the Dutch Center for Mathematics and Computer Science (CWI) -- starting in 1999. Ralf Lämmel obtained his PhD in Computer Science (dissertation in programming theory) from the University of Rostock (Germany) in December 1998.
Ralf Lämmel's speciality is "software language engineering" but he is generally interested in themes that combine software engineering and programming languages. His research and teaching interests include program transformation, generative programming, programming languages, type systems, generic language technology, grammar-based methods, advanced separation of concerns, and automated software engineering.
Ralf Lämmel has published approximately 70 peer-reviewed papers on the above-mentioned subjects, and he has participated in numerous national and international collaborations and funded research projects on these subjects. In academic and industrial projects, Ralf Lämmel has designed, implemented, and deployed software development tools, migration tools and application generators. Ralf Lämmel participates in various international conferences as program committee member and as organizer. For instance, he is one of the founding fathers of the international summer school series GTTSE -- Generative and Transformational Techniques on Software Engineering.
Francesco Cesarini will be presenting the concurrent soft real time functional programming language Erlang and its rapidly growing community. He will describe why Erlang programs are generally 4 - 10 times shorter than their counterparts in Java, C and C++ achieving 99,999% availability. The talk will also go into why Erlang, originally invented to handle the next generation of Telecom products, has been successful in a much wider range of sectors including banking and e-commerce.
Francesco Cesarini is the founder and CTO of Erlang Training and Consulting. He has used Erlang on a daily basis for almost 15 years, having started his career as an intern at Ericsson’s computer science laboratory, the birthplace of Erlang. He moved on to Ericsson’s Erlang training and consulting arm working on the first release of OTP, applying it to turnkey solutions and flagship telecom applications. In 1999, soon after Erlang was released as open source, he founded Erlang Training and Consulting. With offices in the UK, Sweden, Poland (and soon the US), they have become the world leaders in Erlang based consulting, contracting, training and systems development. Francesco has worked in major Erlang based projects both within and outside Ericsson, and in his role as CTO, is currently leading the development and consulting teams at ETC. He is also the co-author of Practical Erlang Programming, a book soon to be published.
Electronic components designed nowadays are increasingly complex. A particularly good example are SoC (System on Chip) architectures which allow embedding numerous heterogeneous components into a single chip.
Market pressure calls for two apparently contradictory constraints:
Thus, the aforementioned abstraction gap poses a problem which needs to be addressed by system design methodologies. The ForSyDe (Formal System Design) methodology tries to solve this problem by raising the abstraction level of system design.
The abstraction gap is bridged by refinement techniques. The designer creates a high-level design (specification model) which is automatically transformed to a lower-level design (implementation model) by applying transformational refinement rules.
ForSyDe's implementation is currently based upon Haskell (a purely functional programming language). Therefore, specification models are expressed in Haskell, using a EDSL (Embedded Domain Specific Language) specifically designed for this purpose.
The talk will be divided in two main blocks:
ForSyDe's embedded compiler, also based upon Haskell, makes use of innovative and particular compilation techniques which drastically reduce development time.
For this block, the audience will be assumed to have wide prior Functional Programming knowledge and be fluent in Haskell.
The formalisation of mathematical proofs often requires the help of a computer to be processed. For instance, the 4 color theorem or the Kepler conjecture coul not have been proved without the help of programs.
Surprisingly, the usual logical formalisms does not take this into account. The focus more on deduction and forget the computation, although it is a central concept in mathematics all over the centuries. Deduction Modulo is a logical framework introduced by Dowek, Hardin and Kirchner, that tries to conciliate those two approach.
I first give a general overview of deduction modulo, and give hints on it expressiveness, introducing also the semantics for it.
I then focus on a central point of every logical formalism: the cut elimination theorem. I will explain how to get rid of the cut rule by semantic means in the particular intuitionistic logic case. Thanks to the constructive approach, we will be able to characterize the algorithm inside the proof, and we show it is a Tableau procedure (extended to the intuitionistic deduction modulo)
Then I will point out how syntactical methods of cut elimination (normalization) is linked with those previous methods, and generates a general form of V-complexes, a structure introduced by Prawitz, Takahashi and Andrews in '68.
Olivier Hermant am a post-doctoral fellow (Promesas-CAM project) at the Universidad Complutense, Madrid
The design and implementation of precise static analyzers for significant fragments of modern imperative languages like C, C++, Java and Python is a challenging problem. Here we consider a core imperative language that has several features found in mainstream languages such as those including recursive functions, run-time system and user-defined exceptions, and a realistic data and memory model. For this language we provide a concrete semantics characterizing both finite and infinite computations and a generic abstract semantics that we prove sound with respect to the concrete one. We say the abstract semantics is generic since it is designed to be completely parametric on the analysis domains: in particular, it provides support for relational domains (i.e., abstract domains that can capture the relationships between different data objects). We also sketch how the proposed methodology can be extended to accommodate a larger language that includes pointers, compound data objects and non-structured control flow mechanisms. The approach, which is based on structured, big-step operational semantics and on abstract interpretation, is modular in that the overall static analyzer is naturally partitioned into components with clearly identified responsibilities and interfaces, something that greatly simplifies both the proof of correctness and the implementation.
This talk shows a method for going from the software architecture to the formal verification of a distributed Video-on-demand system.
As the motivation and target of this research, we use the VoDKA system, a distributed VoD server developed by the MADS research group at the Universidade da Coruña, using the Erlang/OTP platform. The software architecture of VoDKA is very flexible and complex, and better tools are needed in order to increase the confidence of the system architects and improve the overall system quality. We study how to use formal verification for that purpose.
Therefore, using several tools from the area of formal methods, we propose an innovative method for automatically extracting performance information about the system. As input to our method, we receive the system source code and the system configuration (the description of the components and how they interact). As output, we provide feedback information about the system performance and architectural bottlenecks. We extensively applied the method for analyzing the VoDKA system as a case study and showed how it can be reused with other tools and for other similar distributed systems.
I want to show some of my recent work that includes attempts to formally verify using first order logic distributed and fault tolerant algorithms, with the help of automated theorem provers. It is very much work in progress; we have successfully handled some simple algorithms and are now trying to verify a challenging distributed and fault tolerant leader election algorithm.
A parametric algebraic data type is a functor when we can apply a function to its data components while satisfying certain equations. We investigate whether parametric abstract data types can be functors. We provide a general definition for their map operation that needs only satisfy one equation. The definability of this map depends on properties of interfaces and is a suficient condition for functoriality. Instances of the definition for particular abstract types can then be constructed using their axiomatic semantics. The definition and the equation can be adapted to determine, necessarily and suficiently, whether an ADT is a functor for a given implementation.
Pablo Nogueira was born in Pontevedra, Galicia. He obtained the degree of Licenciado en Informatica from the Facultad de Informatica de Madrid, and his PhD from the University of Nottingham, UK. His research interest lie in the area of foundations and principles of programming and has recently worked in the area of generic functional programming.
The design of programs as the composition of smaller ones is a standard approach to programming. In functional programming, this approach raises the necessity of creating a good amount of intermediate data structures with the aim of passing data from one function to another.
Programs so defined are easier to understand and maintain, but are less efficient due to the generation of such data structures. Under certain conditions, intermediate data structures can be eliminated by the application of a program transformation technique, known as fusion, which appropriately combines the codes of the involved functions. Fusion techniques are based on algebraic laws associated with common recursive program schemes, like catamorphism, anamorphism, or the combination of both, called hylomorphism.
In this talk we will present the main aspects of a tool, called HFusion, which permits the application of fusion techniques to Haskell programs.
Asymmetric encryption using paired public and private keys provides encryption for secrecy, and also signatures for authenticity. (I will review the capabilities of asymmetric encryption at the beginning of the lecture.) Widespread use of asymmetric encryption requires a public key infrastructure (PKI) to distribute public keys and associate them with information about their owners.
Outside of the network, people and institutions have identities, attached to identifiers such as names or numbers. The main proposals for PKI try to import identities from outside of the network, by associating public keys reliably with externally meaningful identifiers. Some proposals require a chain of trusted signatures to avoid identity theft. The Pretty Good Privacy (PGP) system uses a more elaborate web of trusted signatures to reduce vulnerability to attack on a single signer.
The end-to-end principle of networking suggests a different approach to PKI (EEPKI). Instead of importing external identities into the network, EEPKI provides a tool that allows people to establish identities. EEPKI supports the identity relation instead of identities as objects. I outline protocols for EEPKI that provide substantial value to network users without relying on any central authority. EEPKI merges nicely with other systems of identifiers, including DNS, personal identifiers, and host/directory names in NFS.
A design pattern describes a set of proven solutions for a set of recurring design problems that occur within a context. As such, reusing patterns improves both quality and time-to-market of software projects. Currently most patterns are specified in an informal fashion which gives rise to ambiguity, and limit tool support and correct usage. This seminar describes Balanced Pattern Specification Language (BPSL), a language intended to accurately describe patterns in order to allow rigorous reasoning about them. BPSL incorporates the formal specification of both structural and behavioral aspects of patterns. Moreover, it can formalize pattern composition and pattern instances (possible implementations of a given pattern).
Keywords: Balanced Pattern Specification Language (BPSL), permanent relations, temporal relations, actions, pattern composition, instances of patterns.
I'll give an overview of (one large part of) my research. I will follow a chronological account, mostly as a convenient framework.
So, I will start with Fraenkel-Mostowski set theory and the nominal approach to abstract syntax with binding, move through nominal logic and nominal unification, then nominal rewriting, the new calculus of contexts, nominal algebra, and present and future work.
Questions will be welcome as the aim of this talk is not technical detail: it is to acquaint the audience with where and how I see my work fitting in with the current state of the art in other fields. More information can be found on www.gabbay.org.uk.
Research in formal models of confidentiality, long dominated by the concept of noninterference (Goguen and Meseguer, 1980), is currently concentrated on how to relax and come around the limitations of noninterence that have hampered their practical adoption. Noninterference prevents all flows of information from users and systems with high clearance to users and systems with lower clearance levels, while most real applications need to permit and also control some amount of information declassification.
Ideally a declassification property is parameterized by a policy that is explicit, program-independent and extensionally described. After reviewing some of the initial proposals, which lack most of these desirable features, we present and discuss an automata-based approach, where the declassification policy is characterized by a Mealy-machine. We illustrate the property with several examples and We illustrate the property with several examples and discuss current attempts to model both the policies and their enforcement using Maude, a rewriting logic framework.
Polytypism, or structural polymorphism, is a popular functional generic programming technique which is already supported by two language extensions for Haskell: Generic Haskell and Scrap your Boilerplate, the latter based on Strategic Programming concepts.
Polytypic functions are parametric on types. However, application is not substitution but driven by the definitional structure of the types in terms of sums of products. Polytypic functions capture families of overloaded or polymorphic functions in a single and typed ‘template’ definition. Two typical examples are equality and map, which can be defined once and for all, and can work with any monomorphic or parametrically polymorphic algebraic type defined by the programmer.
However, accessing the definitional structure of a type conflicts with the principle of data abstraction. Like ordinary functions, polytypic functions are sensible to representation changes and may violate internal invariants. Possible solutions based on proposals for reconciling pattern matching and data abstraction are not satisfactory. In order to reconcile polytypism with data abstraction we need a notion of structure centred around the concept of interface. Research in formal models of confidentiality, long dominated by the