First Joint Workshop on Ontologies, Uncertainty, and Inconsistency Handling

Dresden, June 26–29, 2018

The goal of this workshop is to foster collaboration between researchers in Germany and Saõ Paulo working on ontology design and maintenance, uncertainty representation, and inconsistency handling. The overall goal is to develop formalisms and approaches that support the construction and maintenance of very large, comprehensive ontologies that may contain uncertain and partially contradicting knowledge. In particular, we will concentrate on extending classical ontology languages towards nonmonotonic and probabilistic logics, consider methods for reasoning in the presence of inconsistency and approaches for updating, revising and evolving ontologies.

Organizers

Franz Baader (TU Dresden, Theoretical Computer Science)

Title: Quantitative Description Logics

Abstract: Description Logics (DLs) are a well-investigated family of logic-based knowledge representation languages, which are frequently used to formalize ontologies for application domains such as biology and medicine. To define the important notions of such an application domain as formal concepts, DLs state necessary and sufficient conditions for an individual to belong to a concept. From a formal logic point of view, basic DLs are (usually decidable) fragments of first-order logic, and thus provide their users only with qualitative means for describing the relevant concepts of the application domain. To overcome this deficit, various ways of making DLs quantitative have been introduced in the literature. One the one hand, quantitative features may be employed to define non-classical variants of description logics, such as fuzzy, probabilistic, or possibilistic DLs. On the other hand, one can also use quantitative means of expressiveness for defining concepts based on classical first-order semantics. A prominent example are DLs with concrete domains, in which abstract objects can, e.g., be associated through partial functions with numbers, and numerical predicates can be used to constrain the values of these functions.

This talk will concentrate on a different way of using numerical constraints on the cardinality of sets to define concepts. The sets in question may either be the extensions of concepts or the role successor of a given individual in an interpretation. In previous approaches (number restrictions, cardinality restrictions on concepts), one could only compare the cardinalities of such sets with a fixed number.  We considerably extend the expressiveness of number restrictions and cardinality restrictions on concepts by allowing for constraints on sets of role successors and elements of concepts that are formulate in the quantifier-free fragment of Boolean Algebra with Presburger Arithmetic (QFBAPA), in which one can express Boolean combinations of set constraints and numerical constraints on the cardinalities of sets. We can show that the obtained DL is decidable (NExptime-complete) and that certain fragments have a lower complexity (Exptime).

Web page

Renata Wassermann (University of São Paulo, Computer Science Department)

Title: On Ontology Evolution

Abstract: In this talk, I will first summarize our work in the last 10 years on adapting AGM style Belief Revision to deal with Description Logics. The usual AGM respresentation results depend on many assumptions on the underlying logics which do not hold for DLs. We have characterized contraction and revision for both belief bases and belief sets. Then I will show our Protégé Plugins and experimental results with OWL.

Homepage

Participants

Glauber de Bona (University of São Paulo, Department of Mechatronics)

Title: Localising Inconsistency

Abstract: In artificial intelligence, it is important to handle and analyse inconsistency in knowledge bases. Inconsistent pieces of information suggest questions like "where is the inconsistency?", "how severe is it?" and "how to repair it?". In this talk, we will explore the first question, showing its relation to the others. Inconsistency localisation in knowledge bases has been traditionally reduced to finding minimally inconsistent subsets, which may reveal only the tip of an iceberg inconsistency. We will present a new concept to localise inconsistency, based on consolidation methods, together with its consequences to inconsistency measurement and its postulates. The existence of alternative ways of characterising conflicts raise the question of which would be a reasonable characterisation. More generally, we will discuss the (lack of) rationality postulates for inconsistency localisation.

Stefan Borgwardt (TU Dresden, Theoretical Computer Science)

Title: Fuzzy description logics and probabilistic databases

Abstract: I will give an overview over my PhD research on fuzzy description logics, as well as more recent work on probabilistic databases enhanced with ontologies. I will discuss how both formalisms are capable to deal with contradictory knowledge to some extent. Fuzzy description logics based on finite lattices can be used to model information coming from several sources, and keep track of which sources are used for inferences. Probabilistic databases attach probabilities to facts, and the semantics is based on the notion of possible worlds. By requiring inconsistent possible worlds to have a probability of 0, and redistributing the probability among the consistent possible worlds, we obtain a semantics that is almost surely consistent.

Homepage

Gerhard Brewka (University of Leipzig, Intelligent Systems Department)

Title: Computational Models of Argument

Abstract: Computational argumentation aims at the development of computer-based systems capable to support and to participate in argumentative activities. To this end one has to come up with formal models of the way we usually come to conclusions and make decisions, namely by

1. constructing arguments for and against various options,
2. establishing relationships among the arguments, most notably the attack relation, and
3. identifying interesting subsets of the arguments which represent coherent positions based on these relations.

In the talk we will highlight some of the main ideas and key techniques that have been developed in the field and show how they provide new ways of representing knowledge, handling inconsistencies, and reasoning by default. In particular, we will demonstrate how directed graphs with arbitrary edge labels, which are widely used to visualize argumentation and reasoning scenarios, can be turned into full-edged knowledge representation formalisms with a whole range of precisely defined semantics.

Homepage

Juliana Bueno-Soler (State University of Campinas, Faculty of Technology)

Title: Paraconsistent probabilistic methods for uncertain ontologies

Abstract: The Logics of Formal Inconsistency (LFIs) are a suitable tool to formalize the paraconsistent reasoning paradigm, as they contain linguistic resources to express the notions of consistency (and inconsistency as well) within the object language by employing a connective ◦, reading ◦α as “α is consistent” (and •, reading •α as “α is inconsistent”) that realize such an intuition. The distinguishing feature of the LFIs is that the principle of explosion PEx is not valid in general, although it is not abolished, but restricted to consistent sentences. Therefore, a contradictory theory may be non-trivial unless the contradiction refers to something consistent.

I will revise the main ideas of [1] concerning extensions of the notion of probability able to express probabilistic reasoning under contradictions. A paradigmatic case for defining a paraconsistent notion of probability is the three-valued system LFI1, which is quite close to classical logic and can more naturally encode a quasi-classical notion of paraconsistent probability.

One of the most interesting uses of probability in paraconsistent logic, which I intend to survey, is to define a new form of Bayesian conditionalization. The well-known Bayes rule permits one to update probabilities as new information is acquired, and, in the paraconsistent case, even when such new information involves some degree of contradictoriness.

As a secondary topic, I intend to start a discussion on potential applications of such quasi-classical probabilities, including non-standard Bayesian networks based on LFI1 probability distributions, and to access their applicability in causal networks.

As a third aspect, I will also briefly discuss a more philosophical question related to conditional probability. The well known triviality results obtained by David Lewis of 1976 showed that the identification of the probability of conditional and conditional probability trivializes the notion of probability distributions (see [2]). I will show that the triviality results obtained by Lewis can also be recovered in the probability theory based upon LFI1 under certain conditions (namely, for consistent events), albeit not valid for uncertain events.

[1] J. Bueno-Soler and W. A. Carnielli. Paraconsistent probabilities: Consistency, contradictions and Bayes theorem. In J. Stern, editor, Special Issue Statistical Significance and the Logic of Hypothesis Testing, Entropy 18(9), 2016. Open access http://www.mdpi.com/1099-4300/18/9/325/htm

[2] P. Milne. The simplest Lewis-style triviality proof yet? Analysis, 63(4):300–303, 2003.

Walter Carnielli (State University of Campinas, Department of Philosophy)

Title: Non-standard possibilistic and necessitistic measures as foundations for artificial reasoning

Abstract: Possibility and necessity theories rival with probability in representing uncertain knowledge, while offering a more qualitative view of uncertainty. Moreover, necessity and possibility measures constitute, respectively, lower and upper bounds for probability measures, with the advantage of avoiding the complications of the notion of probabilistic independence.

On the other hand, paraconsistent formal systems, especially the Logics of Formal Inconsistency, are capable to quite carefully express the circumstances of reasoning with contradictions.

The aim of this talk is to define new notions of possibility and necessity theories involving the concept of consistency (inspired in (Besnard & Lang 1994)) based on the paraconsistent and paracomplete logic Cie, connecting them to the notion of partial and conclusive evidence. This combination permits a whole treatment of contradictions, both local and global, including a gradual handling of the notion of contradiction, thus obtaining a really useful tool for AI and machine learning, with potential applications to logic programming via appropriate resolution rules.

I will also discuss a calculus for reasoning with conflicting evidence, LETj, as proposed by (Carnielli & Rodrigues 2016) and explore its connections with possibilistic and necessitistic measures.

Homepage

Marcelo E. Coniglio (State University of Campinas, Department of Philosophy)

Title: Handling inconsistencies with Logics of Formal Inconsistency

Abstract: The class of paraconsistent logics known as Logics of Formal Inconsistency (LFIs) was introduced by Walter Carnielli and João Marcos in 2002, as a generalization of Newton da Costa's hierarchy Cn. In the original paper "A taxonomy of C-systems" (W. Carnielli and J. Marcos, 2002), and in a second paper "Logics of Formal Inconsistency" (W. Carnielli, M. Coniglio and J. Marcos, 2007) several LFIs were defined, starting from a basic system (bC in 2002, mbC in 2007). In both papers, relevant properties and metaproperties were added gradually to the basic system by considering suitable axioms, thus defining a hierarchy or taxonomy of LFIs. The basic system mbC consists of positive classical propositional logic expanded by two connectives: a paraconsistent negation ~ and a (unary) consistency operator o, satisfying the following properties: (1) the third-excluded middle for the negation ~, namely (A v ~A) is a valid schema in mbC; and (2) from a contradiction (A & ~A) plus the assumption of the consistency of A (namely, oA) everything follows. That is, {A, ~A, oA} is always a trivial theory, despite {A, ~A} not being necessarily trivial (since ~ is a paraconsistent negation).

In the first part of this talk, the new taxonomy of LFIs presented in Chapter 3 of the book "Paraconsistent Logic: Consistency, Contradiction and Negation" (W. Carnielli and M. Coniglio, Springer, 2016) will be analyzed. As it was shown in that book, none of these logics can be semantically characterized by a single finite logical matrix, despite all of them being decidable.

In the second part of this talk some finite-valued LFIs will be surveyed: in particular, the well-known 3-valued paraconsistent logic J3, introduced in 1970 by I. D'Ottaviano and N. da Costa, will be analyzed. Finally, a new hierarchy of finite-valued LFIs recently introduced by M. Coniglio, F. Esteva, J. Gispert and L. Godo will be described. These logics are obtained from the finite-valued Lukasiewicz logics by taking an order filter as set of designated values. From this, a hierarchy of ideal paraconsistent logics is obtained, generalizing the 3-valued logic J3 to p-valued LFIs, for any prime p.

Applications of LFIs to paraconsistent belief revision will be outlined.

Homepage

Andreas Ecke (TU Dresden, Theoretical Computer Science)

Title: Reasoning with Prototypes in the Description Logic ALC using Weighted Tree Automata

Abstract: Recently, we have introduced an extension to Description Logics that allows to use prototypes to define concepts. To accomplish this, prototype distance functions (pdfs) are used, which assign to each element of an interpretation a distance value. Based on this, a new concept constructor of the form P∼n (d) for ∼ ∈ {<, ≤, >, ≥} is defined, which is used to describe the set of all elements with a distance ∼ n according to the pdf d.

In this talk, I will show how weighted alternating parity tree automata (wapta) over the non-negative integers can be used to define pdfs, and how this allows us to use both concepts and pointed interpretations as prototypes. Furthermore, I show that the complexity of reasoning in the Description Logic ALC stays EXPtime-complete even with the extension by prototypes defined using wapta. Finally, I will contrast this approach with similar approaches, in particular the Threshold Concepts presented by Oliver Fernández Gil.

Oliver Fernández Gil (TU Dresden, Theoretical Computer Science)

Title: Extending Description Logics with Threshold Concepts

Abstract: Recently, we have introduced an extension of the lightweight Description Logic EL that allows to define concepts in an approximate way. Such an extension is based on the use of concept constructors of the form C~t, where ~ is a comparison operator in {<,<=,>,>=}, t a threshold value and C a classical EL concept. The semantics of these new concepts is defined using a graded membership function. In this way, one can use a concept C~t to represent an approximation of the concept C. 

In a separate development (in our group), a framework to define concept similarity measures w.r.t. an EL ontology has been proposed. Moreover, it has been shown how concept similarity measures can be used to relax the set of answers of a concept query over a description logic knowledge base.

My current research project investigates the definition and reasoning of/with approximate concepts in Description Logics. It is based on the previously described approaches and the connection between them. In this talk, I will give an overview of the project, present preliminary results that we have obtained so far and current/future research directions.

Homepage
Marcelo Finger (University of São Paulo, Computer Science Department)

Title: Quantitative Logic Reasoning – Logic reasoning with probabilities and quantities

Abstract: We  investigate the intersection of deductive reasoning with explicit quantitative capabilities. These quantitative capabilities encompass probabilistic reasoning and counting quantifiers. The need to have a combined reasoning system that enables a unified way of reasoning with quantities has always been recognized in modern logic, as proposals of probabilistic logic reasoning are present since the work of Boole [1854]. Equally ubiquitous is the need to deal with cardinality restrictions on finite sets. We explore a common way to deal with these deductive quantitative capabilities, involving a framework based on Linear Algebra and Linear Programming. The distinction between probabilistic and cardinality reasoning comes from the different family of algebras employed.

Homepage

Gabriele Kern-Isberner (TU Dortmund, Information Engineering)

Title: Conditionals for uncertain reasoning in ontologies

Abstract: Uncertain reasoning can be done either in a quantitative (e.g., probabilistic) or in a qualitative (e.g., default-logical) way, and one may even wish to combine both ways in one ontology. While a huge amount of work has been done for propositional settings in more than 30 successful years of uncertain and nonmonotonic reasoning, dealing with uncertainty in first-order logical frameworks is still a challenge. Numerous approaches have been proposed for probabilistic reasoning, and the basic ideas of nonmonotonic reasoning have been modified and applied to first-order default logics, also resulting in a diversity of methods. In this talk, I focus on the role of conditionals which have successfully proved to provide useful links between quantitative and qualitative propositional approaches to uncertain reasoning, and present ideas and methods to lift conditionals to first-order reasoning. Conditionals seem to be particularly well suited to model uncertain GCIs and may thus play an important part for enriching ontologies with uncertainty. A crucial question is which semantics should be given to open conditionals to maintain their default characteristics while taking both subjective (uncertainty with respect to some specific individual) and objective (uncertainty with respect to a distribution over the whole population) aspects into account. Moreover, since conditional beliefs may reflect revision strategies, it is interesting to investigate what conditionals can do for ontology change.

Homepage

Patrick Koopmann (TU Dresden, Theoretical Computer Science)

Title: Probabilities, Actions and Uniform Interpolation

Abstract: The talk will give an overview on my research in the areas ontology-based probabilistic data access, DL-based action formalisms and uniform interpolation. Probabilistic databases allow to represent probabilistic uncertainties in the data. Two extensions to the standard setting of ontology-based data access on these data will be presented: the first allows to model continuous probability distributions of numerical values, the second adds a temporal dimension to the setting. DL-based action formalisms allow to model conditionalised changes on datasets, and allow to make inferences about the effects of a sequence of these adaptations, or to analyse invariants of programs performing these adaptations. Forgetting or uniform interpolation is a technique to reduce the signature of knowledge bases, for which we developed a practical implementation. It can be used to detect and represent logical differences between versions of an ontology, and thus to analyse changes in evolving ontologies.

Homepage

Markus Krötzsch (TU Dresden, Theoretical Computer Science)

Title: Capturing context information and annotations with attributed logics

Abstract: Graph-structured data is used to represent large information collections, called knowledge graphs, in many applications. Their exact format may vary, but they often share the concept that edges can be annotated with additional information, such as validity time or provenance information. We give a formalisation of a generalised notion of annotated graphs, called multi-attributed relational structures (MARS), and introduce a matching knowledge representation formalism, multi-attributed predicate logic (MAPL). We analyse the expressive power of MAPL and suggest a simpler, rule-based fragment of MAPL that can be used for ontological reasoning.

Homepage

Carsten Lutz (University of Bremen, Theory of Artificial Intelligence)

Title: Principled Support for Pragmatic Ontology Approximation

Abstract: We study the problem of approximating an ontology O formulated in a description logic (DL) L by an ontology O' formulated in a fragment L' of L. In practice, such approximations are often constructed in an ad-hoc way that does not preserve all information from O expressible in L'. On the one hand, this is for a reason: strongest approximations, which preserve all expressible information from O, must be infinite even in simple and practically relevant cases. On the other hand, it is often only poorly understood which information has been given up. We address this problem by carefully studying the structure of (infinite) strongest approximations, concentrating on the case when L is Horn-SHIF or an expressive fragment thereof and L' is ELH_\bot. We exibit a connection to axiomatizations of quasi-equations valid in classes of semilattices with operators and additionally develop a direct approach that allows us to also deal with approximations of bounded role depth. A main aim of our results is to provide important guidance for constructing pragmatic approximations in practice.

Homepage

Dênis D. Mauá (University of São Paulo, Computer Science Department)

Title: Enriching the Probabilistic Logic Programming Toolset: Complexity, Inference and Learning

Abstract: Probabilistic logic programming combines logic and probability theory in an intuitive and appealing framework for the construction statistical models over arbitrary objects, attributes and relations. The additional expressivity and flexibility provided by the approach is followed by an additional cost of inference and learning with these models. In this work, we seek to identify islands of practical and/or theoretical tractability in the landscape of probabilisic logic programming by designing algorithms and analyzing the theoretical complexity in terms of language features and inference task.

Homepage

Sebastian Rudolph (TU Dresden, Artificial Intelligence)

Title: Fixed-Domain Reasoning in Description Logics

Abstract: Description logics (DLs) have evolved into a de facto standard in logic-based knowledge representation. In particular, they serve as the formal basis of the standardized and very popular web ontology language (OWL), which also comes with the advantage of readily available user-friendly modeling tools and optimized reasoning engines. In the course of the wide-spread adoption of OWL and DLs, situations have been observed where logically less skilled practitioners are (ab)using these formalisms as constraint languages adopting a closed-world assumption, contrary to the open-world semantics imposed by the classical definitions and the standards. To provide a clear theoretical basis and inferencing support for this often practically reasonable "off-label use" we propose an alternative formal semantics reflecting the intuitive understanding of such scenarios. To that end, we introduce the fixed-domain semantics and argue that this semantics gives rise to an interesting new inferencing task: model enumeration. We describe how the new semantics can be axiomatized in very expressive DLs. We thoroughly investigate the complexities for standard reasoning as well as query answering under the fixed-domain semantics for a wide range of DLs. Further, we present an implementation of a fixed-domain DL reasoner based on a translation into answer set programming (ASP) which is competitive with alternative approaches for standard reasoning tasks and provides the added functionality of model enumeration.

Homepage

Rafael Testa (State University of Campinas, Department of Philosophy)

Title: On Paraconsistent Belief Change

Abstract: In this talk I will present two systems of AGM-like paraconsistent Belief Revision, constructed over the LFIs (to be introduced by Coniglio in the Workshop), both advanced in [1] and further developed in [2]. Some recent results on the work in progress will be outlined, mainly regarding distinct paraconsistent operations, as well as constructive models that extend the previous results and allow direct application in weaker paraconsistent logics.

[1] TESTA, R. R.; CONIGLIO, M. E.; RIBEIRO, M. M. Paraconsistent Belief Revision based on a formal consistency operator. CLE e-prints, v. 15(8), 2015.   [2] TESTA, R. R.; CONIGLIO, M. E.; RIBEIRO, M. M. AGM-like paraconsistent belief change. Log J IGPL 2017; 25 (4): 632-672. doi: 10.1093/jigpal/jzx010   Homepage
Matthias Thimm (University of Koblenz, Institute for Web Science and Technologies)

Title: Theoretical and Practical Aspects of Inconsistency Measurement and Formal Argumentation

Abstract: In this talk, I will give an overview on my research interests, which focus on theoretical foundations and practical aspects of knowledge representation and reasoning. The central themes of my work are uncertainty and inconsistency of information and I will give an overview on recent works in the areas of inconsistent measurement and formal argumentation. The former is concerned with quantitative evaluation of inconsistency and I will discuss topics related to rationality, computational complexity, and expressivity. The area of formal argumentation is concerned with computational models of rational argument and I will discuss topics related to probabilistic reasoning and algorithmic issues.

Homepage

Anni-Yasmin Turhan (TU Dresden, Theoretical Computer Science)

Title: Reasoning in the description logic EL can be relaxed and tolerant

Abstract: In this overview talk, I consider the lightweight Description Logic EL and discuss an approach for relaxing instance queries and two approaches for handling inconsistent information in the ontology.

When querying Description logic (DL) knowledge bases, a controlled way of relaxing queries can be provided by the use of a concept similarity measure (CSM) and a threshold: while the CSM provides means to state which parts of the concept to relax, the threshold determines how much relaxation still yields acceptable answers.

Defeasible DLs are a nonmonotonic variant of DLs, which provide concept inclusions that can be defeated, i.e. neglected if contradictory information is present. We discuss a recent approach for computing a form of canonical models that serve as a basis for computing (defeasible) subsumption relationships.

For systems that need to be resilient against erroneous data, query answering can be performed under inconsistency-tolerant semantics. I discuss inconsistency-tolerant query answering in a temporalized setting, where queries may contain LTL operators and are answered over ontologies with time-stamped assertions.

Homepage

Participant from DFG

  • Dietrich Halm (Director of International Cooperation with Latin America)

Accommodation

For the people from outside of Dresden, we have blocked an allotment of rooms with Motel ONE, which is reasonably priced and nicely located in the historic town center. Please book your room there using this form, no later than May 15, 2018. Note that cancelation is free of charge, and you only need to give your credit card number to guarantee that the room is kept for you if you arrive later than 6pm. So there is no reason not to make the reservation as soon as you know your travel plans.

Venue

The workshop will take place at the Faculty of Computer Science of TU Dresden, Andreas-Pfitzmann-Bau (APB), Nöthnitzer Str. 46, in room 1004. Lunch and coffee breaks will take place in room 1005.

Program

Tuesday, June 26

09:30-10:30

Session 1
Opening
Carsten Lutz: Principled Support for Pragmatic Ontology Approximation

10:30-11:00 Coffee break
11:00-12:20 Session 2
Dênis D. Mauá: Enriching the Probabilistic Logic Programming Toolset: Complexity, Inference and Learning
Stefan Borgwardt: Fuzzy Description Logics and Probabilistic Databases
12:30-13:30 Lunch
13:30-15:30

Session 3
Glauber de Bona: Localising Inconsistency
Anni-Yasmin Turhan: Reasoning in the Description Logic EL can be Relaxed and Tolerant
Marcelo E. Coniglio: Handling Inconsistencies with Logics of Formal Inconsistency

15:30-16:00 Coffee break
16:00-17:20 Session 4
Gerhard Brewka: Computational Models of Argument
Matthias Thimm: Theoretical and Practical Aspects of Inconsistency Measurement and Formal Argumentation
18:00-19:30 Reception at the pond

Wednesday, June 27

09:00-10:20

Session 5
Marcelo Finger: Quantitative Logic Reasoning – Logic Reasoning with Probabilities and Quantities
Franz Baader: Quantitative Description Logics

10:20-11:00 Coffee break
11:00-12:20 Session 6
Renata Wassermann: On Ontology Evolution
Rafael Testa: On Paraconsistent Belief Change
12:30-14:00 Lunch
14:00-16:00

Session 7
Juliana Bueno-Soler: Paraconsistent Probabilistic Methods for Uncertain Ontologies
Walter Carnielli: Non-standard Possibilistic and Necessitistic Measures as Foundations for Artificial Reasoning
Oliver Fernández Gil: Extending Description Logics with Threshold Concepts

16:00-16:30 Coffee break

Thursday, June 28

09:00-10:20

Session 9
Markus Krötzsch: Capturing Context Information and Annotations with Attributed Logics
Sebastian Rudolph: Fixed-Domain Reasoning in Description Logics

10:20-11:00 Coffee break
11:00-13:00

Session 10
Gabriele Kern-Isberner: Conditionals for Uncertain Reasoning in Ontologies
Patrick Koopmann: Probabilities, Actions and Uniform Interpolation
Andreas Ecke: Reasoning with Prototypes in the Description Logic ALC using Weighted Tree Automata

13:00-14:00 Lunch
14:00-15:00 Dietrich Halm (DFG): Funding for Cooperative Research Projects
15:00-15:30 Coffee break
15:30-17:30 Discussion (in particular planning of the workshop in São Paulo)
19:30-22:00 Dinner at Restaurant Sophienkeller (map)

Friday, June 29

09:00-10:30 Discussion (small groups)
10:30-11:00 Coffee break
11:00-12:00 Summary
12:00-13:30 Lunch

Support

This workshop is funded by FAPESP and DFG under the special call FAPESP–DFG Joint Workshops 2017.

Internal

Wiki

Zu dieser Seite

Stefan Borgwardt
Letzte Änderung: 26.06.2018