Doctoral Theses
The list of theses is also available as PDF document.
2023 2022
2020 2019 2017 2016 2014 2011
2010 2009 2008 2007 2006 2002 2001
2000 1999 1998
Generated 21 November 2024, 15:59:45.
2023 2022
2020 2019 2017 2016 2014 2011
2010 2009 2008 2007 2006 2002 2001
2000 1999 1998
2023
Satyadharma Tirtarasa: Context-Sensitive Description Logics in Dynamic Settings. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2023.
Abstract BibTeX Entry Publication
Abstract BibTeX Entry Publication
The role-based paradigm has been introduced for the design of adaptive and context sensitive software systems. Naturally, a system built on top of the paradigm is expected to thrive in dynamic environments. In consequence, reasoning services over temporal aspect are essential in such a system. To represent context-dependent domains, various extensions of Description Logics (DLs) with context are introduced and studied. We focus on the family of Contextualized Description Logics (ConDLs) that have been shown capable to represent role-based modelling languages while retaining decidability. However, reasoning problems over dynamic settings under the logics are rather unexplored.
@thesis{ Tirtarasa-Diss-2023, address = {Dresden, Germany}, author = {Satyadharma {Tirtarasa}}, school = {Technische Universit\"{a}t Dresden}, title = {Context-Sensitive Description Logics in Dynamic Settings}, type = {Doctoral Thesis}, year = {2023}, }
2022
Jakub Rydval: Using Model Theory to Find Decidable and Tractable Description Logics with Concrete Domains. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2022.
Abstract BibTeX Entry Publication
Abstract BibTeX Entry Publication
Concrete domains have been introduced in the area of Description Logic (DL) to enable reference to concrete objects (such as numbers) and predefined predicates on these objects (such as numerical comparisons) when defining concepts. Unfortunately, in the presence of general concept inclusions (GCIs), which are supported by all modern DL systems, adding concrete domains may easily lead to undecidability.
To regain decidability of the DL \(\mathcal{ALC}\) in the presence of GCIs, quite strong restrictions, called \(\omega\)-admissibility, were imposed on the concrete domain. On the one hand, we generalize the notion of \(\omega\)-admissibility from concrete domains with only binary predicates to concrete domains with predicates of arbitrary arity. On the other hand, we relate \(\omega\)-admissibility to well-known notions from model theory. In particular, we show that finitely bounded homogeneous structures yield \(\omega\)-admissible concrete domains. This allows us to show \(\omega\)-admissibility of concrete domains using existing results from model theory. When integrating concrete domains into lightweight DLs of the \(\mathcal{EL}\) family, achieving decidability of reasoning is not enough. One wants the resulting DL to be tractable. This can be achieved by using so-called p-admissible concrete domains and restricting the interaction between the DL and the concrete domain. We investigate p-admissibility from an algebraic point of view. Again, this yields strong algebraic tools for demonstrating p-admissibility. In particular, we obtain an expressive numerical p-admissible concrete domain based on the rational numbers. Although \(\omega\)-admissibility and p-admissibility are orthogonal conditions that are almost exclusive, our algebraic characterizations of these two properties allow us to locate an infinite class of p-admissible concrete domains whose integration into \(\mathcal{ALC}\) yields decidable DLs.
DL systems that can handle concrete domains allow their users to employ a fixed set of predicates of one or more fixed concrete domains when modelling concepts. They do not provide their users with means for defining new predicates, let alone new concrete domains. The good news is that finitely bounded homogeneous structures offer precisely that. We show that integrating concrete domains based on finitely bounded homogeneous structures into \(\mathcal{ALC}\) yields decidable DLs even if we allow predicates specified by first-order formulas. This class of structures also provides effective means for defining new \(\omega\)-admissible concrete domains with at most binary predicates. The bad news is that defining \(\omega\)-admissible concrete domains with predicates of higher arities is computationally hard. We obtain two new lower bounds for this meta-problem, but leave its decidability open. In contrast, we prove that there is no algorithm that would facilitate defining p-admissible concrete domains already for binary signatures.
@thesis{ Rydval-Diss-2022, address = {Dresden, Germany}, author = {Jakub {Rydval}}, school = {Technische Universit\"{a}t Dresden}, title = {Using Model Theory to Find Decidable and Tractable Description Logics with Concrete Domains}, type = {Doctoral Thesis}, year = {2022}, }
2020
Walter Forkel: Closed-World Semantics for Query Answering in Temporal Description Logics. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2020.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
Ontology-mediated query answering is a popular paradigm for enriching answers to user queries with background knowledge. For querying the absence of information, however, there exist only few ontology-based approaches. Moreover, these proposals conflate the closed-domain and closed-world assumption, and therefore are not suited to deal with the anonymous objects that are common in ontological reasoning. Many real-world applications, like processing electronic health records (EHRs), also contain a temporal dimension, and require efficient reasoning algorithms. Moreover, since medical data is not recorded on a regular basis, reasoners must deal with sparse data with potentially large temporal gaps. Our contribution consists of three main parts: Firstly, we introduce a new closed-world semantics for answering conjunctive queries with negation over ontologies formulated in the description logic ELH⊥, which is based on the minimal universal model. We propose a rewriting strategy for dealing with negated query atoms, which shows that query answering is possible in polynomial time in data complexity. Secondly, we introduce a new temporal variant of ELH⊥ that features a convexity operator. We extend this minimal-world semantics for answering metric temporal conjunctive queries with negation over the logic and obtain similar rewritability and complexity results. Thirdly, apart from the theoretical results, we evaluate minimal-world semantics in practice by selecting patients, based on their EHRs, that match given criteria.
@thesis{ Forkel-Diss-20, address = {Dresden, Germany}, author = {Walter {Forkel}}, school = {Technische Universit\"{a}t Dresden}, title = {Closed-World Semantics for Query Answering in Temporal Description Logics}, type = {Doctoral Thesis}, year = {2020}, }
2019
Francesco Kriegel: Constructing and Extending Description Logic Ontologies using Methods of Formal Concept Analysis. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2019.
Abstract BibTeX Entry PDF File Publication Summary
Abstract BibTeX Entry PDF File Publication Summary
Description Logic (abbrv. DL) belongs to the field of knowledge representation and reasoning. DL researchers have developed a large family of logic-based languages, so-called description logics (abbrv. DLs). These logics allow their users to explicitly represent knowledge as ontologies, which are finite sets of (human- and machine-readable) axioms, and provide them with automated inference services to derive implicit knowledge. The landscape of decidability and computational complexity of common reasoning tasks for various description logics has been explored in large parts: there is always a trade-off between expressibility and reasoning costs. It is therefore not surprising that DLs are nowadays applied in a large variety of domains: agriculture, astronomy, biology, defense, education, energy management, geography, geoscience, medicine, oceanography, and oil and gas. Furthermore, the most notable success of DLs is that these constitute the logical underpinning of the Web Ontology Language (abbrv. OWL) in the Semantic Web.
Formal Concept Analysis (abbrv. FCA) is a subfield of lattice theory that allows to analyze data-sets that can be represented as formal contexts. Put simply, such a formal context binds a set of objects to a set of attributes by specifying which objects have which attributes. There are two major techniques that can be applied in various ways for purposes of conceptual clustering, data mining, machine learning, knowledge management, knowledge visualization, etc. On the one hand, it is possible to describe the hierarchical structure of such a data-set in form of a formal concept lattice. On the other hand, the theory of implications (dependencies between attributes) valid in a given formal context can be axiomatized in a sound and complete manner by the so-called canonical base, which furthermore contains a minimal number of implications w.r.t. the properties of soundness and completeness.
In spite of the different notions used in FCA and in DLs, there has been a very fruitful interaction between these two research areas. My thesis continues this line of research and, more specifically, I will describe how methods from FCA can be used to support the automatic construction and extension of DL ontologies from data.
@thesis{ Kriegel-Diss-2019, address = {Dresden, Germany}, author = {Francesco {Kriegel}}, school = {Technische Universit\"{a}t Dresden}, title = {Constructing and Extending Description Logic Ontologies using Methods of Formal Concept Analysis}, type = {Doctoral Thesis}, year = {2019}, }
Pavlos Marantidis: Quantitative Variants of Language Equations and their Applications to Description Logics. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2019.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
Unification in description logics (DLs) has been introduced as a novel inference service that can be used to detect redundancies in ontologies, by finding different concepts that may potentially stand for the same intuitive notion. Together with the special case of matching, they were first investigated in detail for the DL FL0, where these problems can be reduced to solving certain language equations. In this thesis, we extend this service in two directions. In order to increase the recall of this method for finding redundancies, we introduce and investigate the notion of approximate unification, which basically finds pairs of concepts that “almost” unify, in order to account for potential small modelling errors. The meaning of “almost” is formalized using distance measures between concepts. We show that approximate unification in FL0 can be reduced to approximately solving language equations, and devise algorithms for solving the latter problem for particular distance measures. Furthermore, we make a first step towards integrating background knowledge, formulated in so-called TBoxes, by investigating the special case of matching in the presence of TBoxes of different forms. We acquire a tight complexity bound for the general case, while we prove that the problem becomes easier in a restricted setting. To achieve these bounds, we take advantage of an equivalence characterization of FL0 concepts that is based on formal languages. In addition, we incorporate TBoxes in computing concept distances. Even though our results on the approximate setting cannot deal with TBoxes yet, we prepare the framework that future research can build on. Before we journey to the technical details of the above investigations, we showcase our program in the simpler setting of the equational theory ACUI, where we are able to also combine the two extensions. In the course of studying the above problems, we make heavy use of automata theory, where we also derive novel results that could be of independent interest.
@thesis{ Marantidis-Diss-2019, address = {Dresden, Germany}, author = {Pavlos {Marantidis}}, school = {Technische Universit\"{a}t Dresden}, title = {Quantitative Variants of Language Equations and their Applications to Description Logics}, type = {Doctoral Thesis}, year = {2019}, }
Adrian Nuradiansyah: Reasoning in Description Logic Ontologies for Privacy Management. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2019.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
A rise in the number of ontologies that are integrated and distributed in numerous application systems may provide the users to access the ontologies with different privileges and purposes. In this situation, preserving confidential information from possible unauthorized disclosures becomes a critical requirement. For instance, in the clinical sciences, unauthorized disclosures of medical information do not only threaten the system but also, most importantly, the patient data. Motivated by this situation, this thesis initially investigates a privacy problem, called the identity problem, where the identity of (anonymous) objects stored in Description Logic ontologies can be revealed or not. Then, we consider this problem in the context of role-based access control to ontologies and extend it to the problem asking if the identity belongs to a set of known individuals of cardinality smaller than the number k. If it is the case that some confidential information of persons, such as their identity, their relationships or their other properties, can be deduced from an ontology, which implies that some privacy policy is not fulfilled, then one needs to repair this ontology such that the modified one complies with the policies and preserves the information from the original ontology as much as possible. The repair mechanism we provide is called gentle repair and performed via axiom weakening instead of axiom deletion which was commonly used in classical approaches of ontology repair. However, policy compliance itself is not enough if there is a possible attacker that can obtain relevant information from other sources, which together with the modified ontology still violates the privacy policies. Safety property is proposed to alleviate this issue and we investigate this in the context of privacy-preserving ontology publishing. Inference procedures to solve those privacy problems and additional investigations on the complexity of the procedures, as well as the worst-case complexity of the problems, become the main contributions of this thesis.
@thesis{ Nuradiansyah-Diss-2019, address = {Dresden, Germany}, author = {Adrian {Nuradiansyah}}, school = {Technische Universit\"{a}t Dresden}, title = {Reasoning in Description Logic Ontologies for Privacy Management}, type = {Doctoral Thesis}, year = {2019}, }
Maximilian Pensel: A Lightweight Defeasible Description Logic in Depth: Quantification in Rational Reasoning and Beyond. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2019.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
Description Logics (DLs) are increasingly successful knowledge representation formalisms, useful for any application requiring implicit derivation of knowledge from explicitly known facts. A prominent example domain benefiting from these formalisms since the 1990s is the biomedical field. This area contributes an intangible amount of facts and relations between low- and high-level concepts such as the constitution of cells or interactions between studied illnesses, their symptoms and remedies. DLs are well-suited for handling large formal knowledge repositories and computing inferable coherences throughout such data, relying on their well-founded first-order semantics. In particular, DLs of reduced expressivity have proven a tremendous worth for handling large ontologies due to their computational tractability. In spite of these assets and prevailing influence, classical DLs are not well-suited to adequately model some of the most intuitive forms of reasoning. The capability for abductive reasoning is imperative for any field subjected to incomplete knowledge and the motivation to complete it with typical expectations. When such default expectations receive contradicting evidence, an abductive formalism is able to retract previously drawn, conflicting conclusions. Common examples often include human reasoning or a default characterisation of properties in biology, such as the normal arrangement of organs in the human body. Treatment of such defeasible knowledge must be aware of exceptional cases - such as a human suffering from the congenital condition situs inversus - and therefore accommodate for the ability to retract defeasible conclusions in a non-monotonic fashion. Specifically tailored non-monotonic semantics have been continuously investigated for DLs in the past 30 years. A particularly promising approach, is rooted in the research by Kraus, Lehmann and Magidor for preferential (propositional) logics and Rational Closure (RC). The biggest advantages of RC are its well-behaviour in terms of formal inference postulates and the efficient computation of defeasible entailments, by relying on a tractable reduction to classical reasoning in the underlying formalism. A major contribution of this work is a reorganisation of the core of this reasoning method, into an abstract framework formalisation. This framework is then easily instantiated to provide the reduction method for RC in DLs as well as more advanced closure operators, such as Relevant or Lexicographic Closure. In spite of their practical aptitude, we discovered that all reduction approaches fail to provide any defeasible conclusions for elements that only occur in the relational neighbourhood of the inspected elements. More explicitly, a distinguishing advantage of DLs over propositional logic is the capability to model binary relations and describe aspects of a related concept in terms of existential and universal quantification. Previous approaches to RC (and more advanced closures) are not able to derive typical behaviour for the concepts that occur within such quantification. The main contribution of this work is to introduce stronger semantics for the lightweight DL ELbot with the capability to infer the expected entailments, while maintaining a close relation to the reduction method. We achieve this by introducing a new kind of first-order interpretation that allocates defeasible information on its elements directly. This allows to compare the level of typicality of such interpretations in terms of defeasible information satisfied at elements in the relational neighbourhood. A typicality preference relation then provides the means to single out those sets of models with maximal typicality. Based on this notion, we introduce two types of nested rational semantics, a sceptical and a selective variant, each capable of deriving the missing entailments under RC for arbitrarily nested quantified concepts. As a proof of versatility for our new semantics, we also show that the stronger Relevant Closure, can be imbued with typical information in the successors of binary relations. An extensive investigation into the computational complexity of our new semantics shows that the sceptical nested variant comes at considerable additional effort, while the selective semantics reside in the complexity of classical reasoning in the underlying DL, which remains tractable in our case.
@thesis{ Pensel-Diss-2019, address = {Dresden, Germany}, author = {Maximilian {Pensel}}, school = {Technische Universit\"{a}t Dresden}, title = {A Lightweight Defeasible Description Logic in Depth: Quantification in Rational Reasoning and Beyond}, type = {Doctoral Thesis}, year = {2019}, }
2017
Stephan Böhme: Context Reasoning for Role-Based Models. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2017.
Abstract BibTeX Entry PDF File
Abstract BibTeX Entry PDF File
In a modern world software systems are literally everywhere. These should cope with very complex scenarios including the ability of context-awareness and self-adaptability. The concept of roles provide the means to model such complex, context-dependent systems. In role-based systems, the relational and context-dependent properties of objects are transferred into the roles that the object plays in a certain context. However, even if the domain can be expressed in a well-structured and modular way, role-based models can still be hard to comprehend due to the sophisticated semantics of roles, contexts and different constraints. Hence, unintended implications or inconsistencies may be overlooked. A feasible logical formalism is required here. In this setting Description Logics (DLs) fit very well as a starting point for further considerations since as a decidable fragment of first-order logic they have both an underlying formal semantics and decidable reasoning problems. DLs are a well-understood family of knowledge representation formalisms which allow to represent application domains in a well-structured way by DL-concepts, i.e. unary predicates, and DL-roles, i.e. binary predicates. However, classical DLs lack expressive power to formalise contextual knowledge which is crucial for formalising role-based systems. We investigate a novel family of contextualised description logics that is capable of expressing contextual knowledge and preserves decidability even in the presence of rigid DL-roles, i.e. relational structures that are context-independent. For these contextualised description logics we thoroughly analyse the complexity of the consistency problem. Furthermore, we present a mapping algorithm that allows for an automated translation from a formal role-based model, namely a Compartment Role Object Model (CROM), into a contextualised DL ontology. We prove the semantical correctness and provide ideas how features extending CROM can be expressed in our contextualised DLs. As final step for a completely automated analysis of role-based models, we investigate a practical reasoning algorithm and implement the first reasoner that can process contextual ontologies.
@thesis{ Boehme-Diss-2017, address = {Dresden, Germany}, author = {Stephan {B\"{o}hme}}, school = {Technische Universit\"{a}t Dresden}, title = {Context Reasoning for Role-Based Models}, type = {Doctoral Thesis}, year = {2017}, }
İsmail İlkan Ceylan: Query Answering in Probabilistic Data and Knowledge Bases. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2017.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
Probabilistic data and knowledge bases are becoming increasingly important in academia and industry. They are continuously extended with new data, powered by modern information extraction tools that associate probabilities with knowledge base facts. The state of the art to store and process such data is founded on probabilistic database systems, which are widely and successfully employed. Beyond all the success stories, however, such systems still lack the fundamental machinery to convey some of the valuable knowledge hidden in them to the end user, which limits their potential applications in practice. In particular, in their classical form, such systems are typically based on strong, unrealistic limitations, such as the closed-world assumption, the closed-domain assumption, the tuple-independence assumption, and the lack of commonsense knowledge. These limitations do not only lead to unwanted consequences, but also put such systems on weak footing in important tasks, querying answering being a very central one. In this thesis, we enhance probabilistic data and knowledge bases with more realistic data models, thereby allowing for better means for querying them. Building on the long endeavor of unifying logic and probability, we develop different rigorous semantics for probabilistic data and knowledge bases, analyze their computational properties and identify sources of (in)tractability and design practical scalable query answering algorithms whenever possible. To achieve this, the current work brings together some recent paradigms from logics, probabilistic inference, and database theory.
@thesis{ Ceylan-Diss-2017, address = {Dresden, Germany}, author = {{\.I}smail {\.I}lkan {Ceylan}}, school = {Technische Universit\"{a}t Dresden}, title = {Query Answering in Probabilistic Data and Knowledge Bases}, type = {Doctoral Thesis}, year = {2017}, }
Veronika Thost: Using Ontology-Based Data Access to Enable Context Recognition in the Presence of Incomplete Information. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2017.
Abstract BibTeX Entry Publication
Abstract BibTeX Entry Publication
Ontology-based data access (OBDA) augments classical query answering in databases by including domain knowledge provided by an ontology. An ontology captures the terminology of an application domain and describes domain knowledge in a machine-processable way. Formal ontology languages additionally provide semantics to these specifications. Systems for OBDA thus may apply logical reasoning to answer queries; they use the ontological knowledge to infer new information, which is only implicitly given in the data. Moreover, they usually employ the open-world assumption, which means that knowledge not stated explicitly in the data or inferred is neither assumed to be true nor false. Classical OBDA regards the knowledge however only w.r.t. a single moment, which means that information about time is not used for reasoning and hence lost; in particular, the queries generally cannot express temporal aspects. We investigate temporal query languages that allow to access temporal data through classical ontologies. In particular, we study the computational complexity of temporal query answering regarding ontologies written in lightweight description logics, which are known to allow for efficient reasoning in the atemporal setting and are successfully applied in practice. Furthermore, we present a so-called rewritability result for ontology-based temporal query answering, which suggests ways for implementation. Our results may thus guide the choice of a query language for temporal OBDA in data-intensive applications that require fast processing, such as context recognition.
@thesis{ Thost-Diss-2017, address = {Dresden, Germany}, author = {Veronika {Thost}}, school = {Technische Universit\"{a}t Dresden}, title = {Using Ontology-Based Data Access to Enable Context Recognition in the Presence of Incomplete Information}, type = {Doctoral Thesis}, year = {2017}, }
Benjamin Zarrieß: Verification of Golog Programs over Description Logic Actions. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2017.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
Golog is a powerful programming language for logic-based agents. The primitives of the language are actions whose preconditions and effects are defined in a Situation Calculus action theory using first-order logic. To describe possible courses of actions the programmer can freely combine imperative control structures with constructs for non-deterministic choice, leaving it to the system to resolve the non-determinism in a suitable manner. Golog has been successfully used for high-level decision making in the area of cognitive robotics. Obviously, it is important to verify certain properties of a Golog program before executing it on a physical robot. However, due to the high expressiveness of the language the verification problem is in general undecidable. In this thesis, we study the verification problem for Golog programs over actions defined in action languages based on Description Logics and explore the boundary between decidable and undecidable fragments.
@thesis{ Zarriess-Diss-2017, address = {Dresden, Germany}, author = {Benjamin {Zarrie{\ss}}}, school = {Technische Universit\"{a}t Dresden}, title = {Verification of Golog Programs over Description Logic Actions}, type = {Doctoral Thesis}, year = {2017}, }
2016
Andreas Ecke: Quantitative Methods for Similarity in Description Logics. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2016.
Abstract BibTeX Entry PDF File
Abstract BibTeX Entry PDF File
Description Logics (DLs) are a family of logic-based knowledge representation languages used to describe the knowledge of an application domain and reason about it in formally well-defined way. They allow users to describe the important notions and classes of the knowledge domain as concepts, which formalize the necessary and sufficient conditions for individual objects to belong to that concept. A variety of different DLs exist, differing in the set of properties one can use to express concepts, the so-called concept constructors, as well as the set of axioms available to describe the relations between concepts or individuals. However, all classical DLs have in common that they can only express exact knowledge, and correspondingly only allow exact inferences. Either we can infer that some individual belongs to a concept, or we can't, there is no in-between. In practice though, knowledge is rarely exact. Many definitions have their exceptions or are vaguely formulated in the first place, and people might not only be interested in exact answers, but also in alternatives that are "close enough".
This thesis is aimed at tackling how to express that something "close enough", and how to integrate this notion into the formalism of Description Logics. To this end, we will use the notion of similarity and dissimilarity measures as a way to quantify how close exactly two concepts are. We will look at how useful measures can be defined in the context of DLs, and how they can be incorporated into the formal framework in order to generalize it. In particular, we will look closer at two applications of thus measures to DLs: Relaxed instance queries will incorporate a similarity measure in order to not just give the exact answer to some query, but all answers that are reasonably similar. Prototypical definitions on the other hand use a measure of dissimilarity or distance between concepts in order to allow the definitions of and reasoning with concepts that capture not just those individuals that satisfy exactly the stated properties, but also those that are "close enough".
@thesis{ Ecke-PhD-2016, address = {Dresden, Germany}, author = {Andreas {Ecke}}, school = {Technische Universit\"{a}t Dresden}, title = {Quantitative Methods for Similarity in Description Logics}, type = {Doctoral Thesis}, year = {2016}, }
2014
Stefan Borgwardt: Fuzzy Description Logics with General Concept Inclusions. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2014.
Abstract BibTeX Entry Publication
Abstract BibTeX Entry Publication
Description logics (DLs) are used to represent knowledge of an application domain and provide standard reasoning services to infer consequences of this knowledge. However, classical DLs are not suited to represent vagueness in the description of the knowledge. We consider a combination of DLs and Fuzzy Logics to address this task. In particular, we consider the t-norm-based semantics for fuzzy DLs introduced by Hajek in 2005. Since then, many tableau algorithms have been developed for reasoning in fuzzy DLs. Another popular approach is to reduce fuzzy ontologies to classical ones and use existing highly optimized classical reasoners to deal with them. However, a systematic study of the computational complexity of the different reasoning problems is so far missing from the literature on fuzzy DLs. Recently, some of the developed tableau algorithms have been shown to be incorrect in the presence of general concept inclusion axioms (GCIs). In some fuzzy DLs, reasoning with GCIs has even turned out to be undecidable. This work provides a rigorous analysis of the boundary between decidable and undecidable reasoning problems in t-norm-based fuzzy DLs, in particular for GCIs. Existing undecidability proofs are extended to cover large classes of fuzzy DLs, and decidability is shown for most of the remaining logics considered here. Additionally, the computational complexity of reasoning in fuzzy DLs with semantics based on finite lattices is analyzed. For most decidability results, tight complexity bounds can be derived.
@thesis{ Borgwardt-Diss-2014, address = {Dresden, Germany}, author = {Stefan {Borgwardt}}, school = {Technische Universit\"{a}t Dresden}, title = {Fuzzy Description Logics with General Concept Inclusions}, type = {Doctoral Thesis}, year = {2014}, }
Marcel Lippmann: Temporalised Description Logics for Monitoring Partially Observable Events. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2014.
Abstract BibTeX Entry Publication
Abstract BibTeX Entry Publication
Inevitably, it becomes more and more important to verify that the systems surrounding us have certain properties. This is indeed unavoidable for safety-critical systems such as power plants and intensive-care units. We refer to the term system in a broad sense: it may be man-made (e.g. a computer system) or natural (e.g. a patient in an intensive-care unit). Whereas in Model Checking it is assumed that one has complete knowledge about the functioning of the system, we consider an open-world scenario and assume that we can only observe the behaviour of the actual running system by sensors. Such an abstract sensor could sense e.g. the blood pressure of a patient or the air traffic observed by radar.
Then the observed data are preprocessed appropriately and stored in a fact base. Based on the data available in the fact base, situation-awareness tools are supposed to help the user to detect certain situations that require intervention by an expert. Such situations could be that the heart-rate of a patient is rather high while the blood pressure is low, or that a collision of two aeroplanes is about to happen.
Moreover, the information in the fact base can be used by monitors to verify that the system has certain properties. It is not realistic, however, to assume that the sensors always yield a complete description of the current state of the observed system. Thus, it makes sense to assume that information that is not present in the fact base is unknown rather than false. Moreover, very often one has some knowledge about the functioning of the system. This background knowledge can be used to draw conclusions about the possible future behaviour of the system. Employing description logics (DLs) is one way to deal with these requirements. In this thesis, we tackle the sketched problem in three different contexts: (i) runtime verification using a temporalised DL, (ii) temporalised query entailment, and (iii) verification in DL-based action formalisms.
@thesis{ Lippmann-Diss-2014, address = {Dresden, Germany}, author = {Marcel {Lippmann}}, school = {Technische Universit\"{a}t Dresden}, title = {Temporalised Description Logics for Monitoring Partially Observable Events}, type = {Doctoral Thesis}, year = {2014}, }
2011
Felix Distel: Learning Description Logic Knowledge Bases from Data Using Methods from Formal Concept Analysis. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2011.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
Description Logics (DLs) are a class of knowledge representation formalisms that can represent terminological and assertional knowledge using a well-defined semantics. Often, knowledge engineers are experts in their own fields, but not in logics, and require assistance in the process of ontology design. This thesis presents three methods that can extract terminological knowledge from existing data and thereby assist in the design process. They are based on similar formalisms from Formal Concept Analysis (FCA), in particular the Next-Closure Algorithm and Attribute-Exploration. The first of the three methods computes terminological knowledge from the data, without any expert interaction. The two other methods use expert interaction, where a human expert can confirm each terminological axiom or refute it by providing a counterexample. These two methods differ only in the way counterexamples are provided.
@thesis{ dis-diss-2011, address = {Dresden, Germany}, author = {Felix {Distel}}, school = {Technische Universit\"{a}t Dresden}, title = {Learning Description Logic Knowledge Bases from Data Using Methods from Formal Concept Analysis}, type = {Doctoral Thesis}, year = {2011}, }
2010
Martin Knechtel: Access Restrictions to and with Description Logic Web Ontologies. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2010.
Abstract BibTeX Entry PDF File Publication Slides
Abstract BibTeX Entry PDF File Publication Slides
Access restrictions are essential in standard information systems and became an issue for ontologies in the following two aspects. Ontologies can represent explicit and implicit knowledge about an access policy. For this aspect we provided a methodology to represent and systematically complete role-based access control policies. Orthogonally, an ontology might be available for limited reading access. Independently of a specific ontology language or reasoner, we provided a lattice-based framework to assign labels to an ontology's axioms and consequences. We looked at the problems to compute and repair one or multiple consequence labels and to assign a query-based access restriction. An empirical evaluation has shown that the algorithms perform well in practical scenarios with large-scale ontologies.
@thesis{ kne-diss-2010, address = {Dresden, Germany}, author = {Martin {Knechtel}}, school = {Technische Universit\"{a}t Dresden}, title = {Access Restrictions to and with Description Logic Web Ontologies}, type = {Doctoral Thesis}, year = {2010}, }
Hongkai Liu: Computing Updates in Description Logics. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2010.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
Description Logics (DLs) form a family of knowledge representation formalisms which can be used to represent and reason with conceptual knowledge about a domain of interest. The knowledge represented by DLs is mainly static. In many applications, the domain knowledge is dynamic. This observation motivates the research on how to update the knowledge when changes in the application domain take place. This thesis is dedicated to the study of updating knowledge, more precisely, assertional knowledge represented in DLs. We explore whether the updated knowledge can be expressed in several standard DLs and, if so, whether it is computable and what is its size.
@thesis{ Liu-PhD-10, address = {Dresden, Germany}, author = {Hongkai {Liu}}, school = {Technische Universit\"{a}t Dresden}, title = {Computing Updates in Description Logics}, type = {Doctoral Thesis}, year = {2010}, }
2009
Rafael Peñaloza Nyssen: Axiom-Pinpointing in Description Logics and Beyond. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2009.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
Building and mantaining large-scale ontologies is an error-prone task. It is thus not uncommon to find unwanted or unexpected consequences that follow implicitely from the restrictions in the ontology. To understand and correct these consequences, it is helpful to find the specific portions of the ontology that are responsible for them. Axiom-pinpointing is the task of finding minimal subontologies that entail a given consequence, also called MinAs. In this work we look at the task of computing all the MinAs by means of modified decision procedures. We first show that tableaux- and automata-based decision procedures can be transformed into pinpointing algorithms that output a (compact) representation of the set of all MinAs. We then explore the complexity of the problem.
@thesis{ Pen-PhD-09, address = {Dresden, Germany}, author = {Rafael Pe\~{n}aloza {Nyssen}}, school = {Technische Universit\"{a}t Dresden}, title = {Axiom-Pinpointing in Description Logics and Beyond}, type = {Doctoral Thesis}, year = {2009}, }
Boontawee Suntisrivaraporn: Polynomial-Time Reasoning Support for Design and Maintenance of Large-Scale Biomedical Ontologies. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2009.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
Description Logics (DLs) belong to a successful family of knowledge representation formalisms with two key assets: formally well-defined semantics which allows to represent knowledge in an unambiguous way and automated reasoning which allows to infer implicit knowledge from the one given explicitly. This thesis investigates various reasoning techniques for tractable DLs in the EL family which have been implemented in the CEL system. It suggests that the use of the lightweight DLs, in which reasoning is tractable, is beneficial for ontology design and maintenance both in terms of expressivity and scalability. The claim is supported by a case study on the renown medical ontology SNOMED CT and extensive empirical evaluation on several large-scale biomedical ontologies.
@thesis{ Sun-PhD-09, address = {Dresden, Germany}, author = {Boontawee {Suntisrivaraporn}}, school = {Technische Universit\"{a}t Dresden}, title = {Polynomial-Time Reasoning Support for Design and Maintenance of Large-Scale Biomedical Ontologies}, type = {Doctoral Thesis}, year = {2009}, }
2008
Maja Miličić: Action, Time and Space in Description Logics. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2008.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
Description Logics (DLs) are a family of logic-based knowledge representation (KR) formalisms designed to represent and reason about static conceptual knowledge in a semantically well-understood way. On the other hand, standard action formalisms are KR formalisms based on classical logic designed to model and reason about dynamic systems. The largest part of the present work is dedicated to integrating DLs with action formalisms, with the main goal of obtaining decidable action formalisms with an expressiveness significantly beyond propositional. To this end, we offer DL-tailored solutions to the frame and ramification problem. The main technical result is that standard reasoning problems about actions (executability and projection), as well as the plan existence problem are decidable if one restricts the logic for describing action pre- and post-conditions and the state of the world to decidable Description Logics. Moreover, reasoning about DL actions is reducible to standard DL reasoning problems, with the nice consequence that already available DL reasoners can be employed to provide automated support for reasoning about a dynamically changing world. A smaller part of the work is related to the design of a tableau algorithm for decidable extensions of Description Logics with concrete datatypes, most importantly with those allowing to refer to the notions of space and time, and with powerful domain constraints (general concept inclusions).
@thesis{ MajaMilicicDiss, address = {Dresden, Germany}, author = {Maja {Mili\v{c}i\'{c}}}, school = {Technische Universit\"{a}t Dresden}, title = {Action, Time and Space in Description Logics}, type = {Doctoral Thesis}, year = {2008}, }
2007
Jan Hladik: To and Fro Between Tableaus and Automata for Description Logics. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2007.
Abstract BibTeX Entry Publication
Abstract BibTeX Entry Publication
Description Logics (DLs) are a family of knowledge representation languages with well-defined logic-based semantics and decidable inference problems, e.g. satisfiability. Two of the most widely used decision procedures for the satisfiability problem are tableau- and automata-based algorithms. Due to their different operation, these two classes have complementary properties: tableau algorithms are well-suited for implementation and for showing PSPACE and NEXPTIME complexity results, whereas automata algorithms are particularly useful for showing EXPTIME results. Additionally, they allow for an elegant handling of infinite structures, but they are not suited for implementation. The aim of this thesis is to analyse the reasons for these differences and to find ways of transferring properties between the two approaches in order to reconcile the positive properties of both.
In a first approach, we describe a method to translate automata into DL knowledge bases, which allows us to use tableau reasoners to decide the automata emptiness problem. Since empirical results show that this does not lead to acceptable performance in practice, we develop a way for transferring PSPACE complexity results from tableaus to automata by modifying the automata emptiness test with optimisation techniques known from tableaus, in particular by using a blocking condition. Finally, in order to transfer EXPTIME complexity results from the automata to the tableau paradigm, we develop Tableau Systems, a framework for tableau algorithms. From an algorithm formalised within this framework, it is possible to derive both an EXPTIME automata algorithm and a tableau algorithm that is useable in practice.
@thesis{ Hladik-Diss-2007, address = {Dresden, Germany}, author = {Jan {Hladik}}, school = {Technische Universit\"{a}t Dresden}, title = {To and Fro Between Tableaus and Automata for Description Logics}, type = {Doctoral Thesis}, year = {2007}, }
Barış Sertkaya: Formal Concept Analysis Methods for Description Logics. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2007.
Abstract BibTeX Entry Publication
Abstract BibTeX Entry Publication
Description Logics (DLs) are a class of logic based knowledge representation formalisms that are used to represent terminological knowledge of an application domain in a structured way. Formal Concept Analysis (FCA), on the other hand, is a field of applied mathematics that aims to formalize the notions of a concept and a conceptual hierarchy by means of mathematical tools. Although the notion of a concept as a collection of objects sharing certain properties, and the notion of a conceptual hierarchy are fundamental to both DLs and FCA, the ways concepts are described and obtained differ significantly between these two research areas.
In the present work we show that, despite these differences, DL research can benefit from FCA research for solving problems encountered in knowledge representation by employing FCA methods. Our contributions in this direction fall mainly under two topics: 1) supporting bottom-up construction of DL knowledge bases, 2) completing DL knowledge bases. In the first one we show how the attribute exploration method can be used for computing least common subsumers w.r.t. a background knowledge base. In the second one, we develop an extension of FCA for open-world semantics of DL knowledge bases, and show how the extended attribute exploration method can be used to detect missing axioms and individuals in a DL knowledge base. Moreover we also contribute to FCA research by investigating the computational complexity of several decision and counting problems related to minimal generators of closed sets.
@thesis{ Sertkaya-Diss-2007, address = {Dresden, Germany}, author = {Bar\i{}\c{s} {Sertkaya}}, school = {Technische Universit\"{a}t Dresden}, title = {Formal Concept Analysis Methods for Description Logics}, type = {Doctoral Thesis}, year = {2007}, }
Anni-Yasmin Turhan: On the Computation of Common Subsumers in Description Logics. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2007.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
Description logics (DL) knowledge bases are often build by users with expertise in the application domain, but little expertise in logic. To support this kind of users when building their knowledge bases a number of extension methods have been proposed to provide the user with concept descriptions as a starting point for new concept definitions. The inference service central to several of these approaches is the computation of (least) common subsumers of concept descriptions.
In case disjunction of concepts can be expressed in the DL under consideration, the least common subsumer (lcs) is just the disjunction of the input concepts. Such a trivial lcs is of little use as a starting point for a new concept definition to be edited by the user. To address this problem we propose two approaches to obtain "meaningful" common subsumers in the presence of disjunction tailored to two different methods to extend DL knowledge bases. More precisely, we devise computation methods for the approximation-based approach and the customization of DL knowledge bases, extend these methods to DLs with number restrictions and discuss their efficient implementation.
@thesis{ Turhan-PhD, address = {Dresden, Germany}, author = {Anni-Yasmin {Turhan}}, school = {Technische Universit\"{a}t Dresden}, title = {On the Computation of Common Subsumers in Description Logics}, type = {Doctoral Thesis}, year = {2007}, }
2006
Sebastian-Philipp Brandt: Standard and Non-standard Reasoning in Description Logics. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2006.
Abstract BibTeX Entry PDF File Publication
Abstract BibTeX Entry PDF File Publication
The present work deals with Description Logics (DLs), a class of knowledge representation formalisms used to represent and reason about classes of individuals and relations between such classes in a formally well-defined way. We provide novel results in three main directions. (1) Tractable reasoning revisited: in the 1990s, DL research has largely answered the question for practically relevant yet tractable DL formalisms in the negative. Due to novel application domains, especially the Life Sciences, and a surprising tractability result by Baader, we have re-visited this question, this time looking in a new direction: general terminologies (TBoxes) and extensions thereof defined over the DL EL and extensions thereof. As main positive result, we devise EL++(D)-CBoxes as a tractable DL formalism with optimal expressivity in the sense that every additional standard DL constructor, every extension of the TBox formalism, or every more powerful concrete domain, makes reasoning intractable. (2) Non-standard inferences for knowledge maintenance: non-standard inferences, such as matching, can support domain experts in maintaining DL knowledge bases in a structured and well-defined way. In order to extend their availability and promote their use, the present work extends the state of the art of non-standard inferences both w.r.t. theory and implementation. Our main results are implementations and performance evaluations of known matching algorithms for the DLs ALE and ALN, optimal non-deterministic polynomial time algorithms for matching under acyclic side conditions in ALN and sublanguages, and optimal algorithms for matching w.r.t. cyclic (and hybrid) EL-TBoxes. (3) Non-standard inferences over general concept inclusion (GCI) axioms: the utility of GCIs in modern DL knowledge bases and the relevance of non-standard inferences to knowledge maintenance naturally motivate the question for tractable DL formalism in which both can be provided. As main result, we propose hybrid EL-TBoxes as a solution to this hitherto open question.
@thesis{ Brandt-PhD-2006, address = {Dresden, Germany}, author = {Sebastian-Philipp {Brandt}}, school = {Technische Universit\"{a}t Dresden}, title = {Standard and Non-standard Reasoning in Description Logics}, type = {Doctoral Thesis}, year = {2006}, }
2002
T. Hinze: Universelle Modelle und ausgewählte Algorithmen des DNA-Computing. Doctoral Thesis, Technische Universität Dresden, Dresden, Germany, 2002.
Abstract BibTeX Entry PDF File PS File
Abstract BibTeX Entry PDF File PS File
Die Arbeit beleuchtet das Forschungsgebiet des DNA-Computing vordergründig aus dem Blickwinkel der Berechenbarkeitstheorie. Universelle sowie platzbeschränkt universelle Modelle des DNA-Computing, deren DNA-basierte Daten auf der Abbildung linearer DNA beruhen, werden untersucht, klassifiziert und als Beschreibungssysteme für Algorithmen angewendet. Mit dem TT6-EH-System und dem Simulationssystem Sisyphus werden zwei universelle DNA-Computing-Modelle eingeführt, deren Modelleigenschaften labornah ausgerichtet sind. Das TT6-EH-System stellt ein endlichkomponentiges verteiltes Splicing-System dar, das sich durch einen statischen Systemaufbau, eine Minimierung der in die Verarbeitung einbezogenen Ressourcen und ein niedriges Abstraktionsniveau der Modelloperationen auszeichnet. Das Simulationssystem Sisyphus berücksichtigt Seiteneffekte der den Modelloperationen zugrundeliegenden molekularbiologischen Prozesse. Zusätzlich besitzt das Modell die Eigenschaften "restriktiv" und "multimengenbasiert". Anhand einer Probleminstanz des NP-vollständigen Rucksackproblems erfolgte eine laborpraktische Verifikation.
@phdthesis{ Hinze-TUD-02, address = {Dresden, Germany}, author = {T. {Hinze}}, school = {Technische Universit\"at Dresden}, title = {Universelle Modelle und ausgew\"ahlte Algorithmen des DNA-Computing}, type = {Doctoral Thesis}, year = {2002}, }
Carsten Lutz: The Complexity of Description Logics with Concrete Domains. Doctoral Thesis, Rheinisch-Westfälische Technische Hochschule Aachen, Aachen, Germany, 2002.
Abstract BibTeX Entry PDF File PS File Publication
Abstract BibTeX Entry PDF File PS File Publication
Concrete domains are an extension of Description Logics (DLs) that allows to integrate reasoning about conceptual knowledge with reasoning about "concrete qualities" of real world entities such as their age, weight, shape, and temporal extension. In this thesis, we perform an in-depth analysis of the computational complexity of reasoning with DLs that are equipped with concrete domains.
The main results are that (i) reasoning with ALC(D), the basic DL ALC extended with a concrete domain D, is PSpace-complete if reasoning with D is in PSpace; (ii) for many seemingly "harmless" extensions of ALC(D), such as the extension with acyclic TBoxes, role conjunction, and inverse roles, the complexity of reasoning leaps from PSpace-completeness to NExpTime-completeness—at least for a large class of concrete domains; and (iii) although, in general, the combination of concrete domains and general TBoxes leads to undecidability of reasoning, there exists an interesting temporal concrete domain that can be combined with general TBoxes without sacrificing decidability. This concrete domain is used to devise a rather powerful interval-based temporal Description Logic.
As a side-track, we illuminate the connection between Description Logics equipped with concrete domains and DLs providing for so-called feature agreement and disagreement constructors. Indeed, this connection turns out to be quite close: in most cases, the complexity of reasoning with a DL providing for concrete domains is identical to the complexity of reasoning with the corresponding DL that offers feature (dis)agreements.
@thesis{ Lutz-PhD-2002, address = {Aachen, Germany}, author = {Carsten {Lutz}}, school = {Rheinisch-Westf\"{a}lische Technische Hochschule Aachen}, title = {The Complexity of Description Logics with Concrete Domains}, type = {Doctoral Thesis}, year = {2002}, }
2001
Stephan Tobies: Complexity Results and Practical Algorithms for Logics in Knowledge Representation. Doctoral Thesis, Rheinisch-Westfälische Technische Hochschule Aachen, Aachen, Germany, 2001.
Abstract BibTeX Entry PDF File PS File Publication
Abstract BibTeX Entry PDF File PS File Publication
Description Logics (DLs) are used in knowledge-based systems to represent and reason about terminological knowledge of the application domain in a semantically well-defined manner. In this thesis, we establish a number of novel complexity results and give practical algorithms for expressive DLs that provide different forms of counting quantifiers.
We show that, in many cases, adding local counting in the form of qualifying number restrictions to DLs does not increase the complexity of the inference problems, even if binary coding of numbers in the input is assumed. On the other hand, we show that adding different forms of global counting restrictions to a logic may increase the complexity of the inference problems dramatically.
We provide exact complexity results and a practical, tableau based algorithm for the DL SHIQ, which forms the basis of the highly optimized DL system iFaCT.
Finally, we describe a tableau algorithm for the clique guarded fragment (CGF), which we hope will serve as the basis for an efficient implementation of a CGF reasoner.
@thesis{ Tobies-PhD-2001, address = {Aachen, Germany}, author = {Stephan {Tobies}}, school = {Rheinisch-Westf\"{a}lische Technische Hochschule Aachen}, title = {Complexity Results and Practical Algorithms for Logics in Knowledge Representation}, type = {Doctoral Thesis}, year = {2001}, }
2000
Ralf Küsters: Non-Standard Inferences in Description Logics. Doctoral Thesis, Rheinisch-Westfälische Technische Hochschule Aachen, Aachen, Germany, 2000.
Abstract BibTeX Entry ©Springer-Verlag Publication
Abstract BibTeX Entry ©Springer-Verlag Publication
Description Logics denote a family of knowledge representation formalisms that can be used to represent the terminological knowledge of an application domain in a structured and well-defined way. The standard inferences in these logics, like subsumption and instance checking, have been investigated in detail during the last fifteen years, both from a theoretical and a practical point of view. In some applications it has turned out, however, that the optimal support of building and maintaining large knowledge bases requires additional (non-standard) inferences, such as least common subsumer, most specific concept, and matching of concept descriptions. Some systems contain ad hoc implementations of such inferences, mostly without having defined them formally or having investigated their computational complexity. In this work, a formal basis for these inferences is established by providing precise definitions, complete algorithms, and first complexity results.
@thesis{ Kuesters-Diss-2000, address = {Aachen, Germany}, author = {Ralf {K\"{u}sters}}, school = {Rheinisch-Westf\"{a}lische Technische Hochschule Aachen}, title = {Non-Standard Inferences in Description Logics}, type = {Doctoral Thesis}, year = {2000}, }
Ralf Molitor: Unterstützung der Modellierung verfahrenstechnischer Prozesse durch Nicht-Standardinferenzen in Beschreibungslogiken. Doctoral Thesis, Rheinisch-Westfälische Technische Hochschule Aachen, Aachen, Germany, 2000.
Abstract BibTeX Entry PDF File PS File Publication
Abstract BibTeX Entry PDF File PS File Publication
In der Prozesstechnik ist, wie in vielen anderen Bereichen, eine strukturierte Darstellung und Speicherung des anwendungsspezifischen Wissens wünschenswert. Im Rahmen einer Kooperation zwischen dem Lehrstuhl für Prozesstechnik und dem Lehr- und Forschungsgebiet Theoretische Informatik wurde bereits nachgewiesen, dass sich Beschreibungslogiken aufgrund ihrer hohen Ausdrucksstärke und mächtigen Inferenzalgorithmen sehr gut für diese Aufgabe eignen. So ermöglichen die standardmäßig von Beschreibungslogik-Systemen bereitgestellten Inferenzdienste beispielsweise die automatische Berechnung der Spezialisierungshierarchie einer Wissensbasis. Es hat sich jedoch herausgestellt, dass sie für eine umfassende Unterstützung der Erstellung und Wartung der Wissensbasis nicht ausreichen.
In dieser Arbeit werden daher die Nicht-Standardinferenzen Least Common Subsumer, Most Specific Concept und Rewriting untersucht, die im Zusammenspiel die Definition neuer Konzepte und damit die Erweiterung und Pflege der Wissensbasis unterstützen. Die Resultate zur Existenz, Berechenbarkeit und Komplexität sowie die Entwicklung vollständiger Algorithmen zur Lösung dieser Inferenzprobleme konzentrieren sich dabei auf Beschreibungslogiken, die bereits erfolgreich in der Prozesstechnik eingesetzt werden.
@thesis{ Molitor-diss, address = {Aachen, Germany}, author = {Ralf {Molitor}}, school = {Rheinisch-Westf\"{a}lische Technische Hochschule Aachen}, title = {Unterst\"{u}tzung der Modellierung verfahrenstechnischer Prozesse durch Nicht-Standardinferenzen in Beschreibungslogiken}, type = {Doctoral Thesis}, year = {2000}, }
1999
Jörn Richts: Effiziente Entscheidungsverfahren zur E-Unifikation. Doctoral Thesis, Rheinisch-Westfälische Technische Hochschule Aachen, Aachen, Germany, 1999.
Abstract BibTeX Entry ©Shaker Publication
Abstract BibTeX Entry ©Shaker Publication
Im automatischen Beweisen ist der Einsatz von E-Unifikation eine Technik, um allgemein anwendbare Beweisverfahren wie Resolution oder Vervollständigung mit Spezialverfahren zu ergänzen. Diese können zwar nur sehr spezielle Probleme lösen, sind dabei aber besonders effizient. Durch den Einsatz von E-Unifikation wird die Behandlung einer Gleichungstheorie E von dem allgemeinen Beweisverfahren in ein spezielles Unifikationsverfahren verlagert.
Treten in einem Beweisproblem mehrere Gleichungstheorien E0, ... , En auf, so entsteht die Aufgabe, die einzelnen Unifikationsverfahren für diese Theorien zu einem Unifikationsverfahren für die Vereinigung der Theorien E0, ..., En zu kombinieren. Zu diesem Zweck wurden in der Literatur mehrere Kombinationsalgorithmen vorgestellt. Diese beschäftigen sich aber fast ausschließlich mit der Kombination von E-Unifikationsverfahren, die sogenannte vollständige Mengen von Unifikatoren, also Mengen von Lösungen, berechnen. In den letzten Jahren ist zusätzlich das Interesse an E-Unifikationsverfahren gestiegen, die als reine Entscheidungsverfahren arbeiten. Franz Baader und Klaus Schulz haben einen Kombinationsalgorithmus vorgestellt, der auch Entscheidungsverfahren zur E-Unifikation kombinieren kann. Dieser Algorithmus ist allerdings vornehmlich von theoretischem Interesse, da eine direkte Implementierung wegen des enorm großen Suchraums für praktische Probleme nicht geeignet ist.
In der vorliegenden Arbeit wird aufbauend auf dem Algorithmus von Baader und Schulz ein optimiertes Kombinationsverfahren für Entscheidungsverfahren zur E-Unifikation vorgestellt. Der grundlegende Gedanke dieser Optimierung ist, den Suchraum durch den Austausch von Informationen zwischen den Komponentenverfahren der beteiligten Theorien E0, ... , En zu beschränken. Um diesen Informationsaustausch spezifizieren und seine Korrektheit beweisen zu können, wird ein spezieller Formalismus, die sogenannten Constraints, eingeführt. Außerdem wird für drei ausgewählte Theorien vorgestellt, wie bekannte Entscheidungsverfahren für diese Theorien so erweitert werden können, dass sie sich an dem angesprochenen Informationsaustausch beteiligen können. Laufzeittests mit den implementieren Algorithmen zeigen, dass diese Optimierungen tatsächlich zu Unifikationsalgorithmen führen, welche für Probleme in der Praxis einsetzbar sind.
@thesis{ Richts-Diss-1999, address = {Aachen, Germany}, author = {J\"{o}rn {Richts}}, school = {Rheinisch-Westf\"{a}lische Technische Hochschule Aachen}, title = {Effiziente Entscheidungsverfahren zur E-Unifikation}, type = {Doctoral Thesis}, year = {1999}, }
Christopher B. Tresp: Beschreibungslogiken zur Behandlung von unscharfem Wissen im Kontext eines medizinischen Anwendungsszenarios. Doctoral Thesis, Rheinisch-Westfälische Technische Hochschule Aachen, Aachen, Germany, 1999.
Abstract BibTeX Entry ©Shaker Publication
Abstract BibTeX Entry ©Shaker Publication
Im Anwendungsbereich Medizin wird der Einsatz traditioneller Wissensrepräsentationssysteme dadurch erschwert, dass Aussagen und Folgerungen häufig nicht exakt sind. Vielmehr liegen diese oft nur mit einem gewissen Grad an Unschärfe vor. Allerdings gibt es nichtsdestotrotz oft die Anforderung, das entsprechende Wissen geeignet zu formaliseren, um Anwendungssysteme zu entwickeln, die den Mediziner in seiner Ausbildung oder im Alltagsgeschäft unterstützen.
Zum Aufbau eines medizinischen Tutoringsystems im Bereich der Radiologie wird daher im Rahmen dieser Arbeit eine unendlichwertige (fuzzy) Beschreibungslogik entwickelt, welche speziell das in diesem Rahmen vorkommende unscharfe Wissen zu repräsentieren erlaubt.
Die Arbeit umfasst als einen Schwerpunkt die theoretischen Grundlagen der Beschreibungslogik und das darauf aufbauende Schlussfolgerungsverfahren, welches den Ausgangspunkt für ein Computersystem liefert. Über die an dieser Stelle gezeigten Eigenschaften lassen sich für die spätere Anwendung wichtige Aussagen treffen, welche beispielsweise die Korrektheit einer Inferenz in dieser Logik umfassen.
Zum anderen werden aber auch die praktischen Fragestellungen behandelt, die beim Entwurf eines medizinischen Anwendungssystems auftauchen. Ein wichtiger Punkt ist hier die Verarbeitung der medizinischen Fachsprache, die es im Kontext des Ausbildungssystems vollautomatisch in Konstrukte der Wissensrepräsentationssprache zu übersetzen gilt.
@thesis{ Tresp-Diss-1999, address = {Aachen, Germany}, author = {Christopher B. {Tresp}}, school = {Rheinisch-Westf\"{a}lische Technische Hochschule Aachen}, title = {Beschreibungslogiken zur Behandlung von unscharfem Wissen im Kontext eines medizinischen Anwendungsszenarios}, type = {Doctoral Thesis}, year = {1999}, }
1998
Can Adam Albayrak: Die WHILE-Hierarchie für Programmschemata. Doctoral Thesis, Rheinisch-Westfälische Technische Hochschule Aachen, Aachen, Germany, 1998.
Abstract BibTeX Entry ©Shaker Publication
Abstract BibTeX Entry ©Shaker Publication
In der hier vorliegenden Arbeit werden verschiedene, aus der Literatur bekannte Klassen monadischer Programmschemata betrachtet. Monadische Programmschemata sind uninterpretierte Programme und stellen damit Programm über potentiell beliebigen Rechenbereichen mit beliebigen einstelligen Grundfunktionen dar. Im Gegensatz zum Äquivalenzbegriff für Programme, bei dem zwei Programme als äquivalent betrachtet werden, wenn sie die gleiche Funktion berechnen, fordert man beim Äquivalenzbegriff für Programmschemata, dass für alle Interpretationen der Grundfunktionen die entstehenden Programme die gleiche Funktion berechnen. Für die Klasse der Ianov-Schemata ist eine Charakterisierung äquivalenter Programmschemata bekannt. Hier wird eine solche Charakterisierung für die Klasse der Dijkstra-Schemata und die Klasse der Kosaraju-Schemata angegeben. Diese Charakterisierung erfolgt durch die Zuordnung von regulären Sprachen bzw. Vektoren von solchen Sprachen, wobei hervorzuheben ist, dass sich die charakterisierenden Sprachen am induktiven Aufbau der jeweiligen Klassen orientieren. Durch die Verwendung dieser Charakterisierungen kann beispielsweise das Äquivalenzproblem für Programmschemata dieser Klassen sehr effizient entschieden werden.
Darüber hinaus wird gezeigt, dass die Schachtelungstiefe von WHILE-Schleifen in Dijkstra-Schemata eine echte semantische Hierarchie von Funktionen, die sogenannte WHILE-Hierarchie, bildet. Dieses Hierarchieresultat zeigt, dass bei Betrachtung beliebiger Rechenbereiche und beliebigen Grundfunktionen der klassische Satz der Rekursionstheorie von S.C. Kleene nicht mehr gilt, der besagt, dass eine einzige WHILE-Schleife zur Programmierung beliebiger Funktionen auf natürlichen Zahlen ausreicht.
@thesis{ Albayrak-Diss-1998, address = {Aachen, Germany}, author = {Can Adam {Albayrak}}, school = {Rheinisch-Westf\"{a}lische Technische Hochschule Aachen}, title = {Die WHILE-Hierarchie f\"{u}r Programmschemata}, type = {Doctoral Thesis}, year = {1998}, }
Ulrike Sattler: Terminological Knowledge Representation Systems in a Chemical Engineering Application. Doctoral Thesis, Rheinisch-Westfälische Technische Hochschule Aachen, Aachen, Germany, 1998.
Abstract BibTeX Entry PDF File PS File ©Verlag Mainz
Abstract BibTeX Entry PDF File PS File ©Verlag Mainz
This work is concerned with the question of how far terminological knowledge representation systems can support the development of mathematical models of chemical processes. Terminological knowledge representation systems are based on Description Logics, a highly expressive formalism with well-defined semantics, and provide powerful inference services. These system services can be used to support the structuring of the application domain, namely the organised storage of parts of process models. However, the process systems engineering application asks for the extension of the expressive power of already existing Description Logics. These extensions are introduced and investigated with respect to the computational complexity of the corresponding inference problems.
@thesis{ Sattler-diss, address = {Aachen, Germany}, author = {Ulrike {Sattler}}, school = {Rheinisch-Westf\"{a}lische Technische Hochschule Aachen}, title = {Terminological Knowledge Representation Systems in a Chemical Engineering Application}, type = {Doctoral Thesis}, year = {1998}, }
Generated 21 November 2024, 15:59:45.