ZIH-Colloquium
Past Colloquia
10.Dezember 2015, zusätzliches Kolloquium: Olivier Beaumont (INRIA - Bordeaux): Comparison of Static and Dynamic Strategies for Task Graph Scheduling
The goal of this talk is to provide an analysis and comparison of static and dynamic strategies for task graph scheduling on platforms consisting of heterogeneous and unrelated resources, such as GPUs and CPUs. Static scheduling strategies, that have been used for years, suffer several weaknesses. First, it is well known that underlying optimization problems are in general NP-Complete, what limits the capability of finding optimal solutions to small cases. Second, parallelism inside processing nodes makes it difficult to precisely predict the performance of both communications and computations, due to shared resources and co-scheduling effects.
Recently, to cope with these limitations, many dynamic task-graph based runtime schedulers (StarPU, StarSs, QUARK, PaRSEC) have been proposed. Dynamic schedulers base their allocation and scheduling decisions on the one side on dynamic information such as the set of available tasks, data location and the state of the resources and on the other hand on static information such as task priorities computed from the whole task graph.
In this talk, I will consider two different linear algebra kernels, namely Matrix Multiplication and Cholesky Factorization to illustrate the respective advantages and limitations of static and dynamic solutions. In the case of Matrix Multiplication, we will more specifically study the impact of dynamism and runtime strategies on the overall communication amount. In the case of Cholesky Factorization, we will concentrate on the impact of heterogeneous and unrelated resources on resource allocation.
Olivier Beaumont received his PhD from the University of Rennes in 1999 and his Habilitation à Diriger des Recherches in 2004. Since 2007, he is a senior scientist at Inria and he serves as Scientific Deputy for Inria Research Center in Bordeaux since 2010. His research interests include the design of parallel and distributed algorithms and of scheduling and resource allocation algorithms in presence of heterogeneous resources.
23. Juli 2015: Lucas Wetzel (MPI PKS Dresden) "Robust Self-organized Synchronization of Separate Electronic Clocks inspired by Biological Systems"
Both in biology and in engineering, synchronization of many autonomously oscillating parts is an important concept, think of flashing fireflies, cardiac pacemaker cells, or multi-core systems. In electronic systems consisting of hundreds of autonomous cores with separate clocks, state-of-the-art techniques of synchronization can become highly inefficient. Understanding synchronization strategies in biological systems can provide novel approaches for synchronization in electronic systems.
We seek for biologically inspired synchronization techniques for large systems of electronic clocks to support concerted operations. Using theories of coupled oscillators that capture the effects of signal transmission delays and signal filtering we analyze the synchronization properties of such systems. In parallel, we test our theoretical results for the frequencies of the synchronized states and the perturbation decay rate with experiments on mutually delay-coupled electronic clocks. In particular we use networks of delay-coupled digital phase-locked loops, which are basic electronic clocking devices.
We see potential applications in multi-core and multi-processor computer architectures, network-on-chips, terahertz devices, and antenna arrays.
Lucas Wetzel received the diploma degree in physics from Leipzig University, Germany in 2008 and the doctoral degree in physics from TU Dresden, Germany in 2012. During 2006, he was a Visiting Student at ETH Zurich, Switzerland. In 2003 he did an internship at the Leibniz Insti- tute of Surface Modification, studying the epitaxial growth of GaAs semiconductors. For his diploma studies, he specialized on micro- & nanosystems, stochastic processes and nonlinear dynamics. Prof. Dr. Ulrich Behn supervised his diploma thesis on statistical properties of stochastic delay differential equations with additive and multiplicative noise. Since 2008, he is with the Max Planck Institute for the Physics of Complex Systems, Dresden, Germany, where he currently is a Research Associate. His research interests are related to biological clocks, nonlinear dynamics of coupled oscillators, and the coordination of large systems of coupled electronic clocks. Dr. Wetzel is a member of the German Physical Society DPG.
25. June 2015: Prof. Dr. Heidrun Schumann (Universität Rostock) "Visuelle Analyse großer Datenmengen" (slides)
Die effektive Analyse komplexer Informationsmengen stellt eine enorme Herausforderung für Forscher und Entscheidungsträger in vielen Bereichen von Wissenschaft und Wirtschaft dar. Visual Analytics ist ein neues Forschungsgebiet, das die Exploration sehr großer Datenmengen durch die Kombination von analytischen, visuellen und interaktiven Methoden adressiert und damit das Aufdecken von Zusammenhängen sowie die Generierung und Validierung von Hypothesen unterstützt. Im Vortrag werden neue Ansätze zur interaktiven Visualisierung von Daten und strukturellen Zusammenhängen vorgestellt. Am Beispiel der Segmentierung von Zeitreihen wird gezeigt, wie unterschiedliche Verfahren und Parametrisierungen zu unterschiedlichen Ergebnissen führen und wie durch geeignete Fokussierung Informationen von Interesse hervorgehoben werden können. Abschließend wird ein proaktiver Ansatz vorgestellt, der die Präsentation und Diskussion großer Informationsmengen in einer Multi-Display-Umgebung unterstützt.
Heidrun Schumann leitet seit 1992 den Lehrstuhl für Computergraphik an der Universität Rostock. Sie forscht in den Bereichen Informationsvisualisierung und Visual Analytics. Schwerpunkt sind dabei Untersuchungen zur visuellen Analyse von Netzwerken und spatio-temporalen multivariaten Daten. Aktuelle Forschungsprojekte untersuchen Fragen zur skalierbaren Visualisierung und zum Entwurf visueller Interfaces in Smart Environments. Daneben werden anwendungsbezogene Lösungen zur Visualisierung von Terrains und bio-medizinischen Daten entwickelt. Darüber hinaus ist sie Co-Autorin von zwei Fachbüchern zum Thema Visualisierung. Seit 2014 ist Frau Prof. Schumann ein Mitglied der Eurographics Association.
21. May 2015, 15 Uhr, WIL A317: additional colloquium with Morris Riedel (University of Iceland, FZ Jülich)"Selected Parallel and Scalable Methods for Scientific Big Data Analytics" (slides)
The goal of this talk is to inform participants about two concrete and widely used data analytics techniques that are suitable to analyse ‘big data’ for scientific and engineering applications. After a brief introduction to the general approach of using machine learning, data mining, and statistical computing in data analytics, the talk will offer details on the ‘clustering’ technique that partitions datasets into subgroups (i.e. clusters) previously unknown. From the broad class of available methods we focus on the density-based spatial clustering of applications with noise (DBSCAN) algorithm that also enables the identification of outliers or interesting anomalies. A parallel and scalable DBSCAN implementation, based on MPI/OpenMP and the hierarchical data format (HDF), will be discussed in the context of interesting scientific datasets. The second technique that the talk will adress is ‘classification’ in which groups of datasets already exist and new data is checked in order to understand to which existing group it belongs. As one of the best out-of-the-box methods for classification the support vector machine (SVM) algorithm including kernel methods will be a focus. A parallel and scalable SVM implementation, based on MPI, will be described in detail by using a couple of challenging scientific datasets and smart feature extraction methods. Both aforementioned high performance computing algorithms will be compared with solutions based on a variety of high throughput computing techniques (i.e. map-reduce, Hadoop, Spark/MLlib, etc.) and serial approaches (R, Octave, Matlab, Weka, scikit-learn, etc.).
Dr. - Ing. Morris Riedel is an Adjunct Associate Professor at the School of Engineering and Natural Sciences of the University of Iceland. He received his PhD from the Karlsruhe Institute of Technology (KIT) and started the work in parallel and distributed systems in the field of scientific visualization and computational steering of e-science applications on large-scale HPC resources. He previously held various positions at the Juelich Supercomputing Centre in Germany. At this institute, he is also the head of a specific scientific research group focussed on “High Productivity Data Processing” as part of the Federated Systems and Data Division. Lectures given in universities such as the University of Iceland, University of Applied Sciences of Cologne and University of Technology Aachen (RWTH Aachen) include 'High Performance Computing & Big Data', Statistical Data Mining', ‘Handling of large datasets’ and ‘Scientific and Grid computing’. His current research focusses on 'high productivity processing of big data' in the context of scientific computing applications.
April 9, 2015: Chadlia Jerad(École Nationale des Sciences de l’Informatique und École Nationale d'Ingénieurs de Tunis, Tunesien) "Model Checking Architectural Descriptions: Software and Hardware" (slides)
Increasing complexity in software as well as in hardware systems, leads to the need of techniques to formally check design properties. This is especially important for safety critical systems. Architectural descriptions allow designers to focus on high level aspects of a system by leaving aside details or on better terms by abstracting these details. It is precisely this abstraction that makes these models suitable for formal verification using model checking techniques. The presentation will overview architecture description languages (mainly software, but also hardware), as well as model checking techniques. A particular focus will be made on the use of Maude, an executable specification language based on rewriting logic, and its LTL model checker for describing and verifying properties.
Chadlia Jerad received the Electrical Engineering degree and the Master degree from the National Engineering School of Tunis (ENIT), Tunisia. For her Ph.D., she worked on rewriting logic based formal verification of software architecture description languages. Since 2009, she joined the National School of Computer Science, Tunisia as an associate Professor. Now she is the coordinator of “Embedded Systems and Software Specialization”. She joined OASIS Laboratory at ENIT starting 2012. Her research interests include formal verification, architecture description (software and hardware), and MPSoC.
26. März 2015: Benjamin Friedrich (MPI-PKS Dresden): The art of cell locomotion: swimming, steering, synchronization
We study how single cells swim and respond to sensory cues. Sperm cells can steer their swimming path in response to chemical cues released by the egg. I present a theory explaining how a simple feedback mechanisms aligns the helical swimming paths of these cells with the direction of a concentration gradient. In a second part, I will show how purely mechanical forces stabilize the swimming gait of a swimming alga (Chlamydomonas). This alga swims like a breast-swimmer with two flagella that must keep a common rhythm. These theories of biological motility control are confirmed by experiments performed by our experimental collaboration partners. We developed novel image and data analysis algorithms to analyze these experiments, following inspiration by theory.
Benjamin Friedrich is a theoretical physicist, who is curious on "how biological systems work". He studied mathematics in Leipzig and Cambridge, before he moved to biological physics. For his PhD with Frank Jülicher on "How sperm find the egg", he was awarded the Otto-Hahn medal of the Max Planck society. After a post-doc with Sam Safran in Israel, he is now an independent researcher at the Max Planck Institute for the Physics of Complex Systems.
February 17, 2015, 14:00, WIL A317, additional colloquium: Ana Lucia Varbanescu (University of Amsterdam) "The Landscape of Large Scale Graph Processing: a View from Amsterdam" (slides)
Mainly due to social and professional networks, graphs have regained a lot of popularity in the past 5 years. With their irregular structure and unpredictable performance, they are a nightmare for parallel systems and a serious challenge for distributed systems. In this talk, we sketch the landscape of large scale graph processing platforms as it stands today, pointing out its peaks and valleys. We provide examples of interesting platforms, discuss their performance and programmability, and make recommendations for their usability. We overview the challenges of benchmarking large scale graph processing platforms, and propose a systematic way to tackle them. We conclude by identifying those gaps in the landscape that lead to promising new directions of research.
Ana Lucia Varbanescu is an assistant professor at University of Amsterdam, The Netherlands. She obtained her PhD in 2010 from Delft University of Technology, and continued as a PostDoc researcher at VU University Amsterdam and Delft University of Technology. In 2013, she received the MacGillavry fellowship from University of Amsterdam. From 2007 till 2014, she was a visiting researcher at IBM T.J.Watson (USA), NVIDIA (USA), BSC (Spain), and Imperial College of London (UK). Her research interests include HPC, parallel programming, performance analysis and prediction, and big data analytics.
December 18, 2014: Michael Kuhn (BIOTEC, TU Dresden) "Using computational biology to uncover the evolution of tissues and their function"
Jellyfish, insects, mammals: all animals consist of different tissues and organs. On the molecular level, tissues differ in the abundance of proteins. Each protein is encoded in a gene and has certain functions, e.g. signal transmission, structural support, or the catalysis of chemical reactions. The set of proteins that occur in a tissue give the tissue its identity. Thanks to technical advances in the past years, there has been a steady increase in the number of datasets that measure the expression of genes in multiple tissues. I will present a method to compare these datasets across very different species, and will show some conclusions about the evolution of genes and tissues.
Michael Kuhn is a post-doc at the BIOTEC (TU Dresden) in the labs of Andreas Beyer (now in Cologne) and Tony Hyman at the Max Planck Institute for Molecular Cell Biology and Genetics. He is currently working on tracing the molecular evolution of centrosome components and their function, and on integrating data from tissue-specific gene expression screens across many species. After undergraduate studies in computer science at the Caltech, he obtained a PhD in the lab Peer Bork at the EMBL Heidelberg.
October 23, 2014: Sandra Gesing (University of Notre Dame, Indiana, USA) "Distributed Job, Workflow and Data Management - Science Gateways as Solution to Rule Them All" (slides)
The next era of science gateways has begun. Per definition science gateways provide a single point of entry to distributed job, workflow and/or data management across organizational boundaries in an intuitive manner. The first time in the history of the such solutions, providers of HPC, grid and cloud infrastructures report that more of their resources are accessed via science gateways than via command line. Additionally, novel developments in web-based technologies and agile web frameworks allow for supporting developers in efficiently creating web-based science gateways. The talk will give an overview on complete science gateway frameworks as well as APIs and their features. It will suggest measurements for developers deciding which technology is suitable for a community under consideration of the community's preferred tools, methods and available data.
Prof. Dr. Sandra Gesing is a research assistant professor at the University of Notre Dame. She received her PhD in computer science from the University of Tübingen and her German diploma in computer science from extramural studies at the FernUniversität Hagen. Prior to the position at Notre Dame, she worked as research associate at the University of Edinburgh and she has perennial experience in industry. Her research interests include science gateways especially for bioinformatic applications, grid and cloud computing and parallel programming.
July 24, 2014: Jens Krüger (Center for Bioinformatics Tübingen) "Scientific Gateways for High Performance Resources"
Current research in natural sciences often relies on computer-aided data analysis and simulations. Hence the availability of compute resources and storage is an essential aspect. The presentation will cover the full range from scientific motivation and problem description to computational resource requirements and scaling behavior of individual applications. The real life example for illustrating these aspects will be the molecular dynamics simulations of ion channels, highlighting the computational challenges. Special emphasis will be put on the development of science gateways facilitating the use of high-performance resources for natural scientists. Experiences with using such gateways in university teaching will be presented.
Dr. Jens Krüger received his diploma and PhD in chemistry at the University of Paderborn. After staying some years at the biophotonics department of the National Yang Ming University Taipei, he moved back to Germany. After a brief return to Paderborn he is now at the Eberhard-Karls University Tübingen working on his habilitation in bioinformatics. His research interests cover simulations of membrane proteins and ion channels as well as high performance computing and science gateways.
June12, 2014: Andreas Both (Unister GmbH) "Interdisziplinäre Herausforderungen bei der Entwicklung von Big-Data-Anwendungen" (slides)
Big Data ist in aller Munde. Die grobe Klassifikation der sich stellenden Herausforderungen in Form der 3 V -- Volume, Variety, Velocity -- sind weithin bekannt. Gleichzeitig ist eine ganzheitliche Betrachtung selten. Im Rahmen des Vortrags werden Beobachtungen beim Umgang mit Big Data in industriellen Anwendungen präsentiert. Dabei sollen insbesondere die Problemstellungen auf dem Weg zu Smart Data und suchgetriebenen Anwendungen des Web 3.0 dargestellt werden. Ziel ist, basierend auf Erfahrungen bei der Entwicklung von datengetriebenen Internetanwendungen und in Forschungsprojekten mit Big-Data-Kontext (z.B. FP7 GeoKnow oder VISEA), die Herausforderungen für Industrie und Wissenschaft abzuleiten und insbesondere deren interdisziplinären Charakter zu erörtern.
Dr. Andreas Both ist Leiter der Forschung und Entwicklung bei der UNISTER GmbH. Er erhielt seine Doktorwürde an der Martin-Luther-Universität Halle-Wittenberg im Bereich Softwaretechnik. Bei Unister fokussiert sich seine Arbeit auf innovative Internettechnologien und -anwendungen. Insbesondere liegt ein Schwerpunkt bei datengetriebenen Anwendungen, Information Retrieval, Wissensmanagement, Datenanalyse und Suchfunktionen verschiedenster Art.
May 22, 2014: Chris Schläger (Amazon) "Developing the OS for the world's largest cloud"
Amazon Web Services (AWS) is the leading provider of cloud services. Amazon Linux is the operating system that forms on one hand the basis for the central services that power AWS. On the other hand, it is the OS that we provide to our internal and external EC2 customers for their instances. The presentation will cover some of the challenges we encounter to develop and operate an OS infrastructure at the scale of AWS.
Chris Schläger is a Managing Director of the Amazon Development Center (Germany) and manages worldwide Amazon Kernel and Operating Systems teams. Prior to Amazon he ran the AMD Operating System Research Center (OSRC). Prior to AMD, he was VP of Linux Distributions at Novell and SUSE
April 24, 2014: Johanna Vompras (Universität Bielefeld) "Hochschulweites Forschungsdatenmanagement der Universität Bielefeld" (slides)
An der Universität Bielefeld sind in den letzten Jahren unter Mitwirkung der Universitätsbibliothek wegweisende Dienste für das Forschungsdatenmanagement entstanden: Für die Planung und kontinuierliche Dokumentation des Umgangs mit Forschungsdaten im Sinne aktueller Förderbedingungen steht den Forschenden ein Tool für die Erstellung und Verwaltung von Data-Management-Plänen zur Verfügung. Für Forschende, die nicht auf adäquate Dienste für die Veröffentlichung ihrer Daten zurückgreifen können, ist das institutionelle Repositorium der Universität um den Typ Forschungsdaten erweitert worden. Die Veröffentlichung berücksichtigt personen- und unternehmensbezogene Interessen und erfolgt unter verbindlichen Lizenzbedingungen. Die DOI-Registrierung über DataCite sichert die persistente Referenzierung. Zugleich werden Daten aus disziplinären Infrastrukturen automatisch aggregiert und mit ihren Publikationen und Projektzusammenhängen kontextualisiert. Der Vortrag präsentiert sowohl strategische und organisatorische Maßnahmen für den Aufbau einer Forschungsdateninfrastruktur, als auch Erfahrungen der Nutzer aus den Profilbereichen der Universität bei der Nutzung der bereits etablierten Dienste. Weitere Infos: data.uni-bielefeld.de
Johanna Vompras studierte “Naturwissenschaftliche Informatik” mit den Hauptschwerpunkten Data Mining, Pattern Recognition und Datenbanksysteme an der Technischen Fakultät der Universität Bielefeld. In ihrer jetzigen Tätigkeit an der Universitätsbibliothek Bielefeld koordiniert sie im Rahmen des Projeks Informium die Piloten, die sowohl die Umsetzung von disziplinspezifischen Anforderungen als Ziel haben, als auch Impulsgeber für die fortlaufende Weiterentwicklung von Forschungsdatenservices an der Universität sind.
March, 27 2014: Pavel Tomancak (MPI-CBG Dresden) "Open Source Tools for Biological Image Analysis"
I will discuss Open Source tools designed to facilitate modern biological research. The application examples will include image stitching, processing of multi angle microscopy data, deconvolution, registration of serial section microscopy data and visualisation of terabyte sized datasets. All these tools are available and actively developed under the Fiji (Fiji Is Just ImageJ) platform for Open Source biological image analysis. I will discuss our vision of how the productive collaboration between biology and computer science research communities could work in the exciting and dynamically growing area of biological imaging.
Pavel Tomancak is a Research Group leader at the Max Planck Institute of Molecular Cell Biology and Genetics, Dresden. He did his PhD at the EMBL Heidelberg at the laboratory of Dr. Anne Ephrussi in 1999. From 2000 to 2004 he had been a Postdoc at the Dept. of Molecular and Cell Biology of the University of California in Berkeley at the laboratory of Dr. Gerald M. Rubin.
March, 12 2014, 15:00, WIL C207: additional colloquium with Hartmut Petzold (Deutschen Museum, München) „Nikolaus Joachim Lehmann, Pionier und Missionar des Computers in der DDR“
Der langjährige Professor an der TH/TU Dresden Nikolaus Joachim Lehmann (1921-1998) war einer der ersten, die auf dem europäischen Kontinent die Bedeutung des elektronischen Computers erkannten und die Entwicklung eines eigenen Exemplars planten und organisierten. Er bezog das Projekt von Anfang an in die Lehre an der TH ein und setzte sich auch in der Industrie und bei der politischen Führung für die Herstellung, Anwendung und Verbreitung des Computers in der DDR ein.
Es werden die Leitlinien vorgestellt, denen im Rahmen einer umfassenden und noch nicht veröffentlichten Darstellung der wissenschaftlichen Aktivitäten von N. J. Lehmann gefolgt wurde. Um die Vielschichtigkeit der wechselnden historischen Situationen zu verdeutlichen werden einige Aspekte seiner Ausbildung und seiner Bemühungen um die Verbreitung des Computers bis zum Beginn der 1960er Jahre ausführlicher beleuchtet.
Hartmut Petzold, geboren 1944, studierte Elektrotechnik und allgemeine Geschichte in Berlin (West). Nach Tätigkeiten als Ingenieur und als wissenschaftlicher Mitarbeiter am Institut für Wissenschafts- und Technikgeschichte der TU Berlin war er von 1988 bis 2009 als Kurator am Deutschen Museum in München für die Bereiche Mathematische Instrumente und Informatik zuständig.
January 23, 2014: Yury Oleynik (TU München) "Automatic Characterization of Performance Dynamics with Periscope" (slides)
Scientific applications performing complex numeric simulations may run for very long times. Performance of such applications is a subject for significant runtime dynamics which may degrade with time and result in bottlenecks. In this scenario performance analysis tools being able to support application developers in pin-pointing such degradations become crucially important. However, the tools themselves have to face scalability challenges along the time dimension in particular in size of collected measurements, their visualization and analysis.
This talk focuses on the challenge of analysis and, more specifically, automatic characterization of temporal performance variability. New analysis strategy extending the performance analysis tool Periscope is presented. It adds temporal resolution to the existing analysis capabilities and automatically searches for dynamic properties of performance bottlenecks. The strategy utilizes signal processing algorithms for detection of the intervals of significant performance variability and performs both quantitative and qualitative characterization of degradation trends. The result of this analysis is a list of found performance properties detailing location, severity and high-level description of temporal performance dynamics.
Yury Olenyik is a scientist at the Chair of Computer Architecture at Techinische Universität München. His research lies in the field of performance analysis tools for High Performance Computing. He graduated from Technische Universität München with a Master of Science degree in Computational Science and Engineering and from Bauman Moscow State Technical University with a diploma in Bio-medical Engineering.
Past years
History 2013
History 2012
History 2011
History 2010
History 2009
History 2008
History 1998 - 2007