Stochastic Modeling and Simulation
Upon completing the module, the students master the basics of stochastic modelling and simulation. We first discuss discrete-time models, followed by two classic examples, and then continuous-time models.
Contents
Conditional probabilities, normal distributions, and scale-free distributions; Markov chains and their matrix representation, mixing times and Perron-Frobenius theory; Applications of Markov chains, such as the PageRank algorithm; Monte Carlo Methods: Convergence, Law of Large Numbers, Variance Reduction, Importance Sampling, Markov Chains Monte-Carlo Using Metropolis-Hastings & Gibbs Samplers; Random processes and Brownian motion: properties in 2, 3 and more dimensions, connection to the diffusion equation, Levy processes and anomalous diffusion; Stochastic differential equations (SDEs): Nonlinear transformations of Brownian motion (Ito calculus), Ornstein-Uhlenbeck process and other solvable equations; Examples from population dynamics, genetics, protein kinetics, etc.; Numerical simulation of SDEs: strong and weak error, Euler-Maruyama scheme, Milstein scheme.
Topic Prerequisites
Working knowledge of computer programming in any language (e.g. Matlab, Python, Java), basic knowledge of classical physics, and solid undergraduate knowledge in calculus, probabilities and statistics.
Program / Module
M.Sc. Computational Modeling and Simulation
Module: CMS-COR-SAP - Stochastics and Probability
Format
2 SWS lecture, 1 SWS exercise, 1 SWS tutorial, self-study
5 credits
Registration to the course
For students of the Master program "Computational Modeling and Simulation: via CampusNet SELMA
Teachers
Lecture: Dr. Abhishek Behera, Mr. Serhii Yaskovets & Prof. Ivo F. Sbalzarini
Exercises: Mr. Mohammad-Hadi Salehi & Dr. Nandu Gopan
Instruction language: ENGLISH
Script
Lecture notes are available as PDF here.
Suggested Literature
Feller - an introduction to probability theory and its applications, Wiley+Sons, 1957.
Robert & Casella - Monte Carlo statistical methods, Springer, 2004.
Winter Term 2024/25
First Lecture on 21/10/2024
First Tutorial on 24/10/2024
Lecture: 4 DS (13:00- 14:30), Mondays at HSZ/403
Exercises / Tutorials: 4 DS (13:00- 14:30), Thursdays at FOE/0244
LECTURES AND EXERCISES WILL BE IN PRESENCE FOR THE WHOLE SEMESTER, BUT SUPPORTED WITH ONLINE VIDEO RECORDINGS
- Link to the videos in OPAL: https://bildungsportal.sachsen.de/opal/auth/RepositoryEntry/32365445134
The exam in winter 2023/24 will be in-person and written:
Exam Location: ZEU/250/Z
Exam Date and Time: Mo, 5. Feb. 2024, 07:30-09:00
Please be present at the examination hall atleast 15 minutes in advance (i.e. by 07:15) so that the instructions for the exam can be read out.
At the exam, the following may be used:
- 4 A4 sheets (8 pages if you print duplex) of hand-written summary. We recommend writing the summary by hand, but it can also be machine-written. In the latter case, the font size must be 8 points or larger throughout.
- A standard pocket calculator (devices with network or bluetooth access, as well as devices capable of storing and displaying documents are not allowed)
Items not adhering to these guidelines will be confiscated in their entirety at the beginning of the exam.
Please urgently observe and adhere to the following:
- Proper registration to the exam in SELMA is mandatory. Registration to the course does NOT imply registration to the exam. You can de-register from the exam without naming a reason up to 1 days BEFORE the exam.
Registration to the exam-
For students of the Master program Computational Modeling and Simulation: via CampusNet SELMA
-
For students of other degree programs: via your respective examination office
-
For ERASMUS and exchange students: via the Computer Science examination office
-
- Please bring an official photo ID of yours to the exam and be ready to show it to the exxaminer in order to identify yourself.
Students not adhering to all of the above will be excluded from the exam.
Grade scale:
All exams are graded in absolute terms w.r.t. the following pre-defined grade scale that remains constant over the years:
- The top grade of 1.0 is reached with 80% of the maximum possible points
- Half of that, i.e., 40% of the maximum possible points, are required to pass
- Below 40%, or no-show, is a fail.
Between the top grade and the passing threshold, the grading scale is linear. In the end, grades are rounded to the nearest allowed grade according to the exam regulations: 1.0, 1.3, 1.7, 2.0, 2.3, 2.7, 3.0, 3.3, 3.7, 4.0, 5.0. The grades 0.7, 4.3, and 4.7 are not allowed. Any grade above 4.1 is a fail (see exam regulations). The maximum number of points that can be reached in the exam is given by the number of minutes the exam lasts (i.e., a 90 minute exam yields maximum 90 points). Points are distributed amongst the exam questions to reflect the number of minutes a good student would need to solve the problem. This provides some guidance for your time management in the exam. In order to reduce the risk of correction mistakes, all exams are checked by at least two independent, qualified assessors (typically professors or teachers with officially conferred examination rights). The exam review session (see below) is for you to come look at your exam paper and report correction mistakes you found.
Exam Review winter term 2022/23
TBA
- Oct 09, 2023: Lecture 0 - Introduction to the Course and Organization
- Oct 16, 2023: Lecture 1 - Probability refresher, conditional probabilities, Bayes' rule, random variables, discrete and continuous probability distributions, scale-free distributions
- Oct 23, 2023: Lecture 2 - Transformation of random variables, pseudo- and quasi-random numbers, low discrepancy sequences, transformation algorithms: inversion, Box-Muller, accept-reject method, composition-rejection method
- Oct 30, 2023: Lecture 3 - Discrete-time stochastic processes, discrete Markov chains and their matrix
- Nov 06, 2023: Lecture 4 - Law of large numbers, Monte Carlo methods, example: MC integration, importance sampling
- Nov 13, 2023: Lecture 5 - Monitoring variance, variance reduction
- Nov 20, 2023: Lecture 6 - Rao-Blackwell, Markov Chain Monte Carlo (MCMC), detailed balance, convergence criteria, acceleration methods
- Nov 27, 2023: Lecture 7 - Classic MCMC samplers 1: Gibbs sampling
- Dec 04, 2023: Lecture 8 - Classic MCMC samplers 2: Metropolis-Hastings, convergence diagnostics, stopping conditions
- Dec 11, 2023: Lecture 9 - Random Walks, Brownian motion in 1,2,3,n-dim, connection to diffusion, continuum limit of random walks
- Jan 8, 2024: Lecture 10 - Monte-Carlo optimization: stochastic gradient descent, simulated annealing, evolution strategies, CMA-ES
- Jan 15, 2024: Lecture 11 - Stochastic calculus, Ito calculus, Ornstein-Uhlenbeck process (analytical)
- Jan 22, 2024: Lecture 12 - Numerical methods for SDE: Euler-Maruyama, Milstein, strong and weak convergence
- Jan 29, 2024: Lecture 13 - Master equation, Fokker-Planck, Kolmogorov forward, Example: chemical kinetics