Thesis Topics
The chair of Privacy and Security is constantly offering new topics for student theses. Most topics can be adapted, no matter if you are looking for a Bachelor's, or Master's thesis, or if you're aiming at doing a Diplom-/Belegarbeit, or even a research related module.
In the following, you can find a list of potential supervisors at our chair together with a link to the topics currently offered by each person:
- Stefan Köpsell
-
Diversity of virtualisation solutions
In the area of modern telecommunication networks (5G), etc., virtualisation and software components are increasingly being used.Therefore, there are now a number of (open source) projects that provide the necessary basic infrastructure, such as:
https://www.onap.org/
https://www.o-ran.org/software
https://osm.etsi.org/
https://gitlab.com/sylva-projects/sylva
https://www.starlingx.io/
etc.
The aim of the work is to give an overview of existing projects and their technological basis and architecture. Based on this, it will be analysed from a security perspective to what extent the different projects can be seen as solutions for diversity or to what extent it is a monoculture landscape (for example: all projects are based on Linux). These analyses should be used to assess which components are particularly worthwhile from an attacker's point of view in terms of attacks or manipulations in order to attack the largest possible number of virtualisation solutions.
-
Enhancing Secure Key Exchange based on Physical Layer Security
The task is to extend an existing prototype for secure key exchange based on physical layer security. Physical layer security means, that the secret key is derived from the physical characteristic of a wireless channel between Alice and Bob. Therefore, the channel is measured, and the measurements a processed (quantisation and error correction) to derive the final key.
The existing prototype only deals with a semi-static setting, e.g. Alice has a static position and only Bob is moving (realised by a LEGO robot). The first extension therefore would be to allow fully dynamic setting, i.e. Alice and Bob are moving. The second enhancement is related to the currently used quantisation and error correction algorithms. Based on previous work and knowledge from the literature more efficient algorithms should be implemented to improve the overall efficiency of the key generation. The final result should be evaluated with respect to the achieved key generation rate.
-
Evaluation of Key Exchange based on ML-based Physical Layer Security
The task is to implement and evaluation a known ML-based mechanism for secure key exchange based on physical layer security. Physical layer security means, that the secret key is derived from the physical characteristic of a wireless channel between Alice and Bob. Therefore, the channel is measured, and the measurements are feed to an ML-based algorithm, which outputs the key stream.
There exist currently a concept and a python-based implementation of this ML-based key generation algorithm. The task would be to implement this algorithm in a real-world prototype allowing some real world evaluation. The foundation will be Coral.AI development boards (https://coral.ai/products/dev-board/). They are feed with the wireless measurements and should generate the derived key bits. Before the ML-algorithm can actually generate the key stream, it needs to be trained based on real world measurements. Therefore, part of the task is to collect these measurements and train the ML-algorithm offline. Afterwards the trained model should be executed on the Coral.AI board to derive the actual key stream. The overall performance in terms of computational overhead and key generation rate should be evaluated.
-
Improving resilience of 5G by reducing complexity
5G networks are complex distributed systems involving many services (usually executed as virtualised network functions). This complexity increases the risks of failures or successful attacks and in general has negative impact on resilience.
The task would be to define the minimal necessary set of 5G components and the minimal set of interfaces and functions these components need to provide to implement a given 5G-service, e.g. transmitting SMS, allowing data or voice communication. Based on these minimal set of functionality enhancements to increase the overall security and resilience should be proposed. The related design proposal should be evaluated baes on an existing Open Source 5G testbed.
-
Improving resilience of 5G by trustworthy virtualisation.
5G networks are distributed systems consisting of a set of virtualised network functions. The tasks is to enhance the overall security and resilience of such a 5G system based on the features a trustworthy virtualisation environment might provide. These features are for example: strict separation, monitoring, resource usage limitation, migration and restart, remote attestation etc.
The proposed design should be evaluated with the help of an existing Open Source 5G testbed. -
Confidential Computing for 5G
5G networks are distributed systems consisting of a set of virtualised network functions (VNFs). These VNFs could be executed in a public cloud in principle. This would introduce risks to IT security in case of malicious cloud operators. Therefore, the task is to design a solution based on confidential computing. A possible foundation could be frameworks like Gramine (https://gramineproject.io/) or SCONE which allow to execute components within hardware based trusted execution environments. The proposed solution should be evaluated with the help of an existing Open Source 5G testbed.
-
- Sebastian Rehms
- please contact by e-mail
- Elke Franz
- please contact by e-mail
- Ghazal Bagheri
- Entropy Estimation
Due to the broadcast nature of the wireless medium, communication between terminals in a wireless environment are often susceptible to eavesdropping, message modifying, and node impersonation. Secret keys have thus been used to protect the confidentiality, integrity, and authenticity of messages and nodes. Physical layer security as an alternative approach to the traditional cryptographic schemes has been attentional to many researchers.
Channel reciprocity based key generation (CRKG) method by utilizing the advantage of reciprocal channel characteristics as a source of randomness, allows the two legitimate communication partners to locally generate secret keys at the terminals. In this sense, the two legitimate partners and eavesdropper in a communication system realize different observations from a shared random source. These realizations obtained from the estimated channel are assumed to be reciprocal, independent identically distributed (i.i.d), and with high entropy. Under the assumption of an authenticated and noiseless public channel, the legitimate parties can derive a shared secret key through exchanging messages while the observations remain unknown to the eavesdropper.
In particular, channel-based SKG typically consists of three main operations: randomness sharing, advantage distillation, information reconciliation and privacy amplification. Randomness sharing is an indispensable part of the key generation process. In this step, commonly continuous measured channel profiles are quantized into bit vectors to obtain initial preliminary key material. This step has a crucial role to provide the requirements of the CRKG assumptions for generated keys such as, independent identically distributed (i.i.d) and high entropy. Information reconciliation is then used to correct the key mismatch through the exchange of messages (e.g., parity bits) for error detection or correction; and privacy amplification is used to eliminate the information that may have been leaked to the eavesdropper (Eve) during the message exchange process.
During the whole procedure of secret key generation, measuring the randomness of the common source and the mutual information between the two legitimate partners and information leakage to the eavesdropper is crucial to design a secure key generation scheme. Hence, finding a solution to the problem of estimating the entropy is an interesting issue for many CRKG experts. Current approaches to estimate the mutual information between the legitimate partners or information leakage to the eavesdropper is based on machine learning methods such as KNN-based density estimators, Neural network estimators and non-parametric methods in which a kernel function is used for the estimator.
Our main goal is to investigate through the existing methods and find state-of-the-art approaches in this sense regarding our intended problem which is generating secret keys in IoT scenarios. Hence our aim would be to lay on the assumption of the low complexity-based approaches with high accuracy, then we can find the solution suited to the scheme and improve its performance.
-
Privacy Amplification
Approaches toward wireless security in IoT tend to shift from traditionally cryptography approaches to physical layer security (PLS). In cryptographic methods that attempt to achieve security on the higher layers of the protocol stack, reaching the aim of low computational cost and at the same time with a high level of security is challenging. PLS approaches provide new techniques which bring lightweight solutions for security problems through taking the advantages of wireless propagation’s features. In this sense, channel reciprocity based key generation (CRKG) methods which provides security goals of wireless communication has been an interesting research area in recent years.
The procedure of key establishment takes four main steps: randomness sharing, advantage distillation, information reconciliation, and privacy amplification. Randomness sharing is an indispensable part of the key generation process. Through advantage distillation step, commonly continuous measured channel profiles are quantized into bit vectors to obtain initial preliminary key material. Then, in information reconciliation step the information is exchanged over public discussion to achieve a same fix-length key strings called reconciled key by correcting errors within the raw key string. With information reconciliation, privacy amplification cannot be avoided. In practice, the randomness of the generated key should be extensively tested, which is time-consuming in general.
Privacy amplification is an ingredient step for removing the redundancy of key, of which, the amount of information leaked to potential eavesdroppers, is often difficult to determine. Privacy amplification has now become indispensable for guaranteeing the security of generated secret key. A common method of privacy amplification is compressing a string of key U into a series of key U′ through universe class of hash functions and then the information leaked to Eve is removed. In this way, the secure key will be obtained. The hash function is usually chosen as Toeplitz matrix, whose elements are 0 or 1. Since the length of the reconciled key is usually very long, and then the size of the Toeplitz matrix is very large. Due to the large matrix, the implementation of privacy amplification will cost considerable number of storage resources and slow down the computing speed.
Our main goal is to have a comprehensive study on the existing methods of privacy amplification in secret key generation and find the method with high efficiency in terms of complexity and preferably being done real time procedure.
-
Quantization
Approaches toward wireless security in IoT tend to shift from traditionally cryptography approaches to physical layer security (PLS). In cryptographic methods that attempt to achieve security on the higher layers of the protocol stack, reaching the aim of low computational cost and at the same time with a high level of security is challenging. PLS approaches provide new techniques which bring lightweight solutions for security problems through taking the advantages of wireless propagation’s features. In this sense, channel reciprocity based key generation (CRKG) methods which provides security goals of wireless communication has been an interesting research area in recent years.
The main idea of CRKG is to derive symmetric keys from the reciprocal measurements of the wireless channel in the presence of an attacker. For this purpose, the two legitimate partners and eavesdropper in a communication system realize different observations from a shared random source. These realizations obtained from channel estimation are assumed to be reciprocal, independent identically distributed (i.i.d), and with high entropy. Under the assumption of an authenticated and noiseless public channel, the legitimate parties can derive a shared secret key through exchanging messages while the observations remain unknown to the eavesdropper. In general, there are two approaches regarding CRKG implementation. The received signal strength indicator (RSSI) and the channel state information (CSI). Despite RSSI which uses a single value of transmission characteristic, CSI contains more valuable information about transmission channels, therefore our aim is to use the CSI approach for the CRKG implementation.
The procedure of key establishment takes four main steps: randomness sharing, advantage distillation, information reconciliation, and privacy amplification. Randomness sharing is an indispensable part of the key generation process. In this step, commonly continuous measured channel profiles are quantized into bit vectors to obtain initial preliminary key material. This step has a crucial role to provide the requirements of the CRKG assumptions for generated keys such as, independent identically distributed (i.i.d) and high entropy. In this regard, the aim of quantization has to leverage the randomness and entropy to address its reduction in the next steps of CRKG due to revealing information through the message exchanging. By extracting channel properties from measured data, quantization is able to facilitate independent distribution of secret keys.
A number of algorithms have been introduced for the quantization phase, namely lossless and lossy. The lossless schemes apply a guard interval [+q, -q], where q+ and q− denote the upper and lower threshold, respectively. The samples within this interval are dropped as a quantization procedure. Compared to lossless algorithms that feed all available information about channel to quantization procedure, lossy schemes carry based on strong channel characteristics. Hence, finding robust parameters of measurements data in this interval will affect the final performance of the scheme.
Our goal is to improve a quantization scheme with the following aims:
· Fulfilling the necessary prerequisites with the ability to be used in CRKG methods
· Perform quantization on the CSI
· Analyzing and evaluation of the common performance metrics in this field (such as maximizing the entropy of the quantizer output between legitimate partners and eavesdropper, minimizing quantization delays while maintaining low computational complexity, etc.) compared to other existing methods.
- Entropy Estimation