Student Area


This is the right page for you if you are interested in/inscribed to one of my current/coming courses.

It is also the right page if you are looking for a bachelor's or master's project.

For completeness sake here is also a list of past (= potentially future) courses, mainly for the master programme.

Office hours are before/after the course, or by arrangement.

Sprechstunde vor/nach der Lehrveranstaltung, bzw. nach Vereinbarung.

Current/coming courses

If you think that one of the past courses should actually be listed here, let me know!

Lecture: Linear Algebra [WS20]


Mondays, 8:15-9:45
Tuesdays, 10:15 - 11:45


Due to Corona, we will flip the class room. So students first read the lecture notes and watch the videos made by Tim Netzer. Then I will answer question on all aspects of linear algebra on Mondays, while Tim Netzer will answer questions on Tuesdays.

Exercises: Linear Algebra [WS20]


Mondays, 12:15-13:45


The opportunity to touch all the wonderful and strange things from the lecture, play with them and understand them.

Bachelor's projects

In case of interest in one of the proposed projects, contact me before the beginning of the 'Seminar with Bachelorarbeit'.
If the project you are interested in is not mentioned below but should be, come for a chat way before the beginning of the 'Seminar with Bachelorarbeit'.


Convolutional/shift-invariant dictionary learning
Compare convolutional dictionary learning to unconstrained dictionary learning for sparse approximation and signal restoration on image or audio data,
detailed project description to come.

Initialisation strategies for dictionary learning
Compare graph clustering to random initialisation for dictionary learning,
detailed project description.

In Progress:

Masked Hard Thresholding Pursuit for dictionary based inpainting
Modify the Hard Thresholding Pursuit algorithm for data with missing entries and use it for inpainting,
detailed project description.


Inference acceleration for neural networks using external hardware, L. Gutbrunner, Bachelor thesis, University of Innsbruck and Besi Austria GmbH (co-supervised with M. Sandbichler), 2019. [thesis]

Free the bird - Inpainting, K. Nössing, Bachelor thesis, University of Innsbruck, 2019. [description].

Integrating low-rank components into weighted K-SVD for dictionary based inpainting, M. Tiefenthaler, Bachelor thesis, University of Innsbruck, 2018. [thesis] [description].

Hard Thresholding Pursuit for Sparse Approximation, E. Höck, Bachelor thesis, University of Innsbruck, 2016. [thesis] [description].

Master's projects

I offer master's projects in the areas of signal processing, sparse approximation, dictionary learning and machine learning.
Recommended prerequisites are good knowledge of linear algebra and probability theory and/or functional analysis.

The goal is to contribute to an active area of research and so the project will be tailored to currently open questions and your personal interests. If you are super-strong and mega-motivated there is also the possibility to do a more substantial but paid master's thesis within the START-project 'Optimisation Principles, Models & Algorithms for Dictionary Learning', which can continue on to a PhD.

In case of interest have a look at the research or publications page and contact me for a first chat.

Past Courses

Especially the lectures and seminars can become future courses if enough people are motivated.

Signal Processing and Learning [SS19]


Wednesday 10:15-12:00
Fridays 08:15-10:00


The course is divided into two parts. I will teach sparsity based signal processing and dictionary learning, while Markus Haltmeier will teach inverse problems (regularisation) and deep learning.

Seminar: Channel Coding [SS18]


Thursdays 12:15-13:45


We will teach each other the basics of channel coding and decoding, until we are ready to understand a bit about turbo codes, ldpc codes and other stuff that is used nowadays. Most important by-product to obtain in this course - presentation skills!

Exercises: Stochastics 2 [WS17/18]


Wednesdays 10:15-11:45


To digest the theory in the lecture we will calculate and sweat through several examples and verify the theory in concrete cases. Whenever possible we will also have look at applications of the theoretical results.

VU: Selected Statistical Methods [WS16/17]


Mondays 10:15-12:45
Wednesdays 10:15-12:45


The course is invented on the fly together with Tobias Hell and will probably include linear regression, classification, principal component analysis, clustering and a bit of measure concentration.

VU: Time Frequency Analysis, Wavelets and Signal Processing [SS16]


Wednesdays 12:15-13:45
Thursdays 10:15-11:45


I teach the first part of the course, covering time-frequency analysis,
the second part on wavelets is taught by Markus Haltmeier. Always with signal processing in the back of our head, in the first part we will have a look at:

  • Basic Fourier Analysis
  • Uncertainty Principles
  • Short Time Fourier Transform
  • Quadratic Time Frequency Representations
  • Gabor Frames

and in the second part you will lern about:

  • Continuous Wavelet Transform
  • Wavelet Frames
  • Orthogonal and Bi-orthogonal Wavelet Bases
  • Function Approximation
  • Optimal Statistical Estimation

Seminar: Randomness, matrices and random matrices [SS15]


Thursdays 12:15-13:45


We will start with some historic papers covering concentration of measure inequalities, have a look at the workhorse of the data-scientist also known as Johnson Lindenstrauss Lemma, and then progress to more non-asymptotic results on random matrices as well as a very useful paper about the conditioning of random sub-matrices. Depending on the participants' interests we will then dive further into one of the topics or its applications or play around with Matlab.

A proposed collection of papers spanning half a century:

  • Concentration of measure inequalities
    G. Bennet, Probability inequalities for the sum of independent random variables, 1962.
    W. Hoeffding, Probability inequalities for sums of bounded random variables, 1963.
  • Concentration of chaos variables
    D.L. Hanson, F.T. Wright, A bound on tail probabilities for quadratic forms in independent random variables, 1971.
    D. Hsu, S.M. Kakade, T. Zhang, A tail inequality for quadratic forms of sub-Gaussian random vectors, 2011.
    M. Rudelson, R. Vershynin, Hanson-Wright inequality and sub-Gaussian concentration, 2013.
  • Johnson-Lindenstrauss Lemma (1984)
    S. Dasgupta, A. Gupta, An elementary proof of a theorem of Johnson and Lindenstrauss, 2002(1999).
    D. Achlioptas, Data-base friendly random projections, 2001.
    R. Baraniuk, M. Davenport, R. DeVore, and M. Wakin. The Johnson-Lindenstrauss lemma meets compressed sensing, 2006.
    F. Krahmer, R. Ward, New and improved Johnson-Lindenstrauss embeddings via the restricted isometry property, 2011.
  • Concentration of measure for matrices
    R. Ahlswede, A. Winter, Strong converse for identification via quantum channels, 2001.
    R.I. Oliveira, Sums of random Hermitian matrices and an inequality by Rudelson, 2010.
    J. Tropp, User-friendly tail bounds for sums of random matrices, 2010.
  • Miscellaneous papers, I consider interesting
    G.W. Stewart, Perturbation theory for the singular value decomposition, 1990.
    J. Tropp, On the conditioning of random sub-dictionaries, 2008.
Nach oben scrollen