Laboratory of Artificial Intelligence

The website of the Laboratory for Artificial Intelligence is currently being migrated.
Further below you can find a selection of publications from recent years.

A full list of all publications is currently available here.

News

December 2025: We are happy to announce the release of the new source code library VAMM! The models and algorithms of VAMM can be used for (1) highly efficient data density estimation for large-scale and high-dimensional data, and (2) for clustering of very large-scale and high-dimensional data. The unique feature implemented by VAMM is a sublinear runtime scaling: a linear increase of model size results in an only sublinear increase of runtime. See the following (inverse chronological) list of papers that led to the new library: Salwig, Kahlke, et al., arXiv 2025; Hirschberger et al., TPAMI 2022; Lücke & Forster, PRLetters 2019; Forster & Lücke, AISTATS 2018. See here for more information. The paper arXiv:2501.12299 (news item below) is the main reference for VAMM.

December 2025: We have uploaded a new version of the paper "Sublinear Variational Optimization of Gaussian Mixture Models with Millions to Billions of Parameters", Salwig, Kahlke, Hirschberger, Forster, Lücke, to arXiv.

November 2025: Sebastian Salwig successfully defended his doctoral thesis. Congratulations!

October 2025: Till Kahlke has joined the new AI Lab at Innsbruck University as Doctoral Researcher. Welcome!

October 2025: Yidi Ke has joined the new AI Lab at Innsbruck University as Doctoral Researcher. Welcome!

September 2025: Esther Gezzele-Lechner has joined the new AI Lab as administrator. Welcome!

September 2025: Simon Außerlechner has joint the new AI Lab at Innsbruck University. Welcome!

September 2025: Start of the new AI Lab at the Department of Computer Science, Innsbruck University.

A selection of papers of the last three years is the following:


H. Mousavi, J. Lücke (2025).
Linear and Nonlinear Generative Models for 'Zero-Shot'Image Denoising in the Limit of Few Photons.
Journal of Mathematical Imaging and Vision 67(3): 1-17. (online access, bibtex)

S.Salwig*, J. Drefs* and J. Lücke (2024).
Zero-shot denoising of microscopy images recorded at high-resolution limits.
PLOS Computational Biology 20(6): e1012192 (online access, bibtex)
*joint first authorship.

D. Velychko, S. Damm, Z. Dai, A. Fischer and J. Lücke (2024).
Learning Sparse Codes with Entropy-Based ELBOs.
Int. Conf. on Artificial Intelligence and Statistics (AISTATS), 2089-2097, 2024. (online access, bibtex)

H. Mousavi, J. Drefs, F. Hirschberger, J. Lücke (2023).
Generic Unsupervised Optimization for a Latent Variable Model with Exponential Family Observables.
Journal of Machine Learning Research 24(285):1−59. (online access, bibtex)

S. Damm*, D. Forster, D. Velychko, Z. Dai, A. Fischer and J. Lücke* (2023).
The ELBO of Variational Autoencoders Converges to a Sum of Entropies.
Int. Conf. on Artificial Intelligence and Statistics (AISTATS), 3931-3960, 2023. (online access, bibtex).
*joint main contributions

J. Drefs*, E. Guiraud*, F. Panagiotou, J. Lücke (2023).
Direct Evolutionary Optimization of Variational Autoencoders With Binary Latents.
European Conference on Machine Learning, 357-372. (pdf, bibtex).
*joint first authorship

F. Hirschberger*, D. Forster* and J. Lücke (2022).
A Variational EM Acceleration for Efficient Clustering at Very Large Scales.
IEEE Transactions on Pattern Analysis and Machine Intelligence 44(12):9787-9801 (online access, bibtex).
*joint first authorship.

J. Drefs, E. Guiraud and J. Lücke (2022).
Evolutionary Variational Optimization of Generative Models.
Journal of Machine Learning Research 23(21):1-51 (online access, bibtex).

Selected other papers:

J. Lücke and D. Forster (2019).
k-means as a variational EM approximation of Gaussian mixture models.
Pattern Recognition Letters 125:349-356 (online access, bibtexarXiv).

F. Hutter*, J. Lücke*, L. Schmidt-Thieme* (2015).
Beyond manual tuning of hyperparameters.
KI - Künstliche Intelligenz 29 (4), 329-337.
*alphabetical order

Z. Dai and J. Lücke (2014).
Autonomous Document Cleaning – A Generative Approach to Reconstruct Strongly Corrupted Scanned Texts.
IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10): 1950-1962. (online access, bibtex)

A.-S. Sheikh, J. A. Shelton, J. Lücke (2014).
A Truncated EM Approach for Spike-and-Slab Sparse Coding.
Journal of Machine Learning Research 15:2653-2687. (online access, bibtex)

M. Henniges, R.E. Turner, M. Sahani, J. Eggert, J. Lücke (2014).
Efficient occlusive components analysis.
The Journal of Machine Learning Research 15 (1), 2689-2722.

J Lücke, J Eggert (2010).
Expectation truncation and the benefits of preselection in training generative models.
The Journal of Machine Learning Research 11, 2855-2900.

J. Lücke, C. von der Malsburg (2004).
Rapid processing and unsupervised learning in a model of the cortical macrocolumn.
Neural Computation 16 (3), 501-533.

Nach oben scrollen