Portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 2
Published in Journal 1, 2009
This paper is about the number 1. The number 2 is left for future work.
Recommended citation: Your Name, You. (2009). "Paper Title Number 1." Journal 1. 1(1). http://academicpages.github.io/files/paper1.pdf
Published in Journal 1, 2010
This paper is about the number 2. The number 3 is left for future work.
Recommended citation: Your Name, You. (2010). "Paper Title Number 2." Journal 1. 1(2). http://academicpages.github.io/files/paper2.pdf
Published in Journal 1, 2015
This paper is about the number 3. The number 4 is left for future work.
Recommended citation: Your Name, You. (2015). "Paper Title Number 3." Journal 1. 1(3). http://academicpages.github.io/files/paper3.pdf
Published:
During MLSS at Skoltech, I give a small practical tutorial on MCMC for listners of wonderful Mark Gilorami talk on probabilistic numerics. There is notebook and video.
Published:
I presented our (Evgenii Egorov, Kirill Neklydov, Ruslan Kostoev and Evgeny Burnaev) work on greedy semi-parametric Variational Inference. We propose a version of the Frank-Wolfe algorithm with entropy regularization in the density space. There are slides and full-text on arXiv, as well as published version.
Published:
We have the meeting between “Machine Learning”-people and Physicists at the wonderful workshop Physics inspired Machine Learning. I presented our (Kirill Neklyudov, Evgenii Egorov, Dmitry Vetrov) paper on the fusion between MCMC and GANs. There is a video (in Russian) and very long slides. If you would like to take the global idea instantly, I propose to see our NeurIPS poster.
Published:
During the MIDL2020, I and Anna Kuzina presented our work on transfer learning for 3D MRI segmentation. How to learn deep and big 3D UNet-like model with only 5 or 10 training images? It is a common question for medical data. We address it by imposing prior distribution over the convolutional kernels as the generative model (VAE)!
Under-graduate course, HSE, Computer Science, 2020
I deliver practical seminars on bayesian machine learning which are complemen- tary to the lectures of Prof. Vetrov. You can find all materials in this repo.
Graduate course, Skoltech, CDISE, 9999
I developed materials on Bayesian Machine during autumns in 2018, 2019, 2020. Please visit the following repo. The content is a mixture of the essential introduction, with attention to the exponential families and modern methods for deep learning.