Sure, here is the summary of the paper "Score-based generative model learn manifold-like structures with constrained mixing" by Li Kevin Wenliang and Ben Moran:
Summary: Score-based generative models (SBMs) learn the data distribution supported on a lower-dimensional manifold by learning a vector field that mixes samples within the manifold while denoising them with normal projections in off-manifold directions.
Key insights and lessons learned:
- SBMs can learn the data distribution supported on a lower-dimensional manifold by learning a vector field that mixes samples within the manifold while denoising them with normal projections in off-manifold directions.
- The local dimensionality of the manifold increases as the noise decreases.
- The subspace spanned by the local features overlap with an effective density function.
Questions for the authors:
- How can SBMs be used to learn more complex data distributions, such as those with multiple modes?
- How can SBMs be made more robust to noise?
- How can SBMs be used to generate more realistic samples?
Related topics or future research directions:
- Adversarial score matching
- Energy-based models
- Variational inference
References:
- [1] Kingma, D. P., & Welling, M. (2013). Auto-encoding variational Bayes. arXiv preprint arXiv:1312.6114.
- [2] Rezende, D. J., Mohamed, S., & Senior, A. W. (2014). Stochastic backpropagation and approximate inference in deep generative models. arXiv preprint arXiv:1401.4088.
- [3] Salimans, T., Goodfellow, I. J., Zaremba, W., Cheung, V., Radford, A., & Sutskever, I. (2016). Improved techniques for training variational autoencoders. arXiv preprint arXiv:1606.05386.
- [4] Maddison, C. J., Teh, Y. W., & Sutskever, I. (2016). The concrete distribution: A scalable variational inference framework. arXiv preprint arXiv:1611.00712.
- [5] Ho, J., Lim, T., & Welling, M. (2016). Generative adversarial moment matching. arXiv preprint arXiv:1611.04051.