ray-awesome-papers
ray-awesome-papers copied to clipboard
Kyurae Kim's Awesome Reads
Kyurae Kim's Awesome Papers
The papers that made me stay awake all night long. Let me know if you have anything interesting to share!
Interests
- High-performance computing
- Probabilistic machine learning
- Bayesian Statistics
- Bayesian inference
- Bayesian optimization
- Heterogeneous, specialized hardware
- Image processing
- Signal Processing
Awesome Papers
-
Blei, David M., Andrew Y. Ng, and Michael I. Jordan. "Latent Dirichlet Allocation." Journal of machine Learning research 3.Jan (2003): 993-1022.
-
Neal, Radford M. "Bayesian Learning for Neural Networks." Vol. 118. Springer Science & Business Media, 2012.
-
Chaney, Allison, et al. "Detecting and Characterizing Events." Proceedings of the Conference on Empirical Methods in Natural Language Processing. 2016.
-
Regier, Jeffrey, et al. "Approximate inference for constructing astronomical catalogs from images." The Annals of Applied Statistics 13.3 (2019): 1884-1926.
-
Shanbhag, Naresh R., et al. "Shannon-inspired Statistical Computing for The Nanoscale Era." Proceedings of the IEEE 107.1 (2019): 90-107.
-
Ungar, David, and Sam S. Adams. "Harnessing Emergence for Manycore Programming: Early Experience Integrating Ensembles, Adverbs, and Object-based Inheritance." Proceedings of the ACM International Conference Companion on Object Oriented Programming Systems languages and Applications Companion (OOPSLA). ACM, 2010.
-
Thompson, Neil, and Svenja Spanuth. "The Decline of Computers As a General Purpose Technology: Why Deep Learning and the End of Moore’s Law are Fragmenting Computing." Available at SSRN 3287769 (2018).
-
Hammernik, Kerstin, et al. "Learning a Variational Network for Reconstruction of Accelerated MRI Data." Magnetic resonance in medicine 79.6 (2018): 3055-3071.
- Previous works
- Chen, Yunjin, Wei Yu, and Thomas Pock. "On learning optimized reaction-diffusion processes for effective image restoration." Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), 2015.
- Previous works
-
Fuchs, Adi, and David Wentzlaff. "The Accelerator Wall: Limits of Chip Specialization." Proceedings of the IEEE International Symposium on High-Performance Computer Architecture (HPCA'19).
-
Ulyanov, Dmitry, Andrea Vedaldi, and Victor Lempitsky. "Deep Image Prior." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR'18).
- Follow-up works
- Zezhou Cheng, Matheus Gadelha, Subhransu Maji, Daniel Sheldon. "A Bayesian Perspective on the Deep Image Prior". Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR'19).
- Follow-up works
-
de Fine Licht, Johannes, et al. "Transformations of high-level synthesis codes for high-performance computing." IEEE Transactions on Parallel and Distributed Systems 32.5 (2020): 1014-1029.
-
Kendall, Alex, and Yarin Gal. "What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?." Advances in neural information processing systems (NIPS). 2017.
-
Boyd, Stephen, et al. "Distributed optimization and statistical learning via the alternating direction method of multipliers." Foundations and Trends® in Machine learning 3.1 (2011): 1-122.
-
Pearce, Tim, et al. "Uncertainty in Neural Networks: Bayesian Ensembling." In Proceedings of Artificial Intelligence and Statistics (AISTATS'20). 2020.
-
Qiang Liu and Dilin Wang. 2016. "Stein Variational Gradient Descent: a General Purpose Bayesian Inference Algorithm." Advances in Neural Information Processing Systems (NIPS'16), 2016.
- Interactive demo. Select SVGD for the algorithm.
- Follow-up works
- Han, Jun, and Qiang Liu. "Stein Variational Gradient Descent Without Gradient." Proceedings of the International Conference on Machine Learning (ICML'18), in PMLR 80:1900-1908
- Han, Jun, and Qiang Liu. "Stein Variational Adaptive Importance Sampling." Proceedings of the Conference on Uncertainty in Artificial Intelligence (UAI'17).
-
Wilson Ye Chen, Alessandro Barp, Francois-Xavier Briol, Jackson Gorham, Mark Girolami, Lester Mackey and Chris Oates. (2019). "Stein Point Markov Chain Monte Carlo", Proceedings of the International Conference on Machine Learning (ICML'19), in PMLR 97:1011-1021
-
Jordan, Michael I. "Dynamical, Symplectic and Stochastic Perspectives on Gradient-Based Optimization." University of California, Berkeley (2018).
-
Kruskal, Clyde P., and Alan Weiss. "Allocating Independent Subtasks on Parallel Processors." IEEE Transactions on Software engineering 10, 1001-1016, 1985.
- Follow-up works
- Bast, Hannah. Ph.D. Thesis, 2000
- Follow-up works
-
Solnik, Benjamin, et al. "Bayesian Optimization for a Better Dessert." (2017).
-
Dai, Z., Yu, H., Low, B.K.H. & Jaillet, P.. (2019). "Bayesian Optimization Meets Bayesian Optimal Stopping". Proceedings of the 36th International Conference on Machine Learning (ICML), in PMLR 97:1496-1506
-
Hartwig Anzt, Terry Cojean, Chen Yen-Chen, Jack Dongarra, Goran Flegar, Pratik Nayak, Stanimire Tomov, Yuhsiang M. Tsai, and Weichung Wang. "Load-balancing Sparse Matrix Vector Product Kernels on GPUs". ACM Transactions on Parallel Computing. 7, 1, Article 2 (March 2020).
-
Kathleen E. Hamilton, Catherine D. Schuman, Steven R. Young, Ryan S. Bennink, Neena Imam, and Travis S. Humble. "Accelerating Scientific Computing in the Post-Moore’s Era". ACM Transactions on Parallel Computing. 7, 1, Article 6 (March 2020).
-
Mikkola, Petrus, et al. "Projective Preferential Bayesian Optimization". Proceedings of the International Conference on Machine Learning (ICML'20), 2020.
-
Slaughter, Elliott, et al. "Task Bench: A Parameterized Benchmark for Evaluating Parallel Runtime Performance." Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis (SC'20), 2020.
-
Geoffrey Roeder, Yuhuai Wu, David K. Duvenaud. "Sticking the Landing: Simple, Lower-Variance Gradient Estimators for Variational Inference." Advances in Neural Information Processing Systems 30 (NeurIPS'17), 2017.
-
Vaden Masrani, Tuan Anh Le, Frank Wood. "The Thermodynamic Variational Objective." Advances in Neural Information Processing Systems 32 (NeurIPS'19), 2019.
- Follow-up works
- Rob Brekelmans, Vaden Masrani, Frank Wood, Greg Ver Steeg, Aram Galstyan. "All in the Exponential Family: Bregman Duality in Thermodynamic Variational Inference." Proceedings of the International Conference on Machine Learning (ICML'20), PMLR 119:1111-1122, 2020.
- Vu Nguyen, et al. "Gaussian Process Bandit Optimization of the Thermodynamic Variational Objective." Advances in Neural Information Processing Systems 33 (NeurIPS'20).
- Follow-up works
-
Tijana Radivojević, Elena Akhmatskaya. "Modified Hamiltonian Monte Carlo for Bayesian inference." Statistics and Computing 30, 377–404, 2020.
-
Gilboa, Guy, Nir Sochen, and Yehoshua Y. Zeevi. "Image enhancement and denoising by complex diffusion processes." IEEE Transactions on Pattern Analysis and Machine Intelligence 26.8 (2004): 1020-1036.
-
Tzu-Mao Li, Jaakko Lehtinen, Ravi Ramamoorthi, Wenzel Jakob, Frédo Durand. "Anisotropic Gaussian Mutations for Metropolis Light Transport through Hessian-Hamiltonian Dynamics." ACM Transactions on Graphics 34(6) (Proceedings of ACM SIGGRAPH Asia 2015).
-
Eric Brochu, Tyson Brochu, Nando de Freitas. "A Bayesian interactive optimization approach to procedural animation design." Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation (SCA'10), 103-112.
-
Fearnhead, Paul, and Benjamin M. Taylor. "An adaptive sequential Monte Carlo sampler." Bayesian Analysis 8.2 (2013): 411-438.
-
Akash Kumar Dhaka, et al. "Robust, Accurate Stochastic Optimization for Variational Inference." Advances in Neural Information Processing Systems (NeurIPS'20).
-
Yu, Yongjian, and Scott T. Acton. "Speckle reducing anisotropic diffusion." IEEE Transactions on image processing 11.11 (2002): 1260-1270.
- Follow-up works
- Aja-Fernández, Santiago, and Carlos Alberola-López. "On the estimation of the coefficient of variation for anisotropic diffusion speckle filtering." IEEE Transactions on Image Processing 15.9 (2006): 2694-2701.
- Krissian, Karl, et al. "Oriented speckle reducing anisotropic diffusion." IEEE Transactions on Image Processing 16.5 (2007): 1412-1424.
- Follow-up works
-
Shan, Tie-Jun, and Kailath, T., "Adaptive beamforming for coherent signals and interference." IEEE Transactions on Acoustics, Speech, and Signal Processing 33(3), Jun 1985.
-
Yuko Ishiwaka, Xiao S. Zeng, Michael Lee Eastman, Sho Kakazu, Sarah Gross, Ryosuke Mizutani, and Masaki Nakada, "Foids: bio-inspired fish simulation for generating synthetic datasets." ACM Transactions on Graphics 40, 6, Article 207, Dec 2021.
-
Surjanovic, Nikola, et al. "Parallel Tempering With a Variational Reference." Advances in Neural Information Processing Systems 35 (2022): 565-577.
-
Andrieu, Christophe, and Arnaud Doucet. "Joint Bayesian model selection and estimation of noisy sinusoids via reversible jump MCMC." IEEE Transactions on Signal Processing 47.10 (1999): 2667-2676.
-
Mishchenko, Konstantin, Ahmed Khaled, and Peter Richtárik. "Random reshuffling: Simple analysis with vast improvements." Advances in Neural Information Processing Systems 33 (2020): 17309-17320.
-
Jacob, Pierre E., John O’Leary, and Yves F. Atchadé. "Unbiased Markov chain Monte Carlo methods with couplings." Journal of the Royal Statistical Society: Series B (Statistical Methodology) 82.3 (2020).
-
De Bortoli, Valentin, et al. "Diffusion Schrödinger bridge with applications to score-based generative modeling." Advances in Neural Information Processing Systems 34 (2021): 17695-17709.
-
Giordano, Ryan, Tamara Broderick, and Michael I. Jordan. "Covariances, robustness and variational bayes." Journal of machine learning research 19.51 (2018).
-
Doucet, Arnaud, Will Grathwohl, Alexander G. Matthews, and Heiko Strathmann. "Score-based diffusion meets annealed importance sampling." Advances in Neural Information Processing Systems 35 (2022): 21482-21494.
-
Heng, Jeremy, Adrian N. Bishop, George Deligiannidis, and Arnaud Doucet. "Controlled sequential Monte Carlo." Annals of Statistics 48, no. 5 (2020).
-
Kobak, Dmitry, Jonathan Lomond, and Benoit Sanchez. "The optimal ridge penalty for real-world high-dimensional data can be zero or negative due to the implicit ridge regularization." The Journal of Machine Learning Research 21.1 (2020): 6863-6878.
-
Bernton, Espen, et al. "Schrodinger Bridge Samplers." arXiv preprint arXiv:1912.13170 (2019).
-
Bardenet, Rémi, Arnaud Doucet, and Chris Holmes. "On Markov chain Monte Carlo methods for tall data." Journal of Machine Learning Research 18.47 (2017).
-
Karagiannis, Georgios, and Christophe Andrieu. "Annealed importance sampling reversible jump MCMC algorithms." Journal of Computational and Graphical Statistics 22.3 (2013): 623-648.
-
Aubry, Mathieu, Sylvain Paris, Samuel W. Hasinoff, Jan Kautz, and Frédo Durand. "Fast local laplacian filters: Theory and applications." ACM Transactions on Graphics (TOG) 33.5 (2014): 1-14.
-
A. Dieuleveut, G. Fort, E. Moulines and H. -T. Wai, "Stochastic Approximation Beyond Gradient for Signal Processing and Machine Learning," in IEEE Transactions on Signal Processing, vol. 71, pp. 3117-3148, 2023.
-
Kunstner, Frederik, Raunak Kumar, and Mark Schmidt. "Homeomorphic-invariance of em: Non-asymptotic convergence in kl divergence for exponential families via mirror descent." International Conference on Artificial Intelligence and Statistics. PMLR, 2021.
-
Altschuler, Jason M., and Pablo A. Parrilo. "Acceleration by stepsize hedging II: Silver stepsize schedule for smooth convex optimization." arXiv preprint arXiv:2309.16530 (2023).
-
Biron-Lattes, Miguel, Nikola Surjanovic, Saifuddin Syed, Trevor Campbell, and Alexandre Bouchard-Côté. "autoMALA: Locally adaptive Metropolis-adjusted Langevin algorithm." International Conference on Artificial Intelligence and Statistics. PMLR, 2024.
-
Taylor, Adrien, and Francis Bach. "Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions." Conference on Learning Theory. PMLR, 2019.
-
Durmus, Alain, Szymon Majewski, and Błażej Miasojedow. "Analysis of Langevin Monte Carlo via convex optimization." Journal of Machine Learning Research 20.73 (2019): 1-46
-
Lacoste–Julien, Simon, Ferenc Huszár, and Zoubin Ghahramani. "Approximate inference for the loss-calibrated Bayesian." Proceedings of the International Conference on Artificial Intelligence and Statistics. PMLR, 2011.
-
Akyildiz, Ö. Deniz, Francesca Romana Crucinio, Mark Girolami, Tim Johnston, and Sotirios Sabanis. "Interacting particle langevin algorithm for maximum marginal likelihood estimation." arXiv preprint arXiv:2303.13429 (2023).
-
Doucet, Arnaud, Will Grathwohl, Alexander G. Matthews, and Heiko Strathmann. "Score-based diffusion meets annealed importance sampling." Advances in Neural Information Processing Systems 35 (2022): 21482-21494.