As you will very quickly see, I am interested in many different topics. If there is a term common to all of them, that is quantum. But after that, I have worked in very fundamental concepts, such as causality (in fact, the main topic of my Ph.D. thesis is quantum networks) or the vacuum, in experiment-motivated fields of thermodynamics, or in more applied topics such as machine learning. I am also genuinely interested in classical machine learning. In fact, with a small group of friends we even attempted to do a Kaggle competition, and scored top 20%! After this, my interest moved closer to unsupervised learning and probabilistic graphical models.
Accelerating the training of single-layer binary neural networks using the HHL quantum algorithm
S. Lopez Alarcon, C. Merkel, M. Hoffnagle, S. Ly, A. Pozas-Kerstjens
Defence against adversarial attacks using classical and quantum-enhanced Boltzmann machines
A. Kehoe, P. Wittek, Y. Xue, A. Pozas-Kerstjens
Efficient training of energy-based models via spin-glass control
A. Pozas-Kerstjens, G. Muñoz-Gil, M. Á. García-March, A. Acín, M. Lewenstein, P. R. Grzybowski
Our work Efficient training of energy-based models via spin-glass control appears featured as Editor’s Recommendahttps://doi.org/10.1088/1361-6633/ac41bbtion in Machine Learning: Science and Technology. You can see it on the paper's website.
My colleague Peter Wittek was invited to the Toronto Deep Learning Series to talk about our work Bayesian deep learning on a quantum computer. You can check out his talk here. Also, he did a review of the paper in the AI Socratic Circles blog. This was later converted into a KDnuggets blog story, which earned a Gold medal in July 2019.
The two works Harvesting correlations from the quantum vacuum and Entanglement harvesting from the electromagnetic vacuum with hydrogenlike atoms appear featured in Revista Española de Física 31-1, 40-41 (2017)
Lately I have been teaching some workshops on introductions to quantum computing through Qiskit. The material employed in those workshops can be found in my teaching repository.
ebm-torch is a collection of codes I have developed for playing around with Boltzmann machines in Pytorch. It includes an easy way of defining graphical models, samplers, and optimizers for learning probability distributions from data samples.
For the past years, we have been running in ICFO a reading group focused in classical and quantum machine learning. We stored the topics covered in all sessions, summaries and comments on papers, and even exercises with solutions, in the qml-rg repository.
I instructed an introductory bootcamp to Python and data analysis as part of BIST's Master of Multidisciplinary Research in Experimental Sciences. The content of such bootcamp can be found in this repository.