About

I am Hsin-Yuan Huang (pronounced as “Shin Yuan Huan”, 黃信元), a Ph.D. student at Caltech advised by John Preskill and Thomas Vidick. I also go by the name Robert.

Research Interest:

I work on the interplay between quantum physics and computer science (information theory, machine learning, complexity theory). Some problems I think about:

  • How to learn and make predictions in a quantum-mechanical world?
  • Could quantum machines learn faster and predict more accurately?
  • How can machine learning advance quantum technology and fundamental physics?
  • What are the fundamental limits in learning with classical and quantum machines?
Cartoon depiction of intelligence

Publications:

Google Scholar provides a full list under chronological/citations order.

» Machine learning, quantum physics, and quantum advantage

Can classical machine learning models solve challenging problems in physics that no classical algorithms could solve? By solve, we consider the ability to provide solutions in polynomial time. We prove the affirmative in the following paper. See my 13 tweets for a summary.

  • H.-Y. Huang, R. Kueng, G. Torlai, V. V. Albert, J. Preskill. Provably efficient machine learning for quantum many-body problems. QIP 2021, arXiv (2021).

But why could machine learning algorithms be significantly more efficient than non-machine-learning algorithms? The main idea is that generalizing from training data can be much easier than computing answers from the ground up. In the following work with Google Quantum AI, we study how such a fact impacts quantum advantage in machine learning.

Furthermore, we show that for a wide range of learning problems in quantum physics, there is a fundamental limit to quantum advantage in prediction performance. We also identify learning problems with exponential quantum advantage without relying on any conjecture. The results are presented in the following work.

An invited perspective article that summarizes these ideas in the context of quantum chemistry and presents future directions:

  • J. R. McClean, N. C. Rubin, J. Lee, M. P. Harrigan, T. E. O’Brien, R. Babbush, W. J. Huggins, H.-Y. Huang. What the foundations of quantum computer science teach us about chemistry. arXiv (2021).

» Classical shadows of quantum states

Classical shadow is an efficient representation of quantum many-body system that can be easily constructed in experiments. Storing full description of a quantum system requires an exponential amount of classical memory (exponential in the number of qubits). But classical shadow only require a linear (or polynomial) amount of classical memory. See the wiki page, a nice tutorial in PennyLane, or a great talk by John Preskill for more details.

We proposed the idea of classical shadow in the following Nature Physics article.

We wrote a follow-up work that can make classical shadows even more efficient. The key idea is to derandomize the randomization construction in the original classical shadow.

The following work demonstrates that classical shadows can be easily applied to existing experimental platforms.

  • A. Elben, R. Kueng, H.-Y. Huang, R. van Bijnen, C. Kokail, M. Dalmonte, P. Calabrese, B. Kraus, J. Preskill, P. Zoller, B. Vermersch. Mixed-state entanglement from local randomized measurements. Physical Review Letters, arXiv (2020).

» Many-body quantum chaos

What are some salient features of chaotic many-body systems? The following works with condensed matter physicists make progress in answering this question. In particular, we show that most wavefunctions can individually generate state design by performing measurements on the subsystem. A state design is an ensemble of quantum states that are uniformly distributed in the quantum Hilbert space.

  • J. Cotler$\dagger$, D. Mark$\dagger$, H.-Y. Huang$\dagger$ (co-first author), F. Hernandez, J. Choi, A. L. Shaw, M. Endres, S. Choi. Emergent quantum state designs from individual many-body wavefunctions. arXiv (2021).

We experimentally verify this observation. And utilize this concept to perform benchmarking in a Rydberg atom system.

  • J. Choi, A. Shaw, I. Madjarov, X. Xie, J. Covey, J. Cotler, D. Mark, H.-Y. Huang, A. Kale, H. Pichler, F. Brandao, S. Choi, M. Endres. Emergent Randomness and Benchmarking from Many-Body Quantum Chaos. arXiv (2021).

» Quantum simulation and quantum algorithm

Quantum simulation is an important application of quantum computers. In the first work, we provide a tight analysis to study one of the simplest yet strongest simulation method, Trotterization / Product formula, for simulating interacting electrons.

  • Y. Su, H.-Y. Huang, E. Campbell. Nearly-tight Trotterization of interacting electrons. Quantum, QIP 2021, arXiv (2020).

Randomness has recently been used to speed up quantum simulation algorithms. We provide a comprehensive analysis using random matrix theory, a fascinating topic in mathematics that has been contributed by Joel Tropp and others.

  • C.-F. Chen$\dagger$, H.-Y. Huang$\dagger$ (co-first author), R. Kueng, J. Tropp. Concentration for random product formulas. TQC 2021, arXiv (2020).

In this work, we look at how near-term quantum computers could be used to solve linear systems of equations. The method we proposed comes with rigorous guarantee. But the usefulness of quantum computers in real world problems is not yet conclusive.

  • H.-Y. Huang, K. Bharti, P. Rebentrost. Near-term quantum algorithms for linear systems of equations. arXiv (2019).

» Deep learning for question answering

During undergraduate, I spent a summer at Microsoft Research working on designing deep learning models to perform questions answering (also known as machine comprehension). We were at the top of the leaderboard back in the days.

  • H.-Y. Huang, C. Zhu, Y. Shen, W. Chen. FusionNet: Fusing via Fully-aware attention with application to machine comprehension. ICLR 2018, arXiv (2017).

After a year, I did a follow-up work at Allen Institute of AI that considers question answering in a conversational environment.

  • H.-Y. Huang, E. Choi, W. Yih. FlowQA: grasping flow in history for conversational machine comprehension. ICLR 2019, arXiv (2018).

» Classical machine learning

In my first year of undergraduate, I started doing research in machine learning with Chih-Jen Lin. The following works marked the beginning of my research career. The first work studies recommendation systems using matrix factorization. The second work studies a basic task in autoML on choosing between linear and kernel classifiers.

  • H.-F. Yu, H.-Y. Huang, I. S. Dhillon, C.-J. Lin. A unified algorithm for one-class structured matrix factorization with side information. AAAI 2017, Paper link.

  • H.-Y. Huang, C.-J. Lin. Linear and kernel classification: When to use which? SDM 2016, Paper link.

» Bioinformatics

During high school (2013-2014), I analyzed protein-protein interaction networks and developed models for the evolutions of protein networks.

  • C.-Y. Chen, A. Ho, H.-Y. Huang, H.-F. Juan and H.-C. Huang. Dissecting the human protein-protein interaction network via phylogenetic decomposition. Scientific Reports.