University of Birmingham > Talks@bham > Theoretical Physics Seminars > Tensor Networks for Machine Learning (Note venue)

## Tensor Networks for Machine Learning (Note venue)Add to your list(s) Download to your calendar using vCal - Miles Stoudenmire (Flatiron Institute)
- Thursday 26 September 2019, 13:45-15:00
- NOTE venue Watson Building, Lecture Theatre C (G24).
If you have a question about this talk, please contact Mike Gunn. Tensor networks are a technique originating in quantum many-body physics, which underpin powerful and interesting computational techniques. They can be viewed as a way of storing exponentially big quantum wavefunctions using much fewer parameters. Tensor networks are valued for their ability to be optimized very quickly with adaptive algorithms, and their high degree of interpretability. If one takes a more general view of tensor networks, they are just a kind of function approximator. This means they could be applied to many interesting problems I will discuss frameworks for applying tensor networks, and tensor network algorithms to perform both supervised and unsupervised learning tasks, and discuss recent progress in this area. One exciting & real possibility is theoretically predicting the outcome of algorithms for training tensor network models based on summary properties of the data. Another interesting idea is to use quantum circuits identical to tensor networks for machine learning on quantum hardware, allowing transfer of model designs and training methods across these platforms. I will conclude by discussing some future directions where tensor networks could have a significant impact. This talk is part of the Theoretical Physics Seminars series. ## This talk is included in these lists:- Bham Talks
- NOTE venue Watson Building, Lecture Theatre C (G24)
- Theoretical Physics Seminars
- What's on in Physics?
Note that ex-directory lists are not shown. |
## Other listsAnalysis Seminar Algebra Reading Group on Sporadic Groups Physics and Astronomy Colloquia## Other talksA compositional theory of digital circuits: Part II Mini-workshops Fixed point ratios for primitive groups and applications On the choice of loss functions and initializations for deep learning-based solvers for PDEs FnS: Time-complexity analysis of co-evolutionary algorithms as an adversarial optimisation approach Hidden Markov Model in Multiple Testing on Dependent Data |