Suivre
Michael Murray
Michael Murray
Mathematics, UCLA
Adresse e-mail validée de math.ucla.edu
Titre
Citée par
Citée par
Année
Activation function design for deep networks: linearity and effective initialisation
M Murray, V Abrol, J Tanner
Applied and Computational Harmonic Analysis 59, 117-154, 2022
222022
Representation Learning for High-Dimensional Data Collection under Local Differential Privacy
A Mansbridge, G Barbour, D Piras, M Murray, C Frye, I Feige, D Barber
arXiv preprint arXiv:2010.12464, 2020
7*2020
Characterizing the Spectrum of the NTK via a Power Series Expansion
M Murray, H Jin, B Bowman, G Montufar
International Conference on Learning Representations (ICLR) 2023, 2022
62022
Training shallow ReLU networks on noisy data using hinge loss: when do we overfit and is it benign?
E George, M Murray, W Swartworth, D Needell
Thirty-seventh Conference on Neural Information Processing Systems (NeurIPS …, 2023
22023
Mildly overparameterized relu networks have a favorable loss landscape
K Karhadkar, M Murray, H Tseran, G Montúfar
arXiv preprint arXiv:2305.19510, 2023
22023
Towards an understanding of CNNs: analysing the recovery of activation pathways via Deep Convolutional Sparse Coding
M Murray, J Tanner
2018 IEEE Data Science Workshop, 2018
1*2018
Benign overfitting in leaky ReLU networks with moderate input dimension
K Karhadkar, E George, M Murray, G Montúfar, D Needell
arXiv preprint arXiv:2403.06903, 2024
2024
Encoder blind combinatorial compressed sensing
M Murray, J Tanner
IEEE Transactions on Information Theory, 2022
2022
From matrix factorisation to signal propagation in deep learning: algorithms and guarantees
M Murray
University of Oxford, 2021
2021
Le système ne peut pas réaliser cette opération maintenant. Veuillez réessayer plus tard.
Articles 1–9