**Assistant Professor**

Department of Electrical and Computer Engineering

Coordinated Science Laboratory

University of Illinois at Urbana-Champaign

Contact -
CV -
Google Scholar -
dblp -
github -
LinkedIn

My research is in optimization and machine learning, and their applications in power and energy systems. Specifically: Nonconvex optimization (low-rank matrix optimization); Convex optimization (semidefinite programming); Large-scale linear systems (Krylov subspaces); Exploiting domain-specific structure (small treewidth, quadratic nonconvexity, monotonicity) to solve problems in power and energy with *provable guarantees* on quality, speed, and safety.

I am advising PhD students Gavin (Jialun) Zhang, Hong-Ming Chiu, Iven Guzel, and June Hou.

**January 2023.**I will serve as an Area Chair for ICML 2023.**November 2022.**Existing techniques for certifying the robustness of neural networks to adversarial perturbations have been observed to face a "convex relaxation barrier". Our preprint overcomes this barrier using a nonconvex relaxation for the ReLU relaxation to overcome this barrier, based on a low-rank restriction of a semidefinite programming (SDP) relaxation.**October 2022.**Our preprint shows that simple alternating minimization is enough to provably solve the Dictionary Learning problem.**August 2022.**Our preprint proposes a preconditioned variant of SGD that maintains all of its favorable properties for practical huge-scale optimization while also making it immune to the effects of ill-conditioning.**August 2022.**I am co-organizing the C3 ai Digitial Transformation Institute Colloquium Series for Fall 2022 alongside Gauri Joshi of CMU.**July 2022.**My preprint proves, in the unconstrained setting, that a constant-factor overparameterization is all that is needed for nonconvex low-rank matrix optimization to always suceed, as if it were convex.**June 2022.**In low-rank matrix optimization, if a locally optimal solution is rank deficient, then it is provably globally optimal. In our new preprint, we show that a simple preconditioner is all it takes to maintain the rapid exponential convergence rate of gradient descent when converging towards a rank deficient local optimum.**April 2022.**I will serve as Area Chair for NeurIPS 2022.**March 2022.**Zico Kolter and I received funding from C3.ai Digital Transformation Institute to study provably robust AI methods for cybersecurity tasks on the critical infrastructure.**September 2021.**Paper accepted at__NeuIPS 2021__: Preconditioned Gradient Descent for Over-parameterized Nonconvex Matrix Factorization.**April 2021.**I will serve as Area Chair for NeurIPS 2021.**April 2021.**New preprint makes sharp global guarantees for nonconvex low-rank matrix recovery in which the search rank is allowed to exceed the true rank: Sharp Global Guarantees for Nonconvex Low-Rank Matrix Recovery in the Overparameterized Regime.**January 2021.**Delighted and honored to receive the NSF CAREER Award.**September 2020.**Two papers accepted at__NeuIPS 2020__:**(Spotlight)**How Many Samples is a Good Initial Point Worth in Low-rank Matrix Recovery? and**(Poster)**On the Tightness of Semidefinite Relaxations for Certifying Robustness to Adversarial Examples.**June 2020.**New preprint shows that the SDP relaxation of ReLU networks is generally tight for a single hidden layer and generally loose for multiple layers: On the Tightness of Semidefinite Relaxations for Certifying Robustness to Adversarial Examples.**June 2020.**New preprint quantifies the value of a good initial point for nonconvex matrix recovery: How Many Samples is a Good Initial Point Worth?**April 2020.**Paper on solving sparse semidefinite programs in near-linear time to appear in__Mathematical Programming__: Sparse Semidefinite Programs with Guaranteed Near-Linear Time Complexity via Dualized Clique Tree Conversion.**November 2019.**Paper on optimizing the relative timing of traffic signals to appear in the__IEEE Transactions on Control of Network Systems__: Large-Scale Traffic Signal Offset Optimization.**June 2019.**Paper on the Restricted Isometry Property (RIP) for nonconvex matrix recovery problem to appear in the__Journal of Machine Learning Research (JMLR)__: Sharp Restricted Isometry Bounds for the Inexistence of Spurious Local Minima in Nonconvex Matrix Recovery.**May 2019.**I will be joining ECE Illinois (UIUC) as an Assistant Professor starting August 2019.**May 2019.**Paper on the nonconvex power system state estimation problem to appear in a special issue of__IEEE Transactions on Control of Network Systems__: Spurious Local Minima in Power System Estimation.**January 2019.**New preprint proves that the (2,1/2)-Restricted Isometry Property (RIP) is both necessary and sufficient for the rank-1 nonconvex matrix recovery problem to contain no spurious local minima: Sharp Restricted Isometry Bounds for the Inexistence of Spurious Local Minima in Nonconvex Matrix Recovery.**December 2018.**Presented 2 papers at__NeurIPS 2018__: (Spotlight) (Poster) How Much Restricted Isometry is Needed In Nonconvex Matrix Recovery? (Poster) A Theory on the Absence of Spurious Solutions for Nonconvex and Nonsmooth Optimization. (Of 4856 total submissions, 1011 were accepted, including 30 orals and 168 spotlights.)**October 2018.**Paper on accelerating ADMM using Krylov subspace appeared in__SIAM Journal on Optimization__: GMRES-Accelerated ADMM for Quadratic Objectives.

**Improved Global Guarantees for the Nonconvex Burer--Monteiro Factorization via Rank Overparameterization**

R.Y. Zhang - Preprint, Jul 2022. [arxiv]**Overcoming the Convex Relaxation Barrier for Neural Network Verification via Nonconvex Low-Rank Semidefinite Relaxations**

H.-M. Chiu, R.Y. Zhang - Preprint, Nov 2022. [arxiv]**Accelerating SGD for Highly Ill-Conditioned Huge-Scale Online Matrix Completion**

G. Zhang, H.-M. Chiu, R.Y. Zhang -NeurIPS 2022Advances in Neural Information Processing Systems. [arxiv]**How Many Samples is a Good Initial Point Worth in Low-rank Matrix Recovery?***Selected for Spotlight (one of 280/9454 submissions)*

G. Zhang, R.Y. Zhang -NeurIPS 2020Advances in Neural Information Processing Systems. [arxiv]**Sparse Semidefinite Programs with Guaranteed Near-Linear Time Complexity via Dualized Clique Tree Conversion**

R.Y. Zhang, J. Lavaei -*Mathematical Programming*, 188.1 (2021): pp. 351-393.. [doi] [arxiv]**GMRES-Accelerated ADMM for Quadratic Objectives**

R.Y. Zhang, J.K. White -*SIAM Journal on Optimization*, 28.4 (2018): pp. 3025-3056. [doi] [arxiv]

**Accelerating SGD for Highly Ill-Conditioned Huge-Scale Online Matrix Completion**

G. Zhang, H.-M. Chiu, R.Y. Zhang -NeurIPS 2022Advances in Neural Information Processing Systems. [arxiv]**Overcoming the Convex Relaxation Barrier for Neural Network Verification via Nonconvex Low-Rank Semidefinite Relaxations**

H.-M. Chiu, R.Y. Zhang - Preprint, Nov 2022. [arxiv]**Improved Global Guarantees for the Nonconvex Burer--Monteiro Factorization via Rank Overparameterization**

R.Y. Zhang - Preprint, Jul 2022. [arxiv]**Simple Alternating Minimization Provably Solves Complete Dictionary Learning**

G. Liang, G. Zhang, S. Fattahi, R.Y. Zhang - Preprint, Oct 2022. [arxiv]**Preconditioned Gradient Descent for Overparameterized Nonconvex Burer-Monteiro Factorization with Global Optimality Certification**

G. Zhang, S. Fattahi, R.Y. Zhang - Preprint, Jun 2022. [arxiv]

**Sparse Semidefinite Programs with Guaranteed Near-Linear Time Complexity via Dualized Clique Tree Conversion**

R.Y. Zhang, J. Lavaei -*Mathematical Programming*, 188.1 (2021): pp. 351-393.. [doi] [arxiv]**Preconditioned Gradient Descent for Over-Parameterized Nonconvex Matrix Factorization**

G. Zhang, S. Fattahi, R.Y. Zhang -NeurIPS 2021Advances in Neural Information Processing Systems. [permalink]**Uniqueness of Power Flow Solutions Using Monotonicity and Network Topology**

S.-W. Park, R.Y. Zhang, J. Lavaei, R. Baldick -*IEEE Transactions on Control of Network Systems*, 8.1 (2020): pp. 319-330. [doi]**Sharp Global Guarantees for Nonconvex Low-Rank Matrix Recovery in the Overparameterized Regime**

R.Y. Zhang - Preprint, Apr 2021. [arxiv]

**How Many Samples is a Good Initial Point Worth in Low-rank Matrix Recovery?***Selected for Spotlight (one of 280/9454 submissions)*

G. Zhang, R.Y. Zhang -NeurIPS 2020Advances in Neural Information Processing Systems. [arxiv]**On the Tightness of Semidefinite Relaxations for Certifying Robustness to Adversarial Examples**

R.Y. Zhang -NeurIPS 2020Advances in Neural Information Processing Systems. [arxiv]

**Sharp Restricted Isometry Bounds for the Inexistence of Spurious Local Minima in Nonconvex Matrix Recovery**

R.Y. Zhang, S. Sojoudi, J. Lavaei -*Journal of Machine Learning Research*, 20.114 (2019): pp. 1−34. [permalink] [arxiv]**Spurious Local Minima in Power System State Estimation***Special Issue on Analysis, Control and Optimization of Energy System Networks*

R.Y. Zhang, J. Lavaei, R. Baldick -*IEEE Transactions on Control of Network Systems*. [pdf]**Large-Scale Traffic Signal Offset Optimization**

Y. Ouyang, R.Y. Zhang, J. Lavaei, P. Varaiya -*IEEE Transactions on Control of Network Systems*. [pdf]**Conic Optimization With Applications to Machine Learning and Energy Systems**

R.Y. Zhang, C. Josz, S. Sojoudi -*Annual Reviews in Control*, 47 (2019): pp. 323-340. [doi] [arxiv]**Monotonicity Between Phase Angles and Power Flow and Its Implications for the Uniqueness of Solutions**

S.W. Park, R.Y. Zhang, J. Lavaei, R. Baldick -HICSS 52Hawaii International Conference on System Sciences.

**GMRES-Accelerated ADMM for Quadratic Objectives**

R.Y. Zhang, J.K. White -*SIAM Journal on Optimization*, 28.4 (2018): pp. 3025-3056. [doi] [arxiv]**How Much Restricted Isometry is Needed In Nonconvex Matrix Recovery?***Selected for Spotlight (one of 168/4856 submissions)*

R.Y. Zhang, C. Josz, S. Sojoudi, J. Lavaei -NeurIPS 2018Advances in Neural Information Processing Systems. [arxiv]**Large-Scale Sparse Inverse Covariance Estimation via Thresholding and Max-Det Matrix Completion**

R.Y. Zhang, S. Fattahi, S. Soujoudi -ICML 2018International Conference on Machine Learning. [permalink] [arxiv] [slides]**A Theory on the Absence of Spurious Optimality**

C. Josz, Y. Ouyang, R. Y. Zhang, J. Lavaei, S. Sojoudi -NeurIPS 2018Advances in Neural Information Processing Systems. [arxiv]**Sparse Semidefinite Programs with Near-Linear Time Complexity**

R.Y. Zhang, J. Lavaei -CDC 201857th IEEE Conference on Decision and Control. [arxiv]**Efficient Algorithm for Large-and-Sparse LMI Feasibility Problems**

R.Y. Zhang, J. Lavaei -CDC 201857th IEEE Conference on Decision and Control. [pdf]**Conic Approximation with Provable Guarantee for Traffic Signal Offset Optimization**

Y. Ouyang, R.Y. Zhang, J. Lavaei, P. Varaiya -CDC 201857th IEEE Conference on Decision and Control. [pdf]**Sparse Inverse Covariance Estimation for Chordal Structures**

S. Fattahi, R.Y. Zhang, S. Sojoudi -ECC 2018European Control Conference 2018. [arxiv]**Conic Optimization Theory: Convexification Techniques and Numerical Algorithms**

R.Y. Zhang*, C. Josz*, S. Sojoudi -ACC 2018American Control Conference. [doi] [arxiv]**Spurious Critical Points in Power System State Estimation**

R.Y. Zhang, J. Lavaei, R. Baldick -HICSS 51Hawaii International Conference on System Sciences. [doi] [pdf]

**Modified Interior-Point Method for Large-and-Sparse Low-Rank Semidefinite Programs**

R.Y. Zhang, J. Lavaei -CDC 201756th IEEE Conference on Decision and Control. [doi] [arxiv]

**Robust Stability Analysis for Large-Scale Power Systems**

R.Y. Zhang - Ph.D. thesis, MIT Department of Electrical Engineering & Computer Science, 2016. [permalink] [pdf]**Certifying Microgrid Stability Under Large-Signal Intermittency**

R.Y. Zhang, J. Elizondo, J.L. Kirtley, J.K. White -COMPEL 2016Seventeenth IEEE Workshop on Control and Modeling for Power Electronics. [doi]**Small-Signal Stability Verification Issues for Transmission Systems with Distributed Renewables**

R.Y. Zhang, J. Elizondo, J.L. Kirtley, J.K. White -PESGM 2016IEEE Power & Energy Society General Meeting 2016. [doi] [pdf]**Inertial and Frequency Response from Microgrids with Induction Motors**

J. Elizondo, R.Y. Zhang, P.-H. Huang, J.K. White, J.L. Kirtley -COMPEL 2016Seventeenth IEEE Workshop on Control and Modeling for Power Electronics. [doi]**Parameter Insensitivity in ADMM-Preconditioned Solution of Saddle-Point Problems**

R.Y. Zhang, J.K. White - Tech report, Feb 2016. [arxiv]

**Toeplitz-Plus-Hankel Matrix Recovery for Green’s Function Computations on General Substrates**

R.Y. Zhang, J.K. White -*Proceedings of the IEEE*, 103.11 (2015): pp. 1970-1984. [doi] [pdf]**Design of Resonance Damping via Control Synthesis**

R.Y. Zhang, A.-T. Avestruz, J.K. White, S.B. Leeb -COMPEL 2015Sixteenth IEEE Workshop on Control and Modeling for Power Electronics. [doi] [pdf]**Robust Small Signal Stability for Microgrids under Uncertainty**

J. Elizondo, R.Y. Zhang, J.K. White, J.L. Kirtley -PEDG 20156th International Symposium on Power Electronics for Distributed Generation Systems. [doi] [pdf]**An energy-based method for the assessment of battery and ultracapacitor in pulse load applications***Outstanding Presentation Award (Poster)*

Y. He, R.Y. Zhang, J.G. Kassakian -APEC 2015IEEE Applied Power Electronics Conference and Exposition 2015. [doi] [pdf]

**Fast simulation of complicated 3D structures above lossy magnetic media**

R.Y. Zhang, J.K. White, J.G. Kassakian -*IEEE Transactions on Magnetics*, 50.10 (2014): 7027416. [doi] [pdf]**Analytical model for effects of twisting on litz-wire losses**

C.R. Sullivan, R.Y. Zhang -COMPEL 2014Fifteenth IEEE Workshop on Control and Modeling for Power Electronics. [doi] [pdf]**Characterization of realistic litz wires using fast simulations***Outstanding Presentation Award (Oral)*

R.Y. Zhang, C.R. Sullivan, J.K. White, J.G. Kassakian -APEC 2014IEEE Applied Power Electronics Conference and Exposition 2014. [doi] [pdf] [slides]**Simplified design method for litz wire**

C.R. Sullivan, R.Y. Zhang -APEC 2014IEEE Applied Power Electronics Conference and Exposition 2014. [doi] [pdf]

**A Generalized Approach to Planar Induction Heating Magnetics**

R.Y. Zhang - S.M. thesis, MIT Department of Electrical Engineering & Computer Science, 2012. [permalink] [pdf]**The Future of the Electric Grid -- An Interdisciplinary MIT study**

J.G. Kassakian, R. Schmalensee et al. - Technical report, MIT Energy Initiative, 2011. [permalink]

My last name 张/張 (Zhāng) is pronounced "Dj-uh-ng", but I usually go by the anglicized "Z-ang". People often confuse me with Dr. Richard Zhang, or Dr. Richard Zhang, or Prof. Richard Zhang, which is why I usually give my middle initial when stating my name. But even then, my name still collides with Dr. Richard Y. Zhang and Dr. Richard Y. Zhang.

I am originally from New Zealand. I was in a post-rock band in college, see Two Weeks and Life without Light.

© 2018-2022 Richard Y. Zhang.