作者: Ye Wang , Meng Li , Xinyang Yi , Zhao Song , Michael Orshansky
关键词:
摘要: Model order reduction exploiting the spectral properties of admittance matrix, known as graph Laplacian, to control approximation accuracy is a promising new class approaches power grid analysis. In this paper we introduce method that allows dramatic increase in resulting sparsity and can handle large dense input graphs. The based on observation information about realistic ranges port currents be used significantly improve sparsity. practice, cannot vary unboundedly estimates peak are often available early design cycle. However, existing methods including sampling-based sparsification approach [11] utilize information. We propose novel framework Sparsification by L1 regularization Laplacians (SparseLL) exploit range achieve higher degree better quality. By formulating sparsity-inducing optimization problem, leverage recent progress stochastic develop gradient descent algorithm an efficient solution. Using established benchmarks for experiments, demonstrate SparseLL up 10X edge improvement compared assuming full currents, with improvement. running time our also scales quite favorably due low complexity fast convergence, which leads us believe highly suitable large-scale problems.