期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
Limit Cycle Identification in Nonlinear Polynomial Systems
1
作者 Shuqi Zhang Haotian Liu +1 位作者 Kim Batselier ngai wong 《Applied Mathematics》 2013年第9期19-26,共8页
We present a novel formulation, based on the latest advancement in polynomial system solving via linear algebra, for identifying limit cycles in general n-dimensional autonomous nonlinear polynomial systems. The condi... We present a novel formulation, based on the latest advancement in polynomial system solving via linear algebra, for identifying limit cycles in general n-dimensional autonomous nonlinear polynomial systems. The condition for the existence of an algebraic limit cycle is first set up and cast into a Macaulay matrix format whereby polynomials are regarded as coefficient vectors of monomials. This results in a system of polynomial equations whose roots are solved through the null space of another Macaulay matrix. This two-level Macaulay matrix approach relies solely on linear algebra and eigenvalue computation with robust numerical implementation. Furthermore, a state immersion technique further enlarges the scope to cover also non-polynomial (including exponential and logarithmic) limit cycles. Application examples are given to demonstrate the efficacy of the proposed framework. 展开更多
关键词 LIMIT Cycle IDENTIFICATION POLYNOMIAL Representation ROOTS Finding Macaulay Matrix IMMERSION
下载PDF
MERACLE:Constructive Layer-Wise Conversion of a Tensor Train into a MERA
2
作者 Kim Batselier Andrzej Cichocki ngai wong 《Communications on Applied Mathematics and Computation》 2021年第2期257-279,共23页
In this article,two new algorithms are presented that convert a given data tensor train into either a Tucker decomposition with orthogonal matrix factors or a multi-scale entanglement renormalization ansatz(MERA).The ... In this article,two new algorithms are presented that convert a given data tensor train into either a Tucker decomposition with orthogonal matrix factors or a multi-scale entanglement renormalization ansatz(MERA).The Tucker core tensor is never explicitly computed but stored as a tensor train instead,resulting in both computationally and storage efficient algorithms.Both the multilinear Tucker-ranks as well as the MERA-ranks are automatically determined by the algorithm for a given upper bound on the relative approximation error.In addition,an iterative algorithm with low computational complexity based on solving an orthogonal Procrustes problem is proposed for the first time to retrieve optimal rank-lowering disentangler tensors,which are a crucial component in the construction of a low-rank MERA.Numerical experiments demonstrate the effectiveness of the proposed algorithms together with the potential storage benefit of a low-rank MERA over a tensor train. 展开更多
关键词 TENSORS Tensor train Tucker decomposition HOSVD MERA Disentangler
下载PDF
Hilbert-Schmidt-Hankel norm model reduction for matrix second-order linear systems
3
作者 Qing WANG Tongke ZHONG +1 位作者 ngai wong Qingyang WANG 《控制理论与应用(英文版)》 EI 2011年第4期571-578,共8页
This paper considers the optimal model reduction problem of matrix second-order linear systems in the sense of Hilbert-Schmidt-Hankel norm, with the reduced order systems preserving the structure of the original syste... This paper considers the optimal model reduction problem of matrix second-order linear systems in the sense of Hilbert-Schmidt-Hankel norm, with the reduced order systems preserving the structure of the original systems. The expressions of the error function and its gradient are derived. Two numerical examples are given to illustrate the presented model reduction technique. 展开更多
关键词 Model reduction Matrix second-order linear system Hilbert-Schmidt-Hankel norm GRADIENT
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部