Hello friends! Welcome to my page! I am Senwei Liang. I joined Lawrence Berkeley National Laboratory working as postdoc in Aug 2022 under supervision by Prof. Chao Yang and Prof. Lin Lin. Before that, I obtained PhD from Purdue University supervised by Prof. Haizhao Yang, and got my MSc degree in National University of Singapore and BSc degree in Sun Yat-sen University.
Research Interest: Scientific machine learning and AI with Science.
(2023) Probing reaction channels via reinforcement learning
We propose a RL-based method to generate important configurations that connect reactant and product states along chemical reaction paths. These configurations can be effectively employed in a NN-PDE solver to obtain a solution of Backward Kolmogorov equation, even when the dimension of the problem is very high.
S Liang, AN Singh, Y Zhu, DT Limmer, C Yang [PDF].
(2023) Stationary Density Estimation of Itô Diffusions Using Deep Learning
We propose a deep learning scheme to estimate the density from a discrete-time series that approximate the solutions of the stochastic differential equations. We establish the convergence of the proposed scheme.
Y. Gu, J. Harlim, S. Liang, H. Yang, SIAM Journal on Numerical Analysis (Corresp. author) [PDF].
(2022) Accelerating numerical solvers for large-scale simulation of dynamical system via NeurVec
We propose a data-driven corrector method that allows using large step sizes while compensating for the integration error for high accuracy.
Z. Huang, S. Liang, H. Zhang, H. Yang, L. Lin, submitted (Joint first) [PDF, Code].
(2022) Finite Expression Method for Solving High-Dimensional Partial Differential Equations
We introduce a new methodology that seeks an approximate PDE solution in the space of functions with finitely many analytic expressions and, hence, this methodology is named the finite expression method (FEX).
(2022) Quantifying spatial homogeneity of urban road networks via graph neural networks
We borrow the power of graph neural networks to model the road network system and use its predictability to quantify the spatial homogeneity. The proposed measurement is shown to be a non-linear integration of multiple geometric properties. We demonstrate its connection with the road irregularity and the socioeconomic status indicators.
J. Xue, N. Jiang, S. Liang, Q. Pang, T. Yabe, S. Ukkusuri, J. Ma, Nature Machine Intelligence, 4, 246–257 (2022) [PDF, Code].
(2021) Stiffness-aware neural network for learning Hamiltonian systems
We propose stiffness-aware neural network, a new method for learning stiff Hamiltonian dynamical systems from data. SANN identifies and splits the training data into stiff and nonstiff portions based on a stiffness-aware index, a metric to quantify the stiffness of the dynamical system.
S. Liang, Z. Huang, H. Zhang, ICLR 2022 [PDF].
(2021) Solving PDEs on Unknown Manifolds with Machine Learning
We propose a mesh-free computational framework and machine learning theory for solving elliptic PDEs on unknown manifolds, identified with point clouds, based on diffusion maps (DM) and deep learning.
S. Liang, S. Jiang, J. Harlim, H. Yang, Submitted [PDF].
(2021) Reproducing Activation Function for Deep Learning
We propose reproducing activation functions which employs several basic functions and their learnable linear combination to construct neuron-wise data-driven activation functions for each neuron.
S. Liang, L. Lyu, C. Wang, H. Yang, Submitted [PDF].
(2021) Efficient Attention Network: Accelerate Attention by Searching Where to Plug
To improve the efficiency for the existing attention modules, we leverage the sharing mechanism to share the attention module within the backbone and search where to connect the shared attention module via reinforcement learning.
Z. Huang, S. Liang, M. Liang, W. He, H. Yang, Submitted, (Joint first) [PDF, Code].
(2021) Blending Pruning Criteria for Convolutional Neural Networks
We propose a novel framework to integrate the existing filter pruning criteria by exploring the criteria diversity. The proposed framework contains two stages: Criteria Clustering and Filters Importance Calibration.
W. He, Z. Huang, M. Liang, S. Liang, H. Yang, ICANN 2021 [PDF].
(2019) Machine learning for prediction with missing dynamics
We propose a framework that reformulates the prediction problem as a supervised learning problem to approximate a map that takes the memories of the resolved and identifiable unresolved variables to the missing components in the resolved dynamics.
J. Harlim, S. Jiang, S. Liang, H. Yang, J. Comput. Phys., (Alphabetical order) [PDF].
(2019) Instance Enhancement Batch Normalization: An Adaptive Regulator of Batch Noise
We point out that self-attention mechanism can help to regulate the noise by enhancing instance-specific information and propose a normalization that recalibrates the information of each channel by a simple linear transformation.
S. Liang, Z. Huang, M. Liang, H. Yang, AAAI-2020 [PDF, Code].
(2018) Drop-activation: Implicit parameter reduction and harmonic regularization
We propose a regularization method that drops nonlinear activation functions by setting them to be identity functions randomly during training time.
S. Liang, Y. Khoo, H. Yang, Communications on Applied Mathematics and Computation [PDF, Code].
This page has been accessed at least
times since 10/10/2021, and on average
per day.