Sodium metal batteries(SMBs)are rising as viable alternatives to lithium-ion systems due to their superior energy density and sodium's relative abundance.However,SMBs face significant impediments,particularly the ...Sodium metal batteries(SMBs)are rising as viable alternatives to lithium-ion systems due to their superior energy density and sodium's relative abundance.However,SMBs face significant impediments,particularly the exceedingly high negative-to-positive capacity ratios(N/P ratios)which severely encumber energy density and hinder their practical application.Herein,a novel nucleophilic Na_(3)P interphase on aluminum foil has been designed to significantly lower the nucleation energy barrier for sodium atom deposition,resulting in a remarkable reduction of nucleation overpotential and efficient mitigation of dendritic growth at high sodium deposition of 5 mA h cm^(−2).The interphase promotes stable cycling in anode-less SMB configurations with a low N/P ratio of 1.4 and high cathode mass loading of 11.5 mg cm^(−2),and demonstrates a substantial increase in high capacity retention of 92.4%after 500 cycles even under 1 C rate condition.This innovation signifies a promising leap forward in the development of high-energy-density,anode-less SMBs,offering a potential solution to the longstanding issues of cycle stability and energy efficiency.展开更多
Overfitting frequently occurs in deep learning.In this paper,we propose a novel regularization method called drop-activation to reduce overfitting and improve generalization.The key idea is to drop nonlinear activatio...Overfitting frequently occurs in deep learning.In this paper,we propose a novel regularization method called drop-activation to reduce overfitting and improve generalization.The key idea is to drop nonlinear activation functions by setting them to be identity functions randomly during training time.During testing,we use a deterministic network with a new activation function to encode the average effect of dropping activations randomly.Our theoretical analyses support the regularization effect of drop-activation as implicit parameter reduction and verify its capability to be used together with batch normalization(Iolfe and Szegedy in Batch normalization:accelerating deep network training by reducing internal covariate shift.arXiv:1502.03167,2015).The experimental results on CIFAR10,CIFAR100,SVHN,EMNIST,and ImageNet show that drop-activation generally improves the performance of popular neural network architectures for the image classification task.Furthermore,as a regularizer drop-activation can be used in harmony with standard training and regularization techniques such as batch normalization and AutoAugment(Cubuk et al.in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,pp.113-123,2019).The code is available at https://github.com/LeungSamWai/Drop-Activ ation.展开更多
We prove a theorem concerning the approximation of generalized bandlimited multivariate functions by deep ReLU networks for which the curse of the dimensionality is overcome.Our theorem is based on a result by Maurey ...We prove a theorem concerning the approximation of generalized bandlimited multivariate functions by deep ReLU networks for which the curse of the dimensionality is overcome.Our theorem is based on a result by Maurey and on the ability of deep ReLU networks to approximate Chebyshev polynomials and analytic functions efficiently.展开更多
基金funding support from the National Natural Science Foundation of China (22125902, 22109150, 22279126, U2032202, and 21975243)the DNL cooperation Fund, CAS (DNL202020)+1 种基金the National Key R&D Program of China (no. 2022YFA1504101)the Anhui Provincial Natural Science Foundation (2108085QB65)
文摘Sodium metal batteries(SMBs)are rising as viable alternatives to lithium-ion systems due to their superior energy density and sodium's relative abundance.However,SMBs face significant impediments,particularly the exceedingly high negative-to-positive capacity ratios(N/P ratios)which severely encumber energy density and hinder their practical application.Herein,a novel nucleophilic Na_(3)P interphase on aluminum foil has been designed to significantly lower the nucleation energy barrier for sodium atom deposition,resulting in a remarkable reduction of nucleation overpotential and efficient mitigation of dendritic growth at high sodium deposition of 5 mA h cm^(−2).The interphase promotes stable cycling in anode-less SMB configurations with a low N/P ratio of 1.4 and high cathode mass loading of 11.5 mg cm^(−2),and demonstrates a substantial increase in high capacity retention of 92.4%after 500 cycles even under 1 C rate condition.This innovation signifies a promising leap forward in the development of high-energy-density,anode-less SMBs,offering a potential solution to the longstanding issues of cycle stability and energy efficiency.
文摘Overfitting frequently occurs in deep learning.In this paper,we propose a novel regularization method called drop-activation to reduce overfitting and improve generalization.The key idea is to drop nonlinear activation functions by setting them to be identity functions randomly during training time.During testing,we use a deterministic network with a new activation function to encode the average effect of dropping activations randomly.Our theoretical analyses support the regularization effect of drop-activation as implicit parameter reduction and verify its capability to be used together with batch normalization(Iolfe and Szegedy in Batch normalization:accelerating deep network training by reducing internal covariate shift.arXiv:1502.03167,2015).The experimental results on CIFAR10,CIFAR100,SVHN,EMNIST,and ImageNet show that drop-activation generally improves the performance of popular neural network architectures for the image classification task.Furthermore,as a regularizer drop-activation can be used in harmony with standard training and regularization techniques such as batch normalization and AutoAugment(Cubuk et al.in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,pp.113-123,2019).The code is available at https://github.com/LeungSamWai/Drop-Activ ation.
基金The research of the second author was partially supported by US NSF under the grant award DMS-1945029The research of the third author is supported in part by US NSF DMS-1719699the NSF TRIPODS program CCF-1704833.
文摘We prove a theorem concerning the approximation of generalized bandlimited multivariate functions by deep ReLU networks for which the curse of the dimensionality is overcome.Our theorem is based on a result by Maurey and on the ability of deep ReLU networks to approximate Chebyshev polynomials and analytic functions efficiently.