期刊文献+
共找到42,680篇文章
< 1 2 250 >
每页显示 20 50 100
Ostensibly perpetual optical data storage in glass with ultra-high stability and tailored photoluminescence 被引量:4
1
作者 Zhuo Wang Bo Zhang +1 位作者 Dezhi Tan Jianrong Qiu 《Opto-Electronic Advances》 SCIE EI CAS CSCD 2023年第1期1-8,共8页
Long-term optical data storage(ODS)technology is essential to break the bottleneck of high energy consumption for information storage in the current era of big data.Here,ODS with an ultralong lifetime of 2×10^(7)... Long-term optical data storage(ODS)technology is essential to break the bottleneck of high energy consumption for information storage in the current era of big data.Here,ODS with an ultralong lifetime of 2×10^(7)years is attained with single ultrafast laser pulse induced reduction of Eu^(3+)ions and tailoring of optical properties inside the Eu-doped aluminosilicate glasses.We demonstrate that the induced local modifications in the glass can stand against the temperature of up to 970 K and strong ultraviolet light irradiation with the power density of 100 kW/cm^(2).Furthermore,the active ions of Eu^(2+)exhibit strong and broadband emission with the full width at half maximum reaching 190 nm,and the photoluminescence(PL)is flexibly tunable in the whole visible region by regulating the alkaline earth metal ions in the glasses.The developed technology and materials will be of great significance in photonic applications such as long-term ODS. 展开更多
关键词 ultrafast laser photoluminescence tailoring ultralong lifetime optical data storage
下载PDF
Data Secure Storage Mechanism for IIoT Based on Blockchain 被引量:1
2
作者 Jin Wang Guoshu Huang +2 位作者 R.Simon Sherratt Ding Huang Jia Ni 《Computers, Materials & Continua》 SCIE EI 2024年第3期4029-4048,共20页
With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapi... With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT. 展开更多
关键词 Blockchain IIoT data storage cryptographic commitment
下载PDF
A phenology-based vegetation index for improving ratoon rice mapping using harmonized Landsat and Sentinel-2 data 被引量:1
3
作者 Yunping Chen Jie Hu +6 位作者 Zhiwen Cai Jingya Yang Wei Zhou Qiong Hu Cong Wang Liangzhi You Baodong Xu 《Journal of Integrative Agriculture》 SCIE CAS CSCD 2024年第4期1164-1178,共15页
Ratoon rice,which refers to a second harvest of rice obtained from the regenerated tillers originating from the stubble of the first harvested crop,plays an important role in both food security and agroecology while r... Ratoon rice,which refers to a second harvest of rice obtained from the regenerated tillers originating from the stubble of the first harvested crop,plays an important role in both food security and agroecology while requiring minimal agricultural inputs.However,accurately identifying ratoon rice crops is challenging due to the similarity of its spectral features with other rice cropping systems(e.g.,double rice).Moreover,images with a high spatiotemporal resolution are essential since ratoon rice is generally cultivated in fragmented croplands within regions that frequently exhibit cloudy and rainy weather.In this study,taking Qichun County in Hubei Province,China as an example,we developed a new phenology-based ratoon rice vegetation index(PRVI)for the purpose of ratoon rice mapping at a 30 m spatial resolution using a robust time series generated from Harmonized Landsat and Sentinel-2(HLS)images.The PRVI that incorporated the red,near-infrared,and shortwave infrared 1 bands was developed based on the analysis of spectro-phenological separability and feature selection.Based on actual field samples,the performance of the PRVI for ratoon rice mapping was carefully evaluated by comparing it to several vegetation indices,including normalized difference vegetation index(NDVI),enhanced vegetation index(EVI)and land surface water index(LSWI).The results suggested that the PRVI could sufficiently capture the specific characteristics of ratoon rice,leading to a favorable separability between ratoon rice and other land cover types.Furthermore,the PRVI showed the best performance for identifying ratoon rice in the phenological phases characterized by grain filling and harvesting to tillering of the ratoon crop(GHS-TS2),indicating that only several images are required to obtain an accurate ratoon rice map.Finally,the PRVI performed better than NDVI,EVI,LSWI and their combination at the GHS-TS2 stages,with producer's accuracy and user's accuracy of 92.22 and 89.30%,respectively.These results demonstrate that the proposed PRVI based on HLS data can effectively identify ratoon rice in fragmented croplands at crucial phenological stages,which is promising for identifying the earliest timing of ratoon rice planting and can provide a fundamental dataset for crop management activities. 展开更多
关键词 ratoon rice phenology-based ratoon rice vegetation index(PRVI) phenological phase feature selection Harmonized Landsat Sentinel-2 data
下载PDF
Detection Capability Evaluation of Lunar Mineralogical Spectrometer:Results from Ground Experimental Data
4
作者 Fang Gao Bin Liu +2 位作者 Xin Ren Da-Wei Liu Chun-Lai Li 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2023年第12期164-174,共11页
The Chang’E-6 mission will first land on the far side of the moon and bring lunar samples back.As a hyperspectral imager aboard the Chang’E-6 lander,the Lunar Mineralogical Spectrometer(LMS),will achieve the goal of... The Chang’E-6 mission will first land on the far side of the moon and bring lunar samples back.As a hyperspectral imager aboard the Chang’E-6 lander,the Lunar Mineralogical Spectrometer(LMS),will achieve the goal of spectral detection and mineral composition analysis in the sampling area,and the data of LMS will also be compared with the results of the returned sample laboratory measurements.Visible and near-infrared hyperspectral remote sensing is an effective tool for lunar minerals identification and quantification.The ground validation experiment can be used to evaluate the detection ability of the LMS.According to the modal abundances of lunar minerals and glasses of APOLLO samples,binary mixed samples,ternary mixed samples,and seven-membered mixed samples were prepared.The samples were ground and stirred homogeneous to about 200 mesh(median particle size about 75μm),to simulate the soil state of the lunar surface.Under the laboratory ambient condition,the 480–3200 nm spectral data of the samples were acquired using the Engineering Qualification Model(EQM)of Chang’E-5 LMS,the performance of which is consistent with the flight model of Chang’E-6 LMS.By fitting the mixed samples’spectral data of the EQM using the Modified Gaussian Methods,the following conclusions can be drawn:The subtle spectral changes of mixed samples can be detected.The modal abundance of low-Ca pyroxene,high-Ca pyroxene,and plagioclase can be derived based on the spectral parameters such as absorption position,depth or width of the mixed samples,and the correlation coefficients R2are better than 82%,indicating that the LMS has good quantitative detection capability. 展开更多
关键词 methods data analysis-techniques spectroscopic-Moon
下载PDF
基于re3data的中英科学数据仓储平台对比研究
5
作者 袁烨 陈媛媛 《数字图书馆论坛》 2024年第2期13-23,共11页
以re3data为数据获取源,选取中英两国406个科学数据仓储为研究对象,从分布特征、责任类型、仓储许可、技术标准及质量标准等5个方面、11个指标对两国科学数据仓储的建设情况进行对比分析,试图为我国数据仓储的可持续发展提出建议:广泛... 以re3data为数据获取源,选取中英两国406个科学数据仓储为研究对象,从分布特征、责任类型、仓储许可、技术标准及质量标准等5个方面、11个指标对两国科学数据仓储的建设情况进行对比分析,试图为我国数据仓储的可持续发展提出建议:广泛联结国内外异质机构,推进多学科领域的交流与合作,有效扩充仓储许可权限与类型,优化技术标准的应用现况,提高元数据使用的灵活性。 展开更多
关键词 科学数据 数据仓储平台 re3data 中国 英国
下载PDF
Data Component:An Innovative Framework for Information Value Metrics in the Digital Economy
6
作者 Tao Xiaoming Wang Yu +5 位作者 Peng Jieyang Zhao Yuelin Wang Yue Wang Youzheng Hu Chengsheng Lu Zhipeng 《China Communications》 SCIE CSCD 2024年第5期17-35,共19页
The increasing dependence on data highlights the need for a detailed understanding of its behavior,encompassing the challenges involved in processing and evaluating it.However,current research lacks a comprehensive st... The increasing dependence on data highlights the need for a detailed understanding of its behavior,encompassing the challenges involved in processing and evaluating it.However,current research lacks a comprehensive structure for measuring the worth of data elements,hindering effective navigation of the changing digital environment.This paper aims to fill this research gap by introducing the innovative concept of“data components.”It proposes a graphtheoretic representation model that presents a clear mathematical definition and demonstrates the superiority of data components over traditional processing methods.Additionally,the paper introduces an information measurement model that provides a way to calculate the information entropy of data components and establish their increased informational value.The paper also assesses the value of information,suggesting a pricing mechanism based on its significance.In conclusion,this paper establishes a robust framework for understanding and quantifying the value of implicit information in data,laying the groundwork for future research and practical applications. 展开更多
关键词 data component data element data governance data science information theory
下载PDF
Traffic Flow Prediction with Heterogeneous Spatiotemporal Data Based on a Hybrid Deep Learning Model Using Attention-Mechanism
7
作者 Jing-Doo Wang Chayadi Oktomy Noto Susanto 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第8期1711-1728,共18页
A significant obstacle in intelligent transportation systems(ITS)is the capacity to predict traffic flow.Recent advancements in deep neural networks have enabled the development of models to represent traffic flow acc... A significant obstacle in intelligent transportation systems(ITS)is the capacity to predict traffic flow.Recent advancements in deep neural networks have enabled the development of models to represent traffic flow accurately.However,accurately predicting traffic flow at the individual road level is extremely difficult due to the complex interplay of spatial and temporal factors.This paper proposes a technique for predicting short-term traffic flow data using an architecture that utilizes convolutional bidirectional long short-term memory(Conv-BiLSTM)with attention mechanisms.Prior studies neglected to include data pertaining to factors such as holidays,weather conditions,and vehicle types,which are interconnected and significantly impact the accuracy of forecast outcomes.In addition,this research incorporates recurring monthly periodic pattern data that significantly enhances the accuracy of forecast outcomes.The experimental findings demonstrate a performance improvement of 21.68%when incorporating the vehicle type feature. 展开更多
关键词 Traffic flow prediction sptiotemporal data heterogeneous data Conv-BiLSTM data-CENTRIC intra-data
下载PDF
Redundant Data Detection and Deletion to Meet Privacy Protection Requirements in Blockchain-Based Edge Computing Environment
8
作者 Zhang Lejun Peng Minghui +6 位作者 Su Shen Wang Weizheng Jin Zilong Su Yansen Chen Huiling Guo Ran Sergey Gataullin 《China Communications》 SCIE CSCD 2024年第3期149-159,共11页
With the rapid development of information technology,IoT devices play a huge role in physiological health data detection.The exponential growth of medical data requires us to reasonably allocate storage space for clou... With the rapid development of information technology,IoT devices play a huge role in physiological health data detection.The exponential growth of medical data requires us to reasonably allocate storage space for cloud servers and edge nodes.The storage capacity of edge nodes close to users is limited.We should store hotspot data in edge nodes as much as possible,so as to ensure response timeliness and access hit rate;However,the current scheme cannot guarantee that every sub-message in a complete data stored by the edge node meets the requirements of hot data;How to complete the detection and deletion of redundant data in edge nodes under the premise of protecting user privacy and data dynamic integrity has become a challenging problem.Our paper proposes a redundant data detection method that meets the privacy protection requirements.By scanning the cipher text,it is determined whether each sub-message of the data in the edge node meets the requirements of the hot data.It has the same effect as zero-knowledge proof,and it will not reveal the privacy of users.In addition,for redundant sub-data that does not meet the requirements of hot data,our paper proposes a redundant data deletion scheme that meets the dynamic integrity of the data.We use Content Extraction Signature(CES)to generate the remaining hot data signature after the redundant data is deleted.The feasibility of the scheme is proved through safety analysis and efficiency analysis. 展开更多
关键词 blockchain data integrity edge computing privacy protection redundant data
下载PDF
Big Data Access Control Mechanism Based on Two-Layer Permission Decision Structure
9
作者 Aodi Liu Na Wang +3 位作者 Xuehui Du Dibin Shan Xiangyu Wu Wenjuan Wang 《Computers, Materials & Continua》 SCIE EI 2024年第4期1705-1726,共22页
Big data resources are characterized by large scale, wide sources, and strong dynamics. Existing access controlmechanisms based on manual policy formulation by security experts suffer from drawbacks such as low policy... Big data resources are characterized by large scale, wide sources, and strong dynamics. Existing access controlmechanisms based on manual policy formulation by security experts suffer from drawbacks such as low policymanagement efficiency and difficulty in accurately describing the access control policy. To overcome theseproblems, this paper proposes a big data access control mechanism based on a two-layer permission decisionstructure. This mechanism extends the attribute-based access control (ABAC) model. Business attributes areintroduced in the ABAC model as business constraints between entities. The proposed mechanism implementsa two-layer permission decision structure composed of the inherent attributes of access control entities and thebusiness attributes, which constitute the general permission decision algorithm based on logical calculation andthe business permission decision algorithm based on a bi-directional long short-term memory (BiLSTM) neuralnetwork, respectively. The general permission decision algorithm is used to implement accurate policy decisions,while the business permission decision algorithm implements fuzzy decisions based on the business constraints.The BiLSTM neural network is used to calculate the similarity of the business attributes to realize intelligent,adaptive, and efficient access control permission decisions. Through the two-layer permission decision structure,the complex and diverse big data access control management requirements can be satisfied by considering thesecurity and availability of resources. Experimental results show that the proposed mechanism is effective andreliable. In summary, it can efficiently support the secure sharing of big data resources. 展开更多
关键词 Big data access control data security BiLSTM
下载PDF
Defect Detection Model Using Time Series Data Augmentation and Transformation
10
作者 Gyu-Il Kim Hyun Yoo +1 位作者 Han-Jin Cho Kyungyong Chung 《Computers, Materials & Continua》 SCIE EI 2024年第2期1713-1730,共18页
Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal depende... Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight. 展开更多
关键词 Defect detection time series deep learning data augmentation data transformation
下载PDF
Assimilation of GOES-R Geostationary Lightning Mapper Flash Extent Density Data in GSI 3DVar, EnKF, and Hybrid En3DVar for the Analysis and Short-Term Forecast of a Supercell Storm Case
11
作者 Rong KONG Ming XUE +2 位作者 Edward R.MANSELL Chengsi LIU Alexandre O.FIERRO 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2024年第2期263-277,共15页
Capabilities to assimilate Geostationary Operational Environmental Satellite “R-series ”(GOES-R) Geostationary Lightning Mapper(GLM) flash extent density(FED) data within the operational Gridpoint Statistical Interp... Capabilities to assimilate Geostationary Operational Environmental Satellite “R-series ”(GOES-R) Geostationary Lightning Mapper(GLM) flash extent density(FED) data within the operational Gridpoint Statistical Interpolation ensemble Kalman filter(GSI-EnKF) framework were previously developed and tested with a mesoscale convective system(MCS) case. In this study, such capabilities are further developed to assimilate GOES GLM FED data within the GSI ensemble-variational(EnVar) hybrid data assimilation(DA) framework. The results of assimilating the GLM FED data using 3DVar, and pure En3DVar(PEn3DVar, using 100% ensemble covariance and no static covariance) are compared with those of EnKF/DfEnKF for a supercell storm case. The focus of this study is to validate the correctness and evaluate the performance of the new implementation rather than comparing the performance of FED DA among different DA schemes. Only the results of 3DVar and pEn3DVar are examined and compared with EnKF/DfEnKF. Assimilation of a single FED observation shows that the magnitude and horizontal extent of the analysis increments from PEn3DVar are generally larger than from EnKF, which is mainly caused by using different localization strategies in EnFK/DfEnKF and PEn3DVar as well as the integration limits of the graupel mass in the observation operator. Overall, the forecast performance of PEn3DVar is comparable to EnKF/DfEnKF, suggesting correct implementation. 展开更多
关键词 GOES-R LIGHTNING data assimilation ENKF EnVar
下载PDF
Reliable Data Collection Model and Transmission Framework in Large-Scale Wireless Medical Sensor Networks
12
作者 Haosong Gou Gaoyi Zhang +2 位作者 RenêRipardo Calixto Senthil Kumar Jagatheesaperumal Victor Hugo C.de Albuquerque 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第7期1077-1102,共26页
Large-scale wireless sensor networks(WSNs)play a critical role in monitoring dangerous scenarios and responding to medical emergencies.However,the inherent instability and error-prone nature of wireless links present ... Large-scale wireless sensor networks(WSNs)play a critical role in monitoring dangerous scenarios and responding to medical emergencies.However,the inherent instability and error-prone nature of wireless links present significant challenges,necessitating efficient data collection and reliable transmission services.This paper addresses the limitations of existing data transmission and recovery protocols by proposing a systematic end-to-end design tailored for medical event-driven cluster-based large-scale WSNs.The primary goal is to enhance the reliability of data collection and transmission services,ensuring a comprehensive and practical approach.Our approach focuses on refining the hop-count-based routing scheme to achieve fairness in forwarding reliability.Additionally,it emphasizes reliable data collection within clusters and establishes robust data transmission over multiple hops.These systematic improvements are designed to optimize the overall performance of the WSN in real-world scenarios.Simulation results of the proposed protocol validate its exceptional performance compared to other prominent data transmission schemes.The evaluation spans varying sensor densities,wireless channel conditions,and packet transmission rates,showcasing the protocol’s superiority in ensuring reliable and efficient data transfer.Our systematic end-to-end design successfully addresses the challenges posed by the instability of wireless links in large-scaleWSNs.By prioritizing fairness,reliability,and efficiency,the proposed protocol demonstrates its efficacy in enhancing data collection and transmission services,thereby offering a valuable contribution to the field of medical event-drivenWSNs. 展开更多
关键词 Wireless sensor networks reliable data transmission medical emergencies CLUSTER data collection routing scheme
下载PDF
Research on Interpolation Method for Missing Electricity Consumption Data
13
作者 Junde Chen Jiajia Yuan +3 位作者 Weirong Chen Adnan Zeb Md Suzauddola Yaser A.Nanehkaran 《Computers, Materials & Continua》 SCIE EI 2024年第2期2575-2591,共17页
Missing value is one of the main factors that cause dirty data.Without high-quality data,there will be no reliable analysis results and precise decision-making.Therefore,the data warehouse needs to integrate high-qual... Missing value is one of the main factors that cause dirty data.Without high-quality data,there will be no reliable analysis results and precise decision-making.Therefore,the data warehouse needs to integrate high-quality data consistently.In the power system,the electricity consumption data of some large users cannot be normally collected resulting in missing data,which affects the calculation of power supply and eventually leads to a large error in the daily power line loss rate.For the problem of missing electricity consumption data,this study proposes a group method of data handling(GMDH)based data interpolation method in distribution power networks and applies it in the analysis of actually collected electricity data.First,the dependent and independent variables are defined from the original data,and the upper and lower limits of missing values are determined according to prior knowledge or existing data information.All missing data are randomly interpolated within the upper and lower limits.Then,the GMDH network is established to obtain the optimal complexity model,which is used to predict the missing data to replace the last imputed electricity consumption data.At last,this process is implemented iteratively until the missing values do not change.Under a relatively small noise level(α=0.25),the proposed approach achieves a maximum error of no more than 0.605%.Experimental findings demonstrate the efficacy and feasibility of the proposed approach,which realizes the transformation from incomplete data to complete data.Also,this proposed data interpolation approach provides a strong basis for the electricity theft diagnosis and metering fault analysis of electricity enterprises. 展开更多
关键词 data interpolation GMDH electricity consumption data distribution system
下载PDF
Big data-driven water research towards metaverse
14
作者 Minori Uchimiya 《Water Science and Engineering》 EI CAS CSCD 2024年第2期101-107,共7页
Although big data is publicly available on water quality parameters,virtual simulation has not yet been adequately adapted in environmental chemistry research.Digital twin is different from conventional geospatial mod... Although big data is publicly available on water quality parameters,virtual simulation has not yet been adequately adapted in environmental chemistry research.Digital twin is different from conventional geospatial modeling approaches and is particularly useful when systematic laboratory/field experiment is not realistic(e.g.,climate impact and water-related environmental catastrophe)or difficult to design and monitor in a real time(e.g.,pollutant and nutrient cycles in estuaries,soils,and sediments).Data-driven water research could realize early warning and disaster readiness simulations for diverse environmental scenarios,including drinking water contamination. 展开更多
关键词 data mining OMICS Remote sensing Sensor CHEMOINFORMATICS
下载PDF
A Hierarchical Method for Locating the Interferometric Fringes of Celestial Sources in the Visibility Data
15
作者 Rong Ma Ruiqing Yan +7 位作者 Hanshuai Cui Xiaochun Cheng Jixia Li Fengquan Wu Zongyao Yin Hao Wang Wenyi Zeng Xianchuan Yu 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2024年第3期110-128,共19页
In source detection in the Tianlai project,locating the interferometric fringe in visibility data accurately will influence downstream tasks drastically,such as physical parameter estimation and weak source exploratio... In source detection in the Tianlai project,locating the interferometric fringe in visibility data accurately will influence downstream tasks drastically,such as physical parameter estimation and weak source exploration.Considering that traditional locating methods are time-consuming and supervised methods require a great quantity of expensive labeled data,in this paper,we first investigate characteristics of interferometric fringes in the simulation and real scenario separately,and integrate an almost parameter-free unsupervised clustering method and seeding filling or eraser algorithm to propose a hierarchical plug and play method to improve location accuracy.Then,we apply our method to locate single and multiple sources’interferometric fringes in simulation data.Next,we apply our method to real data taken from the Tianlai radio telescope array.Finally,we compare with unsupervised methods that are state of the art.These results show that our method has robustness in different scenarios and can improve location measurement accuracy effectively. 展开更多
关键词 methods data analysis-techniques image processing-techniques INTERFEROMETRIC
下载PDF
Enhanced prediction of anisotropic deformation behavior using machine learning with data augmentation
16
作者 Sujeong Byun Jinyeong Yu +3 位作者 Seho Cheon Seong Ho Lee Sung Hyuk Park Taekyung Lee 《Journal of Magnesium and Alloys》 SCIE EI CAS CSCD 2024年第1期186-196,共11页
Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary w... Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys. 展开更多
关键词 Plastic anisotropy Compression ANNEALING Machine learning data augmentation
下载PDF
An Efficient Modelling of Oversampling with Optimal Deep Learning Enabled Anomaly Detection in Streaming Data
17
作者 R.Rajakumar S.Sathiya Devi 《China Communications》 SCIE CSCD 2024年第5期249-260,共12页
Recently,anomaly detection(AD)in streaming data gained significant attention among research communities due to its applicability in finance,business,healthcare,education,etc.The recent developments of deep learning(DL... Recently,anomaly detection(AD)in streaming data gained significant attention among research communities due to its applicability in finance,business,healthcare,education,etc.The recent developments of deep learning(DL)models find helpful in the detection and classification of anomalies.This article designs an oversampling with an optimal deep learning-based streaming data classification(OS-ODLSDC)model.The aim of the OSODLSDC model is to recognize and classify the presence of anomalies in the streaming data.The proposed OS-ODLSDC model initially undergoes preprocessing step.Since streaming data is unbalanced,support vector machine(SVM)-Synthetic Minority Over-sampling Technique(SVM-SMOTE)is applied for oversampling process.Besides,the OS-ODLSDC model employs bidirectional long short-term memory(Bi LSTM)for AD and classification.Finally,the root means square propagation(RMSProp)optimizer is applied for optimal hyperparameter tuning of the Bi LSTM model.For ensuring the promising performance of the OS-ODLSDC model,a wide-ranging experimental analysis is performed using three benchmark datasets such as CICIDS 2018,KDD-Cup 1999,and NSL-KDD datasets. 展开更多
关键词 anomaly detection deep learning hyperparameter optimization OVERSAMPLING SMOTE streaming data
下载PDF
Data-driven modeling on anisotropic mechanical behavior of brain tissue with internal pressure
18
作者 Zhiyuan Tang Yu Wang +3 位作者 Khalil I.Elkhodary Zefeng Yu Shan Tang Dan Peng 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第3期55-65,共11页
Brain tissue is one of the softest parts of the human body,composed of white matter and grey matter.The mechanical behavior of the brain tissue plays an essential role in regulating brain morphology and brain function... Brain tissue is one of the softest parts of the human body,composed of white matter and grey matter.The mechanical behavior of the brain tissue plays an essential role in regulating brain morphology and brain function.Besides,traumatic brain injury(TBI)and various brain diseases are also greatly influenced by the brain's mechanical properties.Whether white matter or grey matter,brain tissue contains multiscale structures composed of neurons,glial cells,fibers,blood vessels,etc.,each with different mechanical properties.As such,brain tissue exhibits complex mechanical behavior,usually with strong nonlinearity,heterogeneity,and directional dependence.Building a constitutive law for multiscale brain tissue using traditional function-based approaches can be very challenging.Instead,this paper proposes a data-driven approach to establish the desired mechanical model of brain tissue.We focus on blood vessels with internal pressure embedded in a white or grey matter matrix material to demonstrate our approach.The matrix is described by an isotropic or anisotropic nonlinear elastic model.A representative unit cell(RUC)with blood vessels is built,which is used to generate the stress-strain data under different internal blood pressure and various proportional displacement loading paths.The generated stress-strain data is then used to train a mechanical law using artificial neural networks to predict the macroscopic mechanical response of brain tissue under different internal pressures.Finally,the trained material model is implemented into finite element software to predict the mechanical behavior of a whole brain under intracranial pressure and distributed body forces.Compared with a direct numerical simulation that employs a reference material model,our proposed approach greatly reduces the computational cost and improves modeling efficiency.The predictions made by our trained model demonstrate sufficient accuracy.Specifically,we find that the level of internal blood pressure can greatly influence stress distribution and determine the possible related damage behaviors. 展开更多
关键词 data driven Constitutive law ANISOTROPY Brain tissue Internal pressure
下载PDF
Uncertainties of ENSO-related Regional Hadley Circulation Anomalies within Eight Reanalysis Datasets
19
作者 Yadi LI Xichen LI +3 位作者 Juan FENG Yi ZHOU Wenzhu WANG Yurong HOU 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2024年第1期115-140,共26页
El Nino-Southern Oscillation(ENSO),the leading mode of global interannual variability,usually intensifies the Hadley Circulation(HC),and meanwhile constrains its meridional extension,leading to an equatorward movement... El Nino-Southern Oscillation(ENSO),the leading mode of global interannual variability,usually intensifies the Hadley Circulation(HC),and meanwhile constrains its meridional extension,leading to an equatorward movement of the jet system.Previous studies have investigated the response of HC to ENSO events using different reanalysis datasets and evaluated their capability in capturing the main features of ENSO-associated HC anomalies.However,these studies mainly focused on the global HC,represented by a zonal-mean mass stream function(MSF).Comparatively fewer studies have evaluated HC responses from a regional perspective,partly due to the prerequisite of the Stokes MSF,which prevents us from integrating a regional HC.In this study,we adopt a recently developed technique to construct the three-dimensional structure of HC and evaluate the capability of eight state-of-the-art reanalyses in reproducing the regional HC response to ENSO events.Results show that all eight reanalyses reproduce the spatial structure of HC responses well,with an intensified HC around the central-eastern Pacific but weakened circulations around the Indo-Pacific warm pool and tropical Atlantic.The spatial correlation coefficient of the three-dimensional HC anomalies among the different datasets is always larger than 0.93.However,these datasets may not capture the amplitudes of the HC responses well.This uncertainty is especially large for ENSO-associated equatorially asymmetric HC anomalies,with the maximum amplitude in Climate Forecast System Reanalysis(CFSR)being about 2.7 times the minimum value in the Twentieth Century Reanalysis(20CR).One should be careful when using reanalysis data to evaluate the intensity of ENSO-associated HC anomalies. 展开更多
关键词 regional Hadley circulation ENSO atmosphere-ocean interaction reanalysis data
下载PDF
Battery pack capacity estimation for electric vehicles based on enhanced machine learning and field data
20
作者 Qingguang Qi Wenxue Liu +3 位作者 Zhongwei Deng Jinwen Li Ziyou Song Xiaosong Hu 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第5期605-618,共14页
Accurate capacity estimation is of great importance for the reliable state monitoring,timely maintenance,and second-life utilization of lithium-ion batteries.Despite numerous works on battery capacity estimation using... Accurate capacity estimation is of great importance for the reliable state monitoring,timely maintenance,and second-life utilization of lithium-ion batteries.Despite numerous works on battery capacity estimation using laboratory datasets,most of them are applied to battery cells and lack satisfactory fidelity when extended to real-world electric vehicle(EV)battery packs.The challenges intensify for large-sized EV battery packs,where unpredictable operating profiles and low-quality data acquisition hinder precise capacity estimation.To fill the gap,this study introduces a novel data-driven battery pack capacity estimation method grounded in field data.The proposed approach begins by determining labeled capacity through an innovative combination of the inverse ampere-hour integral,open circuit voltage-based,and resistance-based correction methods.Then,multiple health features are extracted from incremental capacity curves,voltage curves,equivalent circuit model parameters,and operating temperature to thoroughly characterize battery aging behavior.A feature selection procedure is performed to determine the optimal feature set based on the Pearson correlation coefficient.Moreover,a convolutional neural network and bidirectional gated recurrent unit,enhanced by an attention mechanism,are employed to estimate the battery pack capacity in real-world EV applications.Finally,the proposed method is validated with a field dataset from two EVs,covering approximately 35,000 kilometers.The results demonstrate that the proposed method exhibits better estimation performance with an error of less than 1.1%compared to existing methods.This work shows great potential for accurate large-sized EV battery pack capacity estimation based on field data,which provides significant insights into reliable labeled capacity calculation,effective features extraction,and machine learning-enabled health diagnosis. 展开更多
关键词 Electricvehicle Lithium-ion battery pack Capacity estimation Machine learning Field data
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部