The problems in equipment fault detection include data dimension explosion,computational complexity,low detection accuracy,etc.To solve these problems,a device anomaly detection algorithm based on enhanced long short-...The problems in equipment fault detection include data dimension explosion,computational complexity,low detection accuracy,etc.To solve these problems,a device anomaly detection algorithm based on enhanced long short-term memory(LSTM)is proposed.The algorithm first reduces the dimensionality of the device sensor data by principal component analysis(PCA),extracts the strongly correlated variable data among the multidimensional sensor data with the lowest possible information loss,and then uses the enhanced stacked LSTM to predict the extracted temporal data,thus improving the accuracy of anomaly detection.To improve the efficiency of the anomaly detection,a genetic algorithm(GA)is used to adjust the magnitude of the enhancements made by the LSTM model.The validation of the actual data from the pumps shows that the algorithm has significantly improved the recall rate and the detection speed of device anomaly detection,with the recall rate of 97.07%,which indicates that the algorithm is effective and efficient for device anomaly detection in the actual production environment.展开更多
Short-term(up to 30 days)predictions of Earth Rotation Parameters(ERPs)such as Polar Motion(PM:PMX and PMY)play an essential role in real-time applications related to high-precision reference frame conversion.Currentl...Short-term(up to 30 days)predictions of Earth Rotation Parameters(ERPs)such as Polar Motion(PM:PMX and PMY)play an essential role in real-time applications related to high-precision reference frame conversion.Currently,least squares(LS)+auto-regressive(AR)hybrid method is one of the main techniques of PM prediction.Besides,the weighted LS+AR hybrid method performs well for PM short-term prediction.However,the corresponding covariance information of LS fitting residuals deserves further exploration in the AR model.In this study,we have derived a modified stochastic model for the LS+AR hybrid method,namely the weighted LS+weighted AR hybrid method.By using the PM data products of IERS EOP 14 C04,the numerical results indicate that for PM short-term forecasting,the proposed weighted LS+weighted AR hybrid method shows an advantage over both the LS+AR hybrid method and the weighted LS+AR hybrid method.Compared to the mean absolute errors(MAEs)of PMX/PMY sho rt-term prediction of the LS+AR hybrid method and the weighted LS+AR hybrid method,the weighted LS+weighted AR hybrid method shows average improvements of 6.61%/12.08%and 0.24%/11.65%,respectively.Besides,for the slopes of the linear regression lines fitted to the errors of each method,the growth of the prediction error of the proposed method is slower than that of the other two methods.展开更多
El Niño-Southern Oscillation(ENSO)is the strongest interannual climate mode influencing the coupled ocean-atmosphere system in the tropical Pacific,and numerous dynamical and statistical models have been develope...El Niño-Southern Oscillation(ENSO)is the strongest interannual climate mode influencing the coupled ocean-atmosphere system in the tropical Pacific,and numerous dynamical and statistical models have been developed to simulate and predict it.In some simplified coupled ocean-atmosphere models,the relationship between sea surface temperature(SST)anomalies and wind stress(τ)anomalies can be constructed by statistical methods,such as singular value decomposition(SVD).In recent years,the applications of artificial intelligence(AI)to climate modeling have shown promising prospects,and the integrations of AI-based models with dynamical models are active areas of research.This study constructs U-Net models for representing the relationship between SSTAs andτanomalies in the tropical Pacific;the UNet-derivedτmodel,denoted asτUNet,is then used to replace the original SVD-basedτmodel of an intermediate coupled model(ICM),forming a newly AI-integrated ICM,referred to as ICM-UNet.The simulation results obtained from ICM-UNet demonstrate their ability to represent the spatiotemporal variability of oceanic and atmospheric anomaly fields in the equatorial Pacific.In the ocean-only case study,theτUNet-derived wind stress anomaly fields are used to force the ocean component of the ICM,the results of which also indicate reasonable simulations of typical ENSO events.These results demonstrate the feasibility of integrating an AI-derived model with a physics-based dynamical model for ENSO modeling studies.Furthermore,the successful integration of the dynamical ocean models with the AI-based atmospheric wind model provides a novel approach to ocean-atmosphere interaction modeling studies.展开更多
Long-wavelength(>500 km)magnetic anomalies originating in the lithosphere were first found in satellite magnetic surveys.Compared to the striking magnetic anomalies around the world,the long-wavelength magnetic ano...Long-wavelength(>500 km)magnetic anomalies originating in the lithosphere were first found in satellite magnetic surveys.Compared to the striking magnetic anomalies around the world,the long-wavelength magnetic anomalies in China and surrounding regions are relatively weak.Specialized research on each of these anomalies has been quite inadequate;their geological origins remain unclear,in particular their connection to tectonic activity in the Chinese and surrounding regions.We focus on six magnetic high anomalies over the(1)Tarim Basin,(2)Sichuan Basin(3)Great Xing’an Range,(4)Barmer Basin,(5)Central Myanmar Basin,and(6)Sunda and Banda Arcs,and a striking magnetic low anomaly along the southern part of the Himalayan-Tibetan Plateau.We have analyzed their geological origins by reviewing related research and by detailed comparison with geological results.The tectonic backgrounds for these anomalies belong to two cases:either ancient basin basement,or subduction-collision zone.However,the geological origins of large-scale regional magnetic anomalies are always subject to dispute,mainly because of limited surface exposure of sources,later tectonic destruction,and superposition of multi-phase events.展开更多
Based on the understanding that the seismic fault system is a nonlinear complex system,Rundle(1995)introduced the nonlinear threshold system used in meteorology to analyze the ocean-atmosphere interface and the El Ni?...Based on the understanding that the seismic fault system is a nonlinear complex system,Rundle(1995)introduced the nonlinear threshold system used in meteorology to analyze the ocean-atmosphere interface and the El Ni?o Southern Oscillation into the study of seismic activity changes,and then proposed the PI method(Rundle et al.,2000a,b).Wu et al.(2011)modified the Pattern Informatics Method named MPI to extract the ionospheric anomaly by using data from DEMETER satellites which is suitable for 1–3 months short-term prediction.展开更多
Accurate load forecasting forms a crucial foundation for implementing household demand response plans andoptimizing load scheduling. When dealing with short-term load data characterized by substantial fluctuations,a s...Accurate load forecasting forms a crucial foundation for implementing household demand response plans andoptimizing load scheduling. When dealing with short-term load data characterized by substantial fluctuations,a single prediction model is hard to capture temporal features effectively, resulting in diminished predictionaccuracy. In this study, a hybrid deep learning framework that integrates attention mechanism, convolution neuralnetwork (CNN), improved chaotic particle swarm optimization (ICPSO), and long short-term memory (LSTM), isproposed for short-term household load forecasting. Firstly, the CNN model is employed to extract features fromthe original data, enhancing the quality of data features. Subsequently, the moving average method is used for datapreprocessing, followed by the application of the LSTM network to predict the processed data. Moreover, the ICPSOalgorithm is introduced to optimize the parameters of LSTM, aimed at boosting the model’s running speed andaccuracy. Finally, the attention mechanism is employed to optimize the output value of LSTM, effectively addressinginformation loss in LSTM induced by lengthy sequences and further elevating prediction accuracy. According tothe numerical analysis, the accuracy and effectiveness of the proposed hybrid model have been verified. It canexplore data features adeptly, achieving superior prediction accuracy compared to other forecasting methods forthe household load exhibiting significant fluctuations across different seasons.展开更多
In this paper, we propose a novel anomaly detection method for data centers based on a combination of graphstructure and abnormal attention mechanism. The method leverages the sensor monitoring data from targetpower s...In this paper, we propose a novel anomaly detection method for data centers based on a combination of graphstructure and abnormal attention mechanism. The method leverages the sensor monitoring data from targetpower substations to construct multidimensional time series. These time series are subsequently transformed intograph structures, and corresponding adjacency matrices are obtained. By incorporating the adjacency matricesand additional weights associated with the graph structure, an aggregation matrix is derived. The aggregationmatrix is then fed into a pre-trained graph convolutional neural network (GCN) to extract graph structure features.Moreover, both themultidimensional time series segments and the graph structure features are inputted into a pretrainedanomaly detectionmodel, resulting in corresponding anomaly detection results that help identify abnormaldata. The anomaly detection model consists of a multi-level encoder-decoder module, wherein each level includesa transformer encoder and decoder based on correlation differences. The attention module in the encoding layeradopts an abnormal attention module with a dual-branch structure. Experimental results demonstrate that ourproposed method significantly improves the accuracy and stability of anomaly detection.展开更多
Understanding the topographic patterns of the seafloor is a very important part of understanding our planet.Although the science involved in bathymetric surveying has advanced much over the decades,less than 20%of the...Understanding the topographic patterns of the seafloor is a very important part of understanding our planet.Although the science involved in bathymetric surveying has advanced much over the decades,less than 20%of the seafloor has been precisely modeled to date,and there is an urgent need to improve the accuracy and reduce the uncertainty of underwater survey data.In this study,we introduce a pretrained visual geometry group network(VGGNet)method based on deep learning.To apply this method,we input gravity anomaly data derived from ship measurements and satellite altimetry into the model and correct the latter,which has a larger spatial coverage,based on the former,which is considered the true value and is more accurate.After obtaining the corrected high-precision gravity model,it is inverted to the corresponding bathymetric model by applying the gravity-depth correlation.We choose four data pairs collected from different environments,i.e.,the Southern Ocean,Pacific Ocean,Atlantic Ocean and Caribbean Sea,to evaluate the topographic correction results of the model.The experiments show that the coefficient of determination(R~2)reaches 0.834 among the results of the four experimental groups,signifying a high correlation.The standard deviation and normalized root mean square error are also evaluated,and the accuracy of their performance improved by up to 24.2%compared with similar research done in recent years.The evaluation of the R^(2) values at different water depths shows that our model can achieve performance results above 0.90 at certain water depths and can also significantly improve results from mid-water depths when compared to previous research.Finally,the bathymetry corrected by our model is able to show an accuracy improvement level of more than 21%within 1%of the total water depths,which is sufficient to prove that the VGGNet-based method has the ability to perform a gravity-bathymetry correction and achieve outstanding results.展开更多
Accurate forecasting of time series is crucial across various domains.Many prediction tasks rely on effectively segmenting,matching,and time series data alignment.For instance,regardless of time series with the same g...Accurate forecasting of time series is crucial across various domains.Many prediction tasks rely on effectively segmenting,matching,and time series data alignment.For instance,regardless of time series with the same granularity,segmenting them into different granularity events can effectively mitigate the impact of varying time scales on prediction accuracy.However,these events of varying granularity frequently intersect with each other,which may possess unequal durations.Even minor differences can result in significant errors when matching time series with future trends.Besides,directly using matched events but unaligned events as state vectors in machine learning-based prediction models can lead to insufficient prediction accuracy.Therefore,this paper proposes a short-term forecasting method for time series based on a multi-granularity event,MGE-SP(multi-granularity event-based short-termprediction).First,amethodological framework for MGE-SP established guides the implementation steps.The framework consists of three key steps,including multi-granularity event matching based on the LTF(latest time first)strategy,multi-granularity event alignment using a piecewise aggregate approximation based on the compression ratio,and a short-term prediction model based on XGBoost.The data from a nationwide online car-hailing service in China ensures the method’s reliability.The average RMSE(root mean square error)and MAE(mean absolute error)of the proposed method are 3.204 and 2.360,lower than the respective values of 4.056 and 3.101 obtained using theARIMA(autoregressive integratedmoving average)method,as well as the values of 4.278 and 2.994 obtained using k-means-SVR(support vector regression)method.The other experiment is conducted on stock data froma public data set.The proposed method achieved an average RMSE and MAE of 0.836 and 0.696,lower than the respective values of 1.019 and 0.844 obtained using the ARIMA method,as well as the values of 1.350 and 1.172 obtained using the k-means-SVR method.展开更多
BACKGROUND: Dental anomalies are variations from the established well-known general anatomy and morphology of the tooth as a result of disturbances during tooth formation. They can be developmental, congenital, or acq...BACKGROUND: Dental anomalies are variations from the established well-known general anatomy and morphology of the tooth as a result of disturbances during tooth formation. They can be developmental, congenital, or acquired and may be localized to a single tooth or involve systemic conditions. AIM: To evaluate the prevalence of dental anomalies in patients who report to the Komfo Anokye Teaching Hospital (KATH) dental clinics. METHOD: A descriptive cross-sectional design was used with a sample size of 92 patients aged 18 or older, obtained through convenience sampling. Data analysis was performed using SPSS version 26.0. RESULTS: The study included 92 patients aged 18 to 72 years, with 47.8% males and 52.2% females. Dental anomalies were observed in 51.1% of participants, with a higher prevalence in females (55.3%). The most common anomalies were diastema (48.3%), impacted teeth (22.0%), dilaceration (11.9%), and peg-shaped lateral teeth (6.8%). CONCLUSION: This study highlights the importance of conducting thorough dental examinations to identify and address dental anomalies, which may have implications for treatment. Early detection and correction of these anomalies are crucial to prevent future complications.展开更多
Recently,anomaly detection(AD)in streaming data gained significant attention among research communities due to its applicability in finance,business,healthcare,education,etc.The recent developments of deep learning(DL...Recently,anomaly detection(AD)in streaming data gained significant attention among research communities due to its applicability in finance,business,healthcare,education,etc.The recent developments of deep learning(DL)models find helpful in the detection and classification of anomalies.This article designs an oversampling with an optimal deep learning-based streaming data classification(OS-ODLSDC)model.The aim of the OSODLSDC model is to recognize and classify the presence of anomalies in the streaming data.The proposed OS-ODLSDC model initially undergoes preprocessing step.Since streaming data is unbalanced,support vector machine(SVM)-Synthetic Minority Over-sampling Technique(SVM-SMOTE)is applied for oversampling process.Besides,the OS-ODLSDC model employs bidirectional long short-term memory(Bi LSTM)for AD and classification.Finally,the root means square propagation(RMSProp)optimizer is applied for optimal hyperparameter tuning of the Bi LSTM model.For ensuring the promising performance of the OS-ODLSDC model,a wide-ranging experimental analysis is performed using three benchmark datasets such as CICIDS 2018,KDD-Cup 1999,and NSL-KDD datasets.展开更多
Accurate origin–destination(OD)demand prediction is crucial for the efficient operation and management of urban rail transit(URT)systems,particularly during a pandemic.However,this task faces several limitations,incl...Accurate origin–destination(OD)demand prediction is crucial for the efficient operation and management of urban rail transit(URT)systems,particularly during a pandemic.However,this task faces several limitations,including real-time availability,sparsity,and high-dimensionality issues,and the impact of the pandemic.Consequently,this study proposes a unified framework called the physics-guided adaptive graph spatial–temporal attention network(PAG-STAN)for metro OD demand prediction under pandemic conditions.Specifically,PAG-STAN introduces a real-time OD estimation module to estimate real-time complete OD demand matrices.Subsequently,a novel dynamic OD demand matrix compression module is proposed to generate dense real-time OD demand matrices.Thereafter,PAG-STAN leverages various heterogeneous data to learn the evolutionary trend of future OD ridership during the pandemic.Finally,a masked physics-guided loss function(MPG-loss function)incorporates the physical quantity information between the OD demand and inbound flow into the loss function to enhance model interpretability.PAG-STAN demonstrated favorable performance on two real-world metro OD demand datasets under the pandemic and conventional scenarios,highlighting its robustness and sensitivity for metro OD demand prediction.A series of ablation studies were conducted to verify the indispensability of each module in PAG-STAN.展开更多
Network anomaly detection plays a vital role in safeguarding network security.However,the existing network anomaly detection task is typically based on the one-class zero-positive scenario.This approach is susceptible...Network anomaly detection plays a vital role in safeguarding network security.However,the existing network anomaly detection task is typically based on the one-class zero-positive scenario.This approach is susceptible to overfitting during the training process due to discrepancies in data distribution between the training set and the test set.This phenomenon is known as prediction drift.Additionally,the rarity of anomaly data,often masked by normal data,further complicates network anomaly detection.To address these challenges,we propose the PUNet network,which ingeniously combines the strengths of traditional machine learning and deep learning techniques for anomaly detection.Specifically,PUNet employs a reconstruction-based autoencoder to pre-train normal data,enabling the network to capture potential features and correlations within the data.Subsequently,PUNet integrates a sampling algorithm to construct a pseudo-label candidate set among the outliers based on the reconstruction loss of the samples.This approach effectively mitigates the prediction drift problem by incorporating abnormal samples.Furthermore,PUNet utilizes the CatBoost classifier for anomaly detection to tackle potential data imbalance issues within the candidate set.Extensive experimental evaluations demonstrate that PUNet effectively resolves the prediction drift and data imbalance problems,significantly outperforming competing methods.展开更多
BACKGROUND Previous studies have reported that low hematocrit levels indicate poor survival in patients with ovarian cancer and cervical cancer,the prognostic value of hematocrit for colorectal cancer(CRC)patients has...BACKGROUND Previous studies have reported that low hematocrit levels indicate poor survival in patients with ovarian cancer and cervical cancer,the prognostic value of hematocrit for colorectal cancer(CRC)patients has not been determined.The prognostic value of red blood cell distribution width(RDW)for CRC patients was controversial.AIM To investigate the impact of RDW and hematocrit on the short-term outcomes and long-term prognosis of CRC patients who underwent radical surgery.METHODS Patients who were diagnosed with CRC and underwent radical CRC resection between January 2011 and January 2020 at a single clinical center were included.The short-term outcomes,overall survival(OS)and disease-free survival(DFS)were compared among the different groups.Cox analysis was also conducted to identify independent risk factors for OS and DFS.RESULTS There were 4258 CRC patients who underwent radical surgery included in our study.A total of 1573 patients were in the lower RDW group and 2685 patients were in the higher RDW group.There were 2166 and 2092 patients in the higher hematocrit group and lower hematocrit group,respectively.Patients in the higher RDW group had more intraoperative blood loss(P<0.01)and more overall complications(P<0.01)than did those in the lower RDW group.Similarly,patients in the lower hematocrit group had more intraoperative blood loss(P=0.012),longer hospital stay(P=0.016)and overall complications(P<0.01)than did those in the higher hematocrit group.The higher RDW group had a worse OS and DFS than did the lower RDW group for tumor node metastasis(TNM)stage I(OS,P<0.05;DFS,P=0.001)and stage II(OS,P=0.004;DFS,P=0.01)than the lower RDW group;the lower hematocrit group had worse OS and DFS for TNM stage II(OS,P<0.05;DFS,P=0.001)and stage III(OS,P=0.001;DFS,P=0.001)than did the higher hematocrit group.Preoperative hematocrit was an independent risk factor for OS[P=0.017,hazard ratio(HR)=1.256,95%confidence interval(CI):1.041-1.515]and DFS(P=0.035,HR=1.194,95%CI:1.013-1.408).CONCLUSION A higher preoperative RDW and lower hematocrit were associated with more postoperative complications.However,only hematocrit was an independent risk factor for OS and DFS in CRC patients who underwent radical surgery,while RDW was not.展开更多
In the Industrial Internet of Things(IIoT),sensors generate time series data to reflect the working state.When the systems are attacked,timely identification of outliers in time series is critical to ensure security.A...In the Industrial Internet of Things(IIoT),sensors generate time series data to reflect the working state.When the systems are attacked,timely identification of outliers in time series is critical to ensure security.Although many anomaly detection methods have been proposed,the temporal correlation of the time series over the same sensor and the state(spatial)correlation between different sensors are rarely considered simultaneously in these methods.Owing to the superior capability of Transformer in learning time series features.This paper proposes a time series anomaly detection method based on a spatial-temporal network and an improved Transformer.Additionally,the methods based on graph neural networks typically include a graph structure learning module and an anomaly detection module,which are interdependent.However,in the initial phase of training,since neither of the modules has reached an optimal state,their performance may influence each other.This scenario makes the end-to-end training approach hard to effectively direct the learning trajectory of each module.This interdependence between the modules,coupled with the initial instability,may cause the model to find it hard to find the optimal solution during the training process,resulting in unsatisfactory results.We introduce an adaptive graph structure learning method to obtain the optimal model parameters and graph structure.Experiments on two publicly available datasets demonstrate that the proposed method attains higher anomaly detection results than other methods.展开更多
The management of network intelligence in Beyond 5G(B5G)networks encompasses the complex challenges of scalability,dynamicity,interoperability,privacy,and security.These are essential steps towards achieving the reali...The management of network intelligence in Beyond 5G(B5G)networks encompasses the complex challenges of scalability,dynamicity,interoperability,privacy,and security.These are essential steps towards achieving the realization of truly ubiquitous Artificial Intelligence(AI)-based analytics,empowering seamless integration across the entire Continuum(Edge,Fog,Core,Cloud).This paper introduces a Federated Network Intelligence Orchestration approach aimed at scalable and automated Federated Learning(FL)-based anomaly detection in B5Gnetworks.By leveraging a horizontal Federated learning approach based on the FedAvg aggregation algorithm,which employs a deep autoencoder model trained on non-anomalous traffic samples to recognize normal behavior,the systemorchestrates network intelligence to detect and prevent cyber-attacks.Integrated into a B5G Zero-touch Service Management(ZSM)aligned Security Framework,the proposal utilizes multi-domain and multi-tenant orchestration to automate and scale the deployment of FL-agents and AI-based anomaly detectors,enhancing reaction capabilities against cyber-attacks.The proposed FL architecture can be dynamically deployed across the B5G Continuum,utilizing a hierarchy of Network Intelligence orchestrators for real-time anomaly and security threat handling.Implementation includes FL enforcement operations for interoperability and extensibility,enabling dynamic deployment,configuration,and reconfiguration on demand.Performance validation of the proposed solution was conducted through dynamic orchestration,FL,and real-time anomaly detection processes using a practical test environment.Analysis of key performance metrics,leveraging the 5G-NIDD dataset,demonstrates the system’s capability for automatic and near real-time handling of anomalies and attacks,including real-time network monitoring and countermeasure implementation for mitigation.展开更多
Coronary artery anomaly is known as one of the causes of angina pectoris and sudden death and is an important clinical entity that cannot be overlooked.The incidence of coronary artery anomalies is as low as 1%-2%of t...Coronary artery anomaly is known as one of the causes of angina pectoris and sudden death and is an important clinical entity that cannot be overlooked.The incidence of coronary artery anomalies is as low as 1%-2%of the general population,even when the various types are combined.Coronary anomalies are practically challenging when the left and right coronary ostium are not found around their normal positions during coronary angiography with a catheter.If there is atherosclerotic stenosis of the coronary artery with an anomaly and percutaneous coronary intervention(PCI)is required,the suitability of the guiding catheter at the entrance and the adequate back up force of the guiding catheter are issues.The level of PCI risk itself should also be considered on a caseby-case basis.In this case,emission computed tomography in the R-1 subtype single coronary artery proved that ischemia occurred in an area where the coronary artery was not visible to the naked eye.Meticulous follow-up would be crucial,because sudden death may occur in single coronary arteries.To prevent atherosclerosis with full efforts is also important,as the authors indicated admirably.展开更多
In the IoT(Internet of Things)domain,the increased use of encryption protocols such as SSL/TLS,VPN(Virtual Private Network),and Tor has led to a rise in attacks leveraging encrypted traffic.While research on anomaly d...In the IoT(Internet of Things)domain,the increased use of encryption protocols such as SSL/TLS,VPN(Virtual Private Network),and Tor has led to a rise in attacks leveraging encrypted traffic.While research on anomaly detection using AI(Artificial Intelligence)is actively progressing,the encrypted nature of the data poses challenges for labeling,resulting in data imbalance and biased feature extraction toward specific nodes.This study proposes a reconstruction error-based anomaly detection method using an autoencoder(AE)that utilizes packet metadata excluding specific node information.The proposed method omits biased packet metadata such as IP and Port and trains the detection model using only normal data,leveraging a small amount of packet metadata.This makes it well-suited for direct application in IoT environments due to its low resource consumption.In experiments comparing feature extraction methods for AE-based anomaly detection,we found that using flowbased features significantly improves accuracy,precision,F1 score,and AUC(Area Under the Receiver Operating Characteristic Curve)score compared to packet-based features.Additionally,for flow-based features,the proposed method showed a 30.17%increase in F1 score and improved false positive rates compared to Isolation Forest and OneClassSVM.Furthermore,the proposedmethod demonstrated a 32.43%higherAUCwhen using packet features and a 111.39%higher AUC when using flow features,compared to previously proposed oversampling methods.This study highlights the impact of feature extraction methods on attack detection in imbalanced,encrypted traffic environments and emphasizes that the one-class method using AE is more effective for attack detection and reducing false positives compared to traditional oversampling methods.展开更多
Predictive maintenance has emerged as an effective tool for curbing maintenance costs,yet prevailing research predominantly concentrates on the abnormal phases.Within the ostensibly stable healthy phase,the reliance o...Predictive maintenance has emerged as an effective tool for curbing maintenance costs,yet prevailing research predominantly concentrates on the abnormal phases.Within the ostensibly stable healthy phase,the reliance on anomaly detection to preempt equipment malfunctions faces the challenge of sudden anomaly discernment.To address this challenge,this paper proposes a dual-task learning approach for bearing anomaly detection and state evaluation of safe regions.The proposed method transforms the execution of the two tasks into an optimization issue of the hypersphere center.By leveraging the monotonicity and distinguishability pertinent to the tasks as the foundation for optimization,it reconstructs the SVDD model to ensure equilibrium in the model’s performance across the two tasks.Subsequent experiments verify the proposed method’s effectiveness,which is interpreted from the perspectives of parameter adjustment and enveloping trade-offs.In the meantime,experimental results also show two deficiencies in anomaly detection accuracy and state evaluation metrics.Their theoretical analysis inspires us to focus on feature extraction and data collection to achieve improvements.The proposed method lays the foundation for realizing predictive maintenance in a healthy stage by improving condition awareness in safe regions.展开更多
BACKGROUND Hepatectomy is the first choice for treating liver cancer.However,inflammatory factors,released in response to pain stimulation,may suppress perioperative immune function and affect the prognosis of patient...BACKGROUND Hepatectomy is the first choice for treating liver cancer.However,inflammatory factors,released in response to pain stimulation,may suppress perioperative immune function and affect the prognosis of patients undergoing hepatectomies.AIM To determine the short-term efficacy of microwave ablation in the treatment of liver cancer and its effect on immune function.METHODS Clinical data from patients with liver cancer admitted to Suzhou Ninth People’s Hospital from January 2020 to December 2023 were retrospectively analyzed.Thirty-five patients underwent laparoscopic hepatectomy for liver cancer(liver cancer resection group)and 35 patients underwent medical image-guided microwave ablation(liver cancer ablation group).The short-term efficacy,complications,liver function,and immune function indices before and after treatment were compared between the two groups.RESULTS One month after treatment,19 patients experienced complete remission(CR),8 patients experienced partial remission(PR),6 patients experienced stable disease(SD),and 2 patients experienced disease progression(PD)in the liver cancer resection group.In the liver cancer ablation group,21 patients experienced CR,9 patients experienced PR,3 patients experienced SD,and 2 patients experienced PD.No significant differences in efficacy and complications were detected between the liver cancer ablation and liver cancer resection groups(P>0.05).After treatment,total bilirubin(41.24±7.35 vs 49.18±8.64μmol/L,P<0.001),alanine aminotransferase(30.85±6.23 vs 42.32±7.56 U/L,P<0.001),CD4+(43.95±5.72 vs 35.27±5.56,P<0.001),CD8+(20.38±3.91 vs 22.75±4.62,P<0.001),and CD4+/CD8+(2.16±0.39 vs 1.55±0.32,P<0.001)were significantly different between the liver cancer ablation and liver cancer resection groups.CONCLUSION The short-term efficacy and safety of microwave ablation and laparoscopic surgery for the treatment of liver cancer are similar,but liver function recovers quickly after microwave ablation,and microwave ablation may enhance immune function.展开更多
基金National Key R&D Program of China(No.2020YFB1707700)。
文摘The problems in equipment fault detection include data dimension explosion,computational complexity,low detection accuracy,etc.To solve these problems,a device anomaly detection algorithm based on enhanced long short-term memory(LSTM)is proposed.The algorithm first reduces the dimensionality of the device sensor data by principal component analysis(PCA),extracts the strongly correlated variable data among the multidimensional sensor data with the lowest possible information loss,and then uses the enhanced stacked LSTM to predict the extracted temporal data,thus improving the accuracy of anomaly detection.To improve the efficiency of the anomaly detection,a genetic algorithm(GA)is used to adjust the magnitude of the enhancements made by the LSTM model.The validation of the actual data from the pumps shows that the algorithm has significantly improved the recall rate and the detection speed of device anomaly detection,with the recall rate of 97.07%,which indicates that the algorithm is effective and efficient for device anomaly detection in the actual production environment.
基金supported by National Natural Science Foundation of China,China(No.42004016)HuBei Natural Science Fund,China(No.2020CFB329)+1 种基金HuNan Natural Science Fund,China(No.2023JJ60559,2023JJ60560)the State Key Laboratory of Geodesy and Earth’s Dynamics self-deployment project,China(No.S21L6101)。
文摘Short-term(up to 30 days)predictions of Earth Rotation Parameters(ERPs)such as Polar Motion(PM:PMX and PMY)play an essential role in real-time applications related to high-precision reference frame conversion.Currently,least squares(LS)+auto-regressive(AR)hybrid method is one of the main techniques of PM prediction.Besides,the weighted LS+AR hybrid method performs well for PM short-term prediction.However,the corresponding covariance information of LS fitting residuals deserves further exploration in the AR model.In this study,we have derived a modified stochastic model for the LS+AR hybrid method,namely the weighted LS+weighted AR hybrid method.By using the PM data products of IERS EOP 14 C04,the numerical results indicate that for PM short-term forecasting,the proposed weighted LS+weighted AR hybrid method shows an advantage over both the LS+AR hybrid method and the weighted LS+AR hybrid method.Compared to the mean absolute errors(MAEs)of PMX/PMY sho rt-term prediction of the LS+AR hybrid method and the weighted LS+AR hybrid method,the weighted LS+weighted AR hybrid method shows average improvements of 6.61%/12.08%and 0.24%/11.65%,respectively.Besides,for the slopes of the linear regression lines fitted to the errors of each method,the growth of the prediction error of the proposed method is slower than that of the other two methods.
基金supported by the National Natural Science Foundation of China(NFSCGrant No.42030410)+2 种基金Laoshan Laboratory(No.LSKJ202202402)the Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDB40000000)the Startup Foundation for Introducing Talent of NUIST.
文摘El Niño-Southern Oscillation(ENSO)is the strongest interannual climate mode influencing the coupled ocean-atmosphere system in the tropical Pacific,and numerous dynamical and statistical models have been developed to simulate and predict it.In some simplified coupled ocean-atmosphere models,the relationship between sea surface temperature(SST)anomalies and wind stress(τ)anomalies can be constructed by statistical methods,such as singular value decomposition(SVD).In recent years,the applications of artificial intelligence(AI)to climate modeling have shown promising prospects,and the integrations of AI-based models with dynamical models are active areas of research.This study constructs U-Net models for representing the relationship between SSTAs andτanomalies in the tropical Pacific;the UNet-derivedτmodel,denoted asτUNet,is then used to replace the original SVD-basedτmodel of an intermediate coupled model(ICM),forming a newly AI-integrated ICM,referred to as ICM-UNet.The simulation results obtained from ICM-UNet demonstrate their ability to represent the spatiotemporal variability of oceanic and atmospheric anomaly fields in the equatorial Pacific.In the ocean-only case study,theτUNet-derived wind stress anomaly fields are used to force the ocean component of the ICM,the results of which also indicate reasonable simulations of typical ENSO events.These results demonstrate the feasibility of integrating an AI-derived model with a physics-based dynamical model for ENSO modeling studies.Furthermore,the successful integration of the dynamical ocean models with the AI-based atmospheric wind model provides a novel approach to ocean-atmosphere interaction modeling studies.
基金the National Natural Science Foundation of China(grant numbers 42004051,42274214,41904134).
文摘Long-wavelength(>500 km)magnetic anomalies originating in the lithosphere were first found in satellite magnetic surveys.Compared to the striking magnetic anomalies around the world,the long-wavelength magnetic anomalies in China and surrounding regions are relatively weak.Specialized research on each of these anomalies has been quite inadequate;their geological origins remain unclear,in particular their connection to tectonic activity in the Chinese and surrounding regions.We focus on six magnetic high anomalies over the(1)Tarim Basin,(2)Sichuan Basin(3)Great Xing’an Range,(4)Barmer Basin,(5)Central Myanmar Basin,and(6)Sunda and Banda Arcs,and a striking magnetic low anomaly along the southern part of the Himalayan-Tibetan Plateau.We have analyzed their geological origins by reviewing related research and by detailed comparison with geological results.The tectonic backgrounds for these anomalies belong to two cases:either ancient basin basement,or subduction-collision zone.However,the geological origins of large-scale regional magnetic anomalies are always subject to dispute,mainly because of limited surface exposure of sources,later tectonic destruction,and superposition of multi-phase events.
基金supported by the Joint Funds of the National Natural Science Foundation of China(Grant No.U2039207)。
文摘Based on the understanding that the seismic fault system is a nonlinear complex system,Rundle(1995)introduced the nonlinear threshold system used in meteorology to analyze the ocean-atmosphere interface and the El Ni?o Southern Oscillation into the study of seismic activity changes,and then proposed the PI method(Rundle et al.,2000a,b).Wu et al.(2011)modified the Pattern Informatics Method named MPI to extract the ionospheric anomaly by using data from DEMETER satellites which is suitable for 1–3 months short-term prediction.
基金the Shanghai Rising-Star Program(No.22QA1403900)the National Natural Science Foundation of China(No.71804106)the Noncarbon Energy Conversion and Utilization Institute under the Shanghai Class IV Peak Disciplinary Development Program.
文摘Accurate load forecasting forms a crucial foundation for implementing household demand response plans andoptimizing load scheduling. When dealing with short-term load data characterized by substantial fluctuations,a single prediction model is hard to capture temporal features effectively, resulting in diminished predictionaccuracy. In this study, a hybrid deep learning framework that integrates attention mechanism, convolution neuralnetwork (CNN), improved chaotic particle swarm optimization (ICPSO), and long short-term memory (LSTM), isproposed for short-term household load forecasting. Firstly, the CNN model is employed to extract features fromthe original data, enhancing the quality of data features. Subsequently, the moving average method is used for datapreprocessing, followed by the application of the LSTM network to predict the processed data. Moreover, the ICPSOalgorithm is introduced to optimize the parameters of LSTM, aimed at boosting the model’s running speed andaccuracy. Finally, the attention mechanism is employed to optimize the output value of LSTM, effectively addressinginformation loss in LSTM induced by lengthy sequences and further elevating prediction accuracy. According tothe numerical analysis, the accuracy and effectiveness of the proposed hybrid model have been verified. It canexplore data features adeptly, achieving superior prediction accuracy compared to other forecasting methods forthe household load exhibiting significant fluctuations across different seasons.
基金the Science and Technology Project of China Southern Power Grid Company,Ltd.(031200KK52200003)the National Natural Science Foundation of China(Nos.62371253,52278119).
文摘In this paper, we propose a novel anomaly detection method for data centers based on a combination of graphstructure and abnormal attention mechanism. The method leverages the sensor monitoring data from targetpower substations to construct multidimensional time series. These time series are subsequently transformed intograph structures, and corresponding adjacency matrices are obtained. By incorporating the adjacency matricesand additional weights associated with the graph structure, an aggregation matrix is derived. The aggregationmatrix is then fed into a pre-trained graph convolutional neural network (GCN) to extract graph structure features.Moreover, both themultidimensional time series segments and the graph structure features are inputted into a pretrainedanomaly detectionmodel, resulting in corresponding anomaly detection results that help identify abnormaldata. The anomaly detection model consists of a multi-level encoder-decoder module, wherein each level includesa transformer encoder and decoder based on correlation differences. The attention module in the encoding layeradopts an abnormal attention module with a dual-branch structure. Experimental results demonstrate that ourproposed method significantly improves the accuracy and stability of anomaly detection.
基金The National Key R&D Program of China under contract Nos 2022YFC3003800,2020YFC1521700 and 2020YFC1521705the National Natural Science Foundation of China under contract No.41830540+3 种基金the Open Fund of the East China Coastal Field Scientific Observation and Research Station of the Ministry of Natural Resources under contract No.OR-SECCZ2022104the Deep Blue Project of Shanghai Jiao Tong University under contract No.SL2020ZD204the Special Funding Project for the Basic Scientific Research Operation Expenses of the Central Government-Level Research Institutes of Public Interest of China under contract No.SZ2102the Zhejiang Provincial Project under contract No.330000210130313013006。
文摘Understanding the topographic patterns of the seafloor is a very important part of understanding our planet.Although the science involved in bathymetric surveying has advanced much over the decades,less than 20%of the seafloor has been precisely modeled to date,and there is an urgent need to improve the accuracy and reduce the uncertainty of underwater survey data.In this study,we introduce a pretrained visual geometry group network(VGGNet)method based on deep learning.To apply this method,we input gravity anomaly data derived from ship measurements and satellite altimetry into the model and correct the latter,which has a larger spatial coverage,based on the former,which is considered the true value and is more accurate.After obtaining the corrected high-precision gravity model,it is inverted to the corresponding bathymetric model by applying the gravity-depth correlation.We choose four data pairs collected from different environments,i.e.,the Southern Ocean,Pacific Ocean,Atlantic Ocean and Caribbean Sea,to evaluate the topographic correction results of the model.The experiments show that the coefficient of determination(R~2)reaches 0.834 among the results of the four experimental groups,signifying a high correlation.The standard deviation and normalized root mean square error are also evaluated,and the accuracy of their performance improved by up to 24.2%compared with similar research done in recent years.The evaluation of the R^(2) values at different water depths shows that our model can achieve performance results above 0.90 at certain water depths and can also significantly improve results from mid-water depths when compared to previous research.Finally,the bathymetry corrected by our model is able to show an accuracy improvement level of more than 21%within 1%of the total water depths,which is sufficient to prove that the VGGNet-based method has the ability to perform a gravity-bathymetry correction and achieve outstanding results.
基金funded by the Fujian Province Science and Technology Plan,China(Grant Number 2019H0017).
文摘Accurate forecasting of time series is crucial across various domains.Many prediction tasks rely on effectively segmenting,matching,and time series data alignment.For instance,regardless of time series with the same granularity,segmenting them into different granularity events can effectively mitigate the impact of varying time scales on prediction accuracy.However,these events of varying granularity frequently intersect with each other,which may possess unequal durations.Even minor differences can result in significant errors when matching time series with future trends.Besides,directly using matched events but unaligned events as state vectors in machine learning-based prediction models can lead to insufficient prediction accuracy.Therefore,this paper proposes a short-term forecasting method for time series based on a multi-granularity event,MGE-SP(multi-granularity event-based short-termprediction).First,amethodological framework for MGE-SP established guides the implementation steps.The framework consists of three key steps,including multi-granularity event matching based on the LTF(latest time first)strategy,multi-granularity event alignment using a piecewise aggregate approximation based on the compression ratio,and a short-term prediction model based on XGBoost.The data from a nationwide online car-hailing service in China ensures the method’s reliability.The average RMSE(root mean square error)and MAE(mean absolute error)of the proposed method are 3.204 and 2.360,lower than the respective values of 4.056 and 3.101 obtained using theARIMA(autoregressive integratedmoving average)method,as well as the values of 4.278 and 2.994 obtained using k-means-SVR(support vector regression)method.The other experiment is conducted on stock data froma public data set.The proposed method achieved an average RMSE and MAE of 0.836 and 0.696,lower than the respective values of 1.019 and 0.844 obtained using the ARIMA method,as well as the values of 1.350 and 1.172 obtained using the k-means-SVR method.
文摘BACKGROUND: Dental anomalies are variations from the established well-known general anatomy and morphology of the tooth as a result of disturbances during tooth formation. They can be developmental, congenital, or acquired and may be localized to a single tooth or involve systemic conditions. AIM: To evaluate the prevalence of dental anomalies in patients who report to the Komfo Anokye Teaching Hospital (KATH) dental clinics. METHOD: A descriptive cross-sectional design was used with a sample size of 92 patients aged 18 or older, obtained through convenience sampling. Data analysis was performed using SPSS version 26.0. RESULTS: The study included 92 patients aged 18 to 72 years, with 47.8% males and 52.2% females. Dental anomalies were observed in 51.1% of participants, with a higher prevalence in females (55.3%). The most common anomalies were diastema (48.3%), impacted teeth (22.0%), dilaceration (11.9%), and peg-shaped lateral teeth (6.8%). CONCLUSION: This study highlights the importance of conducting thorough dental examinations to identify and address dental anomalies, which may have implications for treatment. Early detection and correction of these anomalies are crucial to prevent future complications.
文摘Recently,anomaly detection(AD)in streaming data gained significant attention among research communities due to its applicability in finance,business,healthcare,education,etc.The recent developments of deep learning(DL)models find helpful in the detection and classification of anomalies.This article designs an oversampling with an optimal deep learning-based streaming data classification(OS-ODLSDC)model.The aim of the OSODLSDC model is to recognize and classify the presence of anomalies in the streaming data.The proposed OS-ODLSDC model initially undergoes preprocessing step.Since streaming data is unbalanced,support vector machine(SVM)-Synthetic Minority Over-sampling Technique(SVM-SMOTE)is applied for oversampling process.Besides,the OS-ODLSDC model employs bidirectional long short-term memory(Bi LSTM)for AD and classification.Finally,the root means square propagation(RMSProp)optimizer is applied for optimal hyperparameter tuning of the Bi LSTM model.For ensuring the promising performance of the OS-ODLSDC model,a wide-ranging experimental analysis is performed using three benchmark datasets such as CICIDS 2018,KDD-Cup 1999,and NSL-KDD datasets.
基金supported by the National Natural Science Foundation of China(72288101,72201029,and 72322022).
文摘Accurate origin–destination(OD)demand prediction is crucial for the efficient operation and management of urban rail transit(URT)systems,particularly during a pandemic.However,this task faces several limitations,including real-time availability,sparsity,and high-dimensionality issues,and the impact of the pandemic.Consequently,this study proposes a unified framework called the physics-guided adaptive graph spatial–temporal attention network(PAG-STAN)for metro OD demand prediction under pandemic conditions.Specifically,PAG-STAN introduces a real-time OD estimation module to estimate real-time complete OD demand matrices.Subsequently,a novel dynamic OD demand matrix compression module is proposed to generate dense real-time OD demand matrices.Thereafter,PAG-STAN leverages various heterogeneous data to learn the evolutionary trend of future OD ridership during the pandemic.Finally,a masked physics-guided loss function(MPG-loss function)incorporates the physical quantity information between the OD demand and inbound flow into the loss function to enhance model interpretability.PAG-STAN demonstrated favorable performance on two real-world metro OD demand datasets under the pandemic and conventional scenarios,highlighting its robustness and sensitivity for metro OD demand prediction.A series of ablation studies were conducted to verify the indispensability of each module in PAG-STAN.
文摘Network anomaly detection plays a vital role in safeguarding network security.However,the existing network anomaly detection task is typically based on the one-class zero-positive scenario.This approach is susceptible to overfitting during the training process due to discrepancies in data distribution between the training set and the test set.This phenomenon is known as prediction drift.Additionally,the rarity of anomaly data,often masked by normal data,further complicates network anomaly detection.To address these challenges,we propose the PUNet network,which ingeniously combines the strengths of traditional machine learning and deep learning techniques for anomaly detection.Specifically,PUNet employs a reconstruction-based autoencoder to pre-train normal data,enabling the network to capture potential features and correlations within the data.Subsequently,PUNet integrates a sampling algorithm to construct a pseudo-label candidate set among the outliers based on the reconstruction loss of the samples.This approach effectively mitigates the prediction drift problem by incorporating abnormal samples.Furthermore,PUNet utilizes the CatBoost classifier for anomaly detection to tackle potential data imbalance issues within the candidate set.Extensive experimental evaluations demonstrate that PUNet effectively resolves the prediction drift and data imbalance problems,significantly outperforming competing methods.
基金The study was approved by the ethics committee of the First Affiliated Hospital of Chongqing Medical University(2022-K205),this study was conducted in accordance with the World Medical Association Declaration of Helsinki as well。
文摘BACKGROUND Previous studies have reported that low hematocrit levels indicate poor survival in patients with ovarian cancer and cervical cancer,the prognostic value of hematocrit for colorectal cancer(CRC)patients has not been determined.The prognostic value of red blood cell distribution width(RDW)for CRC patients was controversial.AIM To investigate the impact of RDW and hematocrit on the short-term outcomes and long-term prognosis of CRC patients who underwent radical surgery.METHODS Patients who were diagnosed with CRC and underwent radical CRC resection between January 2011 and January 2020 at a single clinical center were included.The short-term outcomes,overall survival(OS)and disease-free survival(DFS)were compared among the different groups.Cox analysis was also conducted to identify independent risk factors for OS and DFS.RESULTS There were 4258 CRC patients who underwent radical surgery included in our study.A total of 1573 patients were in the lower RDW group and 2685 patients were in the higher RDW group.There were 2166 and 2092 patients in the higher hematocrit group and lower hematocrit group,respectively.Patients in the higher RDW group had more intraoperative blood loss(P<0.01)and more overall complications(P<0.01)than did those in the lower RDW group.Similarly,patients in the lower hematocrit group had more intraoperative blood loss(P=0.012),longer hospital stay(P=0.016)and overall complications(P<0.01)than did those in the higher hematocrit group.The higher RDW group had a worse OS and DFS than did the lower RDW group for tumor node metastasis(TNM)stage I(OS,P<0.05;DFS,P=0.001)and stage II(OS,P=0.004;DFS,P=0.01)than the lower RDW group;the lower hematocrit group had worse OS and DFS for TNM stage II(OS,P<0.05;DFS,P=0.001)and stage III(OS,P=0.001;DFS,P=0.001)than did the higher hematocrit group.Preoperative hematocrit was an independent risk factor for OS[P=0.017,hazard ratio(HR)=1.256,95%confidence interval(CI):1.041-1.515]and DFS(P=0.035,HR=1.194,95%CI:1.013-1.408).CONCLUSION A higher preoperative RDW and lower hematocrit were associated with more postoperative complications.However,only hematocrit was an independent risk factor for OS and DFS in CRC patients who underwent radical surgery,while RDW was not.
基金This work is partly supported by the National Key Research and Development Program of China(Grant No.2020YFB1805403)the National Natural Science Foundation of China(Grant No.62032002)the 111 Project(Grant No.B21049).
文摘In the Industrial Internet of Things(IIoT),sensors generate time series data to reflect the working state.When the systems are attacked,timely identification of outliers in time series is critical to ensure security.Although many anomaly detection methods have been proposed,the temporal correlation of the time series over the same sensor and the state(spatial)correlation between different sensors are rarely considered simultaneously in these methods.Owing to the superior capability of Transformer in learning time series features.This paper proposes a time series anomaly detection method based on a spatial-temporal network and an improved Transformer.Additionally,the methods based on graph neural networks typically include a graph structure learning module and an anomaly detection module,which are interdependent.However,in the initial phase of training,since neither of the modules has reached an optimal state,their performance may influence each other.This scenario makes the end-to-end training approach hard to effectively direct the learning trajectory of each module.This interdependence between the modules,coupled with the initial instability,may cause the model to find it hard to find the optimal solution during the training process,resulting in unsatisfactory results.We introduce an adaptive graph structure learning method to obtain the optimal model parameters and graph structure.Experiments on two publicly available datasets demonstrate that the proposed method attains higher anomaly detection results than other methods.
基金supported by the grants:PID2020-112675RBC44(ONOFRE-3),funded by MCIN/AEI/10.13039/501100011033Horizon Project RIGOUROUS funded by European Commission,GA:101095933TSI-063000-2021-{36,44,45,62}(Cerberus)funded by MAETD’s 2021 UNICO I+D Program.
文摘The management of network intelligence in Beyond 5G(B5G)networks encompasses the complex challenges of scalability,dynamicity,interoperability,privacy,and security.These are essential steps towards achieving the realization of truly ubiquitous Artificial Intelligence(AI)-based analytics,empowering seamless integration across the entire Continuum(Edge,Fog,Core,Cloud).This paper introduces a Federated Network Intelligence Orchestration approach aimed at scalable and automated Federated Learning(FL)-based anomaly detection in B5Gnetworks.By leveraging a horizontal Federated learning approach based on the FedAvg aggregation algorithm,which employs a deep autoencoder model trained on non-anomalous traffic samples to recognize normal behavior,the systemorchestrates network intelligence to detect and prevent cyber-attacks.Integrated into a B5G Zero-touch Service Management(ZSM)aligned Security Framework,the proposal utilizes multi-domain and multi-tenant orchestration to automate and scale the deployment of FL-agents and AI-based anomaly detectors,enhancing reaction capabilities against cyber-attacks.The proposed FL architecture can be dynamically deployed across the B5G Continuum,utilizing a hierarchy of Network Intelligence orchestrators for real-time anomaly and security threat handling.Implementation includes FL enforcement operations for interoperability and extensibility,enabling dynamic deployment,configuration,and reconfiguration on demand.Performance validation of the proposed solution was conducted through dynamic orchestration,FL,and real-time anomaly detection processes using a practical test environment.Analysis of key performance metrics,leveraging the 5G-NIDD dataset,demonstrates the system’s capability for automatic and near real-time handling of anomalies and attacks,including real-time network monitoring and countermeasure implementation for mitigation.
文摘Coronary artery anomaly is known as one of the causes of angina pectoris and sudden death and is an important clinical entity that cannot be overlooked.The incidence of coronary artery anomalies is as low as 1%-2%of the general population,even when the various types are combined.Coronary anomalies are practically challenging when the left and right coronary ostium are not found around their normal positions during coronary angiography with a catheter.If there is atherosclerotic stenosis of the coronary artery with an anomaly and percutaneous coronary intervention(PCI)is required,the suitability of the guiding catheter at the entrance and the adequate back up force of the guiding catheter are issues.The level of PCI risk itself should also be considered on a caseby-case basis.In this case,emission computed tomography in the R-1 subtype single coronary artery proved that ischemia occurred in an area where the coronary artery was not visible to the naked eye.Meticulous follow-up would be crucial,because sudden death may occur in single coronary arteries.To prevent atherosclerosis with full efforts is also important,as the authors indicated admirably.
基金supported by Institute of Information&Communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(No.RS-2023-00235509,Development of Security Monitoring Technology Based Network Behavior against Encrypted Cyber Threats in ICT Convergence Environment).
文摘In the IoT(Internet of Things)domain,the increased use of encryption protocols such as SSL/TLS,VPN(Virtual Private Network),and Tor has led to a rise in attacks leveraging encrypted traffic.While research on anomaly detection using AI(Artificial Intelligence)is actively progressing,the encrypted nature of the data poses challenges for labeling,resulting in data imbalance and biased feature extraction toward specific nodes.This study proposes a reconstruction error-based anomaly detection method using an autoencoder(AE)that utilizes packet metadata excluding specific node information.The proposed method omits biased packet metadata such as IP and Port and trains the detection model using only normal data,leveraging a small amount of packet metadata.This makes it well-suited for direct application in IoT environments due to its low resource consumption.In experiments comparing feature extraction methods for AE-based anomaly detection,we found that using flowbased features significantly improves accuracy,precision,F1 score,and AUC(Area Under the Receiver Operating Characteristic Curve)score compared to packet-based features.Additionally,for flow-based features,the proposed method showed a 30.17%increase in F1 score and improved false positive rates compared to Isolation Forest and OneClassSVM.Furthermore,the proposedmethod demonstrated a 32.43%higherAUCwhen using packet features and a 111.39%higher AUC when using flow features,compared to previously proposed oversampling methods.This study highlights the impact of feature extraction methods on attack detection in imbalanced,encrypted traffic environments and emphasizes that the one-class method using AE is more effective for attack detection and reducing false positives compared to traditional oversampling methods.
基金Supported by Sichuan Provincial Key Research and Development Program of China(Grant No.2023YFG0351)National Natural Science Foundation of China(Grant No.61833002).
文摘Predictive maintenance has emerged as an effective tool for curbing maintenance costs,yet prevailing research predominantly concentrates on the abnormal phases.Within the ostensibly stable healthy phase,the reliance on anomaly detection to preempt equipment malfunctions faces the challenge of sudden anomaly discernment.To address this challenge,this paper proposes a dual-task learning approach for bearing anomaly detection and state evaluation of safe regions.The proposed method transforms the execution of the two tasks into an optimization issue of the hypersphere center.By leveraging the monotonicity and distinguishability pertinent to the tasks as the foundation for optimization,it reconstructs the SVDD model to ensure equilibrium in the model’s performance across the two tasks.Subsequent experiments verify the proposed method’s effectiveness,which is interpreted from the perspectives of parameter adjustment and enveloping trade-offs.In the meantime,experimental results also show two deficiencies in anomaly detection accuracy and state evaluation metrics.Their theoretical analysis inspires us to focus on feature extraction and data collection to achieve improvements.The proposed method lays the foundation for realizing predictive maintenance in a healthy stage by improving condition awareness in safe regions.
文摘BACKGROUND Hepatectomy is the first choice for treating liver cancer.However,inflammatory factors,released in response to pain stimulation,may suppress perioperative immune function and affect the prognosis of patients undergoing hepatectomies.AIM To determine the short-term efficacy of microwave ablation in the treatment of liver cancer and its effect on immune function.METHODS Clinical data from patients with liver cancer admitted to Suzhou Ninth People’s Hospital from January 2020 to December 2023 were retrospectively analyzed.Thirty-five patients underwent laparoscopic hepatectomy for liver cancer(liver cancer resection group)and 35 patients underwent medical image-guided microwave ablation(liver cancer ablation group).The short-term efficacy,complications,liver function,and immune function indices before and after treatment were compared between the two groups.RESULTS One month after treatment,19 patients experienced complete remission(CR),8 patients experienced partial remission(PR),6 patients experienced stable disease(SD),and 2 patients experienced disease progression(PD)in the liver cancer resection group.In the liver cancer ablation group,21 patients experienced CR,9 patients experienced PR,3 patients experienced SD,and 2 patients experienced PD.No significant differences in efficacy and complications were detected between the liver cancer ablation and liver cancer resection groups(P>0.05).After treatment,total bilirubin(41.24±7.35 vs 49.18±8.64μmol/L,P<0.001),alanine aminotransferase(30.85±6.23 vs 42.32±7.56 U/L,P<0.001),CD4+(43.95±5.72 vs 35.27±5.56,P<0.001),CD8+(20.38±3.91 vs 22.75±4.62,P<0.001),and CD4+/CD8+(2.16±0.39 vs 1.55±0.32,P<0.001)were significantly different between the liver cancer ablation and liver cancer resection groups.CONCLUSION The short-term efficacy and safety of microwave ablation and laparoscopic surgery for the treatment of liver cancer are similar,but liver function recovers quickly after microwave ablation,and microwave ablation may enhance immune function.