Fault diagnosis is important for maintaining the safety and effectiveness of chemical process.Considering the multivariate,nonlinear,and dynamic characteristic of chemical process,many time-series-based data-driven fa...Fault diagnosis is important for maintaining the safety and effectiveness of chemical process.Considering the multivariate,nonlinear,and dynamic characteristic of chemical process,many time-series-based data-driven fault diagnosis methods have been developed in recent years.However,the existing methods have the problem of long-term dependency and are difficult to train due to the sequential way of training.To overcome these problems,a novel fault diagnosis method based on time-series and the hierarchical multihead self-attention(HMSAN)is proposed for chemical process.First,a sliding window strategy is adopted to construct the normalized time-series dataset.Second,the HMSAN is developed to extract the time-relevant features from the time-series process data.It improves the basic self-attention model in both width and depth.With the multihead structure,the HMSAN can pay attention to different aspects of the complicated chemical process and obtain the global dynamic features.However,the multiple heads in parallel lead to redundant information,which cannot improve the diagnosis performance.With the hierarchical structure,the redundant information is reduced and the deep local time-related features are further extracted.Besides,a novel many-to-one training strategy is introduced for HMSAN to simplify the training procedure and capture the long-term dependency.Finally,the effectiveness of the proposed method is demonstrated by two chemical cases.The experimental results show that the proposed method achieves a great performance on time-series industrial data and outperforms the state-of-the-art approaches.展开更多
The frequent missing values in radar-derived time-series tracks of aerial targets(RTT-AT)lead to significant challenges in subsequent data-driven tasks.However,the majority of imputation research focuses on random mis...The frequent missing values in radar-derived time-series tracks of aerial targets(RTT-AT)lead to significant challenges in subsequent data-driven tasks.However,the majority of imputation research focuses on random missing(RM)that differs significantly from common missing patterns of RTT-AT.The method for solving the RM may experience performance degradation or failure when applied to RTT-AT imputation.Conventional autoregressive deep learning methods are prone to error accumulation and long-term dependency loss.In this paper,a non-autoregressive imputation model that addresses the issue of missing value imputation for two common missing patterns in RTT-AT is proposed.Our model consists of two probabilistic sparse diagonal masking self-attention(PSDMSA)units and a weight fusion unit.It learns missing values by combining the representations outputted by the two units,aiming to minimize the difference between the missing values and their actual values.The PSDMSA units effectively capture temporal dependencies and attribute correlations between time steps,improving imputation quality.The weight fusion unit automatically updates the weights of the output representations from the two units to obtain a more accurate final representation.The experimental results indicate that,despite varying missing rates in the two missing patterns,our model consistently outperforms other methods in imputation performance and exhibits a low frequency of deviations in estimates for specific missing entries.Compared to the state-of-the-art autoregressive deep learning imputation model Bidirectional Recurrent Imputation for Time Series(BRITS),our proposed model reduces mean absolute error(MAE)by 31%~50%.Additionally,the model attains a training speed that is 4 to 8 times faster when compared to both BRITS and a standard Transformer model when trained on the same dataset.Finally,the findings from the ablation experiments demonstrate that the PSDMSA,the weight fusion unit,cascade network design,and imputation loss enhance imputation performance and confirm the efficacy of our design.展开更多
Objective:The purpose of this study was to determine the effectiveness of brisk walking as an intervention for self-care agency and care dependency in patients with permanent colorectal cancer stoma.Method:This study ...Objective:The purpose of this study was to determine the effectiveness of brisk walking as an intervention for self-care agency and care dependency in patients with permanent colorectal cancer stoma.Method:This study adopted a quasi-experimental research design,specifically a non-equivalent control group pre-test and post-test design.Utilizing the Exercise of Self-Care Agency Scale(ESCA)and Care Dependency Scale(CDS),a survey was administered to 64 patients from a hospital in Shandong Province.The statistical methods used for analyzing data included frequency,mean,standard deviation(SD),independent t-test,P-value calculation,and dependent t-test.Result:After two months of a brisk walking exercise program,participants in the experimental group had a higher level of self-care agency than before the experiment(P<0.05),and their level of care dependency was significantly reduced(P<0.05).Participants in the control group also showed higher levels of self-care agency(P<0.05)and lower levels of care dependency(P<0.05)after two months compared to their levels before the two months.Conclusion:The brisk walking program had a positive impact on patients’self-care agency and reduced their care dependency.展开更多
BACKGROUND The literature has discussed the relationship between environmental factors and depressive disorders;however,the results are inconsistent in different studies and regions,as are the interaction effects betw...BACKGROUND The literature has discussed the relationship between environmental factors and depressive disorders;however,the results are inconsistent in different studies and regions,as are the interaction effects between environmental factors.We hypo-thesized that meteorological factors and ambient air pollution individually affect and interact to affect depressive disorder morbidity.AIM To investigate the effects of meteorological factors and air pollution on depressive disorders,including their lagged effects and interactions.METHODS The samples were obtained from a class 3 hospital in Harbin,China.Daily hos-pital admission data for depressive disorders from January 1,2015 to December 31,2022 were obtained.Meteorological and air pollution data were also collected during the same period.Generalized additive models with quasi-Poisson regre-ssion were used for time-series modeling to measure the non-linear and delayed effects of environmental factors.We further incorporated each pair of environ-mental factors into a bivariate response surface model to examine the interaction effects on hospital admissions for depressive disorders.RESULTS Data for 2922 d were included in the study,with no missing values.The total number of depressive admissions was 83905.Medium to high correlations existed between environmental factors.Air temperature(AT)and wind speed(WS)significantly affected the number of admissions for depression.An extremely low temperature(-29.0℃)at lag 0 caused a 53%[relative risk(RR)=1.53,95%confidence interval(CI):1.23-1.89]increase in daily hospital admissions relative to the median temperature.Extremely low WSs(0.4 m/s)at lag 7 increased the number of admissions by 58%(RR=1.58,95%CI:1.07-2.31).In contrast,atmospheric pressure and relative humidity had smaller effects.Among the six air pollutants considered in the time-series model,nitrogen dioxide(NO_(2))was the only pollutant that showed significant effects over non-cumulative,cumulative,immediate,and lagged conditions.The cumulative effect of NO_(2) at lag 7 was 0.47%(RR=1.0047,95%CI:1.0024-1.0071).Interaction effects were found between AT and the five air pollutants,atmospheric temperature and the four air pollutants,WS and sulfur dioxide.CONCLUSION Meteorological factors and the air pollutant NO_(2) affect daily hospital admissions for depressive disorders,and interactions exist between meteorological factors and ambient air pollution.展开更多
Marxist political economy provides a perspective for grasping the root cause of the China-US trade war.The international relations of production,which stem from the international division of labor,shape the distributi...Marxist political economy provides a perspective for grasping the root cause of the China-US trade war.The international relations of production,which stem from the international division of labor,shape the distribution of international economic interests and the political status of countries.Traditionally,developing countries have been subjected to the“periphery”in the international division of labor.In the new global value chain,developing countries have remained in a subordinate position characterized by“technological-market”dependence.To achieve the goal of building a strong modern nation,China must escape the“technological-market”dependence.Yet China’s efforts and achievements in escaping dependent development are deemed as a threat to US vested interests in the international markets.To preserve the economic foundation of its hegemony,the US has resorted to a trade war to contain China’s development.展开更多
Degradation and overstress failures occur in many electronic systems in which the operation load and environmental conditions are complex.The dependency of them called dependent competing failure process(DCFP),has bee...Degradation and overstress failures occur in many electronic systems in which the operation load and environmental conditions are complex.The dependency of them called dependent competing failure process(DCFP),has been widely studied.Electronic system may experience mutual effects of degradation and shocks,they are considered to be interdependent.Both the degradation and the shock processes will decrease the limit of system and cause cumulative effect.Finally,the competition of hard and soft failure will cause the system failure.Based on the failure mechanism accumulation theory,this paper constructs the shock-degradation acceleration and the threshold descent model,and a system reliability model established by using these two models.The mutually DCFP effect of electronic system interaction has been decomposed into physical correlation of failure,including acceleration,accumulation and competition.As a case,a reliability of electronic system in aeronautical system has been analyzed with the proposed method.The method proposed is based on failure physical evaluation,and could provide important reference for quantitative evaluation and design improvement of the newly designed system in case of data deficiency.展开更多
Multivariate time-series forecasting(MTSF)plays an important role in diverse real-world applications.To achieve better accuracy in MTSF,time-series patterns in each variable and interrelationship patterns between vari...Multivariate time-series forecasting(MTSF)plays an important role in diverse real-world applications.To achieve better accuracy in MTSF,time-series patterns in each variable and interrelationship patterns between variables should be considered together.Recently,graph neural networks(GNNs)has gained much attention as they can learn both patterns using a graph.For accurate forecasting through GNN,a well-defined graph is required.However,existing GNNs have limitations in reflecting the spectral similarity and time delay between nodes,and consider all nodes with the same weight when constructing graph.In this paper,we propose a novel graph construction method that solves aforementioned limitations.We first calculate the Fourier transform-based spectral similarity and then update this similarity to reflect the time delay.Then,we weight each node according to the number of edge connections to get the final graph and utilize it to train the GNN model.Through experiments on various datasets,we demonstrated that the proposed method enhanced the performance of GNN-based MTSF models,and the proposed forecasting model achieve of up to 18.1%predictive performance improvement over the state-of-the-art model.展开更多
This study aims to investigate the phenomenon of technological gadget usage among pre-university students,which include the time spent using them,as well as their purpose and influence.A descriptive research design was...This study aims to investigate the phenomenon of technological gadget usage among pre-university students,which include the time spent using them,as well as their purpose and influence.A descriptive research design was adopted in this study.131 pre-university students were randomly selected to answer a structured questionnaire.They were informed two weeks earlier to keep track on their time spent on technological devices,before answering the questionnaire.Findings showed that 99.2%of the respondents owned at least two technological gadgets,and all respondents own a smartphone.The main two gadgets that respondents spend at least 4 h a day on are smartphones(65.6%)and computers/laptops(21.4%).This indicates that smartphones are commonly used and owned among the respondents.The majority of the respondents are moderately nomophobia and moderately dependent on smartphones(70.2%and 66.4%,respectively).Correlation analysis demonstrates that the total time spent on gadgets in a day has a significant positive correlation with gadget dependency and total number of gadgets owned.Meanwhile,logistic regression was conducted to estimate the probability of nomophobia and dependency using total time spent and total number of technological gadgets.From thefindings,it was demonstrated that when the total time spent on using technological gadgets increasing,there is greater probability that the respondents develop nomophobia and dependency.This indicates that nomophobia and dependency to technological gadgets can be used to predict lifestyle profiles.The use of technological gadgets can bring both benefit and harm to its user.In light of this,user has to remain rational in order to derive maximum benefit from it.展开更多
With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in th...With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in the field of human reliability analysis(HRA)to evaluate human reliability and assess risk in large complex systems.However,the classical SPAR-H method does not consider the dependencies among performance shaping factors(PSFs),whichmay cause overestimation or underestimation of the risk of the actual situation.To address this issue,this paper proposes a new method to deal with the dependencies among PSFs in SPAR-H based on the Pearson correlation coefficient.First,the dependence between every two PSFs is measured by the Pearson correlation coefficient.Second,the weights of the PSFs are obtained by considering the total dependence degree.Finally,PSFs’multipliers are modified based on the weights of corresponding PSFs,and then used in the calculating of human error probability(HEP).A case study is used to illustrate the procedure and effectiveness of the proposed method.展开更多
In the past two decades,because of the significant increase in the availability of differential interferometry from synthetic aperture radar and GPS data,spaceborne geodesy has been widely employed to determine the co...In the past two decades,because of the significant increase in the availability of differential interferometry from synthetic aperture radar and GPS data,spaceborne geodesy has been widely employed to determine the co-seismic displacement field of earthquakes.On April 18,2021,a moderate earthquake(Mw 5.8)occurred east of Bandar Ganaveh,southern Iran,followed by intensive seismic activity and aftershocks of various magnitudes.We use two-pass D-InSAR and Small Baseline Inversion techniques via the LiCSBAS suite to study the coseismic displacement and monitor the four-month post-seismic deformation of the Bandar Ganaveh earthquake,as well as constrain the fault geometry of the co-seismic faulting mechanism during the seismic sequence.Analyses show that the co-and postseismic deformation are distributed in relatively shallow depths along with an NW-SE striking and NE dipping complex reverse/thrust fault branches of the Zagros Mountain Front Fault,complying with the main trend of the Zagros structures.The average cumulative displacements were obtained from-137.5 to+113.3 mm/yr in the SW and NE blocks of the Mountain Front Fault,respectively.The received maximum uplift amount is approximately consistent with the overall orogen-normal shortening component of the Arabian-Eurasian convergence in the Zagros region.No surface ruptures were associated with the seismic source;therefore,we propose a shallow blind thrust/reverse fault(depth~10 km)connected to the deeper basal decollement fault within a complex tectonic zone,emphasizing the thin-skinned tectonics.展开更多
The increasing penetration rate of electric kickboard vehicles has been popularized and promoted primarily because of its clean and efficient features.Electric kickboards are gradually growing in popularity in tourist...The increasing penetration rate of electric kickboard vehicles has been popularized and promoted primarily because of its clean and efficient features.Electric kickboards are gradually growing in popularity in tourist and education-centric localities.In the upcoming arrival of electric kickboard vehicles,deploying a customer rental service is essential.Due to its freefloating nature,the shared electric kickboard is a common and practical means of transportation.Relocation plans for shared electric kickboards are required to increase the quality of service,and forecasting demand for their use in a specific region is crucial.Predicting demand accurately with small data is troublesome.Extensive data is necessary for training machine learning algorithms for effective prediction.Data generation is a method for expanding the amount of data that will be further accessible for training.In this work,we proposed a model that takes time-series customers’electric kickboard demand data as input,pre-processes it,and generates synthetic data according to the original data distribution using generative adversarial networks(GAN).The electric kickboard mobility demand prediction error was reduced when we combined synthetic data with the original data.We proposed Tabular-GAN-Modified-WGAN-GP for generating synthetic data for better prediction results.We modified The Wasserstein GAN-gradient penalty(GP)with the RMSprop optimizer and then employed Spectral Normalization(SN)to improve training stability and faster convergence.Finally,we applied a regression-based blending ensemble technique that can help us to improve performance of demand prediction.We used various evaluation criteria and visual representations to compare our proposed model’s performance.Synthetic data generated by our suggested GAN model is also evaluated.The TGAN-Modified-WGAN-GP model mitigates the overfitting and mode collapse problem,and it also converges faster than previous GAN models for synthetic data creation.The presented model’s performance is compared to existing ensemble and baseline models.The experimental findings imply that combining synthetic and actual data can significantly reduce prediction error rates in the mean absolute percentage error(MAPE)of 4.476 and increase prediction accuracy.展开更多
Read-write dependency is an important factor restricting software efficiency.Timing Speculative(TS)is a processing architecture aiming to improve energy efficiency of microprocessors.Timing error rate,influenced by th...Read-write dependency is an important factor restricting software efficiency.Timing Speculative(TS)is a processing architecture aiming to improve energy efficiency of microprocessors.Timing error rate,influenced by the read-write dependency,bottlenecks the voltage down-scaling and so the energy efficiency of TS processors.We proposed a method called Read-Write Dependency Aware Register Allocation.It is based on the Read-Write Dependency aware Interference Graph(RWDIG)conception.Registers are reallocated to loosen the read-write dependencies,so resulting in a reduction of timing errors.The traditional no operation(Nop)padding method is also redesigned to increase the distance value to above 2.We analyzed the dependencies of registers and maximized the average distance value of read and write dependencies.Experimental results showed that we can reduce all read-write dependency by Nop padding,as well as the overhead timing errors.An energy saving of approximately 7%was achieved.展开更多
Smart contracts running on public blockchains are permissionless and decentralized,attracting both developers and malicious participants.Ethereum,the world’s largest decentralized application platform on which more t...Smart contracts running on public blockchains are permissionless and decentralized,attracting both developers and malicious participants.Ethereum,the world’s largest decentralized application platform on which more than 40 million smart contracts are running,is frequently challenged by smart contract vulnerabilities.What’s worse,since the homogeneity of a wide range of smart contracts and the increase in inter-contract dependencies,a vulnerability in a certain smart contract could affect a large number of other contracts in Ethereum.However,little is known about how vulnerable contracts affect other on-chain contracts and which contracts can be affected.Thus,we first present the contract dependency graph(CDG)to perform a vulnerability analysis for Ethereum smart contracts,where CDG characterizes inter-contract dependencies formed by DELEGATECALL-type internal transaction in Ethereum.Then,three generic definitions of security violations against CDG are given for finding respective potential victim contracts affected by different types of vulnerable contracts.Further,we construct the CDG with 195,247 smart contracts active in the latest blocks of the Ethereum and verify the above security violations against CDG by detecting three representative known vulnerabilities.Compared to previous large-scale vulnerability analysis,our analysis scheme marks potential victim contracts that can be affected by different types of vulnerable contracts,and identify their possible risks based on the type of security violation actually occurring.The analysis results show that the proportion of potential victim contracts reaches 14.7%,far more than that of corresponding vulnerable contracts(less than 0.02%)in CDG.展开更多
We propose a model of edge-coupled interdependent networks with directed dependency links(EINDDLs)and develop the theoretical analysis framework of this model based on the self-consistent probabilities method.The phas...We propose a model of edge-coupled interdependent networks with directed dependency links(EINDDLs)and develop the theoretical analysis framework of this model based on the self-consistent probabilities method.The phase transition behaviors and parameter thresholds of this model under random attacks are analyzed theoretically on both random regular(RR)networks and Erd¨os-Renyi(ER)networks,and computer simulations are performed to verify the results.In this EINDDL model,a fractionβof connectivity links within network B depends on network A and a fraction(1-β)of connectivity links within network A depends on network B.It is found that randomly removing a fraction(1-p)of connectivity links in network A at the initial state,network A exhibits different types of phase transitions(first order,second order and hybrid).Network B is rarely affected by cascading failure whenβis small,and network B will gradually converge from the first-order to the second-order phase transition asβincreases.We present the critical values ofβfor the phase change process of networks A and B,and give the critical values of p andβfor network B at the critical point of collapse.Furthermore,a cascading prevention strategy is proposed.The findings are of great significance for understanding the robustness of EINDDLs.展开更多
Based on the force-heat equivalence energy density principle,a theoretical model for magnetic metallic materials is developed,which characterizes the temperature-dependent magnetic anisotropy energy by considering the...Based on the force-heat equivalence energy density principle,a theoretical model for magnetic metallic materials is developed,which characterizes the temperature-dependent magnetic anisotropy energy by considering the equivalent relationship between magnetic anisotropy energy and heat energy;then the relationship between the magnetic anisotropy constant and saturation magnetization is considered.Finally,we formulate a temperature-dependent model for saturation magnetization,revealing the inherent relationship between temperature and saturation magnetization.Our model predicts the saturation magnetization for nine different magnetic metallic materials at different temperatures,exhibiting satisfactory agreement with experimental data.Additionally,the experimental data used as reference points are at or near room temperature.Compared to other phenomenological theoretical models,this model is considerably more accessible than the data required at 0 K.The index included in our model is set to a constant value,which is equal to 10/3 for materials other than Fe,Co,and Ni.For transition metals(Fe,Co,and Ni in this paper),the index is 6 in the range of 0 K to 0.65T_(cr)(T_(cr) is the critical temperature),and 3 in the range of 0.65T_(cr) to T_(cr),unlike other models where the adjustable parameters vary according to each material.In addition,our model provides a new way to design and evaluate magnetic metallic materials with superior magnetic properties over a wide range of temperatures.展开更多
In this paper,we study systems of conservation laws in one space dimension.We prove that for classical solutions in Sobolev spaces H^(s),with s>3/2,the data-to-solution map is not uniformly continuous.Our results a...In this paper,we study systems of conservation laws in one space dimension.We prove that for classical solutions in Sobolev spaces H^(s),with s>3/2,the data-to-solution map is not uniformly continuous.Our results apply to all nonlinear scalar conservation laws and to nonlinear hyperbolic systems of two equations.展开更多
In recent years,skeleton-based action recognition has made great achievements in Computer Vision.A graph convolutional network(GCN)is effective for action recognition,modelling the human skeleton as a spatio-temporal ...In recent years,skeleton-based action recognition has made great achievements in Computer Vision.A graph convolutional network(GCN)is effective for action recognition,modelling the human skeleton as a spatio-temporal graph.Most GCNs define the graph topology by physical relations of the human joints.However,this predefined graph ignores the spatial relationship between non-adjacent joint pairs in special actions and the behavior dependence between joint pairs,resulting in a low recognition rate for specific actions with implicit correlation between joint pairs.In addition,existing methods ignore the trend correlation between adjacent frames within an action and context clues,leading to erroneous action recognition with similar poses.Therefore,this study proposes a learnable GCN based on behavior dependence,which considers implicit joint correlation by constructing a dynamic learnable graph with extraction of specific behavior dependence of joint pairs.By using the weight relationship between the joint pairs,an adaptive model is constructed.It also designs a self-attention module to obtain their inter-frame topological relationship for exploring the context of actions.Combining the shared topology and the multi-head self-attention map,the module obtains the context-based clue topology to update the dynamic graph convolution,achieving accurate recognition of different actions with similar poses.Detailed experiments on public datasets demonstrate that the proposed method achieves better results and realizes higher quality representation of actions under various evaluation protocols compared to state-of-the-art methods.展开更多
Due to the structural dependencies among concurrent events in the knowledge graph and the substantial amount of sequential correlation information carried by temporally adjacent events,we propose an Independent Recurr...Due to the structural dependencies among concurrent events in the knowledge graph and the substantial amount of sequential correlation information carried by temporally adjacent events,we propose an Independent Recurrent Temporal Graph Convolution Networks(IndRT-GCNets)framework to efficiently and accurately capture event attribute information.The framework models the knowledge graph sequences to learn the evolutionary represen-tations of entities and relations within each period.Firstly,by utilizing the temporal graph convolution module in the evolutionary representation unit,the framework captures the structural dependency relationships within the knowledge graph in each period.Meanwhile,to achieve better event representation and establish effective correlations,an independent recurrent neural network is employed to implement auto-regressive modeling.Furthermore,static attributes of entities in the entity-relation events are constrained andmerged using a static graph constraint to obtain optimal entity representations.Finally,the evolution of entity and relation representations is utilized to predict events in the next subsequent step.On multiple real-world datasets such as Freebase13(FB13),Freebase 15k(FB15K),WordNet11(WN11),WordNet18(WN18),FB15K-237,WN18RR,YAGO3-10,and Nell-995,the results of multiple evaluation indicators show that our proposed IndRT-GCNets framework outperforms most existing models on knowledge reasoning tasks,which validates the effectiveness and robustness.展开更多
This study presents the design of a modified attributed control chart based on a double sampling(DS)np chart applied in combination with generalized multiple dependent state(GMDS)sampling to monitor the mean life of t...This study presents the design of a modified attributed control chart based on a double sampling(DS)np chart applied in combination with generalized multiple dependent state(GMDS)sampling to monitor the mean life of the product based on the time truncated life test employing theWeibull distribution.The control chart developed supports the examination of the mean lifespan variation for a particular product in the process of manufacturing.Three control limit levels are used:the warning control limit,inner control limit,and outer control limit.Together,they enhance the capability for variation detection.A genetic algorithm can be used for optimization during the in-control process,whereby the optimal parameters can be established for the proposed control chart.The control chart performance is assessed using the average run length,while the influence of the model parameters upon the control chart solution is assessed via sensitivity analysis based on an orthogonal experimental design withmultiple linear regression.A comparative study was conducted based on the out-of-control average run length,in which the developed control chart offered greater sensitivity in the detection of process shifts while making use of smaller samples on average than is the case for existing control charts.Finally,to exhibit the utility of the developed control chart,this paper presents its application using simulated data with parameters drawn from the real set of data.展开更多
Thucydides asserts that the occupation of Decelea by the Spartans in 413 BC made the grain supply for Athens costly by forcing the transport from land onto the sea.This calls into question the well-established consens...Thucydides asserts that the occupation of Decelea by the Spartans in 413 BC made the grain supply for Athens costly by forcing the transport from land onto the sea.This calls into question the well-established consensus that sea transport was far cheaper than land transport.This paper contends that the cost of protecting supply lines-specifically the expenses associated with the warships which escorted the supply ships-rendered the grain transported on the new route exceptionally costly.In this paper,the benefits and drawbacks of a maritime economy,including transaction costs,trade dependencies,and the capabilities of warships and supply ships are discussed.展开更多
基金supported by the National Natural Science Foundation of China(62073140,62073141)the Shanghai Rising-Star Program(21QA1401800).
文摘Fault diagnosis is important for maintaining the safety and effectiveness of chemical process.Considering the multivariate,nonlinear,and dynamic characteristic of chemical process,many time-series-based data-driven fault diagnosis methods have been developed in recent years.However,the existing methods have the problem of long-term dependency and are difficult to train due to the sequential way of training.To overcome these problems,a novel fault diagnosis method based on time-series and the hierarchical multihead self-attention(HMSAN)is proposed for chemical process.First,a sliding window strategy is adopted to construct the normalized time-series dataset.Second,the HMSAN is developed to extract the time-relevant features from the time-series process data.It improves the basic self-attention model in both width and depth.With the multihead structure,the HMSAN can pay attention to different aspects of the complicated chemical process and obtain the global dynamic features.However,the multiple heads in parallel lead to redundant information,which cannot improve the diagnosis performance.With the hierarchical structure,the redundant information is reduced and the deep local time-related features are further extracted.Besides,a novel many-to-one training strategy is introduced for HMSAN to simplify the training procedure and capture the long-term dependency.Finally,the effectiveness of the proposed method is demonstrated by two chemical cases.The experimental results show that the proposed method achieves a great performance on time-series industrial data and outperforms the state-of-the-art approaches.
基金supported by Graduate Funded Project(No.JY2022A017).
文摘The frequent missing values in radar-derived time-series tracks of aerial targets(RTT-AT)lead to significant challenges in subsequent data-driven tasks.However,the majority of imputation research focuses on random missing(RM)that differs significantly from common missing patterns of RTT-AT.The method for solving the RM may experience performance degradation or failure when applied to RTT-AT imputation.Conventional autoregressive deep learning methods are prone to error accumulation and long-term dependency loss.In this paper,a non-autoregressive imputation model that addresses the issue of missing value imputation for two common missing patterns in RTT-AT is proposed.Our model consists of two probabilistic sparse diagonal masking self-attention(PSDMSA)units and a weight fusion unit.It learns missing values by combining the representations outputted by the two units,aiming to minimize the difference between the missing values and their actual values.The PSDMSA units effectively capture temporal dependencies and attribute correlations between time steps,improving imputation quality.The weight fusion unit automatically updates the weights of the output representations from the two units to obtain a more accurate final representation.The experimental results indicate that,despite varying missing rates in the two missing patterns,our model consistently outperforms other methods in imputation performance and exhibits a low frequency of deviations in estimates for specific missing entries.Compared to the state-of-the-art autoregressive deep learning imputation model Bidirectional Recurrent Imputation for Time Series(BRITS),our proposed model reduces mean absolute error(MAE)by 31%~50%.Additionally,the model attains a training speed that is 4 to 8 times faster when compared to both BRITS and a standard Transformer model when trained on the same dataset.Finally,the findings from the ablation experiments demonstrate that the PSDMSA,the weight fusion unit,cascade network design,and imputation loss enhance imputation performance and confirm the efficacy of our design.
文摘Objective:The purpose of this study was to determine the effectiveness of brisk walking as an intervention for self-care agency and care dependency in patients with permanent colorectal cancer stoma.Method:This study adopted a quasi-experimental research design,specifically a non-equivalent control group pre-test and post-test design.Utilizing the Exercise of Self-Care Agency Scale(ESCA)and Care Dependency Scale(CDS),a survey was administered to 64 patients from a hospital in Shandong Province.The statistical methods used for analyzing data included frequency,mean,standard deviation(SD),independent t-test,P-value calculation,and dependent t-test.Result:After two months of a brisk walking exercise program,participants in the experimental group had a higher level of self-care agency than before the experiment(P<0.05),and their level of care dependency was significantly reduced(P<0.05).Participants in the control group also showed higher levels of self-care agency(P<0.05)and lower levels of care dependency(P<0.05)after two months compared to their levels before the two months.Conclusion:The brisk walking program had a positive impact on patients’self-care agency and reduced their care dependency.
基金This study was reviewed and approved by the Ethics Committee of The First Psychiatric Hospital of Harbin.
文摘BACKGROUND The literature has discussed the relationship between environmental factors and depressive disorders;however,the results are inconsistent in different studies and regions,as are the interaction effects between environmental factors.We hypo-thesized that meteorological factors and ambient air pollution individually affect and interact to affect depressive disorder morbidity.AIM To investigate the effects of meteorological factors and air pollution on depressive disorders,including their lagged effects and interactions.METHODS The samples were obtained from a class 3 hospital in Harbin,China.Daily hos-pital admission data for depressive disorders from January 1,2015 to December 31,2022 were obtained.Meteorological and air pollution data were also collected during the same period.Generalized additive models with quasi-Poisson regre-ssion were used for time-series modeling to measure the non-linear and delayed effects of environmental factors.We further incorporated each pair of environ-mental factors into a bivariate response surface model to examine the interaction effects on hospital admissions for depressive disorders.RESULTS Data for 2922 d were included in the study,with no missing values.The total number of depressive admissions was 83905.Medium to high correlations existed between environmental factors.Air temperature(AT)and wind speed(WS)significantly affected the number of admissions for depression.An extremely low temperature(-29.0℃)at lag 0 caused a 53%[relative risk(RR)=1.53,95%confidence interval(CI):1.23-1.89]increase in daily hospital admissions relative to the median temperature.Extremely low WSs(0.4 m/s)at lag 7 increased the number of admissions by 58%(RR=1.58,95%CI:1.07-2.31).In contrast,atmospheric pressure and relative humidity had smaller effects.Among the six air pollutants considered in the time-series model,nitrogen dioxide(NO_(2))was the only pollutant that showed significant effects over non-cumulative,cumulative,immediate,and lagged conditions.The cumulative effect of NO_(2) at lag 7 was 0.47%(RR=1.0047,95%CI:1.0024-1.0071).Interaction effects were found between AT and the five air pollutants,atmospheric temperature and the four air pollutants,WS and sulfur dioxide.CONCLUSION Meteorological factors and the air pollutant NO_(2) affect daily hospital admissions for depressive disorders,and interactions exist between meteorological factors and ambient air pollution.
基金This study is funded by Major Research Program on Philosophy and Social Sciences of Jiangsu Education Department(The Education of Marxism International View in Colleges and Universities for a New Era,No.2022SJZDSZ001)Green Research Program of Nanjing University of Aeronautics and Astronautics(China-US Science and Technology Competition from the Perspective of Marxism,No.1023-YAH21032).
文摘Marxist political economy provides a perspective for grasping the root cause of the China-US trade war.The international relations of production,which stem from the international division of labor,shape the distribution of international economic interests and the political status of countries.Traditionally,developing countries have been subjected to the“periphery”in the international division of labor.In the new global value chain,developing countries have remained in a subordinate position characterized by“technological-market”dependence.To achieve the goal of building a strong modern nation,China must escape the“technological-market”dependence.Yet China’s efforts and achievements in escaping dependent development are deemed as a threat to US vested interests in the international markets.To preserve the economic foundation of its hegemony,the US has resorted to a trade war to contain China’s development.
基金supported by the National Natural Science Foundation of China(61503014,62073009)。
文摘Degradation and overstress failures occur in many electronic systems in which the operation load and environmental conditions are complex.The dependency of them called dependent competing failure process(DCFP),has been widely studied.Electronic system may experience mutual effects of degradation and shocks,they are considered to be interdependent.Both the degradation and the shock processes will decrease the limit of system and cause cumulative effect.Finally,the competition of hard and soft failure will cause the system failure.Based on the failure mechanism accumulation theory,this paper constructs the shock-degradation acceleration and the threshold descent model,and a system reliability model established by using these two models.The mutually DCFP effect of electronic system interaction has been decomposed into physical correlation of failure,including acceleration,accumulation and competition.As a case,a reliability of electronic system in aeronautical system has been analyzed with the proposed method.The method proposed is based on failure physical evaluation,and could provide important reference for quantitative evaluation and design improvement of the newly designed system in case of data deficiency.
基金supported by Energy Cloud R&D Program(grant number:2019M3F2A1073184)through the National Research Foundation of Korea(NRF)funded by the Ministry of Science and ICT.
文摘Multivariate time-series forecasting(MTSF)plays an important role in diverse real-world applications.To achieve better accuracy in MTSF,time-series patterns in each variable and interrelationship patterns between variables should be considered together.Recently,graph neural networks(GNNs)has gained much attention as they can learn both patterns using a graph.For accurate forecasting through GNN,a well-defined graph is required.However,existing GNNs have limitations in reflecting the spectral similarity and time delay between nodes,and consider all nodes with the same weight when constructing graph.In this paper,we propose a novel graph construction method that solves aforementioned limitations.We first calculate the Fourier transform-based spectral similarity and then update this similarity to reflect the time delay.Then,we weight each node according to the number of edge connections to get the final graph and utilize it to train the GNN model.Through experiments on various datasets,we demonstrated that the proposed method enhanced the performance of GNN-based MTSF models,and the proposed forecasting model achieve of up to 18.1%predictive performance improvement over the state-of-the-art model.
基金the Universiti Kebangsaan Malaysia,Geran Galakan Penyelidikan,GGP-2020-040.
文摘This study aims to investigate the phenomenon of technological gadget usage among pre-university students,which include the time spent using them,as well as their purpose and influence.A descriptive research design was adopted in this study.131 pre-university students were randomly selected to answer a structured questionnaire.They were informed two weeks earlier to keep track on their time spent on technological devices,before answering the questionnaire.Findings showed that 99.2%of the respondents owned at least two technological gadgets,and all respondents own a smartphone.The main two gadgets that respondents spend at least 4 h a day on are smartphones(65.6%)and computers/laptops(21.4%).This indicates that smartphones are commonly used and owned among the respondents.The majority of the respondents are moderately nomophobia and moderately dependent on smartphones(70.2%and 66.4%,respectively).Correlation analysis demonstrates that the total time spent on gadgets in a day has a significant positive correlation with gadget dependency and total number of gadgets owned.Meanwhile,logistic regression was conducted to estimate the probability of nomophobia and dependency using total time spent and total number of technological gadgets.From thefindings,it was demonstrated that when the total time spent on using technological gadgets increasing,there is greater probability that the respondents develop nomophobia and dependency.This indicates that nomophobia and dependency to technological gadgets can be used to predict lifestyle profiles.The use of technological gadgets can bring both benefit and harm to its user.In light of this,user has to remain rational in order to derive maximum benefit from it.
基金Shanghai Rising-Star Program(Grant No.21QA1403400)Shanghai Sailing Program(Grant No.20YF1414800)Shanghai Key Laboratory of Power Station Automation Technology(Grant No.13DZ2273800).
文摘With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in the field of human reliability analysis(HRA)to evaluate human reliability and assess risk in large complex systems.However,the classical SPAR-H method does not consider the dependencies among performance shaping factors(PSFs),whichmay cause overestimation or underestimation of the risk of the actual situation.To address this issue,this paper proposes a new method to deal with the dependencies among PSFs in SPAR-H based on the Pearson correlation coefficient.First,the dependence between every two PSFs is measured by the Pearson correlation coefficient.Second,the weights of the PSFs are obtained by considering the total dependence degree.Finally,PSFs’multipliers are modified based on the weights of corresponding PSFs,and then used in the calculating of human error probability(HEP).A case study is used to illustrate the procedure and effectiveness of the proposed method.
文摘In the past two decades,because of the significant increase in the availability of differential interferometry from synthetic aperture radar and GPS data,spaceborne geodesy has been widely employed to determine the co-seismic displacement field of earthquakes.On April 18,2021,a moderate earthquake(Mw 5.8)occurred east of Bandar Ganaveh,southern Iran,followed by intensive seismic activity and aftershocks of various magnitudes.We use two-pass D-InSAR and Small Baseline Inversion techniques via the LiCSBAS suite to study the coseismic displacement and monitor the four-month post-seismic deformation of the Bandar Ganaveh earthquake,as well as constrain the fault geometry of the co-seismic faulting mechanism during the seismic sequence.Analyses show that the co-and postseismic deformation are distributed in relatively shallow depths along with an NW-SE striking and NE dipping complex reverse/thrust fault branches of the Zagros Mountain Front Fault,complying with the main trend of the Zagros structures.The average cumulative displacements were obtained from-137.5 to+113.3 mm/yr in the SW and NE blocks of the Mountain Front Fault,respectively.The received maximum uplift amount is approximately consistent with the overall orogen-normal shortening component of the Arabian-Eurasian convergence in the Zagros region.No surface ruptures were associated with the seismic source;therefore,we propose a shallow blind thrust/reverse fault(depth~10 km)connected to the deeper basal decollement fault within a complex tectonic zone,emphasizing the thin-skinned tectonics.
基金This work was supported by Korea Institute for Advancement of Technology(KIAT)grant funded by the Korea Government(MOTIE)(P0016977,The Establishment Project of Industry-University Fusion District).
文摘The increasing penetration rate of electric kickboard vehicles has been popularized and promoted primarily because of its clean and efficient features.Electric kickboards are gradually growing in popularity in tourist and education-centric localities.In the upcoming arrival of electric kickboard vehicles,deploying a customer rental service is essential.Due to its freefloating nature,the shared electric kickboard is a common and practical means of transportation.Relocation plans for shared electric kickboards are required to increase the quality of service,and forecasting demand for their use in a specific region is crucial.Predicting demand accurately with small data is troublesome.Extensive data is necessary for training machine learning algorithms for effective prediction.Data generation is a method for expanding the amount of data that will be further accessible for training.In this work,we proposed a model that takes time-series customers’electric kickboard demand data as input,pre-processes it,and generates synthetic data according to the original data distribution using generative adversarial networks(GAN).The electric kickboard mobility demand prediction error was reduced when we combined synthetic data with the original data.We proposed Tabular-GAN-Modified-WGAN-GP for generating synthetic data for better prediction results.We modified The Wasserstein GAN-gradient penalty(GP)with the RMSprop optimizer and then employed Spectral Normalization(SN)to improve training stability and faster convergence.Finally,we applied a regression-based blending ensemble technique that can help us to improve performance of demand prediction.We used various evaluation criteria and visual representations to compare our proposed model’s performance.Synthetic data generated by our suggested GAN model is also evaluated.The TGAN-Modified-WGAN-GP model mitigates the overfitting and mode collapse problem,and it also converges faster than previous GAN models for synthetic data creation.The presented model’s performance is compared to existing ensemble and baseline models.The experimental findings imply that combining synthetic and actual data can significantly reduce prediction error rates in the mean absolute percentage error(MAPE)of 4.476 and increase prediction accuracy.
基金This work was supported by the Project of Hunan Social Science Achievement Evaluation Committee(XSP20YBZ090,Sheng Xiao,2020).
文摘Read-write dependency is an important factor restricting software efficiency.Timing Speculative(TS)is a processing architecture aiming to improve energy efficiency of microprocessors.Timing error rate,influenced by the read-write dependency,bottlenecks the voltage down-scaling and so the energy efficiency of TS processors.We proposed a method called Read-Write Dependency Aware Register Allocation.It is based on the Read-Write Dependency aware Interference Graph(RWDIG)conception.Registers are reallocated to loosen the read-write dependencies,so resulting in a reduction of timing errors.The traditional no operation(Nop)padding method is also redesigned to increase the distance value to above 2.We analyzed the dependencies of registers and maximized the average distance value of read and write dependencies.Experimental results showed that we can reduce all read-write dependency by Nop padding,as well as the overhead timing errors.An energy saving of approximately 7%was achieved.
基金supported by the Key R and D Programs of Zhejiang Province under Grant No.2022C01018the Natural Science Foundation of Zhejiang Province under Grant No.LQ20F020019.
文摘Smart contracts running on public blockchains are permissionless and decentralized,attracting both developers and malicious participants.Ethereum,the world’s largest decentralized application platform on which more than 40 million smart contracts are running,is frequently challenged by smart contract vulnerabilities.What’s worse,since the homogeneity of a wide range of smart contracts and the increase in inter-contract dependencies,a vulnerability in a certain smart contract could affect a large number of other contracts in Ethereum.However,little is known about how vulnerable contracts affect other on-chain contracts and which contracts can be affected.Thus,we first present the contract dependency graph(CDG)to perform a vulnerability analysis for Ethereum smart contracts,where CDG characterizes inter-contract dependencies formed by DELEGATECALL-type internal transaction in Ethereum.Then,three generic definitions of security violations against CDG are given for finding respective potential victim contracts affected by different types of vulnerable contracts.Further,we construct the CDG with 195,247 smart contracts active in the latest blocks of the Ethereum and verify the above security violations against CDG by detecting three representative known vulnerabilities.Compared to previous large-scale vulnerability analysis,our analysis scheme marks potential victim contracts that can be affected by different types of vulnerable contracts,and identify their possible risks based on the type of security violation actually occurring.The analysis results show that the proportion of potential victim contracts reaches 14.7%,far more than that of corresponding vulnerable contracts(less than 0.02%)in CDG.
基金the National Natural Science Foundation of China(Grant Nos.61973118,51741902,11761033,12075088,and 11835003)Project in JiangXi Province Department of Science and Technology(Grant Nos.20212BBE51010 and 20182BCB22009)the Natural Science Foundation of Zhejiang Province(Grant No.Y22F035316)。
文摘We propose a model of edge-coupled interdependent networks with directed dependency links(EINDDLs)and develop the theoretical analysis framework of this model based on the self-consistent probabilities method.The phase transition behaviors and parameter thresholds of this model under random attacks are analyzed theoretically on both random regular(RR)networks and Erd¨os-Renyi(ER)networks,and computer simulations are performed to verify the results.In this EINDDL model,a fractionβof connectivity links within network B depends on network A and a fraction(1-β)of connectivity links within network A depends on network B.It is found that randomly removing a fraction(1-p)of connectivity links in network A at the initial state,network A exhibits different types of phase transitions(first order,second order and hybrid).Network B is rarely affected by cascading failure whenβis small,and network B will gradually converge from the first-order to the second-order phase transition asβincreases.We present the critical values ofβfor the phase change process of networks A and B,and give the critical values of p andβfor network B at the critical point of collapse.Furthermore,a cascading prevention strategy is proposed.The findings are of great significance for understanding the robustness of EINDDLs.
基金Project supported by the Natural Science Foundation of Chongqing(Grant No.CSTB2022NSCQ-MSX0391)。
文摘Based on the force-heat equivalence energy density principle,a theoretical model for magnetic metallic materials is developed,which characterizes the temperature-dependent magnetic anisotropy energy by considering the equivalent relationship between magnetic anisotropy energy and heat energy;then the relationship between the magnetic anisotropy constant and saturation magnetization is considered.Finally,we formulate a temperature-dependent model for saturation magnetization,revealing the inherent relationship between temperature and saturation magnetization.Our model predicts the saturation magnetization for nine different magnetic metallic materials at different temperatures,exhibiting satisfactory agreement with experimental data.Additionally,the experimental data used as reference points are at or near room temperature.Compared to other phenomenological theoretical models,this model is considerably more accessible than the data required at 0 K.The index included in our model is set to a constant value,which is equal to 10/3 for materials other than Fe,Co,and Ni.For transition metals(Fe,Co,and Ni in this paper),the index is 6 in the range of 0 K to 0.65T_(cr)(T_(cr) is the critical temperature),and 3 in the range of 0.65T_(cr) to T_(cr),unlike other models where the adjustable parameters vary according to each material.In addition,our model provides a new way to design and evaluate magnetic metallic materials with superior magnetic properties over a wide range of temperatures.
文摘In this paper,we study systems of conservation laws in one space dimension.We prove that for classical solutions in Sobolev spaces H^(s),with s>3/2,the data-to-solution map is not uniformly continuous.Our results apply to all nonlinear scalar conservation laws and to nonlinear hyperbolic systems of two equations.
基金supported in part by the 2023 Key Supported Project of the 14th Five Year Plan for Education and Science in Hunan Province with No.ND230795.
文摘In recent years,skeleton-based action recognition has made great achievements in Computer Vision.A graph convolutional network(GCN)is effective for action recognition,modelling the human skeleton as a spatio-temporal graph.Most GCNs define the graph topology by physical relations of the human joints.However,this predefined graph ignores the spatial relationship between non-adjacent joint pairs in special actions and the behavior dependence between joint pairs,resulting in a low recognition rate for specific actions with implicit correlation between joint pairs.In addition,existing methods ignore the trend correlation between adjacent frames within an action and context clues,leading to erroneous action recognition with similar poses.Therefore,this study proposes a learnable GCN based on behavior dependence,which considers implicit joint correlation by constructing a dynamic learnable graph with extraction of specific behavior dependence of joint pairs.By using the weight relationship between the joint pairs,an adaptive model is constructed.It also designs a self-attention module to obtain their inter-frame topological relationship for exploring the context of actions.Combining the shared topology and the multi-head self-attention map,the module obtains the context-based clue topology to update the dynamic graph convolution,achieving accurate recognition of different actions with similar poses.Detailed experiments on public datasets demonstrate that the proposed method achieves better results and realizes higher quality representation of actions under various evaluation protocols compared to state-of-the-art methods.
基金the National Natural Science Founda-tion of China(62062062)hosted by Gulila Altenbek.
文摘Due to the structural dependencies among concurrent events in the knowledge graph and the substantial amount of sequential correlation information carried by temporally adjacent events,we propose an Independent Recurrent Temporal Graph Convolution Networks(IndRT-GCNets)framework to efficiently and accurately capture event attribute information.The framework models the knowledge graph sequences to learn the evolutionary represen-tations of entities and relations within each period.Firstly,by utilizing the temporal graph convolution module in the evolutionary representation unit,the framework captures the structural dependency relationships within the knowledge graph in each period.Meanwhile,to achieve better event representation and establish effective correlations,an independent recurrent neural network is employed to implement auto-regressive modeling.Furthermore,static attributes of entities in the entity-relation events are constrained andmerged using a static graph constraint to obtain optimal entity representations.Finally,the evolution of entity and relation representations is utilized to predict events in the next subsequent step.On multiple real-world datasets such as Freebase13(FB13),Freebase 15k(FB15K),WordNet11(WN11),WordNet18(WN18),FB15K-237,WN18RR,YAGO3-10,and Nell-995,the results of multiple evaluation indicators show that our proposed IndRT-GCNets framework outperforms most existing models on knowledge reasoning tasks,which validates the effectiveness and robustness.
基金the Science,Research and Innovation Promotion Funding(TSRI)(Grant No.FRB660012/0168)managed under Rajamangala University of Technology Thanyaburi(FRB66E0646O.4).
文摘This study presents the design of a modified attributed control chart based on a double sampling(DS)np chart applied in combination with generalized multiple dependent state(GMDS)sampling to monitor the mean life of the product based on the time truncated life test employing theWeibull distribution.The control chart developed supports the examination of the mean lifespan variation for a particular product in the process of manufacturing.Three control limit levels are used:the warning control limit,inner control limit,and outer control limit.Together,they enhance the capability for variation detection.A genetic algorithm can be used for optimization during the in-control process,whereby the optimal parameters can be established for the proposed control chart.The control chart performance is assessed using the average run length,while the influence of the model parameters upon the control chart solution is assessed via sensitivity analysis based on an orthogonal experimental design withmultiple linear regression.A comparative study was conducted based on the out-of-control average run length,in which the developed control chart offered greater sensitivity in the detection of process shifts while making use of smaller samples on average than is the case for existing control charts.Finally,to exhibit the utility of the developed control chart,this paper presents its application using simulated data with parameters drawn from the real set of data.
文摘Thucydides asserts that the occupation of Decelea by the Spartans in 413 BC made the grain supply for Athens costly by forcing the transport from land onto the sea.This calls into question the well-established consensus that sea transport was far cheaper than land transport.This paper contends that the cost of protecting supply lines-specifically the expenses associated with the warships which escorted the supply ships-rendered the grain transported on the new route exceptionally costly.In this paper,the benefits and drawbacks of a maritime economy,including transaction costs,trade dependencies,and the capabilities of warships and supply ships are discussed.