N6-methyladenosine(m6A)is an important RNA methylation modification involved in regulating diverse biological processes across multiple species.Hence,the identification of m6A modification sites provides valuable insi...N6-methyladenosine(m6A)is an important RNA methylation modification involved in regulating diverse biological processes across multiple species.Hence,the identification of m6A modification sites provides valuable insight into the biological mechanisms of complex diseases at the post-transcriptional level.Although a variety of identification algorithms have been proposed recently,most of them capture the features of m6A modification sites by focusing on the sequential dependencies of nucleotides at different positions in RNA sequences,while ignoring the structural dependencies of nucleotides in their threedimensional structures.To overcome this issue,we propose a cross-species end-to-end deep learning model,namely CR-NSSD,which conduct a cross-domain representation learning process integrating nucleotide structural and sequential dependencies for RNA m6A site identification.Specifically,CR-NSSD first obtains the pre-coded representations of RNA sequences by incorporating the position information into single-nucleotide states with chaos game representation theory.It then constructs a crossdomain reconstruction encoder to learn the sequential and structural dependencies between nucleotides.By minimizing the reconstruction and binary cross-entropy losses,CR-NSSD is trained to complete the task of m6A site identification.Extensive experiments have demonstrated the promising performance of CR-NSSD by comparing it with several state-of-the-art m6A identification algorithms.Moreover,the results of cross-species prediction indicate that the integration of sequential and structural dependencies allows CR-NSSD to capture general features of m6A modification sites among different species,thus improving the accuracy of cross-species identification.展开更多
Thucydides asserts that the occupation of Decelea by the Spartans in 413 BC made the grain supply for Athens costly by forcing the transport from land onto the sea.This calls into question the well-established consens...Thucydides asserts that the occupation of Decelea by the Spartans in 413 BC made the grain supply for Athens costly by forcing the transport from land onto the sea.This calls into question the well-established consensus that sea transport was far cheaper than land transport.This paper contends that the cost of protecting supply lines-specifically the expenses associated with the warships which escorted the supply ships-rendered the grain transported on the new route exceptionally costly.In this paper,the benefits and drawbacks of a maritime economy,including transaction costs,trade dependencies,and the capabilities of warships and supply ships are discussed.展开更多
This study addresses whether gold exhibits the function of a hedge or safe haven as often referred to in academia.It contributes to the existing literature by(i)revisiting this question for the principal stock markets...This study addresses whether gold exhibits the function of a hedge or safe haven as often referred to in academia.It contributes to the existing literature by(i)revisiting this question for the principal stock markets in the Middle East and North Africa(MENA)region and(ii)using the copula-quantile-on-quantile and conditional value at risk methods to detail the risks facing market participants provided with accurate information about various gold and stock market scenarios(i.e.,bear,normal,bull).The results provide strong evidence of quantile dependence between gold and stock returns.Positive correlations are found between MENA gold and stock markets when both are bullish.Conversely,when stock returns are bearish,gold markets show negative correlations with MENA stock markets.The risk spillover from gold to stock markets intensified during the global financial and European crises.Given the risk spillover between gold and stock markets,investors in MENA markets should be careful when considering gold as a safe haven because its effectiveness as a hedge is not the same in all MENA stock markets.Investors and portfolio managers should rebalance their portfolio compositions under various gold and stock market conditions.Overall,such precise insights about the heterogeneous linkages and spillovers between gold and MENA stock returns provide potential input for developing effective hedging strategies and optimal portfolio allocations.展开更多
In this paper, the definition of approximate XFDs based on value equality is proposed. Two metrics, sup port and strength, are presented for measuring the degree of approximate XFD. A basic algorithm is designed for e...In this paper, the definition of approximate XFDs based on value equality is proposed. Two metrics, sup port and strength, are presented for measuring the degree of approximate XFD. A basic algorithm is designed for extracting minimal set of approximate XFDs, and then two optimized strategies are proposed to improve the performance. Finally, the experimental results show that the optimized algorithms are correct and effective.展开更多
Today, the quantity of data continues to increase, furthermore, the data are heterogeneous, from multiple sources (structured, semi-structured and unstructured) and with different levels of quality. Therefore, it is v...Today, the quantity of data continues to increase, furthermore, the data are heterogeneous, from multiple sources (structured, semi-structured and unstructured) and with different levels of quality. Therefore, it is very likely to manipulate data without knowledge about their structures and their semantics. In fact, the meta-data may be insufficient or totally absent. Data Anomalies may be due to the poverty of their semantic descriptions, or even the absence of their description. In this paper, we propose an approach to better understand the semantics and the structure of the data. Our approach helps to correct automatically the intra-column anomalies and the inter-col- umns ones. We aim to improve the quality of data by processing the null values and the semantic dependencies between columns.展开更多
According to the analysis of existing complicated functional dependencies constraint, we conclude the conditions of defining functional dependency in XML, and then we introduce the concept of the node value equality. ...According to the analysis of existing complicated functional dependencies constraint, we conclude the conditions of defining functional dependency in XML, and then we introduce the concept of the node value equality. A new path language and a new definition of functional dependencies in XML (XFD) are proposed XFD includes the relative XFD and the absolute XFD, in which absolute key and relative key are the particular cases. We focus on the logical implication and the closure problems, and propose a group of inference rules. Finally, some proofs of the correctness and completeness are given. XFD is powerful on expressing functional dependencies in XML causing data redundancy, and has a complete axiom system.展开更多
Theory of rough sets, proposed by Zdzislaw Pawlak in 1982, is a model of approximate reasoning. In applications, rough set methodology focuses on approximate representation of knowledge derivable from data. It leads t...Theory of rough sets, proposed by Zdzislaw Pawlak in 1982, is a model of approximate reasoning. In applications, rough set methodology focuses on approximate representation of knowledge derivable from data. It leads to significant results in many areas including, for example, finance, industry, multimedia, medicine, and most recently bioinformatics.展开更多
In the field of data-driven bearing fault diagnosis,convolutional neural network(CNN)has been widely researched and applied due to its superior feature extraction and classification ability.However,the convolutional o...In the field of data-driven bearing fault diagnosis,convolutional neural network(CNN)has been widely researched and applied due to its superior feature extraction and classification ability.However,the convolutional operation could only process a local neighborhood at a time and thus lack the ability of capturing long-range dependencies.Therefore,building an efficient learning method for long-range dependencies is crucial to comprehend and express signal features considering that the vibration signals obtained in a real industrial environment always have strong instability,periodicity,and temporal correlation.This paper introduces nonlocal mean to the CNN and presents a 1D nonlocal block(1D-NLB)to extract long-range dependencies.The 1D-NLB computes the response at a position as a weighted average value of the features at all positions.Based on it,we propose a nonlocal 1D convolutional neural network(NL-1DCNN)aiming at rolling bearing fault diagnosis.Furthermore,the 1D-NLB could be simply plugged into most existing deep learning architecture to improve their fault diagnosis ability.Under multiple noise conditions,the 1D-NLB improves the performance of the CNN on the wheelset bearing data set of high-speed train and the Case Western Reserve University bearing data set.The experiment results show that the NL-1DCNN exhibits superior results compared with six state-of-the-art fault diagnosis methods.展开更多
This study examines the time and regime dependencies of sensitive areas identified by the conditional nonlinear optiflml perturbation (CNOP) method for forecasts of two typhoons. Typhoon Meari (2004) was weakly no...This study examines the time and regime dependencies of sensitive areas identified by the conditional nonlinear optiflml perturbation (CNOP) method for forecasts of two typhoons. Typhoon Meari (2004) was weakly nonlinear and is herein referred to as the linear case, while Typhoon Matsa (2005) was strongly nonlinear and is herein referred to as the nonlinear case. In the linear case, the sensitive areas identified for special forecast times when the initial time was fixed resembled those identified for other forecast times. Targeted observations deployed to improve a special time forecast would thus also benefit forecasts at other times. In the nonlinear case, the similarities among the sensitive areas identified for different forecast times were more limited. The deployment of targeted observations in the nonlinear case would therefore need to be adapted to achieve large improvements for different targeted forecasts. For both cases, the closer the forecast time, the higher the similarities of the sensitive areas. When the forecast time was fixed, the sensitive areas in the linear case diverged continuously from the verification area as the forecast period lengthened, while those in the nonlinear case were always located around the initial cyclones. The deployment of targeted observations to improve a special forecast depends strongly on the time of deployment. An examination of the efficiency gained by reducing initial errors within the identified sensitive areas confirmed these results. In general, the greatest improvement in a special time forecast was obtained by identifying the sensitive areas for the corresponding forecast time period.展开更多
This paper summarizes the main instrumental and methodological points of the tidal research which was performed in the framework of the National Scientific Research Fund Project K101603. Since the project is still run...This paper summarizes the main instrumental and methodological points of the tidal research which was performed in the framework of the National Scientific Research Fund Project K101603. Since the project is still running the tidal analysis results published here are only preliminary. Unmodelled tidal effects have been highlighted in some recent absolute gravity measurements carried out in the Pannonian basin resulting in a periodic modulation exceeding the typical standard deviations (±1microGal) of the drop sets. Since the most dominant source of the daily gravity variation is the bulk tidal effect, the goal of the project is to check its location dependency at BGal level. Unfortunately Hungary has had no dedicated instrumentation, so an effort was made to make the available LaCoste- Romberg spring G meters capable for continuous recording. As a reference instrument the GWR SG025 operated in the Conrad Observatory, Austria was also used and in the mean time of the project, a Scintrex CG-5 became also available, Eventually 6 instruments at 5 different locations were operated for 3 9 months mainly in co-located configuration. Although many experiments (moving mass calibrations) were done to determine the scale factors and scale functions of the instruments, the direct comparison of the tidal parameters obtained from the observations is still questionable. Therefore the ratio of the delta factors of O1 and M2 tidal constituents was investigated supposing that M2 is much more influenced by the ocean loading effect than O1. The slight detected increase of δ(O1 )/δ(M2) (≈0.2%) toward east does not contradict to theory. This result has to be validated in the near future by analyzing available ocean loading models.展开更多
With the growing popularity of data-intensive services on the Internet, the traditional process-centric model for business process meets challenges due to the lack of abilities to describe data semantics and dependenc...With the growing popularity of data-intensive services on the Internet, the traditional process-centric model for business process meets challenges due to the lack of abilities to describe data semantics and dependencies, resulting in the inflexibility of the design and implement for the processes. This paper proposes a novel data-aware business process model which is able to describe both explicit control flow and implicit data flow. Data model with dependencies which are formulated by Linear-time Temporal Logic(LTL) is presented, and their satisfiability is validated by an automaton-based model checking algorithm. Data dependencies are fully considered in modeling phase, which helps to improve the efficiency and reliability of programming during developing phase. Finally, a prototype system based on j BPM for data-aware workflow is designed using such model, and has been deployed to Beijing Kingfore heating management system to validate the flexibility, efficacy and convenience of our approach for massive coding and large-scale system management in reality.展开更多
Temperature and doping dependencies of the transport properties have been calculated using an ensemble Monte Carlo simulation. We consider the polar optical phonon, acoustic phonons, piezoelectric, intervalley scatter...Temperature and doping dependencies of the transport properties have been calculated using an ensemble Monte Carlo simulation. We consider the polar optical phonon, acoustic phonons, piezoelectric, intervalley scatterings and Charged impurity scattering model of Ridley;furthermore, a non nonparabolic three-valley model is used. Our simulation results have shown that the electron velocity in GaN is less sensitive to changes in temperature than that associated with GaAs. Also it is found that GaN exhibits high peak drift velocity at room temperature, 2.8 × 105m/s, at doping concentration of 1 × 1020 m–3and the electron drift velocity relaxes to the saturation value of 1.3 × 105 m/s which is much larger than that of GaAs. The weakening of the phonon emission rate at low temperature explains the extremely high low field mobility. Our results suggest that the transport characteristics of GaN are superior to that of GaAs, over a wide range of temperatures, from 100 K to 700 K, and doping concentrations, up to 1 × 1025展开更多
Effort estimation plays a crucial role in software development projects,aiding in resource allocation,project planning,and risk management.Traditional estimation techniques often struggle to provide accurate estimates...Effort estimation plays a crucial role in software development projects,aiding in resource allocation,project planning,and risk management.Traditional estimation techniques often struggle to provide accurate estimates due to the complex nature of software projects.In recent years,machine learning approaches have shown promise in improving the accuracy of effort estimation models.This study proposes a hybrid model that combines Long Short-Term Memory(LSTM)and Random Forest(RF)algorithms to enhance software effort estimation.The proposed hybrid model takes advantage of the strengths of both LSTM and RF algorithms.To evaluate the performance of the hybrid model,an extensive set of software development projects is used as the experimental dataset.The experimental results demonstrate that the proposed hybrid model outperforms traditional estimation techniques in terms of accuracy and reliability.The integration of LSTM and RF enables the model to efficiently capture temporal dependencies and non-linear interactions in the software development data.The hybrid model enhances estimation accuracy,enabling project managers and stakeholders to make more precise predictions of effort needed for upcoming software projects.展开更多
BACKGROUND Eosinophilic gastroenteritis(EGE)is a chronic recurrent disease with abnormal eosinophilic infiltration in the gastrointestinal tract.Glucocorticoids remain the most common treatment method.However,disease ...BACKGROUND Eosinophilic gastroenteritis(EGE)is a chronic recurrent disease with abnormal eosinophilic infiltration in the gastrointestinal tract.Glucocorticoids remain the most common treatment method.However,disease relapse and glucocorticoid dependence remain notable problems.To date,few studies have illuminated the prognosis of EGE and risk factors for disease relapse.AIM To describe the clinical characteristics of EGE and possible predictive factors for disease relapse based on long-term follow-up.METHODS This was a retrospective cohort study of 55 patients diagnosed with EGE admitted to one medical center between 2013 and 2022.Clinical records were collected and analyzed.Kaplan-Meier curves and log-rank tests were conducted to reveal the risk factors for long-term relapse-free survival(RFS).RESULTS EGE showed a median onset age of 38 years and a slight female predominance(56.4%).The main clinical symptoms were abdominal pain(89.1%),diarrhea(61.8%),nausea(52.7%),distension(49.1%)and vomiting(47.3%).Forty-three(78.2%)patients received glucocorticoid treatment,and compared with patients without glucocorticoid treatments,they were more likely to have elevated serum immunoglobin E(IgE)(86.8%vs 50.0%,P=0.022)and descending duodenal involvement(62.8%vs 27.3%,P=0.046)at diagnosis.With a median follow-up of 67 mo,all patients survived,and 56.4%had at least one relapse.Six variables at baseline might have been associated with the overall RFS rate,including age at diagnosis<40 years[hazard ratio(HR)2.0408,95%confidence interval(CI):1.0082–4.1312,P=0.044],body mass index(BMI)>24 kg/m^(2)(HR 0.3922,95%CI:0.1916-0.8027,P=0.014),disease duration from symptom onset to diagnosis>3.5 mo(HR 2.4725,95%CI:1.220-5.0110,P=0.011),vomiting(HR 3.1259,95%CI:1.5246-6.4093,P=0.001),total serum IgE>300 KU/L at diagnosis(HR 0.2773,95%CI:0.1204-0.6384,P=0.022)and glucocorticoid treatment(HR 6.1434,95%CI:2.8446-13.2676,P=0.003).CONCLUSION In patients with EGE,younger onset age,longer disease course,vomiting and glucocorticoid treatment were risk factors for disease relapse,whereas higher BMI and total IgE level at baseline were protective.展开更多
In recent years,skeleton-based action recognition has made great achievements in Computer Vision.A graph convolutional network(GCN)is effective for action recognition,modelling the human skeleton as a spatio-temporal ...In recent years,skeleton-based action recognition has made great achievements in Computer Vision.A graph convolutional network(GCN)is effective for action recognition,modelling the human skeleton as a spatio-temporal graph.Most GCNs define the graph topology by physical relations of the human joints.However,this predefined graph ignores the spatial relationship between non-adjacent joint pairs in special actions and the behavior dependence between joint pairs,resulting in a low recognition rate for specific actions with implicit correlation between joint pairs.In addition,existing methods ignore the trend correlation between adjacent frames within an action and context clues,leading to erroneous action recognition with similar poses.Therefore,this study proposes a learnable GCN based on behavior dependence,which considers implicit joint correlation by constructing a dynamic learnable graph with extraction of specific behavior dependence of joint pairs.By using the weight relationship between the joint pairs,an adaptive model is constructed.It also designs a self-attention module to obtain their inter-frame topological relationship for exploring the context of actions.Combining the shared topology and the multi-head self-attention map,the module obtains the context-based clue topology to update the dynamic graph convolution,achieving accurate recognition of different actions with similar poses.Detailed experiments on public datasets demonstrate that the proposed method achieves better results and realizes higher quality representation of actions under various evaluation protocols compared to state-of-the-art methods.展开更多
With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in th...With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in the field of human reliability analysis(HRA)to evaluate human reliability and assess risk in large complex systems.However,the classical SPAR-H method does not consider the dependencies among performance shaping factors(PSFs),whichmay cause overestimation or underestimation of the risk of the actual situation.To address this issue,this paper proposes a new method to deal with the dependencies among PSFs in SPAR-H based on the Pearson correlation coefficient.First,the dependence between every two PSFs is measured by the Pearson correlation coefficient.Second,the weights of the PSFs are obtained by considering the total dependence degree.Finally,PSFs’multipliers are modified based on the weights of corresponding PSFs,and then used in the calculating of human error probability(HEP).A case study is used to illustrate the procedure and effectiveness of the proposed method.展开更多
Matching dependencies (MDs) are used to declaratively specify the identification (or matching) of cer- tain attribute values in pairs of database tuples when some similarity conditions on other values are satisfie...Matching dependencies (MDs) are used to declaratively specify the identification (or matching) of cer- tain attribute values in pairs of database tuples when some similarity conditions on other values are satisfied. Their en- forcement can be seen as a natural generalization of entity resolution. In what we call the pure case of MD enforce- ment, an arbitrary value from the underlying data domain can be used for the value in common that is used for a match- ing. However, the overall number of changes of attribute val- ues is expected to be kept to a minimum. We investigate this case in terms of semantics and the properties of data clean- ing through the enforcement of MDs. We characterize the in- tended clean instances, and also the clean answers to queries, as those that are invariant under the cleaning process. The complexity of computing clean instances and clean query an- swering is investigated. Tractable and intractable cases de- pending on the MDs are identified and characterized.展开更多
Conditional functional dependencies(CFDs) are important techniques for data consistency. However, CFDs are limited to 1) provide the reasonable values for consistency repairing and 2) detect potential errors. This...Conditional functional dependencies(CFDs) are important techniques for data consistency. However, CFDs are limited to 1) provide the reasonable values for consistency repairing and 2) detect potential errors. This paper presents context-aware conditional functional dependencies(CCFDs) which contribute to provide reasonable values and detect po- tential errors. Especially, we focus on automatically discov- ering minimal CCFDs. In this paper, we present context rela- tivity to measure the relationship of CFDs. The overlap of the related CFDs can provide reasonable values which result in more accuracy consistency repairing, and some related CFDs are combined into CCFDs. Moreover, we prove that discover- ing minimal CCFDs is NP-complete and we design the pre- cise method and the heuristic method. We also present the dominating value to facilitate the process in both the precise method and the heuristic method. Additionally, the context relativity of the CFDs affects the cleaning results. We will give an approximate threshold of context relativity accord- ing to data distribution for suggestion. The repairing results are approved more accuracy, even evidenced by our empirical evaluation.展开更多
Artificially controlling the solid-state precipitation in aluminum (Al) alloys is an efficient way to achieve well-performed properties,and the microalloying strategy is the most frequently adopted method for such a p...Artificially controlling the solid-state precipitation in aluminum (Al) alloys is an efficient way to achieve well-performed properties,and the microalloying strategy is the most frequently adopted method for such a purpose.In this paper,recent advances in lengthscale-dependent scandium (Sc) microalloying effects in Al-Cu model alloys are reviewed.In coarse-grained Al-Cu alloys,the Sc-aided Cu/Sc/vacancies complexes that act as heterogeneous nuclei and Sc segregation at the θ′-Al_(2)Cu/matrix interface that reduces interfacial energy contribute significantly to θ′precipitation.By grain size refinement to the fine/ultrafine-grained scale,the strongly bonded Cu/Sc/vacancies complexes inhibit Cu and vacancy diffusing toward grain boundaries,promoting the desired intragranular θ′precipitation.At nanocrystalline scale,the applied high strain producing high-density vacancies results in the formation of a large quantity of (Cu Sc,vacancy)-rich atomic complexes with high thermal stability,outstandingly improving the strength/ductility synergy and preventing the intractable low-temperature precipitation.This review recommends the use of microalloying technology to modify the precipitation behaviors toward better combined mechanical properties and thermal stability in Al alloys.展开更多
Based on the force-heat equivalence energy density principle,a theoretical model for magnetic metallic materials is developed,which characterizes the temperature-dependent magnetic anisotropy energy by considering the...Based on the force-heat equivalence energy density principle,a theoretical model for magnetic metallic materials is developed,which characterizes the temperature-dependent magnetic anisotropy energy by considering the equivalent relationship between magnetic anisotropy energy and heat energy;then the relationship between the magnetic anisotropy constant and saturation magnetization is considered.Finally,we formulate a temperature-dependent model for saturation magnetization,revealing the inherent relationship between temperature and saturation magnetization.Our model predicts the saturation magnetization for nine different magnetic metallic materials at different temperatures,exhibiting satisfactory agreement with experimental data.Additionally,the experimental data used as reference points are at or near room temperature.Compared to other phenomenological theoretical models,this model is considerably more accessible than the data required at 0 K.The index included in our model is set to a constant value,which is equal to 10/3 for materials other than Fe,Co,and Ni.For transition metals(Fe,Co,and Ni in this paper),the index is 6 in the range of 0 K to 0.65T_(cr)(T_(cr) is the critical temperature),and 3 in the range of 0.65T_(cr) to T_(cr),unlike other models where the adjustable parameters vary according to each material.In addition,our model provides a new way to design and evaluate magnetic metallic materials with superior magnetic properties over a wide range of temperatures.展开更多
基金supported in part by the National Natural Science Foundation of China(62373348)the Natural Science Foundation of Xinjiang Uygur Autonomous Region(2021D01D05)+1 种基金the Tianshan Talent Training Program(2023TSYCLJ0021)the Pioneer Hundred Talents Program of Chinese Academy of Sciences.
文摘N6-methyladenosine(m6A)is an important RNA methylation modification involved in regulating diverse biological processes across multiple species.Hence,the identification of m6A modification sites provides valuable insight into the biological mechanisms of complex diseases at the post-transcriptional level.Although a variety of identification algorithms have been proposed recently,most of them capture the features of m6A modification sites by focusing on the sequential dependencies of nucleotides at different positions in RNA sequences,while ignoring the structural dependencies of nucleotides in their threedimensional structures.To overcome this issue,we propose a cross-species end-to-end deep learning model,namely CR-NSSD,which conduct a cross-domain representation learning process integrating nucleotide structural and sequential dependencies for RNA m6A site identification.Specifically,CR-NSSD first obtains the pre-coded representations of RNA sequences by incorporating the position information into single-nucleotide states with chaos game representation theory.It then constructs a crossdomain reconstruction encoder to learn the sequential and structural dependencies between nucleotides.By minimizing the reconstruction and binary cross-entropy losses,CR-NSSD is trained to complete the task of m6A site identification.Extensive experiments have demonstrated the promising performance of CR-NSSD by comparing it with several state-of-the-art m6A identification algorithms.Moreover,the results of cross-species prediction indicate that the integration of sequential and structural dependencies allows CR-NSSD to capture general features of m6A modification sites among different species,thus improving the accuracy of cross-species identification.
文摘Thucydides asserts that the occupation of Decelea by the Spartans in 413 BC made the grain supply for Athens costly by forcing the transport from land onto the sea.This calls into question the well-established consensus that sea transport was far cheaper than land transport.This paper contends that the cost of protecting supply lines-specifically the expenses associated with the warships which escorted the supply ships-rendered the grain transported on the new route exceptionally costly.In this paper,the benefits and drawbacks of a maritime economy,including transaction costs,trade dependencies,and the capabilities of warships and supply ships are discussed.
文摘This study addresses whether gold exhibits the function of a hedge or safe haven as often referred to in academia.It contributes to the existing literature by(i)revisiting this question for the principal stock markets in the Middle East and North Africa(MENA)region and(ii)using the copula-quantile-on-quantile and conditional value at risk methods to detail the risks facing market participants provided with accurate information about various gold and stock market scenarios(i.e.,bear,normal,bull).The results provide strong evidence of quantile dependence between gold and stock returns.Positive correlations are found between MENA gold and stock markets when both are bullish.Conversely,when stock returns are bearish,gold markets show negative correlations with MENA stock markets.The risk spillover from gold to stock markets intensified during the global financial and European crises.Given the risk spillover between gold and stock markets,investors in MENA markets should be careful when considering gold as a safe haven because its effectiveness as a hedge is not the same in all MENA stock markets.Investors and portfolio managers should rebalance their portfolio compositions under various gold and stock market conditions.Overall,such precise insights about the heterogeneous linkages and spillovers between gold and MENA stock returns provide potential input for developing effective hedging strategies and optimal portfolio allocations.
基金Supported by the National Natural Science Foun-dation of China (60173051) , Teaching and Research Award Programfor Outstanding Young Teachers in Higher Education Institution ofthe Ministry of Education,the National Research Foundation for theDoctoral Programof Higher Education of China(20030145029) ,andthe Natural Science Foundationfor Doctoral Career Award of LiaoningProvince(20041016)
文摘In this paper, the definition of approximate XFDs based on value equality is proposed. Two metrics, sup port and strength, are presented for measuring the degree of approximate XFD. A basic algorithm is designed for extracting minimal set of approximate XFDs, and then two optimized strategies are proposed to improve the performance. Finally, the experimental results show that the optimized algorithms are correct and effective.
文摘Today, the quantity of data continues to increase, furthermore, the data are heterogeneous, from multiple sources (structured, semi-structured and unstructured) and with different levels of quality. Therefore, it is very likely to manipulate data without knowledge about their structures and their semantics. In fact, the meta-data may be insufficient or totally absent. Data Anomalies may be due to the poverty of their semantic descriptions, or even the absence of their description. In this paper, we propose an approach to better understand the semantics and the structure of the data. Our approach helps to correct automatically the intra-column anomalies and the inter-col- umns ones. We aim to improve the quality of data by processing the null values and the semantic dependencies between columns.
基金Supported by the National Natural Science Foundation of China (60573089)the National High Technology Research and Development Program of China (2006AA09Z139)
文摘According to the analysis of existing complicated functional dependencies constraint, we conclude the conditions of defining functional dependency in XML, and then we introduce the concept of the node value equality. A new path language and a new definition of functional dependencies in XML (XFD) are proposed XFD includes the relative XFD and the absolute XFD, in which absolute key and relative key are the particular cases. We focus on the logical implication and the closure problems, and propose a group of inference rules. Finally, some proofs of the correctness and completeness are given. XFD is powerful on expressing functional dependencies in XML causing data redundancy, and has a complete axiom system.
文摘Theory of rough sets, proposed by Zdzislaw Pawlak in 1982, is a model of approximate reasoning. In applications, rough set methodology focuses on approximate representation of knowledge derivable from data. It leads to significant results in many areas including, for example, finance, industry, multimedia, medicine, and most recently bioinformatics.
基金supported by the State Key Laboratory of Traction Power,Southwest Jiaotong University (TPL2104)the National Natural Science Foundation of China (61833002).
文摘In the field of data-driven bearing fault diagnosis,convolutional neural network(CNN)has been widely researched and applied due to its superior feature extraction and classification ability.However,the convolutional operation could only process a local neighborhood at a time and thus lack the ability of capturing long-range dependencies.Therefore,building an efficient learning method for long-range dependencies is crucial to comprehend and express signal features considering that the vibration signals obtained in a real industrial environment always have strong instability,periodicity,and temporal correlation.This paper introduces nonlocal mean to the CNN and presents a 1D nonlocal block(1D-NLB)to extract long-range dependencies.The 1D-NLB computes the response at a position as a weighted average value of the features at all positions.Based on it,we propose a nonlocal 1D convolutional neural network(NL-1DCNN)aiming at rolling bearing fault diagnosis.Furthermore,the 1D-NLB could be simply plugged into most existing deep learning architecture to improve their fault diagnosis ability.Under multiple noise conditions,the 1D-NLB improves the performance of the CNN on the wheelset bearing data set of high-speed train and the Case Western Reserve University bearing data set.The experiment results show that the NL-1DCNN exhibits superior results compared with six state-of-the-art fault diagnosis methods.
基金supported by the National Natural Science Foundation of China(Grant Nos.41105038and40830955)the NationalKey Technology R&D Program(Grant No.2012BAC22B03)
文摘This study examines the time and regime dependencies of sensitive areas identified by the conditional nonlinear optiflml perturbation (CNOP) method for forecasts of two typhoons. Typhoon Meari (2004) was weakly nonlinear and is herein referred to as the linear case, while Typhoon Matsa (2005) was strongly nonlinear and is herein referred to as the nonlinear case. In the linear case, the sensitive areas identified for special forecast times when the initial time was fixed resembled those identified for other forecast times. Targeted observations deployed to improve a special time forecast would thus also benefit forecasts at other times. In the nonlinear case, the similarities among the sensitive areas identified for different forecast times were more limited. The deployment of targeted observations in the nonlinear case would therefore need to be adapted to achieve large improvements for different targeted forecasts. For both cases, the closer the forecast time, the higher the similarities of the sensitive areas. When the forecast time was fixed, the sensitive areas in the linear case diverged continuously from the verification area as the forecast period lengthened, while those in the nonlinear case were always located around the initial cyclones. The deployment of targeted observations to improve a special forecast depends strongly on the time of deployment. An examination of the efficiency gained by reducing initial errors within the identified sensitive areas confirmed these results. In general, the greatest improvement in a special time forecast was obtained by identifying the sensitive areas for the corresponding forecast time period.
基金the financial support of NKFIH-OTKA in the framework of contract K101603
文摘This paper summarizes the main instrumental and methodological points of the tidal research which was performed in the framework of the National Scientific Research Fund Project K101603. Since the project is still running the tidal analysis results published here are only preliminary. Unmodelled tidal effects have been highlighted in some recent absolute gravity measurements carried out in the Pannonian basin resulting in a periodic modulation exceeding the typical standard deviations (±1microGal) of the drop sets. Since the most dominant source of the daily gravity variation is the bulk tidal effect, the goal of the project is to check its location dependency at BGal level. Unfortunately Hungary has had no dedicated instrumentation, so an effort was made to make the available LaCoste- Romberg spring G meters capable for continuous recording. As a reference instrument the GWR SG025 operated in the Conrad Observatory, Austria was also used and in the mean time of the project, a Scintrex CG-5 became also available, Eventually 6 instruments at 5 different locations were operated for 3 9 months mainly in co-located configuration. Although many experiments (moving mass calibrations) were done to determine the scale factors and scale functions of the instruments, the direct comparison of the tidal parameters obtained from the observations is still questionable. Therefore the ratio of the delta factors of O1 and M2 tidal constituents was investigated supposing that M2 is much more influenced by the ocean loading effect than O1. The slight detected increase of δ(O1 )/δ(M2) (≈0.2%) toward east does not contradict to theory. This result has to be validated in the near future by analyzing available ocean loading models.
基金supported by the National Natural Science Foundation of China (No. 61502043, No. 61132001)Beijing Natural Science Foundation (No. 4162042)BeiJing Talents Fund (No. 2015000020124G082)
文摘With the growing popularity of data-intensive services on the Internet, the traditional process-centric model for business process meets challenges due to the lack of abilities to describe data semantics and dependencies, resulting in the inflexibility of the design and implement for the processes. This paper proposes a novel data-aware business process model which is able to describe both explicit control flow and implicit data flow. Data model with dependencies which are formulated by Linear-time Temporal Logic(LTL) is presented, and their satisfiability is validated by an automaton-based model checking algorithm. Data dependencies are fully considered in modeling phase, which helps to improve the efficiency and reliability of programming during developing phase. Finally, a prototype system based on j BPM for data-aware workflow is designed using such model, and has been deployed to Beijing Kingfore heating management system to validate the flexibility, efficacy and convenience of our approach for massive coding and large-scale system management in reality.
文摘Temperature and doping dependencies of the transport properties have been calculated using an ensemble Monte Carlo simulation. We consider the polar optical phonon, acoustic phonons, piezoelectric, intervalley scatterings and Charged impurity scattering model of Ridley;furthermore, a non nonparabolic three-valley model is used. Our simulation results have shown that the electron velocity in GaN is less sensitive to changes in temperature than that associated with GaAs. Also it is found that GaN exhibits high peak drift velocity at room temperature, 2.8 × 105m/s, at doping concentration of 1 × 1020 m–3and the electron drift velocity relaxes to the saturation value of 1.3 × 105 m/s which is much larger than that of GaAs. The weakening of the phonon emission rate at low temperature explains the extremely high low field mobility. Our results suggest that the transport characteristics of GaN are superior to that of GaAs, over a wide range of temperatures, from 100 K to 700 K, and doping concentrations, up to 1 × 1025
文摘Effort estimation plays a crucial role in software development projects,aiding in resource allocation,project planning,and risk management.Traditional estimation techniques often struggle to provide accurate estimates due to the complex nature of software projects.In recent years,machine learning approaches have shown promise in improving the accuracy of effort estimation models.This study proposes a hybrid model that combines Long Short-Term Memory(LSTM)and Random Forest(RF)algorithms to enhance software effort estimation.The proposed hybrid model takes advantage of the strengths of both LSTM and RF algorithms.To evaluate the performance of the hybrid model,an extensive set of software development projects is used as the experimental dataset.The experimental results demonstrate that the proposed hybrid model outperforms traditional estimation techniques in terms of accuracy and reliability.The integration of LSTM and RF enables the model to efficiently capture temporal dependencies and non-linear interactions in the software development data.The hybrid model enhances estimation accuracy,enabling project managers and stakeholders to make more precise predictions of effort needed for upcoming software projects.
基金National High Level Hospital Clinical Research Funding,No.2022-PUMCH-B-022CAMS Innovation Fund for Medical Sciences,No.CIFMS 2021-1-I2M-003and Undergraduate Innovation Program,No.2023zglc06076.
文摘BACKGROUND Eosinophilic gastroenteritis(EGE)is a chronic recurrent disease with abnormal eosinophilic infiltration in the gastrointestinal tract.Glucocorticoids remain the most common treatment method.However,disease relapse and glucocorticoid dependence remain notable problems.To date,few studies have illuminated the prognosis of EGE and risk factors for disease relapse.AIM To describe the clinical characteristics of EGE and possible predictive factors for disease relapse based on long-term follow-up.METHODS This was a retrospective cohort study of 55 patients diagnosed with EGE admitted to one medical center between 2013 and 2022.Clinical records were collected and analyzed.Kaplan-Meier curves and log-rank tests were conducted to reveal the risk factors for long-term relapse-free survival(RFS).RESULTS EGE showed a median onset age of 38 years and a slight female predominance(56.4%).The main clinical symptoms were abdominal pain(89.1%),diarrhea(61.8%),nausea(52.7%),distension(49.1%)and vomiting(47.3%).Forty-three(78.2%)patients received glucocorticoid treatment,and compared with patients without glucocorticoid treatments,they were more likely to have elevated serum immunoglobin E(IgE)(86.8%vs 50.0%,P=0.022)and descending duodenal involvement(62.8%vs 27.3%,P=0.046)at diagnosis.With a median follow-up of 67 mo,all patients survived,and 56.4%had at least one relapse.Six variables at baseline might have been associated with the overall RFS rate,including age at diagnosis<40 years[hazard ratio(HR)2.0408,95%confidence interval(CI):1.0082–4.1312,P=0.044],body mass index(BMI)>24 kg/m^(2)(HR 0.3922,95%CI:0.1916-0.8027,P=0.014),disease duration from symptom onset to diagnosis>3.5 mo(HR 2.4725,95%CI:1.220-5.0110,P=0.011),vomiting(HR 3.1259,95%CI:1.5246-6.4093,P=0.001),total serum IgE>300 KU/L at diagnosis(HR 0.2773,95%CI:0.1204-0.6384,P=0.022)and glucocorticoid treatment(HR 6.1434,95%CI:2.8446-13.2676,P=0.003).CONCLUSION In patients with EGE,younger onset age,longer disease course,vomiting and glucocorticoid treatment were risk factors for disease relapse,whereas higher BMI and total IgE level at baseline were protective.
基金supported in part by the 2023 Key Supported Project of the 14th Five Year Plan for Education and Science in Hunan Province with No.ND230795.
文摘In recent years,skeleton-based action recognition has made great achievements in Computer Vision.A graph convolutional network(GCN)is effective for action recognition,modelling the human skeleton as a spatio-temporal graph.Most GCNs define the graph topology by physical relations of the human joints.However,this predefined graph ignores the spatial relationship between non-adjacent joint pairs in special actions and the behavior dependence between joint pairs,resulting in a low recognition rate for specific actions with implicit correlation between joint pairs.In addition,existing methods ignore the trend correlation between adjacent frames within an action and context clues,leading to erroneous action recognition with similar poses.Therefore,this study proposes a learnable GCN based on behavior dependence,which considers implicit joint correlation by constructing a dynamic learnable graph with extraction of specific behavior dependence of joint pairs.By using the weight relationship between the joint pairs,an adaptive model is constructed.It also designs a self-attention module to obtain their inter-frame topological relationship for exploring the context of actions.Combining the shared topology and the multi-head self-attention map,the module obtains the context-based clue topology to update the dynamic graph convolution,achieving accurate recognition of different actions with similar poses.Detailed experiments on public datasets demonstrate that the proposed method achieves better results and realizes higher quality representation of actions under various evaluation protocols compared to state-of-the-art methods.
基金Shanghai Rising-Star Program(Grant No.21QA1403400)Shanghai Sailing Program(Grant No.20YF1414800)Shanghai Key Laboratory of Power Station Automation Technology(Grant No.13DZ2273800).
文摘With the improvement of equipment reliability,human factors have become the most uncertain part in the system.The standardized Plant Analysis of Risk-Human Reliability Analysis(SPAR-H)method is a reliable method in the field of human reliability analysis(HRA)to evaluate human reliability and assess risk in large complex systems.However,the classical SPAR-H method does not consider the dependencies among performance shaping factors(PSFs),whichmay cause overestimation or underestimation of the risk of the actual situation.To address this issue,this paper proposes a new method to deal with the dependencies among PSFs in SPAR-H based on the Pearson correlation coefficient.First,the dependence between every two PSFs is measured by the Pearson correlation coefficient.Second,the weights of the PSFs are obtained by considering the total dependence degree.Finally,PSFs’multipliers are modified based on the weights of corresponding PSFs,and then used in the calculating of human error probability(HEP).A case study is used to illustrate the procedure and effectiveness of the proposed method.
文摘Matching dependencies (MDs) are used to declaratively specify the identification (or matching) of cer- tain attribute values in pairs of database tuples when some similarity conditions on other values are satisfied. Their en- forcement can be seen as a natural generalization of entity resolution. In what we call the pure case of MD enforce- ment, an arbitrary value from the underlying data domain can be used for the value in common that is used for a match- ing. However, the overall number of changes of attribute val- ues is expected to be kept to a minimum. We investigate this case in terms of semantics and the properties of data clean- ing through the enforcement of MDs. We characterize the in- tended clean instances, and also the clean answers to queries, as those that are invariant under the cleaning process. The complexity of computing clean instances and clean query an- swering is investigated. Tractable and intractable cases de- pending on the MDs are identified and characterized.
文摘Conditional functional dependencies(CFDs) are important techniques for data consistency. However, CFDs are limited to 1) provide the reasonable values for consistency repairing and 2) detect potential errors. This paper presents context-aware conditional functional dependencies(CCFDs) which contribute to provide reasonable values and detect po- tential errors. Especially, we focus on automatically discov- ering minimal CCFDs. In this paper, we present context rela- tivity to measure the relationship of CFDs. The overlap of the related CFDs can provide reasonable values which result in more accuracy consistency repairing, and some related CFDs are combined into CCFDs. Moreover, we prove that discover- ing minimal CCFDs is NP-complete and we design the pre- cise method and the heuristic method. We also present the dominating value to facilitate the process in both the precise method and the heuristic method. Additionally, the context relativity of the CFDs affects the cleaning results. We will give an approximate threshold of context relativity accord- ing to data distribution for suggestion. The repairing results are approved more accuracy, even evidenced by our empirical evaluation.
基金supported by the National Natural Science Foundation of China(Nos.52201135,52271115,U23A6013,92360301,and U2330203)the 111 Project of China(No.BP2018008)+1 种基金the Shaanxi Province Innovation Team Project,China(No.2024RS-CXTD-58)supported by the International Joint Laboratory for Micro/Nano Manufacturing and Measurement Technologies and by the open research fund of Suzhou Laboratory。
文摘Artificially controlling the solid-state precipitation in aluminum (Al) alloys is an efficient way to achieve well-performed properties,and the microalloying strategy is the most frequently adopted method for such a purpose.In this paper,recent advances in lengthscale-dependent scandium (Sc) microalloying effects in Al-Cu model alloys are reviewed.In coarse-grained Al-Cu alloys,the Sc-aided Cu/Sc/vacancies complexes that act as heterogeneous nuclei and Sc segregation at the θ′-Al_(2)Cu/matrix interface that reduces interfacial energy contribute significantly to θ′precipitation.By grain size refinement to the fine/ultrafine-grained scale,the strongly bonded Cu/Sc/vacancies complexes inhibit Cu and vacancy diffusing toward grain boundaries,promoting the desired intragranular θ′precipitation.At nanocrystalline scale,the applied high strain producing high-density vacancies results in the formation of a large quantity of (Cu Sc,vacancy)-rich atomic complexes with high thermal stability,outstandingly improving the strength/ductility synergy and preventing the intractable low-temperature precipitation.This review recommends the use of microalloying technology to modify the precipitation behaviors toward better combined mechanical properties and thermal stability in Al alloys.
基金Project supported by the Natural Science Foundation of Chongqing(Grant No.CSTB2022NSCQ-MSX0391)。
文摘Based on the force-heat equivalence energy density principle,a theoretical model for magnetic metallic materials is developed,which characterizes the temperature-dependent magnetic anisotropy energy by considering the equivalent relationship between magnetic anisotropy energy and heat energy;then the relationship between the magnetic anisotropy constant and saturation magnetization is considered.Finally,we formulate a temperature-dependent model for saturation magnetization,revealing the inherent relationship between temperature and saturation magnetization.Our model predicts the saturation magnetization for nine different magnetic metallic materials at different temperatures,exhibiting satisfactory agreement with experimental data.Additionally,the experimental data used as reference points are at or near room temperature.Compared to other phenomenological theoretical models,this model is considerably more accessible than the data required at 0 K.The index included in our model is set to a constant value,which is equal to 10/3 for materials other than Fe,Co,and Ni.For transition metals(Fe,Co,and Ni in this paper),the index is 6 in the range of 0 K to 0.65T_(cr)(T_(cr) is the critical temperature),and 3 in the range of 0.65T_(cr) to T_(cr),unlike other models where the adjustable parameters vary according to each material.In addition,our model provides a new way to design and evaluate magnetic metallic materials with superior magnetic properties over a wide range of temperatures.