Air traffic complexity is a critical indicator for air traffic operation,and plays an important role in air traffic management(ATM),such as airspace reconfiguration,air traffic flow management and allocation of air tr...Air traffic complexity is a critical indicator for air traffic operation,and plays an important role in air traffic management(ATM),such as airspace reconfiguration,air traffic flow management and allocation of air traffic controllers(ATCos).Recently,many machine learning techniques have been used to evaluate air traffic complexity by constructing a mapping from complexity related factors to air traffic complexity labels.However,the low quality of complexity labels,which is named as label noise,has often been neglected and caused unsatisfactory performance in air traffic complexity evaluation.This paper aims at label noise in air traffic complexity samples,and proposes a confident learning and XGBoost-based approach to evaluate air traffic complexity under label noise.The confident learning process is applied to filter out noisy samples with various label probability distributions,and XGBoost is used to train a robust and high-performance air traffic complexity evaluation model on the different label noise filtered ratio datasets.Experiments are carried out on a real dataset from the Guangzhou airspace sector in China,and the results prove that the appropriate label noise removal strategy and XGBoost algorithm can effectively mitigate the label noise problem and achieve better performance in air traffic complexity evaluation.展开更多
We analyze the correlation properties of the Erd6s-Rdnyi random graph (RG) and the Barabdsi-Albert scale-free network (SF) under the attack and repair strategy with detrended fluctuation analysis (DFA). The maxi...We analyze the correlation properties of the Erd6s-Rdnyi random graph (RG) and the Barabdsi-Albert scale-free network (SF) under the attack and repair strategy with detrended fluctuation analysis (DFA). The maximum degree kmax, representing the local property of the system, shows similar scaling behaviors for random graphs and scale-free networks. The fluctuations are quite random at short time scales but display strong anticorrelation at longer time scales under the same system size N and different repair probability pre. The average degree 〈k〉, revealing the statistical property of the system, exhibits completely different scaling behaviors for random graphs and scale-free networks. Random graphs display long-range power-law correlations. Scale-free networks are uncorrelated at short time scales; while anticorrelated at longer time scales and the anticorrelation becoming stronger with the increase of pre.展开更多
With the development of automation in smart grids,network reconfiguration is becoming a feasible approach for improving the operation of distribution systems.A novel reconfiguration strategy was presented to get the o...With the development of automation in smart grids,network reconfiguration is becoming a feasible approach for improving the operation of distribution systems.A novel reconfiguration strategy was presented to get the optimal configuration of improving economy of the system,and then identifying the important nodes.In this strategy,the objectives increase the node importance degree and decrease the active power loss subjected to operational constraints.A compound objective function with weight coefficients is formulated to balance the conflict of the objectives.Then a novel quantum particle swarm optimization based on loop switches hierarchical encoded was employed to address the compound objective reconfiguration problem.Its main contribution is the presentation of the hierarchical encoded scheme which is used to generate the population swarm particles of representing only radial connected solutions.Because the candidate solutions are feasible,the search efficiency would improve dramatically during the optimization process without tedious topology verification.To validate the proposed strategy,simulations are carried out on the test systems.The results are compared with other techniques in order to evaluate the performance of the proposed method.展开更多
Software protection technology has been universally emphasized, with the development of reverse engineering and static analysis techniques. So, it is important to research how to quantitatively evaluate the security o...Software protection technology has been universally emphasized, with the development of reverse engineering and static analysis techniques. So, it is important to research how to quantitatively evaluate the security of the protected software. However, there are some researchers evaluating the security of the proposed protect techniques directly by the traditional complexity metrics, which is not suffident. In order to better reflect security from software complexity, a multi-factor complexity metric based on control flow graph (CFG) is proposed, and the corresponding calculating procedures are presented in detail. Moreover, complexity density models are constructed to indicate the strength of software resisting reverse engineering and code analysis. Instance analysis shows that the proposed method is simple and practical, and can more objectively reflect software security from the perspective of the complexity.展开更多
Aiming at the faults of some weak nodes in the concentrated solar power-photovoltaic(CSP-PV)hybrid power generation system,it is impossible to restore the transient voltage only relying on the reactive power regulatio...Aiming at the faults of some weak nodes in the concentrated solar power-photovoltaic(CSP-PV)hybrid power generation system,it is impossible to restore the transient voltage only relying on the reactive power regulation capability of the system itself.We propose a dynamic reactive power planning method suitable for CSP-PV hybrid power generation system.The method determines the installation node of the dynamic reactive power compensation device and its compensation capacity based on the reactive power adjustment capability of the system itself.The critical fault node is determined by the transient voltage stability recovery index,and the weak node of the system is initially determined.Based on this,the sensitivity index is used to determine the installation node of the dynamic reactive power compensation device.Dynamic reactive power planning optimization model is established with the lowest investment cost of dynamic reactive power compensation device and the improvement of system transient voltage stability.Furthermore,the component of the reactive power compensation node is optimized by particle swarm optimization based on differential evolution(DE-PSO).The simulation results of the example system show that compared with the dynamic position compensation device installation location optimization method,the proposed method can improve the transient voltage stability of the system under the same reactive power compensation cost.展开更多
Background To report quality control methods and baseline reproducibility data of the ultrasound measurements of carotid artery intima-media thickness in the project of Establishment of an Integrated System for Corona...Background To report quality control methods and baseline reproducibility data of the ultrasound measurements of carotid artery intima-media thickness in the project of Establishment of an Integrated System for Coronary Heart Disease Prevention and Treatment. Methods Standard ultrasound scanning and measuring protocols were established by the study group. All sonographers and readers were trained by the carotid ultrasound core lab and all digital ultrasound images were centrally read. Ten subjects were scanned twice (with 1 week interval) by 2 sonographers independently and images were read by a single reader to evaluate the sonographer variability. Twenty subjects' images were read twice (with 1 week interval) by a single reader to assess the reader variability and the reproducibility of IMT measured at different carotid segments. Results The intraclass correlation (ICC) of intra- and inter-sonographer and intra- reader for mean IMT measurements was 0.99, 0.98 and 0.97 respectively; while for max IMT, it was 0.97, 0.99 and 0.95 respectively. Among different carotid segments and sites, ICC for mean 1MT measurements of common carotid (CCA), carotid artery bulb (Bulb), internal carotid artery (ICA), overall near wall and overall far wall was 0.97, 0.99, 0.89, 0.93 and 0.98 respectively. Conclusion The reproducibility of IMT measurements according to our protocol is acceptable, although better reproducibility is found when measuring the mean IMT than max IMT, CCA and Bulb IMT than ICA IMT, and far wall IMT than near wall IMT.展开更多
The paper shows a proposition of metrics for measuring the complexity of the business organization and business software. The metrics is based on a subjective estimation of complexity of the elements from a part of th...The paper shows a proposition of metrics for measuring the complexity of the business organization and business software. The metrics is based on a subjective estimation of complexity of the elements from a part of the structure of business organization or business software in relation to other elements from the observed part. Estimation is performed based on the measuring scale for comparison of complexity of elements, and reaching a final conclusion on the complexity of elements in relation to other elements, through the Analytic Hierarchy Process (AHP). Defined in this manner, the metrics represents a unique metrics for measuring the complexity of elements of business organization and business software, which enables their comparison. The paper also presents a short overview of existing metrics for measuring the complexity of business organization and business software.展开更多
On the basis of software testing tools we developed for programming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program\|based ...On the basis of software testing tools we developed for programming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program\|based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J\|complexity 0; J\|complexity 1; J\|complexity \{1+\}; J\|complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.展开更多
In March 2017, Narendra Modi led his Bharatiya Janata Party(BJP) to victory in state elections, among which he gained parliamentary election of Uttar Pradesh(state in northern India), known as the weather vane of Indi...In March 2017, Narendra Modi led his Bharatiya Janata Party(BJP) to victory in state elections, among which he gained parliamentary election of Uttar Pradesh(state in northern India), known as the weather vane of Indian elections. Back in 2014, the BJP had already won a simple majority in Lok Sabha(the lower house of parliament), bringing an end to more than 30 years of coalition government. Now, the BJP is exhibiting stronger presence as a one-party-rule at both federal and local levels, with no counter-balance from the Indian National Congress, local parties, or left wing parties now or in the foreseeable future. This increases the likelihood of Modi's re-election as prime minister in 2019. Obviously,Indian political development is characterized by complexity, accidental factors and intrinsic logic, which will definitely exert great influence on the future of India.展开更多
In this paper, we develop a novel alternating linearization method for solving convex minimization whose objective function is the sum of two separable functions. The motivation of the paper is to extend the recent wo...In this paper, we develop a novel alternating linearization method for solving convex minimization whose objective function is the sum of two separable functions. The motivation of the paper is to extend the recent work Goldfarb et al.(2013) to cope with more generic convex minimization. For the proposed method,both the separable objective functions and the auxiliary penalty terms are linearized. Provided that the separable objective functions belong to C1,1(Rn), we prove the O(1/?) arithmetical complexity of the new method. Some preliminary numerical simulations involving image processing and compressive sensing are conducted.展开更多
Path length calculation is a frequent requirement in studies related to graph theoretic problems such as genetics. Standard method to calculate average path length (APL) of a graph requires traversing all nodes in t...Path length calculation is a frequent requirement in studies related to graph theoretic problems such as genetics. Standard method to calculate average path length (APL) of a graph requires traversing all nodes in the graph repeatedly, which is computationally expensive for graphs containing large number of nodes. We propose a novel method to calculate APL for graphs commonly required in the studies of genetics. The proposed method is computationally less expensive and less time-consuming compared to standard method.展开更多
基金This work was supported by the Na⁃tional Natural Science Foundation of China(No.61903187)Nanjing University of Aeronautics and Astronautics Graduate Innovation Base(Laboratory)Open Fund(No.kfjj20190732)。
文摘Air traffic complexity is a critical indicator for air traffic operation,and plays an important role in air traffic management(ATM),such as airspace reconfiguration,air traffic flow management and allocation of air traffic controllers(ATCos).Recently,many machine learning techniques have been used to evaluate air traffic complexity by constructing a mapping from complexity related factors to air traffic complexity labels.However,the low quality of complexity labels,which is named as label noise,has often been neglected and caused unsatisfactory performance in air traffic complexity evaluation.This paper aims at label noise in air traffic complexity samples,and proposes a confident learning and XGBoost-based approach to evaluate air traffic complexity under label noise.The confident learning process is applied to filter out noisy samples with various label probability distributions,and XGBoost is used to train a robust and high-performance air traffic complexity evaluation model on the different label noise filtered ratio datasets.Experiments are carried out on a real dataset from the Guangzhou airspace sector in China,and the results prove that the appropriate label noise removal strategy and XGBoost algorithm can effectively mitigate the label noise problem and achieve better performance in air traffic complexity evaluation.
基金The project supported by National Natural Science Foundation of China under Grant Nos. 70271067 and 70401020 and the Science Foundation of the Ministry of Education of China under Grant No. 03113
文摘We analyze the correlation properties of the Erd6s-Rdnyi random graph (RG) and the Barabdsi-Albert scale-free network (SF) under the attack and repair strategy with detrended fluctuation analysis (DFA). The maximum degree kmax, representing the local property of the system, shows similar scaling behaviors for random graphs and scale-free networks. The fluctuations are quite random at short time scales but display strong anticorrelation at longer time scales under the same system size N and different repair probability pre. The average degree 〈k〉, revealing the statistical property of the system, exhibits completely different scaling behaviors for random graphs and scale-free networks. Random graphs display long-range power-law correlations. Scale-free networks are uncorrelated at short time scales; while anticorrelated at longer time scales and the anticorrelation becoming stronger with the increase of pre.
基金Project(61102039)supported by the National Natural Science Foundation of ChinaProject(2014AA052600)supported by National Hi-tech Research and Development Plan,China
文摘With the development of automation in smart grids,network reconfiguration is becoming a feasible approach for improving the operation of distribution systems.A novel reconfiguration strategy was presented to get the optimal configuration of improving economy of the system,and then identifying the important nodes.In this strategy,the objectives increase the node importance degree and decrease the active power loss subjected to operational constraints.A compound objective function with weight coefficients is formulated to balance the conflict of the objectives.Then a novel quantum particle swarm optimization based on loop switches hierarchical encoded was employed to address the compound objective reconfiguration problem.Its main contribution is the presentation of the hierarchical encoded scheme which is used to generate the population swarm particles of representing only radial connected solutions.Because the candidate solutions are feasible,the search efficiency would improve dramatically during the optimization process without tedious topology verification.To validate the proposed strategy,simulations are carried out on the test systems.The results are compared with other techniques in order to evaluate the performance of the proposed method.
基金Key Project of the National Eleventh-Five Year Research Program of China(No.2006BAD10A07)
文摘Software protection technology has been universally emphasized, with the development of reverse engineering and static analysis techniques. So, it is important to research how to quantitatively evaluate the security of the protected software. However, there are some researchers evaluating the security of the proposed protect techniques directly by the traditional complexity metrics, which is not suffident. In order to better reflect security from software complexity, a multi-factor complexity metric based on control flow graph (CFG) is proposed, and the corresponding calculating procedures are presented in detail. Moreover, complexity density models are constructed to indicate the strength of software resisting reverse engineering and code analysis. Instance analysis shows that the proposed method is simple and practical, and can more objectively reflect software security from the perspective of the complexity.
基金Science and Technology Projects of State Grid Corporation of China(No.SGGSKY00FJJS1800140)。
文摘Aiming at the faults of some weak nodes in the concentrated solar power-photovoltaic(CSP-PV)hybrid power generation system,it is impossible to restore the transient voltage only relying on the reactive power regulation capability of the system itself.We propose a dynamic reactive power planning method suitable for CSP-PV hybrid power generation system.The method determines the installation node of the dynamic reactive power compensation device and its compensation capacity based on the reactive power adjustment capability of the system itself.The critical fault node is determined by the transient voltage stability recovery index,and the weak node of the system is initially determined.Based on this,the sensitivity index is used to determine the installation node of the dynamic reactive power compensation device.Dynamic reactive power planning optimization model is established with the lowest investment cost of dynamic reactive power compensation device and the improvement of system transient voltage stability.Furthermore,the component of the reactive power compensation node is optimized by particle swarm optimization based on differential evolution(DE-PSO).The simulation results of the example system show that compared with the dynamic position compensation device installation location optimization method,the proposed method can improve the transient voltage stability of the system under the same reactive power compensation cost.
文摘Background To report quality control methods and baseline reproducibility data of the ultrasound measurements of carotid artery intima-media thickness in the project of Establishment of an Integrated System for Coronary Heart Disease Prevention and Treatment. Methods Standard ultrasound scanning and measuring protocols were established by the study group. All sonographers and readers were trained by the carotid ultrasound core lab and all digital ultrasound images were centrally read. Ten subjects were scanned twice (with 1 week interval) by 2 sonographers independently and images were read by a single reader to evaluate the sonographer variability. Twenty subjects' images were read twice (with 1 week interval) by a single reader to assess the reader variability and the reproducibility of IMT measured at different carotid segments. Results The intraclass correlation (ICC) of intra- and inter-sonographer and intra- reader for mean IMT measurements was 0.99, 0.98 and 0.97 respectively; while for max IMT, it was 0.97, 0.99 and 0.95 respectively. Among different carotid segments and sites, ICC for mean 1MT measurements of common carotid (CCA), carotid artery bulb (Bulb), internal carotid artery (ICA), overall near wall and overall far wall was 0.97, 0.99, 0.89, 0.93 and 0.98 respectively. Conclusion The reproducibility of IMT measurements according to our protocol is acceptable, although better reproducibility is found when measuring the mean IMT than max IMT, CCA and Bulb IMT than ICA IMT, and far wall IMT than near wall IMT.
文摘The paper shows a proposition of metrics for measuring the complexity of the business organization and business software. The metrics is based on a subjective estimation of complexity of the elements from a part of the structure of business organization or business software in relation to other elements from the observed part. Estimation is performed based on the measuring scale for comparison of complexity of elements, and reaching a final conclusion on the complexity of elements in relation to other elements, through the Analytic Hierarchy Process (AHP). Defined in this manner, the metrics represents a unique metrics for measuring the complexity of elements of business organization and business software, which enables their comparison. The paper also presents a short overview of existing metrics for measuring the complexity of business organization and business software.
文摘On the basis of software testing tools we developed for programming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program\|based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J\|complexity 0; J\|complexity 1; J\|complexity \{1+\}; J\|complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.
文摘In March 2017, Narendra Modi led his Bharatiya Janata Party(BJP) to victory in state elections, among which he gained parliamentary election of Uttar Pradesh(state in northern India), known as the weather vane of Indian elections. Back in 2014, the BJP had already won a simple majority in Lok Sabha(the lower house of parliament), bringing an end to more than 30 years of coalition government. Now, the BJP is exhibiting stronger presence as a one-party-rule at both federal and local levels, with no counter-balance from the Indian National Congress, local parties, or left wing parties now or in the foreseeable future. This increases the likelihood of Modi's re-election as prime minister in 2019. Obviously,Indian political development is characterized by complexity, accidental factors and intrinsic logic, which will definitely exert great influence on the future of India.
基金supported by National Natural Science Foundation of China(Grant Nos.11301055 and 11401315)Natural Science Foundation of Jiangsu Province(Grant No.BK2009397)the Fundamental Research Funds for the Central Universities(Grant No.ZYGX2013J103)
文摘In this paper, we develop a novel alternating linearization method for solving convex minimization whose objective function is the sum of two separable functions. The motivation of the paper is to extend the recent work Goldfarb et al.(2013) to cope with more generic convex minimization. For the proposed method,both the separable objective functions and the auxiliary penalty terms are linearized. Provided that the separable objective functions belong to C1,1(Rn), we prove the O(1/?) arithmetical complexity of the new method. Some preliminary numerical simulations involving image processing and compressive sensing are conducted.
文摘Path length calculation is a frequent requirement in studies related to graph theoretic problems such as genetics. Standard method to calculate average path length (APL) of a graph requires traversing all nodes in the graph repeatedly, which is computationally expensive for graphs containing large number of nodes. We propose a novel method to calculate APL for graphs commonly required in the studies of genetics. The proposed method is computationally less expensive and less time-consuming compared to standard method.