This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to co...This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to consider the complexity of quantum states, offering a more nuanced understanding of measurement precision. By adding a complexity term to the uncertainty relation, we explore nonlinear modifications such as polynomial, exponential, and logarithmic functions. Rigorous mathematical derivations demonstrate the consistency of the modified principle with classical quantum mechanics and quantum information theory. We investigate the implications of this modified HUP for various aspects of quantum mechanics, including quantum metrology, quantum algorithms, quantum error correction, and quantum chaos. Additionally, we propose experimental protocols to test the validity of the modified HUP, evaluating their feasibility with current and near-term quantum technologies. This work highlights the importance of quantum complexity in quantum mechanics and provides a refined perspective on the interplay between complexity, entanglement, and uncertainty in quantum systems. The modified HUP has the potential to stimulate interdisciplinary research at the intersection of quantum physics, information theory, and complexity theory, with significant implications for the development of quantum technologies and the understanding of the quantum-to-classical transition.展开更多
Elementary information theory is used to model cybersecurity complexity, where the model assumes that security risk management is a binomial stochastic process. Complexity is shown to increase exponentially with the n...Elementary information theory is used to model cybersecurity complexity, where the model assumes that security risk management is a binomial stochastic process. Complexity is shown to increase exponentially with the number of vulnerabilities in combination with security risk management entropy. However, vulnerabilities can be either local or non-local, where the former is confined to networked elements and the latter results from interactions between elements. Furthermore, interactions involve multiple methods of communication, where each method can contain vulnerabilities specific to that method. Importantly, the number of possible interactions scales quadratically with the number of elements in standard network topologies. Minimizing these interactions can significantly reduce the number of vulnerabilities and the accompanying complexity. Two network configurations that yield sub-quadratic and linear scaling relations are presented.展开更多
Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over...Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over reaction parameters,resulting in purer,more uniform,and structurally stable products due to more effective mass transfer manipulation.However,continuous-flow synthesis processes may be accompanied by the emergence of spatial convective structures initiating convective flows.On the one hand,convection can accelerate reactions by intensifying mass transfer.On the other hand,it may lead to non-uniformity in the final product or defects,especially in MOF microcrystal synthesis.The ability to distinguish regions of convective and diffusive mass transfer may be the key to performing higher-quality reactions and obtaining purer products.In this study,we investigate,for the first time,the possibility of using the information complexity measure as a criterion for assessing the intensity of mass transfer in microchannels,considering both spatial and temporal non-uniformities of liquid’s distributions resulting from convection formation.We calculate the complexity using shearlet transform based on a local approach.In contrast to existing methods for calculating complexity,the shearlet transform based approach provides a more detailed representation of local heterogeneities.Our analysis involves experimental images illustrating the mixing process of two non-reactive liquids in a Y-type continuous-flow microchannel under conditions of double-diffusive convection formation.The obtained complexity fields characterize the mixing process and structure formation,revealing variations in mass transfer intensity along the microchannel.We compare the results with cases of liquid mixing via a pure diffusive mechanism.Upon analysis,it was revealed that the complexity measure exhibits sensitivity to variations in the type of mass transfer,establishing its feasibility as an indirect criterion for assessing mass transfer intensity.The method presented can extend beyond flow analysis,finding application in the controlling of microstructures of various materials(porosity,for instance)or surface defects in metals,optical systems and other materials that hold significant relevance in materials science and engineering.展开更多
Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic ...Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic complexity.Despite the substantial number of empirical investigations into task complexity in journal articles,storyline complexity,one of the features of it,is scarcely investigated.Previous research mainly focused on the impact of storyline complexity on learners’oral performance,but the impact on learners’written performance is less investigated.Thus,this study aims at investigating the effects of narrative complexity of storyline on senior high school students’written performance,as displayed by its complexity,fluency,and accuracy.The present study has important pedagogical implications.That is,task design and assessment should make a distinction between different types of narrative tasks.For example,the task with single or dual storyline.Results on task complexity may contribute to informing the pedagogical choices made by teachers when prioritizing work with a specific linguistic dimension.展开更多
Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide ...Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide the self-organized structure. The living systems are open, dynamic structures performing random, stationary, stochastic, self-organizing processes. The self-organizing procedure is defined by the spatial-temporal fractal structure, which is self-similar both in space and time. The system’s complexity appears in its energetics, which tries the most efficient use of the available energies;for that, it organizes various well-connected networks. The controller of environmental relations is the Darwinian selection on a long-time scale. The energetics optimize the healthy processes tuned to the highest efficacy and minimal loss (minimalization of the entropy production). The organism is built up by morphogenetic rules and develops various networks from the genetic level to the organism. The networks have intensive crosstalk and form a balance in the Nash equilibrium, which is the homeostatic state in healthy conditions. Homeostasis may be described as a Nash equilibrium, which ensures energy distribution in a “democratic” way regarding the functions of the parts in the complete system. Cancer radically changes the network system in the organism. Cancer is a network disease. Deviation from healthy networking appears at every level, from genetic (molecular) to cells, tissues, organs, and organisms. The strong proliferation of malignant tissue is the origin of most of the life-threatening processes. The weak side of cancer development is the change of complex information networking in the system, being vulnerable to immune attacks. Cancer cells are masters of adaptation and evade immune surveillance. This hiding process can be broken by electromagnetic nonionizing radiation, for which the malignant structure has no adaptation strategy. Our objective is to review the different sides of living complexity and use the knowledge to fight against cancer.展开更多
The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are ca...The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are called causative availability indiscriminate attacks.Facing the problem that existing data sanitization methods are hard to apply to real-time applications due to their tedious process and heavy computations,we propose a new supervised batch detection method for poison,which can fleetly sanitize the training dataset before the local model training.We design a training dataset generation method that helps to enhance accuracy and uses data complexity features to train a detection model,which will be used in an efficient batch hierarchical detection process.Our model stockpiles knowledge about poison,which can be expanded by retraining to adapt to new attacks.Being neither attack-specific nor scenario-specific,our method is applicable to FL/DML or other online or offline scenarios.展开更多
The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features ...The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features that L2 learners can employ, as well as suggest which features they should focus on in English academic writing. To achieve this, all samples were analyzed for rhetorical moves using Hyland’s five-rhetorical move model. Additionally, all sentences were evaluated for syntactic complexity, considering measures such as global, clausal and phrasal complexity. The findings reveal that expert writers exhibit a more balanced use of syntactic complexity across moves, effectively fulfilling the rhetorical objectives of abstracts. On the other hand, MA students tend to rely excessively on embedded structures and dependent clauses in an attempt to increase complexity. The implications of these findings for academic writing research, pedagogy, and assessment are thoroughly discussed.展开更多
This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the m...This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the main data-gathering instruments.English language teachers from Coral na Munti National High School were selected as the respondents of the study.Finding of the study suggests that the syntactic complexity of the text affects the reading comprehension of the students.Students found it challenging to understand the message that the author conveyed if he or she used a large number of phrases and clauses in one sentence.Furthermore,the complex sentence syntactic structure was deemed the most challenging for students to understand.To overcome said challenges in comprehending text,various reading intervention programs were utilized by teachers.These interventions include focused or targeted instruction and the implementation of the Project Dear,suggested by the Department of Education.These programs were proven to help students improve their comprehension as well as their knowledge in syntactical structure of sentences.This study underscores the importance of selecting appropriate reading materials and implementing suitable reading intervention programs to enhance students’comprehension skills.展开更多
The complexity and applicability of three relative car-following models are investigated and they are the optimal velocity model (OVM),the generalized force model (GFM) and the full velocity difference model (FVD...The complexity and applicability of three relative car-following models are investigated and they are the optimal velocity model (OVM),the generalized force model (GFM) and the full velocity difference model (FVDM).The vehicle trajectory data used is collected from the digital pictures obtained at a 30-storey building near Ⅰ-80 freeway.Three different calibrating methods are used to estimate the model parameters and to study the relationships between model complexity and applicability from overall,inter-driver and intra-driver analysis.Results of the three methods of the OVM,GFM and FVDM show that the complexity and applicability are not consistent and the complicated models are not always superior to the simple ones in modeling car-following.The findings of this study can provide useful information for car-following behavior modeling.展开更多
By the method of gradient pattern analysis, twenty plots were set at altitudes of 700-2600 m with an interval of 100 m on the northern slope of the Changbai Mountain. The dissimilarity of respective sub-plots in the s...By the method of gradient pattern analysis, twenty plots were set at altitudes of 700-2600 m with an interval of 100 m on the northern slope of the Changbai Mountain. The dissimilarity of respective sub-plots in the same community was measured and the complexity of plant communities at different altitudes was analyzed. The result from binary data of tree species in canopy tree indicated that the sub-plots in the communities, except subalpine Betula ermanii forest, showed comparatively high dissimilarity in species composition. Especially, the dissimilarity index (0.7) of broadleaved/Korean pine forest at low altitudes was obviously higher than other communities. The differences are not obvious between communities referring to dark coniferous forest. Comparatively, the dissimilarity in sub-plots of the communities at altitude of 1400 m was slightly higher than that of other communities, which reflected the complexity of tree species compositions of transitory-type communities. For subalpine Betula ermanii forest, tree species composition was simple and showed a high similarity between sub-plots. The results derived from binary data of shrub showed that the dissimilarity index of shrub species in broadleaved/Korean pine forest at low altitudes was higher than that in other communities, but the divergence tendency wasn抰 so obvious as that of arbor species. The dissimilarity derived from binary data of herb and all plant species at different altitudes showed greatly close tendency, and the differences in herb and all plant species between sub-plots were the greatest for the communities of broad-leaved-Korean pine forest and alpine tundra zone..展开更多
Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in ge...Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in general net theory(GNT), a quantitative method for structure description and analysis of information systems was introduced. Results The structural complexity index and two related factors, i.e. element complexity factor and connection complexity factor were defined, and the relations between them and the parameters of the Petri net based model of the system were derived. Application example was presented. Conclusion The proposed method provides a theoretical basis for quantitative analysis and evaluation of the structural complexity and can be applied in the general planning and design processes of the information systems.展开更多
To solve the extended fuzzy description logic with qualifying number restriction (EFALCQ) reasoning problems, EFALCQ is discretely simulated by description logic with qualifying number restriction (ALCQ), and ALCQ...To solve the extended fuzzy description logic with qualifying number restriction (EFALCQ) reasoning problems, EFALCQ is discretely simulated by description logic with qualifying number restriction (ALCQ), and ALCQ reasoning results are reused to prove the complexity of EFALCQ reasoning problems. The ALCQ simulation method for the consistency of EFALCQ is proposed. This method reduces EFALCQ satisfiability into EFALCQ consistency, and uses EFALCQ satisfiability to discretely simulate EFALCQ satdomain. It is proved that the reasoning complexity for EFALCQ satisfiability, consistency and sat-domain is PSPACE-complete.展开更多
Based on the iterative bit-filling procedure, a computationally efficient bit and power allocation algorithm is presented. The algorithm improves the conventional bit-filling algorithms by maintaining only a subset of...Based on the iterative bit-filling procedure, a computationally efficient bit and power allocation algorithm is presented. The algorithm improves the conventional bit-filling algorithms by maintaining only a subset of subcarriers for computation in each iteration, which reduces the complexity without any performance degradation. Moreover, a modified algorithm with even lower complexity is developed, and equal power allocation is introduced as an initial allocation to accelerate its convergence. Simulation results show that the modified algorithm achieves a considerable complexity reduction while causing only a minor drop in performance.展开更多
Love is an eternal subject with many references in many novels,but Langston Hughes approaches it in a most simplified manner to portray the complex feeling of the protagonists.He achieves this inward complexity throug...Love is an eternal subject with many references in many novels,but Langston Hughes approaches it in a most simplified manner to portray the complex feeling of the protagonists.He achieves this inward complexity through carefully-treated outward simplicity.The paper discusses this art of writing in Early Autumn from such aspects as the dramatic point of view,well-designed setting,careful presentation and effective rhetorical devices.展开更多
Being as one figurative form of language,metaphor plays the most complicated role to make language colorful and vivid.Demonstrating the types and the features of metaphor,this article will focus on the point that meta...Being as one figurative form of language,metaphor plays the most complicated role to make language colorful and vivid.Demonstrating the types and the features of metaphor,this article will focus on the point that metaphor is a complex language phenomenon heavily loaded with the factor of culture.展开更多
My investigation will serve two purposes. First, I shall investigate the function of the subclauses in the corpus in relation to their complexity, and I shall establish whether there is a correlation between sentence ...My investigation will serve two purposes. First, I shall investigate the function of the subclauses in the corpus in relation to their complexity, and I shall establish whether there is a correlation between sentence length and sentence complexity.Second, I shall analyse the complexity of the subclauses collected from the two sections and compare the results from these sections, focusing on finite subclauses and non-finite subclauses. I hope to be able to point out some differences in style between the news and sports sections concerning the use of subordinate clauses in various syntactic functions in order to examine how the choice of linguistic structures differs in different sections of The Times.展开更多
For many continuous bio-medieal signals with both strong nonlinearity and non-stationarity, two criterions were proposed for their complexity estimation : (1) Only a short data set is enough for robust estimation; ...For many continuous bio-medieal signals with both strong nonlinearity and non-stationarity, two criterions were proposed for their complexity estimation : (1) Only a short data set is enough for robust estimation; (2) No over-coarse graining preproeessing, such as transferring the original signal into a binary time series, is needed. Co complexity measure proposed by us previously is one of such measures. However, it lacks the solid mathematical foundation and thus its use is limited. A modified version of this measure is proposed, and some important properties are proved rigorously. According to these properties, this measure can be considered as an index of randomness of time series in some senses, and thus also a quantitative index of complexity under the meaning of randomness finding complexity. Compared with other similar measures, this measure seems more suitable for estimating a large quantity of complexity measures for a given task, such as studying the dynamic variation of such measures in sliding windows of a long process, owing to its fast speed for estimation.展开更多
In millimeter wave(mmWave) multiple-input multiple-output(MIMO) systems, hybrid precoding has been widely used to overcome the severe propagation loss. In order to improve the spectrum efficiency with low complexity, ...In millimeter wave(mmWave) multiple-input multiple-output(MIMO) systems, hybrid precoding has been widely used to overcome the severe propagation loss. In order to improve the spectrum efficiency with low complexity, we propose a joint hybrid precoding algorithm for single-user mmWave MIMO systems in this paper. By using the concept of equivalent channel, the proposed algorithm skillfully utilizes the idea of alternating optimization to complete the design of RF precoder and combiner. Then, the baseband precoder and combiner are computed by calculating the singular value decomposition of the equivalent channel. Simulation results demonstrate that the proposed algorithm can achieve satisfactory performance with quite low complexity. Moreover, we investigate the effects of quantization on the analog components and find that the proposed scheme is effective even with coarse quantization.展开更多
In this paper, based on the following theoretical framework: Evolutionary Algorithms + Program Structures = Automatic Programming , some results on complexity of automatic programming for function modeling is given, w...In this paper, based on the following theoretical framework: Evolutionary Algorithms + Program Structures = Automatic Programming , some results on complexity of automatic programming for function modeling is given, which show that the complexity of automatic programming is an exponential function of the problem dimension N , the size of operator set |F| and the height of the program parse tree H . Following this results, the difficulties of automatic programming are discussed. Some function models discovered automatically from database by evolutionary modeling method are given, too.展开更多
Exploring the physical mechanisms of complex systems and making effective use of them are the keys to dealing with the complexity of the world.The emergence of big data and the enhancement of computing power,in conjun...Exploring the physical mechanisms of complex systems and making effective use of them are the keys to dealing with the complexity of the world.The emergence of big data and the enhancement of computing power,in conjunction with the improvement of optimization algorithms,are leading to the development of artificial intelligence(AI)driven by deep learning.However,deep learning fails to reveal the underlying logic and physical connotations of the problems being solved.Mesoscience provides a concept to understand the mechanism of the spatiotemporal multiscale structure of complex systems,and its capability for analyzing complex problems has been validated in different fields.This paper proposes a research paradigm for AI,which introduces the analytical principles of mesoscience into the design of deep learning models.This is done to address the fundamental problem of deep learning models detaching the physical prototype from the problem being solved;the purpose is to promote the sustainable development of AI.展开更多
文摘This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to consider the complexity of quantum states, offering a more nuanced understanding of measurement precision. By adding a complexity term to the uncertainty relation, we explore nonlinear modifications such as polynomial, exponential, and logarithmic functions. Rigorous mathematical derivations demonstrate the consistency of the modified principle with classical quantum mechanics and quantum information theory. We investigate the implications of this modified HUP for various aspects of quantum mechanics, including quantum metrology, quantum algorithms, quantum error correction, and quantum chaos. Additionally, we propose experimental protocols to test the validity of the modified HUP, evaluating their feasibility with current and near-term quantum technologies. This work highlights the importance of quantum complexity in quantum mechanics and provides a refined perspective on the interplay between complexity, entanglement, and uncertainty in quantum systems. The modified HUP has the potential to stimulate interdisciplinary research at the intersection of quantum physics, information theory, and complexity theory, with significant implications for the development of quantum technologies and the understanding of the quantum-to-classical transition.
文摘Elementary information theory is used to model cybersecurity complexity, where the model assumes that security risk management is a binomial stochastic process. Complexity is shown to increase exponentially with the number of vulnerabilities in combination with security risk management entropy. However, vulnerabilities can be either local or non-local, where the former is confined to networked elements and the latter results from interactions between elements. Furthermore, interactions involve multiple methods of communication, where each method can contain vulnerabilities specific to that method. Importantly, the number of possible interactions scales quadratically with the number of elements in standard network topologies. Minimizing these interactions can significantly reduce the number of vulnerabilities and the accompanying complexity. Two network configurations that yield sub-quadratic and linear scaling relations are presented.
基金supported by the Ministry of Science and High Education of Russia(Theme No.368121031700169-1 of ICMM UrB RAS).
文摘Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over reaction parameters,resulting in purer,more uniform,and structurally stable products due to more effective mass transfer manipulation.However,continuous-flow synthesis processes may be accompanied by the emergence of spatial convective structures initiating convective flows.On the one hand,convection can accelerate reactions by intensifying mass transfer.On the other hand,it may lead to non-uniformity in the final product or defects,especially in MOF microcrystal synthesis.The ability to distinguish regions of convective and diffusive mass transfer may be the key to performing higher-quality reactions and obtaining purer products.In this study,we investigate,for the first time,the possibility of using the information complexity measure as a criterion for assessing the intensity of mass transfer in microchannels,considering both spatial and temporal non-uniformities of liquid’s distributions resulting from convection formation.We calculate the complexity using shearlet transform based on a local approach.In contrast to existing methods for calculating complexity,the shearlet transform based approach provides a more detailed representation of local heterogeneities.Our analysis involves experimental images illustrating the mixing process of two non-reactive liquids in a Y-type continuous-flow microchannel under conditions of double-diffusive convection formation.The obtained complexity fields characterize the mixing process and structure formation,revealing variations in mass transfer intensity along the microchannel.We compare the results with cases of liquid mixing via a pure diffusive mechanism.Upon analysis,it was revealed that the complexity measure exhibits sensitivity to variations in the type of mass transfer,establishing its feasibility as an indirect criterion for assessing mass transfer intensity.The method presented can extend beyond flow analysis,finding application in the controlling of microstructures of various materials(porosity,for instance)or surface defects in metals,optical systems and other materials that hold significant relevance in materials science and engineering.
文摘Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic complexity.Despite the substantial number of empirical investigations into task complexity in journal articles,storyline complexity,one of the features of it,is scarcely investigated.Previous research mainly focused on the impact of storyline complexity on learners’oral performance,but the impact on learners’written performance is less investigated.Thus,this study aims at investigating the effects of narrative complexity of storyline on senior high school students’written performance,as displayed by its complexity,fluency,and accuracy.The present study has important pedagogical implications.That is,task design and assessment should make a distinction between different types of narrative tasks.For example,the task with single or dual storyline.Results on task complexity may contribute to informing the pedagogical choices made by teachers when prioritizing work with a specific linguistic dimension.
文摘Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide the self-organized structure. The living systems are open, dynamic structures performing random, stationary, stochastic, self-organizing processes. The self-organizing procedure is defined by the spatial-temporal fractal structure, which is self-similar both in space and time. The system’s complexity appears in its energetics, which tries the most efficient use of the available energies;for that, it organizes various well-connected networks. The controller of environmental relations is the Darwinian selection on a long-time scale. The energetics optimize the healthy processes tuned to the highest efficacy and minimal loss (minimalization of the entropy production). The organism is built up by morphogenetic rules and develops various networks from the genetic level to the organism. The networks have intensive crosstalk and form a balance in the Nash equilibrium, which is the homeostatic state in healthy conditions. Homeostasis may be described as a Nash equilibrium, which ensures energy distribution in a “democratic” way regarding the functions of the parts in the complete system. Cancer radically changes the network system in the organism. Cancer is a network disease. Deviation from healthy networking appears at every level, from genetic (molecular) to cells, tissues, organs, and organisms. The strong proliferation of malignant tissue is the origin of most of the life-threatening processes. The weak side of cancer development is the change of complex information networking in the system, being vulnerable to immune attacks. Cancer cells are masters of adaptation and evade immune surveillance. This hiding process can be broken by electromagnetic nonionizing radiation, for which the malignant structure has no adaptation strategy. Our objective is to review the different sides of living complexity and use the knowledge to fight against cancer.
基金supported in part by the“Pioneer”and“Leading Goose”R&D Program of Zhejiang(Grant No.2022C03174)the National Natural Science Foundation of China(No.92067103)+4 种基金the Key Research and Development Program of Shaanxi,China(No.2021ZDLGY06-02)the Natural Science Foundation of Shaanxi Province(No.2019ZDLGY12-02)the Shaanxi Innovation Team Project(No.2018TD-007)the Xi'an Science and technology Innovation Plan(No.201809168CX9JC10)the Fundamental Research Funds for the Central Universities(No.YJS2212)and National 111 Program of China B16037.
文摘The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are called causative availability indiscriminate attacks.Facing the problem that existing data sanitization methods are hard to apply to real-time applications due to their tedious process and heavy computations,we propose a new supervised batch detection method for poison,which can fleetly sanitize the training dataset before the local model training.We design a training dataset generation method that helps to enhance accuracy and uses data complexity features to train a detection model,which will be used in an efficient batch hierarchical detection process.Our model stockpiles knowledge about poison,which can be expanded by retraining to adapt to new attacks.Being neither attack-specific nor scenario-specific,our method is applicable to FL/DML or other online or offline scenarios.
文摘The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features that L2 learners can employ, as well as suggest which features they should focus on in English academic writing. To achieve this, all samples were analyzed for rhetorical moves using Hyland’s five-rhetorical move model. Additionally, all sentences were evaluated for syntactic complexity, considering measures such as global, clausal and phrasal complexity. The findings reveal that expert writers exhibit a more balanced use of syntactic complexity across moves, effectively fulfilling the rhetorical objectives of abstracts. On the other hand, MA students tend to rely excessively on embedded structures and dependent clauses in an attempt to increase complexity. The implications of these findings for academic writing research, pedagogy, and assessment are thoroughly discussed.
文摘This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the main data-gathering instruments.English language teachers from Coral na Munti National High School were selected as the respondents of the study.Finding of the study suggests that the syntactic complexity of the text affects the reading comprehension of the students.Students found it challenging to understand the message that the author conveyed if he or she used a large number of phrases and clauses in one sentence.Furthermore,the complex sentence syntactic structure was deemed the most challenging for students to understand.To overcome said challenges in comprehending text,various reading intervention programs were utilized by teachers.These interventions include focused or targeted instruction and the implementation of the Project Dear,suggested by the Department of Education.These programs were proven to help students improve their comprehension as well as their knowledge in syntactical structure of sentences.This study underscores the importance of selecting appropriate reading materials and implementing suitable reading intervention programs to enhance students’comprehension skills.
基金The National Basic Research Program of China(973Program)(No.2012CB725402)the National Natural Science Foundation of China(No.51478113)
文摘The complexity and applicability of three relative car-following models are investigated and they are the optimal velocity model (OVM),the generalized force model (GFM) and the full velocity difference model (FVDM).The vehicle trajectory data used is collected from the digital pictures obtained at a 30-storey building near Ⅰ-80 freeway.Three different calibrating methods are used to estimate the model parameters and to study the relationships between model complexity and applicability from overall,inter-driver and intra-driver analysis.Results of the three methods of the OVM,GFM and FVDM show that the complexity and applicability are not consistent and the complicated models are not always superior to the simple ones in modeling car-following.The findings of this study can provide useful information for car-following behavior modeling.
基金supported by the Chinese Academy of Science(grand KZCX2-406)founded by Chinese Science of Academy undred People’Project.
文摘By the method of gradient pattern analysis, twenty plots were set at altitudes of 700-2600 m with an interval of 100 m on the northern slope of the Changbai Mountain. The dissimilarity of respective sub-plots in the same community was measured and the complexity of plant communities at different altitudes was analyzed. The result from binary data of tree species in canopy tree indicated that the sub-plots in the communities, except subalpine Betula ermanii forest, showed comparatively high dissimilarity in species composition. Especially, the dissimilarity index (0.7) of broadleaved/Korean pine forest at low altitudes was obviously higher than other communities. The differences are not obvious between communities referring to dark coniferous forest. Comparatively, the dissimilarity in sub-plots of the communities at altitude of 1400 m was slightly higher than that of other communities, which reflected the complexity of tree species compositions of transitory-type communities. For subalpine Betula ermanii forest, tree species composition was simple and showed a high similarity between sub-plots. The results derived from binary data of shrub showed that the dissimilarity index of shrub species in broadleaved/Korean pine forest at low altitudes was higher than that in other communities, but the divergence tendency wasn抰 so obvious as that of arbor species. The dissimilarity derived from binary data of herb and all plant species at different altitudes showed greatly close tendency, and the differences in herb and all plant species between sub-plots were the greatest for the communities of broad-leaved-Korean pine forest and alpine tundra zone..
文摘Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in general net theory(GNT), a quantitative method for structure description and analysis of information systems was introduced. Results The structural complexity index and two related factors, i.e. element complexity factor and connection complexity factor were defined, and the relations between them and the parameters of the Petri net based model of the system were derived. Application example was presented. Conclusion The proposed method provides a theoretical basis for quantitative analysis and evaluation of the structural complexity and can be applied in the general planning and design processes of the information systems.
基金The National Natural Science Foundation of China(No60403016)the Weaponry Equipment Foundation of PLA Equip-ment Ministry (No51406020105JB8103)
文摘To solve the extended fuzzy description logic with qualifying number restriction (EFALCQ) reasoning problems, EFALCQ is discretely simulated by description logic with qualifying number restriction (ALCQ), and ALCQ reasoning results are reused to prove the complexity of EFALCQ reasoning problems. The ALCQ simulation method for the consistency of EFALCQ is proposed. This method reduces EFALCQ satisfiability into EFALCQ consistency, and uses EFALCQ satisfiability to discretely simulate EFALCQ satdomain. It is proved that the reasoning complexity for EFALCQ satisfiability, consistency and sat-domain is PSPACE-complete.
基金The National High Technology Research and Devel-opment Program of China (863Program) (No2006AA01Z263)the National Natural Science Foundation of China (No60496311)
文摘Based on the iterative bit-filling procedure, a computationally efficient bit and power allocation algorithm is presented. The algorithm improves the conventional bit-filling algorithms by maintaining only a subset of subcarriers for computation in each iteration, which reduces the complexity without any performance degradation. Moreover, a modified algorithm with even lower complexity is developed, and equal power allocation is introduced as an initial allocation to accelerate its convergence. Simulation results show that the modified algorithm achieves a considerable complexity reduction while causing only a minor drop in performance.
文摘Love is an eternal subject with many references in many novels,but Langston Hughes approaches it in a most simplified manner to portray the complex feeling of the protagonists.He achieves this inward complexity through carefully-treated outward simplicity.The paper discusses this art of writing in Early Autumn from such aspects as the dramatic point of view,well-designed setting,careful presentation and effective rhetorical devices.
文摘Being as one figurative form of language,metaphor plays the most complicated role to make language colorful and vivid.Demonstrating the types and the features of metaphor,this article will focus on the point that metaphor is a complex language phenomenon heavily loaded with the factor of culture.
文摘My investigation will serve two purposes. First, I shall investigate the function of the subclauses in the corpus in relation to their complexity, and I shall establish whether there is a correlation between sentence length and sentence complexity.Second, I shall analyse the complexity of the subclauses collected from the two sections and compare the results from these sections, focusing on finite subclauses and non-finite subclauses. I hope to be able to point out some differences in style between the news and sports sections concerning the use of subordinate clauses in various syntactic functions in order to examine how the choice of linguistic structures differs in different sections of The Times.
文摘For many continuous bio-medieal signals with both strong nonlinearity and non-stationarity, two criterions were proposed for their complexity estimation : (1) Only a short data set is enough for robust estimation; (2) No over-coarse graining preproeessing, such as transferring the original signal into a binary time series, is needed. Co complexity measure proposed by us previously is one of such measures. However, it lacks the solid mathematical foundation and thus its use is limited. A modified version of this measure is proposed, and some important properties are proved rigorously. According to these properties, this measure can be considered as an index of randomness of time series in some senses, and thus also a quantitative index of complexity under the meaning of randomness finding complexity. Compared with other similar measures, this measure seems more suitable for estimating a large quantity of complexity measures for a given task, such as studying the dynamic variation of such measures in sliding windows of a long process, owing to its fast speed for estimation.
基金supported by NSFC (No. 61571055)fund of SKL of MMW (No. K201815) Important National Science & Technology Specific Projects (2017ZX03001028)
文摘In millimeter wave(mmWave) multiple-input multiple-output(MIMO) systems, hybrid precoding has been widely used to overcome the severe propagation loss. In order to improve the spectrum efficiency with low complexity, we propose a joint hybrid precoding algorithm for single-user mmWave MIMO systems in this paper. By using the concept of equivalent channel, the proposed algorithm skillfully utilizes the idea of alternating optimization to complete the design of RF precoder and combiner. Then, the baseband precoder and combiner are computed by calculating the singular value decomposition of the equivalent channel. Simulation results demonstrate that the proposed algorithm can achieve satisfactory performance with quite low complexity. Moreover, we investigate the effects of quantization on the analog components and find that the proposed scheme is effective even with coarse quantization.
基金Supported by National Nature Science Foundation of China(6 0 0 730 4370 0 710 42 )
文摘In this paper, based on the following theoretical framework: Evolutionary Algorithms + Program Structures = Automatic Programming , some results on complexity of automatic programming for function modeling is given, which show that the complexity of automatic programming is an exponential function of the problem dimension N , the size of operator set |F| and the height of the program parse tree H . Following this results, the difficulties of automatic programming are discussed. Some function models discovered automatically from database by evolutionary modeling method are given, too.
基金We would like to thank Dr.Wenlai Huang,Dr.Jianhua Chen,and Dr.Lin Zhang for the valuable discussionWe thank the editors and reviewers for their valuable comments about this articleWe gratefully acknowledge the support from the National Natural Science Foundation of China(91834303).
文摘Exploring the physical mechanisms of complex systems and making effective use of them are the keys to dealing with the complexity of the world.The emergence of big data and the enhancement of computing power,in conjunction with the improvement of optimization algorithms,are leading to the development of artificial intelligence(AI)driven by deep learning.However,deep learning fails to reveal the underlying logic and physical connotations of the problems being solved.Mesoscience provides a concept to understand the mechanism of the spatiotemporal multiscale structure of complex systems,and its capability for analyzing complex problems has been validated in different fields.This paper proposes a research paradigm for AI,which introduces the analytical principles of mesoscience into the design of deep learning models.This is done to address the fundamental problem of deep learning models detaching the physical prototype from the problem being solved;the purpose is to promote the sustainable development of AI.