This study,through a re-conceptualization of sociological complexity theory’s epistemological sources,specifically in Edgar Morin’s formulation,sheds light on the theoretical models as well as empirical methodologie...This study,through a re-conceptualization of sociological complexity theory’s epistemological sources,specifically in Edgar Morin’s formulation,sheds light on the theoretical models as well as empirical methodologies of sociological analysis of today’s complex,interconnected,diverse and globalized society and global disorder.Complexity theory leads to a shift in perspective and a transformation of the epistemological status of social sciences with an in-depth intervention of disorder,contingency,case,singular,and non-repeatable in the sociological analysis.The notion of dialogic interplay is placed at the paradigm level and stands out at the heart of the concepts,analyzing the social system as auto-eco-organizer.Similarly,the notion of‘emergence’at macro-micro levels imposes itself as complex,logically requiring overcoming simple,linear thinking and model of explanation to adopt the perspective of organizational rotativity in which the product retroacts by transforming the one producing it,by conceiving a circularity of co-production between individuals and society through interactions.Declining epistemology and sociological complexity theory in the empirical methodology setting,the complex sociological approach is phenomenon-,event/information-and crisis-centered,privileging observation,participation-intervention,and‘live inquiry’.The open,in-depth and possibly non-directive interview is part of clinical sociological methodology,raising the question of the observer-phenomenon-observed relation.展开更多
This paper presents a qualitative study to investigate the dynamics in second language(L2)learning strategies under the guidance of the complexity theory.A group of Chinese undergraduate students studying at an intern...This paper presents a qualitative study to investigate the dynamics in second language(L2)learning strategies under the guidance of the complexity theory.A group of Chinese undergraduate students studying at an international university in Thailand were selected as the research participants.Research instruments include interviews,observations,records of participants’on-line chat and posts,and a research journal.The research findings indicate that the changes in the participants’strategies for learning English exhibit typical features of the complex system.The study will provide implications for probing into the nature of L2 strategy and for applying complexity theory to future researches on L2 strategies.展开更多
Dominant technology formation is the key for the hightech industry to“cross the chasm”and gain an established foothold in the market(and hence disrupt the regime).Therefore,a stimulus-response model is proposed to i...Dominant technology formation is the key for the hightech industry to“cross the chasm”and gain an established foothold in the market(and hence disrupt the regime).Therefore,a stimulus-response model is proposed to investigate the dominant technology by exploring its formation process and mechanism.Specifically,based on complex adaptive system theory and the basic stimulus-response model,we use a combination of agent-based modeling and system dynamics modeling to capture the interactions between dominant technology and the socio-technical landscape.The results indicate the following:(i)The dynamic interaction is“stimulus-reaction-selection”,which promotes the dominant technology’s formation.(ii)The dominant technology’s formation can be described as a dynamic process in which the adaptation intensity of technology standards increases continuously until it becomes the leading technology under the dual action of internal and external mechanisms.(iii)The dominant technology’s formation in the high-tech industry is influenced by learning ability,the number of adopting users and adaptability.Therein,a“critical scale”of learning ability exists to promote the formation of leading technology:a large number of adopting users can promote the dominant technology’s formation by influencing the adaptive response of technology standards to the socio-technical landscape and the choice of technology standards by the socio-technical landscape.There is a minimum threshold and a maximum threshold for the role of adaptability in the dominant technology’s formation.(iv)The socio-technical landscape can promote the leading technology’s shaping in the high-tech industry,and different elements have different effects.This study promotes research on the formation mechanism of dominant technology in the high-tech industry,presents new perspectives and methods for researchers,and provides essential enlightenment for managers to formulate technology strategies.展开更多
The electromagnetic force, strong nuclear force, weak nuclear force, and gravitational force are the four fundamental forces of nature. The Standard Model (SM) succeeded in combining the first three forces to describe...The electromagnetic force, strong nuclear force, weak nuclear force, and gravitational force are the four fundamental forces of nature. The Standard Model (SM) succeeded in combining the first three forces to describe the most basic building blocks of matter and govern the universe. Despite the model’s great success in resolving many issues in particle physics but still has several setbacks and limitations. The model failed to incorporate the fourth force of gravity. It infers that all fermions and bosons are massless contrary to experimental facts. In addition, the model addresses neither the 95% of the universe’s energy of Dark Matter (DM) and Dark Energy (DE) nor the universe’s expansion. The Complex Field Theory (CFT) identifies DM and DE as complex fields of complex masses and charges that encompasses the whole universe, and pervade all matter. This presumption resolves the issue of failing to detect DM and DE for the last five decades. The theory also presents a model for the universe’s expansion and presumes that every material object carries a fraction of this complex field proportional to its mass. These premises clearly explain the physical nature of the gravitational force and its complex field and pave the way for gravity into the SM. On the other hand, to solve the issue of massless bosons and fermions in the SM, Higgs mechanism introduces a pure and abstractive theoretical model of unimaginable four potentials to generate fictitious bosons as mass donors to fermions and W± and Z bosons. The CFT in this paper introduces, for the first time, a physical explanation to the mystery of the mass formation of particles rather than Higgs’ pure mathematical derivations. The analyses lead to uncovering the mystery of electron-positron production near heavy nuclei and never in a vacuum. In addition, it puts a constraint on Einstein’s mass-energy equation that energy can never be converted to mass without the presence of dense dark matter and cannot be true in a vacuum. Furthermore, CFT provides different perspectives and resolves real-world physics concepts such as the nuclear force, Casimir force, Lamb’s shift, and the anomalous magnetic moment to be published elsewhere.展开更多
This essay presents a reflection on the main implications of Complexity Theory for science in general, redefining and dispelling myths of traditional science, and Sociology in particular, suggesting a redefinition of ...This essay presents a reflection on the main implications of Complexity Theory for science in general, redefining and dispelling myths of traditional science, and Sociology in particular, suggesting a redefinition of Parsons’ classic concept of Social System, articulated around the property of self-maintenance of order rather than on its possible discontinuity and instability. It argues that Complexity Theory has established the limits of Classic Science, leading to a more realistic awareness of working and evolution mechanisms of Natural and Social Systems and showing the limits of our capacity to predict and control events. Dissipative structures have shown the creative role of time. Instability, emergence, surprise, unpredictability are the rule rather than the exception when systems move away from equilibrium (entropy), even if these processes are generated from a system’s deterministic working mechanisms. Therefore, we have come to realize how constructive the contribution of Complexity is, in regards to the long lasting problem of the relationship between order and disorder. Today, the terms of this relationship have been re-specified in its new configuration of inter-relationship link, according to a unicum which finds its synthesis in self-organization and deterministic chaos concepts. From this perspective, as Prigogine suggested, studies on Complex Systems are heading toward a historical, biological conception of Physics, and a new alliance between natural systems and living, social systems. Non-linearity, far from equilibrium self-organization, emergence and surprise meet at all levels, as this paper attempts to highlight. In Sociology, insights of Complexity Theory have contributed to a new way of thinking about social systems, by re-addressing some fundamental issues starting to social system, emergence and change concepts. The current social system conception as complex dynamical systems is supported by a profitable use of non-liner models (in particular, the Logistic map) in the study of social processes.展开更多
Based on forbidden patterns in symbolic dynamics,symbolic subsequences are classified and relations between forbidden patterns,correlation dimensions and complexity measures are studied.A complexity measure approach i...Based on forbidden patterns in symbolic dynamics,symbolic subsequences are classified and relations between forbidden patterns,correlation dimensions and complexity measures are studied.A complexity measure approach is proposed in order to separate deterministic (usually chaotic) series from random ones and measure the complexities of different dynamic systems.The complexity is related to the correlation dimensions,and the algorithm is simple and suitable for time series with noise.In the paper,the complexity measure method is used to study dynamic systems of the Logistic map and the Hénon map with multi-parameters.展开更多
On the basis of complex network theory, the issues of key nodes in Wireless Sensor Networks (WSN) are discussed. A model expression of sub-network fault in WSN is given at first; subsequently, the concepts of average ...On the basis of complex network theory, the issues of key nodes in Wireless Sensor Networks (WSN) are discussed. A model expression of sub-network fault in WSN is given at first; subsequently, the concepts of average path length and clustering coefficient are introduced. Based on the two concepts, a novel attribute description of key nodes related to sub-networks is proposed. Moreover, in terms of node deployment density and transmission range, the concept of single-point key nodes and generalized key nodes of WSN are defined, and their decision theorems are investigated.展开更多
This study employs a Q methodology to explore the developmental routines of oral English ability for 12 English major students in China inspired by Complex and Dynamic Systems Theory(CDST).The data analysis suggests t...This study employs a Q methodology to explore the developmental routines of oral English ability for 12 English major students in China inspired by Complex and Dynamic Systems Theory(CDST).The data analysis suggests the next findings:(1)two developmental patterns emerge as the gradual improvement and the strong phase shift influenced by internal and external factors for interactions among different subsystems;(2)guided by CDST,the study proves the importance of self-organization and initial condition in previous studies.According to the above findings,It is highly suggested for teachers to form a holistic view of students’oral English development concerning the non-linear characteristic and individual differences.展开更多
While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if eve...While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if ever do so.In this paper,we review some of the quantitative tools and techniques which are available for use in digital forensic investigations,including Bayesian networks,complexity theory,information theory and probability theory,and indicate how they may be used to obtain likelihood ratios or odds ratios for the relative plausibility of alternative explanations for the creation of the recovered digital evidence.The potential benefits of such quantitative measures for modern digital forensics are also outlined.展开更多
Verification in quantum computations is crucial since quantum systems are extremely vulnerable to the environment.However,verifying directly the output of a quantum computation is difficult since we know that efficien...Verification in quantum computations is crucial since quantum systems are extremely vulnerable to the environment.However,verifying directly the output of a quantum computation is difficult since we know that efficiently simulating a large-scale quantum computation on a classical computer is usually thought to be impossible.To overcome this difficulty,we propose a self-testing system for quantum computations,which can be used to verify if a quantum computation is performed correctly by itself.Our basic idea is using some extra ancilla qubits to test the output of the computation.We design two kinds of permutation circuits into the original quantum circuit:one is applied on the ancilla qubits whose output indicates the testing information,the other is applied on all qubits(including ancilla qubits) which is aiming to uniformly permute the positions of all qubits.We show that both permutation circuits are easy to achieve.By this way,we prove that any quantum computation has an efficient self-testing system.In the end,we also discuss the relation between our self-testing system and interactive proof systems,and show that the two systems are equivalent if the verifier is allowed to have some quantum capacity.展开更多
Quantum information processing and communication(QIPC) is an area of science that has two main goals: On one side,it tries to explore(still not well known) potential of quantum phenomena for(efficient and reliable) in...Quantum information processing and communication(QIPC) is an area of science that has two main goals: On one side,it tries to explore(still not well known) potential of quantum phenomena for(efficient and reliable) information processing and(efficient,reliable and secure) communication.On the other side,it tries to use quantum information storing,processing and transmitting paradigms,principles,laws,limitations,concepts,models and tools to get deeper insights into the phenomena of quantum world and to find efficient ways to describe and handle/simulate various complex physical phenomena.In order to do that QIPC has to use concepts,models,theories,methods and tools of both physics and informatics.The main role of physics at that is to discover primitive physical phenomena that can be used to design and maintain complex and reliable information storing,processing and transmitting systems.The main role of informatics is,one one side,to explore,from the information processing and communication point of view,limitations and potentials of the potential quantum information processing and communication technology,and to prepare information processing methods that could utilise potential of quantum information processing and communication technologies.On the other side,the main role of informatics is to guide and support,by theoretical tools and outcomes,physics oriented research in QIPC.The paper is to describe and analyse a variety of ways and potential informatics contributes and should/could contribute to the development of QIPC--see also Gruska(1999,2006,2008).展开更多
The number of available control sources is a limiting factor to many network control tasks.A lack of input sources can result in compromised controllability and/or sub-optimal network performance,as noted in engineeri...The number of available control sources is a limiting factor to many network control tasks.A lack of input sources can result in compromised controllability and/or sub-optimal network performance,as noted in engineering applications such as the smart grids.The mechanism can be explained by a linear timeinvariant model,where structural controllability sets a lower bound on the number of required sources.Inspired by the ubiquity of time-varying topologies in the real world,we propose the strategy of spatiotemporal input control to overcome the source-related limit by exploiting temporal variation of the network topology.We theoretically prove that under this regime,the required number of sources can always be reduced to 2.It is further shown that the cost of control depends on two hyperparameters,the numbers of sources and intervals,in a trade-off fashion.As a demonstration,we achieve controllability over a complex network resembling the nervous system of Caenorhabditis elegans using as few as 6%of the sources predicted by a static control model.This example underlines the potential of utilizing topological variation in complex network control problems.展开更多
Cascading failure is a potential threat in power systems with the scale development of wind power,especially for the large-scale grid-connected and long distance transmission wind power base in China.This introduces a...Cascading failure is a potential threat in power systems with the scale development of wind power,especially for the large-scale grid-connected and long distance transmission wind power base in China.This introduces a complex network theory(CNT)for cascading failure analysis considering wind farm integration.A cascading failure power flow analysis model for complex power networks is established with improved network topology principles and methods.The network load and boundary conditions are determined to reflect the operational states of power systems.Three typical network evaluation indicators are used to evaluate the topology characteristics of power network before and after malfunction including connectivity level,global effective performance and percentage of load loss(PLL).The impacts of node removal,grid current tolerance capability,wind power instantaneous penetrations,and wind farm coupling points on the power grid are analyzed based on the IEEE 30 bus system.Through the simulation analysis,the occurrence mechanism and main influence factors of cascading failure are determined.Finally,corresponding defense strategies are proposed to reduce the hazards of cascading failure in power systems.展开更多
Background:Urban green infrastructure(GI)networks play a significant role in ensuring regional ecological security;however,they are highly vulnerable to the influence of urban development,and the optimization of GI ne...Background:Urban green infrastructure(GI)networks play a significant role in ensuring regional ecological security;however,they are highly vulnerable to the influence of urban development,and the optimization of GI networks with better connectivity and resilience under different development scenarios has become a practical problem that urgently needs to be solved.Taking Harbin,a megacity in Northeast China,as the case study,we set five simulation scenarios by adjusting the economic growth rate and extracted the GI network in multiple scenarios by integrating the minimal cumulative resistance model and the gravity model.The low‑degree‑first(LDF)strategy of complex network theory was introduced to optimize the GI network,and the optimization effect was verified by robustness analysis.Results:The results showed that in the 5%economic growth scenario,the GI network structure was more complex,and the connectivity of the network was better,while in the other scenarios,the network structure gradually degraded with economic growth.After optimization by the LDF strategy,the average degree of the GI network in multiple scenarios increased from 2.368,2.651,2.189,1.972,and 1.847 to 2.783,3.125,2.643,2.414,and 2.322,respectively,and the GI network structure connectivity and resilience were significantly enhanced in all scenarios.Conclusions:Economic growth did not necessarily lead to degradation of the GI network;there was still room for economic development in the study area,but it was limited under existing GI conditions,and the LDF strategy was an effective method to optimize the GI network.The research results provide a new perspective for the study of GI network protection with urban economic growth and serve as a methodological reference for urban GI network optimization.展开更多
Unlike conventional forensics,digital forensics does not at present generally quantify the results of its investigations.It is suggested that digital forensics should aim to catch up with other forensic disciplines by...Unlike conventional forensics,digital forensics does not at present generally quantify the results of its investigations.It is suggested that digital forensics should aim to catch up with other forensic disciplines by using Bayesian and other numerical methodologies to quantify its investigations’results.Assessing the plausibility of alternative hypotheses(or propositions,or claims)which explain how recovered digital evidence came to exist on a device could assist both the prosecution and the defence sides in criminal proceedings:helping the prosecution to decide whether to proceed to trial and helping defence lawyers to advise a defendant how to plead.This paper reviews some numerical approaches to the goal of quantifying the relative weights of individual items of digital evidence and the plausibility of hypotheses based on that evidence.The potential advantages enabling the construction of cost-effective digital forensic triage schemas are also outlined.展开更多
One of the most critical issues in the evaluation of power systems is the identification of critical buses. For this purpose, this paper proposes a new methodology that evaluates the substitution of the power flow tec...One of the most critical issues in the evaluation of power systems is the identification of critical buses. For this purpose, this paper proposes a new methodology that evaluates the substitution of the power flow technique by the geodesic vulnerability index to identify critical nodes in power systems.Both methods are applied comparatively to demonstrate the scope of the proposed approach. The applicability of the methodology is illustrated using the IEEE 118-bus test system as a case study. To identify the critical components, a node is initially disconnected, and the performance of the resulting topology is evaluated in the face of simulations for multiple cascading faults. Cascading events are simulated by randomly removing assets on a system that continually changes its structure with the elimination of each component. Thus, the classification of the critical nodes is determined by evaluating the resulting performance of 118 different topologies and calculating the damaged area for each of the disintegration curves of cascading failures. In summary, the feasibility and suitability of complex network theory are justified to identify critical nodes in power systems.展开更多
Vehicle information on high-speed trains can not only determine whether the various parts of the train are working normally,but also predict the train’s future operating status.How to obtain valuable information from...Vehicle information on high-speed trains can not only determine whether the various parts of the train are working normally,but also predict the train’s future operating status.How to obtain valuable information from massive vehicle data is a difficult point.First,we divide the vehicle data of a high-speed train into 13 subsystem datasets,according to the functions of the collection components.Then,according to the gray theory and the Granger causality test,we propose the Gray-Granger Causality(GGC)model,which can construct a vehicle information network on the basis of the correlation between the collection components.By using the complex network theory to mine vehicle information and its subsystem networks,we find that the vehicle information network and its subsystem networks have the characteristics of a scale-free network.In addition,the vehicle information network is weak against attacks,but the subsystem network is closely connected and strong against attacks.展开更多
The bond,vibration and microwave dielectric characteristics of Zn_(1- x)(Li_(0.5)Bi_(0.5))_(x)WO_(4)(x=0-0.12)ce-ramics were investigated by XRD refinement,Raman and FT-IR spectroscopy and complex bond valence theory....The bond,vibration and microwave dielectric characteristics of Zn_(1- x)(Li_(0.5)Bi_(0.5))_(x)WO_(4)(x=0-0.12)ce-ramics were investigated by XRD refinement,Raman and FT-IR spectroscopy and complex bond valence theory.The results showed that proper substitution of(Li_(0.5)Bi_(0.5))2þcan improve the sintering charac-teristics and microwave dielectric properties of ZnWO_(4).The increase of Q×f value was mainly attributed to dense and uniform microstructure,and the subsequent decrease resulted from the deterioration of structural stability and relative density.According to the complex bond valence theory,the chemical bond characteristics of ZnWO_(4) and Bi_(2)WO_(6) played an important role in the dielectric properties of the samples.Additionally,the samples(x=0.02)sintered at 900℃ showed satisfactory properties:ε_(r)=15.332,Q×f=35,762 GHz,and t_(f)=-65 ppm/℃,making it a potential candidate material for LTCC applications.展开更多
文摘This study,through a re-conceptualization of sociological complexity theory’s epistemological sources,specifically in Edgar Morin’s formulation,sheds light on the theoretical models as well as empirical methodologies of sociological analysis of today’s complex,interconnected,diverse and globalized society and global disorder.Complexity theory leads to a shift in perspective and a transformation of the epistemological status of social sciences with an in-depth intervention of disorder,contingency,case,singular,and non-repeatable in the sociological analysis.The notion of dialogic interplay is placed at the paradigm level and stands out at the heart of the concepts,analyzing the social system as auto-eco-organizer.Similarly,the notion of‘emergence’at macro-micro levels imposes itself as complex,logically requiring overcoming simple,linear thinking and model of explanation to adopt the perspective of organizational rotativity in which the product retroacts by transforming the one producing it,by conceiving a circularity of co-production between individuals and society through interactions.Declining epistemology and sociological complexity theory in the empirical methodology setting,the complex sociological approach is phenomenon-,event/information-and crisis-centered,privileging observation,participation-intervention,and‘live inquiry’.The open,in-depth and possibly non-directive interview is part of clinical sociological methodology,raising the question of the observer-phenomenon-observed relation.
文摘This paper presents a qualitative study to investigate the dynamics in second language(L2)learning strategies under the guidance of the complexity theory.A group of Chinese undergraduate students studying at an international university in Thailand were selected as the research participants.Research instruments include interviews,observations,records of participants’on-line chat and posts,and a research journal.The research findings indicate that the changes in the participants’strategies for learning English exhibit typical features of the complex system.The study will provide implications for probing into the nature of L2 strategy and for applying complexity theory to future researches on L2 strategies.
基金supported by the Shanghai Philosophy and Social Science Foundation(2022ECK004)Shanghai Soft Science Research Project(23692123400)。
文摘Dominant technology formation is the key for the hightech industry to“cross the chasm”and gain an established foothold in the market(and hence disrupt the regime).Therefore,a stimulus-response model is proposed to investigate the dominant technology by exploring its formation process and mechanism.Specifically,based on complex adaptive system theory and the basic stimulus-response model,we use a combination of agent-based modeling and system dynamics modeling to capture the interactions between dominant technology and the socio-technical landscape.The results indicate the following:(i)The dynamic interaction is“stimulus-reaction-selection”,which promotes the dominant technology’s formation.(ii)The dominant technology’s formation can be described as a dynamic process in which the adaptation intensity of technology standards increases continuously until it becomes the leading technology under the dual action of internal and external mechanisms.(iii)The dominant technology’s formation in the high-tech industry is influenced by learning ability,the number of adopting users and adaptability.Therein,a“critical scale”of learning ability exists to promote the formation of leading technology:a large number of adopting users can promote the dominant technology’s formation by influencing the adaptive response of technology standards to the socio-technical landscape and the choice of technology standards by the socio-technical landscape.There is a minimum threshold and a maximum threshold for the role of adaptability in the dominant technology’s formation.(iv)The socio-technical landscape can promote the leading technology’s shaping in the high-tech industry,and different elements have different effects.This study promotes research on the formation mechanism of dominant technology in the high-tech industry,presents new perspectives and methods for researchers,and provides essential enlightenment for managers to formulate technology strategies.
文摘The electromagnetic force, strong nuclear force, weak nuclear force, and gravitational force are the four fundamental forces of nature. The Standard Model (SM) succeeded in combining the first three forces to describe the most basic building blocks of matter and govern the universe. Despite the model’s great success in resolving many issues in particle physics but still has several setbacks and limitations. The model failed to incorporate the fourth force of gravity. It infers that all fermions and bosons are massless contrary to experimental facts. In addition, the model addresses neither the 95% of the universe’s energy of Dark Matter (DM) and Dark Energy (DE) nor the universe’s expansion. The Complex Field Theory (CFT) identifies DM and DE as complex fields of complex masses and charges that encompasses the whole universe, and pervade all matter. This presumption resolves the issue of failing to detect DM and DE for the last five decades. The theory also presents a model for the universe’s expansion and presumes that every material object carries a fraction of this complex field proportional to its mass. These premises clearly explain the physical nature of the gravitational force and its complex field and pave the way for gravity into the SM. On the other hand, to solve the issue of massless bosons and fermions in the SM, Higgs mechanism introduces a pure and abstractive theoretical model of unimaginable four potentials to generate fictitious bosons as mass donors to fermions and W± and Z bosons. The CFT in this paper introduces, for the first time, a physical explanation to the mystery of the mass formation of particles rather than Higgs’ pure mathematical derivations. The analyses lead to uncovering the mystery of electron-positron production near heavy nuclei and never in a vacuum. In addition, it puts a constraint on Einstein’s mass-energy equation that energy can never be converted to mass without the presence of dense dark matter and cannot be true in a vacuum. Furthermore, CFT provides different perspectives and resolves real-world physics concepts such as the nuclear force, Casimir force, Lamb’s shift, and the anomalous magnetic moment to be published elsewhere.
文摘This essay presents a reflection on the main implications of Complexity Theory for science in general, redefining and dispelling myths of traditional science, and Sociology in particular, suggesting a redefinition of Parsons’ classic concept of Social System, articulated around the property of self-maintenance of order rather than on its possible discontinuity and instability. It argues that Complexity Theory has established the limits of Classic Science, leading to a more realistic awareness of working and evolution mechanisms of Natural and Social Systems and showing the limits of our capacity to predict and control events. Dissipative structures have shown the creative role of time. Instability, emergence, surprise, unpredictability are the rule rather than the exception when systems move away from equilibrium (entropy), even if these processes are generated from a system’s deterministic working mechanisms. Therefore, we have come to realize how constructive the contribution of Complexity is, in regards to the long lasting problem of the relationship between order and disorder. Today, the terms of this relationship have been re-specified in its new configuration of inter-relationship link, according to a unicum which finds its synthesis in self-organization and deterministic chaos concepts. From this perspective, as Prigogine suggested, studies on Complex Systems are heading toward a historical, biological conception of Physics, and a new alliance between natural systems and living, social systems. Non-linearity, far from equilibrium self-organization, emergence and surprise meet at all levels, as this paper attempts to highlight. In Sociology, insights of Complexity Theory have contributed to a new way of thinking about social systems, by re-addressing some fundamental issues starting to social system, emergence and change concepts. The current social system conception as complex dynamical systems is supported by a profitable use of non-liner models (in particular, the Logistic map) in the study of social processes.
基金Project supported by the National Natural Science Foundation of China (Grant No.10871168)
文摘Based on forbidden patterns in symbolic dynamics,symbolic subsequences are classified and relations between forbidden patterns,correlation dimensions and complexity measures are studied.A complexity measure approach is proposed in order to separate deterministic (usually chaotic) series from random ones and measure the complexities of different dynamic systems.The complexity is related to the correlation dimensions,and the algorithm is simple and suitable for time series with noise.In the paper,the complexity measure method is used to study dynamic systems of the Logistic map and the Hénon map with multi-parameters.
基金Supported by the National High Technology Research and Development Program of China(No.2008AA01A201)the National Natural Science Foundation of China(No.60503015)
文摘On the basis of complex network theory, the issues of key nodes in Wireless Sensor Networks (WSN) are discussed. A model expression of sub-network fault in WSN is given at first; subsequently, the concepts of average path length and clustering coefficient are introduced. Based on the two concepts, a novel attribute description of key nodes related to sub-networks is proposed. Moreover, in terms of node deployment density and transmission range, the concept of single-point key nodes and generalized key nodes of WSN are defined, and their decision theorems are investigated.
文摘This study employs a Q methodology to explore the developmental routines of oral English ability for 12 English major students in China inspired by Complex and Dynamic Systems Theory(CDST).The data analysis suggests the next findings:(1)two developmental patterns emerge as the gradual improvement and the strong phase shift influenced by internal and external factors for interactions among different subsystems;(2)guided by CDST,the study proves the importance of self-organization and initial condition in previous studies.According to the above findings,It is highly suggested for teachers to form a holistic view of students’oral English development concerning the non-linear characteristic and individual differences.
文摘While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if ever do so.In this paper,we review some of the quantitative tools and techniques which are available for use in digital forensic investigations,including Bayesian networks,complexity theory,information theory and probability theory,and indicate how they may be used to obtain likelihood ratios or odds ratios for the relative plausibility of alternative explanations for the creation of the recovered digital evidence.The potential benefits of such quantitative measures for modern digital forensics are also outlined.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.61372076,61971348,and 62001351)Foundation of Shaanxi Key Laboratory of Information Communication Network and Security(Grant No.ICNS201802)+1 种基金Natural Science Basic Research Program of Shaanxi,China(Grant No.2021JM-142)Key Research and Development Program of Shaanxi Province,China(Grant No.2019ZDLGY09-02)。
文摘Verification in quantum computations is crucial since quantum systems are extremely vulnerable to the environment.However,verifying directly the output of a quantum computation is difficult since we know that efficiently simulating a large-scale quantum computation on a classical computer is usually thought to be impossible.To overcome this difficulty,we propose a self-testing system for quantum computations,which can be used to verify if a quantum computation is performed correctly by itself.Our basic idea is using some extra ancilla qubits to test the output of the computation.We design two kinds of permutation circuits into the original quantum circuit:one is applied on the ancilla qubits whose output indicates the testing information,the other is applied on all qubits(including ancilla qubits) which is aiming to uniformly permute the positions of all qubits.We show that both permutation circuits are easy to achieve.By this way,we prove that any quantum computation has an efficient self-testing system.In the end,we also discuss the relation between our self-testing system and interactive proof systems,and show that the two systems are equivalent if the verifier is allowed to have some quantum capacity.
基金Support of the grant MSM00211622419 is to be acknowledge
文摘Quantum information processing and communication(QIPC) is an area of science that has two main goals: On one side,it tries to explore(still not well known) potential of quantum phenomena for(efficient and reliable) information processing and(efficient,reliable and secure) communication.On the other side,it tries to use quantum information storing,processing and transmitting paradigms,principles,laws,limitations,concepts,models and tools to get deeper insights into the phenomena of quantum world and to find efficient ways to describe and handle/simulate various complex physical phenomena.In order to do that QIPC has to use concepts,models,theories,methods and tools of both physics and informatics.The main role of physics at that is to discover primitive physical phenomena that can be used to design and maintain complex and reliable information storing,processing and transmitting systems.The main role of informatics is,one one side,to explore,from the information processing and communication point of view,limitations and potentials of the potential quantum information processing and communication technology,and to prepare information processing methods that could utilise potential of quantum information processing and communication technologies.On the other side,the main role of informatics is to guide and support,by theoretical tools and outcomes,physics oriented research in QIPC.The paper is to describe and analyse a variety of ways and potential informatics contributes and should/could contribute to the development of QIPC--see also Gruska(1999,2006,2008).
基金partially supported by the National Key RD Program of China(2020AAA0105200,2018AAA01012600)National Natural Science Foundation of China(61876215)+5 种基金Beijing Academy of Artificial Intelligence(BAAI)in part by the Science and Technology Major Project of Guangzhou(202007030006)Pengcheng laboratorypartially funded by the Ministry of Education,Singapore,under contract RG19/20partly supported by the Future Resilient Systems Project(FRS-Ⅱ)at the Singapore-ETH Centre(SEC)funded by the National Research Foundation of Singapore(NRF)。
文摘The number of available control sources is a limiting factor to many network control tasks.A lack of input sources can result in compromised controllability and/or sub-optimal network performance,as noted in engineering applications such as the smart grids.The mechanism can be explained by a linear timeinvariant model,where structural controllability sets a lower bound on the number of required sources.Inspired by the ubiquity of time-varying topologies in the real world,we propose the strategy of spatiotemporal input control to overcome the source-related limit by exploiting temporal variation of the network topology.We theoretically prove that under this regime,the required number of sources can always be reduced to 2.It is further shown that the cost of control depends on two hyperparameters,the numbers of sources and intervals,in a trade-off fashion.As a demonstration,we achieve controllability over a complex network resembling the nervous system of Caenorhabditis elegans using as few as 6%of the sources predicted by a static control model.This example underlines the potential of utilizing topological variation in complex network control problems.
基金This work was financially supported by a grant from the National Basic Research Program of China(973 Program)(No.2012CB215204)the Key Project of the CAS Knowledge Innovation Program“Research and demonstration of the coordinated control system based on multi-complementary energy storage”(No.KGCX2-EW-330).
文摘Cascading failure is a potential threat in power systems with the scale development of wind power,especially for the large-scale grid-connected and long distance transmission wind power base in China.This introduces a complex network theory(CNT)for cascading failure analysis considering wind farm integration.A cascading failure power flow analysis model for complex power networks is established with improved network topology principles and methods.The network load and boundary conditions are determined to reflect the operational states of power systems.Three typical network evaluation indicators are used to evaluate the topology characteristics of power network before and after malfunction including connectivity level,global effective performance and percentage of load loss(PLL).The impacts of node removal,grid current tolerance capability,wind power instantaneous penetrations,and wind farm coupling points on the power grid are analyzed based on the IEEE 30 bus system.Through the simulation analysis,the occurrence mechanism and main influence factors of cascading failure are determined.Finally,corresponding defense strategies are proposed to reduce the hazards of cascading failure in power systems.
基金supported by the Fundamental Research Funds for the Central Universities,Northeast Forestry University(2572018CP06,2572017CA12)。
文摘Background:Urban green infrastructure(GI)networks play a significant role in ensuring regional ecological security;however,they are highly vulnerable to the influence of urban development,and the optimization of GI networks with better connectivity and resilience under different development scenarios has become a practical problem that urgently needs to be solved.Taking Harbin,a megacity in Northeast China,as the case study,we set five simulation scenarios by adjusting the economic growth rate and extracted the GI network in multiple scenarios by integrating the minimal cumulative resistance model and the gravity model.The low‑degree‑first(LDF)strategy of complex network theory was introduced to optimize the GI network,and the optimization effect was verified by robustness analysis.Results:The results showed that in the 5%economic growth scenario,the GI network structure was more complex,and the connectivity of the network was better,while in the other scenarios,the network structure gradually degraded with economic growth.After optimization by the LDF strategy,the average degree of the GI network in multiple scenarios increased from 2.368,2.651,2.189,1.972,and 1.847 to 2.783,3.125,2.643,2.414,and 2.322,respectively,and the GI network structure connectivity and resilience were significantly enhanced in all scenarios.Conclusions:Economic growth did not necessarily lead to degradation of the GI network;there was still room for economic development in the study area,but it was limited under existing GI conditions,and the LDF strategy was an effective method to optimize the GI network.The research results provide a new perspective for the study of GI network protection with urban economic growth and serve as a methodological reference for urban GI network optimization.
文摘Unlike conventional forensics,digital forensics does not at present generally quantify the results of its investigations.It is suggested that digital forensics should aim to catch up with other forensic disciplines by using Bayesian and other numerical methodologies to quantify its investigations’results.Assessing the plausibility of alternative hypotheses(or propositions,or claims)which explain how recovered digital evidence came to exist on a device could assist both the prosecution and the defence sides in criminal proceedings:helping the prosecution to decide whether to proceed to trial and helping defence lawyers to advise a defendant how to plead.This paper reviews some numerical approaches to the goal of quantifying the relative weights of individual items of digital evidence and the plausibility of hypotheses based on that evidence.The potential advantages enabling the construction of cost-effective digital forensic triage schemas are also outlined.
基金supported by TECNM-Mexico (No. 6520.18-P)the Ministry of Economy and Competitiveness,Spain (No. ENE2016-77172-R)。
文摘One of the most critical issues in the evaluation of power systems is the identification of critical buses. For this purpose, this paper proposes a new methodology that evaluates the substitution of the power flow technique by the geodesic vulnerability index to identify critical nodes in power systems.Both methods are applied comparatively to demonstrate the scope of the proposed approach. The applicability of the methodology is illustrated using the IEEE 118-bus test system as a case study. To identify the critical components, a node is initially disconnected, and the performance of the resulting topology is evaluated in the face of simulations for multiple cascading faults. Cascading events are simulated by randomly removing assets on a system that continually changes its structure with the elimination of each component. Thus, the classification of the critical nodes is determined by evaluating the resulting performance of 118 different topologies and calculating the damaged area for each of the disintegration curves of cascading failures. In summary, the feasibility and suitability of complex network theory are justified to identify critical nodes in power systems.
基金supported by the Graduate Innovation Project of Beijing Jiaotong University(No.2020YJS098)。
文摘Vehicle information on high-speed trains can not only determine whether the various parts of the train are working normally,but also predict the train’s future operating status.How to obtain valuable information from massive vehicle data is a difficult point.First,we divide the vehicle data of a high-speed train into 13 subsystem datasets,according to the functions of the collection components.Then,according to the gray theory and the Granger causality test,we propose the Gray-Granger Causality(GGC)model,which can construct a vehicle information network on the basis of the correlation between the collection components.By using the complex network theory to mine vehicle information and its subsystem networks,we find that the vehicle information network and its subsystem networks have the characteristics of a scale-free network.In addition,the vehicle information network is weak against attacks,but the subsystem network is closely connected and strong against attacks.
基金This work was supported by the National Natural Science Foundation of China under Grant Nos 62071106,U1809215 and 61771104Key R&D projects in Sichuan Province under Grant No.21ZDYF3857.
文摘The bond,vibration and microwave dielectric characteristics of Zn_(1- x)(Li_(0.5)Bi_(0.5))_(x)WO_(4)(x=0-0.12)ce-ramics were investigated by XRD refinement,Raman and FT-IR spectroscopy and complex bond valence theory.The results showed that proper substitution of(Li_(0.5)Bi_(0.5))2þcan improve the sintering charac-teristics and microwave dielectric properties of ZnWO_(4).The increase of Q×f value was mainly attributed to dense and uniform microstructure,and the subsequent decrease resulted from the deterioration of structural stability and relative density.According to the complex bond valence theory,the chemical bond characteristics of ZnWO_(4) and Bi_(2)WO_(6) played an important role in the dielectric properties of the samples.Additionally,the samples(x=0.02)sintered at 900℃ showed satisfactory properties:ε_(r)=15.332,Q×f=35,762 GHz,and t_(f)=-65 ppm/℃,making it a potential candidate material for LTCC applications.