Dominant technology formation is the key for the hightech industry to“cross the chasm”and gain an established foothold in the market(and hence disrupt the regime).Therefore,a stimulus-response model is proposed to i...Dominant technology formation is the key for the hightech industry to“cross the chasm”and gain an established foothold in the market(and hence disrupt the regime).Therefore,a stimulus-response model is proposed to investigate the dominant technology by exploring its formation process and mechanism.Specifically,based on complex adaptive system theory and the basic stimulus-response model,we use a combination of agent-based modeling and system dynamics modeling to capture the interactions between dominant technology and the socio-technical landscape.The results indicate the following:(i)The dynamic interaction is“stimulus-reaction-selection”,which promotes the dominant technology’s formation.(ii)The dominant technology’s formation can be described as a dynamic process in which the adaptation intensity of technology standards increases continuously until it becomes the leading technology under the dual action of internal and external mechanisms.(iii)The dominant technology’s formation in the high-tech industry is influenced by learning ability,the number of adopting users and adaptability.Therein,a“critical scale”of learning ability exists to promote the formation of leading technology:a large number of adopting users can promote the dominant technology’s formation by influencing the adaptive response of technology standards to the socio-technical landscape and the choice of technology standards by the socio-technical landscape.There is a minimum threshold and a maximum threshold for the role of adaptability in the dominant technology’s formation.(iv)The socio-technical landscape can promote the leading technology’s shaping in the high-tech industry,and different elements have different effects.This study promotes research on the formation mechanism of dominant technology in the high-tech industry,presents new perspectives and methods for researchers,and provides essential enlightenment for managers to formulate technology strategies.展开更多
The electromagnetic force, strong nuclear force, weak nuclear force, and gravitational force are the four fundamental forces of nature. The Standard Model (SM) succeeded in combining the first three forces to describe...The electromagnetic force, strong nuclear force, weak nuclear force, and gravitational force are the four fundamental forces of nature. The Standard Model (SM) succeeded in combining the first three forces to describe the most basic building blocks of matter and govern the universe. Despite the model’s great success in resolving many issues in particle physics but still has several setbacks and limitations. The model failed to incorporate the fourth force of gravity. It infers that all fermions and bosons are massless contrary to experimental facts. In addition, the model addresses neither the 95% of the universe’s energy of Dark Matter (DM) and Dark Energy (DE) nor the universe’s expansion. The Complex Field Theory (CFT) identifies DM and DE as complex fields of complex masses and charges that encompasses the whole universe, and pervade all matter. This presumption resolves the issue of failing to detect DM and DE for the last five decades. The theory also presents a model for the universe’s expansion and presumes that every material object carries a fraction of this complex field proportional to its mass. These premises clearly explain the physical nature of the gravitational force and its complex field and pave the way for gravity into the SM. On the other hand, to solve the issue of massless bosons and fermions in the SM, Higgs mechanism introduces a pure and abstractive theoretical model of unimaginable four potentials to generate fictitious bosons as mass donors to fermions and W± and Z bosons. The CFT in this paper introduces, for the first time, a physical explanation to the mystery of the mass formation of particles rather than Higgs’ pure mathematical derivations. The analyses lead to uncovering the mystery of electron-positron production near heavy nuclei and never in a vacuum. In addition, it puts a constraint on Einstein’s mass-energy equation that energy can never be converted to mass without the presence of dense dark matter and cannot be true in a vacuum. Furthermore, CFT provides different perspectives and resolves real-world physics concepts such as the nuclear force, Casimir force, Lamb’s shift, and the anomalous magnetic moment to be published elsewhere.展开更多
This study,through a re-conceptualization of sociological complexity theory’s epistemological sources,specifically in Edgar Morin’s formulation,sheds light on the theoretical models as well as empirical methodologie...This study,through a re-conceptualization of sociological complexity theory’s epistemological sources,specifically in Edgar Morin’s formulation,sheds light on the theoretical models as well as empirical methodologies of sociological analysis of today’s complex,interconnected,diverse and globalized society and global disorder.Complexity theory leads to a shift in perspective and a transformation of the epistemological status of social sciences with an in-depth intervention of disorder,contingency,case,singular,and non-repeatable in the sociological analysis.The notion of dialogic interplay is placed at the paradigm level and stands out at the heart of the concepts,analyzing the social system as auto-eco-organizer.Similarly,the notion of‘emergence’at macro-micro levels imposes itself as complex,logically requiring overcoming simple,linear thinking and model of explanation to adopt the perspective of organizational rotativity in which the product retroacts by transforming the one producing it,by conceiving a circularity of co-production between individuals and society through interactions.Declining epistemology and sociological complexity theory in the empirical methodology setting,the complex sociological approach is phenomenon-,event/information-and crisis-centered,privileging observation,participation-intervention,and‘live inquiry’.The open,in-depth and possibly non-directive interview is part of clinical sociological methodology,raising the question of the observer-phenomenon-observed relation.展开更多
This essay presents a reflection on the main implications of Complexity Theory for science in general, redefining and dispelling myths of traditional science, and Sociology in particular, suggesting a redefinition of ...This essay presents a reflection on the main implications of Complexity Theory for science in general, redefining and dispelling myths of traditional science, and Sociology in particular, suggesting a redefinition of Parsons’ classic concept of Social System, articulated around the property of self-maintenance of order rather than on its possible discontinuity and instability. It argues that Complexity Theory has established the limits of Classic Science, leading to a more realistic awareness of working and evolution mechanisms of Natural and Social Systems and showing the limits of our capacity to predict and control events. Dissipative structures have shown the creative role of time. Instability, emergence, surprise, unpredictability are the rule rather than the exception when systems move away from equilibrium (entropy), even if these processes are generated from a system’s deterministic working mechanisms. Therefore, we have come to realize how constructive the contribution of Complexity is, in regards to the long lasting problem of the relationship between order and disorder. Today, the terms of this relationship have been re-specified in its new configuration of inter-relationship link, according to a unicum which finds its synthesis in self-organization and deterministic chaos concepts. From this perspective, as Prigogine suggested, studies on Complex Systems are heading toward a historical, biological conception of Physics, and a new alliance between natural systems and living, social systems. Non-linearity, far from equilibrium self-organization, emergence and surprise meet at all levels, as this paper attempts to highlight. In Sociology, insights of Complexity Theory have contributed to a new way of thinking about social systems, by re-addressing some fundamental issues starting to social system, emergence and change concepts. The current social system conception as complex dynamical systems is supported by a profitable use of non-liner models (in particular, the Logistic map) in the study of social processes.展开更多
A new method is proposed to transform the time series gained from a dynamic system to a symbolic series which extracts both overall and local information of the time series. Based on the transformation, two measures a...A new method is proposed to transform the time series gained from a dynamic system to a symbolic series which extracts both overall and local information of the time series. Based on the transformation, two measures are defined to characterize the complexity of the symbolic series. The measures reflect the sensitive dependence of chaotic systems on initial conditions and the randomness of a time series, and thus can distinguish periodic or completely random series from chaotic time series even though the lengths of the time series are not long. Finally, the logistic map and the two-parameter Henon map are studied and the results are satisfactory.展开更多
On the basis of complex network theory, the issues of key nodes in Wireless Sensor Networks (WSN) are discussed. A model expression of sub-network fault in WSN is given at first; subsequently, the concepts of average ...On the basis of complex network theory, the issues of key nodes in Wireless Sensor Networks (WSN) are discussed. A model expression of sub-network fault in WSN is given at first; subsequently, the concepts of average path length and clustering coefficient are introduced. Based on the two concepts, a novel attribute description of key nodes related to sub-networks is proposed. Moreover, in terms of node deployment density and transmission range, the concept of single-point key nodes and generalized key nodes of WSN are defined, and their decision theorems are investigated.展开更多
This study employs a Q methodology to explore the developmental routines of oral English ability for 12 English major students in China inspired by Complex and Dynamic Systems Theory(CDST).The data analysis suggests t...This study employs a Q methodology to explore the developmental routines of oral English ability for 12 English major students in China inspired by Complex and Dynamic Systems Theory(CDST).The data analysis suggests the next findings:(1)two developmental patterns emerge as the gradual improvement and the strong phase shift influenced by internal and external factors for interactions among different subsystems;(2)guided by CDST,the study proves the importance of self-organization and initial condition in previous studies.According to the above findings,It is highly suggested for teachers to form a holistic view of students’oral English development concerning the non-linear characteristic and individual differences.展开更多
This paper presents a qualitative study to investigate the dynamics in second language(L2)learning strategies under the guidance of the complexity theory.A group of Chinese undergraduate students studying at an intern...This paper presents a qualitative study to investigate the dynamics in second language(L2)learning strategies under the guidance of the complexity theory.A group of Chinese undergraduate students studying at an international university in Thailand were selected as the research participants.Research instruments include interviews,observations,records of participants’on-line chat and posts,and a research journal.The research findings indicate that the changes in the participants’strategies for learning English exhibit typical features of the complex system.The study will provide implications for probing into the nature of L2 strategy and for applying complexity theory to future researches on L2 strategies.展开更多
Complexity is one of the leading features of modem control systems. It is caused by the complex properties of controlled plants and the varied requirements in controller designs. In dealing with the control problems w...Complexity is one of the leading features of modem control systems. It is caused by the complex properties of controlled plants and the varied requirements in controller designs. In dealing with the control problems with complexity, in the past two decades, a lot of papers have been published that reported fruitful results on different theoretical backgrounds and with different methodologies, such as differential geometry-based design methods, hybrid system theory, switching control, neural network-based intelligent control, etc.展开更多
Based on forbidden patterns in symbolic dynamics, symbolic subsequences are classified and relations between forbidden patterns, correlation dimensions and complexity measures are studied. A complexity measure approac...Based on forbidden patterns in symbolic dynamics, symbolic subsequences are classified and relations between forbidden patterns, correlation dimensions and complexity measures are studied. A complexity measure approach is proposed in order to separate deterministic (usually chaotic) series from random ones and measure the complexities of different dynamic systems. The complexity is related to the correlation dimensions, and the algorithm is simple and suitable for time series with noise. In the paper, the complexity measure method is used to study dynamic systems of the Logistic map and the Henon map with multi-parameters.展开更多
The molecular structures of fifteen possible 2-thioxanthine(2TX) complexes with one Hg^2+ and two Cl-ions were fully optimized using density functional theory B3PW91/6-311++G^** method. The effective pseudo pot...The molecular structures of fifteen possible 2-thioxanthine(2TX) complexes with one Hg^2+ and two Cl-ions were fully optimized using density functional theory B3PW91/6-311++G^** method. The effective pseudo potential LANL2DZ basis set was used for metal Hg^2+ion. The vibrational analysis was also carried out at the same level. The bond lengths, bond angles, zero point energies, Gibbs free energies, thermodynamic energies and relative energies of all the complexes were obtained. The NBO analysis for natural charge and the second order perturbation energy values was carried out for three stable complexes and the IR spectroscopy of the two complexes was assigned to the experimental data. The results show that the 2-thioxanthine complexes with one Hg^2+ and two Cl^-ions were formed and the complexes resulting from the thione tautomer are more stable than that of the thiol ones. The order of three complexes with relative lower energy is 2TX(1,3,7)-Hg^2+-2, 2 TX(1,3,7)-Hg^2+-1 and 2 TX(1,3,9)-Hg^2+. The calculated IR spectroscopy of the two complexes agreed with the experimental data.展开更多
Recent years have witnessed a rapid growth of interest in the study of the dynamic behavior of replenishment rules of bullwhip effect. We prove that bullwhip effect and butterfly effect share a same the self-oscillati...Recent years have witnessed a rapid growth of interest in the study of the dynamic behavior of replenishment rules of bullwhip effect. We prove that bullwhip effect and butterfly effect share a same the self-oscillation amplifying mechanism that is the ordering decisions the supplier self-oscillation amplify the perturbations brought by the errors in the processing of retailers' demand information. This results as an explicit self-similar structure of the sensitivity of the system to the initial values duty to the nonlinear mechanism. In this paper, the causes process of the bullwhip effect is described as the internal nonlinear mechanism and study on the complexity of bullwhip effect for order-up-to policy under demand signal processing. The methodology is based on fractal and chaotic theory and allows important insights to be gained about the complexity behavior of bullwhip effect.展开更多
Cascading failure is a potential threat in power systems with the scale development of wind power,especially for the large-scale grid-connected and long distance transmission wind power base in China.This introduces a...Cascading failure is a potential threat in power systems with the scale development of wind power,especially for the large-scale grid-connected and long distance transmission wind power base in China.This introduces a complex network theory(CNT)for cascading failure analysis considering wind farm integration.A cascading failure power flow analysis model for complex power networks is established with improved network topology principles and methods.The network load and boundary conditions are determined to reflect the operational states of power systems.Three typical network evaluation indicators are used to evaluate the topology characteristics of power network before and after malfunction including connectivity level,global effective performance and percentage of load loss(PLL).The impacts of node removal,grid current tolerance capability,wind power instantaneous penetrations,and wind farm coupling points on the power grid are analyzed based on the IEEE 30 bus system.Through the simulation analysis,the occurrence mechanism and main influence factors of cascading failure are determined.Finally,corresponding defense strategies are proposed to reduce the hazards of cascading failure in power systems.展开更多
Because of poor ground conditions, stoping methods of underhand headings withcemented fill were used in the Jinchuan No. 2 mine, a paste fill system was set up. In order toevaluate the reliability of the new system, i...Because of poor ground conditions, stoping methods of underhand headings withcemented fill were used in the Jinchuan No. 2 mine, a paste fill system was set up. In order toevaluate the reliability of the new system, investigations and trial running have been done. Morethan 20 items of modification or improvements related to paste preparation subsystem, pump and pipesubsystem, auto-controlling and monitoring subsystem were finished. The reliability of the pastefill system was analyzed by using the theory of large complex system, and it is useful inreliability study on paste fill system.展开更多
While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if eve...While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if ever do so.In this paper,we review some of the quantitative tools and techniques which are available for use in digital forensic investigations,including Bayesian networks,complexity theory,information theory and probability theory,and indicate how they may be used to obtain likelihood ratios or odds ratios for the relative plausibility of alternative explanations for the creation of the recovered digital evidence.The potential benefits of such quantitative measures for modern digital forensics are also outlined.展开更多
Background:Urban green infrastructure(GI)networks play a significant role in ensuring regional ecological security;however,they are highly vulnerable to the influence of urban development,and the optimization of GI ne...Background:Urban green infrastructure(GI)networks play a significant role in ensuring regional ecological security;however,they are highly vulnerable to the influence of urban development,and the optimization of GI networks with better connectivity and resilience under different development scenarios has become a practical problem that urgently needs to be solved.Taking Harbin,a megacity in Northeast China,as the case study,we set five simulation scenarios by adjusting the economic growth rate and extracted the GI network in multiple scenarios by integrating the minimal cumulative resistance model and the gravity model.The low‑degree‑first(LDF)strategy of complex network theory was introduced to optimize the GI network,and the optimization effect was verified by robustness analysis.Results:The results showed that in the 5%economic growth scenario,the GI network structure was more complex,and the connectivity of the network was better,while in the other scenarios,the network structure gradually degraded with economic growth.After optimization by the LDF strategy,the average degree of the GI network in multiple scenarios increased from 2.368,2.651,2.189,1.972,and 1.847 to 2.783,3.125,2.643,2.414,and 2.322,respectively,and the GI network structure connectivity and resilience were significantly enhanced in all scenarios.Conclusions:Economic growth did not necessarily lead to degradation of the GI network;there was still room for economic development in the study area,but it was limited under existing GI conditions,and the LDF strategy was an effective method to optimize the GI network.The research results provide a new perspective for the study of GI network protection with urban economic growth and serve as a methodological reference for urban GI network optimization.展开更多
We briefly survey a number of important recent uchievements in Theoretical Computer Science (TCS), especially Computational Complexity Theory. We will discuss the PCP Theorem, its implications to inapproximability o...We briefly survey a number of important recent uchievements in Theoretical Computer Science (TCS), especially Computational Complexity Theory. We will discuss the PCP Theorem, its implications to inapproximability on combinatorial optimization problems; space bounded computations, especially deterministic logspace algorithm for undirected graph connectivity problem; deterministic polynomial-time primality test; lattice complexity, worst-case to average-case reductions; pseudorandomness and extractor constructions; and Valiant's new theory of holographic algorithms and reductions.展开更多
Verification in quantum computations is crucial since quantum systems are extremely vulnerable to the environment.However,verifying directly the output of a quantum computation is difficult since we know that efficien...Verification in quantum computations is crucial since quantum systems are extremely vulnerable to the environment.However,verifying directly the output of a quantum computation is difficult since we know that efficiently simulating a large-scale quantum computation on a classical computer is usually thought to be impossible.To overcome this difficulty,we propose a self-testing system for quantum computations,which can be used to verify if a quantum computation is performed correctly by itself.Our basic idea is using some extra ancilla qubits to test the output of the computation.We design two kinds of permutation circuits into the original quantum circuit:one is applied on the ancilla qubits whose output indicates the testing information,the other is applied on all qubits(including ancilla qubits) which is aiming to uniformly permute the positions of all qubits.We show that both permutation circuits are easy to achieve.By this way,we prove that any quantum computation has an efficient self-testing system.In the end,we also discuss the relation between our self-testing system and interactive proof systems,and show that the two systems are equivalent if the verifier is allowed to have some quantum capacity.展开更多
Quantum information processing and communication(QIPC) is an area of science that has two main goals: On one side,it tries to explore(still not well known) potential of quantum phenomena for(efficient and reliable) in...Quantum information processing and communication(QIPC) is an area of science that has two main goals: On one side,it tries to explore(still not well known) potential of quantum phenomena for(efficient and reliable) information processing and(efficient,reliable and secure) communication.On the other side,it tries to use quantum information storing,processing and transmitting paradigms,principles,laws,limitations,concepts,models and tools to get deeper insights into the phenomena of quantum world and to find efficient ways to describe and handle/simulate various complex physical phenomena.In order to do that QIPC has to use concepts,models,theories,methods and tools of both physics and informatics.The main role of physics at that is to discover primitive physical phenomena that can be used to design and maintain complex and reliable information storing,processing and transmitting systems.The main role of informatics is,one one side,to explore,from the information processing and communication point of view,limitations and potentials of the potential quantum information processing and communication technology,and to prepare information processing methods that could utilise potential of quantum information processing and communication technologies.On the other side,the main role of informatics is to guide and support,by theoretical tools and outcomes,physics oriented research in QIPC.The paper is to describe and analyse a variety of ways and potential informatics contributes and should/could contribute to the development of QIPC--see also Gruska(1999,2006,2008).展开更多
In this paper, single machine scheduling problems with variable processing time are raised. The criterions of the problem considered are minimizing scheduling length of all jobs, flow time and number of tardy jobs and...In this paper, single machine scheduling problems with variable processing time are raised. The criterions of the problem considered are minimizing scheduling length of all jobs, flow time and number of tardy jobs and so on. The complexity of the problem is determined. [WT5HZ]展开更多
基金supported by the Shanghai Philosophy and Social Science Foundation(2022ECK004)Shanghai Soft Science Research Project(23692123400)。
文摘Dominant technology formation is the key for the hightech industry to“cross the chasm”and gain an established foothold in the market(and hence disrupt the regime).Therefore,a stimulus-response model is proposed to investigate the dominant technology by exploring its formation process and mechanism.Specifically,based on complex adaptive system theory and the basic stimulus-response model,we use a combination of agent-based modeling and system dynamics modeling to capture the interactions between dominant technology and the socio-technical landscape.The results indicate the following:(i)The dynamic interaction is“stimulus-reaction-selection”,which promotes the dominant technology’s formation.(ii)The dominant technology’s formation can be described as a dynamic process in which the adaptation intensity of technology standards increases continuously until it becomes the leading technology under the dual action of internal and external mechanisms.(iii)The dominant technology’s formation in the high-tech industry is influenced by learning ability,the number of adopting users and adaptability.Therein,a“critical scale”of learning ability exists to promote the formation of leading technology:a large number of adopting users can promote the dominant technology’s formation by influencing the adaptive response of technology standards to the socio-technical landscape and the choice of technology standards by the socio-technical landscape.There is a minimum threshold and a maximum threshold for the role of adaptability in the dominant technology’s formation.(iv)The socio-technical landscape can promote the leading technology’s shaping in the high-tech industry,and different elements have different effects.This study promotes research on the formation mechanism of dominant technology in the high-tech industry,presents new perspectives and methods for researchers,and provides essential enlightenment for managers to formulate technology strategies.
文摘The electromagnetic force, strong nuclear force, weak nuclear force, and gravitational force are the four fundamental forces of nature. The Standard Model (SM) succeeded in combining the first three forces to describe the most basic building blocks of matter and govern the universe. Despite the model’s great success in resolving many issues in particle physics but still has several setbacks and limitations. The model failed to incorporate the fourth force of gravity. It infers that all fermions and bosons are massless contrary to experimental facts. In addition, the model addresses neither the 95% of the universe’s energy of Dark Matter (DM) and Dark Energy (DE) nor the universe’s expansion. The Complex Field Theory (CFT) identifies DM and DE as complex fields of complex masses and charges that encompasses the whole universe, and pervade all matter. This presumption resolves the issue of failing to detect DM and DE for the last five decades. The theory also presents a model for the universe’s expansion and presumes that every material object carries a fraction of this complex field proportional to its mass. These premises clearly explain the physical nature of the gravitational force and its complex field and pave the way for gravity into the SM. On the other hand, to solve the issue of massless bosons and fermions in the SM, Higgs mechanism introduces a pure and abstractive theoretical model of unimaginable four potentials to generate fictitious bosons as mass donors to fermions and W± and Z bosons. The CFT in this paper introduces, for the first time, a physical explanation to the mystery of the mass formation of particles rather than Higgs’ pure mathematical derivations. The analyses lead to uncovering the mystery of electron-positron production near heavy nuclei and never in a vacuum. In addition, it puts a constraint on Einstein’s mass-energy equation that energy can never be converted to mass without the presence of dense dark matter and cannot be true in a vacuum. Furthermore, CFT provides different perspectives and resolves real-world physics concepts such as the nuclear force, Casimir force, Lamb’s shift, and the anomalous magnetic moment to be published elsewhere.
文摘This study,through a re-conceptualization of sociological complexity theory’s epistemological sources,specifically in Edgar Morin’s formulation,sheds light on the theoretical models as well as empirical methodologies of sociological analysis of today’s complex,interconnected,diverse and globalized society and global disorder.Complexity theory leads to a shift in perspective and a transformation of the epistemological status of social sciences with an in-depth intervention of disorder,contingency,case,singular,and non-repeatable in the sociological analysis.The notion of dialogic interplay is placed at the paradigm level and stands out at the heart of the concepts,analyzing the social system as auto-eco-organizer.Similarly,the notion of‘emergence’at macro-micro levels imposes itself as complex,logically requiring overcoming simple,linear thinking and model of explanation to adopt the perspective of organizational rotativity in which the product retroacts by transforming the one producing it,by conceiving a circularity of co-production between individuals and society through interactions.Declining epistemology and sociological complexity theory in the empirical methodology setting,the complex sociological approach is phenomenon-,event/information-and crisis-centered,privileging observation,participation-intervention,and‘live inquiry’.The open,in-depth and possibly non-directive interview is part of clinical sociological methodology,raising the question of the observer-phenomenon-observed relation.
文摘This essay presents a reflection on the main implications of Complexity Theory for science in general, redefining and dispelling myths of traditional science, and Sociology in particular, suggesting a redefinition of Parsons’ classic concept of Social System, articulated around the property of self-maintenance of order rather than on its possible discontinuity and instability. It argues that Complexity Theory has established the limits of Classic Science, leading to a more realistic awareness of working and evolution mechanisms of Natural and Social Systems and showing the limits of our capacity to predict and control events. Dissipative structures have shown the creative role of time. Instability, emergence, surprise, unpredictability are the rule rather than the exception when systems move away from equilibrium (entropy), even if these processes are generated from a system’s deterministic working mechanisms. Therefore, we have come to realize how constructive the contribution of Complexity is, in regards to the long lasting problem of the relationship between order and disorder. Today, the terms of this relationship have been re-specified in its new configuration of inter-relationship link, according to a unicum which finds its synthesis in self-organization and deterministic chaos concepts. From this perspective, as Prigogine suggested, studies on Complex Systems are heading toward a historical, biological conception of Physics, and a new alliance between natural systems and living, social systems. Non-linearity, far from equilibrium self-organization, emergence and surprise meet at all levels, as this paper attempts to highlight. In Sociology, insights of Complexity Theory have contributed to a new way of thinking about social systems, by re-addressing some fundamental issues starting to social system, emergence and change concepts. The current social system conception as complex dynamical systems is supported by a profitable use of non-liner models (in particular, the Logistic map) in the study of social processes.
基金supported by the Scientific Research Fund of Zhejiang Provincial Education Department of China (Grant No 20070814)the National Natural Science Foundation of China (Grant No 10871168)
文摘A new method is proposed to transform the time series gained from a dynamic system to a symbolic series which extracts both overall and local information of the time series. Based on the transformation, two measures are defined to characterize the complexity of the symbolic series. The measures reflect the sensitive dependence of chaotic systems on initial conditions and the randomness of a time series, and thus can distinguish periodic or completely random series from chaotic time series even though the lengths of the time series are not long. Finally, the logistic map and the two-parameter Henon map are studied and the results are satisfactory.
基金Supported by the National High Technology Research and Development Program of China(No.2008AA01A201)the National Natural Science Foundation of China(No.60503015)
文摘On the basis of complex network theory, the issues of key nodes in Wireless Sensor Networks (WSN) are discussed. A model expression of sub-network fault in WSN is given at first; subsequently, the concepts of average path length and clustering coefficient are introduced. Based on the two concepts, a novel attribute description of key nodes related to sub-networks is proposed. Moreover, in terms of node deployment density and transmission range, the concept of single-point key nodes and generalized key nodes of WSN are defined, and their decision theorems are investigated.
文摘This study employs a Q methodology to explore the developmental routines of oral English ability for 12 English major students in China inspired by Complex and Dynamic Systems Theory(CDST).The data analysis suggests the next findings:(1)two developmental patterns emerge as the gradual improvement and the strong phase shift influenced by internal and external factors for interactions among different subsystems;(2)guided by CDST,the study proves the importance of self-organization and initial condition in previous studies.According to the above findings,It is highly suggested for teachers to form a holistic view of students’oral English development concerning the non-linear characteristic and individual differences.
文摘This paper presents a qualitative study to investigate the dynamics in second language(L2)learning strategies under the guidance of the complexity theory.A group of Chinese undergraduate students studying at an international university in Thailand were selected as the research participants.Research instruments include interviews,observations,records of participants’on-line chat and posts,and a research journal.The research findings indicate that the changes in the participants’strategies for learning English exhibit typical features of the complex system.The study will provide implications for probing into the nature of L2 strategy and for applying complexity theory to future researches on L2 strategies.
文摘Complexity is one of the leading features of modem control systems. It is caused by the complex properties of controlled plants and the varied requirements in controller designs. In dealing with the control problems with complexity, in the past two decades, a lot of papers have been published that reported fruitful results on different theoretical backgrounds and with different methodologies, such as differential geometry-based design methods, hybrid system theory, switching control, neural network-based intelligent control, etc.
基金Project supported by the National Natural Science Foundation of China (Grant No.10871168)
文摘Based on forbidden patterns in symbolic dynamics, symbolic subsequences are classified and relations between forbidden patterns, correlation dimensions and complexity measures are studied. A complexity measure approach is proposed in order to separate deterministic (usually chaotic) series from random ones and measure the complexities of different dynamic systems. The complexity is related to the correlation dimensions, and the algorithm is simple and suitable for time series with noise. In the paper, the complexity measure method is used to study dynamic systems of the Logistic map and the Henon map with multi-parameters.
基金supported by the National Natural Science Foundation of China(No.21643014)the Special Natural Science Foundation of Science and Technology Bureau of Xi’an City Government(No.2016CXWL02)
文摘The molecular structures of fifteen possible 2-thioxanthine(2TX) complexes with one Hg^2+ and two Cl-ions were fully optimized using density functional theory B3PW91/6-311++G^** method. The effective pseudo potential LANL2DZ basis set was used for metal Hg^2+ion. The vibrational analysis was also carried out at the same level. The bond lengths, bond angles, zero point energies, Gibbs free energies, thermodynamic energies and relative energies of all the complexes were obtained. The NBO analysis for natural charge and the second order perturbation energy values was carried out for three stable complexes and the IR spectroscopy of the two complexes was assigned to the experimental data. The results show that the 2-thioxanthine complexes with one Hg^2+ and two Cl^-ions were formed and the complexes resulting from the thione tautomer are more stable than that of the thiol ones. The order of three complexes with relative lower energy is 2TX(1,3,7)-Hg^2+-2, 2 TX(1,3,7)-Hg^2+-1 and 2 TX(1,3,9)-Hg^2+. The calculated IR spectroscopy of the two complexes agreed with the experimental data.
文摘Recent years have witnessed a rapid growth of interest in the study of the dynamic behavior of replenishment rules of bullwhip effect. We prove that bullwhip effect and butterfly effect share a same the self-oscillation amplifying mechanism that is the ordering decisions the supplier self-oscillation amplify the perturbations brought by the errors in the processing of retailers' demand information. This results as an explicit self-similar structure of the sensitivity of the system to the initial values duty to the nonlinear mechanism. In this paper, the causes process of the bullwhip effect is described as the internal nonlinear mechanism and study on the complexity of bullwhip effect for order-up-to policy under demand signal processing. The methodology is based on fractal and chaotic theory and allows important insights to be gained about the complexity behavior of bullwhip effect.
基金This work was financially supported by a grant from the National Basic Research Program of China(973 Program)(No.2012CB215204)the Key Project of the CAS Knowledge Innovation Program“Research and demonstration of the coordinated control system based on multi-complementary energy storage”(No.KGCX2-EW-330).
文摘Cascading failure is a potential threat in power systems with the scale development of wind power,especially for the large-scale grid-connected and long distance transmission wind power base in China.This introduces a complex network theory(CNT)for cascading failure analysis considering wind farm integration.A cascading failure power flow analysis model for complex power networks is established with improved network topology principles and methods.The network load and boundary conditions are determined to reflect the operational states of power systems.Three typical network evaluation indicators are used to evaluate the topology characteristics of power network before and after malfunction including connectivity level,global effective performance and percentage of load loss(PLL).The impacts of node removal,grid current tolerance capability,wind power instantaneous penetrations,and wind farm coupling points on the power grid are analyzed based on the IEEE 30 bus system.Through the simulation analysis,the occurrence mechanism and main influence factors of cascading failure are determined.Finally,corresponding defense strategies are proposed to reduce the hazards of cascading failure in power systems.
文摘Because of poor ground conditions, stoping methods of underhand headings withcemented fill were used in the Jinchuan No. 2 mine, a paste fill system was set up. In order toevaluate the reliability of the new system, investigations and trial running have been done. Morethan 20 items of modification or improvements related to paste preparation subsystem, pump and pipesubsystem, auto-controlling and monitoring subsystem were finished. The reliability of the pastefill system was analyzed by using the theory of large complex system, and it is useful inreliability study on paste fill system.
文摘While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if ever do so.In this paper,we review some of the quantitative tools and techniques which are available for use in digital forensic investigations,including Bayesian networks,complexity theory,information theory and probability theory,and indicate how they may be used to obtain likelihood ratios or odds ratios for the relative plausibility of alternative explanations for the creation of the recovered digital evidence.The potential benefits of such quantitative measures for modern digital forensics are also outlined.
基金supported by the Fundamental Research Funds for the Central Universities,Northeast Forestry University(2572018CP06,2572017CA12)。
文摘Background:Urban green infrastructure(GI)networks play a significant role in ensuring regional ecological security;however,they are highly vulnerable to the influence of urban development,and the optimization of GI networks with better connectivity and resilience under different development scenarios has become a practical problem that urgently needs to be solved.Taking Harbin,a megacity in Northeast China,as the case study,we set five simulation scenarios by adjusting the economic growth rate and extracted the GI network in multiple scenarios by integrating the minimal cumulative resistance model and the gravity model.The low‑degree‑first(LDF)strategy of complex network theory was introduced to optimize the GI network,and the optimization effect was verified by robustness analysis.Results:The results showed that in the 5%economic growth scenario,the GI network structure was more complex,and the connectivity of the network was better,while in the other scenarios,the network structure gradually degraded with economic growth.After optimization by the LDF strategy,the average degree of the GI network in multiple scenarios increased from 2.368,2.651,2.189,1.972,and 1.847 to 2.783,3.125,2.643,2.414,and 2.322,respectively,and the GI network structure connectivity and resilience were significantly enhanced in all scenarios.Conclusions:Economic growth did not necessarily lead to degradation of the GI network;there was still room for economic development in the study area,but it was limited under existing GI conditions,and the LDF strategy was an effective method to optimize the GI network.The research results provide a new perspective for the study of GI network protection with urban economic growth and serve as a methodological reference for urban GI network optimization.
文摘We briefly survey a number of important recent uchievements in Theoretical Computer Science (TCS), especially Computational Complexity Theory. We will discuss the PCP Theorem, its implications to inapproximability on combinatorial optimization problems; space bounded computations, especially deterministic logspace algorithm for undirected graph connectivity problem; deterministic polynomial-time primality test; lattice complexity, worst-case to average-case reductions; pseudorandomness and extractor constructions; and Valiant's new theory of holographic algorithms and reductions.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.61372076,61971348,and 62001351)Foundation of Shaanxi Key Laboratory of Information Communication Network and Security(Grant No.ICNS201802)+1 种基金Natural Science Basic Research Program of Shaanxi,China(Grant No.2021JM-142)Key Research and Development Program of Shaanxi Province,China(Grant No.2019ZDLGY09-02)。
文摘Verification in quantum computations is crucial since quantum systems are extremely vulnerable to the environment.However,verifying directly the output of a quantum computation is difficult since we know that efficiently simulating a large-scale quantum computation on a classical computer is usually thought to be impossible.To overcome this difficulty,we propose a self-testing system for quantum computations,which can be used to verify if a quantum computation is performed correctly by itself.Our basic idea is using some extra ancilla qubits to test the output of the computation.We design two kinds of permutation circuits into the original quantum circuit:one is applied on the ancilla qubits whose output indicates the testing information,the other is applied on all qubits(including ancilla qubits) which is aiming to uniformly permute the positions of all qubits.We show that both permutation circuits are easy to achieve.By this way,we prove that any quantum computation has an efficient self-testing system.In the end,we also discuss the relation between our self-testing system and interactive proof systems,and show that the two systems are equivalent if the verifier is allowed to have some quantum capacity.
基金Support of the grant MSM00211622419 is to be acknowledge
文摘Quantum information processing and communication(QIPC) is an area of science that has two main goals: On one side,it tries to explore(still not well known) potential of quantum phenomena for(efficient and reliable) information processing and(efficient,reliable and secure) communication.On the other side,it tries to use quantum information storing,processing and transmitting paradigms,principles,laws,limitations,concepts,models and tools to get deeper insights into the phenomena of quantum world and to find efficient ways to describe and handle/simulate various complex physical phenomena.In order to do that QIPC has to use concepts,models,theories,methods and tools of both physics and informatics.The main role of physics at that is to discover primitive physical phenomena that can be used to design and maintain complex and reliable information storing,processing and transmitting systems.The main role of informatics is,one one side,to explore,from the information processing and communication point of view,limitations and potentials of the potential quantum information processing and communication technology,and to prepare information processing methods that could utilise potential of quantum information processing and communication technologies.On the other side,the main role of informatics is to guide and support,by theoretical tools and outcomes,physics oriented research in QIPC.The paper is to describe and analyse a variety of ways and potential informatics contributes and should/could contribute to the development of QIPC--see also Gruska(1999,2006,2008).
文摘In this paper, single machine scheduling problems with variable processing time are raised. The criterions of the problem considered are minimizing scheduling length of all jobs, flow time and number of tardy jobs and so on. The complexity of the problem is determined. [WT5HZ]