Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic ...Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic complexity.Despite the substantial number of empirical investigations into task complexity in journal articles,storyline complexity,one of the features of it,is scarcely investigated.Previous research mainly focused on the impact of storyline complexity on learners’oral performance,but the impact on learners’written performance is less investigated.Thus,this study aims at investigating the effects of narrative complexity of storyline on senior high school students’written performance,as displayed by its complexity,fluency,and accuracy.The present study has important pedagogical implications.That is,task design and assessment should make a distinction between different types of narrative tasks.For example,the task with single or dual storyline.Results on task complexity may contribute to informing the pedagogical choices made by teachers when prioritizing work with a specific linguistic dimension.展开更多
The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are ca...The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are called causative availability indiscriminate attacks.Facing the problem that existing data sanitization methods are hard to apply to real-time applications due to their tedious process and heavy computations,we propose a new supervised batch detection method for poison,which can fleetly sanitize the training dataset before the local model training.We design a training dataset generation method that helps to enhance accuracy and uses data complexity features to train a detection model,which will be used in an efficient batch hierarchical detection process.Our model stockpiles knowledge about poison,which can be expanded by retraining to adapt to new attacks.Being neither attack-specific nor scenario-specific,our method is applicable to FL/DML or other online or offline scenarios.展开更多
Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over...Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over reaction parameters,resulting in purer,more uniform,and structurally stable products due to more effective mass transfer manipulation.However,continuous-flow synthesis processes may be accompanied by the emergence of spatial convective structures initiating convective flows.On the one hand,convection can accelerate reactions by intensifying mass transfer.On the other hand,it may lead to non-uniformity in the final product or defects,especially in MOF microcrystal synthesis.The ability to distinguish regions of convective and diffusive mass transfer may be the key to performing higher-quality reactions and obtaining purer products.In this study,we investigate,for the first time,the possibility of using the information complexity measure as a criterion for assessing the intensity of mass transfer in microchannels,considering both spatial and temporal non-uniformities of liquid’s distributions resulting from convection formation.We calculate the complexity using shearlet transform based on a local approach.In contrast to existing methods for calculating complexity,the shearlet transform based approach provides a more detailed representation of local heterogeneities.Our analysis involves experimental images illustrating the mixing process of two non-reactive liquids in a Y-type continuous-flow microchannel under conditions of double-diffusive convection formation.The obtained complexity fields characterize the mixing process and structure formation,revealing variations in mass transfer intensity along the microchannel.We compare the results with cases of liquid mixing via a pure diffusive mechanism.Upon analysis,it was revealed that the complexity measure exhibits sensitivity to variations in the type of mass transfer,establishing its feasibility as an indirect criterion for assessing mass transfer intensity.The method presented can extend beyond flow analysis,finding application in the controlling of microstructures of various materials(porosity,for instance)or surface defects in metals,optical systems and other materials that hold significant relevance in materials science and engineering.展开更多
Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide ...Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide the self-organized structure. The living systems are open, dynamic structures performing random, stationary, stochastic, self-organizing processes. The self-organizing procedure is defined by the spatial-temporal fractal structure, which is self-similar both in space and time. The system’s complexity appears in its energetics, which tries the most efficient use of the available energies;for that, it organizes various well-connected networks. The controller of environmental relations is the Darwinian selection on a long-time scale. The energetics optimize the healthy processes tuned to the highest efficacy and minimal loss (minimalization of the entropy production). The organism is built up by morphogenetic rules and develops various networks from the genetic level to the organism. The networks have intensive crosstalk and form a balance in the Nash equilibrium, which is the homeostatic state in healthy conditions. Homeostasis may be described as a Nash equilibrium, which ensures energy distribution in a “democratic” way regarding the functions of the parts in the complete system. Cancer radically changes the network system in the organism. Cancer is a network disease. Deviation from healthy networking appears at every level, from genetic (molecular) to cells, tissues, organs, and organisms. The strong proliferation of malignant tissue is the origin of most of the life-threatening processes. The weak side of cancer development is the change of complex information networking in the system, being vulnerable to immune attacks. Cancer cells are masters of adaptation and evade immune surveillance. This hiding process can be broken by electromagnetic nonionizing radiation, for which the malignant structure has no adaptation strategy. Our objective is to review the different sides of living complexity and use the knowledge to fight against cancer.展开更多
This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the m...This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the main data-gathering instruments.English language teachers from Coral na Munti National High School were selected as the respondents of the study.Finding of the study suggests that the syntactic complexity of the text affects the reading comprehension of the students.Students found it challenging to understand the message that the author conveyed if he or she used a large number of phrases and clauses in one sentence.Furthermore,the complex sentence syntactic structure was deemed the most challenging for students to understand.To overcome said challenges in comprehending text,various reading intervention programs were utilized by teachers.These interventions include focused or targeted instruction and the implementation of the Project Dear,suggested by the Department of Education.These programs were proven to help students improve their comprehension as well as their knowledge in syntactical structure of sentences.This study underscores the importance of selecting appropriate reading materials and implementing suitable reading intervention programs to enhance students’comprehension skills.展开更多
The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features ...The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features that L2 learners can employ, as well as suggest which features they should focus on in English academic writing. To achieve this, all samples were analyzed for rhetorical moves using Hyland’s five-rhetorical move model. Additionally, all sentences were evaluated for syntactic complexity, considering measures such as global, clausal and phrasal complexity. The findings reveal that expert writers exhibit a more balanced use of syntactic complexity across moves, effectively fulfilling the rhetorical objectives of abstracts. On the other hand, MA students tend to rely excessively on embedded structures and dependent clauses in an attempt to increase complexity. The implications of these findings for academic writing research, pedagogy, and assessment are thoroughly discussed.展开更多
Globally,economies have become complex and new technologies have transformed and facilitated the modernization of economies.In the previous literature,economic complexity approach has become one of the popular tools i...Globally,economies have become complex and new technologies have transformed and facilitated the modernization of economies.In the previous literature,economic complexity approach has become one of the popular tools in the development and innovation studies of economic geography.Researchers have found that green technology and eco-innovation approaches should be used to decisively reduce the effects of carbon emissions on the environment.However,debates about the impact of economic complexity on environment remain unsettled since some emerging production technologies have far-reaching pollution effects.This study explored the impacts of economic complexity on environmental sustainability in Turkey using the novel Fourier-based approaches,namely:Fourier Augmented Dickey-Fuller(FADF)and Fourier Autoregressive-Distributed Lag(FARDL)models.The Fourier-based approaches indicated that all variables(economic complexity index(ECI),GDP,energy consumption,and CO_(2)emission(CO_(2)E))are cointegrated in the long run.Additionally,the FARDL model implied that(i)in the long run,the effect of ECI(as a proxy for economic complexity),GDP(as a proxy for economic growth),and energy consumption on CO_(2)E(as a proxy for environmental quality)are important;(ii)economic complexity decreases environmental degradation in Turkey;and(iii)economic growth and energy consumption negatively affect environmental quality.The results also showed that economic complexity could be used as a policy tool to tackle environmental degradation.The findings also revealed that the fossil fuelbased economy will continue to expand and undermine Turkey’s efforts to meet its net zero emission target by 2053.Therefore,policy-makers should take actions and establish diversified economic,environmental,and energy strategies.For policy insights,the Turkish governments can use the combination of tax exemptions and technical support systems to support knowledge creation and the diffusion of environmentally friendly technologies The governments can also impose strict environmental regulations on the knowledge development phases.展开更多
Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in ge...Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in general net theory(GNT), a quantitative method for structure description and analysis of information systems was introduced. Results The structural complexity index and two related factors, i.e. element complexity factor and connection complexity factor were defined, and the relations between them and the parameters of the Petri net based model of the system were derived. Application example was presented. Conclusion The proposed method provides a theoretical basis for quantitative analysis and evaluation of the structural complexity and can be applied in the general planning and design processes of the information systems.展开更多
This study proposes a novel fractional discrete-time macroeconomic system with incommensurate order.The dynamical behavior of the proposed macroeconomic model is investigated analytically and numerically.In particular...This study proposes a novel fractional discrete-time macroeconomic system with incommensurate order.The dynamical behavior of the proposed macroeconomic model is investigated analytically and numerically.In particular,the zero equilibrium point stability is investigated to demonstrate that the discrete macroeconomic system exhibits chaotic behavior.Through using bifurcation diagrams,phase attractors,the maximum Lyapunov exponent and the 0–1 test,we verified that chaos exists in the new model with incommensurate fractional orders.Additionally,a complexity analysis is carried out utilizing the approximation entropy(ApEn)and C_(0)complexity to prove that chaos exists.Finally,the main findings of this study are presented using numerical simulations.展开更多
This paper proposes the alternating direction method of multipliers-based infinity-norm(ADMIN) with threshold(ADMIN-T) and with percentage(ADMIN-P) detection algorithms,which make full use of the distribution of the s...This paper proposes the alternating direction method of multipliers-based infinity-norm(ADMIN) with threshold(ADMIN-T) and with percentage(ADMIN-P) detection algorithms,which make full use of the distribution of the signal to interference plus noise ratio(SINR) for an uplink massive MIMO system.The ADMIN-T and ADMIN-P detection algorithms are improved visions of the ADMIN detection algorithm,in which an appropriate SINR threshold in the ADMIN-T detection algorithm and a certain percentage in the ADMIN-P detection algorithm are designed to reduce the overall computational complexity.The detected symbols are divided into two parts by the SINR threshold which is based on the cumulative probability density function(CDF) of SINR and a percentage,respectively.The symbols in higher SINR part are detected by MMSE.The interference of these symbols is then cancelled by successive interference cancellation(SIC).Afterwards the remaining symbols with low SINR are iteratively detected by ADMIN.The simulation results show that the ADMIIN-T and the ADMIN-P detection algorithms provide a significant performance gain compared with some recently proposed detection algorithms.In addition,the computational complexity of ADMIN-T and ADMIN-P are significantly reduced.Furthermore,in the case of same number of transceiver antennas,the proposed algorithms have a higher performance compared with the case of asymmetric transceiver antennas.展开更多
Globally traffic signs are used by all countries for healthier traffic flow and to protect drivers and pedestrians.Consequently,traffic signs have been of great importance for every civilized country,which makes resea...Globally traffic signs are used by all countries for healthier traffic flow and to protect drivers and pedestrians.Consequently,traffic signs have been of great importance for every civilized country,which makes researchers give more focus on the automatic detection of traffic signs.Detecting these traffic signs is challenging due to being in the dark,far away,partially occluded,and affected by the lighting or the presence of similar objects.An innovative traffic sign detection method for red and blue signs in color images is proposed to resolve these issues.This technique aimed to devise an efficient,robust and accurate approach.To attain this,initially,the approach presented a new formula,inspired by existing work,to enhance the image using red and green channels instead of blue,which segmented using a threshold calculated from the correlational property of the image.Next,a new set of features is proposed,motivated by existing features.Texture and color features are fused after getting extracted on the channel of Red,Green,and Blue(RGB),Hue,Saturation,and Value(HSV),and YCbCr color models of images.Later,the set of features is employed on different classification frameworks,from which quadratic support vector machine(SVM)outnumbered the others with an accuracy of 98.5%.The proposed method is tested on German Traffic Sign Detection Benchmark(GTSDB)images.The results are satisfactory when compared to the preceding work.展开更多
Task-Based Language Teaching has witnessed growing interest in the impact of task complexity on Second language(L2)learners’linguistic performance.Within this field,the influence of resource-directing and resource-di...Task-Based Language Teaching has witnessed growing interest in the impact of task complexity on Second language(L2)learners’linguistic performance.Within this field,the influence of resource-directing and resource-dispersing features of cognitive task complexity has attracted much attention.Research on task complexity focuses on the influence of different task features on language learners’production in terms of linguistic complexity,accuracy,fluency and lexis.Within this field,a line of investigation that has attracted much attention is the influence of resource-directing and resource-dispersing features of cognitive task complexity.The review is helpful for better understanding of task complexity and allows to draw some preliminary pedagogical implications that may be useful for task-based syllabus design.展开更多
Several possible definitions of local injectivity for a homomorphism of an oriented graph G to an oriented graph H are considered. In each case, we determine the complexity of deciding whether there exists such a homo...Several possible definitions of local injectivity for a homomorphism of an oriented graph G to an oriented graph H are considered. In each case, we determine the complexity of deciding whether there exists such a homomorphism when G is given and H is a fixed tournament on three or fewer vertices. Each possible definition leads to a locally-injective oriented colouring problem. A dichotomy theorem is proved in each case.展开更多
This study,through a re-conceptualization of sociological complexity theory’s epistemological sources,specifically in Edgar Morin’s formulation,sheds light on the theoretical models as well as empirical methodologie...This study,through a re-conceptualization of sociological complexity theory’s epistemological sources,specifically in Edgar Morin’s formulation,sheds light on the theoretical models as well as empirical methodologies of sociological analysis of today’s complex,interconnected,diverse and globalized society and global disorder.Complexity theory leads to a shift in perspective and a transformation of the epistemological status of social sciences with an in-depth intervention of disorder,contingency,case,singular,and non-repeatable in the sociological analysis.The notion of dialogic interplay is placed at the paradigm level and stands out at the heart of the concepts,analyzing the social system as auto-eco-organizer.Similarly,the notion of‘emergence’at macro-micro levels imposes itself as complex,logically requiring overcoming simple,linear thinking and model of explanation to adopt the perspective of organizational rotativity in which the product retroacts by transforming the one producing it,by conceiving a circularity of co-production between individuals and society through interactions.Declining epistemology and sociological complexity theory in the empirical methodology setting,the complex sociological approach is phenomenon-,event/information-and crisis-centered,privileging observation,participation-intervention,and‘live inquiry’.The open,in-depth and possibly non-directive interview is part of clinical sociological methodology,raising the question of the observer-phenomenon-observed relation.展开更多
The vapor film collapse that occurs in the quenching process is complicated and affects the heat treatment quality and its distortion.In order to incorporate it into the MBD(Model Based Development)technology required...The vapor film collapse that occurs in the quenching process is complicated and affects the heat treatment quality and its distortion.In order to incorporate it into the MBD(Model Based Development)technology required these days,it is necessary to predict the quality of heat treatment by CAE(Computer Added Engineering),shorten the product development period.The calculation of the vapor film collapses in a simple and practical time in order to improve the product performance.However,in the past,in order to formulate the vapor film collapse on a simulation,it was necessary to perform a very large amount of computational calculation CFD(computational fluid dynamics),which was a problem in terms of computer resources and the model of vapor film collapse.In addition,this phenomenon has a complexity behavior of the phenomenon in iterative processing,which also complicates the calculation.In this study,the vapor film collapse phenomenon is easily visualized using self-organized cellular automaton simulation which includes the phenomena of“vapor film thickness and its fluctuation”,“flow disturbance”,“surface step of workpiece”,and“decrease of cooling due to r shape of surface”.The average cooling state and repeated fluctuations of the cooling state were reproduced by this method.展开更多
For many continuous bio-medieal signals with both strong nonlinearity and non-stationarity, two criterions were proposed for their complexity estimation : (1) Only a short data set is enough for robust estimation; ...For many continuous bio-medieal signals with both strong nonlinearity and non-stationarity, two criterions were proposed for their complexity estimation : (1) Only a short data set is enough for robust estimation; (2) No over-coarse graining preproeessing, such as transferring the original signal into a binary time series, is needed. Co complexity measure proposed by us previously is one of such measures. However, it lacks the solid mathematical foundation and thus its use is limited. A modified version of this measure is proposed, and some important properties are proved rigorously. According to these properties, this measure can be considered as an index of randomness of time series in some senses, and thus also a quantitative index of complexity under the meaning of randomness finding complexity. Compared with other similar measures, this measure seems more suitable for estimating a large quantity of complexity measures for a given task, such as studying the dynamic variation of such measures in sliding windows of a long process, owing to its fast speed for estimation.展开更多
By the method of gradient pattern analysis, twenty plots were set at altitudes of 700-2600 m with an interval of 100 m on the northern slope of the Changbai Mountain. The dissimilarity of respective sub-plots in the s...By the method of gradient pattern analysis, twenty plots were set at altitudes of 700-2600 m with an interval of 100 m on the northern slope of the Changbai Mountain. The dissimilarity of respective sub-plots in the same community was measured and the complexity of plant communities at different altitudes was analyzed. The result from binary data of tree species in canopy tree indicated that the sub-plots in the communities, except subalpine Betula ermanii forest, showed comparatively high dissimilarity in species composition. Especially, the dissimilarity index (0.7) of broadleaved/Korean pine forest at low altitudes was obviously higher than other communities. The differences are not obvious between communities referring to dark coniferous forest. Comparatively, the dissimilarity in sub-plots of the communities at altitude of 1400 m was slightly higher than that of other communities, which reflected the complexity of tree species compositions of transitory-type communities. For subalpine Betula ermanii forest, tree species composition was simple and showed a high similarity between sub-plots. The results derived from binary data of shrub showed that the dissimilarity index of shrub species in broadleaved/Korean pine forest at low altitudes was higher than that in other communities, but the divergence tendency wasn抰 so obvious as that of arbor species. The dissimilarity derived from binary data of herb and all plant species at different altitudes showed greatly close tendency, and the differences in herb and all plant species between sub-plots were the greatest for the communities of broad-leaved-Korean pine forest and alpine tundra zone..展开更多
The complexity and applicability of three relative car-following models are investigated and they are the optimal velocity model (OVM),the generalized force model (GFM) and the full velocity difference model (FVD...The complexity and applicability of three relative car-following models are investigated and they are the optimal velocity model (OVM),the generalized force model (GFM) and the full velocity difference model (FVDM).The vehicle trajectory data used is collected from the digital pictures obtained at a 30-storey building near Ⅰ-80 freeway.Three different calibrating methods are used to estimate the model parameters and to study the relationships between model complexity and applicability from overall,inter-driver and intra-driver analysis.Results of the three methods of the OVM,GFM and FVDM show that the complexity and applicability are not consistent and the complicated models are not always superior to the simple ones in modeling car-following.The findings of this study can provide useful information for car-following behavior modeling.展开更多
Linear complexity and k-error linear complexity of the stream cipher are two important standards to scale the randomicity of keystreams. For the 2n -periodicperiodic binary sequence with linear complexity 2n 1and k = ...Linear complexity and k-error linear complexity of the stream cipher are two important standards to scale the randomicity of keystreams. For the 2n -periodicperiodic binary sequence with linear complexity 2n 1and k = 2,3,the number of sequences with given k-error linear complexity and the expected k-error linear complexity are provided. Moreover,the proportion of the sequences whose k-error linear complexity is bigger than the expected value is analyzed.展开更多
A large unified hybrid network model with a variable speed growth (LUHNM-VSG) is proposed as third model of the unified hybrid network theoretical framework (UHNTF). A hybrid growth ratio vg of deterministic linki...A large unified hybrid network model with a variable speed growth (LUHNM-VSG) is proposed as third model of the unified hybrid network theoretical framework (UHNTF). A hybrid growth ratio vg of deterministic linking number to random linking number and variable speed growth index a are introduced in it. The main effects of vg and a on topological transition features of the LUHNM-VSC are revealed. For comparison with the other models, we construct a type of the network complexity pyramid with seven levels, in which from the bottom level-1 to the top level-7 of the pyramid simplicity-universality is increasing but complexity-diversity is decreasing. The transition relations between them depend on matching of four hybrid ratios (dr, fd, gr, vg). Thus the most of network models can be investigated in the unification way via four hybrid ratios (dr, fd, gr, vg). The LUHNM-VSG as the level-1 of the pyramid is much better and closer to description of real-world networks as well as has potential application.展开更多
文摘Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic complexity.Despite the substantial number of empirical investigations into task complexity in journal articles,storyline complexity,one of the features of it,is scarcely investigated.Previous research mainly focused on the impact of storyline complexity on learners’oral performance,but the impact on learners’written performance is less investigated.Thus,this study aims at investigating the effects of narrative complexity of storyline on senior high school students’written performance,as displayed by its complexity,fluency,and accuracy.The present study has important pedagogical implications.That is,task design and assessment should make a distinction between different types of narrative tasks.For example,the task with single or dual storyline.Results on task complexity may contribute to informing the pedagogical choices made by teachers when prioritizing work with a specific linguistic dimension.
基金supported in part by the“Pioneer”and“Leading Goose”R&D Program of Zhejiang(Grant No.2022C03174)the National Natural Science Foundation of China(No.92067103)+4 种基金the Key Research and Development Program of Shaanxi,China(No.2021ZDLGY06-02)the Natural Science Foundation of Shaanxi Province(No.2019ZDLGY12-02)the Shaanxi Innovation Team Project(No.2018TD-007)the Xi'an Science and technology Innovation Plan(No.201809168CX9JC10)the Fundamental Research Funds for the Central Universities(No.YJS2212)and National 111 Program of China B16037.
文摘The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are called causative availability indiscriminate attacks.Facing the problem that existing data sanitization methods are hard to apply to real-time applications due to their tedious process and heavy computations,we propose a new supervised batch detection method for poison,which can fleetly sanitize the training dataset before the local model training.We design a training dataset generation method that helps to enhance accuracy and uses data complexity features to train a detection model,which will be used in an efficient batch hierarchical detection process.Our model stockpiles knowledge about poison,which can be expanded by retraining to adapt to new attacks.Being neither attack-specific nor scenario-specific,our method is applicable to FL/DML or other online or offline scenarios.
基金supported by the Ministry of Science and High Education of Russia(Theme No.368121031700169-1 of ICMM UrB RAS).
文摘Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over reaction parameters,resulting in purer,more uniform,and structurally stable products due to more effective mass transfer manipulation.However,continuous-flow synthesis processes may be accompanied by the emergence of spatial convective structures initiating convective flows.On the one hand,convection can accelerate reactions by intensifying mass transfer.On the other hand,it may lead to non-uniformity in the final product or defects,especially in MOF microcrystal synthesis.The ability to distinguish regions of convective and diffusive mass transfer may be the key to performing higher-quality reactions and obtaining purer products.In this study,we investigate,for the first time,the possibility of using the information complexity measure as a criterion for assessing the intensity of mass transfer in microchannels,considering both spatial and temporal non-uniformities of liquid’s distributions resulting from convection formation.We calculate the complexity using shearlet transform based on a local approach.In contrast to existing methods for calculating complexity,the shearlet transform based approach provides a more detailed representation of local heterogeneities.Our analysis involves experimental images illustrating the mixing process of two non-reactive liquids in a Y-type continuous-flow microchannel under conditions of double-diffusive convection formation.The obtained complexity fields characterize the mixing process and structure formation,revealing variations in mass transfer intensity along the microchannel.We compare the results with cases of liquid mixing via a pure diffusive mechanism.Upon analysis,it was revealed that the complexity measure exhibits sensitivity to variations in the type of mass transfer,establishing its feasibility as an indirect criterion for assessing mass transfer intensity.The method presented can extend beyond flow analysis,finding application in the controlling of microstructures of various materials(porosity,for instance)or surface defects in metals,optical systems and other materials that hold significant relevance in materials science and engineering.
文摘Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide the self-organized structure. The living systems are open, dynamic structures performing random, stationary, stochastic, self-organizing processes. The self-organizing procedure is defined by the spatial-temporal fractal structure, which is self-similar both in space and time. The system’s complexity appears in its energetics, which tries the most efficient use of the available energies;for that, it organizes various well-connected networks. The controller of environmental relations is the Darwinian selection on a long-time scale. The energetics optimize the healthy processes tuned to the highest efficacy and minimal loss (minimalization of the entropy production). The organism is built up by morphogenetic rules and develops various networks from the genetic level to the organism. The networks have intensive crosstalk and form a balance in the Nash equilibrium, which is the homeostatic state in healthy conditions. Homeostasis may be described as a Nash equilibrium, which ensures energy distribution in a “democratic” way regarding the functions of the parts in the complete system. Cancer radically changes the network system in the organism. Cancer is a network disease. Deviation from healthy networking appears at every level, from genetic (molecular) to cells, tissues, organs, and organisms. The strong proliferation of malignant tissue is the origin of most of the life-threatening processes. The weak side of cancer development is the change of complex information networking in the system, being vulnerable to immune attacks. Cancer cells are masters of adaptation and evade immune surveillance. This hiding process can be broken by electromagnetic nonionizing radiation, for which the malignant structure has no adaptation strategy. Our objective is to review the different sides of living complexity and use the knowledge to fight against cancer.
文摘This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the main data-gathering instruments.English language teachers from Coral na Munti National High School were selected as the respondents of the study.Finding of the study suggests that the syntactic complexity of the text affects the reading comprehension of the students.Students found it challenging to understand the message that the author conveyed if he or she used a large number of phrases and clauses in one sentence.Furthermore,the complex sentence syntactic structure was deemed the most challenging for students to understand.To overcome said challenges in comprehending text,various reading intervention programs were utilized by teachers.These interventions include focused or targeted instruction and the implementation of the Project Dear,suggested by the Department of Education.These programs were proven to help students improve their comprehension as well as their knowledge in syntactical structure of sentences.This study underscores the importance of selecting appropriate reading materials and implementing suitable reading intervention programs to enhance students’comprehension skills.
文摘The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features that L2 learners can employ, as well as suggest which features they should focus on in English academic writing. To achieve this, all samples were analyzed for rhetorical moves using Hyland’s five-rhetorical move model. Additionally, all sentences were evaluated for syntactic complexity, considering measures such as global, clausal and phrasal complexity. The findings reveal that expert writers exhibit a more balanced use of syntactic complexity across moves, effectively fulfilling the rhetorical objectives of abstracts. On the other hand, MA students tend to rely excessively on embedded structures and dependent clauses in an attempt to increase complexity. The implications of these findings for academic writing research, pedagogy, and assessment are thoroughly discussed.
文摘Globally,economies have become complex and new technologies have transformed and facilitated the modernization of economies.In the previous literature,economic complexity approach has become one of the popular tools in the development and innovation studies of economic geography.Researchers have found that green technology and eco-innovation approaches should be used to decisively reduce the effects of carbon emissions on the environment.However,debates about the impact of economic complexity on environment remain unsettled since some emerging production technologies have far-reaching pollution effects.This study explored the impacts of economic complexity on environmental sustainability in Turkey using the novel Fourier-based approaches,namely:Fourier Augmented Dickey-Fuller(FADF)and Fourier Autoregressive-Distributed Lag(FARDL)models.The Fourier-based approaches indicated that all variables(economic complexity index(ECI),GDP,energy consumption,and CO_(2)emission(CO_(2)E))are cointegrated in the long run.Additionally,the FARDL model implied that(i)in the long run,the effect of ECI(as a proxy for economic complexity),GDP(as a proxy for economic growth),and energy consumption on CO_(2)E(as a proxy for environmental quality)are important;(ii)economic complexity decreases environmental degradation in Turkey;and(iii)economic growth and energy consumption negatively affect environmental quality.The results also showed that economic complexity could be used as a policy tool to tackle environmental degradation.The findings also revealed that the fossil fuelbased economy will continue to expand and undermine Turkey’s efforts to meet its net zero emission target by 2053.Therefore,policy-makers should take actions and establish diversified economic,environmental,and energy strategies.For policy insights,the Turkish governments can use the combination of tax exemptions and technical support systems to support knowledge creation and the diffusion of environmentally friendly technologies The governments can also impose strict environmental regulations on the knowledge development phases.
文摘Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in general net theory(GNT), a quantitative method for structure description and analysis of information systems was introduced. Results The structural complexity index and two related factors, i.e. element complexity factor and connection complexity factor were defined, and the relations between them and the parameters of the Petri net based model of the system were derived. Application example was presented. Conclusion The proposed method provides a theoretical basis for quantitative analysis and evaluation of the structural complexity and can be applied in the general planning and design processes of the information systems.
文摘This study proposes a novel fractional discrete-time macroeconomic system with incommensurate order.The dynamical behavior of the proposed macroeconomic model is investigated analytically and numerically.In particular,the zero equilibrium point stability is investigated to demonstrate that the discrete macroeconomic system exhibits chaotic behavior.Through using bifurcation diagrams,phase attractors,the maximum Lyapunov exponent and the 0–1 test,we verified that chaos exists in the new model with incommensurate fractional orders.Additionally,a complexity analysis is carried out utilizing the approximation entropy(ApEn)and C_(0)complexity to prove that chaos exists.Finally,the main findings of this study are presented using numerical simulations.
基金This work was supported in part by the National Natural Science Foundation of China(NSFC)under grant numbers 61671047,61775015 and U2006217.
文摘This paper proposes the alternating direction method of multipliers-based infinity-norm(ADMIN) with threshold(ADMIN-T) and with percentage(ADMIN-P) detection algorithms,which make full use of the distribution of the signal to interference plus noise ratio(SINR) for an uplink massive MIMO system.The ADMIN-T and ADMIN-P detection algorithms are improved visions of the ADMIN detection algorithm,in which an appropriate SINR threshold in the ADMIN-T detection algorithm and a certain percentage in the ADMIN-P detection algorithm are designed to reduce the overall computational complexity.The detected symbols are divided into two parts by the SINR threshold which is based on the cumulative probability density function(CDF) of SINR and a percentage,respectively.The symbols in higher SINR part are detected by MMSE.The interference of these symbols is then cancelled by successive interference cancellation(SIC).Afterwards the remaining symbols with low SINR are iteratively detected by ADMIN.The simulation results show that the ADMIIN-T and the ADMIN-P detection algorithms provide a significant performance gain compared with some recently proposed detection algorithms.In addition,the computational complexity of ADMIN-T and ADMIN-P are significantly reduced.Furthermore,in the case of same number of transceiver antennas,the proposed algorithms have a higher performance compared with the case of asymmetric transceiver antennas.
基金supported in part by the Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education under Grant NRF-2019R1A2C1006159 and Grant NRF-2021R1A6A1A03039493in part by the 2022 Yeungnam University Research Grant.
文摘Globally traffic signs are used by all countries for healthier traffic flow and to protect drivers and pedestrians.Consequently,traffic signs have been of great importance for every civilized country,which makes researchers give more focus on the automatic detection of traffic signs.Detecting these traffic signs is challenging due to being in the dark,far away,partially occluded,and affected by the lighting or the presence of similar objects.An innovative traffic sign detection method for red and blue signs in color images is proposed to resolve these issues.This technique aimed to devise an efficient,robust and accurate approach.To attain this,initially,the approach presented a new formula,inspired by existing work,to enhance the image using red and green channels instead of blue,which segmented using a threshold calculated from the correlational property of the image.Next,a new set of features is proposed,motivated by existing features.Texture and color features are fused after getting extracted on the channel of Red,Green,and Blue(RGB),Hue,Saturation,and Value(HSV),and YCbCr color models of images.Later,the set of features is employed on different classification frameworks,from which quadratic support vector machine(SVM)outnumbered the others with an accuracy of 98.5%.The proposed method is tested on German Traffic Sign Detection Benchmark(GTSDB)images.The results are satisfactory when compared to the preceding work.
文摘Task-Based Language Teaching has witnessed growing interest in the impact of task complexity on Second language(L2)learners’linguistic performance.Within this field,the influence of resource-directing and resource-dispersing features of cognitive task complexity has attracted much attention.Research on task complexity focuses on the influence of different task features on language learners’production in terms of linguistic complexity,accuracy,fluency and lexis.Within this field,a line of investigation that has attracted much attention is the influence of resource-directing and resource-dispersing features of cognitive task complexity.The review is helpful for better understanding of task complexity and allows to draw some preliminary pedagogical implications that may be useful for task-based syllabus design.
文摘Several possible definitions of local injectivity for a homomorphism of an oriented graph G to an oriented graph H are considered. In each case, we determine the complexity of deciding whether there exists such a homomorphism when G is given and H is a fixed tournament on three or fewer vertices. Each possible definition leads to a locally-injective oriented colouring problem. A dichotomy theorem is proved in each case.
文摘This study,through a re-conceptualization of sociological complexity theory’s epistemological sources,specifically in Edgar Morin’s formulation,sheds light on the theoretical models as well as empirical methodologies of sociological analysis of today’s complex,interconnected,diverse and globalized society and global disorder.Complexity theory leads to a shift in perspective and a transformation of the epistemological status of social sciences with an in-depth intervention of disorder,contingency,case,singular,and non-repeatable in the sociological analysis.The notion of dialogic interplay is placed at the paradigm level and stands out at the heart of the concepts,analyzing the social system as auto-eco-organizer.Similarly,the notion of‘emergence’at macro-micro levels imposes itself as complex,logically requiring overcoming simple,linear thinking and model of explanation to adopt the perspective of organizational rotativity in which the product retroacts by transforming the one producing it,by conceiving a circularity of co-production between individuals and society through interactions.Declining epistemology and sociological complexity theory in the empirical methodology setting,the complex sociological approach is phenomenon-,event/information-and crisis-centered,privileging observation,participation-intervention,and‘live inquiry’.The open,in-depth and possibly non-directive interview is part of clinical sociological methodology,raising the question of the observer-phenomenon-observed relation.
文摘The vapor film collapse that occurs in the quenching process is complicated and affects the heat treatment quality and its distortion.In order to incorporate it into the MBD(Model Based Development)technology required these days,it is necessary to predict the quality of heat treatment by CAE(Computer Added Engineering),shorten the product development period.The calculation of the vapor film collapses in a simple and practical time in order to improve the product performance.However,in the past,in order to formulate the vapor film collapse on a simulation,it was necessary to perform a very large amount of computational calculation CFD(computational fluid dynamics),which was a problem in terms of computer resources and the model of vapor film collapse.In addition,this phenomenon has a complexity behavior of the phenomenon in iterative processing,which also complicates the calculation.In this study,the vapor film collapse phenomenon is easily visualized using self-organized cellular automaton simulation which includes the phenomena of“vapor film thickness and its fluctuation”,“flow disturbance”,“surface step of workpiece”,and“decrease of cooling due to r shape of surface”.The average cooling state and repeated fluctuations of the cooling state were reproduced by this method.
文摘For many continuous bio-medieal signals with both strong nonlinearity and non-stationarity, two criterions were proposed for their complexity estimation : (1) Only a short data set is enough for robust estimation; (2) No over-coarse graining preproeessing, such as transferring the original signal into a binary time series, is needed. Co complexity measure proposed by us previously is one of such measures. However, it lacks the solid mathematical foundation and thus its use is limited. A modified version of this measure is proposed, and some important properties are proved rigorously. According to these properties, this measure can be considered as an index of randomness of time series in some senses, and thus also a quantitative index of complexity under the meaning of randomness finding complexity. Compared with other similar measures, this measure seems more suitable for estimating a large quantity of complexity measures for a given task, such as studying the dynamic variation of such measures in sliding windows of a long process, owing to its fast speed for estimation.
基金supported by the Chinese Academy of Science(grand KZCX2-406)founded by Chinese Science of Academy undred People’Project.
文摘By the method of gradient pattern analysis, twenty plots were set at altitudes of 700-2600 m with an interval of 100 m on the northern slope of the Changbai Mountain. The dissimilarity of respective sub-plots in the same community was measured and the complexity of plant communities at different altitudes was analyzed. The result from binary data of tree species in canopy tree indicated that the sub-plots in the communities, except subalpine Betula ermanii forest, showed comparatively high dissimilarity in species composition. Especially, the dissimilarity index (0.7) of broadleaved/Korean pine forest at low altitudes was obviously higher than other communities. The differences are not obvious between communities referring to dark coniferous forest. Comparatively, the dissimilarity in sub-plots of the communities at altitude of 1400 m was slightly higher than that of other communities, which reflected the complexity of tree species compositions of transitory-type communities. For subalpine Betula ermanii forest, tree species composition was simple and showed a high similarity between sub-plots. The results derived from binary data of shrub showed that the dissimilarity index of shrub species in broadleaved/Korean pine forest at low altitudes was higher than that in other communities, but the divergence tendency wasn抰 so obvious as that of arbor species. The dissimilarity derived from binary data of herb and all plant species at different altitudes showed greatly close tendency, and the differences in herb and all plant species between sub-plots were the greatest for the communities of broad-leaved-Korean pine forest and alpine tundra zone..
基金The National Basic Research Program of China(973Program)(No.2012CB725402)the National Natural Science Foundation of China(No.51478113)
文摘The complexity and applicability of three relative car-following models are investigated and they are the optimal velocity model (OVM),the generalized force model (GFM) and the full velocity difference model (FVDM).The vehicle trajectory data used is collected from the digital pictures obtained at a 30-storey building near Ⅰ-80 freeway.Three different calibrating methods are used to estimate the model parameters and to study the relationships between model complexity and applicability from overall,inter-driver and intra-driver analysis.Results of the three methods of the OVM,GFM and FVDM show that the complexity and applicability are not consistent and the complicated models are not always superior to the simple ones in modeling car-following.The findings of this study can provide useful information for car-following behavior modeling.
基金the National Natural Science Foundation of China (No.60373092).
文摘Linear complexity and k-error linear complexity of the stream cipher are two important standards to scale the randomicity of keystreams. For the 2n -periodicperiodic binary sequence with linear complexity 2n 1and k = 2,3,the number of sequences with given k-error linear complexity and the expected k-error linear complexity are provided. Moreover,the proportion of the sequences whose k-error linear complexity is bigger than the expected value is analyzed.
基金Supported by National Natural Science Foundation of China under Grant Nos. 70431002, 10647001, and 60874087
文摘A large unified hybrid network model with a variable speed growth (LUHNM-VSG) is proposed as third model of the unified hybrid network theoretical framework (UHNTF). A hybrid growth ratio vg of deterministic linking number to random linking number and variable speed growth index a are introduced in it. The main effects of vg and a on topological transition features of the LUHNM-VSC are revealed. For comparison with the other models, we construct a type of the network complexity pyramid with seven levels, in which from the bottom level-1 to the top level-7 of the pyramid simplicity-universality is increasing but complexity-diversity is decreasing. The transition relations between them depend on matching of four hybrid ratios (dr, fd, gr, vg). Thus the most of network models can be investigated in the unification way via four hybrid ratios (dr, fd, gr, vg). The LUHNM-VSG as the level-1 of the pyramid is much better and closer to description of real-world networks as well as has potential application.