The computational techniques are a set of novel problem-solving methodologies that have attracted wider attention for their excellent performance.The handling strategies of real-world problems are artificial neural ne...The computational techniques are a set of novel problem-solving methodologies that have attracted wider attention for their excellent performance.The handling strategies of real-world problems are artificial neural networks(ANN),evolutionary computing(EC),and many more.An estimated fifty thousand to ninety thousand new leishmaniasis cases occur annually,with only 25%to 45%reported to the World Health Organization(WHO).It remains one of the top parasitic diseases with outbreak and mortality potential.In 2020,more than ninety percent of new cases reported to World Health Organization(WHO)occurred in ten countries:Brazil,China,Ethiopia,Eritrea,India,Kenya,Somalia,South Sudan,Sudan,and Yemen.The transmission of visceral leishmaniasis is studied dynamically and numerically.The study included positivity,boundedness,equilibria,reproduction number,and local stability of the model in the dynamical analysis.Some detailed methods like Runge Kutta and Euler depend on time steps and violate the physical relevance of the disease.They produce negative and unbounded results,so in disease dynamics,such developments have no biological significance;in other words,these results are meaningless.But the implicit nonstandard finite difference method does not depend on time step,positive,bounded,dynamic and consistent.All the computational techniques and their results were compared using computer simulations.展开更多
Edge computing paradigm for 5G architecture has been considered as one of the most effective ways to realize low latency and highly reliable communication,which brings computing tasks and network resources to the edge...Edge computing paradigm for 5G architecture has been considered as one of the most effective ways to realize low latency and highly reliable communication,which brings computing tasks and network resources to the edge of network.The deployment of edge computing nodes is a key factor affecting the service performance of edge computing systems.In this paper,we propose a method for deploying edge computing nodes based on user location.Through the combination of Simulation of Urban Mobility(SUMO)and Network Simulator-3(NS-3),a simulation platform is built to generate data of hotspot areas in Io T scenario.By effectively using the data generated by the communication between users in Io T scenario,the location area of the user terminal can be obtained.On this basis,the deployment problem is expressed as a mixed integer linear problem,which can be solved by Simulated Annealing(SA)method.The analysis of the results shows that,compared with the traditional method,the proposed method has faster convergence speed and better performance.展开更多
Although AI and quantum computing (QC) are fast emerging as key enablers of the future Internet, experts believe they pose an existential threat to humanity. Responding to the frenzied release of ChatGPT/GPT-4, thousa...Although AI and quantum computing (QC) are fast emerging as key enablers of the future Internet, experts believe they pose an existential threat to humanity. Responding to the frenzied release of ChatGPT/GPT-4, thousands of alarmed tech leaders recently signed an open letter to pause AI research to prepare for the catastrophic threats to humanity from uncontrolled AGI (Artificial General Intelligence). Perceived as an “epistemological nightmare”, AGI is believed to be on the anvil with GPT-5. Two computing rules appear responsible for these risks. 1) Mandatory third-party permissions that allow computers to run applications at the expense of introducing vulnerabilities. 2) The Halting Problem of Turing-complete AI programming languages potentially renders AGI unstoppable. The double whammy of these inherent weaknesses remains invincible under the legacy systems. A recent cybersecurity breakthrough shows that banning all permissions reduces the computer attack surface to zero, delivering a new zero vulnerability computing (ZVC) paradigm. Deploying ZVC and blockchain, this paper formulates and supports a hypothesis: “Safe, secure, ethical, controllable AGI/QC is possible by conquering the two unassailable rules of computability.” Pursued by a European consortium, testing/proving the proposed hypothesis will have a groundbreaking impact on the future digital infrastructure when AGI/QC starts powering the 75 billion internet devices by 2025.展开更多
Cloud Computing Assisted Instruction shows incomparable advantages over the traditional language teaching, but meanwhile, it exists some major problems, for instance, the information technology is omnipotent, informat...Cloud Computing Assisted Instruction shows incomparable advantages over the traditional language teaching, but meanwhile, it exists some major problems, for instance, the information technology is omnipotent, information input is too excessive and teachers' role is considerably weakened. This article attempts to analyze the problems and promote language teaching reform base on Cloud Computing Assisted Instruction.展开更多
Molecular programming is applied to minimum spanning problem whose solution requires encoding of real values in DNA strands. A new encoding scheme is proposed for real values that is biologically plausible and has a f...Molecular programming is applied to minimum spanning problem whose solution requires encoding of real values in DNA strands. A new encoding scheme is proposed for real values that is biologically plausible and has a fixed code length. According to the characteristics of the problem, a DNA algorithm solving the minimum spanning tree problem is given. The effectiveness of the proposed method is verified by simulation. The advantages and disadvantages of this algorithm are discussed.展开更多
Acoustic propagation problems in the sheared mean flow are numerically investigated using different acoustic propagation equations , including linearized Euler equations ( LEE ) and acoustic perturbation equations ( A...Acoustic propagation problems in the sheared mean flow are numerically investigated using different acoustic propagation equations , including linearized Euler equations ( LEE ) and acoustic perturbation equations ( APE ) .The resulted acoustic pressure is compared for the cases of uniform mean flow and sheared mean flow using both APE and LEE.Numerical results show that interactions between acoustics and mean flow should be properly considered to better understand noise propagation problems , and the suitable option of the different acoustic equations is indicated by the present comparisons.Moreover , the ability of APE to predict acoustic propagation is validated.APE can replace LEE when the 3-D flow-induced noise problem is solved , thus computational cost can decrease.展开更多
The biggest bottleneck in DNA computing is exponential explosion, in which the DNA molecules used as data in information processing grow exponentially with an increase of problem size. To overcome this bottleneck and ...The biggest bottleneck in DNA computing is exponential explosion, in which the DNA molecules used as data in information processing grow exponentially with an increase of problem size. To overcome this bottleneck and improve the processing speed, we propose a DNA computing model to solve the graph vertex coloring problem. The main points of the model are as follows: The exponential explosion prob- lem is solved by dividing subgraphs, reducing the vertex colors without losing the solutions, and ordering the vertices in subgraphs; and the bio-operation times are reduced considerably by a designed parallel polymerase chain reaction (PCR) technology that dramatically improves the processing speed. In this arti- cle, a 3-colorable graph with 61 vertices is used to illustrate the capability of the DNA computing model. The experiment showed that not only are all the solutions of the graph found, but also more than 99% of false solutions are deleted when the initial solution space is constructed. The powerful computational capability of the model was based on specific reactions among the large number of nanoscale oligonu- cleotide strands. All these tiny strands are operated by DNA self-assembly and parallel PCR. After thou- sands of accurate PCR operations, the solutions were found by recognizing, splicing, and assembling. We also prove that the searching capability of this model is up to 0(3^59). By means of an exhaustive search, it would take more than 896 000 years for an electronic computer (5 x 10^14 s-1) to achieve this enormous task. This searching capability is the largest among both the electronic and non-electronic computers that have been developed since the DNA computing model was proposed by Adleman's research group in 2002 (with a searching capability of 0(2^20)).展开更多
To solve job shop scheduling problem, a new approach-DNA computing is used in solving job shop scheduling problem. The approach using DNA computing to solve job shop scheduling is divided into three stands. Finally, o...To solve job shop scheduling problem, a new approach-DNA computing is used in solving job shop scheduling problem. The approach using DNA computing to solve job shop scheduling is divided into three stands. Finally, optimum solutions are obtained by sequencing A small job shop scheduling problem is solved in DNA computing, and the "operations" of the computation were performed with standard protocols, as ligation, synthesis, electrophoresis etc. This work represents further evidence for the ability of DNA computing to solve NP-complete search problems.展开更多
In this paper, the sticker based DNA computing was used for solving the independent set problem. At first, solution space was constructed by using appropriate DNA memory complexes. We defined a new operation called “...In this paper, the sticker based DNA computing was used for solving the independent set problem. At first, solution space was constructed by using appropriate DNA memory complexes. We defined a new operation called “divide” and applied it in construction of solution space. Then, by application of a sticker based parallel algorithm using biological operations, independent set problem was resolved in polynomial time.展开更多
The surface-based DNA computing is one of the methods of DNA computing which uses DNA strands immobilized on a solid surface. In this paper, we applied surface-based DNA computing for solving the dominating set proble...The surface-based DNA computing is one of the methods of DNA computing which uses DNA strands immobilized on a solid surface. In this paper, we applied surface-based DNA computing for solving the dominating set problem. At first step, surface-based DNA solution space was constructed by using appropriate DNA strands. Then, by application of a DNA parallel algorithm, dominating set problem was resolved in polynomial time.展开更多
The main goal of this paper is to compute the Figure-eight solutions for the planar Newtonian 3-body problem with equal masses by finding the critical points of the functional associated with the motion equations of 3...The main goal of this paper is to compute the Figure-eight solutions for the planar Newtonian 3-body problem with equal masses by finding the critical points of the functional associated with the motion equations of 3-body in plane R2. The algorithm adopted here is the steepest descent method, which is simple but very valid for our problem.展开更多
Research in spreadsheet management proved that the overuse of slow thinking, rather than fast thinking, is the primary source of erroneous end-user computing. However, we found that the reality is not that simple. To ...Research in spreadsheet management proved that the overuse of slow thinking, rather than fast thinking, is the primary source of erroneous end-user computing. However, we found that the reality is not that simple. To view end-user computing in its full complexity, we launched a project to investigate end-user education, training, support, activities, and computer problem solving. In this project we also set up the base and mathability-extended typology of computer problem solving approaches, where quantitative values are assigned to the different problem solving methods and activities. In this paper we present the results of our analyses of teaching materials collected in different languages from all over the world and our findings considering the different problem solving approaches, set in the frame of different thinking modes, the characteristics of expert teachers, and the meaning system model of teaching approaches. Based on our research, we argue that the proportions of fast and slow thinking and most importantly their manifestation are responsible for erroneous end-user activities. Applying the five-point mathability scale of computer problem solving, we recognized slow thinking activities on both tails and one fast thinking approach between them. The low mathability slow thinking activities, where surface navigation and language details are focused on, are widely accepted in end-user computing. The high mathability slow thinking problem solving activities, where the utilization of concept based approaches and schema construction take place, is hardly detectable in end-user activities. Instead of building up knowledge which requires slow thinking and then using the tools with fast thinking, end-users use up their slow thinking in aimless wandering in huge programs, making wrong decisions based on their untrained, clueless intuition, and distributing erroneous end-user documents. We also found that the dominance of low mathability slow thinking activities has its roots in the education system and through this we point out that we are in great need of expert teachers and institutions and their widely accepted approaches and methods.展开更多
To programming one Child's Behavior Problems Computer Screening System (CBPCSS), a series of software for analyzing child behavior individual and group sample. According to the worldfamous American ACHENBACH child...To programming one Child's Behavior Problems Computer Screening System (CBPCSS), a series of software for analyzing child behavior individual and group sample. According to the worldfamous American ACHENBACH child's behavior checklist, to ensure the screening quality and compatibility of cultures, we revised and standardized the norms of different ages in primary schools and nursery schools in various cities based on the principle of cluster stratified sampling. Then we designed CBPCSS carefully. The system can reliably and rapidly screen an individual child behavior and output the behavior factor curve (appearing in front of the profile). With CBPCSS we can observe the child behavior clearly. It takes twenty times shorter than that of manual screening. On the other hand, CBPCSS has a function of group analysis. The clinical practice proved that CBPCSS could substitute for manual screening completely. It is a powerful tool for social, scientific and pediatric medical workers.展开更多
We present an efficient deep learning method called coupled deep neural networks(CDNNs) for coupling of the Stokes and Darcy–Forchheimer problems. Our method compiles the interface conditions of the coupled problems ...We present an efficient deep learning method called coupled deep neural networks(CDNNs) for coupling of the Stokes and Darcy–Forchheimer problems. Our method compiles the interface conditions of the coupled problems into the networks properly and can be served as an efficient alternative to the complex coupled problems. To impose energy conservation constraints, the CDNNs utilize simple fully connected layers and a custom loss function to perform the model training process as well as the physical property of the exact solution. The approach can be beneficial for the following reasons: Firstly, we sample randomly and only input spatial coordinates without being restricted by the nature of samples.Secondly, our method is meshfree, which makes it more efficient than the traditional methods. Finally, the method is parallel and can solve multiple variables independently at the same time. We present the theoretical results to guarantee the convergence of the loss function and the convergence of the neural networks to the exact solution. Some numerical experiments are performed and discussed to demonstrate performance of the proposed method.展开更多
In this paper, we study a new approach for solving linear fractional programming problem (LFP) by converting it into a single Linear Programming (LP) Problem, which can be solved by using any type of linear fractional...In this paper, we study a new approach for solving linear fractional programming problem (LFP) by converting it into a single Linear Programming (LP) Problem, which can be solved by using any type of linear fractional programming technique. In the objective function of an LFP, if βis negative, the available methods are failed to solve, while our proposed method is capable of solving such problems. In the present paper, we propose a new method and develop FORTRAN programs to solve the problem. The optimal LFP solution procedure is illustrated with numerical examples and also by a computer program. We also compare our method with other available methods for solving LFP problems. Our proposed method of linear fractional programming (LFP) problem is very simple and easy to understand and apply.展开更多
The method of boundary layer with multiple scales and computer algebra were applied to study the asymptotic behavior of solution of boundary value problems for a class of system of nonlinear differential equations . T...The method of boundary layer with multiple scales and computer algebra were applied to study the asymptotic behavior of solution of boundary value problems for a class of system of nonlinear differential equations . The asymptotic expansions of solution were constructed. The remainders were estimated. And an example was analysed. It provides a new foreground for the application of the method of boundary layer with multiple scales .展开更多
The complexity of current software tools increases with the complexity of problem solving tasks they are designed to assist and are mainly dedicated to computer educated people. On the other hand current computer tech...The complexity of current software tools increases with the complexity of problem solving tasks they are designed to assist and are mainly dedicated to computer educated people. On the other hand current computer technology is deeply involved in people’s everyday life. This gap deepens and stresses software technology and computer education. The purpose of this paper is to discuss the feasibility of a new computer based problem solving methodology based on software tools that can be manipulated through the use of natural language. By computational emancipation the natural language becomes a family of non-ambiguous languages. This means that every problem solver uses a non-ambiguous natural language, termed here as Domain Algorithmic Language, DAL. Here we show how to develop software tools dedicated to the problem domain and illustrate the methodology we propose with the software tools required by teaching high school algebra.展开更多
The Hamiltonian cycle problem(HCP),which is an NP-complete problem,consists of having a graph G with n nodes and m edges and finding the path that connects each node exactly once.In this paper we compare some algorith...The Hamiltonian cycle problem(HCP),which is an NP-complete problem,consists of having a graph G with n nodes and m edges and finding the path that connects each node exactly once.In this paper we compare some algorithms to solve a Hamiltonian cycle problem,using different models of computations and especially the probabilistic and quantum ones.Starting from the classical probabilistic approach of random walks,we take a step to the quantum direction by involving an ad hoc designed Quantum Turing Machine(QTM),which can be a useful conceptual project tool for quantum algorithms.Introducing several constraints to the graphs,our analysis leads to not-exponential speedup improvements to the best-known algorithms.In particular,the results are based on bounded degree graphs(graphs with nodes having a maximum number of edges)and graphs with the right limited number of nodes and edges to allow them to outperform the other algorithms.展开更多
文摘The computational techniques are a set of novel problem-solving methodologies that have attracted wider attention for their excellent performance.The handling strategies of real-world problems are artificial neural networks(ANN),evolutionary computing(EC),and many more.An estimated fifty thousand to ninety thousand new leishmaniasis cases occur annually,with only 25%to 45%reported to the World Health Organization(WHO).It remains one of the top parasitic diseases with outbreak and mortality potential.In 2020,more than ninety percent of new cases reported to World Health Organization(WHO)occurred in ten countries:Brazil,China,Ethiopia,Eritrea,India,Kenya,Somalia,South Sudan,Sudan,and Yemen.The transmission of visceral leishmaniasis is studied dynamically and numerically.The study included positivity,boundedness,equilibria,reproduction number,and local stability of the model in the dynamical analysis.Some detailed methods like Runge Kutta and Euler depend on time steps and violate the physical relevance of the disease.They produce negative and unbounded results,so in disease dynamics,such developments have no biological significance;in other words,these results are meaningless.But the implicit nonstandard finite difference method does not depend on time step,positive,bounded,dynamic and consistent.All the computational techniques and their results were compared using computer simulations.
基金supported in part by the Beijing Natural Science Foundation under Grant L201011in part by the National Natural Science Foundation of China(U2001213 and 61971191)in part by National Key Research and Development Project(2020YFB1807204)。
文摘Edge computing paradigm for 5G architecture has been considered as one of the most effective ways to realize low latency and highly reliable communication,which brings computing tasks and network resources to the edge of network.The deployment of edge computing nodes is a key factor affecting the service performance of edge computing systems.In this paper,we propose a method for deploying edge computing nodes based on user location.Through the combination of Simulation of Urban Mobility(SUMO)and Network Simulator-3(NS-3),a simulation platform is built to generate data of hotspot areas in Io T scenario.By effectively using the data generated by the communication between users in Io T scenario,the location area of the user terminal can be obtained.On this basis,the deployment problem is expressed as a mixed integer linear problem,which can be solved by Simulated Annealing(SA)method.The analysis of the results shows that,compared with the traditional method,the proposed method has faster convergence speed and better performance.
文摘Although AI and quantum computing (QC) are fast emerging as key enablers of the future Internet, experts believe they pose an existential threat to humanity. Responding to the frenzied release of ChatGPT/GPT-4, thousands of alarmed tech leaders recently signed an open letter to pause AI research to prepare for the catastrophic threats to humanity from uncontrolled AGI (Artificial General Intelligence). Perceived as an “epistemological nightmare”, AGI is believed to be on the anvil with GPT-5. Two computing rules appear responsible for these risks. 1) Mandatory third-party permissions that allow computers to run applications at the expense of introducing vulnerabilities. 2) The Halting Problem of Turing-complete AI programming languages potentially renders AGI unstoppable. The double whammy of these inherent weaknesses remains invincible under the legacy systems. A recent cybersecurity breakthrough shows that banning all permissions reduces the computer attack surface to zero, delivering a new zero vulnerability computing (ZVC) paradigm. Deploying ZVC and blockchain, this paper formulates and supports a hypothesis: “Safe, secure, ethical, controllable AGI/QC is possible by conquering the two unassailable rules of computability.” Pursued by a European consortium, testing/proving the proposed hypothesis will have a groundbreaking impact on the future digital infrastructure when AGI/QC starts powering the 75 billion internet devices by 2025.
文摘Cloud Computing Assisted Instruction shows incomparable advantages over the traditional language teaching, but meanwhile, it exists some major problems, for instance, the information technology is omnipotent, information input is too excessive and teachers' role is considerably weakened. This article attempts to analyze the problems and promote language teaching reform base on Cloud Computing Assisted Instruction.
文摘Molecular programming is applied to minimum spanning problem whose solution requires encoding of real values in DNA strands. A new encoding scheme is proposed for real values that is biologically plausible and has a fixed code length. According to the characteristics of the problem, a DNA algorithm solving the minimum spanning tree problem is given. The effectiveness of the proposed method is verified by simulation. The advantages and disadvantages of this algorithm are discussed.
基金Supported by the National Natural Science Foundation of China(10902050)the China Postdoctoral Science Foundation Funded Project(20100481138)the Aeronautical Science Foundation of China(20101452017)
文摘Acoustic propagation problems in the sheared mean flow are numerically investigated using different acoustic propagation equations , including linearized Euler equations ( LEE ) and acoustic perturbation equations ( APE ) .The resulted acoustic pressure is compared for the cases of uniform mean flow and sheared mean flow using both APE and LEE.Numerical results show that interactions between acoustics and mean flow should be properly considered to better understand noise propagation problems , and the suitable option of the different acoustic equations is indicated by the present comparisons.Moreover , the ability of APE to predict acoustic propagation is validated.APE can replace LEE when the 3-D flow-induced noise problem is solved , thus computational cost can decrease.
基金The authors are grateful for the support from the National Natural Science Foundation of China (61632002, 61379059, and 61572046).
文摘The biggest bottleneck in DNA computing is exponential explosion, in which the DNA molecules used as data in information processing grow exponentially with an increase of problem size. To overcome this bottleneck and improve the processing speed, we propose a DNA computing model to solve the graph vertex coloring problem. The main points of the model are as follows: The exponential explosion prob- lem is solved by dividing subgraphs, reducing the vertex colors without losing the solutions, and ordering the vertices in subgraphs; and the bio-operation times are reduced considerably by a designed parallel polymerase chain reaction (PCR) technology that dramatically improves the processing speed. In this arti- cle, a 3-colorable graph with 61 vertices is used to illustrate the capability of the DNA computing model. The experiment showed that not only are all the solutions of the graph found, but also more than 99% of false solutions are deleted when the initial solution space is constructed. The powerful computational capability of the model was based on specific reactions among the large number of nanoscale oligonu- cleotide strands. All these tiny strands are operated by DNA self-assembly and parallel PCR. After thou- sands of accurate PCR operations, the solutions were found by recognizing, splicing, and assembling. We also prove that the searching capability of this model is up to 0(3^59). By means of an exhaustive search, it would take more than 896 000 years for an electronic computer (5 x 10^14 s-1) to achieve this enormous task. This searching capability is the largest among both the electronic and non-electronic computers that have been developed since the DNA computing model was proposed by Adleman's research group in 2002 (with a searching capability of 0(2^20)).
基金This Project was supported by the National Nature Science Foundation (60274026 ,30570431) China Postdoctoral Sci-ence Foundation Natural +1 种基金Science Foundation of Educational Government of Anhui Province of China Excellent Youth Scienceand Technology Foundation of Anhui Province of China (06042088) and Doctoral Foundation of Anhui University of Scienceand Technology
文摘To solve job shop scheduling problem, a new approach-DNA computing is used in solving job shop scheduling problem. The approach using DNA computing to solve job shop scheduling is divided into three stands. Finally, optimum solutions are obtained by sequencing A small job shop scheduling problem is solved in DNA computing, and the "operations" of the computation were performed with standard protocols, as ligation, synthesis, electrophoresis etc. This work represents further evidence for the ability of DNA computing to solve NP-complete search problems.
文摘In this paper, the sticker based DNA computing was used for solving the independent set problem. At first, solution space was constructed by using appropriate DNA memory complexes. We defined a new operation called “divide” and applied it in construction of solution space. Then, by application of a sticker based parallel algorithm using biological operations, independent set problem was resolved in polynomial time.
文摘The surface-based DNA computing is one of the methods of DNA computing which uses DNA strands immobilized on a solid surface. In this paper, we applied surface-based DNA computing for solving the dominating set problem. At first step, surface-based DNA solution space was constructed by using appropriate DNA strands. Then, by application of a DNA parallel algorithm, dominating set problem was resolved in polynomial time.
文摘The main goal of this paper is to compute the Figure-eight solutions for the planar Newtonian 3-body problem with equal masses by finding the critical points of the functional associated with the motion equations of 3-body in plane R2. The algorithm adopted here is the steepest descent method, which is simple but very valid for our problem.
文摘Research in spreadsheet management proved that the overuse of slow thinking, rather than fast thinking, is the primary source of erroneous end-user computing. However, we found that the reality is not that simple. To view end-user computing in its full complexity, we launched a project to investigate end-user education, training, support, activities, and computer problem solving. In this project we also set up the base and mathability-extended typology of computer problem solving approaches, where quantitative values are assigned to the different problem solving methods and activities. In this paper we present the results of our analyses of teaching materials collected in different languages from all over the world and our findings considering the different problem solving approaches, set in the frame of different thinking modes, the characteristics of expert teachers, and the meaning system model of teaching approaches. Based on our research, we argue that the proportions of fast and slow thinking and most importantly their manifestation are responsible for erroneous end-user activities. Applying the five-point mathability scale of computer problem solving, we recognized slow thinking activities on both tails and one fast thinking approach between them. The low mathability slow thinking activities, where surface navigation and language details are focused on, are widely accepted in end-user computing. The high mathability slow thinking problem solving activities, where the utilization of concept based approaches and schema construction take place, is hardly detectable in end-user activities. Instead of building up knowledge which requires slow thinking and then using the tools with fast thinking, end-users use up their slow thinking in aimless wandering in huge programs, making wrong decisions based on their untrained, clueless intuition, and distributing erroneous end-user documents. We also found that the dominance of low mathability slow thinking activities has its roots in the education system and through this we point out that we are in great need of expert teachers and institutions and their widely accepted approaches and methods.
文摘To programming one Child's Behavior Problems Computer Screening System (CBPCSS), a series of software for analyzing child behavior individual and group sample. According to the worldfamous American ACHENBACH child's behavior checklist, to ensure the screening quality and compatibility of cultures, we revised and standardized the norms of different ages in primary schools and nursery schools in various cities based on the principle of cluster stratified sampling. Then we designed CBPCSS carefully. The system can reliably and rapidly screen an individual child behavior and output the behavior factor curve (appearing in front of the profile). With CBPCSS we can observe the child behavior clearly. It takes twenty times shorter than that of manual screening. On the other hand, CBPCSS has a function of group analysis. The clinical practice proved that CBPCSS could substitute for manual screening completely. It is a powerful tool for social, scientific and pediatric medical workers.
基金Project supported in part by the National Natural Science Foundation of China (Grant No.11771259)the Special Support Program to Develop Innovative Talents in the Region of Shaanxi Province+1 种基金the Innovation Team on Computationally Efficient Numerical Methods Based on New Energy Problems in Shaanxi Provincethe Innovative Team Project of Shaanxi Provincial Department of Education (Grant No.21JP013)。
文摘We present an efficient deep learning method called coupled deep neural networks(CDNNs) for coupling of the Stokes and Darcy–Forchheimer problems. Our method compiles the interface conditions of the coupled problems into the networks properly and can be served as an efficient alternative to the complex coupled problems. To impose energy conservation constraints, the CDNNs utilize simple fully connected layers and a custom loss function to perform the model training process as well as the physical property of the exact solution. The approach can be beneficial for the following reasons: Firstly, we sample randomly and only input spatial coordinates without being restricted by the nature of samples.Secondly, our method is meshfree, which makes it more efficient than the traditional methods. Finally, the method is parallel and can solve multiple variables independently at the same time. We present the theoretical results to guarantee the convergence of the loss function and the convergence of the neural networks to the exact solution. Some numerical experiments are performed and discussed to demonstrate performance of the proposed method.
文摘In this paper, we study a new approach for solving linear fractional programming problem (LFP) by converting it into a single Linear Programming (LP) Problem, which can be solved by using any type of linear fractional programming technique. In the objective function of an LFP, if βis negative, the available methods are failed to solve, while our proposed method is capable of solving such problems. In the present paper, we propose a new method and develop FORTRAN programs to solve the problem. The optimal LFP solution procedure is illustrated with numerical examples and also by a computer program. We also compare our method with other available methods for solving LFP problems. Our proposed method of linear fractional programming (LFP) problem is very simple and easy to understand and apply.
文摘The method of boundary layer with multiple scales and computer algebra were applied to study the asymptotic behavior of solution of boundary value problems for a class of system of nonlinear differential equations . The asymptotic expansions of solution were constructed. The remainders were estimated. And an example was analysed. It provides a new foreground for the application of the method of boundary layer with multiple scales .
文摘The complexity of current software tools increases with the complexity of problem solving tasks they are designed to assist and are mainly dedicated to computer educated people. On the other hand current computer technology is deeply involved in people’s everyday life. This gap deepens and stresses software technology and computer education. The purpose of this paper is to discuss the feasibility of a new computer based problem solving methodology based on software tools that can be manipulated through the use of natural language. By computational emancipation the natural language becomes a family of non-ambiguous languages. This means that every problem solver uses a non-ambiguous natural language, termed here as Domain Algorithmic Language, DAL. Here we show how to develop software tools dedicated to the problem domain and illustrate the methodology we propose with the software tools required by teaching high school algebra.
基金the project PNRR-HPC,Big Data and Quantum Computing–CN1 Spoke 10,CUP I53C22000690001.
文摘The Hamiltonian cycle problem(HCP),which is an NP-complete problem,consists of having a graph G with n nodes and m edges and finding the path that connects each node exactly once.In this paper we compare some algorithms to solve a Hamiltonian cycle problem,using different models of computations and especially the probabilistic and quantum ones.Starting from the classical probabilistic approach of random walks,we take a step to the quantum direction by involving an ad hoc designed Quantum Turing Machine(QTM),which can be a useful conceptual project tool for quantum algorithms.Introducing several constraints to the graphs,our analysis leads to not-exponential speedup improvements to the best-known algorithms.In particular,the results are based on bounded degree graphs(graphs with nodes having a maximum number of edges)and graphs with the right limited number of nodes and edges to allow them to outperform the other algorithms.