Traditional Numerical Reservoir Simulation has been contributing to the oil and gas industry for decades.The current state of this technology is the result of decades of research and development by a large number of e...Traditional Numerical Reservoir Simulation has been contributing to the oil and gas industry for decades.The current state of this technology is the result of decades of research and development by a large number of engineers and scientists.Starting in the late 1960s and early 1970s,advances in computer hardware along with development and adaptation of clever algorithms resulted in a paradigm shift in reservoir studies moving them from simplified analogs and analytical solution methods to more mathematically robust computational and numerical solution models.展开更多
Machine intelligence,is out of the system by the artificial intelligence shown.It is usually achieved by the average computer intelligence.Rough sets and Information Granules in uncertainty management and soft computi...Machine intelligence,is out of the system by the artificial intelligence shown.It is usually achieved by the average computer intelligence.Rough sets and Information Granules in uncertainty management and soft computing and granular computing is widely used in many fields,such as in protein sequence analysis and biobasis determination,TSM and Web service classification Etc.展开更多
Artificial intelligence (AI) based technology, machine learning, and cognitive systems have played a very active role in society’s economic and technological transformation. For industrial value chains and internatio...Artificial intelligence (AI) based technology, machine learning, and cognitive systems have played a very active role in society’s economic and technological transformation. For industrial value chains and international businesses, it means that a structural change is necessary since these machines can learn and apply new information in making forecasts, processing, and interacting with people. Artificial intelligence (AI) is a science that uses powerful enough techniques, strategies, and mathematical modelling to tackle complex actual problems. Because of its inevitable progress further into the future, there have been considerable safety and ethical concerns. Creating an environment that is AI friendly for the people and vice versa might be a solution for humans and machines to discover a common set of values. In this context, the goal of this study is to investigate the emerging trends of AI (the benefits that it brings to the society), the moral challenges that come from ethical algorithms, learned or pre-set ideals, as well as address the ethical issues and malpractices of AI and AI security. This paper will address the consequences of AI in relation to investors and financial services. The article will examine the challenges and possible alternatives for resolving the potential unethical issues in finance and will propose the necessity of new AI governance mechanisms to protect the efficiency of the capital markets as well as the role of financial authority in the regulation and monitoring of the huge expansion of AI in finance.展开更多
Building cyber-physical system(CPS) models of machine tools is a key technology for intelligent manufacturing. The massive electronic data from a computer numerical control(CNC) system during the work processes of a C...Building cyber-physical system(CPS) models of machine tools is a key technology for intelligent manufacturing. The massive electronic data from a computer numerical control(CNC) system during the work processes of a CNC machine tool is the main source of the big data on which a CPS model is established. In this work-process model, a method based on instruction domain is applied to analyze the electronic big data, and a quantitative description of the numerical control(NC) processes is built according to the G code of the processes. Utilizing the instruction domain, a work-process CPS model is established on the basis of the accurate, real-time mapping of the manufacturing tasks, resources, and status of the CNC machine tool. Using such models, case studies are conducted on intelligent-machining applications, such as the optimization of NC processing parameters and the health assurance of CNC machine tools.展开更多
Due to the emergence of a large number of counterfeit notes and incomplete coins in the slot machine of self-service bus, to improve the automization of intelligent slot machine, based on multi-sensor testing technolo...Due to the emergence of a large number of counterfeit notes and incomplete coins in the slot machine of self-service bus, to improve the automization of intelligent slot machine, based on multi-sensor testing technology, using programming log- ic controller (PLC) as the core of the whole system, the PLC hardware design and software design are accomplished for the first time to detect the counterfeit notes and coins. The system was tested by many groups of experiments. The results show that the system has reliable recognition rate, good flexibility and stability, reaching the accuracy of 97%.展开更多
The rise of big data has led to new demands for machine learning (ML) systems to learn complex mod- els, with millions to billions of parameters, that promise adequate capacity to digest massive datasets and offer p...The rise of big data has led to new demands for machine learning (ML) systems to learn complex mod- els, with millions to billions of parameters, that promise adequate capacity to digest massive datasets and offer powerful predictive analytics (such as high-dimensional latent features, intermediate repre- sentations, and decision functions) thereupon. In order to run ML algorithms at such scales, on a distrib- uted cluster with tens to thousands of machines, it is often the case that significant engineering efforts are required-and one might fairly ask whether such engineering truly falls within the domain of ML research. Taking the view that "big" ML systems can benefit greatly from ML-rooted statistical and algo- rithmic insights-and that ML researchers should therefore not shy away from such systems design-we discuss a series of principles and strategies distilled from our recent efforts on industrial-scale ML solu- tions. These principles and strategies span a continuum from application, to engineering, and to theo- retical research and development of big ML systems and architectures, with the goal of understanding how to make them efficient, generally applicable, and supported with convergence and scaling guaran- tees. They concern four key questions that traditionally receive little attention in ML research: How can an ML program be distributed over a cluster? How can ML computation be bridged with inter-machine communication? How can such communication be performed? What should be communicated between machines? By exposing underlying statistical and algorithmic characteristics unique to ML programs but not typically seen in traditional computer programs, and by dissecting successful cases to reveal how we have harnessed these principles to design and develop both high-performance distributed ML software as well as general-purpose ML frameworks, we present opportunities for ML researchers and practitioners to further shape and enlarge the area that lies between ML and systems..展开更多
A design idea was proposed that it was about intelligent digital welding machine with self-learning and self- regulation functions. The overall design scheme of software and hardware was provided. It was introduced th...A design idea was proposed that it was about intelligent digital welding machine with self-learning and self- regulation functions. The overall design scheme of software and hardware was provided. It was introduced that a parameter self-learning algorithm was based on large-step calibration and partial Newton interpolation. Furthermore, experimental verification was carried out with different welding technologies. The results show that weld bead is pegrect. Therefore, good welding quality and stability are obtained, and intelligent regulation is realized by parameters self-learning.展开更多
After analyzing the structure and characteristics of the hybrid intelligent diagnosis system of CNC machine toolsCNC-HIDS), we describe the intelligent hybrid mechanism of the CNC-HIDS, and present the evaluation and ...After analyzing the structure and characteristics of the hybrid intelligent diagnosis system of CNC machine toolsCNC-HIDS), we describe the intelligent hybrid mechanism of the CNC-HIDS, and present the evaluation and the running instance of the system. Through tryout and validation, we attain satisfactory results.展开更多
The key techniques of modular design of heavy duty NC mathine tools are described. Amodule definition modelfor modular design and manufacturing of heavy duty NC machine tools isbulit and the essential composition of t...The key techniques of modular design of heavy duty NC mathine tools are described. Amodule definition modelfor modular design and manufacturing of heavy duty NC machine tools isbulit and the essential composition of the module definition model (MDM) is discussed in detail. Itis composed of two models: the part definition model (PDM) and the module assembly model(MAM). The PDM and MAM are built and their structures are given. Using object-oriented know-ledge representation and based on these models, an intelligent support system of modular design forheavy duty NC machine tools is developed and implemented This system has been applied to thepractical use of Wuhan Heavy Duty Machine Tool Works展开更多
Bipolar disorder presents significant challenges in clinical management, characterized by recurrent episodes of depression and mania often accompanied by impairment in functioning. This study investigates the efficacy...Bipolar disorder presents significant challenges in clinical management, characterized by recurrent episodes of depression and mania often accompanied by impairment in functioning. This study investigates the efficacy of pharmacological interventions and rehabilitation strategies to improve patient outcomes and quality of life. Utilizing a randomized controlled trial with multiple treatment arms, participants will receive pharmacotherapy, polypharmacotherapy, rehabilitation interventions, or combination treatments. Outcome measures will be assessed using standardized scales, including the Hamilton Depression Scale, Yale-Brown Obsessive Compulsive Scale (Y-BOCS), and Mania Scale. Preliminary data suggest improvements in symptom severity and functional outcomes with combination treatments. This research aims to inform clinical practice, guide treatment decisions, and ultimately enhance the quality of care for individuals living with bipolar disorder. Findings will be disseminated through peer-reviewed journals and scientific conferences to advance knowledge in this field.展开更多
To solve the problem of advanced digital manufacturing technology in the practical application, a knowledge engineering technology was introduced into the computer numerical control(CNC) programming. The knowledge acq...To solve the problem of advanced digital manufacturing technology in the practical application, a knowledge engineering technology was introduced into the computer numerical control(CNC) programming. The knowledge acquisition, knowledge representation and reasoning used in CNC programming were researched. The CNC programming system functional architecture of impeller parts based on knowledge based engineering(KBE) was constructed. The structural model of the general knowledge-based system(KBS) was also constructed. The KBS of CNC programming system was established through synthesizing database technology and knowledge base theory. And in the context of corporate needs, based on the knowledge-driven manufacturing platform(i.e. UG CAD/CAM), VC++6.0 and UG/Open, the KBS and UG CAD/CAM were integrated seamlessly and the intelligent CNC programming KBE system for the impeller parts was developed by integrating KBE and UG CAD/CAM system. A method to establish standard process templates was proposed, so as to develop the intelligent CNC programming system in which CNC machining process and process parameters were standardized by using this KBE system. For the impeller parts processing, the method applied in the development of the prototype system is proven to be viable, feasible and practical.展开更多
The Severe Acute Respiratory Syndrome CoronaVirus 2(SARS-CoV-2)virus spread the novel CoronaVirus−19(nCoV-19)pandemic,resulting in millions of fatalities globally.Recent research demonstrated that the Protein-Protein ...The Severe Acute Respiratory Syndrome CoronaVirus 2(SARS-CoV-2)virus spread the novel CoronaVirus−19(nCoV-19)pandemic,resulting in millions of fatalities globally.Recent research demonstrated that the Protein-Protein Interaction(PPI)between SARS-CoV-2 and human proteins is accountable for viral pathogenesis.However,many of these PPIs are poorly understood and unexplored,necessitating a more in-depth investigation to find latent yet critical interactions.This article elucidates the host-viral PPI through Machine Learning(ML)lenses and validates the biological significance of the same using web-based tools.ML classifiers are designed based on comprehensive datasets with five sequence-based features of human proteins,namely Amino Acid Composition,Pseudo Amino Acid Composition,Conjoint Triad,Dipeptide Composition,and Normalized Auto Correlation.A majority voting rule-based ensemble method composed of the Random Forest Model(RFM),AdaBoost,and Bagging technique is proposed that delivers encouraging statistical performance compared to other models employed in this work.The proposed ensemble model predicted a total of 111 possible SARS-CoV-2 human target proteins with a high likelihood factor≥70%,validated by utilizing Gene Ontology(GO)and KEGG pathway enrichment analysis.Consequently,this research can aid in a deeper understanding of the molecular mechanisms underlying viral pathogenesis and provide clues for developing more efficient anti-COVID medications.展开更多
Self-awareness,or self-consciousness,refers to reflective recognition of the existence of“subjective-self”.Every person has a subjective-self from which the person observes and interacts with the world.In this artic...Self-awareness,or self-consciousness,refers to reflective recognition of the existence of“subjective-self”.Every person has a subjective-self from which the person observes and interacts with the world.In this article,we argue that self-consciousness is an enigmatic phenomenon unique in human intelligence.Unlike many other intelligent and conscious capabilities,self-consciousness is not possible to be achieved in electronic computers and robots.Self-consciousness is an odd-point of human intelligence and a singularity of artificial intelligence(AI).Man-made intelligence through software is not capable of self-consciousness;therefore,robots will never become a newly created species.Because of the lack of self-awareness,AI software,such as Watson,Alpha-zero,ChatGPT,and PaLM,will remain a tool of humans and will not dominate the human society no matter how smart it is.This singularity of AI makes us re-think humbly what the future AI is like,what kind of robots we are going to deal with,and the blessing and threat of AI on humanity.展开更多
The ongoing expansion of the Industrial Internet of Things(IIoT)is enabling the possibility of effective Industry 4.0,where massive sensing devices in heterogeneous environments are connected through dedicated communi...The ongoing expansion of the Industrial Internet of Things(IIoT)is enabling the possibility of effective Industry 4.0,where massive sensing devices in heterogeneous environments are connected through dedicated communication protocols.This brings forth new methods and models to fuse the information yielded by the various industrial plant elements and generates emerging security challenges that we have to face,providing ad-hoc functions for scheduling and guaranteeing the network operations.Recently,the large development of SoftwareDefined Networking(SDN)and Artificial Intelligence(AI)technologies have made feasible the design and control of scalable and secure IIoT networks.This paper studies how AI and SDN technologies combined can be leveraged towards improving the security and functionality of these IIoT networks.After surveying the state-of-the-art research efforts in the subject,the paper introduces a candidate architecture for AI-enabled Software-Defined IIoT Network(AI-SDIN)that divides the traditional industrial networks into three functional layers.And with this aim in mind,key technologies(Blockchain-based Data Sharing,Intelligent Wireless Data Sensing,Edge Intelligence,Time-Sensitive Networks,Integrating SDN&TSN,Distributed AI)and improve applications based on AISDIN are also discussed.Further,the paper also highlights new opportunities and potential research challenges in control and automation of IIoT networks.展开更多
Machine intelligence is increasingly entering roles that were until recently dominated by human intelligence. As humans now depend upon machines to perform various tasks and operations, there appears to be a risk that...Machine intelligence is increasingly entering roles that were until recently dominated by human intelligence. As humans now depend upon machines to perform various tasks and operations, there appears to be a risk that humans are losing the necessary skills associated with producing competitively advantageous decisions.Therefore, this research explores the emerging area of human versus machine decision-making. An illustrative engineering case involving a joint machine and human decision-making system is presented to demonstrate how the outcome was not satisfactorily managed for all the parties involved. This is accompanied by a novel framework and research agenda to highlight areas of concern for engineering managers. We offer that the speed at which new human-machine interactions are being encountered by engineering managers suggests that an urgent need exists to develop a robust body of knowledge to provide sound guidance to situations where human and machine decisions conflict. Human-machine systems are becoming pervasive yet this research has revealed that current technological approaches are not adequate. The engineering insights and multi-criteria decision-making tool from this research significantly advance our understanding of this important area.展开更多
This article describes a novel approach for enhancing the three-dimensional(3D)point cloud reconstruction for light field microscopy(LFM)using U-net architecture-based fully convolutional neural network(CNN).Since the...This article describes a novel approach for enhancing the three-dimensional(3D)point cloud reconstruction for light field microscopy(LFM)using U-net architecture-based fully convolutional neural network(CNN).Since the directional view of the LFM is limited,noise and artifacts make it difficult to reconstruct the exact shape of 3D point clouds.The existing methods suffer from these problems due to the self-occlusion of the model.This manuscript proposes a deep fusion learning(DL)method that combines a 3D CNN with a U-Net-based model as a feature extractor.The sub-aperture images obtained from the light field microscopy are aligned to form a light field data cube for preprocessing.A multi-stream 3D CNNs and U-net architecture are applied to obtain the depth feature fromthe directional sub-aperture LF data cube.For the enhancement of the depthmap,dual iteration-based weighted median filtering(WMF)is used to reduce surface noise and enhance the accuracy of the reconstruction.Generating a 3D point cloud involves combining two key elements:the enhanced depth map and the central view of the light field image.The proposed method is validated using synthesized Heidelberg Collaboratory for Image Processing(HCI)and real-world LFM datasets.The results are compared with different state-of-the-art methods.The structural similarity index(SSIM)gain for boxes,cotton,pillow,and pens are 0.9760,0.9806,0.9940,and 0.9907,respectively.Moreover,the discrete entropy(DE)value for LFM depth maps exhibited better performance than other existing methods.展开更多
A great discovery made by H. von Foerster, P. M. Mora and L. W. Amiot was published in a 1960 issue of “Science”. The authors showed that existing data for calculating the Earth’s population in the new era (from 1 ...A great discovery made by H. von Foerster, P. M. Mora and L. W. Amiot was published in a 1960 issue of “Science”. The authors showed that existing data for calculating the Earth’s population in the new era (from 1 to 1958) could be described with incredibly high proximity by a hyperbolic function with the point of singularity on 13 November 2026. Thus, empirical regularity of the rise of the human population was established, which was marked by explosive demographic growth in the 20<sup>th</sup> century when during only one century it almost quadrupled: from 1.656 billion in 1900 to 6.144 billion in 2000. Nowadays, the world population has already overcome 7.8 billion people. Immediately after 1960, an active search for phenomenological models began to explain the mechanism of the hyperbolic population growth and the following demographic transition designed to stabilize its population. A significant role in explaining the mechanism of the hyperbolic growth of the world population was played by S. Kuznets (1960) and E. Boserup (1965), who found out that the rates of technological progress historically increased in proportion to the Earth’s population. It meant that the growth of the population led to raising the level of life-supporting technologies, and the latter in its turn enlarged the carrying capacity of the Earth, making it possible for the world population to expand. Proceeding from the information imperative, we have developed the model of the demographic dynamics for the 21<sup>st</sup> century for the first time. The model shows that with the development and spread of Intelligent Machines (IM), the number of the world population reaching a certain maximum will then irreversibly decline. Human depopulation will largely touch upon the most developed countries, where IM is used intensively nowadays. Until a certain moment in time, this depopulation in developed countries will be compensated by the explosive growth of the population in African countries located south of the Sahara. Calculations in our model reveal that the peak of the human population of 8.52 billion people will be reached in 2050, then it will irreversibly go down to 7.9 billion people by 2100, if developed countries do not take timely effective measures to overcome the process of information depopulation.展开更多
The aim of this study was to develop an adequate mathematical model for long-term forecasting of technological progress and economic growth in the digital age (2020-2050). In addition, the task was to develop a model ...The aim of this study was to develop an adequate mathematical model for long-term forecasting of technological progress and economic growth in the digital age (2020-2050). In addition, the task was to develop a model for forecast calculations of labor productivity in the symbiosis of “man + intelligent machine”, where an intelligent machine (IM) is understood as a computer or robot equipped with elements of artificial intelligence (AI), as well as in the digital economy as a whole. In the course of the study, it was shown that in order to implement its goals the Schumpeter-Kondratiev innovation and cycle theory on forming long waves (LW) of economic development influenced by a powerful cluster of economic technologies engendered by industrial revolutions is most appropriate for a long-term forecasting of technological progress and economic growth. The Solow neoclassical model of economic growth, synchronized with LW, gives the opportunity to forecast economic dynamics of technologically advanced countries with a greater precision up to 30 years, the time which correlates with the continuation of LW. In the information and digital age, the key role among the main factors of growth (capital, labour and technological progress) is played by the latter. The authors have developed an information model which allows for forecasting technological progress basing on growth rates of endogenous technological information in economics. The main regimes of producing technological information, corresponding to the eras of information and digital economies, are given in the article, as well as the Lagrangians that engender them. The model is verified on the example of the 5<sup>th</sup> information LW for the US economy (1982-2018) and it has had highly accurate approximation for both technological progress and economic growth. A number of new results were obtained using the developed information models for forecasting technological progress. The forecasting trajectory of economic growth of developed countries (on the example of the USA) on the upward stage of the 6<sup>th</sup> LW (2018-2042), engendered by the digital technologies of the 4<sup>th</sup> Industrial Revolution is given. It is also demonstrated that the symbiosis of human and intelligent machine (IM) is the driving force in the digital economy, where man plays the leading role organizing effective and efficient mutual work. Authors suggest a mathematical model for calculating labour productivity in the digital economy, where the symbiosis of “human + IM” is widely used. The calculations carried out with the help of the model show: 1) the symbiosis of “human + IM” from the very beginning lets to realize the possibilities of increasing work performance in the economy with the help of digital technologies;2) the largest labour productivity is achieved in the symbiosis of “human + IM”, where man labour prevails, and the lowest labour productivity is seen where the largest part of the work is performed by IM;3) developed countries may achieve labour productivity of 3% per year by the mid-2020s, which has all the chances to stay up to the 2040s.展开更多
Using object mathematical model of traditional control theory can not solve the forecasting problem of the chemical components of sintered ore.In order to control complicated chemical components in the manufacturing p...Using object mathematical model of traditional control theory can not solve the forecasting problem of the chemical components of sintered ore.In order to control complicated chemical components in the manufacturing process of sintered ore,some key techniques for intelligent forecasting of the chemical components of sintered ore are studied in this paper.A new intelligent forecasting system based on SVM is proposed and realized.The results show that the accuracy of predictive value of every component is more than 90%.The application of our system in related companies is for more than one year and has shown satisfactory results.展开更多
文摘Traditional Numerical Reservoir Simulation has been contributing to the oil and gas industry for decades.The current state of this technology is the result of decades of research and development by a large number of engineers and scientists.Starting in the late 1960s and early 1970s,advances in computer hardware along with development and adaptation of clever algorithms resulted in a paradigm shift in reservoir studies moving them from simplified analogs and analytical solution methods to more mathematically robust computational and numerical solution models.
文摘Machine intelligence,is out of the system by the artificial intelligence shown.It is usually achieved by the average computer intelligence.Rough sets and Information Granules in uncertainty management and soft computing and granular computing is widely used in many fields,such as in protein sequence analysis and biobasis determination,TSM and Web service classification Etc.
文摘Artificial intelligence (AI) based technology, machine learning, and cognitive systems have played a very active role in society’s economic and technological transformation. For industrial value chains and international businesses, it means that a structural change is necessary since these machines can learn and apply new information in making forecasts, processing, and interacting with people. Artificial intelligence (AI) is a science that uses powerful enough techniques, strategies, and mathematical modelling to tackle complex actual problems. Because of its inevitable progress further into the future, there have been considerable safety and ethical concerns. Creating an environment that is AI friendly for the people and vice versa might be a solution for humans and machines to discover a common set of values. In this context, the goal of this study is to investigate the emerging trends of AI (the benefits that it brings to the society), the moral challenges that come from ethical algorithms, learned or pre-set ideals, as well as address the ethical issues and malpractices of AI and AI security. This paper will address the consequences of AI in relation to investors and financial services. The article will examine the challenges and possible alternatives for resolving the potential unethical issues in finance and will propose the necessity of new AI governance mechanisms to protect the efficiency of the capital markets as well as the role of financial authority in the regulation and monitoring of the huge expansion of AI in finance.
基金support of the studies is from the National Major Scientific and Technological Special Project for "Development and comprehensive verification of complete products of open high-end CNC system, servo device and motor" (2012ZX04001012)
文摘Building cyber-physical system(CPS) models of machine tools is a key technology for intelligent manufacturing. The massive electronic data from a computer numerical control(CNC) system during the work processes of a CNC machine tool is the main source of the big data on which a CPS model is established. In this work-process model, a method based on instruction domain is applied to analyze the electronic big data, and a quantitative description of the numerical control(NC) processes is built according to the G code of the processes. Utilizing the instruction domain, a work-process CPS model is established on the basis of the accurate, real-time mapping of the manufacturing tasks, resources, and status of the CNC machine tool. Using such models, case studies are conducted on intelligent-machining applications, such as the optimization of NC processing parameters and the health assurance of CNC machine tools.
文摘Due to the emergence of a large number of counterfeit notes and incomplete coins in the slot machine of self-service bus, to improve the automization of intelligent slot machine, based on multi-sensor testing technology, using programming log- ic controller (PLC) as the core of the whole system, the PLC hardware design and software design are accomplished for the first time to detect the counterfeit notes and coins. The system was tested by many groups of experiments. The results show that the system has reliable recognition rate, good flexibility and stability, reaching the accuracy of 97%.
文摘The rise of big data has led to new demands for machine learning (ML) systems to learn complex mod- els, with millions to billions of parameters, that promise adequate capacity to digest massive datasets and offer powerful predictive analytics (such as high-dimensional latent features, intermediate repre- sentations, and decision functions) thereupon. In order to run ML algorithms at such scales, on a distrib- uted cluster with tens to thousands of machines, it is often the case that significant engineering efforts are required-and one might fairly ask whether such engineering truly falls within the domain of ML research. Taking the view that "big" ML systems can benefit greatly from ML-rooted statistical and algo- rithmic insights-and that ML researchers should therefore not shy away from such systems design-we discuss a series of principles and strategies distilled from our recent efforts on industrial-scale ML solu- tions. These principles and strategies span a continuum from application, to engineering, and to theo- retical research and development of big ML systems and architectures, with the goal of understanding how to make them efficient, generally applicable, and supported with convergence and scaling guaran- tees. They concern four key questions that traditionally receive little attention in ML research: How can an ML program be distributed over a cluster? How can ML computation be bridged with inter-machine communication? How can such communication be performed? What should be communicated between machines? By exposing underlying statistical and algorithmic characteristics unique to ML programs but not typically seen in traditional computer programs, and by dissecting successful cases to reveal how we have harnessed these principles to design and develop both high-performance distributed ML software as well as general-purpose ML frameworks, we present opportunities for ML researchers and practitioners to further shape and enlarge the area that lies between ML and systems..
文摘A design idea was proposed that it was about intelligent digital welding machine with self-learning and self- regulation functions. The overall design scheme of software and hardware was provided. It was introduced that a parameter self-learning algorithm was based on large-step calibration and partial Newton interpolation. Furthermore, experimental verification was carried out with different welding technologies. The results show that weld bead is pegrect. Therefore, good welding quality and stability are obtained, and intelligent regulation is realized by parameters self-learning.
文摘After analyzing the structure and characteristics of the hybrid intelligent diagnosis system of CNC machine toolsCNC-HIDS), we describe the intelligent hybrid mechanism of the CNC-HIDS, and present the evaluation and the running instance of the system. Through tryout and validation, we attain satisfactory results.
文摘The key techniques of modular design of heavy duty NC mathine tools are described. Amodule definition modelfor modular design and manufacturing of heavy duty NC machine tools isbulit and the essential composition of the module definition model (MDM) is discussed in detail. Itis composed of two models: the part definition model (PDM) and the module assembly model(MAM). The PDM and MAM are built and their structures are given. Using object-oriented know-ledge representation and based on these models, an intelligent support system of modular design forheavy duty NC machine tools is developed and implemented This system has been applied to thepractical use of Wuhan Heavy Duty Machine Tool Works
文摘Bipolar disorder presents significant challenges in clinical management, characterized by recurrent episodes of depression and mania often accompanied by impairment in functioning. This study investigates the efficacy of pharmacological interventions and rehabilitation strategies to improve patient outcomes and quality of life. Utilizing a randomized controlled trial with multiple treatment arms, participants will receive pharmacotherapy, polypharmacotherapy, rehabilitation interventions, or combination treatments. Outcome measures will be assessed using standardized scales, including the Hamilton Depression Scale, Yale-Brown Obsessive Compulsive Scale (Y-BOCS), and Mania Scale. Preliminary data suggest improvements in symptom severity and functional outcomes with combination treatments. This research aims to inform clinical practice, guide treatment decisions, and ultimately enhance the quality of care for individuals living with bipolar disorder. Findings will be disseminated through peer-reviewed journals and scientific conferences to advance knowledge in this field.
基金Project(12ZT14)supported by the Natural Science Foundation of Shanghai Municipal Education Commission,China
文摘To solve the problem of advanced digital manufacturing technology in the practical application, a knowledge engineering technology was introduced into the computer numerical control(CNC) programming. The knowledge acquisition, knowledge representation and reasoning used in CNC programming were researched. The CNC programming system functional architecture of impeller parts based on knowledge based engineering(KBE) was constructed. The structural model of the general knowledge-based system(KBS) was also constructed. The KBS of CNC programming system was established through synthesizing database technology and knowledge base theory. And in the context of corporate needs, based on the knowledge-driven manufacturing platform(i.e. UG CAD/CAM), VC++6.0 and UG/Open, the KBS and UG CAD/CAM were integrated seamlessly and the intelligent CNC programming KBE system for the impeller parts was developed by integrating KBE and UG CAD/CAM system. A method to establish standard process templates was proposed, so as to develop the intelligent CNC programming system in which CNC machining process and process parameters were standardized by using this KBE system. For the impeller parts processing, the method applied in the development of the prototype system is proven to be viable, feasible and practical.
文摘The Severe Acute Respiratory Syndrome CoronaVirus 2(SARS-CoV-2)virus spread the novel CoronaVirus−19(nCoV-19)pandemic,resulting in millions of fatalities globally.Recent research demonstrated that the Protein-Protein Interaction(PPI)between SARS-CoV-2 and human proteins is accountable for viral pathogenesis.However,many of these PPIs are poorly understood and unexplored,necessitating a more in-depth investigation to find latent yet critical interactions.This article elucidates the host-viral PPI through Machine Learning(ML)lenses and validates the biological significance of the same using web-based tools.ML classifiers are designed based on comprehensive datasets with five sequence-based features of human proteins,namely Amino Acid Composition,Pseudo Amino Acid Composition,Conjoint Triad,Dipeptide Composition,and Normalized Auto Correlation.A majority voting rule-based ensemble method composed of the Random Forest Model(RFM),AdaBoost,and Bagging technique is proposed that delivers encouraging statistical performance compared to other models employed in this work.The proposed ensemble model predicted a total of 111 possible SARS-CoV-2 human target proteins with a high likelihood factor≥70%,validated by utilizing Gene Ontology(GO)and KEGG pathway enrichment analysis.Consequently,this research can aid in a deeper understanding of the molecular mechanisms underlying viral pathogenesis and provide clues for developing more efficient anti-COVID medications.
文摘Self-awareness,or self-consciousness,refers to reflective recognition of the existence of“subjective-self”.Every person has a subjective-self from which the person observes and interacts with the world.In this article,we argue that self-consciousness is an enigmatic phenomenon unique in human intelligence.Unlike many other intelligent and conscious capabilities,self-consciousness is not possible to be achieved in electronic computers and robots.Self-consciousness is an odd-point of human intelligence and a singularity of artificial intelligence(AI).Man-made intelligence through software is not capable of self-consciousness;therefore,robots will never become a newly created species.Because of the lack of self-awareness,AI software,such as Watson,Alpha-zero,ChatGPT,and PaLM,will remain a tool of humans and will not dominate the human society no matter how smart it is.This singularity of AI makes us re-think humbly what the future AI is like,what kind of robots we are going to deal with,and the blessing and threat of AI on humanity.
基金the Ministry of Education-China Mobile Research Foundation Project of China(MCM20180703)the National Key Research and Development Program of China(2020YFB1711100)for financial support.
基金This work was supported by the six talent peaks project in Jiangsu Province(No.XYDXX-012)Natural Science Foundation of China(No.62002045),China Postdoctoral Science Foundation(No.2021M690565)Fundamental Research Funds for the Cornell University(No.N2117002).
文摘The ongoing expansion of the Industrial Internet of Things(IIoT)is enabling the possibility of effective Industry 4.0,where massive sensing devices in heterogeneous environments are connected through dedicated communication protocols.This brings forth new methods and models to fuse the information yielded by the various industrial plant elements and generates emerging security challenges that we have to face,providing ad-hoc functions for scheduling and guaranteeing the network operations.Recently,the large development of SoftwareDefined Networking(SDN)and Artificial Intelligence(AI)technologies have made feasible the design and control of scalable and secure IIoT networks.This paper studies how AI and SDN technologies combined can be leveraged towards improving the security and functionality of these IIoT networks.After surveying the state-of-the-art research efforts in the subject,the paper introduces a candidate architecture for AI-enabled Software-Defined IIoT Network(AI-SDIN)that divides the traditional industrial networks into three functional layers.And with this aim in mind,key technologies(Blockchain-based Data Sharing,Intelligent Wireless Data Sensing,Edge Intelligence,Time-Sensitive Networks,Integrating SDN&TSN,Distributed AI)and improve applications based on AISDIN are also discussed.Further,the paper also highlights new opportunities and potential research challenges in control and automation of IIoT networks.
文摘Machine intelligence is increasingly entering roles that were until recently dominated by human intelligence. As humans now depend upon machines to perform various tasks and operations, there appears to be a risk that humans are losing the necessary skills associated with producing competitively advantageous decisions.Therefore, this research explores the emerging area of human versus machine decision-making. An illustrative engineering case involving a joint machine and human decision-making system is presented to demonstrate how the outcome was not satisfactorily managed for all the parties involved. This is accompanied by a novel framework and research agenda to highlight areas of concern for engineering managers. We offer that the speed at which new human-machine interactions are being encountered by engineering managers suggests that an urgent need exists to develop a robust body of knowledge to provide sound guidance to situations where human and machine decisions conflict. Human-machine systems are becoming pervasive yet this research has revealed that current technological approaches are not adequate. The engineering insights and multi-criteria decision-making tool from this research significantly advance our understanding of this important area.
基金supported by the National Research Foundation of Korea (NRF) (NRF-2018R1D1A3B07044041&NRF-2020R1A2C1101258)supported by the MSIT (Ministry of Science and ICT),Korea,under the ITRC (Information Technology Research Center)Support Program (IITP-2023-2020-0-01846)was conducted during the research year of Chungbuk National University in 2023.
文摘This article describes a novel approach for enhancing the three-dimensional(3D)point cloud reconstruction for light field microscopy(LFM)using U-net architecture-based fully convolutional neural network(CNN).Since the directional view of the LFM is limited,noise and artifacts make it difficult to reconstruct the exact shape of 3D point clouds.The existing methods suffer from these problems due to the self-occlusion of the model.This manuscript proposes a deep fusion learning(DL)method that combines a 3D CNN with a U-Net-based model as a feature extractor.The sub-aperture images obtained from the light field microscopy are aligned to form a light field data cube for preprocessing.A multi-stream 3D CNNs and U-net architecture are applied to obtain the depth feature fromthe directional sub-aperture LF data cube.For the enhancement of the depthmap,dual iteration-based weighted median filtering(WMF)is used to reduce surface noise and enhance the accuracy of the reconstruction.Generating a 3D point cloud involves combining two key elements:the enhanced depth map and the central view of the light field image.The proposed method is validated using synthesized Heidelberg Collaboratory for Image Processing(HCI)and real-world LFM datasets.The results are compared with different state-of-the-art methods.The structural similarity index(SSIM)gain for boxes,cotton,pillow,and pens are 0.9760,0.9806,0.9940,and 0.9907,respectively.Moreover,the discrete entropy(DE)value for LFM depth maps exhibited better performance than other existing methods.
文摘A great discovery made by H. von Foerster, P. M. Mora and L. W. Amiot was published in a 1960 issue of “Science”. The authors showed that existing data for calculating the Earth’s population in the new era (from 1 to 1958) could be described with incredibly high proximity by a hyperbolic function with the point of singularity on 13 November 2026. Thus, empirical regularity of the rise of the human population was established, which was marked by explosive demographic growth in the 20<sup>th</sup> century when during only one century it almost quadrupled: from 1.656 billion in 1900 to 6.144 billion in 2000. Nowadays, the world population has already overcome 7.8 billion people. Immediately after 1960, an active search for phenomenological models began to explain the mechanism of the hyperbolic population growth and the following demographic transition designed to stabilize its population. A significant role in explaining the mechanism of the hyperbolic growth of the world population was played by S. Kuznets (1960) and E. Boserup (1965), who found out that the rates of technological progress historically increased in proportion to the Earth’s population. It meant that the growth of the population led to raising the level of life-supporting technologies, and the latter in its turn enlarged the carrying capacity of the Earth, making it possible for the world population to expand. Proceeding from the information imperative, we have developed the model of the demographic dynamics for the 21<sup>st</sup> century for the first time. The model shows that with the development and spread of Intelligent Machines (IM), the number of the world population reaching a certain maximum will then irreversibly decline. Human depopulation will largely touch upon the most developed countries, where IM is used intensively nowadays. Until a certain moment in time, this depopulation in developed countries will be compensated by the explosive growth of the population in African countries located south of the Sahara. Calculations in our model reveal that the peak of the human population of 8.52 billion people will be reached in 2050, then it will irreversibly go down to 7.9 billion people by 2100, if developed countries do not take timely effective measures to overcome the process of information depopulation.
文摘The aim of this study was to develop an adequate mathematical model for long-term forecasting of technological progress and economic growth in the digital age (2020-2050). In addition, the task was to develop a model for forecast calculations of labor productivity in the symbiosis of “man + intelligent machine”, where an intelligent machine (IM) is understood as a computer or robot equipped with elements of artificial intelligence (AI), as well as in the digital economy as a whole. In the course of the study, it was shown that in order to implement its goals the Schumpeter-Kondratiev innovation and cycle theory on forming long waves (LW) of economic development influenced by a powerful cluster of economic technologies engendered by industrial revolutions is most appropriate for a long-term forecasting of technological progress and economic growth. The Solow neoclassical model of economic growth, synchronized with LW, gives the opportunity to forecast economic dynamics of technologically advanced countries with a greater precision up to 30 years, the time which correlates with the continuation of LW. In the information and digital age, the key role among the main factors of growth (capital, labour and technological progress) is played by the latter. The authors have developed an information model which allows for forecasting technological progress basing on growth rates of endogenous technological information in economics. The main regimes of producing technological information, corresponding to the eras of information and digital economies, are given in the article, as well as the Lagrangians that engender them. The model is verified on the example of the 5<sup>th</sup> information LW for the US economy (1982-2018) and it has had highly accurate approximation for both technological progress and economic growth. A number of new results were obtained using the developed information models for forecasting technological progress. The forecasting trajectory of economic growth of developed countries (on the example of the USA) on the upward stage of the 6<sup>th</sup> LW (2018-2042), engendered by the digital technologies of the 4<sup>th</sup> Industrial Revolution is given. It is also demonstrated that the symbiosis of human and intelligent machine (IM) is the driving force in the digital economy, where man plays the leading role organizing effective and efficient mutual work. Authors suggest a mathematical model for calculating labour productivity in the digital economy, where the symbiosis of “human + IM” is widely used. The calculations carried out with the help of the model show: 1) the symbiosis of “human + IM” from the very beginning lets to realize the possibilities of increasing work performance in the economy with the help of digital technologies;2) the largest labour productivity is achieved in the symbiosis of “human + IM”, where man labour prevails, and the lowest labour productivity is seen where the largest part of the work is performed by IM;3) developed countries may achieve labour productivity of 3% per year by the mid-2020s, which has all the chances to stay up to the 2040s.
基金Supported by Key Science and Technology Project of Wuhan(No. 20106062327)Self-determined and Innovative Research Funds of WUT (No.2010-YB-20)
文摘Using object mathematical model of traditional control theory can not solve the forecasting problem of the chemical components of sintered ore.In order to control complicated chemical components in the manufacturing process of sintered ore,some key techniques for intelligent forecasting of the chemical components of sintered ore are studied in this paper.A new intelligent forecasting system based on SVM is proposed and realized.The results show that the accuracy of predictive value of every component is more than 90%.The application of our system in related companies is for more than one year and has shown satisfactory results.