From 26 to 27 October 2017, the Centre for Cross-Border Commercial Law in Asia of Singapore Management University (SMU) Law School held an international conference entitled "Future of Law Conference: The Internet ...From 26 to 27 October 2017, the Centre for Cross-Border Commercial Law in Asia of Singapore Management University (SMU) Law School held an international conference entitled "Future of Law Conference: The Internet of Things, Smart Contracts and Intelligent Machines" in Singapore. The conference brought together the leading thinkers in academia and practice in the field of information technology law to discuss the legal and regulatory implications of recent technological developments. Associate Professor ZHANG Jiyu and Associate Professor DING Xiaodong of the Law and Technology Institute of Renmin Law School were invited to attend the conference.展开更多
Research purposes:in this study,the intelligent bionic robotic horse is introduced into the equestrian teaching for teenagers,compared with the traditional teaching mode of using real horses.This research aims to expl...Research purposes:in this study,the intelligent bionic robotic horse is introduced into the equestrian teaching for teenagers,compared with the traditional teaching mode of using real horses.This research aims to explore the effectiveness of using intelligent bionic robotic horse in equestrian teaching for teenagers,as well as to promote the further development of equestrian teaching for teenagers in China,and to promote the introduction of new technology into the equestrian teaching area in the age of internet.Research methods:the methods used were literature method;mathematical statistics;interviewing the equestrian coaches who participated in the experiment;experimental method.The intelligent bionic robotic horse used in this research is the GETTAEN intelligent bionic robotic horse produced by Joy Game Technology Co.,Ltd.The bionic robotic horse is equipped with Internet technology,and the course is supervised and produced by senior coaches of China Equestrian Team.It also includes multiple operation modes.In this study,40 amateur students in Beijing Chaoyang Park Youth Equestrian Center were selected as the experimental subjects.Students will spend 40 h on studying how to ride a horse.Twenty(20)students in the experimental group,they are accommodated with 20 h of bionic robotic horse courses and 20 h of real horse course;20 students in the control group were taught in the traditional teaching mode with 40 h of real horse courses.Results:(1)in horseback physical fitness test,the average value of the control group was 101.9 s;325.6 s in the experimental group.Independent sample T test p<0.05 has significant difference,the horseback physical performance in experimental group is better than the control group.(2)in horseback physical balance test,the average value of the control group was 3.75,and the average value of the experimental group was 7.1.Independent sample T test p<0.05 has significant difference,the horseback physical balance test results in experimental group have significant difference,and the experimental group is better than the control group.(3)The interview method was used to interview the equestrian coaches who participated in the experiment,coaches think that the bionic robotic horse can speed up the learning progress and has a strong technical consolidation,especially for teaching amateurs;but for the time being,it cannot meet the training and improvement target of the actual horse control ability and the ability to grasp the route,and such experience is not real and good enough for senior students.Conclusion:using real horse and intelligent bionic robotic horse combined,one can improve the teaching effectiveness and promote students’adaptation to horseback and technical mastery.But for the time being,it is only suitable for students with weak foundation or zero foundation.The capability of intelligent bionic robotic horse needs to be strengthened,and technological innovation is needed to adapt to all kinds of students.展开更多
Bipolar disorder presents significant challenges in clinical management, characterized by recurrent episodes of depression and mania often accompanied by impairment in functioning. This study investigates the efficacy...Bipolar disorder presents significant challenges in clinical management, characterized by recurrent episodes of depression and mania often accompanied by impairment in functioning. This study investigates the efficacy of pharmacological interventions and rehabilitation strategies to improve patient outcomes and quality of life. Utilizing a randomized controlled trial with multiple treatment arms, participants will receive pharmacotherapy, polypharmacotherapy, rehabilitation interventions, or combination treatments. Outcome measures will be assessed using standardized scales, including the Hamilton Depression Scale, Yale-Brown Obsessive Compulsive Scale (Y-BOCS), and Mania Scale. Preliminary data suggest improvements in symptom severity and functional outcomes with combination treatments. This research aims to inform clinical practice, guide treatment decisions, and ultimately enhance the quality of care for individuals living with bipolar disorder. Findings will be disseminated through peer-reviewed journals and scientific conferences to advance knowledge in this field.展开更多
Due to the emergence of a large number of counterfeit notes and incomplete coins in the slot machine of self-service bus, to improve the automization of intelligent slot machine, based on multi-sensor testing technolo...Due to the emergence of a large number of counterfeit notes and incomplete coins in the slot machine of self-service bus, to improve the automization of intelligent slot machine, based on multi-sensor testing technology, using programming log- ic controller (PLC) as the core of the whole system, the PLC hardware design and software design are accomplished for the first time to detect the counterfeit notes and coins. The system was tested by many groups of experiments. The results show that the system has reliable recognition rate, good flexibility and stability, reaching the accuracy of 97%.展开更多
To solve the problem of advanced digital manufacturing technology in the practical application, a knowledge engineering technology was introduced into the computer numerical control(CNC) programming. The knowledge acq...To solve the problem of advanced digital manufacturing technology in the practical application, a knowledge engineering technology was introduced into the computer numerical control(CNC) programming. The knowledge acquisition, knowledge representation and reasoning used in CNC programming were researched. The CNC programming system functional architecture of impeller parts based on knowledge based engineering(KBE) was constructed. The structural model of the general knowledge-based system(KBS) was also constructed. The KBS of CNC programming system was established through synthesizing database technology and knowledge base theory. And in the context of corporate needs, based on the knowledge-driven manufacturing platform(i.e. UG CAD/CAM), VC++6.0 and UG/Open, the KBS and UG CAD/CAM were integrated seamlessly and the intelligent CNC programming KBE system for the impeller parts was developed by integrating KBE and UG CAD/CAM system. A method to establish standard process templates was proposed, so as to develop the intelligent CNC programming system in which CNC machining process and process parameters were standardized by using this KBE system. For the impeller parts processing, the method applied in the development of the prototype system is proven to be viable, feasible and practical.展开更多
A design idea was proposed that it was about intelligent digital welding machine with self-learning and self- regulation functions. The overall design scheme of software and hardware was provided. It was introduced th...A design idea was proposed that it was about intelligent digital welding machine with self-learning and self- regulation functions. The overall design scheme of software and hardware was provided. It was introduced that a parameter self-learning algorithm was based on large-step calibration and partial Newton interpolation. Furthermore, experimental verification was carried out with different welding technologies. The results show that weld bead is pegrect. Therefore, good welding quality and stability are obtained, and intelligent regulation is realized by parameters self-learning.展开更多
Using object mathematical model of traditional control theory can not solve the forecasting problem of the chemical components of sintered ore.In order to control complicated chemical components in the manufacturing p...Using object mathematical model of traditional control theory can not solve the forecasting problem of the chemical components of sintered ore.In order to control complicated chemical components in the manufacturing process of sintered ore,some key techniques for intelligent forecasting of the chemical components of sintered ore are studied in this paper.A new intelligent forecasting system based on SVM is proposed and realized.The results show that the accuracy of predictive value of every component is more than 90%.The application of our system in related companies is for more than one year and has shown satisfactory results.展开更多
After analyzing the structure and characteristics of the hybrid intelligent diagnosis system of CNC machine toolsCNC-HIDS), we describe the intelligent hybrid mechanism of the CNC-HIDS, and present the evaluation and ...After analyzing the structure and characteristics of the hybrid intelligent diagnosis system of CNC machine toolsCNC-HIDS), we describe the intelligent hybrid mechanism of the CNC-HIDS, and present the evaluation and the running instance of the system. Through tryout and validation, we attain satisfactory results.展开更多
Self-awareness,or self-consciousness,refers to reflective recognition of the existence of“subjective-self”.Every person has a subjective-self from which the person observes and interacts with the world.In this artic...Self-awareness,or self-consciousness,refers to reflective recognition of the existence of“subjective-self”.Every person has a subjective-self from which the person observes and interacts with the world.In this article,we argue that self-consciousness is an enigmatic phenomenon unique in human intelligence.Unlike many other intelligent and conscious capabilities,self-consciousness is not possible to be achieved in electronic computers and robots.Self-consciousness is an odd-point of human intelligence and a singularity of artificial intelligence(AI).Man-made intelligence through software is not capable of self-consciousness;therefore,robots will never become a newly created species.Because of the lack of self-awareness,AI software,such as Watson,Alpha-zero,ChatGPT,and PaLM,will remain a tool of humans and will not dominate the human society no matter how smart it is.This singularity of AI makes us re-think humbly what the future AI is like,what kind of robots we are going to deal with,and the blessing and threat of AI on humanity.展开更多
The ongoing expansion of the Industrial Internet of Things(IIoT)is enabling the possibility of effective Industry 4.0,where massive sensing devices in heterogeneous environments are connected through dedicated communi...The ongoing expansion of the Industrial Internet of Things(IIoT)is enabling the possibility of effective Industry 4.0,where massive sensing devices in heterogeneous environments are connected through dedicated communication protocols.This brings forth new methods and models to fuse the information yielded by the various industrial plant elements and generates emerging security challenges that we have to face,providing ad-hoc functions for scheduling and guaranteeing the network operations.Recently,the large development of SoftwareDefined Networking(SDN)and Artificial Intelligence(AI)technologies have made feasible the design and control of scalable and secure IIoT networks.This paper studies how AI and SDN technologies combined can be leveraged towards improving the security and functionality of these IIoT networks.After surveying the state-of-the-art research efforts in the subject,the paper introduces a candidate architecture for AI-enabled Software-Defined IIoT Network(AI-SDIN)that divides the traditional industrial networks into three functional layers.And with this aim in mind,key technologies(Blockchain-based Data Sharing,Intelligent Wireless Data Sensing,Edge Intelligence,Time-Sensitive Networks,Integrating SDN&TSN,Distributed AI)and improve applications based on AISDIN are also discussed.Further,the paper also highlights new opportunities and potential research challenges in control and automation of IIoT networks.展开更多
This article describes a novel approach for enhancing the three-dimensional(3D)point cloud reconstruction for light field microscopy(LFM)using U-net architecture-based fully convolutional neural network(CNN).Since the...This article describes a novel approach for enhancing the three-dimensional(3D)point cloud reconstruction for light field microscopy(LFM)using U-net architecture-based fully convolutional neural network(CNN).Since the directional view of the LFM is limited,noise and artifacts make it difficult to reconstruct the exact shape of 3D point clouds.The existing methods suffer from these problems due to the self-occlusion of the model.This manuscript proposes a deep fusion learning(DL)method that combines a 3D CNN with a U-Net-based model as a feature extractor.The sub-aperture images obtained from the light field microscopy are aligned to form a light field data cube for preprocessing.A multi-stream 3D CNNs and U-net architecture are applied to obtain the depth feature fromthe directional sub-aperture LF data cube.For the enhancement of the depthmap,dual iteration-based weighted median filtering(WMF)is used to reduce surface noise and enhance the accuracy of the reconstruction.Generating a 3D point cloud involves combining two key elements:the enhanced depth map and the central view of the light field image.The proposed method is validated using synthesized Heidelberg Collaboratory for Image Processing(HCI)and real-world LFM datasets.The results are compared with different state-of-the-art methods.The structural similarity index(SSIM)gain for boxes,cotton,pillow,and pens are 0.9760,0.9806,0.9940,and 0.9907,respectively.Moreover,the discrete entropy(DE)value for LFM depth maps exhibited better performance than other existing methods.展开更多
Advances in intelligent shield machines reflect an evolving trend from traditional tunnel boring machines(TBMs)to tunnel boring robots(TBRs).This shift aims to address the challenges encountered by the conventional sh...Advances in intelligent shield machines reflect an evolving trend from traditional tunnel boring machines(TBMs)to tunnel boring robots(TBRs).This shift aims to address the challenges encountered by the conventional shield machine industry arising from construction environment and manual operations.This study presents a systematic review of intelligent shield machine technology,with a particular emphasis on its smart operation.Firstly,the definition,meaning,contents,and development modes of intelligent shield machines are proposed.The development status of the intelligent shield machine and its smart operation are then presented.After analyzing the operation process of the shield machine,an autonomous operation framework considering both stand-alone and fleet levels is proposed.Challenges and recommendations are given for achieving autonomous operation.This study offers insights into the essence and developmental framework of intelligent shield machines to propel the advancement of this technology.展开更多
A great discovery made by H. von Foerster, P. M. Mora and L. W. Amiot was published in a 1960 issue of “Science”. The authors showed that existing data for calculating the Earth’s population in the new era (from 1 ...A great discovery made by H. von Foerster, P. M. Mora and L. W. Amiot was published in a 1960 issue of “Science”. The authors showed that existing data for calculating the Earth’s population in the new era (from 1 to 1958) could be described with incredibly high proximity by a hyperbolic function with the point of singularity on 13 November 2026. Thus, empirical regularity of the rise of the human population was established, which was marked by explosive demographic growth in the 20<sup>th</sup> century when during only one century it almost quadrupled: from 1.656 billion in 1900 to 6.144 billion in 2000. Nowadays, the world population has already overcome 7.8 billion people. Immediately after 1960, an active search for phenomenological models began to explain the mechanism of the hyperbolic population growth and the following demographic transition designed to stabilize its population. A significant role in explaining the mechanism of the hyperbolic growth of the world population was played by S. Kuznets (1960) and E. Boserup (1965), who found out that the rates of technological progress historically increased in proportion to the Earth’s population. It meant that the growth of the population led to raising the level of life-supporting technologies, and the latter in its turn enlarged the carrying capacity of the Earth, making it possible for the world population to expand. Proceeding from the information imperative, we have developed the model of the demographic dynamics for the 21<sup>st</sup> century for the first time. The model shows that with the development and spread of Intelligent Machines (IM), the number of the world population reaching a certain maximum will then irreversibly decline. Human depopulation will largely touch upon the most developed countries, where IM is used intensively nowadays. Until a certain moment in time, this depopulation in developed countries will be compensated by the explosive growth of the population in African countries located south of the Sahara. Calculations in our model reveal that the peak of the human population of 8.52 billion people will be reached in 2050, then it will irreversibly go down to 7.9 billion people by 2100, if developed countries do not take timely effective measures to overcome the process of information depopulation.展开更多
Building cyber-physical system(CPS) models of machine tools is a key technology for intelligent manufacturing. The massive electronic data from a computer numerical control(CNC) system during the work processes of a C...Building cyber-physical system(CPS) models of machine tools is a key technology for intelligent manufacturing. The massive electronic data from a computer numerical control(CNC) system during the work processes of a CNC machine tool is the main source of the big data on which a CPS model is established. In this work-process model, a method based on instruction domain is applied to analyze the electronic big data, and a quantitative description of the numerical control(NC) processes is built according to the G code of the processes. Utilizing the instruction domain, a work-process CPS model is established on the basis of the accurate, real-time mapping of the manufacturing tasks, resources, and status of the CNC machine tool. Using such models, case studies are conducted on intelligent-machining applications, such as the optimization of NC processing parameters and the health assurance of CNC machine tools.展开更多
The rise of big data has led to new demands for machine learning (ML) systems to learn complex mod- els, with millions to billions of parameters, that promise adequate capacity to digest massive datasets and offer p...The rise of big data has led to new demands for machine learning (ML) systems to learn complex mod- els, with millions to billions of parameters, that promise adequate capacity to digest massive datasets and offer powerful predictive analytics (such as high-dimensional latent features, intermediate repre- sentations, and decision functions) thereupon. In order to run ML algorithms at such scales, on a distrib- uted cluster with tens to thousands of machines, it is often the case that significant engineering efforts are required-and one might fairly ask whether such engineering truly falls within the domain of ML research. Taking the view that "big" ML systems can benefit greatly from ML-rooted statistical and algo- rithmic insights-and that ML researchers should therefore not shy away from such systems design-we discuss a series of principles and strategies distilled from our recent efforts on industrial-scale ML solu- tions. These principles and strategies span a continuum from application, to engineering, and to theo- retical research and development of big ML systems and architectures, with the goal of understanding how to make them efficient, generally applicable, and supported with convergence and scaling guaran- tees. They concern four key questions that traditionally receive little attention in ML research: How can an ML program be distributed over a cluster? How can ML computation be bridged with inter-machine communication? How can such communication be performed? What should be communicated between machines? By exposing underlying statistical and algorithmic characteristics unique to ML programs but not typically seen in traditional computer programs, and by dissecting successful cases to reveal how we have harnessed these principles to design and develop both high-performance distributed ML software as well as general-purpose ML frameworks, we present opportunities for ML researchers and practitioners to further shape and enlarge the area that lies between ML and systems..展开更多
Traditional Numerical Reservoir Simulation has been contributing to the oil and gas industry for decades.The current state of this technology is the result of decades of research and development by a large number of e...Traditional Numerical Reservoir Simulation has been contributing to the oil and gas industry for decades.The current state of this technology is the result of decades of research and development by a large number of engineers and scientists.Starting in the late 1960s and early 1970s,advances in computer hardware along with development and adaptation of clever algorithms resulted in a paradigm shift in reservoir studies moving them from simplified analogs and analytical solution methods to more mathematically robust computational and numerical solution models.展开更多
Machine intelligence,is out of the system by the artificial intelligence shown.It is usually achieved by the average computer intelligence.Rough sets and Information Granules in uncertainty management and soft computi...Machine intelligence,is out of the system by the artificial intelligence shown.It is usually achieved by the average computer intelligence.Rough sets and Information Granules in uncertainty management and soft computing and granular computing is widely used in many fields,such as in protein sequence analysis and biobasis determination,TSM and Web service classification Etc.展开更多
The aim of this study was to develop an adequate mathematical model for long-term forecasting of technological progress and economic growth in the digital age (2020-2050). In addition, the task was to develop a model ...The aim of this study was to develop an adequate mathematical model for long-term forecasting of technological progress and economic growth in the digital age (2020-2050). In addition, the task was to develop a model for forecast calculations of labor productivity in the symbiosis of “man + intelligent machine”, where an intelligent machine (IM) is understood as a computer or robot equipped with elements of artificial intelligence (AI), as well as in the digital economy as a whole. In the course of the study, it was shown that in order to implement its goals the Schumpeter-Kondratiev innovation and cycle theory on forming long waves (LW) of economic development influenced by a powerful cluster of economic technologies engendered by industrial revolutions is most appropriate for a long-term forecasting of technological progress and economic growth. The Solow neoclassical model of economic growth, synchronized with LW, gives the opportunity to forecast economic dynamics of technologically advanced countries with a greater precision up to 30 years, the time which correlates with the continuation of LW. In the information and digital age, the key role among the main factors of growth (capital, labour and technological progress) is played by the latter. The authors have developed an information model which allows for forecasting technological progress basing on growth rates of endogenous technological information in economics. The main regimes of producing technological information, corresponding to the eras of information and digital economies, are given in the article, as well as the Lagrangians that engender them. The model is verified on the example of the 5<sup>th</sup> information LW for the US economy (1982-2018) and it has had highly accurate approximation for both technological progress and economic growth. A number of new results were obtained using the developed information models for forecasting technological progress. The forecasting trajectory of economic growth of developed countries (on the example of the USA) on the upward stage of the 6<sup>th</sup> LW (2018-2042), engendered by the digital technologies of the 4<sup>th</sup> Industrial Revolution is given. It is also demonstrated that the symbiosis of human and intelligent machine (IM) is the driving force in the digital economy, where man plays the leading role organizing effective and efficient mutual work. Authors suggest a mathematical model for calculating labour productivity in the digital economy, where the symbiosis of “human + IM” is widely used. The calculations carried out with the help of the model show: 1) the symbiosis of “human + IM” from the very beginning lets to realize the possibilities of increasing work performance in the economy with the help of digital technologies;2) the largest labour productivity is achieved in the symbiosis of “human + IM”, where man labour prevails, and the lowest labour productivity is seen where the largest part of the work is performed by IM;3) developed countries may achieve labour productivity of 3% per year by the mid-2020s, which has all the chances to stay up to the 2040s.展开更多
The famous claim that we only use about 10% of the brain capacity has recently been challenged. Researchers argue that we are likely to use the whole brain, against the 10% claim. Some evidence and results from releva...The famous claim that we only use about 10% of the brain capacity has recently been challenged. Researchers argue that we are likely to use the whole brain, against the 10% claim. Some evidence and results from relevant studies and experiments related to memory in the field of neuroscience lead to the conclusion that if the rest 90% of the brain is not used, then many neural pathways will degenerate. What is memory? How does the brain function? What would be the limit of memory capacity? This article provides a model established upon the physiological and neurological characteristics of the human brain, which can give some theoretical support and scientific explanation to explain some phenomena. It may not only have theoretically significance in neuroscience, but can also be practically useful to fill in the gap between the natural and machine intelligence.展开更多
文摘From 26 to 27 October 2017, the Centre for Cross-Border Commercial Law in Asia of Singapore Management University (SMU) Law School held an international conference entitled "Future of Law Conference: The Internet of Things, Smart Contracts and Intelligent Machines" in Singapore. The conference brought together the leading thinkers in academia and practice in the field of information technology law to discuss the legal and regulatory implications of recent technological developments. Associate Professor ZHANG Jiyu and Associate Professor DING Xiaodong of the Law and Technology Institute of Renmin Law School were invited to attend the conference.
文摘Research purposes:in this study,the intelligent bionic robotic horse is introduced into the equestrian teaching for teenagers,compared with the traditional teaching mode of using real horses.This research aims to explore the effectiveness of using intelligent bionic robotic horse in equestrian teaching for teenagers,as well as to promote the further development of equestrian teaching for teenagers in China,and to promote the introduction of new technology into the equestrian teaching area in the age of internet.Research methods:the methods used were literature method;mathematical statistics;interviewing the equestrian coaches who participated in the experiment;experimental method.The intelligent bionic robotic horse used in this research is the GETTAEN intelligent bionic robotic horse produced by Joy Game Technology Co.,Ltd.The bionic robotic horse is equipped with Internet technology,and the course is supervised and produced by senior coaches of China Equestrian Team.It also includes multiple operation modes.In this study,40 amateur students in Beijing Chaoyang Park Youth Equestrian Center were selected as the experimental subjects.Students will spend 40 h on studying how to ride a horse.Twenty(20)students in the experimental group,they are accommodated with 20 h of bionic robotic horse courses and 20 h of real horse course;20 students in the control group were taught in the traditional teaching mode with 40 h of real horse courses.Results:(1)in horseback physical fitness test,the average value of the control group was 101.9 s;325.6 s in the experimental group.Independent sample T test p<0.05 has significant difference,the horseback physical performance in experimental group is better than the control group.(2)in horseback physical balance test,the average value of the control group was 3.75,and the average value of the experimental group was 7.1.Independent sample T test p<0.05 has significant difference,the horseback physical balance test results in experimental group have significant difference,and the experimental group is better than the control group.(3)The interview method was used to interview the equestrian coaches who participated in the experiment,coaches think that the bionic robotic horse can speed up the learning progress and has a strong technical consolidation,especially for teaching amateurs;but for the time being,it cannot meet the training and improvement target of the actual horse control ability and the ability to grasp the route,and such experience is not real and good enough for senior students.Conclusion:using real horse and intelligent bionic robotic horse combined,one can improve the teaching effectiveness and promote students’adaptation to horseback and technical mastery.But for the time being,it is only suitable for students with weak foundation or zero foundation.The capability of intelligent bionic robotic horse needs to be strengthened,and technological innovation is needed to adapt to all kinds of students.
文摘Bipolar disorder presents significant challenges in clinical management, characterized by recurrent episodes of depression and mania often accompanied by impairment in functioning. This study investigates the efficacy of pharmacological interventions and rehabilitation strategies to improve patient outcomes and quality of life. Utilizing a randomized controlled trial with multiple treatment arms, participants will receive pharmacotherapy, polypharmacotherapy, rehabilitation interventions, or combination treatments. Outcome measures will be assessed using standardized scales, including the Hamilton Depression Scale, Yale-Brown Obsessive Compulsive Scale (Y-BOCS), and Mania Scale. Preliminary data suggest improvements in symptom severity and functional outcomes with combination treatments. This research aims to inform clinical practice, guide treatment decisions, and ultimately enhance the quality of care for individuals living with bipolar disorder. Findings will be disseminated through peer-reviewed journals and scientific conferences to advance knowledge in this field.
文摘Due to the emergence of a large number of counterfeit notes and incomplete coins in the slot machine of self-service bus, to improve the automization of intelligent slot machine, based on multi-sensor testing technology, using programming log- ic controller (PLC) as the core of the whole system, the PLC hardware design and software design are accomplished for the first time to detect the counterfeit notes and coins. The system was tested by many groups of experiments. The results show that the system has reliable recognition rate, good flexibility and stability, reaching the accuracy of 97%.
基金Project(12ZT14)supported by the Natural Science Foundation of Shanghai Municipal Education Commission,China
文摘To solve the problem of advanced digital manufacturing technology in the practical application, a knowledge engineering technology was introduced into the computer numerical control(CNC) programming. The knowledge acquisition, knowledge representation and reasoning used in CNC programming were researched. The CNC programming system functional architecture of impeller parts based on knowledge based engineering(KBE) was constructed. The structural model of the general knowledge-based system(KBS) was also constructed. The KBS of CNC programming system was established through synthesizing database technology and knowledge base theory. And in the context of corporate needs, based on the knowledge-driven manufacturing platform(i.e. UG CAD/CAM), VC++6.0 and UG/Open, the KBS and UG CAD/CAM were integrated seamlessly and the intelligent CNC programming KBE system for the impeller parts was developed by integrating KBE and UG CAD/CAM system. A method to establish standard process templates was proposed, so as to develop the intelligent CNC programming system in which CNC machining process and process parameters were standardized by using this KBE system. For the impeller parts processing, the method applied in the development of the prototype system is proven to be viable, feasible and practical.
文摘A design idea was proposed that it was about intelligent digital welding machine with self-learning and self- regulation functions. The overall design scheme of software and hardware was provided. It was introduced that a parameter self-learning algorithm was based on large-step calibration and partial Newton interpolation. Furthermore, experimental verification was carried out with different welding technologies. The results show that weld bead is pegrect. Therefore, good welding quality and stability are obtained, and intelligent regulation is realized by parameters self-learning.
基金Supported by Key Science and Technology Project of Wuhan(No. 20106062327)Self-determined and Innovative Research Funds of WUT (No.2010-YB-20)
文摘Using object mathematical model of traditional control theory can not solve the forecasting problem of the chemical components of sintered ore.In order to control complicated chemical components in the manufacturing process of sintered ore,some key techniques for intelligent forecasting of the chemical components of sintered ore are studied in this paper.A new intelligent forecasting system based on SVM is proposed and realized.The results show that the accuracy of predictive value of every component is more than 90%.The application of our system in related companies is for more than one year and has shown satisfactory results.
文摘After analyzing the structure and characteristics of the hybrid intelligent diagnosis system of CNC machine toolsCNC-HIDS), we describe the intelligent hybrid mechanism of the CNC-HIDS, and present the evaluation and the running instance of the system. Through tryout and validation, we attain satisfactory results.
文摘Self-awareness,or self-consciousness,refers to reflective recognition of the existence of“subjective-self”.Every person has a subjective-self from which the person observes and interacts with the world.In this article,we argue that self-consciousness is an enigmatic phenomenon unique in human intelligence.Unlike many other intelligent and conscious capabilities,self-consciousness is not possible to be achieved in electronic computers and robots.Self-consciousness is an odd-point of human intelligence and a singularity of artificial intelligence(AI).Man-made intelligence through software is not capable of self-consciousness;therefore,robots will never become a newly created species.Because of the lack of self-awareness,AI software,such as Watson,Alpha-zero,ChatGPT,and PaLM,will remain a tool of humans and will not dominate the human society no matter how smart it is.This singularity of AI makes us re-think humbly what the future AI is like,what kind of robots we are going to deal with,and the blessing and threat of AI on humanity.
基金the Ministry of Education-China Mobile Research Foundation Project of China(MCM20180703)the National Key Research and Development Program of China(2020YFB1711100)for financial support.
基金This work was supported by the six talent peaks project in Jiangsu Province(No.XYDXX-012)Natural Science Foundation of China(No.62002045),China Postdoctoral Science Foundation(No.2021M690565)Fundamental Research Funds for the Cornell University(No.N2117002).
文摘The ongoing expansion of the Industrial Internet of Things(IIoT)is enabling the possibility of effective Industry 4.0,where massive sensing devices in heterogeneous environments are connected through dedicated communication protocols.This brings forth new methods and models to fuse the information yielded by the various industrial plant elements and generates emerging security challenges that we have to face,providing ad-hoc functions for scheduling and guaranteeing the network operations.Recently,the large development of SoftwareDefined Networking(SDN)and Artificial Intelligence(AI)technologies have made feasible the design and control of scalable and secure IIoT networks.This paper studies how AI and SDN technologies combined can be leveraged towards improving the security and functionality of these IIoT networks.After surveying the state-of-the-art research efforts in the subject,the paper introduces a candidate architecture for AI-enabled Software-Defined IIoT Network(AI-SDIN)that divides the traditional industrial networks into three functional layers.And with this aim in mind,key technologies(Blockchain-based Data Sharing,Intelligent Wireless Data Sensing,Edge Intelligence,Time-Sensitive Networks,Integrating SDN&TSN,Distributed AI)and improve applications based on AISDIN are also discussed.Further,the paper also highlights new opportunities and potential research challenges in control and automation of IIoT networks.
基金supported by the National Research Foundation of Korea (NRF) (NRF-2018R1D1A3B07044041&NRF-2020R1A2C1101258)supported by the MSIT (Ministry of Science and ICT),Korea,under the ITRC (Information Technology Research Center)Support Program (IITP-2023-2020-0-01846)was conducted during the research year of Chungbuk National University in 2023.
文摘This article describes a novel approach for enhancing the three-dimensional(3D)point cloud reconstruction for light field microscopy(LFM)using U-net architecture-based fully convolutional neural network(CNN).Since the directional view of the LFM is limited,noise and artifacts make it difficult to reconstruct the exact shape of 3D point clouds.The existing methods suffer from these problems due to the self-occlusion of the model.This manuscript proposes a deep fusion learning(DL)method that combines a 3D CNN with a U-Net-based model as a feature extractor.The sub-aperture images obtained from the light field microscopy are aligned to form a light field data cube for preprocessing.A multi-stream 3D CNNs and U-net architecture are applied to obtain the depth feature fromthe directional sub-aperture LF data cube.For the enhancement of the depthmap,dual iteration-based weighted median filtering(WMF)is used to reduce surface noise and enhance the accuracy of the reconstruction.Generating a 3D point cloud involves combining two key elements:the enhanced depth map and the central view of the light field image.The proposed method is validated using synthesized Heidelberg Collaboratory for Image Processing(HCI)and real-world LFM datasets.The results are compared with different state-of-the-art methods.The structural similarity index(SSIM)gain for boxes,cotton,pillow,and pens are 0.9760,0.9806,0.9940,and 0.9907,respectively.Moreover,the discrete entropy(DE)value for LFM depth maps exhibited better performance than other existing methods.
基金supported by the National Natural Science Foundation of China(No.52105074)the Open Project of State Key Laboratory of Shield Machine and Boring Technology(No.SKLST-2021-K02),China。
文摘Advances in intelligent shield machines reflect an evolving trend from traditional tunnel boring machines(TBMs)to tunnel boring robots(TBRs).This shift aims to address the challenges encountered by the conventional shield machine industry arising from construction environment and manual operations.This study presents a systematic review of intelligent shield machine technology,with a particular emphasis on its smart operation.Firstly,the definition,meaning,contents,and development modes of intelligent shield machines are proposed.The development status of the intelligent shield machine and its smart operation are then presented.After analyzing the operation process of the shield machine,an autonomous operation framework considering both stand-alone and fleet levels is proposed.Challenges and recommendations are given for achieving autonomous operation.This study offers insights into the essence and developmental framework of intelligent shield machines to propel the advancement of this technology.
文摘A great discovery made by H. von Foerster, P. M. Mora and L. W. Amiot was published in a 1960 issue of “Science”. The authors showed that existing data for calculating the Earth’s population in the new era (from 1 to 1958) could be described with incredibly high proximity by a hyperbolic function with the point of singularity on 13 November 2026. Thus, empirical regularity of the rise of the human population was established, which was marked by explosive demographic growth in the 20<sup>th</sup> century when during only one century it almost quadrupled: from 1.656 billion in 1900 to 6.144 billion in 2000. Nowadays, the world population has already overcome 7.8 billion people. Immediately after 1960, an active search for phenomenological models began to explain the mechanism of the hyperbolic population growth and the following demographic transition designed to stabilize its population. A significant role in explaining the mechanism of the hyperbolic growth of the world population was played by S. Kuznets (1960) and E. Boserup (1965), who found out that the rates of technological progress historically increased in proportion to the Earth’s population. It meant that the growth of the population led to raising the level of life-supporting technologies, and the latter in its turn enlarged the carrying capacity of the Earth, making it possible for the world population to expand. Proceeding from the information imperative, we have developed the model of the demographic dynamics for the 21<sup>st</sup> century for the first time. The model shows that with the development and spread of Intelligent Machines (IM), the number of the world population reaching a certain maximum will then irreversibly decline. Human depopulation will largely touch upon the most developed countries, where IM is used intensively nowadays. Until a certain moment in time, this depopulation in developed countries will be compensated by the explosive growth of the population in African countries located south of the Sahara. Calculations in our model reveal that the peak of the human population of 8.52 billion people will be reached in 2050, then it will irreversibly go down to 7.9 billion people by 2100, if developed countries do not take timely effective measures to overcome the process of information depopulation.
基金support of the studies is from the National Major Scientific and Technological Special Project for "Development and comprehensive verification of complete products of open high-end CNC system, servo device and motor" (2012ZX04001012)
文摘Building cyber-physical system(CPS) models of machine tools is a key technology for intelligent manufacturing. The massive electronic data from a computer numerical control(CNC) system during the work processes of a CNC machine tool is the main source of the big data on which a CPS model is established. In this work-process model, a method based on instruction domain is applied to analyze the electronic big data, and a quantitative description of the numerical control(NC) processes is built according to the G code of the processes. Utilizing the instruction domain, a work-process CPS model is established on the basis of the accurate, real-time mapping of the manufacturing tasks, resources, and status of the CNC machine tool. Using such models, case studies are conducted on intelligent-machining applications, such as the optimization of NC processing parameters and the health assurance of CNC machine tools.
文摘The rise of big data has led to new demands for machine learning (ML) systems to learn complex mod- els, with millions to billions of parameters, that promise adequate capacity to digest massive datasets and offer powerful predictive analytics (such as high-dimensional latent features, intermediate repre- sentations, and decision functions) thereupon. In order to run ML algorithms at such scales, on a distrib- uted cluster with tens to thousands of machines, it is often the case that significant engineering efforts are required-and one might fairly ask whether such engineering truly falls within the domain of ML research. Taking the view that "big" ML systems can benefit greatly from ML-rooted statistical and algo- rithmic insights-and that ML researchers should therefore not shy away from such systems design-we discuss a series of principles and strategies distilled from our recent efforts on industrial-scale ML solu- tions. These principles and strategies span a continuum from application, to engineering, and to theo- retical research and development of big ML systems and architectures, with the goal of understanding how to make them efficient, generally applicable, and supported with convergence and scaling guaran- tees. They concern four key questions that traditionally receive little attention in ML research: How can an ML program be distributed over a cluster? How can ML computation be bridged with inter-machine communication? How can such communication be performed? What should be communicated between machines? By exposing underlying statistical and algorithmic characteristics unique to ML programs but not typically seen in traditional computer programs, and by dissecting successful cases to reveal how we have harnessed these principles to design and develop both high-performance distributed ML software as well as general-purpose ML frameworks, we present opportunities for ML researchers and practitioners to further shape and enlarge the area that lies between ML and systems..
文摘Traditional Numerical Reservoir Simulation has been contributing to the oil and gas industry for decades.The current state of this technology is the result of decades of research and development by a large number of engineers and scientists.Starting in the late 1960s and early 1970s,advances in computer hardware along with development and adaptation of clever algorithms resulted in a paradigm shift in reservoir studies moving them from simplified analogs and analytical solution methods to more mathematically robust computational and numerical solution models.
文摘Machine intelligence,is out of the system by the artificial intelligence shown.It is usually achieved by the average computer intelligence.Rough sets and Information Granules in uncertainty management and soft computing and granular computing is widely used in many fields,such as in protein sequence analysis and biobasis determination,TSM and Web service classification Etc.
文摘The aim of this study was to develop an adequate mathematical model for long-term forecasting of technological progress and economic growth in the digital age (2020-2050). In addition, the task was to develop a model for forecast calculations of labor productivity in the symbiosis of “man + intelligent machine”, where an intelligent machine (IM) is understood as a computer or robot equipped with elements of artificial intelligence (AI), as well as in the digital economy as a whole. In the course of the study, it was shown that in order to implement its goals the Schumpeter-Kondratiev innovation and cycle theory on forming long waves (LW) of economic development influenced by a powerful cluster of economic technologies engendered by industrial revolutions is most appropriate for a long-term forecasting of technological progress and economic growth. The Solow neoclassical model of economic growth, synchronized with LW, gives the opportunity to forecast economic dynamics of technologically advanced countries with a greater precision up to 30 years, the time which correlates with the continuation of LW. In the information and digital age, the key role among the main factors of growth (capital, labour and technological progress) is played by the latter. The authors have developed an information model which allows for forecasting technological progress basing on growth rates of endogenous technological information in economics. The main regimes of producing technological information, corresponding to the eras of information and digital economies, are given in the article, as well as the Lagrangians that engender them. The model is verified on the example of the 5<sup>th</sup> information LW for the US economy (1982-2018) and it has had highly accurate approximation for both technological progress and economic growth. A number of new results were obtained using the developed information models for forecasting technological progress. The forecasting trajectory of economic growth of developed countries (on the example of the USA) on the upward stage of the 6<sup>th</sup> LW (2018-2042), engendered by the digital technologies of the 4<sup>th</sup> Industrial Revolution is given. It is also demonstrated that the symbiosis of human and intelligent machine (IM) is the driving force in the digital economy, where man plays the leading role organizing effective and efficient mutual work. Authors suggest a mathematical model for calculating labour productivity in the digital economy, where the symbiosis of “human + IM” is widely used. The calculations carried out with the help of the model show: 1) the symbiosis of “human + IM” from the very beginning lets to realize the possibilities of increasing work performance in the economy with the help of digital technologies;2) the largest labour productivity is achieved in the symbiosis of “human + IM”, where man labour prevails, and the lowest labour productivity is seen where the largest part of the work is performed by IM;3) developed countries may achieve labour productivity of 3% per year by the mid-2020s, which has all the chances to stay up to the 2040s.
文摘The famous claim that we only use about 10% of the brain capacity has recently been challenged. Researchers argue that we are likely to use the whole brain, against the 10% claim. Some evidence and results from relevant studies and experiments related to memory in the field of neuroscience lead to the conclusion that if the rest 90% of the brain is not used, then many neural pathways will degenerate. What is memory? How does the brain function? What would be the limit of memory capacity? This article provides a model established upon the physiological and neurological characteristics of the human brain, which can give some theoretical support and scientific explanation to explain some phenomena. It may not only have theoretically significance in neuroscience, but can also be practically useful to fill in the gap between the natural and machine intelligence.