Background knowledge is important for data mining, especially in complicated situation. Ontological engineering is the successor of knowledge engineering. The sharable knowledge bases built on ontology can be used to ...Background knowledge is important for data mining, especially in complicated situation. Ontological engineering is the successor of knowledge engineering. The sharable knowledge bases built on ontology can be used to provide background knowledge to direct the process of data mining. This paper gives a common introduction to the method and presents a practical analysis example using SVM (support vector machine) as the classifier. Gene Ontology and the accompanying annotations compose a big knowledge base, on which many researches have been carried out. Microarray dataset is the output of DNA chip. With the help of Gene Ontology we present a more elaborate analysis on microarray data than former researchers. The method can also be used in other fields with similar scenario.展开更多
Through analyzing theprinciple of data sharing in the data-base system, this paper discusses theprinciple and method for integratingand sharing GIS data by data engine,introduces a way to achieve the highintegration a...Through analyzing theprinciple of data sharing in the data-base system, this paper discusses theprinciple and method for integratingand sharing GIS data by data engine,introduces a way to achieve the highintegration and sharing of GIS data on the basis of VCT in VC++, and pro-vides the method for uniting VCT intoRDBMS in order to implement a spa-tial database with object-oriented datamodel.展开更多
In view of the problems of inconsistent data semantics,inconsistent data formats,and difficult data quality assurance between the railway engineering design phase and the construction and operation phase,as well as th...In view of the problems of inconsistent data semantics,inconsistent data formats,and difficult data quality assurance between the railway engineering design phase and the construction and operation phase,as well as the difficulty in fully realizing the value of design results,this paper proposes a design and implementation scheme for a railway engineering collaborative design platform.The railway engineering collaborative design platform mainly includes functional modules such as metadata management,design collaboration,design delivery management,model component library,model rendering services,and Building Information Modeling(BIM)application services.Based on this,research is conducted on multi-disciplinary parameterized collaborative design technology for railway engineering,infrastructure data management and delivery technology,and design multi-source data fusion and application technology.The railway engineering collaborative design platform is compared with other railway design software to further validate its advantages and advanced features.The platform has been widely applied in multiple railway construction projects,greatly improving the design and project management efficiency.展开更多
Reliability evaluation for aircraft engines is difficult because of the scarcity of failure data. But aircraft engine data are available from a variety of sources. Data fusion has the function of maximizing the amount...Reliability evaluation for aircraft engines is difficult because of the scarcity of failure data. But aircraft engine data are available from a variety of sources. Data fusion has the function of maximizing the amount of valu- able information extracted from disparate data sources to obtain the comprehensive reliability knowledge. Consid- ering the degradation failure and the catastrophic failure simultaneously, which are competing risks and can affect the reliability, a reliability evaluation model based on data fusion for aircraft engines is developed, Above the characteristics of the proposed model, reliability evaluation is more feasible than that by only utilizing failure data alone, and is also more accurate than that by only considering single failure mode. Example shows the effective- ness of the proposed model.展开更多
The challenge of transitioning from temporary humanitarian settlements to more sustainable human settlements is due to a significant increase in the number of forcibly displaced people over recent decades, difficultie...The challenge of transitioning from temporary humanitarian settlements to more sustainable human settlements is due to a significant increase in the number of forcibly displaced people over recent decades, difficulties in providing social services that meet the required standards, and the prolongation of emergencies. Despite this challenging context, short-term considerations continue to guide their planning and management rather than more integrated, longer-term perspectives, thus preventing viable, sustainable development. Over the years, the design of humanitarian settlements has not been adapted to local contexts and perspectives, nor to the dynamics of urbanization and population growth and data. In addition, the current approach to temporary settlement harms the environment and can strain limited resources. Inefficient land use and ad hoc development models have compounded difficulties and generated new challenges. As a result, living conditions in settlements have deteriorated over the last few decades and continue to pose new challenges. The stakes are such that major shortcomings have emerged along the way, leading to disruption, budget overruns in a context marked by a steady decline in funding. However, some attempts have been made to shift towards more sustainable approaches, but these have mainly focused on vague, sector-oriented themes, failing to consider systematic and integration views. This study is a contribution in addressing these shortcomings by designing a model-driving solution, emphasizing an integrated system conceptualized as a system of systems. This paper proposes a new methodology for designing an integrated and sustainable human settlement model, based on Model-Based Systems Engineering and a Systems Modeling Language to provide valuable insights toward sustainable solutions for displaced populations aligning with the United Nations 2030 agenda for sustainable development.展开更多
A data identifier(DID)is an essential tag or label in all kinds of databases—particularly those related to integrated computational materials engineering(ICME),inheritable integrated intelligent manufacturing(I3M),an...A data identifier(DID)is an essential tag or label in all kinds of databases—particularly those related to integrated computational materials engineering(ICME),inheritable integrated intelligent manufacturing(I3M),and the Industrial Internet ofThings.With the guidance and quick acceleration of the developme nt of advanced materials,as envisioned by official documents worldwide,more investigations are required to construct relative numerical standards for material informatics.This work proposes a universal DID format consisting of a set of build chains,which aligns with the classical form of identifier in both international and national standards,such as ISO/IEC 29168-1:2000,GB/T 27766-2011,GA/T 543.2-2011,GM/T 0006-2012,GJB 7365-2011,SL 325-2014,SL 607-201&WS 363.2-2011,and QX/T 39-2005.Each build chain is made up of capital letters and numbers,with no symbols.Moreover,the total length of each build chain is not restricted,which follows the formation of the Universal Coded Character Set in the international standard of ISO/IEC 10646.Based on these rules,the proposed DID is flexible and convenient for extendi ng and sharing in and between various cloud-based platforms.Accordingly,classical two-dimensional(2D)codes,including the Hanxin Code,Lots Perception Matrix(LP)Code,Quick Response(Q.R)code,Grid Matrix(GM)code,and Data Matrix(DM)Code,can be constructed and precisely recognized and/or decoded by either smart phones or specific machines.By utilizing these 2D codes as the fingerprints of a set of data linked with cloud-based platforms,progress and updates in the composition-processing-structure-property-performance workflow process can be tracked spontaneously,paving a path to accelerate the discovery and manufacture of advanced materials and enhance research productivity,performance,and collaboration.展开更多
The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured d...The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured data usually have some abnormalities. When the abnor mal data are eliminated by filtering, blanks are created. The grey generation an d GM(1,1) are used to create new data for these blanks. For the uneven data sequ en ce created by measuring error, the mean generation is used to smooth it and then the stepwise and smooth generations are used to improve the data sequence.展开更多
The nature of the measured data varies among different disciplines of geosciences.In rock engineering,features of data play a leading role in determining the feasible methods of its proper manipulation.The present stu...The nature of the measured data varies among different disciplines of geosciences.In rock engineering,features of data play a leading role in determining the feasible methods of its proper manipulation.The present study focuses on resolving one of the major deficiencies of conventional neural networks(NNs)in dealing with rock engineering data.Herein,since the samples are obtained from hundreds of meters below the surface with the utmost difficulty,the number of samples is always limited.Meanwhile,the experimental analysis of these samples may result in many repetitive values and 0 s.However,conventional neural networks are incapable of making robust models in the presence of such data.On the other hand,these networks strongly depend on the initial weights and bias values for making reliable predictions.With this in mind,the current research introduces a novel kind of neural network processing framework for the geological that does not suffer from the limitations of the conventional NNs.The introduced single-data-based feature engineering network extracts all the information wrapped in every single data point without being affected by the other points.This method,being completely different from the conventional NNs,re-arranges all the basic elements of the neuron model into a new structure.Therefore,its mathematical calculations were performed from the very beginning.Moreover,the corresponding programming codes were developed in MATLAB and Python since they could not be found in any common programming software at the time being.This new kind of network was first evaluated through computer-based simulations of rock cracks in the 3 DEC environment.After the model’s reliability was confirmed,it was adopted in two case studies for estimating respectively tensile strength and shear strength of real rock samples.These samples were coal core samples from the Southern Qinshui Basin of China,and gas hydrate-bearing sediment(GHBS)samples from the Nankai Trough of Japan.The coal samples used in the experiments underwent nuclear magnetic resonance(NMR)measurements,and Scanning Electron Microscopy(SEM)imaging to investigate their original micro and macro fractures.Once done with these experiments,measurement of the rock mechanical properties,including tensile strength,was performed using a rock mechanical test system.However,the shear strength of GHBS samples was acquired through triaxial and direct shear tests.According to the obtained result,the new network structure outperformed the conventional neural networks in both cases of simulation-based and case study estimations of the tensile and shear strength.Even though the proposed approach of the current study originally aimed at resolving the issue of having a limited dataset,its unique properties would also be applied to larger datasets from other subsurface measurements.展开更多
The knowledge representation mode and inference control strategy were analyzed according to the specialties of air-conditioning cooling/heating sources selection. The constructing idea and working procedure for knowle...The knowledge representation mode and inference control strategy were analyzed according to the specialties of air-conditioning cooling/heating sources selection. The constructing idea and working procedure for knowledge base and inference engine were proposed while the realization technique of the C language was discussed. An intelligent decision support system (IDSS) model based on such knowledge representation and inference mechanism was developed by domain engineers. The model was verified to have a small kernel and powerful capability in list processing and data driving, which was successfully used in the design of a cooling/heating sources system for a large-sized office building.展开更多
A comprehensive safety evaluation system taking the most influential factors into account has been developed to evaluate the reliability of hydraulic metal structures. Applying the techniques of AI and DB, the idea of...A comprehensive safety evaluation system taking the most influential factors into account has been developed to evaluate the reliability of hydraulic metal structures. Applying the techniques of AI and DB, the idea of a one-machine and three-base system is proposed. The framework of the three-base system has been designed and the structural framework constructed in turn. A practical example is given to illustrate the process of using this system and it can be used for comparison and analysis purposes. The key technology of the system is its ability to reorganize and improve the expert system's knowledge base by establishing the expert system. This system utilizes the computer technology inference process, making safety evaluation conclusions more reasonable and applicable to the actual situation. The system is not only advanced, but also feasible, reliable, artificially intelligent, and has the capacity to constantly grow.展开更多
3D spatial data model and simulating are the core of 3D GIS can be adopted indifferent domains. A data model based on Quasi Tri-Prism Volume (QTPV) has been proposed. QTPVdefinition and its special cases have been dis...3D spatial data model and simulating are the core of 3D GIS can be adopted indifferent domains. A data model based on Quasi Tri-Prism Volume (QTPV) has been proposed. QTPVdefinition and its special cases have been discussed. Using QTPV and its special cases, irregularnatural geological bodies and regular subsurface engineering can be described efficiently. Theproposed model is composed of five primitives and six objects. Data structures and topologicalrelationship of the fives primitives and three objects describing stratigraphy are designed indetail. Some schemes are designed for the QTPV modelling of stratigraphy and subsurface engineeringaccording to modelling data. The model manipulation method of QTPV cutting by an arbitrary plane isdiscussed. Using VC^(++)6. 0 programming language integrated with SQL database and OpenGL graphiclibrary under windows environment, a system prototype 3DGeoMV has been developed. The experimentresult shows that the QTPV model is feasible and efficient in modelling subsurface engineering.展开更多
This article describes an Internet based laboratory (NETLAB) developed at Zhejiang University for electrical engi- neering education. A key feature of the project is the use of real experimental systems rather than si...This article describes an Internet based laboratory (NETLAB) developed at Zhejiang University for electrical engi- neering education. A key feature of the project is the use of real experimental systems rather than simulation or virtual reality. NELTAB provides remote access to a wide variety of experiments, including not only basic electrical and electronic experiments but also many innovative control experiments. Students can effectively use the laboratory at any time and from anywhere. NETLAB has been in operation since July 2003.展开更多
A sufficient sample size of monitoring data becomes a key factor for describing aircraft engines state.Generative adversarial nets(GAN)can be used to expand the sample size based on the existing state monitoring infor...A sufficient sample size of monitoring data becomes a key factor for describing aircraft engines state.Generative adversarial nets(GAN)can be used to expand the sample size based on the existing state monitoring information.In the paper,a GAN model is introduced to design an algorithm for generating the monitoring data of aircraft engines.This feasibility of the method is illustrated by an example.The experimental results demonstrate that the probability density distribution of generated data after a large number of network training iterations is consistent with the probability density distribution of monitoring data.The proposed method also effectively demonstrates the generated monitoring data of aircraft engine are in a reasonable range.The method can effectively solve the problem of inaccurate performance degradation evaluation caused by the small amount of aero?engine condition monitoring data.展开更多
With the development and implementation of performance-based earthquake engineering,harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components ...With the development and implementation of performance-based earthquake engineering,harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event,failure of architectural,mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover,nonstructural damage has limited the functionality of critical facilities,such as hospitals,following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore,it is not surprising that in many past earthquakes,losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore,the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings,or of rescue workers entering buildings. In comparison to structural components and systems,there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse,and the available codes and guidelines are usually,for the most part,based on past experiences,engineering judgment and intuition,rather than on objective experimental and analytical results. Often,design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components,identifying major knowledge gaps that will need to be filled by future research. Furthermore,considering recent trends in earthquake engineering,the paper explores how performance-based seismic design might be conceived for nonstructural components,drawing on recent developments made in the field of seismic design and hinting at the specific considerations required for nonstructural components.展开更多
The key to develop 3-D GISs is the study on 3-D data model and data structure. Some of the data models and data structures have been presented by scholars. Because of the complexity of 3-D spatial phenomenon, there ar...The key to develop 3-D GISs is the study on 3-D data model and data structure. Some of the data models and data structures have been presented by scholars. Because of the complexity of 3-D spatial phenomenon, there are no perfect data structures that can describe all spatial entities. Every data structure has its own advantages and disadvantages. It is difficult to design a single data structure to meet different needs. The important subject in the3-D data models is developing a data model that has integrated vector and raster data structures. A special 3-D spatial data model based on distributing features of spatial entities should be designed. We took the geological exploration engineering as the research background and designed an integrated data model whose data structures integrats vector and raster data byadopting object-oriented technique. Research achievements are presented in this paper.展开更多
With the deep research of knowledge engineering and the widespread applications of CAD technology, the joining of knowledge engineering with CAD is the focus of advanced manufacturing. An intelligent approach is prese...With the deep research of knowledge engineering and the widespread applications of CAD technology, the joining of knowledge engineering with CAD is the focus of advanced manufacturing. An intelligent approach is presented for configurating the typical structural components of radar. Case based reasoning, rule based reasoning, geometric, constraint solving and domain ontology are merged into a compound knowledge model. The main frame and workflow of radar typical structural component design system are illustrated. Experiments show this approach is efficient and effective.展开更多
Knowledge-Based Engineering (KBE) is introduced into the ship structural design in this paper. From the implementation of KBE, the design solutions for both Rules Design Method (RDM) and Interpolation Design Meth...Knowledge-Based Engineering (KBE) is introduced into the ship structural design in this paper. From the implementation of KBE, the design solutions for both Rules Design Method (RDM) and Interpolation Design Method (IDM) are generated. The corresponding Finite Element (FE) models are generated. Topological design of the longitudinal structures is studied where the Gaussian Process (GP) is employed to build the surrogate model for FE analysis. Multi-objective optimization methods inspired by Pareto Front are used to reduce the design tank weight and outer surface area simultaneously. Additionally, an enhanced Level Set Method (LSM) which employs implicit algorithm is applied to the topological design of typical bracket plate which is used extensively in ship structures. Two different sets of boundary conditions are considered. The proposed methods show satisfactory efficiency and accuracy.展开更多
Since the late of previous decade, hypertext technique has been applied in many areas. A hypertext data model with version control which is applied to a digital delivery for engineering documents named Optical Disk ba...Since the late of previous decade, hypertext technique has been applied in many areas. A hypertext data model with version control which is applied to a digital delivery for engineering documents named Optical Disk based Electronic Archives Management System(ODEAMS) is presented first and it has successfully solved some problems in engineering data management. Then, this paper describes some details to implement the hypertext network in ODEAMS after introducing the requirements and characters of engineering data management.展开更多
基金Project (No. 20040248001) supported by the Ph.D. Programs Foun-dation of Ministry of Education of China
文摘Background knowledge is important for data mining, especially in complicated situation. Ontological engineering is the successor of knowledge engineering. The sharable knowledge bases built on ontology can be used to provide background knowledge to direct the process of data mining. This paper gives a common introduction to the method and presents a practical analysis example using SVM (support vector machine) as the classifier. Gene Ontology and the accompanying annotations compose a big knowledge base, on which many researches have been carried out. Microarray dataset is the output of DNA chip. With the help of Gene Ontology we present a more elaborate analysis on microarray data than former researchers. The method can also be used in other fields with similar scenario.
文摘Through analyzing theprinciple of data sharing in the data-base system, this paper discusses theprinciple and method for integratingand sharing GIS data by data engine,introduces a way to achieve the highintegration and sharing of GIS data on the basis of VCT in VC++, and pro-vides the method for uniting VCT intoRDBMS in order to implement a spa-tial database with object-oriented datamodel.
基金supported by the National Key Research and Development Program of China(2021YFB2600405).
文摘In view of the problems of inconsistent data semantics,inconsistent data formats,and difficult data quality assurance between the railway engineering design phase and the construction and operation phase,as well as the difficulty in fully realizing the value of design results,this paper proposes a design and implementation scheme for a railway engineering collaborative design platform.The railway engineering collaborative design platform mainly includes functional modules such as metadata management,design collaboration,design delivery management,model component library,model rendering services,and Building Information Modeling(BIM)application services.Based on this,research is conducted on multi-disciplinary parameterized collaborative design technology for railway engineering,infrastructure data management and delivery technology,and design multi-source data fusion and application technology.The railway engineering collaborative design platform is compared with other railway design software to further validate its advantages and advanced features.The platform has been widely applied in multiple railway construction projects,greatly improving the design and project management efficiency.
基金Supported by the National Natural Science Foundation of China and Aviation Fund(60879001)the Natural Science Foundation of Jiangsu Province(BK2009378)+1 种基金the Fundamental Research Fund of Nanjing University of Aeronautics and Astronautics(NS2010179)the Qinglan Project of Jiangsu Province~~
文摘Reliability evaluation for aircraft engines is difficult because of the scarcity of failure data. But aircraft engine data are available from a variety of sources. Data fusion has the function of maximizing the amount of valu- able information extracted from disparate data sources to obtain the comprehensive reliability knowledge. Consid- ering the degradation failure and the catastrophic failure simultaneously, which are competing risks and can affect the reliability, a reliability evaluation model based on data fusion for aircraft engines is developed, Above the characteristics of the proposed model, reliability evaluation is more feasible than that by only utilizing failure data alone, and is also more accurate than that by only considering single failure mode. Example shows the effective- ness of the proposed model.
文摘The challenge of transitioning from temporary humanitarian settlements to more sustainable human settlements is due to a significant increase in the number of forcibly displaced people over recent decades, difficulties in providing social services that meet the required standards, and the prolongation of emergencies. Despite this challenging context, short-term considerations continue to guide their planning and management rather than more integrated, longer-term perspectives, thus preventing viable, sustainable development. Over the years, the design of humanitarian settlements has not been adapted to local contexts and perspectives, nor to the dynamics of urbanization and population growth and data. In addition, the current approach to temporary settlement harms the environment and can strain limited resources. Inefficient land use and ad hoc development models have compounded difficulties and generated new challenges. As a result, living conditions in settlements have deteriorated over the last few decades and continue to pose new challenges. The stakes are such that major shortcomings have emerged along the way, leading to disruption, budget overruns in a context marked by a steady decline in funding. However, some attempts have been made to shift towards more sustainable approaches, but these have mainly focused on vague, sector-oriented themes, failing to consider systematic and integration views. This study is a contribution in addressing these shortcomings by designing a model-driving solution, emphasizing an integrated system conceptualized as a system of systems. This paper proposes a new methodology for designing an integrated and sustainable human settlement model, based on Model-Based Systems Engineering and a Systems Modeling Language to provide valuable insights toward sustainable solutions for displaced populations aligning with the United Nations 2030 agenda for sustainable development.
基金This work was financially supported by the National Key Research and Development Program of China(2018YFB0703801,2018YFB0703802,2016YFB0701303,and 2016YFB0701304)CRRC Tangshan Co.,Ltd.(201750463031).Special thanks to Professor Hong Wang at Shanghai Jiao Tong University for the fruitful discussions and the constructive suggestions/comments.
文摘A data identifier(DID)is an essential tag or label in all kinds of databases—particularly those related to integrated computational materials engineering(ICME),inheritable integrated intelligent manufacturing(I3M),and the Industrial Internet ofThings.With the guidance and quick acceleration of the developme nt of advanced materials,as envisioned by official documents worldwide,more investigations are required to construct relative numerical standards for material informatics.This work proposes a universal DID format consisting of a set of build chains,which aligns with the classical form of identifier in both international and national standards,such as ISO/IEC 29168-1:2000,GB/T 27766-2011,GA/T 543.2-2011,GM/T 0006-2012,GJB 7365-2011,SL 325-2014,SL 607-201&WS 363.2-2011,and QX/T 39-2005.Each build chain is made up of capital letters and numbers,with no symbols.Moreover,the total length of each build chain is not restricted,which follows the formation of the Universal Coded Character Set in the international standard of ISO/IEC 10646.Based on these rules,the proposed DID is flexible and convenient for extendi ng and sharing in and between various cloud-based platforms.Accordingly,classical two-dimensional(2D)codes,including the Hanxin Code,Lots Perception Matrix(LP)Code,Quick Response(Q.R)code,Grid Matrix(GM)code,and Data Matrix(DM)Code,can be constructed and precisely recognized and/or decoded by either smart phones or specific machines.By utilizing these 2D codes as the fingerprints of a set of data linked with cloud-based platforms,progress and updates in the composition-processing-structure-property-performance workflow process can be tracked spontaneously,paving a path to accelerate the discovery and manufacture of advanced materials and enhance research productivity,performance,and collaboration.
文摘The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured data usually have some abnormalities. When the abnor mal data are eliminated by filtering, blanks are created. The grey generation an d GM(1,1) are used to create new data for these blanks. For the uneven data sequ en ce created by measuring error, the mean generation is used to smooth it and then the stepwise and smooth generations are used to improve the data sequence.
文摘The nature of the measured data varies among different disciplines of geosciences.In rock engineering,features of data play a leading role in determining the feasible methods of its proper manipulation.The present study focuses on resolving one of the major deficiencies of conventional neural networks(NNs)in dealing with rock engineering data.Herein,since the samples are obtained from hundreds of meters below the surface with the utmost difficulty,the number of samples is always limited.Meanwhile,the experimental analysis of these samples may result in many repetitive values and 0 s.However,conventional neural networks are incapable of making robust models in the presence of such data.On the other hand,these networks strongly depend on the initial weights and bias values for making reliable predictions.With this in mind,the current research introduces a novel kind of neural network processing framework for the geological that does not suffer from the limitations of the conventional NNs.The introduced single-data-based feature engineering network extracts all the information wrapped in every single data point without being affected by the other points.This method,being completely different from the conventional NNs,re-arranges all the basic elements of the neuron model into a new structure.Therefore,its mathematical calculations were performed from the very beginning.Moreover,the corresponding programming codes were developed in MATLAB and Python since they could not be found in any common programming software at the time being.This new kind of network was first evaluated through computer-based simulations of rock cracks in the 3 DEC environment.After the model’s reliability was confirmed,it was adopted in two case studies for estimating respectively tensile strength and shear strength of real rock samples.These samples were coal core samples from the Southern Qinshui Basin of China,and gas hydrate-bearing sediment(GHBS)samples from the Nankai Trough of Japan.The coal samples used in the experiments underwent nuclear magnetic resonance(NMR)measurements,and Scanning Electron Microscopy(SEM)imaging to investigate their original micro and macro fractures.Once done with these experiments,measurement of the rock mechanical properties,including tensile strength,was performed using a rock mechanical test system.However,the shear strength of GHBS samples was acquired through triaxial and direct shear tests.According to the obtained result,the new network structure outperformed the conventional neural networks in both cases of simulation-based and case study estimations of the tensile and shear strength.Even though the proposed approach of the current study originally aimed at resolving the issue of having a limited dataset,its unique properties would also be applied to larger datasets from other subsurface measurements.
文摘The knowledge representation mode and inference control strategy were analyzed according to the specialties of air-conditioning cooling/heating sources selection. The constructing idea and working procedure for knowledge base and inference engine were proposed while the realization technique of the C language was discussed. An intelligent decision support system (IDSS) model based on such knowledge representation and inference mechanism was developed by domain engineers. The model was verified to have a small kernel and powerful capability in list processing and data driving, which was successfully used in the design of a cooling/heating sources system for a large-sized office building.
基金supported by the National Natural Science Foundation of China (Grant No. 50539010)
文摘A comprehensive safety evaluation system taking the most influential factors into account has been developed to evaluate the reliability of hydraulic metal structures. Applying the techniques of AI and DB, the idea of a one-machine and three-base system is proposed. The framework of the three-base system has been designed and the structural framework constructed in turn. A practical example is given to illustrate the process of using this system and it can be used for comparison and analysis purposes. The key technology of the system is its ability to reorganize and improve the expert system's knowledge base by establishing the expert system. This system utilizes the computer technology inference process, making safety evaluation conclusions more reasonable and applicable to the actual situation. The system is not only advanced, but also feasible, reliable, artificially intelligent, and has the capacity to constantly grow.
基金Funded by the Hong Kong Polytechnic University ASD research fund (No. 1.34.A222),Open Research Fund Program of LIESMARS (No. WKL(01) 0302) and the National Natural Science Foundation of China(No. 40401021)
文摘3D spatial data model and simulating are the core of 3D GIS can be adopted indifferent domains. A data model based on Quasi Tri-Prism Volume (QTPV) has been proposed. QTPVdefinition and its special cases have been discussed. Using QTPV and its special cases, irregularnatural geological bodies and regular subsurface engineering can be described efficiently. Theproposed model is composed of five primitives and six objects. Data structures and topologicalrelationship of the fives primitives and three objects describing stratigraphy are designed indetail. Some schemes are designed for the QTPV modelling of stratigraphy and subsurface engineeringaccording to modelling data. The model manipulation method of QTPV cutting by an arbitrary plane isdiscussed. Using VC^(++)6. 0 programming language integrated with SQL database and OpenGL graphiclibrary under windows environment, a system prototype 3DGeoMV has been developed. The experimentresult shows that the QTPV model is feasible and efficient in modelling subsurface engineering.
基金Project supported by the Promising Project Foundation of Zheji-ang University, China
文摘This article describes an Internet based laboratory (NETLAB) developed at Zhejiang University for electrical engi- neering education. A key feature of the project is the use of real experimental systems rather than simulation or virtual reality. NELTAB provides remote access to a wide variety of experiments, including not only basic electrical and electronic experiments but also many innovative control experiments. Students can effectively use the laboratory at any time and from anywhere. NETLAB has been in operation since July 2003.
基金supported by the National Science Foundation for Young Scientists of China (No. 71401073)
文摘A sufficient sample size of monitoring data becomes a key factor for describing aircraft engines state.Generative adversarial nets(GAN)can be used to expand the sample size based on the existing state monitoring information.In the paper,a GAN model is introduced to design an algorithm for generating the monitoring data of aircraft engines.This feasibility of the method is illustrated by an example.The experimental results demonstrate that the probability density distribution of generated data after a large number of network training iterations is consistent with the probability density distribution of monitoring data.The proposed method also effectively demonstrates the generated monitoring data of aircraft engine are in a reasonable range.The method can effectively solve the problem of inaccurate performance degradation evaluation caused by the small amount of aero?engine condition monitoring data.
文摘With the development and implementation of performance-based earthquake engineering,harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event,failure of architectural,mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover,nonstructural damage has limited the functionality of critical facilities,such as hospitals,following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore,it is not surprising that in many past earthquakes,losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore,the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings,or of rescue workers entering buildings. In comparison to structural components and systems,there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse,and the available codes and guidelines are usually,for the most part,based on past experiences,engineering judgment and intuition,rather than on objective experimental and analytical results. Often,design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components,identifying major knowledge gaps that will need to be filled by future research. Furthermore,considering recent trends in earthquake engineering,the paper explores how performance-based seismic design might be conceived for nonstructural components,drawing on recent developments made in the field of seismic design and hinting at the specific considerations required for nonstructural components.
基金Project supported by the National Outstanding Youth Researchers Foundation (No.49525101)the Opening Research Foundation from LIESMARS(WKL(96)0302)
文摘The key to develop 3-D GISs is the study on 3-D data model and data structure. Some of the data models and data structures have been presented by scholars. Because of the complexity of 3-D spatial phenomenon, there are no perfect data structures that can describe all spatial entities. Every data structure has its own advantages and disadvantages. It is difficult to design a single data structure to meet different needs. The important subject in the3-D data models is developing a data model that has integrated vector and raster data structures. A special 3-D spatial data model based on distributing features of spatial entities should be designed. We took the geological exploration engineering as the research background and designed an integrated data model whose data structures integrats vector and raster data byadopting object-oriented technique. Research achievements are presented in this paper.
基金Supported by China Hi-tech Program (863) (2007AA04Z125)
文摘With the deep research of knowledge engineering and the widespread applications of CAD technology, the joining of knowledge engineering with CAD is the focus of advanced manufacturing. An intelligent approach is presented for configurating the typical structural components of radar. Case based reasoning, rule based reasoning, geometric, constraint solving and domain ontology are merged into a compound knowledge model. The main frame and workflow of radar typical structural component design system are illustrated. Experiments show this approach is efficient and effective.
基金financially supported by the Project of Ministry of Education and Finance of China(Grant Nos.200512 and 201335)the Project of the State Key Laboratory of Ocean Engineering,Shanghai Jiao Tong University(Grant No.GKZD010053-10)
文摘Knowledge-Based Engineering (KBE) is introduced into the ship structural design in this paper. From the implementation of KBE, the design solutions for both Rules Design Method (RDM) and Interpolation Design Method (IDM) are generated. The corresponding Finite Element (FE) models are generated. Topological design of the longitudinal structures is studied where the Gaussian Process (GP) is employed to build the surrogate model for FE analysis. Multi-objective optimization methods inspired by Pareto Front are used to reduce the design tank weight and outer surface area simultaneously. Additionally, an enhanced Level Set Method (LSM) which employs implicit algorithm is applied to the topological design of typical bracket plate which is used extensively in ship structures. Two different sets of boundary conditions are considered. The proposed methods show satisfactory efficiency and accuracy.
文摘Since the late of previous decade, hypertext technique has been applied in many areas. A hypertext data model with version control which is applied to a digital delivery for engineering documents named Optical Disk based Electronic Archives Management System(ODEAMS) is presented first and it has successfully solved some problems in engineering data management. Then, this paper describes some details to implement the hypertext network in ODEAMS after introducing the requirements and characters of engineering data management.