The whole-process project cost management based on building information modeling(BIM)is a new management method,aiming to realize the comprehensive optimization and improvement of project cost management through the a...The whole-process project cost management based on building information modeling(BIM)is a new management method,aiming to realize the comprehensive optimization and improvement of project cost management through the application of BIM technology.This paper summarizes and analyzes the whole-process project cost management based on BIM,aiming to explore its application and development prospects in the construction industry.Firstly,this paper introduces the role and advantages of BIM technology in engineering cost management,including information integration,data sharing,and collaborative work.Secondly,the paper analyzes the key technologies and methods of the whole-process project cost management based on BIM,including model construction,data management,and cost control.In addition,the paper also discusses the challenges and limitations of the whole-process BIM project cost management,such as the inconsistency of technical standards,personnel training,and consciousness change.Finally,the paper summarizes the advantages and development prospects of the whole-process project cost management based on BIM and puts forward the direction and suggestions for future research.Through the research of this paper,it can provide a reference for construction cost management and promote innovation and development in the construction industry.展开更多
Pipeline integrity is a cornerstone of the operation of many industrial systems, and maintaining pipeline integrity is essential for preventing economic losses and ecological damage caused by oil and gas leaks. Based ...Pipeline integrity is a cornerstone of the operation of many industrial systems, and maintaining pipeline integrity is essential for preventing economic losses and ecological damage caused by oil and gas leaks. Based on integritymanagement data published by the US Pipeline and Hazardous Materials Safety Administration, this study applied the k-means clustering and data envelopment analysis(DEA) methods to both explore the characteristics of pipeline-integrity management and evaluate its efficiency. The k-means clustering algorithm was found to be scientifically valid for classifying pipeline companies as either low-, medium-, or high-difficulty companies according to their integrity-management requirements. Regardless of a pipeline company's classification, equipment failure was found to be the main cause of pipeline failure. In-line inspection corrosion and dent tools were the two most-used tools for pipeline inspection. Among the types of repair, 180-day condition repairs were a key concern for pipeline companies. The results of the DEA analysis indicate that only three out of 34 companies were deemed to be DEA-effective. To improve the effectiveness of pipeline integrity management, we propose targeted directions and scales of improvement for non-DEA-effective companies.展开更多
This paper describes how database information and electronic 3D models are integrated to produce power plant designs more efficiently and accurately. Engineering CAD/CAE systems have evolved from strictly 3D modeling ...This paper describes how database information and electronic 3D models are integrated to produce power plant designs more efficiently and accurately. Engineering CAD/CAE systems have evolved from strictly 3D modeling to spatial data management tools. This paper describes how process data, commodities, and location data are disseminated to the various project team members through a central integrated database. The database and 3D model also provide a cache of information that is valuable to the constructor, and operations and maintenance Personnel.展开更多
Data Integrity is a critical component of Data lifecycle management. Its importance increases even more in a complex and dynamic landscape. Actions like unauthorized access, unauthorized modifications, data manipulati...Data Integrity is a critical component of Data lifecycle management. Its importance increases even more in a complex and dynamic landscape. Actions like unauthorized access, unauthorized modifications, data manipulations, audit tampering, data backdating, data falsification, phishing and spoofing are no longer restricted to rogue individuals but in fact also prevalent in systematic organizations and states as well. Therefore, data security requires strong data integrity measures and associated technical controls in place. Without proper customized framework in place, organizations are prone to high risk of financial, reputational, revenue losses, bankruptcies, and legal penalties which we shall discuss further throughout this paper. We will also explore some of the improvised and innovative techniques in product development to better tackle the challenges and requirements of data security and integrity.展开更多
Using spatial data integration and database technology,analyzing and integrating the assessment results in all the development zones at different time in Hunan Province,the paper is intended to construct the database ...Using spatial data integration and database technology,analyzing and integrating the assessment results in all the development zones at different time in Hunan Province,the paper is intended to construct the database and managerial system for the assessment results of land use intensity in development zones,thus formulating"one map"of Hunan Development zones and realizing the integrated management and application of the assessment results in all the development zones at any time of Hunan above the provincial level.It has been proved that the system has good application effect and promising development in land management for land management departments and development zones.展开更多
introduced are the main implementation contents and brought out is a cycle optimal PDMimplementation methodology by phases through studying requirements of enterprises' datamanagement and summarizing the existing ...introduced are the main implementation contents and brought out is a cycle optimal PDMimplementation methodology by phases through studying requirements of enterprises' datamanagement and summarizing the existing method. The implementation contents include documentsmanagement product structure and configuration management, product classification and codingmanagement, workflow management and software encapsulation. It divides the PDM implementationinto preparing, software selechon and training, implementation plan making, concrete implementingand evaluating. By the cycle of the last two pheses, the successful implementation of PDM can beobtained. This methodology has been used in the CIMS project of two companies and proved to bepractical.展开更多
With the advent of Industry 4.0, more and more investment casting enterprises are implementing production manufacturing systems, especially in the last two years. This paper summarizes three new common requirements of...With the advent of Industry 4.0, more and more investment casting enterprises are implementing production manufacturing systems, especially in the last two years. This paper summarizes three new common requirements of the digital management aspect in precision casting enterprises, and puts forward three corresponding techniques. They are: the production process tracking card technology based on the main-sub card mode; the workshop site production process processing technology based on the barcode; and the equipment data integration technology. Then, this paper discusses in detail the principle, application and effect of these technologies; to provide the reference for enterprises to move towards digital casting and intelligent casting.展开更多
An integrated intelligent management is presented to help organizations manage many heterogeneous resources in their information system. A general architecture of management for information system reliability is propo...An integrated intelligent management is presented to help organizations manage many heterogeneous resources in their information system. A general architecture of management for information system reliability is proposed, and the architecture from two aspects, process model and hierarchical model, described. Data mining techniques are used in data analysis. A data analysis system applicable to real-time data analysis is developed by improved data mining on the critical processes. The framework of the integrated management for information system reliability based on real-time data mining is illustrated, and the development of integrated and intelligent management of information system discussed.展开更多
In a networked manufacturing environment, product data integration is the foundation of collaborative manufacturing among manufacturing enterprises. It is very important to build an effective integration mechanism for...In a networked manufacturing environment, product data integration is the foundation of collaborative manufacturing among manufacturing enterprises. It is very important to build an effective integration mechanism for product data sharing of manufacturing enterprises in geographically dispersed, heterogeneous system platform. In order to integrate enterprise level product data management (PDM) system and shop floor level distributed numerical control (DNC) system, based on the description of Web services technology, a Web-services-based integration approach is presented for facilitating seamless sharing of product data between PDM and DNC systems in a networked manufacturing environment. In order to verify the validity of the proposed approach, an example-based integration development has been implemented with Java 2 Software Development Kit and Java Web Services Developer Pack in Windows 2000 Professional environment.展开更多
Grid Computing is concerned with the sharing and coordinated use of diverse resources in distributed Virtual Organizations. This introduces various challenging security issues. Among these trusting, the resources to b...Grid Computing is concerned with the sharing and coordinated use of diverse resources in distributed Virtual Organizations. This introduces various challenging security issues. Among these trusting, the resources to be shared and coordinated with the dynamic and multi-institutional virtual organization environment becomes a challenging security issue. In this paper, an approach for trust assessment and trust degree calculation using subjective logic is suggested to allocate the Data Grid or Computational Grid user a reliable, trusted resource for maintaining the integrity of the data with fast response and accurate results. The suggested approach is explained using an example scenario and also from the simulation results. It is observed that there is an increase in the resource utilization of a trusted resource in contrast to the resource which is not trusted.展开更多
We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation,...We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation, and functionality, and robustness refers to the ability to handle incomplete and/or corrupt adversarial information, on one side, and image and or device variability, on the other side. The proposed methodology is model-free and non-parametric. It draws support from discriminative methods using likelihood ratios to link at the conceptual level biometrics and forensics. It further links, at the modeling and implementation level, the Bayesian framework, statistical learning theory (SLT) using transduction and semi-supervised lea- rning, and Information Theory (IY) using mutual information. The key concepts supporting the proposed methodology are a) local estimation to facilitate learning and prediction using both labeled and unlabeled data;b) similarity metrics using regularity of patterns, randomness deficiency, and Kolmogorov complexity (similar to MDL) using strangeness/typicality and ranking p-values;and c) the Cover – Hart theorem on the asymptotical performance of k-nearest neighbors approaching the optimal Bayes error. Several topics on biometric inference and prediction related to 1) multi-level and multi-layer data fusion including quality and multi-modal biometrics;2) score normalization and revision theory;3) face selection and tracking;and 4) identity management, are described here using an integrated approach that includes transduction and boosting for ranking and sequential fusion/aggregation, respectively, on one side, and active learning and change/ outlier/intrusion detection realized using information gain and martingale, respectively, on the other side. The methodology proposed can be mapped to additional types of information beyond biometrics.展开更多
为满足训练的信息化需求,设计了灵活通用的数据协议和虚实结合的信息引接系统。基于中间件的分层设计思路,实现了外部信息接收、对内信息转发、通信协议解析与转换、信息对接联调、数据文件记录、配置管理以及信息流量统计等功能;采用了...为满足训练的信息化需求,设计了灵活通用的数据协议和虚实结合的信息引接系统。基于中间件的分层设计思路,实现了外部信息接收、对内信息转发、通信协议解析与转换、信息对接联调、数据文件记录、配置管理以及信息流量统计等功能;采用了MVC软件架构模式,选用Visual Studio 2015作为开发工具,以Oracle数据库作为底层支撑,构建了实装信息及虚拟目标信息接收模块、建立数据映射关系模块、读取数据库方案模块、基础通信接收内部航迹模块、数据包发送模块。实际应用效果表明,该系统能够屏蔽外部装备系统、仿真系统和网络协议之间的差异,提供互联互通,适应虚实结合的训练任务,提高保障能力。该系统具有良好的适应性和可扩展性,可以满足信息化条件下系统的应用和发展需要。展开更多
文摘The whole-process project cost management based on building information modeling(BIM)is a new management method,aiming to realize the comprehensive optimization and improvement of project cost management through the application of BIM technology.This paper summarizes and analyzes the whole-process project cost management based on BIM,aiming to explore its application and development prospects in the construction industry.Firstly,this paper introduces the role and advantages of BIM technology in engineering cost management,including information integration,data sharing,and collaborative work.Secondly,the paper analyzes the key technologies and methods of the whole-process project cost management based on BIM,including model construction,data management,and cost control.In addition,the paper also discusses the challenges and limitations of the whole-process BIM project cost management,such as the inconsistency of technical standards,personnel training,and consciousness change.Finally,the paper summarizes the advantages and development prospects of the whole-process project cost management based on BIM and puts forward the direction and suggestions for future research.Through the research of this paper,it can provide a reference for construction cost management and promote innovation and development in the construction industry.
基金funded by the National Natural Science Foundation of China (Grant No. 71871018)。
文摘Pipeline integrity is a cornerstone of the operation of many industrial systems, and maintaining pipeline integrity is essential for preventing economic losses and ecological damage caused by oil and gas leaks. Based on integritymanagement data published by the US Pipeline and Hazardous Materials Safety Administration, this study applied the k-means clustering and data envelopment analysis(DEA) methods to both explore the characteristics of pipeline-integrity management and evaluate its efficiency. The k-means clustering algorithm was found to be scientifically valid for classifying pipeline companies as either low-, medium-, or high-difficulty companies according to their integrity-management requirements. Regardless of a pipeline company's classification, equipment failure was found to be the main cause of pipeline failure. In-line inspection corrosion and dent tools were the two most-used tools for pipeline inspection. Among the types of repair, 180-day condition repairs were a key concern for pipeline companies. The results of the DEA analysis indicate that only three out of 34 companies were deemed to be DEA-effective. To improve the effectiveness of pipeline integrity management, we propose targeted directions and scales of improvement for non-DEA-effective companies.
文摘This paper describes how database information and electronic 3D models are integrated to produce power plant designs more efficiently and accurately. Engineering CAD/CAE systems have evolved from strictly 3D modeling to spatial data management tools. This paper describes how process data, commodities, and location data are disseminated to the various project team members through a central integrated database. The database and 3D model also provide a cache of information that is valuable to the constructor, and operations and maintenance Personnel.
文摘Data Integrity is a critical component of Data lifecycle management. Its importance increases even more in a complex and dynamic landscape. Actions like unauthorized access, unauthorized modifications, data manipulations, audit tampering, data backdating, data falsification, phishing and spoofing are no longer restricted to rogue individuals but in fact also prevalent in systematic organizations and states as well. Therefore, data security requires strong data integrity measures and associated technical controls in place. Without proper customized framework in place, organizations are prone to high risk of financial, reputational, revenue losses, bankruptcies, and legal penalties which we shall discuss further throughout this paper. We will also explore some of the improvised and innovative techniques in product development to better tackle the challenges and requirements of data security and integrity.
文摘Using spatial data integration and database technology,analyzing and integrating the assessment results in all the development zones at different time in Hunan Province,the paper is intended to construct the database and managerial system for the assessment results of land use intensity in development zones,thus formulating"one map"of Hunan Development zones and realizing the integrated management and application of the assessment results in all the development zones at any time of Hunan above the provincial level.It has been proved that the system has good application effect and promising development in land management for land management departments and development zones.
文摘introduced are the main implementation contents and brought out is a cycle optimal PDMimplementation methodology by phases through studying requirements of enterprises' datamanagement and summarizing the existing method. The implementation contents include documentsmanagement product structure and configuration management, product classification and codingmanagement, workflow management and software encapsulation. It divides the PDM implementationinto preparing, software selechon and training, implementation plan making, concrete implementingand evaluating. By the cycle of the last two pheses, the successful implementation of PDM can beobtained. This methodology has been used in the CIMS project of two companies and proved to bepractical.
基金financially supported by the National Science&Technology Key Projects of Numerical Control(2012ZX04012-011)National High-tech R&D Program(863 program)(2013031003)
文摘With the advent of Industry 4.0, more and more investment casting enterprises are implementing production manufacturing systems, especially in the last two years. This paper summarizes three new common requirements of the digital management aspect in precision casting enterprises, and puts forward three corresponding techniques. They are: the production process tracking card technology based on the main-sub card mode; the workshop site production process processing technology based on the barcode; and the equipment data integration technology. Then, this paper discusses in detail the principle, application and effect of these technologies; to provide the reference for enterprises to move towards digital casting and intelligent casting.
文摘An integrated intelligent management is presented to help organizations manage many heterogeneous resources in their information system. A general architecture of management for information system reliability is proposed, and the architecture from two aspects, process model and hierarchical model, described. Data mining techniques are used in data analysis. A data analysis system applicable to real-time data analysis is developed by improved data mining on the critical processes. The framework of the integrated management for information system reliability based on real-time data mining is illustrated, and the development of integrated and intelligent management of information system discussed.
基金Project supported by the National Nature Science Foundation, China (Grant No60574054)Program for NewCentury Excellent Talents in University (2006) the Shanghai Science and Technology Climb Action Foundation( Grant No06DZ11202)
文摘In a networked manufacturing environment, product data integration is the foundation of collaborative manufacturing among manufacturing enterprises. It is very important to build an effective integration mechanism for product data sharing of manufacturing enterprises in geographically dispersed, heterogeneous system platform. In order to integrate enterprise level product data management (PDM) system and shop floor level distributed numerical control (DNC) system, based on the description of Web services technology, a Web-services-based integration approach is presented for facilitating seamless sharing of product data between PDM and DNC systems in a networked manufacturing environment. In order to verify the validity of the proposed approach, an example-based integration development has been implemented with Java 2 Software Development Kit and Java Web Services Developer Pack in Windows 2000 Professional environment.
文摘Grid Computing is concerned with the sharing and coordinated use of diverse resources in distributed Virtual Organizations. This introduces various challenging security issues. Among these trusting, the resources to be shared and coordinated with the dynamic and multi-institutional virtual organization environment becomes a challenging security issue. In this paper, an approach for trust assessment and trust degree calculation using subjective logic is suggested to allocate the Data Grid or Computational Grid user a reliable, trusted resource for maintaining the integrity of the data with fast response and accurate results. The suggested approach is explained using an example scenario and also from the simulation results. It is observed that there is an increase in the resource utilization of a trusted resource in contrast to the resource which is not trusted.
文摘We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation, and functionality, and robustness refers to the ability to handle incomplete and/or corrupt adversarial information, on one side, and image and or device variability, on the other side. The proposed methodology is model-free and non-parametric. It draws support from discriminative methods using likelihood ratios to link at the conceptual level biometrics and forensics. It further links, at the modeling and implementation level, the Bayesian framework, statistical learning theory (SLT) using transduction and semi-supervised lea- rning, and Information Theory (IY) using mutual information. The key concepts supporting the proposed methodology are a) local estimation to facilitate learning and prediction using both labeled and unlabeled data;b) similarity metrics using regularity of patterns, randomness deficiency, and Kolmogorov complexity (similar to MDL) using strangeness/typicality and ranking p-values;and c) the Cover – Hart theorem on the asymptotical performance of k-nearest neighbors approaching the optimal Bayes error. Several topics on biometric inference and prediction related to 1) multi-level and multi-layer data fusion including quality and multi-modal biometrics;2) score normalization and revision theory;3) face selection and tracking;and 4) identity management, are described here using an integrated approach that includes transduction and boosting for ranking and sequential fusion/aggregation, respectively, on one side, and active learning and change/ outlier/intrusion detection realized using information gain and martingale, respectively, on the other side. The methodology proposed can be mapped to additional types of information beyond biometrics.
文摘为满足训练的信息化需求,设计了灵活通用的数据协议和虚实结合的信息引接系统。基于中间件的分层设计思路,实现了外部信息接收、对内信息转发、通信协议解析与转换、信息对接联调、数据文件记录、配置管理以及信息流量统计等功能;采用了MVC软件架构模式,选用Visual Studio 2015作为开发工具,以Oracle数据库作为底层支撑,构建了实装信息及虚拟目标信息接收模块、建立数据映射关系模块、读取数据库方案模块、基础通信接收内部航迹模块、数据包发送模块。实际应用效果表明,该系统能够屏蔽外部装备系统、仿真系统和网络协议之间的差异,提供互联互通,适应虚实结合的训练任务,提高保障能力。该系统具有良好的适应性和可扩展性,可以满足信息化条件下系统的应用和发展需要。