Discussing the matter of organizational data management implies, almost automatically, the concept of data warehousing as one of the most important parts of decision support system (DSS), as it supports the integrat...Discussing the matter of organizational data management implies, almost automatically, the concept of data warehousing as one of the most important parts of decision support system (DSS), as it supports the integration of information management by aggregating all data formats and provisioning external systems with consistent data content and flows, together with the metadata concept, as one of the easiest ways of integration for software and database systems. Since organizational data management uses the metadata channel for creating a bi-directional flow, when correctly managed, metadata can save both time and resources for organizations. This paperI will focus on providing theoretical aspects of the two concepts, together with a short brief over a proposed model of design for an organizational management tool.展开更多
With the deepening informationization of Resources & Environment Remote Sensing geological survey conducted,some potential problems and deficiency are:(1) shortage of unified-planed running environment;(2) inconsi...With the deepening informationization of Resources & Environment Remote Sensing geological survey conducted,some potential problems and deficiency are:(1) shortage of unified-planed running environment;(2) inconsistent methods of data integration;and(3) disadvantages of different performing ways of data integration.This paper solves the above problems through overall planning and design,constructs unified running environment, consistent methods of data integration and system structure in order to advance the informationization展开更多
The brokering approach can be successfully used to overcome the crucial question of searching among enormous amount of data (raw and/or processed) produced and stored in different information systems. In this paper,...The brokering approach can be successfully used to overcome the crucial question of searching among enormous amount of data (raw and/or processed) produced and stored in different information systems. In this paper, authors describe the Data Management System the DMS (Data Management System) developed by INGV (Istituto Nazionale di Geofisica e Vulcanologia) to support the brokering system GEOSS (Global Earth Observation System of Systems) adopted for the ARCA (Arctic Present Climate Change and Past Extreme Events) project. This DMS includes heterogeneous data that contributes to the ARCA objective (www.arcaproject.it) focusing on multi-parametric and multi-disciplinary studies on the mechanism (s) behind the release of large volumes of cold and fresh water from melting of ice caps. The DMS is accessible directly at the www.arca.rm.ingv.it, or through the IADC (Italian Arctic Data Center) at http://arcticnode.dta.cnr.it/iadc/gi-portal/index.jsp that interoperates with the GEOSS brokering system (http://www.geoportal.org0 making easy and fast the search of specific data set and its URL.展开更多
A product data management system for a manufacturing enterprise is to make sure that the proper product data can be communicated to the right people at the right time.This paper describes a system analysis paradigm fo...A product data management system for a manufacturing enterprise is to make sure that the proper product data can be communicated to the right people at the right time.This paper describes a system analysis paradigm for data analysis in a product data management(PDM)development.Three aspects of the paradigm,i.e.,function,structure and behavior are rep- resented.The use of the paradigm explains why so many kinds of objects are necessary in a commercial database matrix and what models are available for developing a PDM application.As another result,a lot of models are derived from the analysis of product data system paradigm to model product data and PDM database definitions.展开更多
Since the late of previous decade, hypertext technique has been applied in many areas. A hypertext data model with version control which is applied to a digital delivery for engineering documents named Optical Disk ba...Since the late of previous decade, hypertext technique has been applied in many areas. A hypertext data model with version control which is applied to a digital delivery for engineering documents named Optical Disk based Electronic Archives Management System(ODEAMS) is presented first and it has successfully solved some problems in engineering data management. Then, this paper describes some details to implement the hypertext network in ODEAMS after introducing the requirements and characters of engineering data management.展开更多
In this paper, an open architecture and its implementation for product data management are presented. The system architecture, product data definition model, and a set of components of the architecture are discussed i...In this paper, an open architecture and its implementation for product data management are presented. The system architecture, product data definition model, and a set of components of the architecture are discussed in detail. Especially, the principle and some mechanism of one of the components, an object oriented database management system GH-EODB, are discussed in more detail. The architecture is extensible and adaptable.展开更多
Digital broadcasting is a novel paradigm for the next generation broadcasting. Its goal is to provide not only better quality of pictures but also a variety of services that is impossible in traditional airwaves broad...Digital broadcasting is a novel paradigm for the next generation broadcasting. Its goal is to provide not only better quality of pictures but also a variety of services that is impossible in traditional airwaves broadcasting. One of the important factors for this new broadcasting environment is the interoperability among broadcasting applications since the environment is distributed. Therefore the broadcasting metadata becomes increasingly important and one of the metadata standards for a digital broadcasting is TV-Anytime metadata. TV-Anytime metadata is defined using XML schema, so its instances are XML data. In order to fulfill interoperability, a standard query language is also required and XQuery is a natural choice. There are some researches for dealing with broadcasting metadata. In our previous study, we have proposed the method for efficiently managing the broadcasting metadata in a service provider. However, the environment of a Set-Top Box for digital broadcasting is limited such as low-cost and low-setting. Therefore there are some considerations to apply general approaches for managing the metadata into the Set-Top Box. This paper proposes a method for efficiently managing the broadcasting metadata based on the Set-Top Box and a prototype of metadata management system for evaluating our method. Our system consists of a storage engine to store the metadata and an XQuery engine to search the stored metadata and uses special index for storing and searching. Our two engines are designed independently with hardware platform therefore these engines can be used in any low-cost applications to manage broadcasting metadata.展开更多
Acomputerized platform for multi-channel physiological signals is developed in our lab to highly improve the recording and review for the output of The Polygraph System. The platform mainly consists of a Pentium III P...Acomputerized platform for multi-channel physiological signals is developed in our lab to highly improve the recording and review for the output of The Polygraph System. The platform mainly consists of a Pentium III PC and a high speed A/D converter and is supported by Visual Basic 6.0 and Microsoft Access 2 000. The platform has powerful functions for data acquisition, real-time waveform display and review. It has proved its reliability and flexibility through practical animal experiments. Besides, its modulized program design provides interfaces for further data processing and analysis.展开更多
The next generation of high-power lasers enables repetition of experiments at orders of magnitude higher frequency than what was possible using the prior generation.Facilities requiring human intervention between lase...The next generation of high-power lasers enables repetition of experiments at orders of magnitude higher frequency than what was possible using the prior generation.Facilities requiring human intervention between laser repetitions need to adapt in order to keep pace with the new laser technology.A distributed networked control system can enable laboratory-wide automation and feedback control loops.These higher-repetition-rate experiments will create enormous quantities of data.A consistent approach to managing data can increase data accessibility,reduce repetitive data-software development and mitigate poorly organized metadata.An opportunity arises to share knowledge of improvements to control and data infrastructure currently being undertaken.We compare platforms and approaches to state-of-the-art control systems and data management at high-power laser facilities,and we illustrate these topics with case studies from our community.展开更多
Forensic investigations,especially those related to missing persons and unidentified remains,produce different types of data that must be managed and understood.The data collected and produced are extensive and origin...Forensic investigations,especially those related to missing persons and unidentified remains,produce different types of data that must be managed and understood.The data collected and produced are extensive and originate from various sources:the police,non-governmental organizations(NGOs),medical examiner offices,specialised forensic teams,family members,and others.Some examples of information include,but are not limited to,the investigative background information,excavation data of burial sites,antemortem data on missing persons,and postmortem data on the remains of unidentified individuals.These complex data must be stored in a secured place,analysed,compared,shared,and then reported to the investigative actors and the public,especially the families of missing persons,who should be kept informed of the investigation.Therefore,a data management system with the capability of performing the tasks relevant to the goals of the investigation and the identification of an individual,while respecting the deceased and their families,is critical for standardising investigations.Data management is crucial to assure the quality of investigative processes,and it must be recognised as a holistic integrated system.The aim of this article is to discuss some of the most important components of an effective forensic data management system.The discussion is enriched by examples,challenges,and lessons learned from the erratic development and launching of databases for missing and unidentified persons in Brazil.The main objective of this article is to bring attention to the urgent need for an effective and integrated system in Brazil.展开更多
Using spatial data integration and database technology,analyzing and integrating the assessment results in all the development zones at different time in Hunan Province,the paper is intended to construct the database ...Using spatial data integration and database technology,analyzing and integrating the assessment results in all the development zones at different time in Hunan Province,the paper is intended to construct the database and managerial system for the assessment results of land use intensity in development zones,thus formulating"one map"of Hunan Development zones and realizing the integrated management and application of the assessment results in all the development zones at any time of Hunan above the provincial level.It has been proved that the system has good application effect and promising development in land management for land management departments and development zones.展开更多
Management of poultry farms in China mostly relies on manual labor.Since such a large amount of valuable data for the production process either are saved incomplete or saved only as paper documents,making it very diff...Management of poultry farms in China mostly relies on manual labor.Since such a large amount of valuable data for the production process either are saved incomplete or saved only as paper documents,making it very difficult for data retrieve,processing and analysis.An integrated cloud-based data management system(CDMS)was proposed in this study,in which the asynchronous data transmission,distributed file system,and wireless network technology were used for information collection,management and sharing in large-scale egg production.The cloud-based platform can provide information technology infrastructures for different farms.The CDMS can also allocate the computing resources and storage space based on demand.A real-time data acquisition software was developed,which allowed farm management staff to submit reports through website or smartphone,enabled digitization of production data.The use of asynchronous transfer in the system can avoid potential data loss during the transmission between farms and the remote cloud data center.All the valid historical data of poultry farms can be stored to the remote cloud data center,and then eliminates the need for large server clusters on the farms.Users with proper identification can access the online data portal of the system through a browser or an APP from anywhere worldwide.展开更多
Schema incompatibility is a major challenge to a federated database systemfor data sharing among heterogeneous,multiple and autonomous databases.This paperpresents a mapping approach based on import schema,export sche...Schema incompatibility is a major challenge to a federated database systemfor data sharing among heterogeneous,multiple and autonomous databases.This paperpresents a mapping approach based on import schema,export schema and domain conver-sion function,through which schema incompatibility problems such as naming conflict,domain incompatibility and entity definition incompatibility can be resolved effectively.The implementation techniques are also discussed.展开更多
This article presents the design and implementation of highly secure and reliable database system for resident records management system using blockchain technology. Blockchain provides highly secure and reliable data...This article presents the design and implementation of highly secure and reliable database system for resident records management system using blockchain technology. Blockchain provides highly secure and reliable data access environment. In blockchain, several data fragments are packed into one block and all blocks are connected to form the chain of blocks. In our prototype, each event of resident such as birth, moving, employment and so on, is assigned to data fragment and certain amount of data fragment, says 20 fragments are packed into block. We also developed the web application interface to avoid installing any applications in users’ PC or smartphone. Prototype development proved the possibility to use the blockchain technology to large amount of data management system with highly secure and reliable features.展开更多
Combined with the current status of Antarctic data management and the characteristics of polar science data resulted from Chinese Antarctic and Arctic Research Expeditions, the Chinese Polar Science Database System(CP...Combined with the current status of Antarctic data management and the characteristics of polar science data resulted from Chinese Antarctic and Arctic Research Expeditions, the Chinese Polar Science Database System(CPSDS) has been designed and established in 2002. The infrastructure, technical standard, mechanism of sharing data of this system are reviewed in this article. Meanwhile, the development of Chinese polar data management is summarized. As the metadata is the powerful and useful tool for managing and disseminating scientific data, the metadata is also used as “search engine” of CPSDS. Besides, the trend of data management and sharing is also discussed.展开更多
Product family(PF) is the most important part of product platform. A new method is proposed to mine PF based on multi-space product data in PLM database. Product structure tree(PST) and bill of material(BOM) are used ...Product family(PF) is the most important part of product platform. A new method is proposed to mine PF based on multi-space product data in PLM database. Product structure tree(PST) and bill of material(BOM) are used as the data source. A PF can be obtained by mining physics space, logic space and attribute space of product data. In this work, firstly, a PLM database is described, consisting of data organization form, data structure, and data characteristics. Then the PF mining method introduces the sequence alignment techniques used in bio-informatics, which mainly includes data pre-processing, regularization, mining algorithm and cluster analysis. Finally, the feasibility and effectiveness of the proposed method are verified by a case study of high and middle pressure valve, demonstrating a feasible method to obtain PF from PLM database.展开更多
Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the...Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials(RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research.展开更多
文摘Discussing the matter of organizational data management implies, almost automatically, the concept of data warehousing as one of the most important parts of decision support system (DSS), as it supports the integration of information management by aggregating all data formats and provisioning external systems with consistent data content and flows, together with the metadata concept, as one of the easiest ways of integration for software and database systems. Since organizational data management uses the metadata channel for creating a bi-directional flow, when correctly managed, metadata can save both time and resources for organizations. This paperI will focus on providing theoretical aspects of the two concepts, together with a short brief over a proposed model of design for an organizational management tool.
文摘With the deepening informationization of Resources & Environment Remote Sensing geological survey conducted,some potential problems and deficiency are:(1) shortage of unified-planed running environment;(2) inconsistent methods of data integration;and(3) disadvantages of different performing ways of data integration.This paper solves the above problems through overall planning and design,constructs unified running environment, consistent methods of data integration and system structure in order to advance the informationization
文摘The brokering approach can be successfully used to overcome the crucial question of searching among enormous amount of data (raw and/or processed) produced and stored in different information systems. In this paper, authors describe the Data Management System the DMS (Data Management System) developed by INGV (Istituto Nazionale di Geofisica e Vulcanologia) to support the brokering system GEOSS (Global Earth Observation System of Systems) adopted for the ARCA (Arctic Present Climate Change and Past Extreme Events) project. This DMS includes heterogeneous data that contributes to the ARCA objective (www.arcaproject.it) focusing on multi-parametric and multi-disciplinary studies on the mechanism (s) behind the release of large volumes of cold and fresh water from melting of ice caps. The DMS is accessible directly at the www.arca.rm.ingv.it, or through the IADC (Italian Arctic Data Center) at http://arcticnode.dta.cnr.it/iadc/gi-portal/index.jsp that interoperates with the GEOSS brokering system (http://www.geoportal.org0 making easy and fast the search of specific data set and its URL.
文摘A product data management system for a manufacturing enterprise is to make sure that the proper product data can be communicated to the right people at the right time.This paper describes a system analysis paradigm for data analysis in a product data management(PDM)development.Three aspects of the paradigm,i.e.,function,structure and behavior are rep- resented.The use of the paradigm explains why so many kinds of objects are necessary in a commercial database matrix and what models are available for developing a PDM application.As another result,a lot of models are derived from the analysis of product data system paradigm to model product data and PDM database definitions.
文摘Since the late of previous decade, hypertext technique has been applied in many areas. A hypertext data model with version control which is applied to a digital delivery for engineering documents named Optical Disk based Electronic Archives Management System(ODEAMS) is presented first and it has successfully solved some problems in engineering data management. Then, this paper describes some details to implement the hypertext network in ODEAMS after introducing the requirements and characters of engineering data management.
基金the High Technoloey Research and Development Programme of China
文摘In this paper, an open architecture and its implementation for product data management are presented. The system architecture, product data definition model, and a set of components of the architecture are discussed in detail. Especially, the principle and some mechanism of one of the components, an object oriented database management system GH-EODB, are discussed in more detail. The architecture is extensible and adaptable.
文摘Digital broadcasting is a novel paradigm for the next generation broadcasting. Its goal is to provide not only better quality of pictures but also a variety of services that is impossible in traditional airwaves broadcasting. One of the important factors for this new broadcasting environment is the interoperability among broadcasting applications since the environment is distributed. Therefore the broadcasting metadata becomes increasingly important and one of the metadata standards for a digital broadcasting is TV-Anytime metadata. TV-Anytime metadata is defined using XML schema, so its instances are XML data. In order to fulfill interoperability, a standard query language is also required and XQuery is a natural choice. There are some researches for dealing with broadcasting metadata. In our previous study, we have proposed the method for efficiently managing the broadcasting metadata in a service provider. However, the environment of a Set-Top Box for digital broadcasting is limited such as low-cost and low-setting. Therefore there are some considerations to apply general approaches for managing the metadata into the Set-Top Box. This paper proposes a method for efficiently managing the broadcasting metadata based on the Set-Top Box and a prototype of metadata management system for evaluating our method. Our system consists of a storage engine to store the metadata and an XQuery engine to search the stored metadata and uses special index for storing and searching. Our two engines are designed independently with hardware platform therefore these engines can be used in any low-cost applications to manage broadcasting metadata.
基金Sponsored by China Ministry of Science and Technology:Joint Chinese - Israel Research Grant (99M- 0 0 4 14 15 )
文摘Acomputerized platform for multi-channel physiological signals is developed in our lab to highly improve the recording and review for the output of The Polygraph System. The platform mainly consists of a Pentium III PC and a high speed A/D converter and is supported by Visual Basic 6.0 and Microsoft Access 2 000. The platform has powerful functions for data acquisition, real-time waveform display and review. It has proved its reliability and flexibility through practical animal experiments. Besides, its modulized program design provides interfaces for further data processing and analysis.
基金A.J.acknowledges the support from DOE Grant#DESC0016804.
文摘The next generation of high-power lasers enables repetition of experiments at orders of magnitude higher frequency than what was possible using the prior generation.Facilities requiring human intervention between laser repetitions need to adapt in order to keep pace with the new laser technology.A distributed networked control system can enable laboratory-wide automation and feedback control loops.These higher-repetition-rate experiments will create enormous quantities of data.A consistent approach to managing data can increase data accessibility,reduce repetitive data-software development and mitigate poorly organized metadata.An opportunity arises to share knowledge of improvements to control and data infrastructure currently being undertaken.We compare platforms and approaches to state-of-the-art control systems and data management at high-power laser facilities,and we illustrate these topics with case studies from our community.
基金This work was partially supported by the CAPES-Science without Borders Scholarship[grant number 99999.013091/2013-01].
文摘Forensic investigations,especially those related to missing persons and unidentified remains,produce different types of data that must be managed and understood.The data collected and produced are extensive and originate from various sources:the police,non-governmental organizations(NGOs),medical examiner offices,specialised forensic teams,family members,and others.Some examples of information include,but are not limited to,the investigative background information,excavation data of burial sites,antemortem data on missing persons,and postmortem data on the remains of unidentified individuals.These complex data must be stored in a secured place,analysed,compared,shared,and then reported to the investigative actors and the public,especially the families of missing persons,who should be kept informed of the investigation.Therefore,a data management system with the capability of performing the tasks relevant to the goals of the investigation and the identification of an individual,while respecting the deceased and their families,is critical for standardising investigations.Data management is crucial to assure the quality of investigative processes,and it must be recognised as a holistic integrated system.The aim of this article is to discuss some of the most important components of an effective forensic data management system.The discussion is enriched by examples,challenges,and lessons learned from the erratic development and launching of databases for missing and unidentified persons in Brazil.The main objective of this article is to bring attention to the urgent need for an effective and integrated system in Brazil.
文摘Using spatial data integration and database technology,analyzing and integrating the assessment results in all the development zones at different time in Hunan Province,the paper is intended to construct the database and managerial system for the assessment results of land use intensity in development zones,thus formulating"one map"of Hunan Development zones and realizing the integrated management and application of the assessment results in all the development zones at any time of Hunan above the provincial level.It has been proved that the system has good application effect and promising development in land management for land management departments and development zones.
基金the“12th Five-Year-Plan”for National Science and Technology for Rural Development in China(No.2014BAD08B05).
文摘Management of poultry farms in China mostly relies on manual labor.Since such a large amount of valuable data for the production process either are saved incomplete or saved only as paper documents,making it very difficult for data retrieve,processing and analysis.An integrated cloud-based data management system(CDMS)was proposed in this study,in which the asynchronous data transmission,distributed file system,and wireless network technology were used for information collection,management and sharing in large-scale egg production.The cloud-based platform can provide information technology infrastructures for different farms.The CDMS can also allocate the computing resources and storage space based on demand.A real-time data acquisition software was developed,which allowed farm management staff to submit reports through website or smartphone,enabled digitization of production data.The use of asynchronous transfer in the system can avoid potential data loss during the transmission between farms and the remote cloud data center.All the valid historical data of poultry farms can be stored to the remote cloud data center,and then eliminates the need for large server clusters on the farms.Users with proper identification can access the online data portal of the system through a browser or an APP from anywhere worldwide.
文摘Schema incompatibility is a major challenge to a federated database systemfor data sharing among heterogeneous,multiple and autonomous databases.This paperpresents a mapping approach based on import schema,export schema and domain conver-sion function,through which schema incompatibility problems such as naming conflict,domain incompatibility and entity definition incompatibility can be resolved effectively.The implementation techniques are also discussed.
文摘This article presents the design and implementation of highly secure and reliable database system for resident records management system using blockchain technology. Blockchain provides highly secure and reliable data access environment. In blockchain, several data fragments are packed into one block and all blocks are connected to form the chain of blocks. In our prototype, each event of resident such as birth, moving, employment and so on, is assigned to data fragment and certain amount of data fragment, says 20 fragments are packed into block. We also developed the web application interface to avoid installing any applications in users’ PC or smartphone. Prototype development proved the possibility to use the blockchain technology to large amount of data management system with highly secure and reliable features.
文摘Combined with the current status of Antarctic data management and the characteristics of polar science data resulted from Chinese Antarctic and Arctic Research Expeditions, the Chinese Polar Science Database System(CPSDS) has been designed and established in 2002. The infrastructure, technical standard, mechanism of sharing data of this system are reviewed in this article. Meanwhile, the development of Chinese polar data management is summarized. As the metadata is the powerful and useful tool for managing and disseminating scientific data, the metadata is also used as “search engine” of CPSDS. Besides, the trend of data management and sharing is also discussed.
基金Project(51275362)supported by the National Natural Science Foundation of ChinaProject(2014ZX04015021)supported by National Science and Technology Major Project,China
文摘Product family(PF) is the most important part of product platform. A new method is proposed to mine PF based on multi-space product data in PLM database. Product structure tree(PST) and bill of material(BOM) are used as the data source. A PF can be obtained by mining physics space, logic space and attribute space of product data. In this work, firstly, a PLM database is described, consisting of data organization form, data structure, and data characteristics. Then the PF mining method introduces the sequence alignment techniques used in bio-informatics, which mainly includes data pre-processing, regularization, mining algorithm and cluster analysis. Finally, the feasibility and effectiveness of the proposed method are verified by a case study of high and middle pressure valve, demonstrating a feasible method to obtain PF from PLM database.
基金supported by Fundamental Research Funds of State Key Laboratory of Ophthalmology (Grant No.2015QN01)Young Teacher Top-Support project of Sun Yat-sen University(Grant No.2015ykzd11)+4 种基金the Cultivation Projects for Young Teaching Staff of Sun Yat-sen University(Grant No.12ykpy61) from the Fundamental Research Funds for the Central Universitiesthe Pearl River Science and Technology New Star(Grant No.2014J2200060)Project of Guangzhou City,the Guangdong Provincial Natural Science Foundation for Distinguished Young Scholars of China(Grant No. 2014A030306030)Youth Science and Technology Innovation Talents Funds in Special Support Plan for High Level Talents in Guangdong Province(Grant No. 2014TQ01R573)Key Research Plan for National Natural Science Foundation of China in Cultivation Project (No.91546101)
文摘Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials(RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research.