Most semi-structured data are of certain structure regularity. Having beenstored as structured data in relational database (RDB), they can be effectively managed by databasemanagement system (DBMS). Some semi-structur...Most semi-structured data are of certain structure regularity. Having beenstored as structured data in relational database (RDB), they can be effectively managed by databasemanagement system (DBMS). Some semi-structured data are difficult to transform due to theirirregular structures. We design an efficient algorithm and data structure for ensuring losslesstransformation. We bring forward an approach of schema extraction through data mining, in whichdifferent kinds of elements are transformed respectively and lossless mapping from semi-structureddata to structured data can be achieved.展开更多
In order to archive, quality control and disseminate a large variety of marine data in a marine data exchange platfonn, a marine XML has been developed to encapsulate marine data, which provides an efficient means to ...In order to archive, quality control and disseminate a large variety of marine data in a marine data exchange platfonn, a marine XML has been developed to encapsulate marine data, which provides an efficient means to store, transfer and display marine data. This paper first presents the details of the main marine XML elements and then gives an example showing how to transform CTD-observed data into Marine XML format, which illustrates the XML encapsulation process of marine observed data.展开更多
In existing web services-based workflow, data exchanging across the web services is centralized, the workflow engine intermediates at each step of the application sequence. However, many grid applications, especially ...In existing web services-based workflow, data exchanging across the web services is centralized, the workflow engine intermediates at each step of the application sequence. However, many grid applications, especially data intensive scientific applications, require exchanging large amount of data across the grid services. Having a central workflow engine relay the data between the services would resu'lts in a bottleneck in these cases. This paper proposes a data exchange model for individual grid workflow and multiworkflows composition respectively. The model enables direct communication for large amounts of data between two grid services. To enable data to exchange among multiple workflows, the bridge data service is used.展开更多
A semi structured data extraction method to get the useful information embedded in a group of relevant web pages and store it with OEM(Object Exchange Model) is proposed. Then, the data mining method is adopted to dis...A semi structured data extraction method to get the useful information embedded in a group of relevant web pages and store it with OEM(Object Exchange Model) is proposed. Then, the data mining method is adopted to discover schema knowledge implicit in the semi structured data. This knowledge can make users understand the information structure on the web more deeply and thourouly. At the same time, it can also provide a kind of effective schema for the querying of web information.展开更多
In this paper, in order to implement the share and exchange of the ship product data, a new kind of global function model is established. By researching on the development and trend of the application of ship STEP (st...In this paper, in order to implement the share and exchange of the ship product data, a new kind of global function model is established. By researching on the development and trend of the application of ship STEP (standard for the exchange of product model data) standards, the AIM (application interpreted model) of AP216 is developed and improved as an example, aiming at the characteristics and practical engineering of ship industry in our country. The data exchange interfaces are formed based on STEP in the CAD/CAM for the ship by all function modules and shared databases under the global function model. The share and exchange of all information and data are solved in the design, manufacture and all life-cycle of ship products among different computer application systems. The research work makes foundation for the ship industry informatization.展开更多
The high-pressure hydrogenation heat exchanger is an impoltmlt equipment of the refinery, but it is exposed tothe problem of leakage caused by ammonium salt corrosion. Therefore, it is very important to evaluate the o...The high-pressure hydrogenation heat exchanger is an impoltmlt equipment of the refinery, but it is exposed tothe problem of leakage caused by ammonium salt corrosion. Therefore, it is very important to evaluate the operating statusof flae hydrogenation heat exchanger. To improve flae method for evaluating the operating status of hydrogenation heat ex-chmagers by using flae traditional method, flais paper proposes a new method for evaluating the operation of hydrogenationheat exchangers based on big data. To address flae noisy data common in flae industry, this paper proposes an automatednoisy interval detection algorithm. To deal with flae problem that the sensor parameters have voluminous and mtrelateddimensions, flais paper proposes a key parameter detection algorithm based on flae Pearson correlation coefficient. Finally,this paper presents a system-based health scoring algorithm based on PCA (Principal Component Analysis) to assist site op-erators in assessing the healfla of hydrogenation heat exchangers. The evaluation of flae operating status of flae hydrorefiningheat exchange device based on big data technology will help the operators to more accurately grasp the status of flae indus-trial system mad have positive guiding significance for the early warning offlae failure.展开更多
Objectives:The aim of this study was to investigate and develop a data storage and exchange format for the process of automatic systematic reviews(ASR)of traditional Chinese medicine(TCM).Methods:A lightweight and com...Objectives:The aim of this study was to investigate and develop a data storage and exchange format for the process of automatic systematic reviews(ASR)of traditional Chinese medicine(TCM).Methods:A lightweight and commonly used data format,namely,JavaScript Object Notation(JSON),was introduced in this study.We designed a fully described data structure to collect TCM clinical trial information based on the JSON syntax.Results:A smart and powerful data format,JSON-ASR,was developed.JSON-ASR uses a plain-text data format in the form of key/value pairs and consists of six sections and more than 80 preset pairs.JSON-ASR adopts extensible structured arrays to support the situations of multi-groups and multi-outcomes.Conclusion:JSON-ASR has the characteristics of light weight,flexibility,and good scalability,which is suitable for the complex data of clinical evidence.展开更多
On Nov.4^(th), AQSIQ (General Administration of Quality Supervision,Inspection and Quarantine of the People' s Republic of China), SAC (Standardization Administrationof China), National Audit Office of China (CNAO...On Nov.4^(th), AQSIQ (General Administration of Quality Supervision,Inspection and Quarantine of the People' s Republic of China), SAC (Standardization Administrationof China), National Audit Office of China (CNAO), and National Ministry of Finance of China jointlyheld the conference press on the national standard of Information Technology--Data Interface ofAccounting Software (GB/T 19581-2004) in Beijing. The standard was approved and issued on Sept. 20,2004 by AQSIQ and SAC, and it would come into effect all over the whole nation from January 1^(st),2005. Pu Changcheng, Vice Director of AQSIQ, Shi Aizhong, Vice Director of CNAO, Li Zhonghai. amember of the Party Group of AQSIQ and Director of SAC, the other leaders of concerned departmentssuch as National Ministry of Finance, National Telegraphy Office, and etc. attended the ConferencePress and made speeches. They fully affirmed the important significance and the achievements onstandardization work of electronic government business, and also they set new demands on the workfor the future.展开更多
1.1. Development of international data exchange standards in securities field Securities market involves a large number of participants, like investors, securities companies, exchanges, clearingcorporations and so on...1.1. Development of international data exchange standards in securities field Securities market involves a large number of participants, like investors, securities companies, exchanges, clearingcorporations and so on. Businesses among the participants are completed via data exchange. Therefore, the data exchange protocols serve an important factor to determine and promote the sate and rapid development of the securities market.展开更多
Contemporary mainstream big data governance platforms are built atop the big data ecosystem components,offering a one-stop development and analysis governance platform for the collection,transmission,storage,cleansing...Contemporary mainstream big data governance platforms are built atop the big data ecosystem components,offering a one-stop development and analysis governance platform for the collection,transmission,storage,cleansing,transformation,querying and analysis,data development,publishing,and subscription,sharing and exchange,management,and services of massive data.These platforms serve various role members who have internal and external data needs.However,in the era of big data,the rapid update and iteration of big data technologies,the diversification of data businesses,and the exponential growth of data present more challenges and uncertainties to the construction of big data governance platforms.This paper discusses how to effectively build a data governance platform under the big data system from the perspectives of functional architecture,logical architecture,data architecture,and functional design.展开更多
In the course of network supported collaborative design, the data processing plays a very vital role. Much effort has been spent in this area, and many kinds of approaches have been proposed. Based on the correlative ...In the course of network supported collaborative design, the data processing plays a very vital role. Much effort has been spent in this area, and many kinds of approaches have been proposed. Based on the correlative materials, this paper presents extensible markup language (XML) based strategy for several important problems of data processing in network supported collaborative design, such as the representation of standard for the exchange of product model data (STEP) with XML in the product information expression and the management of XML documents using relational database. The paper gives a detailed exposition on how to clarify the mapping between XML structure and the relationship database structure and how XML-QL queries can be translated into structured query language (SQL) queries. Finally, the structure of data processing system based on XML is presented.展开更多
These days,data is regarded as a valuable asset in the era of the data economy,which demands a trading platform for buying and selling data.However,online data trading poses challenges in terms of security and fairnes...These days,data is regarded as a valuable asset in the era of the data economy,which demands a trading platform for buying and selling data.However,online data trading poses challenges in terms of security and fairness because the seller and the buyer may not fully trust each other.Therefore,in this paper,a blockchain-based secure and fair data trading system is proposed by taking advantage of the smart contract and matchmaking encryption.The proposed system enables bilateral authorization,where data trading between a seller and a buyer is accomplished only if their policies,required by each other,are satisfied simultaneously.This can be achieved by exploiting the security features of the matchmaking encryption.To guarantee non-repudiation and fairness between trading parties,the proposed system leverages a smart contract to ensure that the parties honestly carry out the data trading protocol.However,the smart contract in the proposed system does not include complex cryptographic operations for the efficiency of onchain processes.Instead,these operations are carried out by off-chain parties and their results are used as input for the on-chain procedure.The system also uses an arbitration protocol to resolve disputes based on the trading proof recorded on the blockchain.The performance of the protocol is evaluated in terms of off-chain computation overhead and on-chain gas consumption.The results of the experiments demonstrate that the proposed protocols can enable the implementation of a cost-effective data trading system.展开更多
Multi-factor authentication(MFA)was proposed by Pointcheval et al.[Pointcheval and Zimmer(2008)]to improve the security of single-factor(and two-factor)authentication.As the backbone of multi-factor authentication,bio...Multi-factor authentication(MFA)was proposed by Pointcheval et al.[Pointcheval and Zimmer(2008)]to improve the security of single-factor(and two-factor)authentication.As the backbone of multi-factor authentication,biometric data are widely observed.Especially,how to keep the privacy of biometric at the password database without impairing efficiency is still an open question.Using the vulnerability of encryption(or hash)algorithms,the attacker can still launch offline brute-force attacks on encrypted(or hashed)biometric data.To address the potential risk of biometric disclosure at the password database,in this paper,we propose a novel efficient and secure MFA key exchange(later denoted as MFAKE)protocol leveraging the Pythia PRF service and password-to-random(or PTR)protocol.Armed with the PTR protocol,a master password pwd can be translated by the user into independent pseudorandom passwords(or rwd)for each user account with the help of device(e.g.,smart phone).Meanwhile,using the Pythia PRF service,the password database can avoid leakage of the local user’s password and biometric data.This is the first paper to achieve the password and biometric harden service simultaneously using the PTR protocol and Pythia PRF.展开更多
文摘Most semi-structured data are of certain structure regularity. Having beenstored as structured data in relational database (RDB), they can be effectively managed by databasemanagement system (DBMS). Some semi-structured data are difficult to transform due to theirirregular structures. We design an efficient algorithm and data structure for ensuring losslesstransformation. We bring forward an approach of schema extraction through data mining, in whichdifferent kinds of elements are transformed respectively and lossless mapping from semi-structureddata to structured data can be achieved.
基金funds of Ocean University of China Research Initiation Grant and the National 908 Project entitled ‘Marine Information Exchange and Integration Technology Based on XML’ (No 908-03-01-07)
文摘In order to archive, quality control and disseminate a large variety of marine data in a marine data exchange platfonn, a marine XML has been developed to encapsulate marine data, which provides an efficient means to store, transfer and display marine data. This paper first presents the details of the main marine XML elements and then gives an example showing how to transform CTD-observed data into Marine XML format, which illustrates the XML encapsulation process of marine observed data.
基金Supported by the National Natural Science Foun-dation of China(60373072)
文摘In existing web services-based workflow, data exchanging across the web services is centralized, the workflow engine intermediates at each step of the application sequence. However, many grid applications, especially data intensive scientific applications, require exchanging large amount of data across the grid services. Having a central workflow engine relay the data between the services would resu'lts in a bottleneck in these cases. This paper proposes a data exchange model for individual grid workflow and multiworkflows composition respectively. The model enables direct communication for large amounts of data between two grid services. To enable data to exchange among multiple workflows, the bridge data service is used.
文摘A semi structured data extraction method to get the useful information embedded in a group of relevant web pages and store it with OEM(Object Exchange Model) is proposed. Then, the data mining method is adopted to discover schema knowledge implicit in the semi structured data. This knowledge can make users understand the information structure on the web more deeply and thourouly. At the same time, it can also provide a kind of effective schema for the querying of web information.
基金Supported by Commission of the Basic ResearchScience and Technology for National Defence ( No.B192001C001).
文摘In this paper, in order to implement the share and exchange of the ship product data, a new kind of global function model is established. By researching on the development and trend of the application of ship STEP (standard for the exchange of product model data) standards, the AIM (application interpreted model) of AP216 is developed and improved as an example, aiming at the characteristics and practical engineering of ship industry in our country. The data exchange interfaces are formed based on STEP in the CAD/CAM for the ship by all function modules and shared databases under the global function model. The share and exchange of all information and data are solved in the design, manufacture and all life-cycle of ship products among different computer application systems. The research work makes foundation for the ship industry informatization.
基金supported by the National Natural Science Foundation of China (U1534201)the open project of Science and Technology on Communication Networks Laboratorythe National Key Research and Development Program of China (2016QY01W0200)
文摘The high-pressure hydrogenation heat exchanger is an impoltmlt equipment of the refinery, but it is exposed tothe problem of leakage caused by ammonium salt corrosion. Therefore, it is very important to evaluate the operating statusof flae hydrogenation heat exchanger. To improve flae method for evaluating the operating status of hydrogenation heat ex-chmagers by using flae traditional method, flais paper proposes a new method for evaluating the operation of hydrogenationheat exchangers based on big data. To address flae noisy data common in flae industry, this paper proposes an automatednoisy interval detection algorithm. To deal with flae problem that the sensor parameters have voluminous and mtrelateddimensions, flais paper proposes a key parameter detection algorithm based on flae Pearson correlation coefficient. Finally,this paper presents a system-based health scoring algorithm based on PCA (Principal Component Analysis) to assist site op-erators in assessing the healfla of hydrogenation heat exchangers. The evaluation of flae operating status of flae hydrorefiningheat exchange device based on big data technology will help the operators to more accurately grasp the status of flae indus-trial system mad have positive guiding significance for the early warning offlae failure.
基金the National Key R&D Program of China(Grant no.2019YFC1709803)National Natural Science Foundation of China(Grant no.81873183).
文摘Objectives:The aim of this study was to investigate and develop a data storage and exchange format for the process of automatic systematic reviews(ASR)of traditional Chinese medicine(TCM).Methods:A lightweight and commonly used data format,namely,JavaScript Object Notation(JSON),was introduced in this study.We designed a fully described data structure to collect TCM clinical trial information based on the JSON syntax.Results:A smart and powerful data format,JSON-ASR,was developed.JSON-ASR uses a plain-text data format in the form of key/value pairs and consists of six sections and more than 80 preset pairs.JSON-ASR adopts extensible structured arrays to support the situations of multi-groups and multi-outcomes.Conclusion:JSON-ASR has the characteristics of light weight,flexibility,and good scalability,which is suitable for the complex data of clinical evidence.
文摘On Nov.4^(th), AQSIQ (General Administration of Quality Supervision,Inspection and Quarantine of the People' s Republic of China), SAC (Standardization Administrationof China), National Audit Office of China (CNAO), and National Ministry of Finance of China jointlyheld the conference press on the national standard of Information Technology--Data Interface ofAccounting Software (GB/T 19581-2004) in Beijing. The standard was approved and issued on Sept. 20,2004 by AQSIQ and SAC, and it would come into effect all over the whole nation from January 1^(st),2005. Pu Changcheng, Vice Director of AQSIQ, Shi Aizhong, Vice Director of CNAO, Li Zhonghai. amember of the Party Group of AQSIQ and Director of SAC, the other leaders of concerned departmentssuch as National Ministry of Finance, National Telegraphy Office, and etc. attended the ConferencePress and made speeches. They fully affirmed the important significance and the achievements onstandardization work of electronic government business, and also they set new demands on the workfor the future.
文摘1.1. Development of international data exchange standards in securities field Securities market involves a large number of participants, like investors, securities companies, exchanges, clearingcorporations and so on. Businesses among the participants are completed via data exchange. Therefore, the data exchange protocols serve an important factor to determine and promote the sate and rapid development of the securities market.
文摘Contemporary mainstream big data governance platforms are built atop the big data ecosystem components,offering a one-stop development and analysis governance platform for the collection,transmission,storage,cleansing,transformation,querying and analysis,data development,publishing,and subscription,sharing and exchange,management,and services of massive data.These platforms serve various role members who have internal and external data needs.However,in the era of big data,the rapid update and iteration of big data technologies,the diversification of data businesses,and the exponential growth of data present more challenges and uncertainties to the construction of big data governance platforms.This paper discusses how to effectively build a data governance platform under the big data system from the perspectives of functional architecture,logical architecture,data architecture,and functional design.
基金supported by National High Technology Research and Development Program of China (863 Program) (No. AA420060)
文摘In the course of network supported collaborative design, the data processing plays a very vital role. Much effort has been spent in this area, and many kinds of approaches have been proposed. Based on the correlative materials, this paper presents extensible markup language (XML) based strategy for several important problems of data processing in network supported collaborative design, such as the representation of standard for the exchange of product model data (STEP) with XML in the product information expression and the management of XML documents using relational database. The paper gives a detailed exposition on how to clarify the mapping between XML structure and the relationship database structure and how XML-QL queries can be translated into structured query language (SQL) queries. Finally, the structure of data processing system based on XML is presented.
基金supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(No.2022R1I1A3063257)supported by Electronics and Telecommunications Research Institute(ETRI)grant funded by the Korean Government[22ZR1300,Research on Intelligent Cyber Security and Trust Infra].
文摘These days,data is regarded as a valuable asset in the era of the data economy,which demands a trading platform for buying and selling data.However,online data trading poses challenges in terms of security and fairness because the seller and the buyer may not fully trust each other.Therefore,in this paper,a blockchain-based secure and fair data trading system is proposed by taking advantage of the smart contract and matchmaking encryption.The proposed system enables bilateral authorization,where data trading between a seller and a buyer is accomplished only if their policies,required by each other,are satisfied simultaneously.This can be achieved by exploiting the security features of the matchmaking encryption.To guarantee non-repudiation and fairness between trading parties,the proposed system leverages a smart contract to ensure that the parties honestly carry out the data trading protocol.However,the smart contract in the proposed system does not include complex cryptographic operations for the efficiency of onchain processes.Instead,these operations are carried out by off-chain parties and their results are used as input for the on-chain procedure.The system also uses an arbitration protocol to resolve disputes based on the trading proof recorded on the blockchain.The performance of the protocol is evaluated in terms of off-chain computation overhead and on-chain gas consumption.The results of the experiments demonstrate that the proposed protocols can enable the implementation of a cost-effective data trading system.
基金This work was supported by the National Natural Science Foundation of China(No.61802214)the Natural Science Foundation of Shandong Province(Nos.ZR2019BF009,ZR2018LF007,ZR2017MF0,ZR2016YL011)+2 种基金the Shandong Provincial Key Research and Development Program of China(2018GGX1010052017,CXGC07012016,GGX109001)the Project of Shandong Province Higher Educational Science and Technology Program(No.J17KA049)the Global Infrastructure Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Science and ICT(NRF-2018K1A3A1A20026485).
文摘Multi-factor authentication(MFA)was proposed by Pointcheval et al.[Pointcheval and Zimmer(2008)]to improve the security of single-factor(and two-factor)authentication.As the backbone of multi-factor authentication,biometric data are widely observed.Especially,how to keep the privacy of biometric at the password database without impairing efficiency is still an open question.Using the vulnerability of encryption(or hash)algorithms,the attacker can still launch offline brute-force attacks on encrypted(or hashed)biometric data.To address the potential risk of biometric disclosure at the password database,in this paper,we propose a novel efficient and secure MFA key exchange(later denoted as MFAKE)protocol leveraging the Pythia PRF service and password-to-random(or PTR)protocol.Armed with the PTR protocol,a master password pwd can be translated by the user into independent pseudorandom passwords(or rwd)for each user account with the help of device(e.g.,smart phone).Meanwhile,using the Pythia PRF service,the password database can avoid leakage of the local user’s password and biometric data.This is the first paper to achieve the password and biometric harden service simultaneously using the PTR protocol and Pythia PRF.