As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The ...As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The eddy dissipation rate(EDR)has been established as the standard metric for quantifying turbulence in civil aviation.This study aims to explore a universally applicable symbolic classification approach based on genetic programming to detect turbulence anomalies using quick access recorder(QAR)data.The detection of atmospheric turbulence is approached as an anomaly detection problem.Comparative evaluations demonstrate that this approach performs on par with direct EDR calculation methods in identifying turbulence events.Moreover,comparisons with alternative machine learning techniques indicate that the proposed technique is the optimal methodology currently available.In summary,the use of symbolic classification via genetic programming enables accurate turbulence detection from QAR data,comparable to that with established EDR approaches and surpassing that achieved with machine learning algorithms.This finding highlights the potential of integrating symbolic classifiers into turbulence monitoring systems to enhance civil aviation safety amidst rising environmental and operational hazards.展开更多
This article studies the fault recorder in power system and introduces the Comtrade format. Andituses C++ programming to read recorded fault data and adopts Fourier analysis and symmetrical component method to filter ...This article studies the fault recorder in power system and introduces the Comtrade format. Andituses C++ programming to read recorded fault data and adopts Fourier analysis and symmetrical component method to filter and extract fundamental waves. Finally the effectiveness of the data processing method introduced in this paper is verified by CAAP software.展开更多
In order to improve the accuracy and integrality of mining data records from the web, the concepts of isomorphic page and directory page and three algorithms are proposed. An isomorphic web page is a set of web pages ...In order to improve the accuracy and integrality of mining data records from the web, the concepts of isomorphic page and directory page and three algorithms are proposed. An isomorphic web page is a set of web pages that have uniform structure, only differing in main information. A web page which contains many links that link to isomorphic web pages is called a directory page. Algorithm 1 can find directory web pages in a web using adjacent links similar analysis method. It first sorts the link, and then counts the links in each directory. If the count is greater than a given valve then finds the similar sub-page links in the directory and gives the results. A function for an isomorphic web page judgment is also proposed. Algorithm 2 can mine data records from an isomorphic page using a noise information filter. It is based on the fact that the noise information is the same in two isomorphic pages, only the main information is different. Algorithm 3 can mine data records from an entire website using the technology of spider. The experiment shows that the proposed algorithms can mine data records more intactly than the existing algorithms. Mining data records from isomorphic pages is an efficient method.展开更多
Since the British National Archive put forward the concept of the digital continuity in 2007,several developed countries have worked out their digital continuity action plan.However,the technologies of the digital con...Since the British National Archive put forward the concept of the digital continuity in 2007,several developed countries have worked out their digital continuity action plan.However,the technologies of the digital continuity guarantee are still lacked.At first,this paper analyzes the requirements of digital continuity guarantee for electronic record based on data quality theory,then points out the necessity of data quality guarantee for electronic record.Moreover,we convert the digital continuity guarantee of electronic record to ensure the consistency,completeness and timeliness of electronic record,and construct the first technology framework of the digital continuity guarantee for electronic record.Finally,the temporal functional dependencies technology is utilized to build the first integration method to insure the consistency,completeness and timeliness of electronic record.展开更多
Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face ...Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face challenges like the inconsistence of terminology in electronic health records (EHR) and the complexities in data quality and data formats in regional healthcare platform.In this paper,we propose methodology and process on constructing large scale cohorts which forms the basis of causality and comparative effectiveness relationship in epidemiology.We firstly constructed a Chinese terminology knowledge graph to deal with the diversity of vocabularies on regional platform.Secondly,we built special disease case repositories (i.e.,heart failure repository) that utilize the graph to search the related patients and to normalize the data.Based on the requirements of the clinical research which aimed to explore the effectiveness of taking statin on 180-days readmission in patients with heart failure,we built a large-scale retrospective cohort with 29647 cases of heart failure patients from the heart failure repository.After the propensity score matching,the study group (n=6346) and the control group (n=6346) with parallel clinical characteristics were acquired.Logistic regression analysis showed that taking statins had a negative correlation with 180-days readmission in heart failure patients.This paper presents the workflow and application example of big data mining based on regional EHR data.展开更多
With the rapid development of information technology,the electronifi-cation of medical records has gradually become a trend.In China,the population base is huge and the supporting medical institutions are numerous,so ...With the rapid development of information technology,the electronifi-cation of medical records has gradually become a trend.In China,the population base is huge and the supporting medical institutions are numerous,so this reality drives the conversion of paper medical records to electronic medical records.Electronic medical records are the basis for establishing a smart hospital and an important guarantee for achieving medical intelligence,and the massive amount of electronic medical record data is also an important data set for conducting research in the medical field.However,electronic medical records contain a large amount of private patient information,which must be desensitized before they are used as open resources.Therefore,to solve the above problems,data masking for Chinese electronic medical records with named entity recognition is proposed in this paper.Firstly,the text is vectorized to satisfy the required format of the model input.Secondly,since the input sentences may have a long or short length and the relationship between sentences in context is not negligible.To this end,a neural network model for named entity recognition based on bidirectional long short-term memory(BiLSTM)with conditional random fields(CRF)is constructed.Finally,the data masking operation is performed based on the named entity recog-nition results,mainly using regular expression filtering encryption and principal component analysis(PCA)word vector compression and replacement.In addi-tion,comparison experiments with the hidden markov model(HMM)model,LSTM-CRF model,and BiLSTM model are conducted in this paper.The experi-mental results show that the method used in this paper achieves 92.72%Accuracy,92.30%Recall,and 92.51%F1_score,which has higher accuracy compared with other models.展开更多
In order to settle the problem of workflow data consis-tency under the distributed environment, an invalidation strategy based-on timely updating record list is put forward. The strategy adopting the method of updatin...In order to settle the problem of workflow data consis-tency under the distributed environment, an invalidation strategy based-on timely updating record list is put forward. The strategy adopting the method of updating the records list and the recovery mechanism of updating message proves the classical invalidation strategy. When the request cycle of duplication is too long, the strategy uses the method of updating the records list to pause for sending updating message; when the long cycle duplication is requested again, it uses the recovery mechanism to resume the updating message. This strategy not only ensures the consistency of the workflow data, but also reduces the unnecessary network traffic. From theoretical comparison with those common strategies, the unnecessary network traffic of this strategy is fewer and more stable. The simulation results validate this conclusion.展开更多
The calibration of paleoclimate proxies is one of the key problems in the study of paleoclimate at present. Historical documentary records of climate are suitable for calibration on dating and the climatic implication...The calibration of paleoclimate proxies is one of the key problems in the study of paleoclimate at present. Historical documentary records of climate are suitable for calibration on dating and the climatic implication of the proxy data in a climatological sense. A test calibration on correcting the Delingha tree ring precipitation series using Chinese historical documentary records shows that among the 44 extreme dry cases in 1401 1950 AD, 42 cases (or 95.5%) are believable. Thus the long series of Delingha rings-denoted precipitation is highly reliable. Another test to validate the monsoon intensity proxy data based on the Zhanjiang Huguangyan sediments using historical records indicates that the years of Lake Maar Ti content series-designated winter monsoon intensities are entirely opposite to historical documents- depicted years of harsh winters in 800-900 AD. As a result, serious doubt is raised about the climatic implication of this paleo-monsoon proxy series.展开更多
With the advancements in the era of artificial intelligence,blockchain,cloud computing,and big data,there is a need for secure,decentralized medical record storage and retrieval systems.While cloud storage solves stor...With the advancements in the era of artificial intelligence,blockchain,cloud computing,and big data,there is a need for secure,decentralized medical record storage and retrieval systems.While cloud storage solves storage issues,it is challenging to realize secure sharing of records over the network.Medi-block record in the healthcare system has brought a new digitalization method for patients’medical records.This centralized technology provides a symmetrical process between the hospital and doctors when patients urgently need to go to a different or nearby hospital.It enables electronic medical records to be available with the correct authentication and restricts access to medical data retrieval.Medi-block record is the consumer-centered healthcare data system that brings reliable and transparent datasets for the medical record.This study presents an extensive review of proposed solutions aiming to protect the privacy and integrity of medical data by securing data sharing for Medi-block records.It also aims to propose a comprehensive investigation of the recent advances in different methods of securing data sharing,such as using Blockchain technology,Access Control,Privacy-Preserving,Proxy Re-Encryption,and Service-On-Chain approach.Finally,we highlight the open issues and identify the challenges regarding secure data sharing for Medi-block records in the healthcare systems.展开更多
In the software of data management system, there are some different lengths of records needed storing in an array, and the number of records often increases in use of the software. A universal data structure is presen...In the software of data management system, there are some different lengths of records needed storing in an array, and the number of records often increases in use of the software. A universal data structure is presented in the design, and it provide an unified interface for dynamic storage records in different length, so that the developers can call the unified interface directly for the data storage to simplify the design of data management system.展开更多
In the field of electronic record management,especially in the current big data environment,data continuity has become a new topic that is as important as security and needs to be studied.This paper decomposes the dat...In the field of electronic record management,especially in the current big data environment,data continuity has become a new topic that is as important as security and needs to be studied.This paper decomposes the data continuity guarantee of electronic record into a set of data protection requirements consisting of data relevance,traceability and comprehensibility,and proposes to use the associated data technology to provide an integrated guarantee mechanism to meet the above three requirements.展开更多
Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laborat...Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laboratories, pharmacies, and daily medical status reports. The electronic format of medical reports ensures that all information is available in a single place. However, it is difficult to store and manage large amounts of data. Dedicated servers and a data center are needed to store and manage patient data. However, self-managed data centers are expensive for hospitals. Storing data in a cloud is a cheaper alternative. The advantage of storing data in a cloud is that it can be retrieved anywhere and anytime using any device connected to the Internet. Therefore, doctors can easily access the medical history of a patient and diagnose diseases according to the context. It also helps prescribe the correct medicine to a patient in an appropriate way. The systematic storage of medical records could help reduce medical errors in hospitals. The challenge is to store medical records on a third-party cloud server while addressing privacy and security concerns. These servers are often semi-trusted. Thus, sensitive medical information must be protected. Open access to records and modifications performed on the information in those records may even cause patient fatalities. Patient-centric health-record security is a major concern. End-to-end file encryption before outsourcing data to a third-party cloud server ensures security. This paper presents a method that is a combination of the advanced encryption standard and the elliptical curve Diffie-Hellman method designed to increase the efficiency of medical record security for users. Comparisons of existing and proposed techniques are presented at the end of the article, with a focus on the analyzing the security approaches between the elliptic curve and secret-sharing methods. This study aims to provide a high level of security for patient health records.展开更多
This paper presented a rule merging and simplifying method and an improved analysis deviation algorithm. The fuzzy equivalence theory avoids the rigid way (either this or that) of traditional equivalence theory. Durin...This paper presented a rule merging and simplifying method and an improved analysis deviation algorithm. The fuzzy equivalence theory avoids the rigid way (either this or that) of traditional equivalence theory. During a data cleaning process task, some rules exist such as included/being included relations with each other. The equivalence degree of the being-included rule is smaller than that of the including rule, so a rule merging and simplifying method is introduced to reduce the total computing time. And this kind of relation will affect the deviation of fuzzy equivalence degree. An improved analysis deviation algorithm that omits the influence of the included rules' equivalence degree was also presented. Normally the duplicate records are logged in a file, and users have to check and verify them one by one. It's time-cost. The proposed algorithm can save users' labor during duplicate records checking. Finally, an experiment was presented which demonstrates the possibility of the rule.展开更多
常规收集的卫生数据在药物流行病学研究中发挥着越来越重要的作用。根据STROBE声明拓展而来的RECORD声明是目前利用常规收集卫生数据进行研究的主要指南,但RECORD声明并不完全适用于药物流行病学研究,RECORD-PE(RECORD for Pharmacoepid...常规收集的卫生数据在药物流行病学研究中发挥着越来越重要的作用。根据STROBE声明拓展而来的RECORD声明是目前利用常规收集卫生数据进行研究的主要指南,但RECORD声明并不完全适用于药物流行病学研究,RECORD-PE(RECORD for Pharmacoepidemiology)声明在RECORD声明基础上,针对药物流行病学研究扩展了15个条目。本文对RECORD-PE进行了详细解读,并给出了相应的实例,便于我国学者更好地掌握RECORD-PE声明并进行利用。展开更多
On November 13, 2016, an MW7.8 earthquake struck Kaikoura in South Island of New Zealand. By means of back-projection of array recordings, ASTFs-analysis of global seismic recordings, and joint inversion of global sei...On November 13, 2016, an MW7.8 earthquake struck Kaikoura in South Island of New Zealand. By means of back-projection of array recordings, ASTFs-analysis of global seismic recordings, and joint inversion of global seismic data and co-seismic In SAR data, we investigated complexity of the earthquake source. The result shows that the 2016 MW7.8 Kaikoura earthquake ruptured about 100 s unilaterally from south to northeast(~N28°–33°E), producing a rupture area about 160 km long and about 50 km wide and releasing scalar moment 1.01×1021 Nm. In particular, the rupture area consisted of two slip asperities, with one close to the initial rupture point having a maximal slip value ~6.9 m while the other far away in the northeast having a maximal slip value ~9.3 m. The first asperity slipped for about 65 s and the second one started 40 s after the first one had initiated. The two slipped simultaneously for about 25 s.Furthermore, the first had a nearly thrust slip while the second had both thrust and strike slip. It is interesting that the rupture velocity was not constant, and the whole process may be divided into 5 stages in which the velocities were estimated to be 1.4 km/s, 0 km/s, 2.1 km/s, 0 km/s and 1.1 km/s, respectively. The high-frequency sources distributed nearly along the lower edge of the rupture area, the highfrequency radiating mainly occurred at launching of the asperities, and it seemed that no high-frequency energy was radiated when the rupturing was going to stop.展开更多
Recent studies have demonstrated the importance of LUCC change with climate and ecosystem simulation, but the result could only be determined precisely if a high-resolution underlying land cover map is used. While the...Recent studies have demonstrated the importance of LUCC change with climate and ecosystem simulation, but the result could only be determined precisely if a high-resolution underlying land cover map is used. While the efforts based satellites have provided a good baseline for present land cover, what the next advancement in the research about LUCC change required is the development of reconstruction of historical LUCC change especially spatially-explicit historical dataset. Being different from other similar studies, this study is based on the analysis of historical land use patterns in the traditional cultivated region of China. Taking no account of the less important factors, altitude, slope and population patterns are selected as the major drivers of reclamation in ancient China, and used to design the HCGM (Historical Cropland Gridding Model, at a 60 km×60 km resolution), which is an empirical model for allocating the historical cropland inventory data spatially to grid cells in each political unit. Then we use this model to reconstruct cropland distribution of the study area in 1820, and verify the result by prefectural cropland data of 1820, which is from the historical documents. The statistical analyzing result shows that the model can simulate the patterns of the cropland distribution in the historical period in the traditional cultivated region efficiently.展开更多
Background:The impact of sleep disorders on active-duty soldiers’medical readiness is not currently quantified.Patient data generated at military treatment facilities can be accessed to create research reports and th...Background:The impact of sleep disorders on active-duty soldiers’medical readiness is not currently quantified.Patient data generated at military treatment facilities can be accessed to create research reports and thus can be used to estimate the prevalence of sleep disturbances and the role of sleep on overall health in service members.The current study aimed to quantify sleep-related health issues and their impact on health and nondeployability through the analysis of U.S.military healthcare records from fiscal year 2018(FY2018).Methods:Medical diagnosis information and deployability profiles(e-Profiles)were queried for all active-duty U.S.Army patients with a concurrent sleep disorder diagnosis receiving medical care within FY2018.Nondeployability was predicted from medical reasons for having an e-Profile(categorized as sleep,behavioral health,musculoskeletal,cardiometabolic,injury,or accident)using binomial logistic regression.Sleep e-Profiles were investigated as a moderator between other e-Profile categories and nondeployability.Results:Out of 582,031 soldiers,48.4%(n=281,738)had a sleep-related diagnosis in their healthcare records,9.7%(n=56,247)of soldiers had e-Profiles,and 1.9%(n=10,885)had a sleep e-Profile.Soldiers with sleep e-Profiles were more likely to have had a motor vehicle accident(p OR(prevalence odds ratio)=4.7,95%CI 2.63–8.39,P≤0.001)or work/duty-related injury(p OR=1.6,95%CI 1.32–1.94,P≤0.001).The likelihood of nondeployability was greater in soldiers with a sleep e-Profile and a musculoskeletal e-Profile(p OR=4.25,95%CI 3.75–4.81,P≤0.001)or work/dutyrelated injury(p OR=2.62,95%CI 1.63–4.21,P≤0.001).Conclusion:Nearly half of soldiers had a sleep disorder or sleep-related medical diagnosis in 2018,but their sleep problems are largely not profiled as limitations to medical readiness.Musculoskeletal issues and physical injury predict nondeployability,and nondeployability is more likely to occur in soldiers who have sleep e-Profiles in addition to these issues.Addressing sleep problems may prevent accidents and injuries that could render a soldier nondeployable.展开更多
In the digital era,electronic medical record(EMR)has been a major way for hospitals to store patients’medical data.The traditional centralized medical system and semi-trusted cloud storage are difficult to achieve dy...In the digital era,electronic medical record(EMR)has been a major way for hospitals to store patients’medical data.The traditional centralized medical system and semi-trusted cloud storage are difficult to achieve dynamic balance between privacy protection and data sharing.The storage capacity of blockchain is limited and single blockchain schemes have poor scalability and low throughput.To address these issues,we propose a secure and efficient medical data storage and sharing scheme based on double blockchain.In our scheme,we encrypt the original EMR and store it in the cloud.The storage blockchain stores the index of the complete EMR,and the shared blockchain stores the index of the shared part of the EMR.Users with different attributes can make requests to different blockchains to share different parts according to their own permissions.Through experiments,it was found that cloud storage combined with blockchain not only solved the problem of limited storage capacity of blockchain,but also greatly reduced the risk of leakage of the original EMR.Content Extraction Signature(CES)combined with the double blockchain technology realized the separation of the privacy part and the shared part of the original EMR.The symmetric encryption technology combined with Ciphertext-Policy Attribute-Based Encryption(CP–ABE)not only ensures the safe storage of data in the cloud,but also achieves the consistency and convenience of data update,avoiding redundant backup of data.Safety analysis and performance analysis verified the feasibility and effectiveness of our scheme.展开更多
基金supported by the Meteorological Soft Science Project(Grant No.2023ZZXM29)the Natural Science Fund Project of Tianjin,China(Grant No.21JCYBJC00740)the Key Research and Development-Social Development Program of Jiangsu Province,China(Grant No.BE2021685).
文摘As the risks associated with air turbulence are intensified by climate change and the growth of the aviation industry,it has become imperative to monitor and mitigate these threats to ensure civil aviation safety.The eddy dissipation rate(EDR)has been established as the standard metric for quantifying turbulence in civil aviation.This study aims to explore a universally applicable symbolic classification approach based on genetic programming to detect turbulence anomalies using quick access recorder(QAR)data.The detection of atmospheric turbulence is approached as an anomaly detection problem.Comparative evaluations demonstrate that this approach performs on par with direct EDR calculation methods in identifying turbulence events.Moreover,comparisons with alternative machine learning techniques indicate that the proposed technique is the optimal methodology currently available.In summary,the use of symbolic classification via genetic programming enables accurate turbulence detection from QAR data,comparable to that with established EDR approaches and surpassing that achieved with machine learning algorithms.This finding highlights the potential of integrating symbolic classifiers into turbulence monitoring systems to enhance civil aviation safety amidst rising environmental and operational hazards.
文摘This article studies the fault recorder in power system and introduces the Comtrade format. Andituses C++ programming to read recorded fault data and adopts Fourier analysis and symmetrical component method to filter and extract fundamental waves. Finally the effectiveness of the data processing method introduced in this paper is verified by CAAP software.
文摘In order to improve the accuracy and integrality of mining data records from the web, the concepts of isomorphic page and directory page and three algorithms are proposed. An isomorphic web page is a set of web pages that have uniform structure, only differing in main information. A web page which contains many links that link to isomorphic web pages is called a directory page. Algorithm 1 can find directory web pages in a web using adjacent links similar analysis method. It first sorts the link, and then counts the links in each directory. If the count is greater than a given valve then finds the similar sub-page links in the directory and gives the results. A function for an isomorphic web page judgment is also proposed. Algorithm 2 can mine data records from an isomorphic page using a noise information filter. It is based on the fact that the noise information is the same in two isomorphic pages, only the main information is different. Algorithm 3 can mine data records from an entire website using the technology of spider. The experiment shows that the proposed algorithms can mine data records more intactly than the existing algorithms. Mining data records from isomorphic pages is an efficient method.
基金This work is supported by the NSFC(Nos.61772280,61772454)the Changzhou Sci&Tech Program(No.CJ20179027)the PAPD fund from NUIST.Prof.Jin Wang is the corresponding author。
文摘Since the British National Archive put forward the concept of the digital continuity in 2007,several developed countries have worked out their digital continuity action plan.However,the technologies of the digital continuity guarantee are still lacked.At first,this paper analyzes the requirements of digital continuity guarantee for electronic record based on data quality theory,then points out the necessity of data quality guarantee for electronic record.Moreover,we convert the digital continuity guarantee of electronic record to ensure the consistency,completeness and timeliness of electronic record,and construct the first technology framework of the digital continuity guarantee for electronic record.Finally,the temporal functional dependencies technology is utilized to build the first integration method to insure the consistency,completeness and timeliness of electronic record.
基金Supported by the National Major Scientific and Technological Special Project for"Significant New Drugs Development’’(No.2018ZX09201008)Special Fund Project for Information Development from Shanghai Municipal Commission of Economy and Information(No.201701013)
文摘Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face challenges like the inconsistence of terminology in electronic health records (EHR) and the complexities in data quality and data formats in regional healthcare platform.In this paper,we propose methodology and process on constructing large scale cohorts which forms the basis of causality and comparative effectiveness relationship in epidemiology.We firstly constructed a Chinese terminology knowledge graph to deal with the diversity of vocabularies on regional platform.Secondly,we built special disease case repositories (i.e.,heart failure repository) that utilize the graph to search the related patients and to normalize the data.Based on the requirements of the clinical research which aimed to explore the effectiveness of taking statin on 180-days readmission in patients with heart failure,we built a large-scale retrospective cohort with 29647 cases of heart failure patients from the heart failure repository.After the propensity score matching,the study group (n=6346) and the control group (n=6346) with parallel clinical characteristics were acquired.Logistic regression analysis showed that taking statins had a negative correlation with 180-days readmission in heart failure patients.This paper presents the workflow and application example of big data mining based on regional EHR data.
基金This research was supported by the National Natural Science Foundation of China under Grant(No.42050102)the Postgraduate Education Reform Project of Jiangsu Province under Grant(No.SJCX22_0343)Also,this research was supported by Dou Wanchun Expert Workstation of Yunnan Province(No.202205AF150013).
文摘With the rapid development of information technology,the electronifi-cation of medical records has gradually become a trend.In China,the population base is huge and the supporting medical institutions are numerous,so this reality drives the conversion of paper medical records to electronic medical records.Electronic medical records are the basis for establishing a smart hospital and an important guarantee for achieving medical intelligence,and the massive amount of electronic medical record data is also an important data set for conducting research in the medical field.However,electronic medical records contain a large amount of private patient information,which must be desensitized before they are used as open resources.Therefore,to solve the above problems,data masking for Chinese electronic medical records with named entity recognition is proposed in this paper.Firstly,the text is vectorized to satisfy the required format of the model input.Secondly,since the input sentences may have a long or short length and the relationship between sentences in context is not negligible.To this end,a neural network model for named entity recognition based on bidirectional long short-term memory(BiLSTM)with conditional random fields(CRF)is constructed.Finally,the data masking operation is performed based on the named entity recog-nition results,mainly using regular expression filtering encryption and principal component analysis(PCA)word vector compression and replacement.In addi-tion,comparison experiments with the hidden markov model(HMM)model,LSTM-CRF model,and BiLSTM model are conducted in this paper.The experi-mental results show that the method used in this paper achieves 92.72%Accuracy,92.30%Recall,and 92.51%F1_score,which has higher accuracy compared with other models.
基金National Basic Research Program of China (973 Program) (2005CD312904)
文摘In order to settle the problem of workflow data consis-tency under the distributed environment, an invalidation strategy based-on timely updating record list is put forward. The strategy adopting the method of updating the records list and the recovery mechanism of updating message proves the classical invalidation strategy. When the request cycle of duplication is too long, the strategy uses the method of updating the records list to pause for sending updating message; when the long cycle duplication is requested again, it uses the recovery mechanism to resume the updating message. This strategy not only ensures the consistency of the workflow data, but also reduces the unnecessary network traffic. From theoretical comparison with those common strategies, the unnecessary network traffic of this strategy is fewer and more stable. The simulation results validate this conclusion.
基金supported in part by National Science Foundation of China(41075055)
文摘The calibration of paleoclimate proxies is one of the key problems in the study of paleoclimate at present. Historical documentary records of climate are suitable for calibration on dating and the climatic implication of the proxy data in a climatological sense. A test calibration on correcting the Delingha tree ring precipitation series using Chinese historical documentary records shows that among the 44 extreme dry cases in 1401 1950 AD, 42 cases (or 95.5%) are believable. Thus the long series of Delingha rings-denoted precipitation is highly reliable. Another test to validate the monsoon intensity proxy data based on the Zhanjiang Huguangyan sediments using historical records indicates that the years of Lake Maar Ti content series-designated winter monsoon intensities are entirely opposite to historical documents- depicted years of harsh winters in 800-900 AD. As a result, serious doubt is raised about the climatic implication of this paleo-monsoon proxy series.
文摘With the advancements in the era of artificial intelligence,blockchain,cloud computing,and big data,there is a need for secure,decentralized medical record storage and retrieval systems.While cloud storage solves storage issues,it is challenging to realize secure sharing of records over the network.Medi-block record in the healthcare system has brought a new digitalization method for patients’medical records.This centralized technology provides a symmetrical process between the hospital and doctors when patients urgently need to go to a different or nearby hospital.It enables electronic medical records to be available with the correct authentication and restricts access to medical data retrieval.Medi-block record is the consumer-centered healthcare data system that brings reliable and transparent datasets for the medical record.This study presents an extensive review of proposed solutions aiming to protect the privacy and integrity of medical data by securing data sharing for Medi-block records.It also aims to propose a comprehensive investigation of the recent advances in different methods of securing data sharing,such as using Blockchain technology,Access Control,Privacy-Preserving,Proxy Re-Encryption,and Service-On-Chain approach.Finally,we highlight the open issues and identify the challenges regarding secure data sharing for Medi-block records in the healthcare systems.
文摘In the software of data management system, there are some different lengths of records needed storing in an array, and the number of records often increases in use of the software. A universal data structure is presented in the design, and it provide an unified interface for dynamic storage records in different length, so that the developers can call the unified interface directly for the data storage to simplify the design of data management system.
基金This work is supported by the NSFC(61772280)the national training programs of innovation and entrepreneurship for undergraduates(Nos.201910300123Y,202010300200)the PAPD fund from NUIST.
文摘In the field of electronic record management,especially in the current big data environment,data continuity has become a new topic that is as important as security and needs to be studied.This paper decomposes the data continuity guarantee of electronic record into a set of data protection requirements consisting of data relevance,traceability and comprehensibility,and proposes to use the associated data technology to provide an integrated guarantee mechanism to meet the above three requirements.
文摘Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laboratories, pharmacies, and daily medical status reports. The electronic format of medical reports ensures that all information is available in a single place. However, it is difficult to store and manage large amounts of data. Dedicated servers and a data center are needed to store and manage patient data. However, self-managed data centers are expensive for hospitals. Storing data in a cloud is a cheaper alternative. The advantage of storing data in a cloud is that it can be retrieved anywhere and anytime using any device connected to the Internet. Therefore, doctors can easily access the medical history of a patient and diagnose diseases according to the context. It also helps prescribe the correct medicine to a patient in an appropriate way. The systematic storage of medical records could help reduce medical errors in hospitals. The challenge is to store medical records on a third-party cloud server while addressing privacy and security concerns. These servers are often semi-trusted. Thus, sensitive medical information must be protected. Open access to records and modifications performed on the information in those records may even cause patient fatalities. Patient-centric health-record security is a major concern. End-to-end file encryption before outsourcing data to a third-party cloud server ensures security. This paper presents a method that is a combination of the advanced encryption standard and the elliptical curve Diffie-Hellman method designed to increase the efficiency of medical record security for users. Comparisons of existing and proposed techniques are presented at the end of the article, with a focus on the analyzing the security approaches between the elliptic curve and secret-sharing methods. This study aims to provide a high level of security for patient health records.
文摘This paper presented a rule merging and simplifying method and an improved analysis deviation algorithm. The fuzzy equivalence theory avoids the rigid way (either this or that) of traditional equivalence theory. During a data cleaning process task, some rules exist such as included/being included relations with each other. The equivalence degree of the being-included rule is smaller than that of the including rule, so a rule merging and simplifying method is introduced to reduce the total computing time. And this kind of relation will affect the deviation of fuzzy equivalence degree. An improved analysis deviation algorithm that omits the influence of the included rules' equivalence degree was also presented. Normally the duplicate records are logged in a file, and users have to check and verify them one by one. It's time-cost. The proposed algorithm can save users' labor during duplicate records checking. Finally, an experiment was presented which demonstrates the possibility of the rule.
文摘常规收集的卫生数据在药物流行病学研究中发挥着越来越重要的作用。根据STROBE声明拓展而来的RECORD声明是目前利用常规收集卫生数据进行研究的主要指南,但RECORD声明并不完全适用于药物流行病学研究,RECORD-PE(RECORD for Pharmacoepidemiology)声明在RECORD声明基础上,针对药物流行病学研究扩展了15个条目。本文对RECORD-PE进行了详细解读,并给出了相应的实例,便于我国学者更好地掌握RECORD-PE声明并进行利用。
基金supported by the NSFC project (41474046)the DQJB project (DQJB16B05) of the Institute of Geophysics, CEA
文摘On November 13, 2016, an MW7.8 earthquake struck Kaikoura in South Island of New Zealand. By means of back-projection of array recordings, ASTFs-analysis of global seismic recordings, and joint inversion of global seismic data and co-seismic In SAR data, we investigated complexity of the earthquake source. The result shows that the 2016 MW7.8 Kaikoura earthquake ruptured about 100 s unilaterally from south to northeast(~N28°–33°E), producing a rupture area about 160 km long and about 50 km wide and releasing scalar moment 1.01×1021 Nm. In particular, the rupture area consisted of two slip asperities, with one close to the initial rupture point having a maximal slip value ~6.9 m while the other far away in the northeast having a maximal slip value ~9.3 m. The first asperity slipped for about 65 s and the second one started 40 s after the first one had initiated. The two slipped simultaneously for about 25 s.Furthermore, the first had a nearly thrust slip while the second had both thrust and strike slip. It is interesting that the rupture velocity was not constant, and the whole process may be divided into 5 stages in which the velocities were estimated to be 1.4 km/s, 0 km/s, 2.1 km/s, 0 km/s and 1.1 km/s, respectively. The high-frequency sources distributed nearly along the lower edge of the rupture area, the highfrequency radiating mainly occurred at launching of the asperities, and it seemed that no high-frequency energy was radiated when the rupturing was going to stop.
基金Natiional Natural Science Foundation of China,No.40471007Innovation Knowledge Project of CAS,No.KZCX2-YW-315
文摘Recent studies have demonstrated the importance of LUCC change with climate and ecosystem simulation, but the result could only be determined precisely if a high-resolution underlying land cover map is used. While the efforts based satellites have provided a good baseline for present land cover, what the next advancement in the research about LUCC change required is the development of reconstruction of historical LUCC change especially spatially-explicit historical dataset. Being different from other similar studies, this study is based on the analysis of historical land use patterns in the traditional cultivated region of China. Taking no account of the less important factors, altitude, slope and population patterns are selected as the major drivers of reclamation in ancient China, and used to design the HCGM (Historical Cropland Gridding Model, at a 60 km×60 km resolution), which is an empirical model for allocating the historical cropland inventory data spatially to grid cells in each political unit. Then we use this model to reconstruct cropland distribution of the study area in 1820, and verify the result by prefectural cropland data of 1820, which is from the historical documents. The statistical analyzing result shows that the model can simulate the patterns of the cropland distribution in the historical period in the traditional cultivated region efficiently.
基金The Department of Defense Military Operational Medicine Research Program(MOMRP)supported this study。
文摘Background:The impact of sleep disorders on active-duty soldiers’medical readiness is not currently quantified.Patient data generated at military treatment facilities can be accessed to create research reports and thus can be used to estimate the prevalence of sleep disturbances and the role of sleep on overall health in service members.The current study aimed to quantify sleep-related health issues and their impact on health and nondeployability through the analysis of U.S.military healthcare records from fiscal year 2018(FY2018).Methods:Medical diagnosis information and deployability profiles(e-Profiles)were queried for all active-duty U.S.Army patients with a concurrent sleep disorder diagnosis receiving medical care within FY2018.Nondeployability was predicted from medical reasons for having an e-Profile(categorized as sleep,behavioral health,musculoskeletal,cardiometabolic,injury,or accident)using binomial logistic regression.Sleep e-Profiles were investigated as a moderator between other e-Profile categories and nondeployability.Results:Out of 582,031 soldiers,48.4%(n=281,738)had a sleep-related diagnosis in their healthcare records,9.7%(n=56,247)of soldiers had e-Profiles,and 1.9%(n=10,885)had a sleep e-Profile.Soldiers with sleep e-Profiles were more likely to have had a motor vehicle accident(p OR(prevalence odds ratio)=4.7,95%CI 2.63–8.39,P≤0.001)or work/duty-related injury(p OR=1.6,95%CI 1.32–1.94,P≤0.001).The likelihood of nondeployability was greater in soldiers with a sleep e-Profile and a musculoskeletal e-Profile(p OR=4.25,95%CI 3.75–4.81,P≤0.001)or work/dutyrelated injury(p OR=2.62,95%CI 1.63–4.21,P≤0.001).Conclusion:Nearly half of soldiers had a sleep disorder or sleep-related medical diagnosis in 2018,but their sleep problems are largely not profiled as limitations to medical readiness.Musculoskeletal issues and physical injury predict nondeployability,and nondeployability is more likely to occur in soldiers who have sleep e-Profiles in addition to these issues.Addressing sleep problems may prevent accidents and injuries that could render a soldier nondeployable.
基金the Natural Science Foundation of Heilongjiang Province of China under Grant No.LC2016024Natural Science Foundation of the Jiangsu Higher Education Institutions Grant No.17KJB520044Six Talent Peaks Project in Jiangsu Province No.XYDXX–108.
文摘In the digital era,electronic medical record(EMR)has been a major way for hospitals to store patients’medical data.The traditional centralized medical system and semi-trusted cloud storage are difficult to achieve dynamic balance between privacy protection and data sharing.The storage capacity of blockchain is limited and single blockchain schemes have poor scalability and low throughput.To address these issues,we propose a secure and efficient medical data storage and sharing scheme based on double blockchain.In our scheme,we encrypt the original EMR and store it in the cloud.The storage blockchain stores the index of the complete EMR,and the shared blockchain stores the index of the shared part of the EMR.Users with different attributes can make requests to different blockchains to share different parts according to their own permissions.Through experiments,it was found that cloud storage combined with blockchain not only solved the problem of limited storage capacity of blockchain,but also greatly reduced the risk of leakage of the original EMR.Content Extraction Signature(CES)combined with the double blockchain technology realized the separation of the privacy part and the shared part of the original EMR.The symmetric encryption technology combined with Ciphertext-Policy Attribute-Based Encryption(CP–ABE)not only ensures the safe storage of data in the cloud,but also achieves the consistency and convenience of data update,avoiding redundant backup of data.Safety analysis and performance analysis verified the feasibility and effectiveness of our scheme.