The high energy cosmic-radiation detection(HERD)facility is planned to launch in 2027 and scheduled to be installed on the China Space Station.It serves as a dark matter particle detector,a cosmic ray instrument,and a...The high energy cosmic-radiation detection(HERD)facility is planned to launch in 2027 and scheduled to be installed on the China Space Station.It serves as a dark matter particle detector,a cosmic ray instrument,and an observatory for high-energy gamma rays.A transition radiation detector placed on one of its lateral sides serves dual purpose,(ⅰ)calibrating HERD's electromagnetic calorimeter in the TeV energy range,and(ⅱ)serving as an independent detector for high-energy gamma rays.In this paper,the prototype readout electronics design of the transition radiation detector is demonstrated,which aims to accurately measure the charge of the anodes using the SAMPA application specific integrated circuit chip.The electronic performance of the prototype system is evaluated in terms of noise,linearity,and resolution.Through the presented design,each electronic channel can achieve a dynamic range of 0–100 fC,the RMS noise level not exceeding 0.15 fC,and the integral nonlinearity was<0.2%.To further verify the readout electronic performance,a joint test with the detector was carried out,and the results show that the prototype system can satisfy the requirements of the detector's scientific goals.展开更多
LiB (lithium-ion battery) has become serious concern for energy management systems, especially in Japan, where the argument on a nuclear power plant problem is active. Including reuse of LiB, long-term use is expect...LiB (lithium-ion battery) has become serious concern for energy management systems, especially in Japan, where the argument on a nuclear power plant problem is active. Including reuse of LiB, long-term use is expected, however, method to ensure LiB life has not been developed thus the users of LiB are forced to accept the uncertainty of LiB life. Therefore this study suggests an evaluation method for LiB life using degradation experimental data. This method has three elements, defining indexes, preparing degradation speed database from the result of experiment, and setting up the use patterns of LiB. In order to be usable under non-experimental conditions, degradation speed database has the data in all conditions by complementing the experimental result. Finally, this evaluation model was verified by comparing model estimates and the experimental measurements.展开更多
Since the British National Archive put forward the concept of the digital continuity in 2007,several developed countries have worked out their digital continuity action plan.However,the technologies of the digital con...Since the British National Archive put forward the concept of the digital continuity in 2007,several developed countries have worked out their digital continuity action plan.However,the technologies of the digital continuity guarantee are still lacked.At first,this paper analyzes the requirements of digital continuity guarantee for electronic record based on data quality theory,then points out the necessity of data quality guarantee for electronic record.Moreover,we convert the digital continuity guarantee of electronic record to ensure the consistency,completeness and timeliness of electronic record,and construct the first technology framework of the digital continuity guarantee for electronic record.Finally,the temporal functional dependencies technology is utilized to build the first integration method to insure the consistency,completeness and timeliness of electronic record.展开更多
Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face ...Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face challenges like the inconsistence of terminology in electronic health records (EHR) and the complexities in data quality and data formats in regional healthcare platform.In this paper,we propose methodology and process on constructing large scale cohorts which forms the basis of causality and comparative effectiveness relationship in epidemiology.We firstly constructed a Chinese terminology knowledge graph to deal with the diversity of vocabularies on regional platform.Secondly,we built special disease case repositories (i.e.,heart failure repository) that utilize the graph to search the related patients and to normalize the data.Based on the requirements of the clinical research which aimed to explore the effectiveness of taking statin on 180-days readmission in patients with heart failure,we built a large-scale retrospective cohort with 29647 cases of heart failure patients from the heart failure repository.After the propensity score matching,the study group (n=6346) and the control group (n=6346) with parallel clinical characteristics were acquired.Logistic regression analysis showed that taking statins had a negative correlation with 180-days readmission in heart failure patients.This paper presents the workflow and application example of big data mining based on regional EHR data.展开更多
Background:Given the importance of customers as the most valuable assets of organizations,customer retention seems to be an essential,basic requirement for any organization.Banks are no exception to this rule.The comp...Background:Given the importance of customers as the most valuable assets of organizations,customer retention seems to be an essential,basic requirement for any organization.Banks are no exception to this rule.The competitive atmosphere within which electronic banking services are provided by different banks increases the necessity of customer retention.Methods:Being based on existing information technologies which allow one to collect data from organizations’databases,data mining introduces a powerful tool for the extraction of knowledge from huge amounts of data.In this research,the decision tree technique was applied to build a model incorporating this knowledge.Results:The results represent the characteristics of churned customers.Conclusions:Bank managers can identify churners in future using the results of decision tree.They should be provide some strategies for customers whose features are getting more likely to churner’s features.展开更多
In the electron beam selective melting(EBSM)process,the quality of each deposited melt track has an effect on the properties of the manufactured component.However,the formation of the melt track is governed by various...In the electron beam selective melting(EBSM)process,the quality of each deposited melt track has an effect on the properties of the manufactured component.However,the formation of the melt track is governed by various physical phenomena and influenced by various process parameters,and the correlation of these parameters is complicated and difficult to establish experimentally.The mesoscopic modeling technique was recently introduced as a means of simulating the electron beam(EB)melting process and revealing the formation mechanisms of specific melt track morphologies.However,the correlation between the process parameters and the melt track features has not yet been quantitatively understood.This paper investigates the morphological features of the melt track from the results of mesoscopic simulation,while introducing key descriptive indexes such as melt track width and height in order to numerically assess the deposition quality.The effects of various processing parameters are also quantitatively investigated,and the correlation between the processing conditions and the melt track features is thereby derived.Finally,a simulation-driven optimization framework consisting of mesoscopic modeling and data mining is proposed,and its potential and limitations are discussed.展开更多
Because radiation belt electrons can pose a potential threat to the safety of satellites orbiting in space,it is of great importance to develop a reliable model that can predict the highly dynamic variations in outer ...Because radiation belt electrons can pose a potential threat to the safety of satellites orbiting in space,it is of great importance to develop a reliable model that can predict the highly dynamic variations in outer radiation belt electron fluxes.In the present study,we develop a forecast model of radiation belt electron fluxes based on the data assimilation method,in terms of Van Allen Probe measurements combined with three-dimensional radiation belt numerical simulations.Our forecast model can cover the entire outer radiation belt with a high temporal resolution(1 hour)and a spatial resolution of 0.25 L over a wide range of both electron energy(0.1-5.0 MeV)and pitch angle(5°-90°).On the basis of this model,we forecast hourly electron fluxes for the next 1,2,and 3 days during an intense geomagnetic storm and evaluate the corresponding prediction performance.Our model can reasonably predict the stormtime evolution of radiation belt electrons with high prediction efficiency(up to~0.8-1).The best prediction performance is found for~0.3-3 MeV electrons at L=~3.25-4.5,which extends to higher L and lower energies with increasing pitch angle.Our results demonstrate that the forecast model developed can be a powerful tool to predict the spatiotemporal changes in outer radiation belt electron fluxes,and the model has both scientific significance and practical implications.展开更多
To study the influencing factors of traffic violations,this study investigated the effects of vehicle attribution,day of week,time of day,location of traffic violations,and weather on traffic violations based on the e...To study the influencing factors of traffic violations,this study investigated the effects of vehicle attribution,day of week,time of day,location of traffic violations,and weather on traffic violations based on the electronic enforcement data and historical weather data obtained in Shangyu,China.Ten categories of traffic violations were determined from the raw data.Then,chi-square tests were used to analyze the relationship between traffic violations and the potential risk factors.Multinomial logistic regression analyses were conducted to further estimate the effects of different risk factors on the likelihood of the occurrence of traffic violations.By analyzing the results of chi-square tests via SPSS,the five factors above were all determined as significant factors associated with traffic violations.The results of the multinomial logistic regression revealed the significant effects of the five factors on the likelihood of the occurrence of corresponding traffic violations.The conclusions are of great significance for the development of effective traffic intervention measures to reduce traffic violations and the improvement of road traffic safety.展开更多
In this work,an old scanning electron microscope(SEM)is refurbished to enhance its image processing capability.How to digitally sample and process an analog image is also presented.An NI PCI-6259 multiple input/output...In this work,an old scanning electron microscope(SEM)is refurbished to enhance its image processing capability.How to digitally sample and process an analog image is also presented.An NI PCI-6259 multiple input/output data acquisition(DAQ)board is used to acquire signals originally being sent to an analog display,and then convert the signals into a digital image.Two output channels are used for raster scan of the horizontal and verticle axes of the image buffer,while one input channel is used to read the brightness signals at various coordinate points.Synchronous method is used to maximize the DAQ speed.Finally,the digitally buffered images are read out to display and saved in a hard drive.The hardware and software designs of this work are explained in great detail,which can serve as a very good example for fast synchronous DAQ,advanced virtual instrument design and structural driver programming with LabVIEW.展开更多
With the rapid development of information technology,the electronifi-cation of medical records has gradually become a trend.In China,the population base is huge and the supporting medical institutions are numerous,so ...With the rapid development of information technology,the electronifi-cation of medical records has gradually become a trend.In China,the population base is huge and the supporting medical institutions are numerous,so this reality drives the conversion of paper medical records to electronic medical records.Electronic medical records are the basis for establishing a smart hospital and an important guarantee for achieving medical intelligence,and the massive amount of electronic medical record data is also an important data set for conducting research in the medical field.However,electronic medical records contain a large amount of private patient information,which must be desensitized before they are used as open resources.Therefore,to solve the above problems,data masking for Chinese electronic medical records with named entity recognition is proposed in this paper.Firstly,the text is vectorized to satisfy the required format of the model input.Secondly,since the input sentences may have a long or short length and the relationship between sentences in context is not negligible.To this end,a neural network model for named entity recognition based on bidirectional long short-term memory(BiLSTM)with conditional random fields(CRF)is constructed.Finally,the data masking operation is performed based on the named entity recog-nition results,mainly using regular expression filtering encryption and principal component analysis(PCA)word vector compression and replacement.In addi-tion,comparison experiments with the hidden markov model(HMM)model,LSTM-CRF model,and BiLSTM model are conducted in this paper.The experi-mental results show that the method used in this paper achieves 92.72%Accuracy,92.30%Recall,and 92.51%F1_score,which has higher accuracy compared with other models.展开更多
Readout electronics is developed for a prototype spectrometer for in situ measurement of low-energy ions of30 e V/e–20 ke V/e in the solar wind plasma.A low-noise preamplifier/discriminator(A111F) is employed for eac...Readout electronics is developed for a prototype spectrometer for in situ measurement of low-energy ions of30 e V/e–20 ke V/e in the solar wind plasma.A low-noise preamplifier/discriminator(A111F) is employed for each channel to process the signal from micro-channel plate(MCP) detectors.A high-voltage(HV) supply solution based on a HV module and a HV optocoupler is adopted to generate a fast sweeping HV and a fixed HV.Due to limitation of telemetry bandwidth in space communication,an algorithm is implemented in an FPGA(field programmable gate array) to compress the raw data.Test results show that the electronics achieves a 1 MHz event rate and a large input dynamic range of 95 p C.A slew rate of 0.8 V/ls and an integral nonlinearity of 0.7-LSB for the sweeping HV,and a precision of less than 0.8 % for the fixed HV are obtained.A vacuum beam test shows an energy resolution of 12 ± 0.7 % full width at half maximum(FWHM) is achieved,and noise counts are less than10/sec,indicating that the performance meets the physical requirement.展开更多
This paper proposes a useful web-based system for the management and sharing of electron probe micro-analysis( EPMA)data in geology. A new web-based architecture that integrates the management and sharing functions is...This paper proposes a useful web-based system for the management and sharing of electron probe micro-analysis( EPMA)data in geology. A new web-based architecture that integrates the management and sharing functions is developed and implemented.Earth scientists can utilize this system to not only manage their data,but also easily communicate and share it with other researchers.Data query methods provide the core functionality of the proposed management and sharing modules. The modules in this system have been developed using cloud GIS technologies,which help achieve real-time spatial area retrieval on a map. The system has been tested by approximately 263 users at Jilin University and Beijing SHRIMP Center. A survey was conducted among these users to estimate the usability of the primary functions of the system,and the assessment result is summarized and presented.展开更多
With the rapid development of China's reform and opening up and the socialist market economy, the development of Internet technology has promoted the prosperity of e-commerce, and further promoted the rapid developme...With the rapid development of China's reform and opening up and the socialist market economy, the development of Internet technology has promoted the prosperity of e-commerce, and further promoted the rapid development of China's economy. Data mining technology is an advanced science and technology, which has important implications for the e-commerce data processing. Through the summary of the data mining technology, this article puts forward the application of data mining technology in electronic commerce, in order to better promote the development of electronic commerce.展开更多
In the field of electronic record management,especially in the current big data environment,data continuity has become a new topic that is as important as security and needs to be studied.This paper decomposes the dat...In the field of electronic record management,especially in the current big data environment,data continuity has become a new topic that is as important as security and needs to be studied.This paper decomposes the data continuity guarantee of electronic record into a set of data protection requirements consisting of data relevance,traceability and comprehensibility,and proposes to use the associated data technology to provide an integrated guarantee mechanism to meet the above three requirements.展开更多
With the advent of the era of big data,the Provenance Method of electronic archives based on knowledge graph under the environment of big data has produced a large number of electronic archives due to the development ...With the advent of the era of big data,the Provenance Method of electronic archives based on knowledge graph under the environment of big data has produced a large number of electronic archives due to the development of science and technology.How to guarantee the credential characteristics of electronic archives in the big data environment has attracted wide attention of the academic community.Provenance is an important technical means to guarantee the certification of electronic archives.In this paper,knowledge graph technology is used to provide the concept provenance of electronic archives in large data environment.It not only enriches the provenance method,but also guarantees the certification of electronic archives in the large data environment.展开更多
In B2C e-commerce activities, it is involving vast amounts of customer data information. Introducing data mining technology to the electronic commerce, it provides a large number of valuable business information to el...In B2C e-commerce activities, it is involving vast amounts of customer data information. Introducing data mining technology to the electronic commerce, it provides a large number of valuable business information to electronic business enterprise, enhances the core competition of the electronic commerce enterprise. This paper starts from the model of data mining, analyzes the steps of the data mining and puts forward the application of data mining technology in B2C e-commerce.展开更多
Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laborat...Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laboratories, pharmacies, and daily medical status reports. The electronic format of medical reports ensures that all information is available in a single place. However, it is difficult to store and manage large amounts of data. Dedicated servers and a data center are needed to store and manage patient data. However, self-managed data centers are expensive for hospitals. Storing data in a cloud is a cheaper alternative. The advantage of storing data in a cloud is that it can be retrieved anywhere and anytime using any device connected to the Internet. Therefore, doctors can easily access the medical history of a patient and diagnose diseases according to the context. It also helps prescribe the correct medicine to a patient in an appropriate way. The systematic storage of medical records could help reduce medical errors in hospitals. The challenge is to store medical records on a third-party cloud server while addressing privacy and security concerns. These servers are often semi-trusted. Thus, sensitive medical information must be protected. Open access to records and modifications performed on the information in those records may even cause patient fatalities. Patient-centric health-record security is a major concern. End-to-end file encryption before outsourcing data to a third-party cloud server ensures security. This paper presents a method that is a combination of the advanced encryption standard and the elliptical curve Diffie-Hellman method designed to increase the efficiency of medical record security for users. Comparisons of existing and proposed techniques are presented at the end of the article, with a focus on the analyzing the security approaches between the elliptic curve and secret-sharing methods. This study aims to provide a high level of security for patient health records.展开更多
Charge transport characterization of single-molecule junctions is essential for the fundamental research of single-molecule physical chemistry and the development towards single-molecule electronic devices and circuit...Charge transport characterization of single-molecule junctions is essential for the fundamental research of single-molecule physical chemistry and the development towards single-molecule electronic devices and circuits. Among the single-molecule conductance characterization techniques,the single-molecule break junction technique is widely used in tens of worldwide research laboratories which can generate a large amount of experimental data from thousands of individual measurement cycles. However,data interpretation is a challenging task for researchers with different research backgrounds,and the different data analysis approaches sometimes lead to the misunderstanding of the measurement data and even reproducibility issues of the measurement. It is thus a necessity to develop a user-friendly all-in-one data analysis tool that automatizes the basic data analysis in a standard and widely accepted way. In this work,we present the XMe Code (Xiamen Molecular Electronics Code),an intelligent all-in-one data analysis tool for the comprehensive analysis of single-molecule break junction data. XMe code provides end-to-end data analysis that takes in the original experimental data and returns electronic characteristics and even charge transport mechanisms. We believe that XMe Code will promote the transparency of the data analysis in single-molecule electronics and the collaborations among scientists with different research backgrounds.展开更多
Four rice samples of long grain type were tested using an electronic nose (Cyranose-320).Samples of 5 g of each variety of rice were placed individually in vials and were analyzed with the electronic nose unit consist...Four rice samples of long grain type were tested using an electronic nose (Cyranose-320).Samples of 5 g of each variety of rice were placed individually in vials and were analyzed with the electronic nose unit consisting of 32 polymer sensors.The Cyranose-320 was able to differentiate between varieties of rice.The chemical composition of the rice odors for differentiating rice samples needs to be investigated.The optimum parameter settings should be considered during the Cyranose-320 training process especially for multiple samples,which are helpful for obtaining an accurate training model to improve identification capability.Further,it is necessary to investigate the E-nose sensor selection for obtaining better classification accuracy.A re- duced number of sensors could potentially shorten the data processing time,and could be used to establish an application pro- cedure and reduce the cost for a specific electronic nose.Further research is needed for developing analytical procedures that adapt the Cyranose-320 as a tool for testing rice quality.展开更多
基金supported by the National Natural Science Foundation of China(Nos.12375193,11975292,11875304)the CAS“Light of West China”Program+1 种基金the Scientific Instrument Developing Project of the Chinese Academy of Sciences(No.GJJSTD20210009)the CAS Pioneer Hundred Talent Program。
文摘The high energy cosmic-radiation detection(HERD)facility is planned to launch in 2027 and scheduled to be installed on the China Space Station.It serves as a dark matter particle detector,a cosmic ray instrument,and an observatory for high-energy gamma rays.A transition radiation detector placed on one of its lateral sides serves dual purpose,(ⅰ)calibrating HERD's electromagnetic calorimeter in the TeV energy range,and(ⅱ)serving as an independent detector for high-energy gamma rays.In this paper,the prototype readout electronics design of the transition radiation detector is demonstrated,which aims to accurately measure the charge of the anodes using the SAMPA application specific integrated circuit chip.The electronic performance of the prototype system is evaluated in terms of noise,linearity,and resolution.Through the presented design,each electronic channel can achieve a dynamic range of 0–100 fC,the RMS noise level not exceeding 0.15 fC,and the integral nonlinearity was<0.2%.To further verify the readout electronic performance,a joint test with the detector was carried out,and the results show that the prototype system can satisfy the requirements of the detector's scientific goals.
文摘LiB (lithium-ion battery) has become serious concern for energy management systems, especially in Japan, where the argument on a nuclear power plant problem is active. Including reuse of LiB, long-term use is expected, however, method to ensure LiB life has not been developed thus the users of LiB are forced to accept the uncertainty of LiB life. Therefore this study suggests an evaluation method for LiB life using degradation experimental data. This method has three elements, defining indexes, preparing degradation speed database from the result of experiment, and setting up the use patterns of LiB. In order to be usable under non-experimental conditions, degradation speed database has the data in all conditions by complementing the experimental result. Finally, this evaluation model was verified by comparing model estimates and the experimental measurements.
基金This work is supported by the NSFC(Nos.61772280,61772454)the Changzhou Sci&Tech Program(No.CJ20179027)the PAPD fund from NUIST.Prof.Jin Wang is the corresponding author。
文摘Since the British National Archive put forward the concept of the digital continuity in 2007,several developed countries have worked out their digital continuity action plan.However,the technologies of the digital continuity guarantee are still lacked.At first,this paper analyzes the requirements of digital continuity guarantee for electronic record based on data quality theory,then points out the necessity of data quality guarantee for electronic record.Moreover,we convert the digital continuity guarantee of electronic record to ensure the consistency,completeness and timeliness of electronic record,and construct the first technology framework of the digital continuity guarantee for electronic record.Finally,the temporal functional dependencies technology is utilized to build the first integration method to insure the consistency,completeness and timeliness of electronic record.
基金Supported by the National Major Scientific and Technological Special Project for"Significant New Drugs Development’’(No.2018ZX09201008)Special Fund Project for Information Development from Shanghai Municipal Commission of Economy and Information(No.201701013)
文摘Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face challenges like the inconsistence of terminology in electronic health records (EHR) and the complexities in data quality and data formats in regional healthcare platform.In this paper,we propose methodology and process on constructing large scale cohorts which forms the basis of causality and comparative effectiveness relationship in epidemiology.We firstly constructed a Chinese terminology knowledge graph to deal with the diversity of vocabularies on regional platform.Secondly,we built special disease case repositories (i.e.,heart failure repository) that utilize the graph to search the related patients and to normalize the data.Based on the requirements of the clinical research which aimed to explore the effectiveness of taking statin on 180-days readmission in patients with heart failure,we built a large-scale retrospective cohort with 29647 cases of heart failure patients from the heart failure repository.After the propensity score matching,the study group (n=6346) and the control group (n=6346) with parallel clinical characteristics were acquired.Logistic regression analysis showed that taking statins had a negative correlation with 180-days readmission in heart failure patients.This paper presents the workflow and application example of big data mining based on regional EHR data.
文摘Background:Given the importance of customers as the most valuable assets of organizations,customer retention seems to be an essential,basic requirement for any organization.Banks are no exception to this rule.The competitive atmosphere within which electronic banking services are provided by different banks increases the necessity of customer retention.Methods:Being based on existing information technologies which allow one to collect data from organizations’databases,data mining introduces a powerful tool for the extraction of knowledge from huge amounts of data.In this research,the decision tree technique was applied to build a model incorporating this knowledge.Results:The results represent the characteristics of churned customers.Conclusions:Bank managers can identify churners in future using the results of decision tree.They should be provide some strategies for customers whose features are getting more likely to churner’s features.
文摘In the electron beam selective melting(EBSM)process,the quality of each deposited melt track has an effect on the properties of the manufactured component.However,the formation of the melt track is governed by various physical phenomena and influenced by various process parameters,and the correlation of these parameters is complicated and difficult to establish experimentally.The mesoscopic modeling technique was recently introduced as a means of simulating the electron beam(EB)melting process and revealing the formation mechanisms of specific melt track morphologies.However,the correlation between the process parameters and the melt track features has not yet been quantitatively understood.This paper investigates the morphological features of the melt track from the results of mesoscopic simulation,while introducing key descriptive indexes such as melt track width and height in order to numerically assess the deposition quality.The effects of various processing parameters are also quantitatively investigated,and the correlation between the processing conditions and the melt track features is thereby derived.Finally,a simulation-driven optimization framework consisting of mesoscopic modeling and data mining is proposed,and its potential and limitations are discussed.
基金supported by the National Natural Science Foundation of China (Grant Nos. 42025404, 42188101, and 42241143)the National Key R&D Program of China (Grant Nos. 2022YFF0503700 and 2022YFF0503900)+1 种基金the B-type Strategic Priority Program of the Chinese Academy of Sciences (Grant No. XDB41000000)the Fundamental Research Funds for the Central Universities (Grant No. 2042022kf1012)
文摘Because radiation belt electrons can pose a potential threat to the safety of satellites orbiting in space,it is of great importance to develop a reliable model that can predict the highly dynamic variations in outer radiation belt electron fluxes.In the present study,we develop a forecast model of radiation belt electron fluxes based on the data assimilation method,in terms of Van Allen Probe measurements combined with three-dimensional radiation belt numerical simulations.Our forecast model can cover the entire outer radiation belt with a high temporal resolution(1 hour)and a spatial resolution of 0.25 L over a wide range of both electron energy(0.1-5.0 MeV)and pitch angle(5°-90°).On the basis of this model,we forecast hourly electron fluxes for the next 1,2,and 3 days during an intense geomagnetic storm and evaluate the corresponding prediction performance.Our model can reasonably predict the stormtime evolution of radiation belt electrons with high prediction efficiency(up to~0.8-1).The best prediction performance is found for~0.3-3 MeV electrons at L=~3.25-4.5,which extends to higher L and lower energies with increasing pitch angle.Our results demonstrate that the forecast model developed can be a powerful tool to predict the spatiotemporal changes in outer radiation belt electron fluxes,and the model has both scientific significance and practical implications.
基金The National Key Research and Development Program of China(No.2019YFB1600200).
文摘To study the influencing factors of traffic violations,this study investigated the effects of vehicle attribution,day of week,time of day,location of traffic violations,and weather on traffic violations based on the electronic enforcement data and historical weather data obtained in Shangyu,China.Ten categories of traffic violations were determined from the raw data.Then,chi-square tests were used to analyze the relationship between traffic violations and the potential risk factors.Multinomial logistic regression analyses were conducted to further estimate the effects of different risk factors on the likelihood of the occurrence of traffic violations.By analyzing the results of chi-square tests via SPSS,the five factors above were all determined as significant factors associated with traffic violations.The results of the multinomial logistic regression revealed the significant effects of the five factors on the likelihood of the occurrence of corresponding traffic violations.The conclusions are of great significance for the development of effective traffic intervention measures to reduce traffic violations and the improvement of road traffic safety.
文摘In this work,an old scanning electron microscope(SEM)is refurbished to enhance its image processing capability.How to digitally sample and process an analog image is also presented.An NI PCI-6259 multiple input/output data acquisition(DAQ)board is used to acquire signals originally being sent to an analog display,and then convert the signals into a digital image.Two output channels are used for raster scan of the horizontal and verticle axes of the image buffer,while one input channel is used to read the brightness signals at various coordinate points.Synchronous method is used to maximize the DAQ speed.Finally,the digitally buffered images are read out to display and saved in a hard drive.The hardware and software designs of this work are explained in great detail,which can serve as a very good example for fast synchronous DAQ,advanced virtual instrument design and structural driver programming with LabVIEW.
基金This research was supported by the National Natural Science Foundation of China under Grant(No.42050102)the Postgraduate Education Reform Project of Jiangsu Province under Grant(No.SJCX22_0343)Also,this research was supported by Dou Wanchun Expert Workstation of Yunnan Province(No.202205AF150013).
文摘With the rapid development of information technology,the electronifi-cation of medical records has gradually become a trend.In China,the population base is huge and the supporting medical institutions are numerous,so this reality drives the conversion of paper medical records to electronic medical records.Electronic medical records are the basis for establishing a smart hospital and an important guarantee for achieving medical intelligence,and the massive amount of electronic medical record data is also an important data set for conducting research in the medical field.However,electronic medical records contain a large amount of private patient information,which must be desensitized before they are used as open resources.Therefore,to solve the above problems,data masking for Chinese electronic medical records with named entity recognition is proposed in this paper.Firstly,the text is vectorized to satisfy the required format of the model input.Secondly,since the input sentences may have a long or short length and the relationship between sentences in context is not negligible.To this end,a neural network model for named entity recognition based on bidirectional long short-term memory(BiLSTM)with conditional random fields(CRF)is constructed.Finally,the data masking operation is performed based on the named entity recog-nition results,mainly using regular expression filtering encryption and principal component analysis(PCA)word vector compression and replacement.In addi-tion,comparison experiments with the hidden markov model(HMM)model,LSTM-CRF model,and BiLSTM model are conducted in this paper.The experi-mental results show that the method used in this paper achieves 92.72%Accuracy,92.30%Recall,and 92.51%F1_score,which has higher accuracy compared with other models.
基金supported by the National Key Scientific Instrument and Equipment Development Projects of the National Natural Science Foundation of China(No.41327802)the Fundamental Research Funds for the Central Universities(WK2030040066)
文摘Readout electronics is developed for a prototype spectrometer for in situ measurement of low-energy ions of30 e V/e–20 ke V/e in the solar wind plasma.A low-noise preamplifier/discriminator(A111F) is employed for each channel to process the signal from micro-channel plate(MCP) detectors.A high-voltage(HV) supply solution based on a HV module and a HV optocoupler is adopted to generate a fast sweeping HV and a fixed HV.Due to limitation of telemetry bandwidth in space communication,an algorithm is implemented in an FPGA(field programmable gate array) to compress the raw data.Test results show that the electronics achieves a 1 MHz event rate and a large input dynamic range of 95 p C.A slew rate of 0.8 V/ls and an integral nonlinearity of 0.7-LSB for the sweeping HV,and a precision of less than 0.8 % for the fixed HV are obtained.A vacuum beam test shows an energy resolution of 12 ± 0.7 % full width at half maximum(FWHM) is achieved,and noise counts are less than10/sec,indicating that the performance meets the physical requirement.
基金National Major Scientific Instruments and Equipment Development Special Funds,China(No.2016YFF0103303)National Science and Technology Support Program,China(No.2014BAK02B03)
文摘This paper proposes a useful web-based system for the management and sharing of electron probe micro-analysis( EPMA)data in geology. A new web-based architecture that integrates the management and sharing functions is developed and implemented.Earth scientists can utilize this system to not only manage their data,but also easily communicate and share it with other researchers.Data query methods provide the core functionality of the proposed management and sharing modules. The modules in this system have been developed using cloud GIS technologies,which help achieve real-time spatial area retrieval on a map. The system has been tested by approximately 263 users at Jilin University and Beijing SHRIMP Center. A survey was conducted among these users to estimate the usability of the primary functions of the system,and the assessment result is summarized and presented.
文摘With the rapid development of China's reform and opening up and the socialist market economy, the development of Internet technology has promoted the prosperity of e-commerce, and further promoted the rapid development of China's economy. Data mining technology is an advanced science and technology, which has important implications for the e-commerce data processing. Through the summary of the data mining technology, this article puts forward the application of data mining technology in electronic commerce, in order to better promote the development of electronic commerce.
基金This work is supported by the NSFC(61772280)the national training programs of innovation and entrepreneurship for undergraduates(Nos.201910300123Y,202010300200)the PAPD fund from NUIST.
文摘In the field of electronic record management,especially in the current big data environment,data continuity has become a new topic that is as important as security and needs to be studied.This paper decomposes the data continuity guarantee of electronic record into a set of data protection requirements consisting of data relevance,traceability and comprehensibility,and proposes to use the associated data technology to provide an integrated guarantee mechanism to meet the above three requirements.
基金This work is supported by the NSFC[Grant No.61772280]the National Training Programs of Innovation and Entrepreneurship for Undergraduates[Grant Nos.201910300123Y,201810300165]the PAPD Fund from NUIST.
文摘With the advent of the era of big data,the Provenance Method of electronic archives based on knowledge graph under the environment of big data has produced a large number of electronic archives due to the development of science and technology.How to guarantee the credential characteristics of electronic archives in the big data environment has attracted wide attention of the academic community.Provenance is an important technical means to guarantee the certification of electronic archives.In this paper,knowledge graph technology is used to provide the concept provenance of electronic archives in large data environment.It not only enriches the provenance method,but also guarantees the certification of electronic archives in the large data environment.
文摘In B2C e-commerce activities, it is involving vast amounts of customer data information. Introducing data mining technology to the electronic commerce, it provides a large number of valuable business information to electronic business enterprise, enhances the core competition of the electronic commerce enterprise. This paper starts from the model of data mining, analyzes the steps of the data mining and puts forward the application of data mining technology in B2C e-commerce.
文摘Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laboratories, pharmacies, and daily medical status reports. The electronic format of medical reports ensures that all information is available in a single place. However, it is difficult to store and manage large amounts of data. Dedicated servers and a data center are needed to store and manage patient data. However, self-managed data centers are expensive for hospitals. Storing data in a cloud is a cheaper alternative. The advantage of storing data in a cloud is that it can be retrieved anywhere and anytime using any device connected to the Internet. Therefore, doctors can easily access the medical history of a patient and diagnose diseases according to the context. It also helps prescribe the correct medicine to a patient in an appropriate way. The systematic storage of medical records could help reduce medical errors in hospitals. The challenge is to store medical records on a third-party cloud server while addressing privacy and security concerns. These servers are often semi-trusted. Thus, sensitive medical information must be protected. Open access to records and modifications performed on the information in those records may even cause patient fatalities. Patient-centric health-record security is a major concern. End-to-end file encryption before outsourcing data to a third-party cloud server ensures security. This paper presents a method that is a combination of the advanced encryption standard and the elliptical curve Diffie-Hellman method designed to increase the efficiency of medical record security for users. Comparisons of existing and proposed techniques are presented at the end of the article, with a focus on the analyzing the security approaches between the elliptic curve and secret-sharing methods. This study aims to provide a high level of security for patient health records.
基金supported by the National Natural Science Foundation of China(22325303,21973079,22032004)the National Key R&D Program of China(2017YFA0204902)+2 种基金the Fundamental Research Funds for the Central Universities in China(Xiamen University,20720190002)IRTSTFJ,National Science Foundation of Fujian Province(2018J06004)Beijing National Laboratory for Molecular Sciences(BNLMS202005).
文摘Charge transport characterization of single-molecule junctions is essential for the fundamental research of single-molecule physical chemistry and the development towards single-molecule electronic devices and circuits. Among the single-molecule conductance characterization techniques,the single-molecule break junction technique is widely used in tens of worldwide research laboratories which can generate a large amount of experimental data from thousands of individual measurement cycles. However,data interpretation is a challenging task for researchers with different research backgrounds,and the different data analysis approaches sometimes lead to the misunderstanding of the measurement data and even reproducibility issues of the measurement. It is thus a necessity to develop a user-friendly all-in-one data analysis tool that automatizes the basic data analysis in a standard and widely accepted way. In this work,we present the XMe Code (Xiamen Molecular Electronics Code),an intelligent all-in-one data analysis tool for the comprehensive analysis of single-molecule break junction data. XMe code provides end-to-end data analysis that takes in the original experimental data and returns electronic characteristics and even charge transport mechanisms. We believe that XMe Code will promote the transparency of the data analysis in single-molecule electronics and the collaborations among scientists with different research backgrounds.
基金support from the Doctoral Fund of Ministry of Education of China (No 20070224003)oversea research project of Hei-longjiang Province Education Agency, China (No1151HZ01)research project of Heilongjiang Province Education Agency, China (No 10531002).
文摘Four rice samples of long grain type were tested using an electronic nose (Cyranose-320).Samples of 5 g of each variety of rice were placed individually in vials and were analyzed with the electronic nose unit consisting of 32 polymer sensors.The Cyranose-320 was able to differentiate between varieties of rice.The chemical composition of the rice odors for differentiating rice samples needs to be investigated.The optimum parameter settings should be considered during the Cyranose-320 training process especially for multiple samples,which are helpful for obtaining an accurate training model to improve identification capability.Further,it is necessary to investigate the E-nose sensor selection for obtaining better classification accuracy.A re- duced number of sensors could potentially shorten the data processing time,and could be used to establish an application pro- cedure and reduce the cost for a specific electronic nose.Further research is needed for developing analytical procedures that adapt the Cyranose-320 as a tool for testing rice quality.