A great challenge faced by wireless sensor networks(WSNs) is to reduce energy consumption of sensor nodes. Fortunately, the data gathering via random sensing can save energy of sensor nodes. Nevertheless, its randomne...A great challenge faced by wireless sensor networks(WSNs) is to reduce energy consumption of sensor nodes. Fortunately, the data gathering via random sensing can save energy of sensor nodes. Nevertheless, its randomness and density usually result in difficult implementations, high computation complexity and large storage spaces in practical settings. So the deterministic sparse sensing matrices are desired in some situations. However,it is difficult to guarantee the performance of deterministic sensing matrix by the acknowledged metrics. In this paper, we construct a class of deterministic sparse sensing matrices with statistical versions of restricted isometry property(St RIP) via regular low density parity check(RLDPC) matrices. The key idea of our construction is to achieve small mutual coherence of the matrices by confining the column weights of RLDPC matrices such that St RIP is satisfied. Besides, we prove that the constructed sensing matrices have the same scale of measurement numbers as the dense measurements. We also propose a data gathering method based on RLDPC matrix. Experimental results verify that the constructed sensing matrices have better reconstruction performance, compared to the Gaussian, Bernoulli, and CSLDPC matrices. And we also verify that the data gathering via RLDPC matrix can reduce energy consumption of WSNs.展开更多
In this paper the application of spatialization technology on metadata quality check and updating was dis-cussed. A new method based on spatialization was proposed for checking and updating metadata to overcome the de...In this paper the application of spatialization technology on metadata quality check and updating was dis-cussed. A new method based on spatialization was proposed for checking and updating metadata to overcome the defi-ciency of text based methods with the powerful functions of spatial query and analysis provided by GIS software. Thismethod employs the technology of spatialization to transform metadata into a coordinate space and the functions ofspatial analysis in GIS to check and update spatial metadata in a visual environment. The basic principle and technicalflow of this method were explained in detail, and an example of implementation using ArcMap of GIS software wasillustrated with a metadata set of digital raster maps. The result shows the new method with the support of interactionof graph and text is much more intuitive and convenient than the ordinary text based method, and can fully utilize thefunctions of GIS spatial query and analysis with more accuracy and efficiency.展开更多
This paper describes a data transmission method using a cyclic redundancy check and inaudible frequencies.The proposed method uses inaudible high frequencies from 18 k Hz to 22 k Hz generated via the inner speaker of ...This paper describes a data transmission method using a cyclic redundancy check and inaudible frequencies.The proposed method uses inaudible high frequencies from 18 k Hz to 22 k Hz generated via the inner speaker of smart devices.Using the proposed method,the performance is evaluated by conducting data transmission tests between a smart book and smart phone.The test results confirm that the proposed method can send 32 bits of data in an average of 235 ms,the transmission success rate reaches 99.47%,and the error detection rate of the cyclic redundancy check is0.53%.展开更多
In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all usef...In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all useful information across quantiles and can detect nonlinear effects including interactions and heterogeneity,effectively.Furthermore,the proposed screening method based on cCCQC is robust to the existence of outliers and enjoys the sure screening property.Simulation results demonstrate that the proposed method performs competitively on survival datasets of high-dimensional predictors,particularly when the variables are highly correlated.展开更多
The traditional printing checking method always uses printing control strips,but the results are not very well in repeatability and stability. In this paper,the checking methods for printing quality basing on image ar...The traditional printing checking method always uses printing control strips,but the results are not very well in repeatability and stability. In this paper,the checking methods for printing quality basing on image are taken as research objects. On the base of the traditional checking methods of printing quality,combining the method and theory of digital image processing with printing theory in the new domain of image quality checking,it constitute the checking system of printing quality by image processing,and expound the theory design and the model of this system. This is an application of machine vision. It uses the high resolution industrial CCD(Charge Coupled Device) colorful camera. It can display the real-time photographs on the monitor,and input the video signal to the image gathering card,and then the image data transmits through the computer PCI bus to the memory. At the same time,the system carries on processing and data analysis. This method is proved by experiments. The experiments are mainly about the data conversion of image and ink limit show of printing.展开更多
Reliability enhancement testing(RET) is an accelerated testing which hastens the performance degradation process to surface its inherent defects of design and manufacture. It is an important hypothesis that the degrad...Reliability enhancement testing(RET) is an accelerated testing which hastens the performance degradation process to surface its inherent defects of design and manufacture. It is an important hypothesis that the degradation mechanism of the RET is the same as the one of the normal stress condition. In order to check the consistency of two mechanisms, we conduct two enhancement tests with a missile servo system as an object of the study, and preprocess two sets of test data to establish the accelerated degradation models regarding the temperature change rate that is assumed to be the main applied stress of the servo system during the natural storage. Based on the accelerated degradation models and natural storage profile of the servo system, we provide and demonstrate a procedure to check the consistency of two mechanisms by checking the correlation and difference of two sets of degradation data. The results indicate that the two degradation mechanisms are significantly consistent with each other.展开更多
Based on left truncated and right censored dependent data, the estimators of higher derivatives of density function and hazard rate function are given by kernel smoothing method. When observed data exhibit α-mixing d...Based on left truncated and right censored dependent data, the estimators of higher derivatives of density function and hazard rate function are given by kernel smoothing method. When observed data exhibit α-mixing dependence, local properties including strong consistency and law of iterated logarithm are presented. Moreover, when the mode estimator is defined as the random variable that maximizes the kernel density estimator, the asymptotic normality of the mode estimator is established.展开更多
In this article, a partially linear single-index model /or longitudinal data is investigated. The generalized penalized spline least squares estimates of the unknown parameters are suggested. All parameters can be est...In this article, a partially linear single-index model /or longitudinal data is investigated. The generalized penalized spline least squares estimates of the unknown parameters are suggested. All parameters can be estimated simultaneously by the proposed method while the feature of longitudinal data is considered. The existence, strong consistency and asymptotic normality of the estimators are proved under suitable conditions. A simulation study is conducted to investigate the finite sample performance of the proposed method. Our approach can also be used to study the pure single-index model for longitudinal data.展开更多
A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagra...A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagram as the object model is employed to describe the object structure of a Web application design and can be translated into the behavior model. A key problem of model checking-based test generation for a Web application is how to construct a set of trap properties that intend to cause the violations of model checking against the behavior model and output of counterexamples used to construct the test sequences. We give an algorithm that derives trap properties from the object model with respect to node and edge coverage criteria.展开更多
Geo-data is a foundation for the prediction and assessment of ore resources, so managing and making full use of those data, including geography database, geology database, mineral deposits database, aeromagnetics data...Geo-data is a foundation for the prediction and assessment of ore resources, so managing and making full use of those data, including geography database, geology database, mineral deposits database, aeromagnetics database, gravity database, geochemistry database and remote sensing database, is very significant. We developed national important mining zone database (NIMZDB) to manage 14 national important mining zone databases to support a new round prediction of ore deposit. We found that attention should be paid to the following issues: ① data accuracy: integrity, logic consistency, attribute, spatial and time accuracy; ② management of both attribute and spatial data in the same system;③ transforming data between MapGIS and ArcGIS; ④ data sharing and security; ⑤ data searches that can query both attribute and spatial data. Accuracy of input data is guaranteed and the search, analysis and translation of data between MapGIS and ArcGIS has been made convenient via the development of a checking data module and a managing data module based on MapGIS and ArcGIS. Using AreSDE, we based data sharing on a client/server system, and attribute and spatial data are also managed in the same system.展开更多
This study introduces the site selection and data processing of GNSS receiver calibration networks. According to the design requirements and relevant specifications, the authors investigate the observation conditions ...This study introduces the site selection and data processing of GNSS receiver calibration networks. According to the design requirements and relevant specifications, the authors investigate the observation conditions of the potential sites and collect the experimental GNSS observation data. TEQC is used to evaluate the data availability rate and multipath effects of the observation data to determine the appropriate site. After the construction and measurement of the calibration network, the baseline processing of the medium and long baseline network is conducted by GAMIT. The accuracy indexes including NRMS, difference between repeated baselines, and closure of independent observation loops all meet the specified criteria.展开更多
Blockchain is a technology that provides security features that can be used for more than just cryptocurrencies.Blockchain achieves security by saving the information of one block in the next block.Changing the inform...Blockchain is a technology that provides security features that can be used for more than just cryptocurrencies.Blockchain achieves security by saving the information of one block in the next block.Changing the information of one block will require changes to all the next block in order for that change to take effect.Which makes it unfeasible for such an attack to happen.However,the structure of how blockchain works makes the last block always vulnerable for attacks,given that its information is not saved yet in any block.This allows malicious node to change the information of the last block and generate a new block and broadcast it to the network.Given that the nodes always follow the longer chain wins rule,the malicious node will win given that it has the longest chain in the network.This paper suggests a solution to this issue by making the nodes send consistency check messages before broadcasting a block.If the nodes manage to successfully verify that the node that generated a new block hasn’t tampered with the blockchain than that block will be broadcasted.The results of the simulation show suggested protocol provided better security compared to the regular blockchain.展开更多
The parameter estimation and the coefficient of contamination for the regression models with repeated measures are studied when its response variables are contaminated by another random variable sequence.Under the sui...The parameter estimation and the coefficient of contamination for the regression models with repeated measures are studied when its response variables are contaminated by another random variable sequence.Under the suitable conditions it is proved that the estimators which are established in the paper are strongly consistent estimators.展开更多
A new method for constructing contours from complicated terrain elevation grids containing invalid data is put forward. By using this method, the topological consistency of contours in groups can be maintained effecti...A new method for constructing contours from complicated terrain elevation grids containing invalid data is put forward. By using this method, the topological consistency of contours in groups can be maintained effectively and the contours can be drawn smoothly based on boundaries pre-searching and local correction. An experimental example is given to demonstrate that the contours constructed by this method are of good quality.展开更多
基金supported by the National Natural Science Foundation of China(61307121)ABRP of Datong(2017127)the Ph.D.’s Initiated Research Projects of Datong University(2013-B-17,2015-B-05)
文摘A great challenge faced by wireless sensor networks(WSNs) is to reduce energy consumption of sensor nodes. Fortunately, the data gathering via random sensing can save energy of sensor nodes. Nevertheless, its randomness and density usually result in difficult implementations, high computation complexity and large storage spaces in practical settings. So the deterministic sparse sensing matrices are desired in some situations. However,it is difficult to guarantee the performance of deterministic sensing matrix by the acknowledged metrics. In this paper, we construct a class of deterministic sparse sensing matrices with statistical versions of restricted isometry property(St RIP) via regular low density parity check(RLDPC) matrices. The key idea of our construction is to achieve small mutual coherence of the matrices by confining the column weights of RLDPC matrices such that St RIP is satisfied. Besides, we prove that the constructed sensing matrices have the same scale of measurement numbers as the dense measurements. We also propose a data gathering method based on RLDPC matrix. Experimental results verify that the constructed sensing matrices have better reconstruction performance, compared to the Gaussian, Bernoulli, and CSLDPC matrices. And we also verify that the data gathering via RLDPC matrix can reduce energy consumption of WSNs.
基金Project 40301042 supported by Natural Science Foundation of China
文摘In this paper the application of spatialization technology on metadata quality check and updating was dis-cussed. A new method based on spatialization was proposed for checking and updating metadata to overcome the defi-ciency of text based methods with the powerful functions of spatial query and analysis provided by GIS software. Thismethod employs the technology of spatialization to transform metadata into a coordinate space and the functions ofspatial analysis in GIS to check and update spatial metadata in a visual environment. The basic principle and technicalflow of this method were explained in detail, and an example of implementation using ArcMap of GIS software wasillustrated with a metadata set of digital raster maps. The result shows the new method with the support of interactionof graph and text is much more intuitive and convenient than the ordinary text based method, and can fully utilize thefunctions of GIS spatial query and analysis with more accuracy and efficiency.
基金supported by Ministry of Educationunder Basic Science Research Program under Grant No.NRF-2013R1A1A2061478
文摘This paper describes a data transmission method using a cyclic redundancy check and inaudible frequencies.The proposed method uses inaudible high frequencies from 18 k Hz to 22 k Hz generated via the inner speaker of smart devices.Using the proposed method,the performance is evaluated by conducting data transmission tests between a smart book and smart phone.The test results confirm that the proposed method can send 32 bits of data in an average of 235 ms,the transmission success rate reaches 99.47%,and the error detection rate of the cyclic redundancy check is0.53%.
基金Outstanding Youth Foundation of Hunan Provincial Department of Education(Grant No.22B0911)。
文摘In this paper,we introduce the censored composite conditional quantile coefficient(cC-CQC)to rank the relative importance of each predictor in high-dimensional censored regression.The cCCQC takes advantage of all useful information across quantiles and can detect nonlinear effects including interactions and heterogeneity,effectively.Furthermore,the proposed screening method based on cCCQC is robust to the existence of outliers and enjoys the sure screening property.Simulation results demonstrate that the proposed method performs competitively on survival datasets of high-dimensional predictors,particularly when the variables are highly correlated.
文摘The traditional printing checking method always uses printing control strips,but the results are not very well in repeatability and stability. In this paper,the checking methods for printing quality basing on image are taken as research objects. On the base of the traditional checking methods of printing quality,combining the method and theory of digital image processing with printing theory in the new domain of image quality checking,it constitute the checking system of printing quality by image processing,and expound the theory design and the model of this system. This is an application of machine vision. It uses the high resolution industrial CCD(Charge Coupled Device) colorful camera. It can display the real-time photographs on the monitor,and input the video signal to the image gathering card,and then the image data transmits through the computer PCI bus to the memory. At the same time,the system carries on processing and data analysis. This method is proved by experiments. The experiments are mainly about the data conversion of image and ink limit show of printing.
基金supported by the Natural Science Foundation of Hunan Province(2018JJ2282)
文摘Reliability enhancement testing(RET) is an accelerated testing which hastens the performance degradation process to surface its inherent defects of design and manufacture. It is an important hypothesis that the degradation mechanism of the RET is the same as the one of the normal stress condition. In order to check the consistency of two mechanisms, we conduct two enhancement tests with a missile servo system as an object of the study, and preprocess two sets of test data to establish the accelerated degradation models regarding the temperature change rate that is assumed to be the main applied stress of the servo system during the natural storage. Based on the accelerated degradation models and natural storage profile of the servo system, we provide and demonstrate a procedure to check the consistency of two mechanisms by checking the correlation and difference of two sets of degradation data. The results indicate that the two degradation mechanisms are significantly consistent with each other.
文摘Based on left truncated and right censored dependent data, the estimators of higher derivatives of density function and hazard rate function are given by kernel smoothing method. When observed data exhibit α-mixing dependence, local properties including strong consistency and law of iterated logarithm are presented. Moreover, when the mode estimator is defined as the random variable that maximizes the kernel density estimator, the asymptotic normality of the mode estimator is established.
基金Supported by the National Natural Science Foundation of China (10571008)the Natural Science Foundation of Henan (092300410149)the Core Teacher Foundationof Henan (2006141)
文摘In this article, a partially linear single-index model /or longitudinal data is investigated. The generalized penalized spline least squares estimates of the unknown parameters are suggested. All parameters can be estimated simultaneously by the proposed method while the feature of longitudinal data is considered. The existence, strong consistency and asymptotic normality of the estimators are proved under suitable conditions. A simulation study is conducted to investigate the finite sample performance of the proposed method. Our approach can also be used to study the pure single-index model for longitudinal data.
基金Supported by the National Natural Science Foundation of China (60673115)the National Basic Research Program of China (973 Program) (2002CB312001)the Open Foundation of State Key Laboratory of Soft-ware Engineering (SKLSE05-13)
文摘A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagram as the object model is employed to describe the object structure of a Web application design and can be translated into the behavior model. A key problem of model checking-based test generation for a Web application is how to construct a set of trap properties that intend to cause the violations of model checking against the behavior model and output of counterexamples used to construct the test sequences. We give an algorithm that derives trap properties from the object model with respect to node and edge coverage criteria.
基金This paper is financially supported by the National I mportant MiningZone Database ( No .200210000004)Prediction and Assessment ofMineral Resources and Social Service (No .1212010331402) .
文摘Geo-data is a foundation for the prediction and assessment of ore resources, so managing and making full use of those data, including geography database, geology database, mineral deposits database, aeromagnetics database, gravity database, geochemistry database and remote sensing database, is very significant. We developed national important mining zone database (NIMZDB) to manage 14 national important mining zone databases to support a new round prediction of ore deposit. We found that attention should be paid to the following issues: ① data accuracy: integrity, logic consistency, attribute, spatial and time accuracy; ② management of both attribute and spatial data in the same system;③ transforming data between MapGIS and ArcGIS; ④ data sharing and security; ⑤ data searches that can query both attribute and spatial data. Accuracy of input data is guaranteed and the search, analysis and translation of data between MapGIS and ArcGIS has been made convenient via the development of a checking data module and a managing data module based on MapGIS and ArcGIS. Using AreSDE, we based data sharing on a client/server system, and attribute and spatial data are also managed in the same system.
基金Supported by Project of National Natural Science Foundation of China(No.41772346)
文摘This study introduces the site selection and data processing of GNSS receiver calibration networks. According to the design requirements and relevant specifications, the authors investigate the observation conditions of the potential sites and collect the experimental GNSS observation data. TEQC is used to evaluate the data availability rate and multipath effects of the observation data to determine the appropriate site. After the construction and measurement of the calibration network, the baseline processing of the medium and long baseline network is conducted by GAMIT. The accuracy indexes including NRMS, difference between repeated baselines, and closure of independent observation loops all meet the specified criteria.
基金supported by research fund of Chungnam National University.
文摘Blockchain is a technology that provides security features that can be used for more than just cryptocurrencies.Blockchain achieves security by saving the information of one block in the next block.Changing the information of one block will require changes to all the next block in order for that change to take effect.Which makes it unfeasible for such an attack to happen.However,the structure of how blockchain works makes the last block always vulnerable for attacks,given that its information is not saved yet in any block.This allows malicious node to change the information of the last block and generate a new block and broadcast it to the network.Given that the nodes always follow the longer chain wins rule,the malicious node will win given that it has the longest chain in the network.This paper suggests a solution to this issue by making the nodes send consistency check messages before broadcasting a block.If the nodes manage to successfully verify that the node that generated a new block hasn’t tampered with the blockchain than that block will be broadcasted.The results of the simulation show suggested protocol provided better security compared to the regular blockchain.
文摘The parameter estimation and the coefficient of contamination for the regression models with repeated measures are studied when its response variables are contaminated by another random variable sequence.Under the suitable conditions it is proved that the estimators which are established in the paper are strongly consistent estimators.
文摘A new method for constructing contours from complicated terrain elevation grids containing invalid data is put forward. By using this method, the topological consistency of contours in groups can be maintained effectively and the contours can be drawn smoothly based on boundaries pre-searching and local correction. An experimental example is given to demonstrate that the contours constructed by this method are of good quality.