This study analyzes the signal quality and the accuracy of BeiDou 3 rd generation Satellite Navigation System(BDS3) Precise Point Positioning(PPP) in the Arctic Ocean. Assessment of signal quality of BDS3 includes sig...This study analyzes the signal quality and the accuracy of BeiDou 3 rd generation Satellite Navigation System(BDS3) Precise Point Positioning(PPP) in the Arctic Ocean. Assessment of signal quality of BDS3 includes signal to noise ratio(SNR), multipath(MP), dilution of precision(DOP), and code-minus-carrier combination(CC). The results show that, 5 to 13 satellites are visible at any time in the Arctic Ocean area as of September 2018, which are sufficient for positioning. In the mid-latitude oceanic region and in the Arctic Ocean, the SNR is 25–52 dB Hz and the MP ranges from-2 m to 2 m. As the latitude increases, the DOP values show large variation, which may be related to the distribution of BDS satellites. The CC values of signals B1 I and BIC range from-5 m to 5 m in the mid-latitude sea area and the Arctic Ocean, which means the effect of pseudorange noise is small. Moreover, as to obtain the external precise reference value for GNSS positioning in the Arctic Ocean region is difficult, it is hard to evaluate the accuracy of positioning results. An improved isotropy-based protection level method based on Receiver Autonomous Integrity Monitoring is proposed in the paper, which adopts median filter to smooth the gross errors to assess the precision and reliability of PPP in the Arctic Ocean. At first, the improved algorithm is verified with the data from the International GNSS Service Station Tixi. Then the accuracy of BDS3 PPP in the Arctic Ocean is calculated based on the improved algorithm. Which shows that the kinematic accuracy of PPP can reach the decimeter level in both the horizontal and vertical directions, and it meets the precision requirements of maritime navigation.展开更多
Purpose: This paper relates the definition of data quality procedures for knowledge organizations such as Higher Education Institutions. The main purpose is to present the flexible approach developed for monitoring th...Purpose: This paper relates the definition of data quality procedures for knowledge organizations such as Higher Education Institutions. The main purpose is to present the flexible approach developed for monitoring the data quality of the European Tertiary Education Register(ETER) database, illustrating its functioning and highlighting the main challenges that still have to be faced in this domain.Design/methodology/approach: The proposed data quality methodology is based on two kinds of checks, one to assess the consistency of cross-sectional data and the other to evaluate the stability of multiannual data. This methodology has an operational and empirical orientation. This means that the proposed checks do not assume any theoretical distribution for the determination of the threshold parameters that identify potential outliers, inconsistencies, and errors in the data. Findings: We show that the proposed cross-sectional checks and multiannual checks are helpful to identify outliers, extreme observations and to detect ontological inconsistencies not described in the available meta-data. For this reason, they may be a useful complement to integrate the processing of the available information.Research limitations: The coverage of the study is limited to European Higher Education Institutions. The cross-sectional and multiannual checks are not yet completely integrated.Practical implications: The consideration of the quality of the available data and information is important to enhance data quality-aware empirical investigations, highlighting problems, and areas where to invest for improving the coverage and interoperability of data in future data collection initiatives.Originality/value: The data-driven quality checks proposed in this paper may be useful as a reference for building and monitoring the data quality of new databases or of existing databases available for other countries or systems characterized by high heterogeneity and complexity of the units of analysis without relying on pre-specified theoretical distributions.展开更多
TEQC is used to check the observations quality of 173 GPS campaign stations in the Northeast and North China. Each station was observed with an occupation of 4 days. The quality of the 692 data files is analyzed by th...TEQC is used to check the observations quality of 173 GPS campaign stations in the Northeast and North China. Each station was observed with an occupation of 4 days. The quality of the 692 data files is analyzed by the ratio of overall observations to possible observations, MP1, MP2 and the ratio of observations to slips. The reasons for multipath and cycle slips can be derived from the photos taken in the field. The results show that the coverage of trees and buildings/structures, and the interference of high-voltage power lines near the stations are the main reasons. In a small area, the horizontal velocity field in the period 2011-2013 is exemplified, where the magnitudes and directions of the 4 stations' rates are clearly different with that of other stations. It seems that the error caused by the worse environment cannot be mitigated through post processing. Therefore, these conclusions can help the establishment of GNSS stations, measurements, data processing and formulating standards in future.展开更多
In this paper the application of spatialization technology on metadata quality check and updating was dis-cussed. A new method based on spatialization was proposed for checking and updating metadata to overcome the de...In this paper the application of spatialization technology on metadata quality check and updating was dis-cussed. A new method based on spatialization was proposed for checking and updating metadata to overcome the defi-ciency of text based methods with the powerful functions of spatial query and analysis provided by GIS software. Thismethod employs the technology of spatialization to transform metadata into a coordinate space and the functions ofspatial analysis in GIS to check and update spatial metadata in a visual environment. The basic principle and technicalflow of this method were explained in detail, and an example of implementation using ArcMap of GIS software wasillustrated with a metadata set of digital raster maps. The result shows the new method with the support of interactionof graph and text is much more intuitive and convenient than the ordinary text based method, and can fully utilize thefunctions of GIS spatial query and analysis with more accuracy and efficiency.展开更多
基金The Science and Technology of Henan Province under contract No.212102310029the National Natural Science Founation Cultivation Project of Xuchang University under contract No.2022GJPY007the Educational Teaching Research and Practice Project of Xuchang University under contract No.XCU2021-YB-024.
文摘This study analyzes the signal quality and the accuracy of BeiDou 3 rd generation Satellite Navigation System(BDS3) Precise Point Positioning(PPP) in the Arctic Ocean. Assessment of signal quality of BDS3 includes signal to noise ratio(SNR), multipath(MP), dilution of precision(DOP), and code-minus-carrier combination(CC). The results show that, 5 to 13 satellites are visible at any time in the Arctic Ocean area as of September 2018, which are sufficient for positioning. In the mid-latitude oceanic region and in the Arctic Ocean, the SNR is 25–52 dB Hz and the MP ranges from-2 m to 2 m. As the latitude increases, the DOP values show large variation, which may be related to the distribution of BDS satellites. The CC values of signals B1 I and BIC range from-5 m to 5 m in the mid-latitude sea area and the Arctic Ocean, which means the effect of pseudorange noise is small. Moreover, as to obtain the external precise reference value for GNSS positioning in the Arctic Ocean region is difficult, it is hard to evaluate the accuracy of positioning results. An improved isotropy-based protection level method based on Receiver Autonomous Integrity Monitoring is proposed in the paper, which adopts median filter to smooth the gross errors to assess the precision and reliability of PPP in the Arctic Ocean. At first, the improved algorithm is verified with the data from the International GNSS Service Station Tixi. Then the accuracy of BDS3 PPP in the Arctic Ocean is calculated based on the improved algorithm. Which shows that the kinematic accuracy of PPP can reach the decimeter level in both the horizontal and vertical directions, and it meets the precision requirements of maritime navigation.
基金support of the European Commission ETER Project (No. 934533-2017-AO8-CH)H2020 RISIS 2 project (No. 824091)。
文摘Purpose: This paper relates the definition of data quality procedures for knowledge organizations such as Higher Education Institutions. The main purpose is to present the flexible approach developed for monitoring the data quality of the European Tertiary Education Register(ETER) database, illustrating its functioning and highlighting the main challenges that still have to be faced in this domain.Design/methodology/approach: The proposed data quality methodology is based on two kinds of checks, one to assess the consistency of cross-sectional data and the other to evaluate the stability of multiannual data. This methodology has an operational and empirical orientation. This means that the proposed checks do not assume any theoretical distribution for the determination of the threshold parameters that identify potential outliers, inconsistencies, and errors in the data. Findings: We show that the proposed cross-sectional checks and multiannual checks are helpful to identify outliers, extreme observations and to detect ontological inconsistencies not described in the available meta-data. For this reason, they may be a useful complement to integrate the processing of the available information.Research limitations: The coverage of the study is limited to European Higher Education Institutions. The cross-sectional and multiannual checks are not yet completely integrated.Practical implications: The consideration of the quality of the available data and information is important to enhance data quality-aware empirical investigations, highlighting problems, and areas where to invest for improving the coverage and interoperability of data in future data collection initiatives.Originality/value: The data-driven quality checks proposed in this paper may be useful as a reference for building and monitoring the data quality of new databases or of existing databases available for other countries or systems characterized by high heterogeneity and complexity of the units of analysis without relying on pre-specified theoretical distributions.
基金supported by the China National Special Fund for Earthquake Scientific Research(201508003,201508009)
文摘TEQC is used to check the observations quality of 173 GPS campaign stations in the Northeast and North China. Each station was observed with an occupation of 4 days. The quality of the 692 data files is analyzed by the ratio of overall observations to possible observations, MP1, MP2 and the ratio of observations to slips. The reasons for multipath and cycle slips can be derived from the photos taken in the field. The results show that the coverage of trees and buildings/structures, and the interference of high-voltage power lines near the stations are the main reasons. In a small area, the horizontal velocity field in the period 2011-2013 is exemplified, where the magnitudes and directions of the 4 stations' rates are clearly different with that of other stations. It seems that the error caused by the worse environment cannot be mitigated through post processing. Therefore, these conclusions can help the establishment of GNSS stations, measurements, data processing and formulating standards in future.
基金Project 40301042 supported by Natural Science Foundation of China
文摘In this paper the application of spatialization technology on metadata quality check and updating was dis-cussed. A new method based on spatialization was proposed for checking and updating metadata to overcome the defi-ciency of text based methods with the powerful functions of spatial query and analysis provided by GIS software. Thismethod employs the technology of spatialization to transform metadata into a coordinate space and the functions ofspatial analysis in GIS to check and update spatial metadata in a visual environment. The basic principle and technicalflow of this method were explained in detail, and an example of implementation using ArcMap of GIS software wasillustrated with a metadata set of digital raster maps. The result shows the new method with the support of interactionof graph and text is much more intuitive and convenient than the ordinary text based method, and can fully utilize thefunctions of GIS spatial query and analysis with more accuracy and efficiency.