As a combination of edge computing and artificial intelligence,edge intelligence has become a promising technique and provided its users with a series of fast,precise,and customized services.In edge intelligence,when ...As a combination of edge computing and artificial intelligence,edge intelligence has become a promising technique and provided its users with a series of fast,precise,and customized services.In edge intelligence,when learning agents are deployed on the edge side,the data aggregation from the end side to the designated edge devices is an important research topic.Considering the various importance of end devices,this paper studies the weighted data aggregation problem in a single hop end-to-edge communication network.Firstly,to make sure all the end devices with various weights are fairly treated in data aggregation,a distributed end-to-edge cooperative scheme is proposed.Then,to handle the massive contention on the wireless channel caused by end devices,a multi-armed bandit(MAB)algorithm is designed to help the end devices find their most appropriate update rates.Diffe-rent from the traditional data aggregation works,combining the MAB enables our algorithm a higher efficiency in data aggregation.With a theoretical analysis,we show that the efficiency of our algorithm is asymptotically optimal.Comparative experiments with previous works are also conducted to show the strength of our algorithm.展开更多
At present,one of the methods used to determine the height of points on the Earth’s surface is Global Navigation Satellite System(GNSS)leveling.It is possible to determine the orthometric or normal height by this met...At present,one of the methods used to determine the height of points on the Earth’s surface is Global Navigation Satellite System(GNSS)leveling.It is possible to determine the orthometric or normal height by this method only if there is a geoid or quasi-geoid height model available.This paper proposes the methodology for local correction of the heights of high-order global geoid models such as EGM08,EIGEN-6C4,GECO,and XGM2019e_2159.This methodology was tested in different areas of the research field,covering various relief forms.The dependence of the change in corrected height accuracy on the input data was analyzed,and the correction was also conducted for model heights in three tidal systems:"tide free","mean tide",and"zero tide".The results show that the heights of EIGEN-6C4 model can be corrected with an accuracy of up to 1 cm for flat and foothill terrains with the dimensionality of 1°×1°,2°×2°,and 3°×3°.The EGM08 model presents an almost identical result.The EIGEN-6C4 model is best suited for mountainous relief and provides an accuracy of 1.5 cm on the 1°×1°area.The height correction accuracy of GECO and XGM2019e_2159 models is slightly poor,which has fuzziness in terms of numerical fluctuation.展开更多
Weighted fusion algorithms, which can be applied in the area of multi-sensor data fusion, are advanced based on weighted least square method. A weighted fusion algorithm, in which the relationship between weight coeff...Weighted fusion algorithms, which can be applied in the area of multi-sensor data fusion, are advanced based on weighted least square method. A weighted fusion algorithm, in which the relationship between weight coefficients and measurement noise is established, is proposed by giving attention to the correlation of measurement noise. Then a simplified weighted fusion algorithm is deduced on the assumption that measurement noise is uncorrelated. In addition, an algorithm, which can adjust the weight coefficients in the simplified algorithm by making estimations of measurement noise from measurements, is presented. It is proved by emulation and experiment that the precision performance of the multi-sensor system based on these algorithms is better than that of the multi-sensor system based on other algorithms.展开更多
An ill-posed inverse problem in quantitative susceptibility mapping (QSM) is usually solved using a regularization and optimization solver, which is time consuming considering the three-dimensional volume data. Howe...An ill-posed inverse problem in quantitative susceptibility mapping (QSM) is usually solved using a regularization and optimization solver, which is time consuming considering the three-dimensional volume data. However, in clinical diagnosis, it is necessary to reconstruct a susceptibility map efficiently with an appropriate method. Here, a modified QSM reconstruction method called weighted total variation using split Bregman (WTVSB) is proposed. It reconstructs the susceptibility map with fast computational speed and effective artifact suppression by incorporating noise-suppressed data weighting with split Bregman iteration. The noise-suppressed data weighting is determined using the Laplacian of the calculated local field, which can prevent the noise and errors in field maps from spreading into the susceptibility inversion. The split Bregman iteration accelerates the solution of the Ll-regularized reconstruction model by utilizing a preconditioned conjugate gradient solver. In an experiment, the proposed reconstruction method is compared with truncated k-space division (TKD), morphology enabled dipole inversion (MEDI), total variation using the split Bregman (TVSB) method for numerical simulation, phantom and in vivo human brain data evaluated by root mean square error and mean structure similarity. Experimental results demonstrate that our proposed method can achieve better balance between accuracy and efficiency of QSM reconstruction than conventional methods, and thus facilitating clinical applications of QSM.展开更多
The veracity of land evaluation is tightly related to the reasonable weights of land evaluation fac- tors. By mapping qualitative linguistic words into a fine-changeable cloud drops and translating the uncertain facto...The veracity of land evaluation is tightly related to the reasonable weights of land evaluation fac- tors. By mapping qualitative linguistic words into a fine-changeable cloud drops and translating the uncertain factor conditions into quantitative values with the uncertain illation based on cloud model, and then, inte- grating correlation analysis, a new way of figuring out the weight of land evaluation factors is proposed. It may solve the limitations of the conventional ways.展开更多
This paper proposes a new weighted quantile regression model for longitudinal data with weights chosen by empirical likelihood(EL). This approach efficiently incorporates the information from the conditional quantile ...This paper proposes a new weighted quantile regression model for longitudinal data with weights chosen by empirical likelihood(EL). This approach efficiently incorporates the information from the conditional quantile restrictions to account for within-subject correlations. The resulted estimate is computationally simple and has good performance under modest or high within-subject correlation. The efficiency gain is quantified theoretically and illustrated via simulation and a real data application.展开更多
The Extreme Learning Machine(ELM) and its variants are effective in many machine learning applications such as Imbalanced Learning(IL) or Big Data(BD) learning. However, they are unable to solve both imbalanced ...The Extreme Learning Machine(ELM) and its variants are effective in many machine learning applications such as Imbalanced Learning(IL) or Big Data(BD) learning. However, they are unable to solve both imbalanced and large-volume data learning problems. This study addresses the IL problem in BD applications. The Distributed and Weighted ELM(DW-ELM) algorithm is proposed, which is based on the Map Reduce framework. To confirm the feasibility of parallel computation, first, the fact that matrix multiplication operators are decomposable is illustrated.Then, to further improve the computational efficiency, an Improved DW-ELM algorithm(IDW-ELM) is developed using only one Map Reduce job. The successful operations of the proposed DW-ELM and IDW-ELM algorithms are finally validated through experiments.展开更多
In the analysis of correlated data, it is ideal to capture the true dependence structure to increase effciency of the estimation. However, for multivariate survival data, this is extremely
The Gravity Recovery and Climate Experiment(GRACE) mission can significantly improve our knowledge of the temporal variability of the Earth's gravity field.We obtained monthly gravity field solutions based on varia...The Gravity Recovery and Climate Experiment(GRACE) mission can significantly improve our knowledge of the temporal variability of the Earth's gravity field.We obtained monthly gravity field solutions based on variational equations approach from GPS-derived positions of GRACE satellites and K-band range-rate measurements.The impact of different fixed data weighting ratios in temporal gravity field recovery while combining the two types of data was investigated for the purpose of deriving the best combined solution.The monthly gravity field solution obtained through above procedures was named as the Institute of Geodesy and Geophysics(IGG) temporal gravity field models.IGG temporal gravity field models were compared with GRACE Release05(RL05) products in following aspects:(i) the trend of the mass anomaly in China and its nearby regions within 2005-2010; (ii) the root mean squares of the global mass anomaly during 2005-2010; (iii) time-series changes in the mean water storage in the region of the Amazon Basin and the Sahara Desert between 2005 and 2010.The results showed that IGG solutions were almost consistent with GRACE RL05 products in above aspects(i)-(iii).Changes in the annual amplitude of mean water storage in the Amazon Basin were 14.7 ± 1.2 cm for IGG,17.1 ± 1.3 cm for the Centre for Space Research(CSR),16.4 ± 0.9 cm for the GeoForschungsZentrum(GFZ) and 16.9 ± 1.2 cm for the Jet Propulsion Laboratory(JPL) in terms of equivalent water height(EWH),respectively.The root mean squares of the mean mass anomaly in Sahara were 1.2 cm,0.9 cm,0.9 cm and 1.2 cm for temporal gravity field models of IGG,CSR,GFZ and JPL,respectively.Comparison suggested that IGG temporal gravity field solutions were at the same accuracy level with the latest temporal gravity field solutions published by CSR,GFZ and JPL.展开更多
For the slowly changed environment-range-dependent non-homogeneity, a new statistical space-time adaptive processing algorithm is proposed, which uses the statistical methods, such as Bayes or likelihood criterion to ...For the slowly changed environment-range-dependent non-homogeneity, a new statistical space-time adaptive processing algorithm is proposed, which uses the statistical methods, such as Bayes or likelihood criterion to estimate the approximative covariance matrix in the non-homogeneous condition. According to the statistical characteristics of the space-time snapshot data, via defining the aggregate snapshot data and corresponding events, the conditional probability of the space-time snapshot data which is the effective training data is given, then the weighting coefficients are obtained for the weighting method. The theory analysis indicates that the statistical methods of the Bayes and likelihood criterion for covariance matrix estimation are more reasonable than other methods that estimate the covariance matrix with the use of training data except the detected outliers. The last simulations attest that the proposed algorithms can estimate the covariance in the non-homogeneous condition exactly and have favorable characteristics.展开更多
We thank all the discussants for their interesting and stimulating contributions. They have touched various aspects that have not been considered by the original articles.
The survival analysis literature has always lagged behind the categorical data literature in developing methods to analyze clustered or multivariate data. While estimators based on
An improved low distortion sigma-delta ADC(analog-to-digital converter) for wireless local area network standards is presented.A feed-forward MASH 2-2 multi-bit cascaded sigma-delta ADC is adopted; however,this work...An improved low distortion sigma-delta ADC(analog-to-digital converter) for wireless local area network standards is presented.A feed-forward MASH 2-2 multi-bit cascaded sigma-delta ADC is adopted; however,this work shows a much better performance than the ADCs which have been presented to date by adding a feedback factor in the second stage to improve the performance of the in-band SNDR(signal to noise and distortion ratio),using 4-bit ADCs in both stages to minimize the quantization noise.Data weighted averaging technology is therefore used to decrease the mismatch noise induced by the 4-bit DACs,which improves the SFDR(spurious free dynamic range) of the ADC. The modulator has been implemented by a 0.18μm CMOS process and operates at a single 1.8 V supply voltage. Experimental results show that for a 1.25 MHz @-6 dBFS input signal at 160 MHz sampling frequency,the improved ADC with all non-idealities considered achieves a peak SNDR of 80.9 dB and an SFDR of 87 dB,and the effective number of bits is 13.15 bits.展开更多
基金supported by the National Natural Science Foundation of China(NSFC)(62102232,62122042,61971269)Natural Science Foundation of Shandong Province Under(ZR2021QF064)。
文摘As a combination of edge computing and artificial intelligence,edge intelligence has become a promising technique and provided its users with a series of fast,precise,and customized services.In edge intelligence,when learning agents are deployed on the edge side,the data aggregation from the end side to the designated edge devices is an important research topic.Considering the various importance of end devices,this paper studies the weighted data aggregation problem in a single hop end-to-edge communication network.Firstly,to make sure all the end devices with various weights are fairly treated in data aggregation,a distributed end-to-edge cooperative scheme is proposed.Then,to handle the massive contention on the wireless channel caused by end devices,a multi-armed bandit(MAB)algorithm is designed to help the end devices find their most appropriate update rates.Diffe-rent from the traditional data aggregation works,combining the MAB enables our algorithm a higher efficiency in data aggregation.With a theoretical analysis,we show that the efficiency of our algorithm is asymptotically optimal.Comparative experiments with previous works are also conducted to show the strength of our algorithm.
基金the International Center for Global Earth Models(ICGEM)for the height anomaly and gravity anomaly data and Bureau Gravimetrique International(BGI)for free-air gravity anomaly data from the World Gravity Map project(WGM2012)The authors are grateful to Głowny Urza˛d Geodezji i Kartografii of Poland for the height anomaly data of the quasi-geoid PL-geoid2021.
文摘At present,one of the methods used to determine the height of points on the Earth’s surface is Global Navigation Satellite System(GNSS)leveling.It is possible to determine the orthometric or normal height by this method only if there is a geoid or quasi-geoid height model available.This paper proposes the methodology for local correction of the heights of high-order global geoid models such as EGM08,EIGEN-6C4,GECO,and XGM2019e_2159.This methodology was tested in different areas of the research field,covering various relief forms.The dependence of the change in corrected height accuracy on the input data was analyzed,and the correction was also conducted for model heights in three tidal systems:"tide free","mean tide",and"zero tide".The results show that the heights of EIGEN-6C4 model can be corrected with an accuracy of up to 1 cm for flat and foothill terrains with the dimensionality of 1°×1°,2°×2°,and 3°×3°.The EGM08 model presents an almost identical result.The EIGEN-6C4 model is best suited for mountainous relief and provides an accuracy of 1.5 cm on the 1°×1°area.The height correction accuracy of GECO and XGM2019e_2159 models is slightly poor,which has fuzziness in terms of numerical fluctuation.
文摘Weighted fusion algorithms, which can be applied in the area of multi-sensor data fusion, are advanced based on weighted least square method. A weighted fusion algorithm, in which the relationship between weight coefficients and measurement noise is established, is proposed by giving attention to the correlation of measurement noise. Then a simplified weighted fusion algorithm is deduced on the assumption that measurement noise is uncorrelated. In addition, an algorithm, which can adjust the weight coefficients in the simplified algorithm by making estimations of measurement noise from measurements, is presented. It is proved by emulation and experiment that the precision performance of the multi-sensor system based on these algorithms is better than that of the multi-sensor system based on other algorithms.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.11474236,81671674,and 11775184)the Science and Technology Project of Fujian Province,China(Grant No.2016Y0078)
文摘An ill-posed inverse problem in quantitative susceptibility mapping (QSM) is usually solved using a regularization and optimization solver, which is time consuming considering the three-dimensional volume data. However, in clinical diagnosis, it is necessary to reconstruct a susceptibility map efficiently with an appropriate method. Here, a modified QSM reconstruction method called weighted total variation using split Bregman (WTVSB) is proposed. It reconstructs the susceptibility map with fast computational speed and effective artifact suppression by incorporating noise-suppressed data weighting with split Bregman iteration. The noise-suppressed data weighting is determined using the Laplacian of the calculated local field, which can prevent the noise and errors in field maps from spreading into the susceptibility inversion. The split Bregman iteration accelerates the solution of the Ll-regularized reconstruction model by utilizing a preconditioned conjugate gradient solver. In an experiment, the proposed reconstruction method is compared with truncated k-space division (TKD), morphology enabled dipole inversion (MEDI), total variation using the split Bregman (TVSB) method for numerical simulation, phantom and in vivo human brain data evaluated by root mean square error and mean structure similarity. Experimental results demonstrate that our proposed method can achieve better balance between accuracy and efficiency of QSM reconstruction than conventional methods, and thus facilitating clinical applications of QSM.
文摘The veracity of land evaluation is tightly related to the reasonable weights of land evaluation fac- tors. By mapping qualitative linguistic words into a fine-changeable cloud drops and translating the uncertain factor conditions into quantitative values with the uncertain illation based on cloud model, and then, inte- grating correlation analysis, a new way of figuring out the weight of land evaluation factors is proposed. It may solve the limitations of the conventional ways.
基金supported by National Natural Science Foundation of China (Grant Nos. 11401048, 11301037, 11571051 and 11201174)the Natural Science Foundation for Young Scientists of Jilin Province of China (Grant Nos. 20150520055JH and 20150520054JH)
文摘This paper proposes a new weighted quantile regression model for longitudinal data with weights chosen by empirical likelihood(EL). This approach efficiently incorporates the information from the conditional quantile restrictions to account for within-subject correlations. The resulted estimate is computationally simple and has good performance under modest or high within-subject correlation. The efficiency gain is quantified theoretically and illustrated via simulation and a real data application.
基金partially supported by the National Natural Science Foundation of China(Nos.61402089,61472069,and 61501101)the Fundamental Research Funds for the Central Universities(Nos.N161904001,N161602003,and N150408001)+2 种基金the Natural Science Foundation of Liaoning Province(No.2015020553)the China Postdoctoral Science Foundation(No.2016M591447)the Postdoctoral Science Foundation of Northeastern University(No.20160203)
文摘The Extreme Learning Machine(ELM) and its variants are effective in many machine learning applications such as Imbalanced Learning(IL) or Big Data(BD) learning. However, they are unable to solve both imbalanced and large-volume data learning problems. This study addresses the IL problem in BD applications. The Distributed and Weighted ELM(DW-ELM) algorithm is proposed, which is based on the Map Reduce framework. To confirm the feasibility of parallel computation, first, the fact that matrix multiplication operators are decomposable is illustrated.Then, to further improve the computational efficiency, an Improved DW-ELM algorithm(IDW-ELM) is developed using only one Map Reduce job. The successful operations of the proposed DW-ELM and IDW-ELM algorithms are finally validated through experiments.
文摘In the analysis of correlated data, it is ideal to capture the true dependence structure to increase effciency of the estimation. However, for multivariate survival data, this is extremely
基金funded by the Major National Scientific Research Plan(2013CB733305,2012CB957703)the National Natural Science Foundation of China(41174066,41131067,41374087,41431070)
文摘The Gravity Recovery and Climate Experiment(GRACE) mission can significantly improve our knowledge of the temporal variability of the Earth's gravity field.We obtained monthly gravity field solutions based on variational equations approach from GPS-derived positions of GRACE satellites and K-band range-rate measurements.The impact of different fixed data weighting ratios in temporal gravity field recovery while combining the two types of data was investigated for the purpose of deriving the best combined solution.The monthly gravity field solution obtained through above procedures was named as the Institute of Geodesy and Geophysics(IGG) temporal gravity field models.IGG temporal gravity field models were compared with GRACE Release05(RL05) products in following aspects:(i) the trend of the mass anomaly in China and its nearby regions within 2005-2010; (ii) the root mean squares of the global mass anomaly during 2005-2010; (iii) time-series changes in the mean water storage in the region of the Amazon Basin and the Sahara Desert between 2005 and 2010.The results showed that IGG solutions were almost consistent with GRACE RL05 products in above aspects(i)-(iii).Changes in the annual amplitude of mean water storage in the Amazon Basin were 14.7 ± 1.2 cm for IGG,17.1 ± 1.3 cm for the Centre for Space Research(CSR),16.4 ± 0.9 cm for the GeoForschungsZentrum(GFZ) and 16.9 ± 1.2 cm for the Jet Propulsion Laboratory(JPL) in terms of equivalent water height(EWH),respectively.The root mean squares of the mean mass anomaly in Sahara were 1.2 cm,0.9 cm,0.9 cm and 1.2 cm for temporal gravity field models of IGG,CSR,GFZ and JPL,respectively.Comparison suggested that IGG temporal gravity field solutions were at the same accuracy level with the latest temporal gravity field solutions published by CSR,GFZ and JPL.
基金Supported by the National Post-doctor Fundation (No. 20090451251) the Shaanxi Industry Surmount Foundation (2009K08-31) of China
文摘For the slowly changed environment-range-dependent non-homogeneity, a new statistical space-time adaptive processing algorithm is proposed, which uses the statistical methods, such as Bayes or likelihood criterion to estimate the approximative covariance matrix in the non-homogeneous condition. According to the statistical characteristics of the space-time snapshot data, via defining the aggregate snapshot data and corresponding events, the conditional probability of the space-time snapshot data which is the effective training data is given, then the weighting coefficients are obtained for the weighting method. The theory analysis indicates that the statistical methods of the Bayes and likelihood criterion for covariance matrix estimation are more reasonable than other methods that estimate the covariance matrix with the use of training data except the detected outliers. The last simulations attest that the proposed algorithms can estimate the covariance in the non-homogeneous condition exactly and have favorable characteristics.
文摘We thank all the discussants for their interesting and stimulating contributions. They have touched various aspects that have not been considered by the original articles.
文摘The survival analysis literature has always lagged behind the categorical data literature in developing methods to analyze clustered or multivariate data. While estimators based on
基金supported by National Natural Science Foundation of the China(Nos.60725415,60971066 )the National High-Tech Programs of China(Nos.2009AA01Z258,2009AA01Z260)
文摘An improved low distortion sigma-delta ADC(analog-to-digital converter) for wireless local area network standards is presented.A feed-forward MASH 2-2 multi-bit cascaded sigma-delta ADC is adopted; however,this work shows a much better performance than the ADCs which have been presented to date by adding a feedback factor in the second stage to improve the performance of the in-band SNDR(signal to noise and distortion ratio),using 4-bit ADCs in both stages to minimize the quantization noise.Data weighted averaging technology is therefore used to decrease the mismatch noise induced by the 4-bit DACs,which improves the SFDR(spurious free dynamic range) of the ADC. The modulator has been implemented by a 0.18μm CMOS process and operates at a single 1.8 V supply voltage. Experimental results show that for a 1.25 MHz @-6 dBFS input signal at 160 MHz sampling frequency,the improved ADC with all non-idealities considered achieves a peak SNDR of 80.9 dB and an SFDR of 87 dB,and the effective number of bits is 13.15 bits.