期刊文献+
共找到13篇文章
< 1 >
每页显示 20 50 100
Distributed Weighted Data Aggregation Algorithm in End-to-Edge Communication Networks Based on Multi-armed Bandit 被引量:1
1
作者 Yifei ZOU Senmao QI +1 位作者 Cong'an XU Dongxiao YU 《计算机科学》 CSCD 北大核心 2023年第2期13-22,共10页
As a combination of edge computing and artificial intelligence,edge intelligence has become a promising technique and provided its users with a series of fast,precise,and customized services.In edge intelligence,when ... As a combination of edge computing and artificial intelligence,edge intelligence has become a promising technique and provided its users with a series of fast,precise,and customized services.In edge intelligence,when learning agents are deployed on the edge side,the data aggregation from the end side to the designated edge devices is an important research topic.Considering the various importance of end devices,this paper studies the weighted data aggregation problem in a single hop end-to-edge communication network.Firstly,to make sure all the end devices with various weights are fairly treated in data aggregation,a distributed end-to-edge cooperative scheme is proposed.Then,to handle the massive contention on the wireless channel caused by end devices,a multi-armed bandit(MAB)algorithm is designed to help the end devices find their most appropriate update rates.Diffe-rent from the traditional data aggregation works,combining the MAB enables our algorithm a higher efficiency in data aggregation.With a theoretical analysis,we show that the efficiency of our algorithm is asymptotically optimal.Comparative experiments with previous works are also conducted to show the strength of our algorithm. 展开更多
关键词 weighted data aggregation End-to-edge communication Multi-armed bandit Edge intelligence
下载PDF
Methodology for local correction of the heights of global geoid models to improve the accuracy of GNSS leveling
2
作者 Stepan Savchuk Alina Fedorchuk 《Geodesy and Geodynamics》 EI CSCD 2024年第1期42-49,共8页
At present,one of the methods used to determine the height of points on the Earth’s surface is Global Navigation Satellite System(GNSS)leveling.It is possible to determine the orthometric or normal height by this met... At present,one of the methods used to determine the height of points on the Earth’s surface is Global Navigation Satellite System(GNSS)leveling.It is possible to determine the orthometric or normal height by this method only if there is a geoid or quasi-geoid height model available.This paper proposes the methodology for local correction of the heights of high-order global geoid models such as EGM08,EIGEN-6C4,GECO,and XGM2019e_2159.This methodology was tested in different areas of the research field,covering various relief forms.The dependence of the change in corrected height accuracy on the input data was analyzed,and the correction was also conducted for model heights in three tidal systems:"tide free","mean tide",and"zero tide".The results show that the heights of EIGEN-6C4 model can be corrected with an accuracy of up to 1 cm for flat and foothill terrains with the dimensionality of 1°×1°,2°×2°,and 3°×3°.The EGM08 model presents an almost identical result.The EIGEN-6C4 model is best suited for mountainous relief and provides an accuracy of 1.5 cm on the 1°×1°area.The height correction accuracy of GECO and XGM2019e_2159 models is slightly poor,which has fuzziness in terms of numerical fluctuation. 展开更多
关键词 GNSS leveling Global geoid model Gravity anomaly Weight data Correcting data
下载PDF
ADAPTIVE FUSION ALGORITHMS BASED ON WEIGHTED LEAST SQUARE METHOD 被引量:9
3
作者 SONG Kaichen NIE Xili 《Chinese Journal of Mechanical Engineering》 SCIE EI CAS CSCD 2006年第3期451-454,共4页
Weighted fusion algorithms, which can be applied in the area of multi-sensor data fusion, are advanced based on weighted least square method. A weighted fusion algorithm, in which the relationship between weight coeff... Weighted fusion algorithms, which can be applied in the area of multi-sensor data fusion, are advanced based on weighted least square method. A weighted fusion algorithm, in which the relationship between weight coefficients and measurement noise is established, is proposed by giving attention to the correlation of measurement noise. Then a simplified weighted fusion algorithm is deduced on the assumption that measurement noise is uncorrelated. In addition, an algorithm, which can adjust the weight coefficients in the simplified algorithm by making estimations of measurement noise from measurements, is presented. It is proved by emulation and experiment that the precision performance of the multi-sensor system based on these algorithms is better than that of the multi-sensor system based on other algorithms. 展开更多
关键词 weighted least square method data fusion Measurement noise Correlation
下载PDF
Weighted total variation using split Bregman fast quantitative susceptibility mapping reconstruction method 被引量:1
4
作者 Lin Chen Zhi-Wei Zheng +4 位作者 Li-Jun Bao Jin-Sheng Fang Tian-He Yang Shu-Hui Cai Cong-Bo Cai 《Chinese Physics B》 SCIE EI CAS CSCD 2018年第8期645-654,共10页
An ill-posed inverse problem in quantitative susceptibility mapping (QSM) is usually solved using a regularization and optimization solver, which is time consuming considering the three-dimensional volume data. Howe... An ill-posed inverse problem in quantitative susceptibility mapping (QSM) is usually solved using a regularization and optimization solver, which is time consuming considering the three-dimensional volume data. However, in clinical diagnosis, it is necessary to reconstruct a susceptibility map efficiently with an appropriate method. Here, a modified QSM reconstruction method called weighted total variation using split Bregman (WTVSB) is proposed. It reconstructs the susceptibility map with fast computational speed and effective artifact suppression by incorporating noise-suppressed data weighting with split Bregman iteration. The noise-suppressed data weighting is determined using the Laplacian of the calculated local field, which can prevent the noise and errors in field maps from spreading into the susceptibility inversion. The split Bregman iteration accelerates the solution of the Ll-regularized reconstruction model by utilizing a preconditioned conjugate gradient solver. In an experiment, the proposed reconstruction method is compared with truncated k-space division (TKD), morphology enabled dipole inversion (MEDI), total variation using the split Bregman (TVSB) method for numerical simulation, phantom and in vivo human brain data evaluated by root mean square error and mean structure similarity. Experimental results demonstrate that our proposed method can achieve better balance between accuracy and efficiency of QSM reconstruction than conventional methods, and thus facilitating clinical applications of QSM. 展开更多
关键词 quantitative susceptibility mapping ill-posed inverse problem noise-suppressed data weighting split Bregman iteration
下载PDF
Mining Weights of Land Evaluation Factors Based on Cloud Model and Correlation Analysis 被引量:17
5
作者 HU Shiyuan LI Deren +1 位作者 LIU Yaolin LI Deyi 《Geo-Spatial Information Science》 2007年第3期218-222,共5页
The veracity of land evaluation is tightly related to the reasonable weights of land evaluation fac- tors. By mapping qualitative linguistic words into a fine-changeable cloud drops and translating the uncertain facto... The veracity of land evaluation is tightly related to the reasonable weights of land evaluation fac- tors. By mapping qualitative linguistic words into a fine-changeable cloud drops and translating the uncertain factor conditions into quantitative values with the uncertain illation based on cloud model, and then, inte- grating correlation analysis, a new way of figuring out the weight of land evaluation factors is proposed. It may solve the limitations of the conventional ways. 展开更多
关键词 cloud models correlation analysis land evaluation factor weight data mining
下载PDF
Weighted quantile regression for longitudinal data using empirical likelihood 被引量:1
6
作者 YUAN XiaoHui LIN Nan +1 位作者 DONG XiaoGang LIU TianQing 《Science China Mathematics》 SCIE CSCD 2017年第1期147-164,共18页
This paper proposes a new weighted quantile regression model for longitudinal data with weights chosen by empirical likelihood(EL). This approach efficiently incorporates the information from the conditional quantile ... This paper proposes a new weighted quantile regression model for longitudinal data with weights chosen by empirical likelihood(EL). This approach efficiently incorporates the information from the conditional quantile restrictions to account for within-subject correlations. The resulted estimate is computationally simple and has good performance under modest or high within-subject correlation. The efficiency gain is quantified theoretically and illustrated via simulation and a real data application. 展开更多
关键词 empirical likelihood estimating equation influence function longitudinal data weighted quantile regression
原文传递
Distributed and Weighted Extreme Learning Machine for Imbalanced Big Data Learning 被引量:10
7
作者 Zhiqiong Wang Junchang Xin +4 位作者 Hongxu Yang Shuo Tian Ge Yu Chenren Xu Yudong Yao 《Tsinghua Science and Technology》 SCIE EI CAS CSCD 2017年第2期160-173,共14页
The Extreme Learning Machine(ELM) and its variants are effective in many machine learning applications such as Imbalanced Learning(IL) or Big Data(BD) learning. However, they are unable to solve both imbalanced ... The Extreme Learning Machine(ELM) and its variants are effective in many machine learning applications such as Imbalanced Learning(IL) or Big Data(BD) learning. However, they are unable to solve both imbalanced and large-volume data learning problems. This study addresses the IL problem in BD applications. The Distributed and Weighted ELM(DW-ELM) algorithm is proposed, which is based on the Map Reduce framework. To confirm the feasibility of parallel computation, first, the fact that matrix multiplication operators are decomposable is illustrated.Then, to further improve the computational efficiency, an Improved DW-ELM algorithm(IDW-ELM) is developed using only one Map Reduce job. The successful operations of the proposed DW-ELM and IDW-ELM algorithms are finally validated through experiments. 展开更多
关键词 weighted Extreme Learning Machine(ELM) imbalanced big data MapReduce framework user-defined counter
原文传递
Discussion of Fan et al.’s paper “Gaining effciency via weighted estimators for multivariate failure time data” 被引量:1
8
作者 QU Annie & XUE Lan 1 Department of Statisties, University of Illinois at Urbana-Champaign, IL61820, USA 2 Statisties Department, The Oregon State University, Corvallis, OR 97331-4606, USA 《Science China Mathematics》 SCIE 2009年第6期1134-1136,共3页
In the analysis of correlated data, it is ideal to capture the true dependence structure to increase effciency of the estimation. However, for multivariate survival data, this is extremely
关键词 time Discussion of Fan et al s paper Gaining effciency via weighted estimators for multivariate failure time data
原文传递
Monthly gravity field recovery from GRACE orbits and K-band measurements using variational equations approach 被引量:1
9
作者 Wang Changqing Xu Houze +1 位作者 Zhong Min Feng Wei 《Geodesy and Geodynamics》 2015年第4期253-260,共8页
The Gravity Recovery and Climate Experiment(GRACE) mission can significantly improve our knowledge of the temporal variability of the Earth's gravity field.We obtained monthly gravity field solutions based on varia... The Gravity Recovery and Climate Experiment(GRACE) mission can significantly improve our knowledge of the temporal variability of the Earth's gravity field.We obtained monthly gravity field solutions based on variational equations approach from GPS-derived positions of GRACE satellites and K-band range-rate measurements.The impact of different fixed data weighting ratios in temporal gravity field recovery while combining the two types of data was investigated for the purpose of deriving the best combined solution.The monthly gravity field solution obtained through above procedures was named as the Institute of Geodesy and Geophysics(IGG) temporal gravity field models.IGG temporal gravity field models were compared with GRACE Release05(RL05) products in following aspects:(i) the trend of the mass anomaly in China and its nearby regions within 2005-2010; (ii) the root mean squares of the global mass anomaly during 2005-2010; (iii) time-series changes in the mean water storage in the region of the Amazon Basin and the Sahara Desert between 2005 and 2010.The results showed that IGG solutions were almost consistent with GRACE RL05 products in above aspects(i)-(iii).Changes in the annual amplitude of mean water storage in the Amazon Basin were 14.7 ± 1.2 cm for IGG,17.1 ± 1.3 cm for the Centre for Space Research(CSR),16.4 ± 0.9 cm for the GeoForschungsZentrum(GFZ) and 16.9 ± 1.2 cm for the Jet Propulsion Laboratory(JPL) in terms of equivalent water height(EWH),respectively.The root mean squares of the mean mass anomaly in Sahara were 1.2 cm,0.9 cm,0.9 cm and 1.2 cm for temporal gravity field models of IGG,CSR,GFZ and JPL,respectively.Comparison suggested that IGG temporal gravity field solutions were at the same accuracy level with the latest temporal gravity field solutions published by CSR,GFZ and JPL. 展开更多
关键词 Gravity recovery and climate experiment (GRACE) Temporal gravity field Variational equations approach Water storage changes Equivalent water height(EWH)data weight ratio Geoid height per degree IGG temporal gravity model
下载PDF
STATISTICAL SPACE-TIME ADAPTIVE PROCESSING ALGORITHM
10
作者 Yang Jie 《Journal of Electronics(China)》 2010年第3期412-419,共8页
For the slowly changed environment-range-dependent non-homogeneity, a new statistical space-time adaptive processing algorithm is proposed, which uses the statistical methods, such as Bayes or likelihood criterion to ... For the slowly changed environment-range-dependent non-homogeneity, a new statistical space-time adaptive processing algorithm is proposed, which uses the statistical methods, such as Bayes or likelihood criterion to estimate the approximative covariance matrix in the non-homogeneous condition. According to the statistical characteristics of the space-time snapshot data, via defining the aggregate snapshot data and corresponding events, the conditional probability of the space-time snapshot data which is the effective training data is given, then the weighting coefficients are obtained for the weighting method. The theory analysis indicates that the statistical methods of the Bayes and likelihood criterion for covariance matrix estimation are more reasonable than other methods that estimate the covariance matrix with the use of training data except the detected outliers. The last simulations attest that the proposed algorithms can estimate the covariance in the non-homogeneous condition exactly and have favorable characteristics. 展开更多
关键词 Space-Time Adaptive Processing (STAP) Non-homogeneous condition Bayes and likelihood criterion data weighting
下载PDF
Rejoinder for Gaining effciency via weighted estimators for multivariate failure time data
11
作者 FAN JianQing, ZHOU Yong, CAI JianWen, & CHEN Min 《Science China Mathematics》 SCIE 2009年第6期1137-1138,共2页
We thank all the discussants for their interesting and stimulating contributions. They have touched various aspects that have not been considered by the original articles.
关键词 TIME Rejoinder for Gaining effciency via weighted estimators for multivariate failure time data
原文传递
Discussion on “Gaining Effciency via Weight Estimators for Multivariate Failure Time Data” by Fan, Zhou and Chen
12
作者 KUK Anthony 《Science China Mathematics》 SCIE 2009年第6期1129-1130,共2页
The survival analysis literature has always lagged behind the categorical data literature in developing methods to analyze clustered or multivariate data. While estimators based on
关键词 Discussion on Gaining Effciency via Weight Estimators for Multivariate Failure Time data Zhou and Chen by Fan
原文传递
Improved low-distortion sigma-delta ADC with DWA for WLAN standards
13
作者 李迪 杨银堂 +3 位作者 朱樟明 石立春 吴笑峰 王江安 《Journal of Semiconductors》 EI CAS CSCD 北大核心 2010年第2期82-87,共6页
An improved low distortion sigma-delta ADC(analog-to-digital converter) for wireless local area network standards is presented.A feed-forward MASH 2-2 multi-bit cascaded sigma-delta ADC is adopted; however,this work... An improved low distortion sigma-delta ADC(analog-to-digital converter) for wireless local area network standards is presented.A feed-forward MASH 2-2 multi-bit cascaded sigma-delta ADC is adopted; however,this work shows a much better performance than the ADCs which have been presented to date by adding a feedback factor in the second stage to improve the performance of the in-band SNDR(signal to noise and distortion ratio),using 4-bit ADCs in both stages to minimize the quantization noise.Data weighted averaging technology is therefore used to decrease the mismatch noise induced by the 4-bit DACs,which improves the SFDR(spurious free dynamic range) of the ADC. The modulator has been implemented by a 0.18μm CMOS process and operates at a single 1.8 V supply voltage. Experimental results show that for a 1.25 MHz @-6 dBFS input signal at 160 MHz sampling frequency,the improved ADC with all non-idealities considered achieves a peak SNDR of 80.9 dB and an SFDR of 87 dB,and the effective number of bits is 13.15 bits. 展开更多
关键词 WLAN low distortion sigma-delta ADC data weighted averaging
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部