期刊文献+
共找到11篇文章
< 1 >
每页显示 20 50 100
陕西省道路线性参考数据集的生产与思考
1
作者 杨杨 李东辉 杜安丽 《测绘技术装备》 2019年第2期29-31,共3页
利用线性数据的线性运算特性,以2018年“天地图?陕西”电子地图中的道路数据为基础,结合交通部门的相关专业数据资料和采集的高速里程桩号数据,制作陕西省道路线性参考数据集。将天地图道路数据以道路线性参考数据集的形式进行存储和生... 利用线性数据的线性运算特性,以2018年“天地图?陕西”电子地图中的道路数据为基础,结合交通部门的相关专业数据资料和采集的高速里程桩号数据,制作陕西省道路线性参考数据集。将天地图道路数据以道路线性参考数据集的形式进行存储和生产,可对道路数据进行桩号定位服务,有利于道路定位、查找等功能的应用,对交通部门、应急管理部门等相关部门提供了快速、准确的服务。 展开更多
关键词 线性数据集 天地图 定位服务
下载PDF
基于核函数距离测度的LLE降维及其在离群聚类中的应用 被引量:5
2
作者 徐雪松 张宏 刘凤玉 《仪器仪表学报》 EI CAS CSCD 北大核心 2008年第9期1996-2000,共5页
局部线性嵌入算法(locally linear embedding,LLE)是一种流形降维方法,在高维稀疏数据空间中,针对LLE不适合稀疏采样和欧氏距离公式的缺陷,研究该算法的扩展,引入核函数,并将样本映射到高维特征空间,核映射改善了样本的空间分布,改进的... 局部线性嵌入算法(locally linear embedding,LLE)是一种流形降维方法,在高维稀疏数据空间中,针对LLE不适合稀疏采样和欧氏距离公式的缺陷,研究该算法的扩展,引入核函数,并将样本映射到高维特征空间,核映射改善了样本的空间分布,改进的LLE方法在适当选取近邻点个数情况下,可得到良好的效果。对从高维采样数据中恢复得到低维数据集,通过本文提出的离群数据假设,并结合本文给出的离群聚类方法对所得低维数据是否是离群数据进行判别。仿真文验的结果表明了该方法能够有效地发现高维数据集中的离群点,与此同时,该算法具有参数估计简单、参数影响不大等优点,该算法为离群点检测问题的机器学习提供了一条新的途径。 展开更多
关键词 核函数 维数消减 线性数据集 离群数据 聚类
下载PDF
概论核方法及核参数的选择 被引量:1
3
作者 邱潇钰 张化祥 《信息技术与信息化》 2007年第6期63-65,共3页
本文介绍支持向量机分类非线性数据集的优越性,讨论了基于核的方法,并对核的方法的实质进行了论述。通过matlab制作的图像可知:核方法的参数的选择对于基于核的分类法具有重要作用。对现有的选择核参数的方式效果进行了归纳与比较,从而... 本文介绍支持向量机分类非线性数据集的优越性,讨论了基于核的方法,并对核的方法的实质进行了论述。通过matlab制作的图像可知:核方法的参数的选择对于基于核的分类法具有重要作用。对现有的选择核参数的方式效果进行了归纳与比较,从而分析得出了各种选择参数方式的优缺点。 展开更多
关键词 支持向量机 线性数据集 基于核的方法 核参数
下载PDF
基于支持向量机(SVM)算法的加工机械故障分析 被引量:2
4
作者 梁毅峰 《辽东学院学报(自然科学版)》 CAS 2021年第3期153-157,共5页
针对工业领域中加工机械故障数据集的复杂性和海量性,对经典支持向量机(SVM)算法进行深度优化,引入规范化超平面分割数据集,并求解最优解;选择高斯径向基函数作为算法模型核函数,改善算法的泛化性能;基于样本熵排列方式提取故障信号特征... 针对工业领域中加工机械故障数据集的复杂性和海量性,对经典支持向量机(SVM)算法进行深度优化,引入规范化超平面分割数据集,并求解最优解;选择高斯径向基函数作为算法模型核函数,改善算法的泛化性能;基于样本熵排列方式提取故障信号特征,在不可分空间内也可以实现对非线性数据集的精确分割。仿真结果与实例验证表明,优化SVM算法具有更强的数据集分类性能和故障分析与检测性能,实际应用效果良好。 展开更多
关键词 支持向量机(SVM) 机器学习 最优解 线性数据集
下载PDF
Evaluating the impact of spatio-temporal scale on CPUE standardization 被引量:2
5
作者 田思泉 韩婵 +1 位作者 陈勇 陈新军 《Chinese Journal of Oceanology and Limnology》 SCIE CAS CSCD 2013年第5期935-948,共14页
This study focused on the quantitative evaluation of the impact of the spatio-temporal scale used in data collection and grouping on the standardization of CPUE(catch per unit effort).We used the Chinese squid-jigging... This study focused on the quantitative evaluation of the impact of the spatio-temporal scale used in data collection and grouping on the standardization of CPUE(catch per unit effort).We used the Chinese squid-jigging fishery in the northwestern Pacific Ocean as an example to evaluate 24 scenarios at different spatio-temporal scales,with a combination of four levels of temporal scale(weekly,biweekly,monthly,and bimonthly)and six levels of spatial scale(longitude×latitude:0.5°×0.5°,0.5°×1°,0.5°×2°,1°×0.5°,1°×1°,and 1°×2°).We applied generalized additive models and generalized linear models to analyze the24 scenarios for CPUE standardization,and then the differences in the standardized CPUE among these scenarios were quantified.This study shows that combinations of different spatial and temporal scales could have different impacts on the standardization of CPUE.However,at a fine temporal scale(weekly)different spatial scales yielded similar results for standardized CPUE.The choice of spatio-temporal scale used in data collection and analysis may create added uncertainty in fisheries stock assessment and management.To identify a cost-effective spatio-temporal scale for data collection,we recommend a similar study be undertaken to facilitate the design of effective monitoring programs. 展开更多
关键词 spatio-temporal scale CPUE standardization generalized additive model generalized linearmodel Ommastrephes bartramii northwestern Pacific Ocean
下载PDF
Advanced Irrigation Engineering: Precision and Precise 被引量:3
6
作者 Terry A. Howell Steven R. EveR Susan A. O' Shaughnessy Paul D. Colaizzi Prasanna H. Gowda 《Journal of Agricultural Science and Technology(A)》 2012年第1期1-9,共9页
Irrigation advances in precision irrigation (PI) or site specific irrigation (SSI) have been considerable in research; however, commercialization lags. SSI/PI has applications when soil texture variability affects... Irrigation advances in precision irrigation (PI) or site specific irrigation (SSI) have been considerable in research; however, commercialization lags. SSI/PI has applications when soil texture variability affects soil water holding capacity or when crop yield or biotic stresses (insects or diseases) are spatially variable. SSI/PI uses variable rate application technologies, mainly with center-pivots or lateral-move or linear irrigation machines, to match crop needs or soil water holding constraints. Variable rate applications are achieved by variable nozzle flow rates, pulsing nozzle flows, or multiple nozzles on separate submains. Newer center pivot and linear machines are controlled by on-board microprocessor systems that can be integrated with supervisory control and data acquisition controllers for both communication and control of the variable rate application for specific sets of nozzles or individual nozzles for management zones. Communication for center pivot or linear controllers typically uses radio telemetry, wireless interact links, or cellular telephones. Precision irrigation has limited utility without precise irrigation scheduling (temporally and spatially). Plant or soil sensors are used to initiate or complete an irrigation event. Automated weather stations provide site information for determining the irrigation requirement using crop models or simpler reference evapotranspiration (ET), data to be used with crop coefficients. Remote sensing is being used to measure crop water status or crop development from spectral reflectance. Near-surface remote sensing with sensors mounted on moving irrigation systems provide critical spatial integration from point weather networks and feedback on crop ET and irrigation controls in advanced automated systems for SSI/PI. 展开更多
关键词 Irrigation application technology center pivot sprinkler systems precision agriculture precision irrigation site specificirrigation irrigation scheduling soil and crop sensors.
下载PDF
Geostrophic meridional transport in tropical Northwest Pacific based on Argo profiles 被引量:4
7
作者 张志春 袁东亮 Peter C. CHU 《Chinese Journal of Oceanology and Limnology》 SCIE CAS CSCD 2013年第3期656-664,共9页
Absolute geostrophic currents in the North Pacific Ocean were calculated using P-vector method from newly gridded Argo profiling float data collected during 2004-2009. The meridional volume transport of geostrophic cu... Absolute geostrophic currents in the North Pacific Ocean were calculated using P-vector method from newly gridded Argo profiling float data collected during 2004-2009. The meridional volume transport of geostrophic currents differed significantly from the classical Sverdrup balance, with differences of 10×106 -20×106 m3 /s in the interior tropical Northwest Pacific Ocean. Analyses showed that errors of wind stress estimation could not explain all of the differences. The largest differences were found in the areas immediately north and south of the bifurcation latitude of the North Equatorial Current west of the dateline, and in the recirculation area of the Kuroshio and its extension, where nonlinear eddy activities were robust. Comparison of the geostrophic meridional transport and the wind-driven Sverdrup meridional transport in a high-resolution OFES simulation showed that nonlinear effects of the ocean circulation were the most likely reason for the differences. It is therefore suggested that the linear, steady wind-driven dynamics of the Sverdrup theory cannot completely explain the meridional transport of the interior circulation of the tropical Northwest Pacific Ocean. 展开更多
关键词 Sverdrup theory absolute geostrophic: current P-vector
下载PDF
Effect of Recycled Coarse Aggregate on Concrete Compressive Strength 被引量:7
8
作者 汪振双 王立久 +1 位作者 崔正龙 周梅 《Transactions of Tianjin University》 EI CAS 2011年第3期229-234,共6页
The effect of recycled coarse aggregate on concrete compressive strength was investigated based on the concrete skeleton theory. For this purpose, 30 mix proportions of concrete with target cube compressive strength r... The effect of recycled coarse aggregate on concrete compressive strength was investigated based on the concrete skeleton theory. For this purpose, 30 mix proportions of concrete with target cube compressive strength ranging from 20 to 60 MPa were cast with normal coarse aggregate and recycled coarse aggregate from different strength parent concretes. Results of 28-d test show that the strength of different types of recycled aggregate affects the concrete strength obviously. The coarse aggregate added to mortar matrix plays a skeleton role and improves its compressive strength. The skeleton effect of coarse aggregate increases with the increasing strength of coarse aggregate, and normal coarse aggregate plays the highest, whereas the lowest concrete strength occurs when using the weak recycled coarse aggregate. There is a linear relationship between the concrete strength and the corresponding mortar matrix strength. Coarse aggregate skeleton formula is established, and values from experimental tests match the derived expressions. 展开更多
关键词 recycled coarse aggregate compressive strength concrete skeleton model skeleton formula crushing index
下载PDF
Corrected-loss estimation for Error-in-Variable partially linear model 被引量:3
9
作者 JIN Jiao TONG XingWei 《Science China Mathematics》 SCIE CSCD 2015年第5期1101-1114,共14页
We consider an Error-in-Variable partially linear model where the covariates of linear part are measured with error which follows a normal distribution with a known covariance matrix. We propose a corrected-loss estim... We consider an Error-in-Variable partially linear model where the covariates of linear part are measured with error which follows a normal distribution with a known covariance matrix. We propose a corrected-loss estimation of the covariate effect. The proposed estimator is asymptotically normal. Simulation studies are presented to show that the proposed method performs well with finite samples, and the proposed method is applied to a real data set. 展开更多
关键词 partially linear model Error-in-Variable robust analysis
原文传递
Investigate the nonuniformity of low-energy electron beam with large cross-sections 被引量:1
10
作者 REN Jie HUANG JianMing +2 位作者 ZHANG YuTian LI DeMing ZHU NanKang 《Science China(Technological Sciences)》 SCIE EI CAS 2012年第4期997-1000,共4页
Over the past decades, low-energy electron accelerators have been used worldwide for surface curing and sterilization. The beam nonuniformity is an important parameter of the low-energy electron beam with large cross-... Over the past decades, low-energy electron accelerators have been used worldwide for surface curing and sterilization. The beam nonuniformity is an important parameter of the low-energy electron beam with large cross-sections. A simple and accurate measurement system of nonuniformity for the low-energy electron beam with large cross-sections was developed. The main concept consists in the measurement of nonuniformity, which is realized by using a linear actuator to drive two scanning wires through the beam's cross-sections at a fixed speed. The beam distribution can be obtained by sending/collecting the current signals to/from the Data Acquisition (DAQ) software on a laptop by a USB DAQ card. This device is very convenient for the performance testing of a new accelerator at the manufacturer's site. The distribution of the homemade low voltage electron accelerator EBS-300-50 was measured and evaluated. 展开更多
关键词 low-energy electron beam large cross-sections electron beam industry accelerator beam nonuniformity measurement
原文传递
An improved HASM method for dealing with large spatial data sets 被引量:2
11
作者 Na ZHAO Tianxiang YUE +2 位作者 Chuanfa CHEN Miaomiao ZHAO Zhengping DU 《Science China Earth Sciences》 SCIE EI CAS CSCD 2018年第8期1078-1087,共10页
Surface modeling with very large data sets is challenging. An efficient method for modeling massive data sets using the high accuracy surface modeling method(HASM) is proposed, and HASM_Big is developed to handle very... Surface modeling with very large data sets is challenging. An efficient method for modeling massive data sets using the high accuracy surface modeling method(HASM) is proposed, and HASM_Big is developed to handle very large data sets. A large data set is defined here as a large spatial domain with high resolution leading to a linear equation with matrix dimensions of hundreds of thousands. An augmented system approach is employed to solve the equality-constrained least squares problem(LSE) produced in HASM_Big, and a block row action method is applied to solve the corresponding very large matrix equations.A matrix partitioning method is used to avoid information redundancy among each block and thereby accelerate the model.Experiments including numerical tests and real-world applications are used to compare the performances of HASM_Big with its previous version, HASM. Results show that the memory storage and computing speed of HASM_Big are better than those of HASM. It is found that the computational cost of HASM_Big is linearly scalable, even with massive data sets. In conclusion,HASM_Big provides a powerful tool for surface modeling, especially when there are millions or more computing grid cells. 展开更多
关键词 Surface modeling HASM Large spatial data
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部