Frequency modulated continuous wave(FMCW)radar is an advantageous sensor scheme for target estimation and environmental perception.However,existing algorithms based on discrete Fourier transform(DFT),multiple signal c...Frequency modulated continuous wave(FMCW)radar is an advantageous sensor scheme for target estimation and environmental perception.However,existing algorithms based on discrete Fourier transform(DFT),multiple signal classification(MUSIC)and compressed sensing,etc.,cannot achieve both low complexity and high resolution simultaneously.This paper proposes an efficient 2-D MUSIC algorithm for super-resolution target estimation/tracking based on FMCW radar.Firstly,we enhance the efficiency of 2-D MUSIC azimuth-range spectrum estimation by incorporating 2-D DFT and multi-level resolution searching strategy.Secondly,we apply the gradient descent method to tightly integrate the spatial continuity of object motion into spectrum estimation when processing multi-epoch radar data,which improves the efficiency of continuous target tracking.These two approaches have improved the algorithm efficiency by nearly 2-4 orders of magnitude without losing accuracy and resolution.Simulation experiments are conducted to validate the effectiveness of the algorithm in both single-epoch estimation and multi-epoch tracking scenarios.展开更多
Differential spatial modulation(DSM)is a multiple-input multiple-output(MIMO)transmission scheme.It has attracted extensive research interest due to its ability to transmit additional data without increasing any radio...Differential spatial modulation(DSM)is a multiple-input multiple-output(MIMO)transmission scheme.It has attracted extensive research interest due to its ability to transmit additional data without increasing any radio frequency chain.In this paper,DSM is investigated using two mapping algorithms:Look-Up Table Order(LUTO)and Permutation Method(PM).Then,the bit error rate(BER)performance and complexity of the two mapping algorithms in various antennas and modulation methods are verified by simulation experiments.The results show that PM has a lower BER than the LUTO mapping algorithm,and the latter has lower complexity than the former.展开更多
A full-polarimetric super-resolution algorithm with spatial smoothing processing is presented for one-dimensional(1-D)radar imaging.The coherence between scattering centers is minimized by using spatial smoothing pr...A full-polarimetric super-resolution algorithm with spatial smoothing processing is presented for one-dimensional(1-D)radar imaging.The coherence between scattering centers is minimized by using spatial smoothing processing(SSP).Then the range and polarimetric scattering matrix of the scattering centers are estimated.The impact of different lengths of the smoothing window on the imaging quality is mainly analyzed with different signal-to-noise ratios(SNR).Simulation and experimental results show that an improved radar super-resolution range profile and more precise estimation can be obtained by adjusting the length of the smoothing window under different SNR conditions.展开更多
Passive millimeter wave (PMMW) images inherently have the problem of poor resolution owing to limited aperture dimension. Thus, efficient post-processing is necessary to achieve resolution improvement. An adaptive p...Passive millimeter wave (PMMW) images inherently have the problem of poor resolution owing to limited aperture dimension. Thus, efficient post-processing is necessary to achieve resolution improvement. An adaptive projected Landweber (APL) super-resolution algorithm using a spectral correction procedure, which attempts to combine the strong points of all of the projected Landweber (PL) iteration and the adaptive relaxation parameter adjustment and the spectral correction method, is proposed. In the algorithm, the PL iterations are implemented as the main image restoration scheme and a spectral correction method is included in which the calculated spectrum within the passband is replaced by the known low frequency component. Then, the algorithm updates the relaxation parameter adaptively at each iteration. A qualitative evaluation of this algorithm is performed with simulated data as well as actual radiometer image captured by 91.5 GHz mechanically scanned radiometer. From experiments, it is found that the super-resolution algorithm obtains better results and enhances the resolution and has lower mean square error (MSE). These constraints and adaptive character and spectral correction procedures speed up the convergence of the Landweber algorithm and reduce the ringing effects that are caused by regularizing the image restoration problem.展开更多
The traditional deterministic analysis for tunnel face stability neglects the uncertainties of geotechnical parameters,while the simplified reliability analysis which models the potential uncertainties by means of ran...The traditional deterministic analysis for tunnel face stability neglects the uncertainties of geotechnical parameters,while the simplified reliability analysis which models the potential uncertainties by means of random variables usually fails to account for soil spatial variability.To overcome these limitations,this study proposes an efficient framework for conducting reliability analysis and reliability-based design(RBD)of tunnel face stability in spatially variable soil strata.The three-dimensional(3D)rotational failure mechanism of the tunnel face is extended to account for the soil spatial variability,and a probabilistic framework is established by coupling the extended mechanism with the improved Hasofer-Lind-Rackwits-Fiessler recursive algorithm(iHLRF)as well as its inverse analysis formulation.The proposed framework allows for rapid and precise reliability analysis and RBD of tunnel face stability.To demonstrate the feasibility and efficacy of the proposed framework,an illustrative case of tunnelling in frictional soils is presented,where the soil's cohesion and friction angle are modelled as two anisotropic cross-correlated lognormal random fields.The results show that the proposed method can accurately estimate the failure probability(or reliability index)regarding the tunnel face stability and can efficiently determine the required supporting pressure for a target reliability index with soil spatial variability being taken into account.Furthermore,this study reveals the impact of various factors on the support pressure,including coefficient of variation,cross-correlation between cohesion and friction angle,as well as autocorrelation distance of spatially variable soil strata.The results also demonstrate the feasibility of using the forward and/or inverse first-order reliability method(FORM)in high-dimensional stochastic problems.It is hoped that this study may provide a practical and reliable framework for determining the stability of tunnels in complex soil strata.展开更多
To improve the performance of the traditional map matching algorithms in freeway traffic state monitoring systems using the low logging frequency GPS (global positioning system) probe data, a map matching algorithm ...To improve the performance of the traditional map matching algorithms in freeway traffic state monitoring systems using the low logging frequency GPS (global positioning system) probe data, a map matching algorithm based on the Oracle spatial data model is proposed. The algorithm uses the Oracle road network data model to analyze the spatial relationships between massive GPS positioning points and freeway networks, builds an N-shortest path algorithm to find reasonable candidate routes between GPS positioning points efficiently, and uses the fuzzy logic inference system to determine the final matched traveling route. According to the implementation with field data from Los Angeles, the computation speed of the algorithm is about 135 GPS positioning points per second and the accuracy is 98.9%. The results demonstrate the effectiveness and accuracy of the proposed algorithm for mapping massive GPS positioning data onto freeway networks with complex geometric characteristics.展开更多
Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recogni...Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recognition, image processing, and etc. We combine sampling technique with DBSCAN algorithm to cluster large spatial databases, and two sampling based DBSCAN (SDBSCAN) algorithms are developed. One algorithm introduces sampling technique inside DBSCAN, and the other uses sampling procedure outside DBSCAN. Experimental results demonstrate that our algorithms are effective and efficient in clustering large scale spatial databases.展开更多
Human beings’ intellection is the characteristic of a distinct hierarchy and can be taken to construct a heuristic in the shortest path algorithms.It is detailed in this paper how to utilize the hierarchical reasonin...Human beings’ intellection is the characteristic of a distinct hierarchy and can be taken to construct a heuristic in the shortest path algorithms.It is detailed in this paper how to utilize the hierarchical reasoning on the basis of greedy and directional strategy to establish a spatial heuristic,so as to improve running efficiency and suitability of shortest path algorithm for traffic network.The authors divide urban traffic network into three hierarchies and set forward a new node hierarchy division rule to avoid the unreliable solution of shortest path.It is argued that the shortest path,no matter distance shortest or time shortest,is usually not the favorite of drivers in practice.Some factors difficult to expect or quantify influence the drivers’ choice greatly.It makes the drivers prefer choosing a less shortest,but more reliable or flexible path to travel on.The presented optimum path algorithm,in addition to the improvement of the running efficiency of shortest path algorithms up to several times,reduces the emergence of those factors,conforms to the intellection characteristic of human beings,and is more easily accepted by drivers.Moreover,it does not require the completeness of networks in the lowest hierarchy and the applicability and fault tolerance of the algorithm have improved.The experiment result shows the advantages of the presented algorithm.The authors argued that the algorithm has great potential application for navigation systems of large_scale traffic networks.展开更多
Based on the mechanism of imagery, a novel method called the delaminating combining template method, used for the problem of super-resolution reconstruction from image sequence, is described in this paper. The combini...Based on the mechanism of imagery, a novel method called the delaminating combining template method, used for the problem of super-resolution reconstruction from image sequence, is described in this paper. The combining template method contains two steps: a delaminating strategy and a combining template algorithm. The delaminating strategy divides the original problem into several sub-problems; each of them is only eonnected to one degrading factor. The combining template algorithm is suggested to resolve each sub-problem. In addition, to verify the valid of the method, a new index called oriental entropy is presented. The results from the theoretical analysis and experiments illustrate that this method to be promising and efficient.展开更多
To develop a better approach for spatial evaluation of drinking water quality, an intelligent evaluation method integrating a geographical information system(GIS) and an ant colony clustering algorithm(ACCA) was used....To develop a better approach for spatial evaluation of drinking water quality, an intelligent evaluation method integrating a geographical information system(GIS) and an ant colony clustering algorithm(ACCA) was used. Drinking water samples from 29 wells in Zhenping County, China, were collected and analyzed. 35 parameters on water quality were selected, such as chloride concentration, sulphate concentration, total hardness, nitrate concentration, fluoride concentration, turbidity, pH, chromium concentration, COD, bacterium amount, total coliforms and color. The best spatial interpolation methods for the 35 parameters were found and selected from all types of interpolation methods in GIS environment according to the minimum cross-validation errors. The ACCA was improved through three strategies, namely mixed distance function, average similitude degree and probability conversion functions. Then, the ACCA was carried out to obtain different water quality grades in the GIS environment. In the end, the result from the ACCA was compared with those from the competitive Hopfield neural network(CHNN) to validate the feasibility and effectiveness of the ACCA according to three evaluation indexes, which are stochastic sampling method, pixel amount and convergence speed. It is shown that the spatial water quality grades obtained from the ACCA were more effective, accurate and intelligent than those obtained from the CHNN.展开更多
Considering the characteristics of spatial straightness error, this paper puts forward a kind of evaluation method of spatial straightness error using Geometric Approximation Searching Algorithm (GASA). According to t...Considering the characteristics of spatial straightness error, this paper puts forward a kind of evaluation method of spatial straightness error using Geometric Approximation Searching Algorithm (GASA). According to the minimum condition principle of form error evaluation, the mathematic model and optimization objective of the GASA are given. The algorithm avoids the optimization and linearization, and can be fulfilled in three steps. First construct two parallel quadrates based on the preset two reference points of the spatial line respectively;second construct centerlines by connecting one quadrate each vertices to another quadrate each vertices;after that, calculate the distances between measured points and the constructed centerlines. The minimum zone straightness error is obtained by repeating comparing and reconstructing quadrates. The principle and steps of the algorithm to evaluate spatial straightness error is described in detail, and the mathematical formula and program flowchart are given also. Results show that this algorithm can evaluate spatial straightness error more effectively and exactly.展开更多
A new method of super-resolution image reconstruction is proposed, which uses a three-step-training error backpropagation neural network (BPNN) to realize the super-resolution reconstruction (SRR) of satellite ima...A new method of super-resolution image reconstruction is proposed, which uses a three-step-training error backpropagation neural network (BPNN) to realize the super-resolution reconstruction (SRR) of satellite image. The method is based on BPNN. First, three groups learning samples with different resolutions are obtained according to image observation model, and then vector mappings are respectively used to those three group learning samples to speed up the convergence of BPNN, at last, three times consecutive training are carried on the BPNN. Training samples used in each step are of higher resolution than those used in the previous steps, so the increasing weights store a great amount of information for SRR, and network performance and generalization ability are improved greatly. Simulation and generalization tests are carried on the well-trained three-step-training NN respectively, and the reconstruction results with higher resolution images verify the effectiveness and validity of this method.展开更多
Image scanning microscopy based on pixel reassignment can improve the confocal resolution limit without losing the image signal-to-noise ratio(SNR)greatly[C.J.R.Sheppard,"Super resolution in confocal imaging,&quo...Image scanning microscopy based on pixel reassignment can improve the confocal resolution limit without losing the image signal-to-noise ratio(SNR)greatly[C.J.R.Sheppard,"Super resolution in confocal imaging,"Optik 80(2)53-54(1988).C.B.Miller,E.Jorg,"Image scanning microscopy,"Phys.Reu.Lett.104(19)198101(2010).C.J.R.Sheppard,s.B.Mehta,R Heintzmann,"Superresolution by image scanning microscopy using pixel reassignment,"Opt.Lett.38(15)28892892(2013)].Here,we use a tailor-made optical fiber and 19 avalanche pho-todiodes(APDs)as parallel detectors to upgrade our existing confocal microscopy,termed as parallel-detection super resolution(PDSR)microscopy.In order to obtain the correct shift value,we use the normalized 2D cross correlation to calculate the shifting value of each image.We characterized our system performance by imaging fuorescence beads and applied this system to observing the 3D structure of biological specimen.展开更多
A general regression neural network model,combined with an interative algorithm(GRNNI)using sparsely distributed samples and auxiliary environmental variables was proposed to predict both spatial distribution and vari...A general regression neural network model,combined with an interative algorithm(GRNNI)using sparsely distributed samples and auxiliary environmental variables was proposed to predict both spatial distribution and variability of soil organic matter(SOM)in a bamboo forest.The auxiliary environmental variables were:elevation,slope,mean annual temperature,mean annual precipitation,and normalized difference vegetation index.The prediction accuracy of this model was assessed via three accuracy indices,mean error(ME),mean absolute error(MAE),and root mean squared error(RMSE)for validation in sampling sites.Both the prediction accuracy and reliability of this model were compared to those of regression kriging(RK)and ordinary kriging(OK).The results show that the prediction accuracy of the GRNNI model was higher than that of both RK and OK.The three accuracy indices(ME,MAE,and RMSE)of the GRNNI model were lower than those of RK and OK.Relative improvements of RMSE of the GRNNI model compared with RK and OK were 13.6%and 17.5%,respectively.In addition,a more realistic spatial pattern of SOM was produced by the model because the GRNNI model was more suitable than multiple linear regression to capture the nonlinear relationship between SOM and the auxiliary environmental variables.Therefore,the GRNNI model can improve both prediction accuracy and reliability for determining spatial distribution and variability of SOM.展开更多
With the rapid growth of spatial data,POI(Point of Interest)is becoming ever more intensive,and the text description of each spatial point is also gradually increasing.The traditional query method can only address the...With the rapid growth of spatial data,POI(Point of Interest)is becoming ever more intensive,and the text description of each spatial point is also gradually increasing.The traditional query method can only address the problem that the text description is less and single keyword query.In view of this situation,the paper proposes an approximate matching algorithm to support spatial multi-keyword.The fuzzy matching algorithm is integrated into this algorithm,which not only supports multiple POI queries,but also supports fault tolerance of the query keywords.The simulation results demonstrate that the proposed algorithm can improve the accuracy and efficiency of query.展开更多
This paper describes the nearest neighbor (NN) search algorithm on the GBD(generalized BD) tree. The GBD tree is a spatial data structure suitable for two-or three-dimensional data and has good performance characteris...This paper describes the nearest neighbor (NN) search algorithm on the GBD(generalized BD) tree. The GBD tree is a spatial data structure suitable for two-or three-dimensional data and has good performance characteristics with respect to the dynamic data environment. On GIS and CAD systems, the R-tree and its successors have been used. In addition, the NN search algorithm is also proposed in an attempt to obtain good performance from the R-tree. On the other hand, the GBD tree is superior to the R-tree with respect to exact match retrieval, because the GBD tree has auxiliary data that uniquely determines the position of the object in the structure. The proposed NN search algorithm depends on the property of the GBD tree described above. The NN search algorithm on the GBD tree was studied and the performance thereof was evaluated through experiments.展开更多
Rapid development of deepfake technology led to the spread of forged audios and videos across network platforms,presenting risks for numerous countries,societies,and individuals,and posing a serious threat to cyberspa...Rapid development of deepfake technology led to the spread of forged audios and videos across network platforms,presenting risks for numerous countries,societies,and individuals,and posing a serious threat to cyberspace security.To address the problem of insufficient extraction of spatial features and the fact that temporal features are not considered in the deepfake video detection,we propose a detection method based on improved CapsNet and temporal–spatial features(iCapsNet–TSF).First,the dynamic routing algorithm of CapsNet is improved using weight initialization and updating.Then,the optical flow algorithm is used to extract interframe temporal features of the videos to form a dataset of temporal–spatial features.Finally,the iCapsNet model is employed to fully learn the temporal–spatial features of facial videos,and the results are fused.Experimental results show that the detection accuracy of iCapsNet–TSF reaches 94.07%,98.83%,and 98.50%on the Celeb-DF,FaceSwap,and Deepfakes datasets,respectively,displaying a better performance than most existing mainstream algorithms.The iCapsNet–TSF method combines the capsule network and the optical flow algorithm,providing a novel strategy for the deepfake detection,which is of great significance to the prevention of deepfake attacks and the preservation of cyberspace security.展开更多
The majority of spatial data reveal some degree of spatial dependence. The term “spatial dependence” refers to the tendency for phenomena to be more similar when they occur close together than when they occur far ap...The majority of spatial data reveal some degree of spatial dependence. The term “spatial dependence” refers to the tendency for phenomena to be more similar when they occur close together than when they occur far apart in space. This property is ignored in machine learning (ML) for spatial domains of application. Most classical machine learning algorithms are generally inappropriate unless modified in some way to account for it. In this study, we proposed an approach that aimed to improve a ML model to detect the dependence without incorporating any spatial features in the learning process. To detect this dependence while also improving performance, a hybrid model was used based on two representative algorithms. In addition, cross-validation method was used to make the model stable. Furthermore, global moran’s I and local moran were used to capture the spatial dependence in the residuals. The results show that the HM has significant with a R2 of 99.91% performance compared to RBFNN and RF that have 74.22% and 82.26% as R2 respectively. With lower errors, the HM was able to achieve an average test error of 0.033% and a positive global moran’s of 0.12. We concluded that as the R2 value increases, the models become weaker in terms of capturing the dependence.展开更多
基金funded by the National Natural Science Foundation of China,grant number 42074176,U1939204。
文摘Frequency modulated continuous wave(FMCW)radar is an advantageous sensor scheme for target estimation and environmental perception.However,existing algorithms based on discrete Fourier transform(DFT),multiple signal classification(MUSIC)and compressed sensing,etc.,cannot achieve both low complexity and high resolution simultaneously.This paper proposes an efficient 2-D MUSIC algorithm for super-resolution target estimation/tracking based on FMCW radar.Firstly,we enhance the efficiency of 2-D MUSIC azimuth-range spectrum estimation by incorporating 2-D DFT and multi-level resolution searching strategy.Secondly,we apply the gradient descent method to tightly integrate the spatial continuity of object motion into spectrum estimation when processing multi-epoch radar data,which improves the efficiency of continuous target tracking.These two approaches have improved the algorithm efficiency by nearly 2-4 orders of magnitude without losing accuracy and resolution.Simulation experiments are conducted to validate the effectiveness of the algorithm in both single-epoch estimation and multi-epoch tracking scenarios.
基金supported by the National Natural Science Foundation of China(NSFC)under Grant No.62061024the Project of Gansu Province Science and Technology Department under Grant No.22ZD6GA055.
文摘Differential spatial modulation(DSM)is a multiple-input multiple-output(MIMO)transmission scheme.It has attracted extensive research interest due to its ability to transmit additional data without increasing any radio frequency chain.In this paper,DSM is investigated using two mapping algorithms:Look-Up Table Order(LUTO)and Permutation Method(PM).Then,the bit error rate(BER)performance and complexity of the two mapping algorithms in various antennas and modulation methods are verified by simulation experiments.The results show that PM has a lower BER than the LUTO mapping algorithm,and the latter has lower complexity than the former.
基金Supported by the National Naturral Science Foundation of China(61301191)
文摘A full-polarimetric super-resolution algorithm with spatial smoothing processing is presented for one-dimensional(1-D)radar imaging.The coherence between scattering centers is minimized by using spatial smoothing processing(SSP).Then the range and polarimetric scattering matrix of the scattering centers are estimated.The impact of different lengths of the smoothing window on the imaging quality is mainly analyzed with different signal-to-noise ratios(SNR).Simulation and experimental results show that an improved radar super-resolution range profile and more precise estimation can be obtained by adjusting the length of the smoothing window under different SNR conditions.
基金the National Natural Science Foundation of China (60632020).
文摘Passive millimeter wave (PMMW) images inherently have the problem of poor resolution owing to limited aperture dimension. Thus, efficient post-processing is necessary to achieve resolution improvement. An adaptive projected Landweber (APL) super-resolution algorithm using a spectral correction procedure, which attempts to combine the strong points of all of the projected Landweber (PL) iteration and the adaptive relaxation parameter adjustment and the spectral correction method, is proposed. In the algorithm, the PL iterations are implemented as the main image restoration scheme and a spectral correction method is included in which the calculated spectrum within the passband is replaced by the known low frequency component. Then, the algorithm updates the relaxation parameter adaptively at each iteration. A qualitative evaluation of this algorithm is performed with simulated data as well as actual radiometer image captured by 91.5 GHz mechanically scanned radiometer. From experiments, it is found that the super-resolution algorithm obtains better results and enhances the resolution and has lower mean square error (MSE). These constraints and adaptive character and spectral correction procedures speed up the convergence of the Landweber algorithm and reduce the ringing effects that are caused by regularizing the image restoration problem.
基金supported by the National Natural Science Foundation of China(Grant No.U22A20594)the Fundamental Research Funds for the Central Universities(Grant No.B230205028)the Postgraduate Research&Practice Innovation Program of Jiangsu Province(Grant No.KYCX23_0694).
文摘The traditional deterministic analysis for tunnel face stability neglects the uncertainties of geotechnical parameters,while the simplified reliability analysis which models the potential uncertainties by means of random variables usually fails to account for soil spatial variability.To overcome these limitations,this study proposes an efficient framework for conducting reliability analysis and reliability-based design(RBD)of tunnel face stability in spatially variable soil strata.The three-dimensional(3D)rotational failure mechanism of the tunnel face is extended to account for the soil spatial variability,and a probabilistic framework is established by coupling the extended mechanism with the improved Hasofer-Lind-Rackwits-Fiessler recursive algorithm(iHLRF)as well as its inverse analysis formulation.The proposed framework allows for rapid and precise reliability analysis and RBD of tunnel face stability.To demonstrate the feasibility and efficacy of the proposed framework,an illustrative case of tunnelling in frictional soils is presented,where the soil's cohesion and friction angle are modelled as two anisotropic cross-correlated lognormal random fields.The results show that the proposed method can accurately estimate the failure probability(or reliability index)regarding the tunnel face stability and can efficiently determine the required supporting pressure for a target reliability index with soil spatial variability being taken into account.Furthermore,this study reveals the impact of various factors on the support pressure,including coefficient of variation,cross-correlation between cohesion and friction angle,as well as autocorrelation distance of spatially variable soil strata.The results also demonstrate the feasibility of using the forward and/or inverse first-order reliability method(FORM)in high-dimensional stochastic problems.It is hoped that this study may provide a practical and reliable framework for determining the stability of tunnels in complex soil strata.
文摘To improve the performance of the traditional map matching algorithms in freeway traffic state monitoring systems using the low logging frequency GPS (global positioning system) probe data, a map matching algorithm based on the Oracle spatial data model is proposed. The algorithm uses the Oracle road network data model to analyze the spatial relationships between massive GPS positioning points and freeway networks, builds an N-shortest path algorithm to find reasonable candidate routes between GPS positioning points efficiently, and uses the fuzzy logic inference system to determine the final matched traveling route. According to the implementation with field data from Los Angeles, the computation speed of the algorithm is about 135 GPS positioning points per second and the accuracy is 98.9%. The results demonstrate the effectiveness and accuracy of the proposed algorithm for mapping massive GPS positioning data onto freeway networks with complex geometric characteristics.
基金Supported by the Open Researches Fund Program of L IESMARS(WKL(0 0 ) 0 30 2 )
文摘Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recognition, image processing, and etc. We combine sampling technique with DBSCAN algorithm to cluster large spatial databases, and two sampling based DBSCAN (SDBSCAN) algorithms are developed. One algorithm introduces sampling technique inside DBSCAN, and the other uses sampling procedure outside DBSCAN. Experimental results demonstrate that our algorithms are effective and efficient in clustering large scale spatial databases.
文摘Human beings’ intellection is the characteristic of a distinct hierarchy and can be taken to construct a heuristic in the shortest path algorithms.It is detailed in this paper how to utilize the hierarchical reasoning on the basis of greedy and directional strategy to establish a spatial heuristic,so as to improve running efficiency and suitability of shortest path algorithm for traffic network.The authors divide urban traffic network into three hierarchies and set forward a new node hierarchy division rule to avoid the unreliable solution of shortest path.It is argued that the shortest path,no matter distance shortest or time shortest,is usually not the favorite of drivers in practice.Some factors difficult to expect or quantify influence the drivers’ choice greatly.It makes the drivers prefer choosing a less shortest,but more reliable or flexible path to travel on.The presented optimum path algorithm,in addition to the improvement of the running efficiency of shortest path algorithms up to several times,reduces the emergence of those factors,conforms to the intellection characteristic of human beings,and is more easily accepted by drivers.Moreover,it does not require the completeness of networks in the lowest hierarchy and the applicability and fault tolerance of the algorithm have improved.The experiment result shows the advantages of the presented algorithm.The authors argued that the algorithm has great potential application for navigation systems of large_scale traffic networks.
文摘Based on the mechanism of imagery, a novel method called the delaminating combining template method, used for the problem of super-resolution reconstruction from image sequence, is described in this paper. The combining template method contains two steps: a delaminating strategy and a combining template algorithm. The delaminating strategy divides the original problem into several sub-problems; each of them is only eonnected to one degrading factor. The combining template algorithm is suggested to resolve each sub-problem. In addition, to verify the valid of the method, a new index called oriental entropy is presented. The results from the theoretical analysis and experiments illustrate that this method to be promising and efficient.
基金Projects(41161020,41261026) supported by the National Natural Science Foundation of ChinaProject(BQD2012013) supported by the Research starting Funds for Imported Talents,Ningxia University,China+1 种基金Project(ZR1209) supported by the Natural Science Funds,Ningxia University,ChinaProject(NGY2013005) supported by the Key Science Project of Colleges and Universities in Ningxia,China
文摘To develop a better approach for spatial evaluation of drinking water quality, an intelligent evaluation method integrating a geographical information system(GIS) and an ant colony clustering algorithm(ACCA) was used. Drinking water samples from 29 wells in Zhenping County, China, were collected and analyzed. 35 parameters on water quality were selected, such as chloride concentration, sulphate concentration, total hardness, nitrate concentration, fluoride concentration, turbidity, pH, chromium concentration, COD, bacterium amount, total coliforms and color. The best spatial interpolation methods for the 35 parameters were found and selected from all types of interpolation methods in GIS environment according to the minimum cross-validation errors. The ACCA was improved through three strategies, namely mixed distance function, average similitude degree and probability conversion functions. Then, the ACCA was carried out to obtain different water quality grades in the GIS environment. In the end, the result from the ACCA was compared with those from the competitive Hopfield neural network(CHNN) to validate the feasibility and effectiveness of the ACCA according to three evaluation indexes, which are stochastic sampling method, pixel amount and convergence speed. It is shown that the spatial water quality grades obtained from the ACCA were more effective, accurate and intelligent than those obtained from the CHNN.
文摘Considering the characteristics of spatial straightness error, this paper puts forward a kind of evaluation method of spatial straightness error using Geometric Approximation Searching Algorithm (GASA). According to the minimum condition principle of form error evaluation, the mathematic model and optimization objective of the GASA are given. The algorithm avoids the optimization and linearization, and can be fulfilled in three steps. First construct two parallel quadrates based on the preset two reference points of the spatial line respectively;second construct centerlines by connecting one quadrate each vertices to another quadrate each vertices;after that, calculate the distances between measured points and the constructed centerlines. The minimum zone straightness error is obtained by repeating comparing and reconstructing quadrates. The principle and steps of the algorithm to evaluate spatial straightness error is described in detail, and the mathematical formula and program flowchart are given also. Results show that this algorithm can evaluate spatial straightness error more effectively and exactly.
文摘A new method of super-resolution image reconstruction is proposed, which uses a three-step-training error backpropagation neural network (BPNN) to realize the super-resolution reconstruction (SRR) of satellite image. The method is based on BPNN. First, three groups learning samples with different resolutions are obtained according to image observation model, and then vector mappings are respectively used to those three group learning samples to speed up the convergence of BPNN, at last, three times consecutive training are carried on the BPNN. Training samples used in each step are of higher resolution than those used in the previous steps, so the increasing weights store a great amount of information for SRR, and network performance and generalization ability are improved greatly. Simulation and generalization tests are carried on the well-trained three-step-training NN respectively, and the reconstruction results with higher resolution images verify the effectiveness and validity of this method.
基金sponsored by National Natural Science Foundation of China(61827825 and 61735017)Fundamental Research Funds for the Central Universities(2019XZZX003-06)+1 种基金Natural Science Foundation of Zhejiang province(LR16F050001)Zhejiang Lab(2018EB0ZX01).
文摘Image scanning microscopy based on pixel reassignment can improve the confocal resolution limit without losing the image signal-to-noise ratio(SNR)greatly[C.J.R.Sheppard,"Super resolution in confocal imaging,"Optik 80(2)53-54(1988).C.B.Miller,E.Jorg,"Image scanning microscopy,"Phys.Reu.Lett.104(19)198101(2010).C.J.R.Sheppard,s.B.Mehta,R Heintzmann,"Superresolution by image scanning microscopy using pixel reassignment,"Opt.Lett.38(15)28892892(2013)].Here,we use a tailor-made optical fiber and 19 avalanche pho-todiodes(APDs)as parallel detectors to upgrade our existing confocal microscopy,termed as parallel-detection super resolution(PDSR)microscopy.In order to obtain the correct shift value,we use the normalized 2D cross correlation to calculate the shifting value of each image.We characterized our system performance by imaging fuorescence beads and applied this system to observing the 3D structure of biological specimen.
基金The article is supported by National Key Research and Development Projects of P.R.China(No.2018YFD0600100).
文摘A general regression neural network model,combined with an interative algorithm(GRNNI)using sparsely distributed samples and auxiliary environmental variables was proposed to predict both spatial distribution and variability of soil organic matter(SOM)in a bamboo forest.The auxiliary environmental variables were:elevation,slope,mean annual temperature,mean annual precipitation,and normalized difference vegetation index.The prediction accuracy of this model was assessed via three accuracy indices,mean error(ME),mean absolute error(MAE),and root mean squared error(RMSE)for validation in sampling sites.Both the prediction accuracy and reliability of this model were compared to those of regression kriging(RK)and ordinary kriging(OK).The results show that the prediction accuracy of the GRNNI model was higher than that of both RK and OK.The three accuracy indices(ME,MAE,and RMSE)of the GRNNI model were lower than those of RK and OK.Relative improvements of RMSE of the GRNNI model compared with RK and OK were 13.6%and 17.5%,respectively.In addition,a more realistic spatial pattern of SOM was produced by the model because the GRNNI model was more suitable than multiple linear regression to capture the nonlinear relationship between SOM and the auxiliary environmental variables.Therefore,the GRNNI model can improve both prediction accuracy and reliability for determining spatial distribution and variability of SOM.
文摘With the rapid growth of spatial data,POI(Point of Interest)is becoming ever more intensive,and the text description of each spatial point is also gradually increasing.The traditional query method can only address the problem that the text description is less and single keyword query.In view of this situation,the paper proposes an approximate matching algorithm to support spatial multi-keyword.The fuzzy matching algorithm is integrated into this algorithm,which not only supports multiple POI queries,but also supports fault tolerance of the query keywords.The simulation results demonstrate that the proposed algorithm can improve the accuracy and efficiency of query.
文摘This paper describes the nearest neighbor (NN) search algorithm on the GBD(generalized BD) tree. The GBD tree is a spatial data structure suitable for two-or three-dimensional data and has good performance characteristics with respect to the dynamic data environment. On GIS and CAD systems, the R-tree and its successors have been used. In addition, the NN search algorithm is also proposed in an attempt to obtain good performance from the R-tree. On the other hand, the GBD tree is superior to the R-tree with respect to exact match retrieval, because the GBD tree has auxiliary data that uniquely determines the position of the object in the structure. The proposed NN search algorithm depends on the property of the GBD tree described above. The NN search algorithm on the GBD tree was studied and the performance thereof was evaluated through experiments.
基金supported by the Fundamental Research Funds for the Central Universities under Grant 2020JKF101the Research Funds of Sugon under Grant 2022KY001.
文摘Rapid development of deepfake technology led to the spread of forged audios and videos across network platforms,presenting risks for numerous countries,societies,and individuals,and posing a serious threat to cyberspace security.To address the problem of insufficient extraction of spatial features and the fact that temporal features are not considered in the deepfake video detection,we propose a detection method based on improved CapsNet and temporal–spatial features(iCapsNet–TSF).First,the dynamic routing algorithm of CapsNet is improved using weight initialization and updating.Then,the optical flow algorithm is used to extract interframe temporal features of the videos to form a dataset of temporal–spatial features.Finally,the iCapsNet model is employed to fully learn the temporal–spatial features of facial videos,and the results are fused.Experimental results show that the detection accuracy of iCapsNet–TSF reaches 94.07%,98.83%,and 98.50%on the Celeb-DF,FaceSwap,and Deepfakes datasets,respectively,displaying a better performance than most existing mainstream algorithms.The iCapsNet–TSF method combines the capsule network and the optical flow algorithm,providing a novel strategy for the deepfake detection,which is of great significance to the prevention of deepfake attacks and the preservation of cyberspace security.
文摘The majority of spatial data reveal some degree of spatial dependence. The term “spatial dependence” refers to the tendency for phenomena to be more similar when they occur close together than when they occur far apart in space. This property is ignored in machine learning (ML) for spatial domains of application. Most classical machine learning algorithms are generally inappropriate unless modified in some way to account for it. In this study, we proposed an approach that aimed to improve a ML model to detect the dependence without incorporating any spatial features in the learning process. To detect this dependence while also improving performance, a hybrid model was used based on two representative algorithms. In addition, cross-validation method was used to make the model stable. Furthermore, global moran’s I and local moran were used to capture the spatial dependence in the residuals. The results show that the HM has significant with a R2 of 99.91% performance compared to RBFNN and RF that have 74.22% and 82.26% as R2 respectively. With lower errors, the HM was able to achieve an average test error of 0.033% and a positive global moran’s of 0.12. We concluded that as the R2 value increases, the models become weaker in terms of capturing the dependence.