Based on the image theory,the analytical solutions of tunneling-induced ground displacement were derived in conjunction with the nonuniform convergence model.The reasonable value of Poisson ratio in the analytical sol...Based on the image theory,the analytical solutions of tunneling-induced ground displacement were derived in conjunction with the nonuniform convergence model.The reasonable value of Poisson ratio in the analytical solution was discussed.The ground settlement width parameter which could reflect the ground condition was introduced to modify the analytical solutions proposed above,and new analytical solutions were presented.To evaluate the validity of the present solutions using the nonuniform convergence model,the results were compared with the observed values for four engineering projects,including 38 measured data of ground settlement.The agreement shows that the present solutions using the nonuniform convergence model are effective for evaluating the tunneling-induced ground displacements.展开更多
TOPMODEL,a semi-distributed hydrological model,has been widely used.In the process of simulation of the model,Digital Elevation Model(DEM) is used to provide the input data,such as topographic index and distance to th...TOPMODEL,a semi-distributed hydrological model,has been widely used.In the process of simulation of the model,Digital Elevation Model(DEM) is used to provide the input data,such as topographic index and distance to the drainage outlet;thus DEM plays an important role in TOPMODEL.This study aims at examining the impacts of DEM uncertainty on the simulation results of TOPMODEL.In this paper,the effects were evaluated mainly from quantitative and qualitative aspects.Firstly,DEM uncertainty was simulated by using the Monte Carlo method,and for every DEM realization,the topographic index and distance to the drainage outlet were extracted.Secondly,the obtained topographic index and the distance to the drainage outlet were input to the TOPMODEL to simulate seven rain-storm-flood events,and four evaluation indices,such as Nash and Sutcliffe efficiency criterion(EFF),sum of squared residuals over all time steps(SSE),sum of squared log residuals over all time steps(SLE) and sum of absolute errors over all time steps(SAE) were recorded.Thirdly,these four evaluation indices were analyzed in statistical manner(minimum,maximum,range,standard deviation and mean value),and effect of DEM uncertainty on TOPMODEL was quantitatively analyzed.Finally,the simulated hydrographs from TOPMODEL using the original DEM and realizations of DEM were qualitatively evaluated under each flood cases.Results show that the effect of DEM uncertainty on TOPMODEL is inconsiderable and could be ignored in the model’s application.This can be explained by:1) TOPMODEL is not sensitive to the distribution of topographic index and distance to the drainage outlet;2) the distri-bution of topographic index and distance to the drainage outlet are slightly affected by DEM uncertainty.展开更多
The idea of positional inverted index is exploited for indexing of graph database. The main idea is the use of hashing tables in order to prune a considerable portion of graph database that cannot contain the answer s...The idea of positional inverted index is exploited for indexing of graph database. The main idea is the use of hashing tables in order to prune a considerable portion of graph database that cannot contain the answer set. These tables are implemented using column-based techniques and are used to store graphs of database, frequent sub-graphs and the neighborhood of nodes. In order to exact checking of remaining graphs, the vertex invariant is used for isomorphism test which can be parallel implemented. The results of evaluation indicate that proposed method outperforms existing methods.展开更多
Sensors are ubiquitous in the Internet of Things for measuring and collecting data. Analyzing these data derived from sensors is an essential task and can reveal useful latent information besides the data. Since the I...Sensors are ubiquitous in the Internet of Things for measuring and collecting data. Analyzing these data derived from sensors is an essential task and can reveal useful latent information besides the data. Since the Internet of Things contains many sorts of sensors, the measurement data collected by these sensors are multi-type data, sometimes contai- ning temporal series information. If we separately deal with different sorts of data, we will miss useful information. This paper proposes a method to dis- cover the correlation in multi-faceted data, which contains many types of data with temporal informa- tion, and our method can simultaneously deal with multi-faceted data. We transform high-dimensional multi-faeeted data into lower-dimensional data which is set as multivariate Gaussian Graphical Models, then mine the correlation in multi-faceted data by discover the structure of the multivariate Gausslan Graphical Models. With a real data set, we verifies our method, and the experiment demonstrates that the method we propose can correctly fred out the correlation among multi-faceted meas- urement data.展开更多
Rectification for airborne linear images is an indispensable preprocessing step. This paper presents in detail a two-step rectification algorithm. The first step is to establish the model of direct georeference positi...Rectification for airborne linear images is an indispensable preprocessing step. This paper presents in detail a two-step rectification algorithm. The first step is to establish the model of direct georeference position using the data provided by the Po- sitioning and Orientation System (POS) and obtain the mathematical relationships between the image points and ground reference points. The second step is to apply polynomial distortion model and Bilinear Interpolation to get the final precise rectified images. In this step, a reference image is required and some ground control points (GCPs) are selected. Experiments showed that the final rectified images are satisfactory, and that our two-step rectification algorithm is very effective.展开更多
Efficient data management is a key issue for environments distributed on a large scale such as the data cloud. This can be taken into account by replicating the data. The replication of data reduces the time of servic...Efficient data management is a key issue for environments distributed on a large scale such as the data cloud. This can be taken into account by replicating the data. The replication of data reduces the time of service and the delay in availability, increases the availability, and optimizes the distribution of load in the system. It is worth mentioning, however, that with the replication of data, the use of resources and energy increases due to the storing of copies of the data. We suggest a replication manager that decreases the cost of using resources, energy, and the delay in the system, and also increases the availability of the system. To reach this aim, the suggested replication manager, called the locality replication manager (LRM), works by using two important algorithms that use the physical adjacency feature of blocks. In addition, a set of simulations are reported to show that LRM can be a suitable option for distributed systems as it uses less energy and resources, optimizes the distribution of load, and has more availability and less delay.展开更多
With increasing urbanization and agricultural expansion, large tracts of wetlands have been either disturbed or converted to other uses. To protect wetlands, accurate distribution maps are needed. However, because of ...With increasing urbanization and agricultural expansion, large tracts of wetlands have been either disturbed or converted to other uses. To protect wetlands, accurate distribution maps are needed. However, because of the dramatic diversity of wetlands and difficulties in field work, wetland mapping on a large spatial scale is very difficult to do. Until recently there were only a few high resolution global wetland distribution datasets developed for wetland protection and restoration. In this paper, we used hydrologic and climatic variables in combination with Compound Topographic Index (CTI) data in modeling the average annual water table depth at 30 arc-second grids over the continental areas of the world except for Antarctica. The water table depth data were modeled without considering influences of anthropogenic activities. We adopted a relationship between poten- tial wetland distribution and water table depth to develop the global wetland suitability distribution dataset. The modeling re- suits showed that the total area of global wetland reached 3.316× 10^7 km^2. Remote-sensing-based validation based on a compi- lation of wetland areas from multiple sources indicates that the overall accuracy of our product is 83.7%. This result can be used as the basis for mapping the actual global wetland distribution. Because the modeling process did not account for the im- pact of anthropogenic water management such as irrigation and reservoir construction over suitable wetland areas, our result represents the upper bound of wetland areas when compared with some other global wetland datasets. Our method requires relatively fewer datasets and has a higher accuracy than a recently developed global wetland dataset.展开更多
基金Project(09JJ1008) supported by Hunan Provincial Science Foundation of China
文摘Based on the image theory,the analytical solutions of tunneling-induced ground displacement were derived in conjunction with the nonuniform convergence model.The reasonable value of Poisson ratio in the analytical solution was discussed.The ground settlement width parameter which could reflect the ground condition was introduced to modify the analytical solutions proposed above,and new analytical solutions were presented.To evaluate the validity of the present solutions using the nonuniform convergence model,the results were compared with the observed values for four engineering projects,including 38 measured data of ground settlement.The agreement shows that the present solutions using the nonuniform convergence model are effective for evaluating the tunneling-induced ground displacements.
基金Under the auspices of the National Natural Science Foundation of China (No. 40171015)
文摘TOPMODEL,a semi-distributed hydrological model,has been widely used.In the process of simulation of the model,Digital Elevation Model(DEM) is used to provide the input data,such as topographic index and distance to the drainage outlet;thus DEM plays an important role in TOPMODEL.This study aims at examining the impacts of DEM uncertainty on the simulation results of TOPMODEL.In this paper,the effects were evaluated mainly from quantitative and qualitative aspects.Firstly,DEM uncertainty was simulated by using the Monte Carlo method,and for every DEM realization,the topographic index and distance to the drainage outlet were extracted.Secondly,the obtained topographic index and the distance to the drainage outlet were input to the TOPMODEL to simulate seven rain-storm-flood events,and four evaluation indices,such as Nash and Sutcliffe efficiency criterion(EFF),sum of squared residuals over all time steps(SSE),sum of squared log residuals over all time steps(SLE) and sum of absolute errors over all time steps(SAE) were recorded.Thirdly,these four evaluation indices were analyzed in statistical manner(minimum,maximum,range,standard deviation and mean value),and effect of DEM uncertainty on TOPMODEL was quantitatively analyzed.Finally,the simulated hydrographs from TOPMODEL using the original DEM and realizations of DEM were qualitatively evaluated under each flood cases.Results show that the effect of DEM uncertainty on TOPMODEL is inconsiderable and could be ignored in the model’s application.This can be explained by:1) TOPMODEL is not sensitive to the distribution of topographic index and distance to the drainage outlet;2) the distri-bution of topographic index and distance to the drainage outlet are slightly affected by DEM uncertainty.
文摘The idea of positional inverted index is exploited for indexing of graph database. The main idea is the use of hashing tables in order to prune a considerable portion of graph database that cannot contain the answer set. These tables are implemented using column-based techniques and are used to store graphs of database, frequent sub-graphs and the neighborhood of nodes. In order to exact checking of remaining graphs, the vertex invariant is used for isomorphism test which can be parallel implemented. The results of evaluation indicate that proposed method outperforms existing methods.
基金the Project"The Basic Research on Internet of Things Architecture"supported by National Key Basic Research Program of China(No.2011CB302704)supported by National Natural Science Foundation of China(No.60802034)+2 种基金Specialized Research Fund for the Doctoral Program of Higher Education(No.20070013026)Beijing Nova Program(No.2008B50)"New generation broadband wireless mobile communication network"Key Projects for Science and Technology Development(No.2011ZX03002-002-01)
文摘Sensors are ubiquitous in the Internet of Things for measuring and collecting data. Analyzing these data derived from sensors is an essential task and can reveal useful latent information besides the data. Since the Internet of Things contains many sorts of sensors, the measurement data collected by these sensors are multi-type data, sometimes contai- ning temporal series information. If we separately deal with different sorts of data, we will miss useful information. This paper proposes a method to dis- cover the correlation in multi-faceted data, which contains many types of data with temporal informa- tion, and our method can simultaneously deal with multi-faceted data. We transform high-dimensional multi-faeeted data into lower-dimensional data which is set as multivariate Gaussian Graphical Models, then mine the correlation in multi-faceted data by discover the structure of the multivariate Gausslan Graphical Models. With a real data set, we verifies our method, and the experiment demonstrates that the method we propose can correctly fred out the correlation among multi-faceted meas- urement data.
基金Project (No. 02DZ15001) supported by Shanghai Science and Technology Development Funds, China
文摘Rectification for airborne linear images is an indispensable preprocessing step. This paper presents in detail a two-step rectification algorithm. The first step is to establish the model of direct georeference position using the data provided by the Po- sitioning and Orientation System (POS) and obtain the mathematical relationships between the image points and ground reference points. The second step is to apply polynomial distortion model and Bilinear Interpolation to get the final precise rectified images. In this step, a reference image is required and some ground control points (GCPs) are selected. Experiments showed that the final rectified images are satisfactory, and that our two-step rectification algorithm is very effective.
文摘Efficient data management is a key issue for environments distributed on a large scale such as the data cloud. This can be taken into account by replicating the data. The replication of data reduces the time of service and the delay in availability, increases the availability, and optimizes the distribution of load in the system. It is worth mentioning, however, that with the replication of data, the use of resources and energy increases due to the storing of copies of the data. We suggest a replication manager that decreases the cost of using resources, energy, and the delay in the system, and also increases the availability of the system. To reach this aim, the suggested replication manager, called the locality replication manager (LRM), works by using two important algorithms that use the physical adjacency feature of blocks. In addition, a set of simulations are reported to show that LRM can be a suitable option for distributed systems as it uses less energy and resources, optimizes the distribution of load, and has more availability and less delay.
基金supported by National High-tech R&D Program of China (Grant No. 2009AA12200101)
文摘With increasing urbanization and agricultural expansion, large tracts of wetlands have been either disturbed or converted to other uses. To protect wetlands, accurate distribution maps are needed. However, because of the dramatic diversity of wetlands and difficulties in field work, wetland mapping on a large spatial scale is very difficult to do. Until recently there were only a few high resolution global wetland distribution datasets developed for wetland protection and restoration. In this paper, we used hydrologic and climatic variables in combination with Compound Topographic Index (CTI) data in modeling the average annual water table depth at 30 arc-second grids over the continental areas of the world except for Antarctica. The water table depth data were modeled without considering influences of anthropogenic activities. We adopted a relationship between poten- tial wetland distribution and water table depth to develop the global wetland suitability distribution dataset. The modeling re- suits showed that the total area of global wetland reached 3.316× 10^7 km^2. Remote-sensing-based validation based on a compi- lation of wetland areas from multiple sources indicates that the overall accuracy of our product is 83.7%. This result can be used as the basis for mapping the actual global wetland distribution. Because the modeling process did not account for the im- pact of anthropogenic water management such as irrigation and reservoir construction over suitable wetland areas, our result represents the upper bound of wetland areas when compared with some other global wetland datasets. Our method requires relatively fewer datasets and has a higher accuracy than a recently developed global wetland dataset.