Area Sampling Frames (ASFs) are the basis of many statistical programs around the world. To improve the accuracy, objectivity and efficiency of crop survey estimates, an automated stratification method based on geos...Area Sampling Frames (ASFs) are the basis of many statistical programs around the world. To improve the accuracy, objectivity and efficiency of crop survey estimates, an automated stratification method based on geospatial crop planting frequency and cultivation data is proposed. This paper investigates using 2008-2013 geospatial corn, soybean and wheat planting frequency data layers to create three corresponding single crop specific and one multi-crop specific South Dakota (SD) U.S. ASF stratifications. Corn, soybeans and wheat are three major crops in South Dakota. The crop specific ASF stratifications are developed based on crop frequency statistics derived at the primary sampling unit (PSU) level based on the Crop Frequency Data Layers. The SD corn, soybean and wheat mean planting frequency strata of the single crop stratifications are substratified by percent cultivation based on the 2013 Cultivation Layer. The three newly derived ASF stratifications provide more crop specific information when compared to the current National Agricultural Statistics Service (NASS) ASF based on percent cultivation alone. Further, a multi-crop stratification is developed based on the individual corn, soybean and wheat planting frequency data layers. It is observed that all four crop frequency based ASF stratifications consistently predict corn, soybean and wheat planting patterns well as verified by the 2014 Farm Service Agency (FSA) Common Land Unit (CLU) and 578 administrative data. This demonstrates that the new stratifications based on crop planting frequency and cultivation are crop type independent and applicable to all major crops. Further, these results indicate that the new crop specific ASF stratifications have great potential to improve ASF accuracy, efficiency and crop estimates.展开更多
Under the scenario of dense targets in clutter, a multi-layer optimal data correlation algorithm is proposed. This algorithm eliminates a large number of false location points from the assignment process by rough corr...Under the scenario of dense targets in clutter, a multi-layer optimal data correlation algorithm is proposed. This algorithm eliminates a large number of false location points from the assignment process by rough correlations before we calculate the correlation cost, so it avoids the operations for the target state estimate and the calculation of the correlation cost for the false correlation sets. In the meantime, with the elimination of these points in the rough correlation, the disturbance from the false correlations in the assignment process is decreased, so the data correlation accuracy is improved correspondingly. Complexity analyses of the new multi-layer optimal algorithm and the traditional optimal assignment algorithm are given. Simulation results show that the new algorithm is feasible and effective.展开更多
A new design solution of data access layer for N-tier architecture is presented. It can solve the problems such as low efficiency of development and difficulties in transplantation, update and reuse. The solution util...A new design solution of data access layer for N-tier architecture is presented. It can solve the problems such as low efficiency of development and difficulties in transplantation, update and reuse. The solution utilizes the reflection technology of .NET and design pattern. A typical application of the solution demonstrates that the new solution of data access layer performs better than the current N-tier architecture. More importantly, the application suggests that the new solution of data access layer can be reused effectively.展开更多
A new B-spline surface reconstruction method from layer data based on deformable model is presented. An initial deformable surface, which is represented as a closed cylinder, is firstly given. The surface is subject t...A new B-spline surface reconstruction method from layer data based on deformable model is presented. An initial deformable surface, which is represented as a closed cylinder, is firstly given. The surface is subject to internal forces describing its implicit smoothness property and external forces attracting it toward the layer data points. And then finite element method is adopted to solve its energy minimization problem, which results a bicubic closed B-spline surface with C^2 continuity. The proposed method can provide a smoothness and accurate surface model directly from the layer data, without the need to fit cross-sectional curves and make them compatible. The feasibility of the proposed method is verified by the experimental results.展开更多
As an industry accepted storage scheme, hafnium oxide(HfO_x) based resistive random access memory(RRAM)should further improve its thermal stability and data retention for practical applications. We therefore fabri...As an industry accepted storage scheme, hafnium oxide(HfO_x) based resistive random access memory(RRAM)should further improve its thermal stability and data retention for practical applications. We therefore fabricated RRAMs with HfO_x/ZnO double-layer as the storage medium to study their thermal stability as well as data retention. The HfO_x/ZnO double-layer is capable of reversible bipolar switching under ultralow switching current(〈 3 μA) with a Schottky emission dominant conduction for the high resistance state and a Poole–Frenkel emission governed conduction for the low resistance state. Compared with a drastically increased switching current at 120℃ for the single HfO_x layer RRAM, the HfO_x/ZnO double-layer exhibits excellent thermal stability and maintains neglectful fluctuations in switching current at high temperatures(up to 180℃), which might be attributed to the increased Schottky barrier height to suppress current at high temperatures. Additionally, the HfO_x/ZnO double-layer exhibits 10-year data retention @85℃ that is helpful for the practical applications in RRAMs.展开更多
In data centers, the transmission control protocol(TCP) incast causes catastrophic goodput degradation to applications with a many-to-one traffic pattern. In this paper, we intend to tame incast at the receiver-side a...In data centers, the transmission control protocol(TCP) incast causes catastrophic goodput degradation to applications with a many-to-one traffic pattern. In this paper, we intend to tame incast at the receiver-side application. Towards this goal, we first develop an analytical model that formulates the incast probability as a function of connection variables and network environment settings. We combine the model with the optimization theory and derive some insights into minimizing the incast probability through tuning connection variables related to applications. Then,enlightened by the analytical results, we propose an adaptive application-layer solution to the TCP incast.The solution equally allocates advertised windows to concurrent connections, and dynamically adapts the number of concurrent connections to the varying conditions. Simulation results show that our solution consistently eludes incast and achieves high goodput in various scenarios including the ones with multiple bottleneck links and background TCP traffic.展开更多
Much data such as geometric image data and drawings have graph structures. Such data are called graph structured data. In order to manage efficiently such graph structured data, we need to analyze and abstract graph s...Much data such as geometric image data and drawings have graph structures. Such data are called graph structured data. In order to manage efficiently such graph structured data, we need to analyze and abstract graph structures of such data. The purpose of this paper is to find knowledge representations which indicate plural abstractions of graph structured data. Firstly, we introduce a term graph as a graph pattern having structural variables, and a substitution over term graphs which is graph rewriting system. Next, for a graph G, we define a multiple layer ( g,(θ 1,…,θ k )) of G as a pair of a term graph g and a list of k substitutions θ 1,…,θ k such that G can be obtained from g by applying substitutions θ 1,…,θ k to g. In the same way, for a set S of graphs, we also define a multiple layer for S as a pair ( D,Θ ) of a set D of term graphs and a list Θ of substitutions. Secondly, for a graph G and a set S of graphs, we present effective algorithms for extracting minimal multiple layers of G and S which give us stratifying abstractions of G and S, respectively. Finally, we report experimental results obtained by applying our algorithms to both artificial data and drawings of power plants which are real world data.展开更多
In recent years, the anisotropic study has become a hot topic in the field of electromagnetics. Currently, inversion technologies of transient electromagnetic sounding data are mainly based on the case of an isotropic...In recent years, the anisotropic study has become a hot topic in the field of electromagnetics. Currently, inversion technologies of transient electromagnetic sounding data are mainly based on the case of an isotropic medium. However, the actual underground electrical structure tends to be complicated and anisotropic. It is often found that the isotropic inversion technologies do not lead to good results for field transient electromagnetic sounding data. We have developed an algorithm for calculating the transient electromagnetic response in a layered medium with azimuthal anisotropy. An occam inversion algorithm has also been implemented to invert the transient electromagnetic data induced by a grounded horizontal electric dipole in a layered medium with azimuthal anisotropy. Synthetic examples demonstrate the stability and validity of the inversion algorithm. Experimental results show different data for inverting have great influence on the inversion results.展开更多
The European Center for Medium-Range Weather Forecast (ECMWF) Re-Analysis (ERA-40) and the National Centers for Environmental Prediction-National Center for Atmospheric Research (NCEP-NCAR) ECMWF (ERA-40) and ...The European Center for Medium-Range Weather Forecast (ECMWF) Re-Analysis (ERA-40) and the National Centers for Environmental Prediction-National Center for Atmospheric Research (NCEP-NCAR) ECMWF (ERA-40) and NCEP–NCAR reanalysis data were compared with Antarctic station observations, including surface-layer and upper-layer atmospheric observations, on intraseasonal and interannual timescales. At the interannual timescale, atmospheric pressure at different height levels in the ERA-40 data are in better agreement with observed pressure than that in the NCEP–NCAR reanalysis data. ERA-40 reanalysis also outperforms NCEP–NCAR reanalysis in atmospheric temperature, except in the surface layer where the biases are somewhat larger. The wind velocity fields in both datasets do not agree well with surface-and upper-layer atmospheric observations. At intraseasonal timescales, both datasets capture the observed intraseasonal variability in pressure and temperature during austral winter.展开更多
The strong nonlinearity of boundary layer parameterizations in atmospheric and oceanic models can cause difficulty for tangent linear models in approximating nonlinear perturbations when the time integration grows lon...The strong nonlinearity of boundary layer parameterizations in atmospheric and oceanic models can cause difficulty for tangent linear models in approximating nonlinear perturbations when the time integration grows longer. Consequently, the related 4—D variational data assimilation problems could be difficult to solve. A modified tangent linear model is built on the Mellor-Yamada turbulent closure (level 2.5) for 4-D variational data assimilation. For oceanic mixed layer model settings, the modified tangent linear model produces better finite amplitude, nonlinear perturbation than the full and simplified tangent linear models when the integration time is longer than one day. The corresponding variational data assimilation performances based on the adjoint of the modified tangent linear model are also improved compared with those adjoints of the full and simplified tangent linear models.展开更多
Baddeleyite is an important mineral geochronometer. It is valued in the U-Pb (ID-TIMS) geochronology more than zircon because of its magmatic origin, while zircon can be metamorphic, hydrothermal or occur as xenocryst...Baddeleyite is an important mineral geochronometer. It is valued in the U-Pb (ID-TIMS) geochronology more than zircon because of its magmatic origin, while zircon can be metamorphic, hydrothermal or occur as xenocrysts. Detailed mineralogical (BSE, KL, etc.) research of baddeleyite started in the Fennoscandian Shield in the 1990s. The mineral was first extracted from the Paleozoic Kovdor deposit, the second-biggest baddeleyite deposit in the world after Phalaborwa (2.1 Ga), South Africa. The mineral was successfully introduced into the U-Pb systematics. This study provides new U-Pb and LA-ICP-MS data on Archean Ti-Mgt and BIF deposits, Paleoproterozoic layered PGE intrusions with Pt-Pd and Cu-Ni reefs and Paleozoic complex deposits (baddeleyite, apatite, foscorite ores, etc.) in the NE Fennoscandian Shield. Data on concentrations of REE in baddeleyite and temperature of the U-Pb systematics closure are also provided. It is shown that baddeleyite plays an important role in the geological history of the Earth, in particular, in the break-up of supercontinents.展开更多
In order to set up a conceptual data model that reflects the real world as accurately as possible,this paper firstly reviews and analyzes the disadvantages of previous conceptual data models used by traditional GIS in...In order to set up a conceptual data model that reflects the real world as accurately as possible,this paper firstly reviews and analyzes the disadvantages of previous conceptual data models used by traditional GIS in simulating geographic space,gives a new explanation to geographic space and analyzes its various essential characteristics.Finally,this paper proposes several detailed key points for designing a new type of GIS data model and gives a simple holistic GIS data model.展开更多
Hydrological and marine seismic data, collected in the Gulf of Cadiz (respectively in July 1999, 2000, 2001 and 2002, and in April 2000 and 2001) are analysed to reveal the various structures of Mediterranean Water (M...Hydrological and marine seismic data, collected in the Gulf of Cadiz (respectively in July 1999, 2000, 2001 and 2002, and in April 2000 and 2001) are analysed to reveal the various structures of Mediterranean Water (MW). Both the hydrological and seismic data clearly identify the MW undercurrents on the Iberian slope, detached MW eddies (meddies and a cyclone) and smaller fragments of MW (filaments and small eddies). Seismic reflectivity and synthetic reflectivity computed from hydrology, indicate that strong acoustic reflectors, associated with 8 - 64 m thick homogeneous water layers, are found above and below meddies and filaments, around the MW undercurrents, but mostly in the lower part of cyclones and below submesoscale eddies. Reflectors are also observed in the near surface layers where thermohaline contrasts are quite pronounced. The successful use of seismic data to locate submesoscale MW structures, superior to that of hydrology, is related to the improved horizontal resolution.展开更多
Researches related to wireless sensor networks primarily concentrate on Routing, Location Services, Data Aggregation and Energy Calculation Methods. Due to the heterogeneity of sensor networks using the web architectu...Researches related to wireless sensor networks primarily concentrate on Routing, Location Services, Data Aggregation and Energy Calculation Methods. Due to the heterogeneity of sensor networks using the web architecture, cross layer mechanism can be implemented for integrating multiple resources. Framework for Sensor Web using the cross layer scheduling mechanisms in the grid environment is proposed in this paper. The resource discovery and the energy efficient data aggregation schemes are used to improvise the effective utilization capability in the Sensor Web. To collaborate with multiple resources environment, the grid computing concept is integrated with sensor web. Resource discovery and the scheduling schemes in the grid architecture are organized using the medium access control protocol. The various cross layer metrics proposed are Memory Awareness, Task Awareness and Energy Awareness. Based on these metrics, the parameters-Node Waiting Status, Used CPU Status, Average System Utilization, Average Utilization per Cluster, Cluster Usage per Hour and Node Energy Status are determined for the integrated heterogeneous WSN with sensor web in Grid Environment. From the comparative analysis, it is shown that sensor grid architecture with middleware framework has better resource awareness than the normal sensor network architectures.展开更多
文摘Area Sampling Frames (ASFs) are the basis of many statistical programs around the world. To improve the accuracy, objectivity and efficiency of crop survey estimates, an automated stratification method based on geospatial crop planting frequency and cultivation data is proposed. This paper investigates using 2008-2013 geospatial corn, soybean and wheat planting frequency data layers to create three corresponding single crop specific and one multi-crop specific South Dakota (SD) U.S. ASF stratifications. Corn, soybeans and wheat are three major crops in South Dakota. The crop specific ASF stratifications are developed based on crop frequency statistics derived at the primary sampling unit (PSU) level based on the Crop Frequency Data Layers. The SD corn, soybean and wheat mean planting frequency strata of the single crop stratifications are substratified by percent cultivation based on the 2013 Cultivation Layer. The three newly derived ASF stratifications provide more crop specific information when compared to the current National Agricultural Statistics Service (NASS) ASF based on percent cultivation alone. Further, a multi-crop stratification is developed based on the individual corn, soybean and wheat planting frequency data layers. It is observed that all four crop frequency based ASF stratifications consistently predict corn, soybean and wheat planting patterns well as verified by the 2014 Farm Service Agency (FSA) Common Land Unit (CLU) and 578 administrative data. This demonstrates that the new stratifications based on crop planting frequency and cultivation are crop type independent and applicable to all major crops. Further, these results indicate that the new crop specific ASF stratifications have great potential to improve ASF accuracy, efficiency and crop estimates.
基金This project was supported by the National Natural Science Foundation of China (60672139, 60672140)the Excellent Ph.D. Paper Author Foundation of China (200237)the Natural Science Foundation of Shandong (2005ZX01).
文摘Under the scenario of dense targets in clutter, a multi-layer optimal data correlation algorithm is proposed. This algorithm eliminates a large number of false location points from the assignment process by rough correlations before we calculate the correlation cost, so it avoids the operations for the target state estimate and the calculation of the correlation cost for the false correlation sets. In the meantime, with the elimination of these points in the rough correlation, the disturbance from the false correlations in the assignment process is decreased, so the data correlation accuracy is improved correspondingly. Complexity analyses of the new multi-layer optimal algorithm and the traditional optimal assignment algorithm are given. Simulation results show that the new algorithm is feasible and effective.
基金the Foundation for Key Teachers of Chongqing University (200209055).
文摘A new design solution of data access layer for N-tier architecture is presented. It can solve the problems such as low efficiency of development and difficulties in transplantation, update and reuse. The solution utilizes the reflection technology of .NET and design pattern. A typical application of the solution demonstrates that the new solution of data access layer performs better than the current N-tier architecture. More importantly, the application suggests that the new solution of data access layer can be reused effectively.
基金This project is supported by National Natural Science Foundation of China(No. 10272033) and Provincial Natural Science Foundation of Guangdong,China(No.04105385).
文摘A new B-spline surface reconstruction method from layer data based on deformable model is presented. An initial deformable surface, which is represented as a closed cylinder, is firstly given. The surface is subject to internal forces describing its implicit smoothness property and external forces attracting it toward the layer data points. And then finite element method is adopted to solve its energy minimization problem, which results a bicubic closed B-spline surface with C^2 continuity. The proposed method can provide a smoothness and accurate surface model directly from the layer data, without the need to fit cross-sectional curves and make them compatible. The feasibility of the proposed method is verified by the experimental results.
基金supported by the National Natural Science Foundation of China(Grant Nos.61006003 and 61674038)the Natural Science Foundation of Fujian Province,China(Grant Nos.2015J01249 and 2010J05134)+1 种基金the Science Foundation of Fujian Education Department of China(Grant No.JAT160073)the Science Foundation of Fujian Provincial Economic and Information Technology Commission of China(Grant No.83016006)
文摘As an industry accepted storage scheme, hafnium oxide(HfO_x) based resistive random access memory(RRAM)should further improve its thermal stability and data retention for practical applications. We therefore fabricated RRAMs with HfO_x/ZnO double-layer as the storage medium to study their thermal stability as well as data retention. The HfO_x/ZnO double-layer is capable of reversible bipolar switching under ultralow switching current(〈 3 μA) with a Schottky emission dominant conduction for the high resistance state and a Poole–Frenkel emission governed conduction for the low resistance state. Compared with a drastically increased switching current at 120℃ for the single HfO_x layer RRAM, the HfO_x/ZnO double-layer exhibits excellent thermal stability and maintains neglectful fluctuations in switching current at high temperatures(up to 180℃), which might be attributed to the increased Schottky barrier height to suppress current at high temperatures. Additionally, the HfO_x/ZnO double-layer exhibits 10-year data retention @85℃ that is helpful for the practical applications in RRAMs.
基金supported by the Fundamental Research Fundsfor the Central Universities under Grant No.ZYGX2015J009the Sichuan Province Scientific and Technological Support Project under Grants No.2014GZ0017 and No.2016GZ0093
文摘In data centers, the transmission control protocol(TCP) incast causes catastrophic goodput degradation to applications with a many-to-one traffic pattern. In this paper, we intend to tame incast at the receiver-side application. Towards this goal, we first develop an analytical model that formulates the incast probability as a function of connection variables and network environment settings. We combine the model with the optimization theory and derive some insights into minimizing the incast probability through tuning connection variables related to applications. Then,enlightened by the analytical results, we propose an adaptive application-layer solution to the TCP incast.The solution equally allocates advertised windows to concurrent connections, and dynamically adapts the number of concurrent connections to the varying conditions. Simulation results show that our solution consistently eludes incast and achieves high goodput in various scenarios including the ones with multiple bottleneck links and background TCP traffic.
文摘Much data such as geometric image data and drawings have graph structures. Such data are called graph structured data. In order to manage efficiently such graph structured data, we need to analyze and abstract graph structures of such data. The purpose of this paper is to find knowledge representations which indicate plural abstractions of graph structured data. Firstly, we introduce a term graph as a graph pattern having structural variables, and a substitution over term graphs which is graph rewriting system. Next, for a graph G, we define a multiple layer ( g,(θ 1,…,θ k )) of G as a pair of a term graph g and a list of k substitutions θ 1,…,θ k such that G can be obtained from g by applying substitutions θ 1,…,θ k to g. In the same way, for a set S of graphs, we also define a multiple layer for S as a pair ( D,Θ ) of a set D of term graphs and a list Θ of substitutions. Secondly, for a graph G and a set S of graphs, we present effective algorithms for extracting minimal multiple layers of G and S which give us stratifying abstractions of G and S, respectively. Finally, we report experimental results obtained by applying our algorithms to both artificial data and drawings of power plants which are real world data.
文摘In recent years, the anisotropic study has become a hot topic in the field of electromagnetics. Currently, inversion technologies of transient electromagnetic sounding data are mainly based on the case of an isotropic medium. However, the actual underground electrical structure tends to be complicated and anisotropic. It is often found that the isotropic inversion technologies do not lead to good results for field transient electromagnetic sounding data. We have developed an algorithm for calculating the transient electromagnetic response in a layered medium with azimuthal anisotropy. An occam inversion algorithm has also been implemented to invert the transient electromagnetic data induced by a grounded horizontal electric dipole in a layered medium with azimuthal anisotropy. Synthetic examples demonstrate the stability and validity of the inversion algorithm. Experimental results show different data for inverting have great influence on the inversion results.
基金This research was partially funded by the Chinese Polar Program Strategic Research Fund (No. 20080218)the National Natural Science Foundation of China (40233032-40640420556)MOST(2006BAB18B03 and 2006BAB18B05)
文摘The European Center for Medium-Range Weather Forecast (ECMWF) Re-Analysis (ERA-40) and the National Centers for Environmental Prediction-National Center for Atmospheric Research (NCEP-NCAR) ECMWF (ERA-40) and NCEP–NCAR reanalysis data were compared with Antarctic station observations, including surface-layer and upper-layer atmospheric observations, on intraseasonal and interannual timescales. At the interannual timescale, atmospheric pressure at different height levels in the ERA-40 data are in better agreement with observed pressure than that in the NCEP–NCAR reanalysis data. ERA-40 reanalysis also outperforms NCEP–NCAR reanalysis in atmospheric temperature, except in the surface layer where the biases are somewhat larger. The wind velocity fields in both datasets do not agree well with surface-and upper-layer atmospheric observations. At intraseasonal timescales, both datasets capture the observed intraseasonal variability in pressure and temperature during austral winter.
基金Acknowledgments. The authors would like to thank Prof. Z. Yuan for her numerous suggestions in the writing of this paper. This work is supported by the National Natural Science Foundation of China (Grant No.40176009), the National Key Programme for Devel
文摘The strong nonlinearity of boundary layer parameterizations in atmospheric and oceanic models can cause difficulty for tangent linear models in approximating nonlinear perturbations when the time integration grows longer. Consequently, the related 4—D variational data assimilation problems could be difficult to solve. A modified tangent linear model is built on the Mellor-Yamada turbulent closure (level 2.5) for 4-D variational data assimilation. For oceanic mixed layer model settings, the modified tangent linear model produces better finite amplitude, nonlinear perturbation than the full and simplified tangent linear models when the integration time is longer than one day. The corresponding variational data assimilation performances based on the adjoint of the modified tangent linear model are also improved compared with those adjoints of the full and simplified tangent linear models.
文摘Baddeleyite is an important mineral geochronometer. It is valued in the U-Pb (ID-TIMS) geochronology more than zircon because of its magmatic origin, while zircon can be metamorphic, hydrothermal or occur as xenocrysts. Detailed mineralogical (BSE, KL, etc.) research of baddeleyite started in the Fennoscandian Shield in the 1990s. The mineral was first extracted from the Paleozoic Kovdor deposit, the second-biggest baddeleyite deposit in the world after Phalaborwa (2.1 Ga), South Africa. The mineral was successfully introduced into the U-Pb systematics. This study provides new U-Pb and LA-ICP-MS data on Archean Ti-Mgt and BIF deposits, Paleoproterozoic layered PGE intrusions with Pt-Pd and Cu-Ni reefs and Paleozoic complex deposits (baddeleyite, apatite, foscorite ores, etc.) in the NE Fennoscandian Shield. Data on concentrations of REE in baddeleyite and temperature of the U-Pb systematics closure are also provided. It is shown that baddeleyite plays an important role in the geological history of the Earth, in particular, in the break-up of supercontinents.
文摘In order to set up a conceptual data model that reflects the real world as accurately as possible,this paper firstly reviews and analyzes the disadvantages of previous conceptual data models used by traditional GIS in simulating geographic space,gives a new explanation to geographic space and analyzes its various essential characteristics.Finally,this paper proposes several detailed key points for designing a new type of GIS data model and gives a simple holistic GIS data model.
文摘Hydrological and marine seismic data, collected in the Gulf of Cadiz (respectively in July 1999, 2000, 2001 and 2002, and in April 2000 and 2001) are analysed to reveal the various structures of Mediterranean Water (MW). Both the hydrological and seismic data clearly identify the MW undercurrents on the Iberian slope, detached MW eddies (meddies and a cyclone) and smaller fragments of MW (filaments and small eddies). Seismic reflectivity and synthetic reflectivity computed from hydrology, indicate that strong acoustic reflectors, associated with 8 - 64 m thick homogeneous water layers, are found above and below meddies and filaments, around the MW undercurrents, but mostly in the lower part of cyclones and below submesoscale eddies. Reflectors are also observed in the near surface layers where thermohaline contrasts are quite pronounced. The successful use of seismic data to locate submesoscale MW structures, superior to that of hydrology, is related to the improved horizontal resolution.
文摘Researches related to wireless sensor networks primarily concentrate on Routing, Location Services, Data Aggregation and Energy Calculation Methods. Due to the heterogeneity of sensor networks using the web architecture, cross layer mechanism can be implemented for integrating multiple resources. Framework for Sensor Web using the cross layer scheduling mechanisms in the grid environment is proposed in this paper. The resource discovery and the energy efficient data aggregation schemes are used to improvise the effective utilization capability in the Sensor Web. To collaborate with multiple resources environment, the grid computing concept is integrated with sensor web. Resource discovery and the scheduling schemes in the grid architecture are organized using the medium access control protocol. The various cross layer metrics proposed are Memory Awareness, Task Awareness and Energy Awareness. Based on these metrics, the parameters-Node Waiting Status, Used CPU Status, Average System Utilization, Average Utilization per Cluster, Cluster Usage per Hour and Node Energy Status are determined for the integrated heterogeneous WSN with sensor web in Grid Environment. From the comparative analysis, it is shown that sensor grid architecture with middleware framework has better resource awareness than the normal sensor network architectures.