Six national-scale,or near national-scale,geochemical data sets for soils or stream sediments exist for the United States.The earliest of these,here termed the 'Shacklette' data set,was generated by a U.S. Geologica...Six national-scale,or near national-scale,geochemical data sets for soils or stream sediments exist for the United States.The earliest of these,here termed the 'Shacklette' data set,was generated by a U.S. Geological Survey(USGS) project conducted from 1961 to 1975.This project used soil collected from a depth of about 20 cm as the sampling medium at 1323 sites throughout the conterminous U.S.The National Uranium Resource Evaluation Hydrogeochemical and Stream Sediment Reconnaissance(NUREHSSR) Program of the U.S.Department of Energy was conducted from 1975 to 1984 and collected either stream sediments,lake sediments,or soils at more than 378,000 sites in both the conterminous U.S.and Alaska.The sampled area represented about 65%of the nation.The Natural Resources Conservation Service(NRCS),from 1978 to 1982,collected samples from multiple soil horizons at sites within the major crop-growing regions of the conterminous U.S.This data set contains analyses of more than 3000 samples.The National Geochemical Survey,a USGS project conducted from 1997 to 2009,used a subset of the NURE-HSSR archival samples as its starting point and then collected primarily stream sediments, with occasional soils,in the parts of the U.S.not covered by the NURE-HSSR Program.This data set contains chemical analyses for more than 70,000 samples.The USGS,in collaboration with the Mexican Geological Survey and the Geological Survey of Canada,initiated soil sampling for the North American Soil Geochemical Landscapes Project in 2007.Sampling of three horizons or depths at more than 4800 sites in the U.S.was completed in 2010,and chemical analyses are currently ongoing.The NRCS initiated a project in the 1990s to analyze the various soil horizons from selected pedons throughout the U.S.This data set currently contains data from more than 1400 sites.This paper(1) discusses each data set in terms of its purpose,sample collection protocols,and analytical methods;and(2) evaluates each data set in terms of its appropriateness as a national-scale geochemical database and its usefulness for nationalscale geochemical mapping.展开更多
The conventional data envelopment analysis (DEA) measures the relative efficiencies of a set of decision making units with exact values of inputs and outputs. In real-world prob- lems, however, inputs and outputs ty...The conventional data envelopment analysis (DEA) measures the relative efficiencies of a set of decision making units with exact values of inputs and outputs. In real-world prob- lems, however, inputs and outputs typically have some levels of fuzziness. To analyze a decision making unit (DMU) with fuzzy input/output data, previous studies provided the fuzzy DEA model and proposed an associated evaluating approach. Nonetheless, numerous deficiencies must still be improved, including the α- cut approaches, types of fuzzy numbers, and ranking techniques. Moreover, a fuzzy sample DMU still cannot be evaluated for the Fuzzy DEA model. Therefore, this paper proposes a fuzzy DEA model based on sample decision making unit (FSDEA). Five eval- uation approaches and the related algorithm and ranking methods are provided to test the fuzzy sample DMU of the FSDEA model. A numerical experiment is used to demonstrate and compare the results with those obtained using alternative approaches.展开更多
Application-specific data processing units (DPUs) are commonly adopted for operational control and data processing in space missions. To overcome the limitations of traditional radiation-hardened or fully commercial d...Application-specific data processing units (DPUs) are commonly adopted for operational control and data processing in space missions. To overcome the limitations of traditional radiation-hardened or fully commercial design approaches, a reconfigurable-system-on-chip (RSoC) solution based on state-of-the-art FPGA is introduced. The flexibility and reliability of this approach are outlined, and the requirements for an enhanced RSoC design with in-flight reconfigurability for space applications are presented. This design has been demonstrated as an on-board computer prototype, providing an in-flight reconfigurable DPU design approach using integrated hardwired processors.展开更多
This paper describes the function,structure and working status of the data buffer unitDBU,one of the most important functional units on ITM-1.It also discusses DBU’s supportto the multiprocessor system and Prolog lan...This paper describes the function,structure and working status of the data buffer unitDBU,one of the most important functional units on ITM-1.It also discusses DBU’s supportto the multiprocessor system and Prolog language.展开更多
PL/SQL is the most common language for ORACLE database application. It allows the developer to create stored program units (Procedures, Functions, and Packages) to improve software reusability and hide the complexity ...PL/SQL is the most common language for ORACLE database application. It allows the developer to create stored program units (Procedures, Functions, and Packages) to improve software reusability and hide the complexity of the execution of a specific operation behind a name. Also, it acts as an interface between SQL database and DEVELOPER. Therefore, it is important to test these modules that consist of procedures and functions. In this paper, a new genetic algorithm (GA), as search technique, is used in order to find the required test data according to branch criteria to test stored PL/SQL program units. The experimental results show that this was not fully achieved, such that the test target in some branches is not reached and the coverage percentage is 98%. A problem rises when target branch is depending on data retrieved from tables;in this case, GA is not able to generate test cases for this branch.展开更多
With the development of oilfield exploration and mining, the research on continental oil and gas reservoirs has been gradually refined, and the exploration target of offshore reservoir has also entered the hot studyst...With the development of oilfield exploration and mining, the research on continental oil and gas reservoirs has been gradually refined, and the exploration target of offshore reservoir has also entered the hot studystage of small sand bodies, small fault blocks, complex structures, low permeability and various heterogeneous geological bodies. Thus, the marine oil and gas development will inevitably enter thecomplicated reservoir stage;meanwhile the corresponding assessment technologies, engineering measures andexploration method should be designed delicately. Studying on hydraulic flow unit of low permeability reservoir of offshore oilfield has practical significance for connectivity degree and remaining oil distribution. An integrated method which contains the data mining and flow unit identification part was used on the flow unit prediction of low permeability reservoir;the predicted results?were compared with mature commercial system results for verifying its application. This strategy is successfully applied to increase the accuracy by choosing the outstanding prediction result. Excellent computing system could provide more accurate geological information for reservoir characterization.展开更多
Compaction correction is a key part of paleogeomorphic recovery methods. Yet, the influence of lithology on the porosity evolution is not usually taken into account. Present methods merely classify the lithologies as ...Compaction correction is a key part of paleogeomorphic recovery methods. Yet, the influence of lithology on the porosity evolution is not usually taken into account. Present methods merely classify the lithologies as sandstone and mudstone to undertake separate porositydepth compaction modeling. However, using just two lithologies is an oversimplification that cannot represent the compaction history. In such schemes, the precision of the compaction recovery is inadequate. To improve the precision of compaction recovery, a depth compaction model has been proposed that involves both porosity and clay content. A clastic lithological compaction unit classification method, based on clay content, has been designed to identify lithological boundaries and establish sets of compaction units. Also, on the basis of the clastic compaction unit classification, two methods of compaction recovery that integrate well and seismic data are employed to extrapolate well-based compaction information outward along seismic lines and recover the paleo-topography of the clastic strata in the region. The examples presented here show that a better understanding of paleo-geomorphology can be gained by applying the proposed compaction recovery technology.展开更多
The high-density population leads to crowded cities. The future city is envisaged to encompass a large-scale network with diverse applications and a massive number of interconnected heterogeneous wireless-enabled devi...The high-density population leads to crowded cities. The future city is envisaged to encompass a large-scale network with diverse applications and a massive number of interconnected heterogeneous wireless-enabled devices. Hence, green technology elements are crucial to design sustainable and future-proof network architectures. They are the solutions for spectrum scarcity, high latency, interference, energy efficiency, and scalability that occur in dense and heterogeneous wireless networks especially in the home area network (HAN). Radio-over-fiber (ROF) is a technology candidate to provide a global view of HAN's activities that can be leveraged to allocate orthogonal channel communications for enabling wireless-enabled HAN devices transmission, with considering the clustered-frequency-reuse approach. Our proposed network architecture design is mainly focused on enhancing the network throughput and reducing the average network communications latency by proposing a data aggregation unit (DAU). The performance shows that with the DAU, the average network communications latency reduces significantly while the network throughput is enhanced, compared with the existing ROF architecture without the DAU.展开更多
The reliability assessment of unit-system near two levels is the mostimportant content in the reliability multi-level synthesis of complex systems. Introducing theinformation theory into system reliability assessment,...The reliability assessment of unit-system near two levels is the mostimportant content in the reliability multi-level synthesis of complex systems. Introducing theinformation theory into system reliability assessment, using the addible characteristic ofinformation quantity and the principle of equivalence of information quantity, an entropy method ofdata information conversion is presented for the system consisted of identical exponential units.The basic conversion formulae of entropy method of unit test data are derived based on the principleof information quantity equivalence. The general models of entropy method synthesis assessment forsystem reliability approximate lower limits are established according to the fundamental principleof the unit reliability assessment. The applications of the entropy method are discussed by way ofpractical examples. Compared with the traditional methods, the entropy method is found to be validand practicable and the assessment results are very satisfactory.展开更多
In this paper, we will illustrate the use and power of Hidden Markov models in analyzing multivariate data over time. The data used in this study was obtained from the Organization for Economic Co-operation and Develo...In this paper, we will illustrate the use and power of Hidden Markov models in analyzing multivariate data over time. The data used in this study was obtained from the Organization for Economic Co-operation and Development (OECD. Stat database url: https://stats.oecd.org/) and encompassed monthly data on the employment rate of males and females in Canada and the United States (aged 15 years and over;seasonally adjusted from January 1995 to July 2018). Two different underlying patterns of trends in employment over the 23 years observation period were uncovered.展开更多
This paper investigates autonomic cloud data center networks, which is the solution with the increasingly complex computing environment, in terms of the management and cost issues to meet users’ growing demand. The v...This paper investigates autonomic cloud data center networks, which is the solution with the increasingly complex computing environment, in terms of the management and cost issues to meet users’ growing demand. The virtualized cloud networking is to provide a plethora of rich online applications, including self-configuration, self-healing, self-optimization and self-protection. In addition, we draw on the intelligent subject and multi-agent system, concerning system model, strategy, autonomic cloud computing, involving independent computing system development and implementation. Then, combining the architecture with the autonomous unit, we propose the MCDN (Model of Autonomic Cloud Data Center Networks). This model can define intelligent state, elaborate the composition structure, and complete life cycle. Finally, our proposed public infrastructure can be provided with the autonomous unit in the supported interaction model.展开更多
Considering units starting and network constraints and the concept of optimization period,a optimization model which is a typical multi-constraint knapsack problem is established to solve the selection optimization pr...Considering units starting and network constraints and the concept of optimization period,a optimization model which is a typical multi-constraint knapsack problem is established to solve the selection optimization problem of units starting in power system restoration period in this paper, and the objective of the model is to maximize the total power generation capability. A relative effectiveness assessment based on a improving data envelopment analysis is adopted to select the initial units to be started, genetic algorithms are employed to solve the knapsack problem to determine the most reasonable units be started at the current time. Finally, IEEE-39 bus system simulation result proves that the proposed model is feasible and effective.展开更多
文摘Six national-scale,or near national-scale,geochemical data sets for soils or stream sediments exist for the United States.The earliest of these,here termed the 'Shacklette' data set,was generated by a U.S. Geological Survey(USGS) project conducted from 1961 to 1975.This project used soil collected from a depth of about 20 cm as the sampling medium at 1323 sites throughout the conterminous U.S.The National Uranium Resource Evaluation Hydrogeochemical and Stream Sediment Reconnaissance(NUREHSSR) Program of the U.S.Department of Energy was conducted from 1975 to 1984 and collected either stream sediments,lake sediments,or soils at more than 378,000 sites in both the conterminous U.S.and Alaska.The sampled area represented about 65%of the nation.The Natural Resources Conservation Service(NRCS),from 1978 to 1982,collected samples from multiple soil horizons at sites within the major crop-growing regions of the conterminous U.S.This data set contains analyses of more than 3000 samples.The National Geochemical Survey,a USGS project conducted from 1997 to 2009,used a subset of the NURE-HSSR archival samples as its starting point and then collected primarily stream sediments, with occasional soils,in the parts of the U.S.not covered by the NURE-HSSR Program.This data set contains chemical analyses for more than 70,000 samples.The USGS,in collaboration with the Mexican Geological Survey and the Geological Survey of Canada,initiated soil sampling for the North American Soil Geochemical Landscapes Project in 2007.Sampling of three horizons or depths at more than 4800 sites in the U.S.was completed in 2010,and chemical analyses are currently ongoing.The NRCS initiated a project in the 1990s to analyze the various soil horizons from selected pedons throughout the U.S.This data set currently contains data from more than 1400 sites.This paper(1) discusses each data set in terms of its purpose,sample collection protocols,and analytical methods;and(2) evaluates each data set in terms of its appropriateness as a national-scale geochemical database and its usefulness for nationalscale geochemical mapping.
基金supported by the National Natural Science Foundation of China (70961005)211 Project for Postgraduate Student Program of Inner Mongolia University+1 种基金National Natural Science Foundation of Inner Mongolia (2010Zd342011MS1002)
文摘The conventional data envelopment analysis (DEA) measures the relative efficiencies of a set of decision making units with exact values of inputs and outputs. In real-world prob- lems, however, inputs and outputs typically have some levels of fuzziness. To analyze a decision making unit (DMU) with fuzzy input/output data, previous studies provided the fuzzy DEA model and proposed an associated evaluating approach. Nonetheless, numerous deficiencies must still be improved, including the α- cut approaches, types of fuzzy numbers, and ranking techniques. Moreover, a fuzzy sample DMU still cannot be evaluated for the Fuzzy DEA model. Therefore, this paper proposes a fuzzy DEA model based on sample decision making unit (FSDEA). Five eval- uation approaches and the related algorithm and ranking methods are provided to test the fuzzy sample DMU of the FSDEA model. A numerical experiment is used to demonstrate and compare the results with those obtained using alternative approaches.
基金Supported by Innovative Program of the Chinese Academy of Sciences (No. KGCY-SYW-407-02)Grand International Cooperation Foundation of Shanghai Science and Technology Commission (No. 052207046)
文摘Application-specific data processing units (DPUs) are commonly adopted for operational control and data processing in space missions. To overcome the limitations of traditional radiation-hardened or fully commercial design approaches, a reconfigurable-system-on-chip (RSoC) solution based on state-of-the-art FPGA is introduced. The flexibility and reliability of this approach are outlined, and the requirements for an enhanced RSoC design with in-flight reconfigurability for space applications are presented. This design has been demonstrated as an on-board computer prototype, providing an in-flight reconfigurable DPU design approach using integrated hardwired processors.
基金the High Technology Research and Development Programme of china.
文摘This paper describes the function,structure and working status of the data buffer unitDBU,one of the most important functional units on ITM-1.It also discusses DBU’s supportto the multiprocessor system and Prolog language.
文摘PL/SQL is the most common language for ORACLE database application. It allows the developer to create stored program units (Procedures, Functions, and Packages) to improve software reusability and hide the complexity of the execution of a specific operation behind a name. Also, it acts as an interface between SQL database and DEVELOPER. Therefore, it is important to test these modules that consist of procedures and functions. In this paper, a new genetic algorithm (GA), as search technique, is used in order to find the required test data according to branch criteria to test stored PL/SQL program units. The experimental results show that this was not fully achieved, such that the test target in some branches is not reached and the coverage percentage is 98%. A problem rises when target branch is depending on data retrieved from tables;in this case, GA is not able to generate test cases for this branch.
文摘With the development of oilfield exploration and mining, the research on continental oil and gas reservoirs has been gradually refined, and the exploration target of offshore reservoir has also entered the hot studystage of small sand bodies, small fault blocks, complex structures, low permeability and various heterogeneous geological bodies. Thus, the marine oil and gas development will inevitably enter thecomplicated reservoir stage;meanwhile the corresponding assessment technologies, engineering measures andexploration method should be designed delicately. Studying on hydraulic flow unit of low permeability reservoir of offshore oilfield has practical significance for connectivity degree and remaining oil distribution. An integrated method which contains the data mining and flow unit identification part was used on the flow unit prediction of low permeability reservoir;the predicted results?were compared with mature commercial system results for verifying its application. This strategy is successfully applied to increase the accuracy by choosing the outstanding prediction result. Excellent computing system could provide more accurate geological information for reservoir characterization.
文摘Compaction correction is a key part of paleogeomorphic recovery methods. Yet, the influence of lithology on the porosity evolution is not usually taken into account. Present methods merely classify the lithologies as sandstone and mudstone to undertake separate porositydepth compaction modeling. However, using just two lithologies is an oversimplification that cannot represent the compaction history. In such schemes, the precision of the compaction recovery is inadequate. To improve the precision of compaction recovery, a depth compaction model has been proposed that involves both porosity and clay content. A clastic lithological compaction unit classification method, based on clay content, has been designed to identify lithological boundaries and establish sets of compaction units. Also, on the basis of the clastic compaction unit classification, two methods of compaction recovery that integrate well and seismic data are employed to extrapolate well-based compaction information outward along seismic lines and recover the paleo-topography of the clastic strata in the region. The examples presented here show that a better understanding of paleo-geomorphology can be gained by applying the proposed compaction recovery technology.
基金supported by the Ministry of Higher Education,Malaysia under Scholarship of Hadiah Latihan Persekutuan under Grant No.KPT.B.600-19/3-791206065445
文摘The high-density population leads to crowded cities. The future city is envisaged to encompass a large-scale network with diverse applications and a massive number of interconnected heterogeneous wireless-enabled devices. Hence, green technology elements are crucial to design sustainable and future-proof network architectures. They are the solutions for spectrum scarcity, high latency, interference, energy efficiency, and scalability that occur in dense and heterogeneous wireless networks especially in the home area network (HAN). Radio-over-fiber (ROF) is a technology candidate to provide a global view of HAN's activities that can be leveraged to allocate orthogonal channel communications for enabling wireless-enabled HAN devices transmission, with considering the clustered-frequency-reuse approach. Our proposed network architecture design is mainly focused on enhancing the network throughput and reducing the average network communications latency by proposing a data aggregation unit (DAU). The performance shows that with the DAU, the average network communications latency reduces significantly while the network throughput is enhanced, compared with the existing ROF architecture without the DAU.
文摘The reliability assessment of unit-system near two levels is the mostimportant content in the reliability multi-level synthesis of complex systems. Introducing theinformation theory into system reliability assessment, using the addible characteristic ofinformation quantity and the principle of equivalence of information quantity, an entropy method ofdata information conversion is presented for the system consisted of identical exponential units.The basic conversion formulae of entropy method of unit test data are derived based on the principleof information quantity equivalence. The general models of entropy method synthesis assessment forsystem reliability approximate lower limits are established according to the fundamental principleof the unit reliability assessment. The applications of the entropy method are discussed by way ofpractical examples. Compared with the traditional methods, the entropy method is found to be validand practicable and the assessment results are very satisfactory.
文摘In this paper, we will illustrate the use and power of Hidden Markov models in analyzing multivariate data over time. The data used in this study was obtained from the Organization for Economic Co-operation and Development (OECD. Stat database url: https://stats.oecd.org/) and encompassed monthly data on the employment rate of males and females in Canada and the United States (aged 15 years and over;seasonally adjusted from January 1995 to July 2018). Two different underlying patterns of trends in employment over the 23 years observation period were uncovered.
文摘This paper investigates autonomic cloud data center networks, which is the solution with the increasingly complex computing environment, in terms of the management and cost issues to meet users’ growing demand. The virtualized cloud networking is to provide a plethora of rich online applications, including self-configuration, self-healing, self-optimization and self-protection. In addition, we draw on the intelligent subject and multi-agent system, concerning system model, strategy, autonomic cloud computing, involving independent computing system development and implementation. Then, combining the architecture with the autonomous unit, we propose the MCDN (Model of Autonomic Cloud Data Center Networks). This model can define intelligent state, elaborate the composition structure, and complete life cycle. Finally, our proposed public infrastructure can be provided with the autonomous unit in the supported interaction model.
文摘Considering units starting and network constraints and the concept of optimization period,a optimization model which is a typical multi-constraint knapsack problem is established to solve the selection optimization problem of units starting in power system restoration period in this paper, and the objective of the model is to maximize the total power generation capability. A relative effectiveness assessment based on a improving data envelopment analysis is adopted to select the initial units to be started, genetic algorithms are employed to solve the knapsack problem to determine the most reasonable units be started at the current time. Finally, IEEE-39 bus system simulation result proves that the proposed model is feasible and effective.