Significant amounts of free amino acids exist in commercially sold vegetables and fruits. Despite of the fact, only a little information is available about the free amino acid contents in foods. To utilize information...Significant amounts of free amino acids exist in commercially sold vegetables and fruits. Despite of the fact, only a little information is available about the free amino acid contents in foods. To utilize information of free amino acids in food, we have carried out the experiments to quantitate the free amino acids by derivatized with NBD-F (4-fluoro-7-nitrobenzofurazan) and analyzed on reversed-phase UHPLC (ultra-high pressure liquid chromatography) equipped with ultraviolet visible detector. Almost all of food extracts contained free amino acids including GABA (T-amino butyrate). Contents of free amino acids vary considerably depending upon vegetables and fruits. Principal free amino acids found in vegetables and fruits were asparagine, glutamine, arginine and GABA, which are involved in important metabolic pathways in human. About 140 species of vegetables and fruits were subjected for the data base. All of the plants and fruits we examined exhibit significant amount of free amino acids, those are clearly distinct from data bases obtained after acid hydrolysis treated food samples. Since glutamate and GABA act as excitatory and inhibitory neurotransmitters in CNS, respectively; free amino acids in vegetables and fruits that we eat daily, should be an important source for the cellular metabolic activities.展开更多
The development and application of coated cemented carbide made in China are pres-ented. Three aspects of the coated carbide tool's performance: cutting forces, surface finish and toollife are studied. Furthermore...The development and application of coated cemented carbide made in China are pres-ented. Three aspects of the coated carbide tool's performance: cutting forces, surface finish and toollife are studied. Furthermore speed-correcting coefficients of the tool are given. On the basis of thework, a data base for coated carbide tools has been built on a microcomputer. It consists of fivemodules. essential data base, tools' comparison and inquiry, recommending cutting regimes, exper.imental curve base and an expert system for tool selection.展开更多
Various code development platforms, such as the ATHENA Framework [1] of the ATLAS [2] experiment encounter lengthy compilation/linking times. To augment this situation, the IRIS Development Platform was built as a sof...Various code development platforms, such as the ATHENA Framework [1] of the ATLAS [2] experiment encounter lengthy compilation/linking times. To augment this situation, the IRIS Development Platform was built as a software development framework acting as compiler, cross-project linker and data fetcher, which allow hot-swaps in order to compare various versions of software under test. The flexibility fostered by IRIS allowed modular exchange of software libraries among developers, making it a powerful development tool. The IRIS platform used input data ROOT-ntuples [3];however a new data model is sought, in line with the facilities offered by IRIS. The schematic of a possible new data structuring—as a user implemented object oriented data base, is presented.展开更多
The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO...The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO40,and PriEco3000 component in a composite base oil system on the performance of lubricants.The study was conducted under small laboratory sample conditions,and a data expansion method using the Gaussian Copula function was proposed to improve the prediction ability of the hybrid model.The study also compared four optimization algorithms,sticky mushroom algorithm(SMA),genetic algorithm(GA),whale optimization algorithm(WOA),and seagull optimization algorithm(SOA),to predict the kinematic viscosity at 40℃,kinematic viscosity at 100℃,viscosity index,and oxidation induction time performance of the lubricant.The results showed that the Gaussian Copula function data expansion method improved the prediction ability of the hybrid model in the case of small samples.The SOA-GBDT hybrid model had the fastest convergence speed for the samples and the best prediction effect,with determination coefficients(R^(2))for the four indicators of lubricants reaching 0.98,0.99,0.96 and 0.96,respectively.Thus,this model can significantly reduce the model’s prediction error and has good prediction ability.展开更多
In this paper, sampled-data based average-consensus control is considered for networks consisting of continuous-time first-order integrator agents in a noisy distributed communication environment. The impact of the sa...In this paper, sampled-data based average-consensus control is considered for networks consisting of continuous-time first-order integrator agents in a noisy distributed communication environment. The impact of the sampling size and the number of network nodes on the system performances is analyzed. The control input of each agent can only use information measured at the sampling instants from its neighborhood rather than the complete continuous process, and the measurements of its neighbors' states are corrupted by random noises. By probability limit theory and the property of graph Laplacian matrix, it is shown that for a connected network, the static mean square error between the individual state and the average of the initial states of all agents can be made arbitrarily small, provided the sampling size is sufficiently small. Furthermore, by properly choosing the consensus gains, almost sure consensus can be achieved. It is worth pointing out that an uncertainty principle of Gaussian networks is obtained, which implies that in the case of white Gaussian noises, no matter what the sampling size is, the product of the steady-state and transient performance indices is always equal to or larger than a constant depending on the noise intensity, network topology and the number of network nodes.展开更多
On one hand,the diversity of activities and on the other hand,the conflicts between beneficiaries necessitate the efficient management and supervision of coastal areas.Accordingly,monitoring and evaluation of such are...On one hand,the diversity of activities and on the other hand,the conflicts between beneficiaries necessitate the efficient management and supervision of coastal areas.Accordingly,monitoring and evaluation of such areas can be considered as a critical factor in the national development and directorship of the sources.With regard to this fact,remote sourcing technologies with use of analytical operations of geographic information systems(GIS),will be remarkably advantageous.Iran’s south-eastern Makran coasts are geopolitically and economically,of importance due to their strategic characteristics but have been neglected and their development and transit infrastructure are significantly beyond the international standards.Therefore,in this paper,with regard to the importance of developing Makran coasts,a Multi-Criterion Decision Analysis(MCDA)method was applied to identify and prioritize the intended criteria and parameters of zoning,in order to establish new maritime zones.The major scope of this study is to employ the satellite data,remote sensing methods,and regional statistics obtained from Jask synoptic station and investigate the region’s status in terms of topography,rainfall rate and temperature changes to reach to a comprehensive monitoring and zoning of the coastal line and to provide a pervasive local data base via use of GIS and MCDA,which will be implemented to construct the coastal regions.In this article,while explaining the steps of coastal monitoring,its main objectives are also explained and the necessary procedures for doing so are presented.Then,the general steps of marine climate identification and study of marine parameters are stated and the final achievements of the coastal monitoring process are determined.In the following,considering that this article focuses on the monitoring of Makran beaches,the method of work in the mentioned region will be described and its specific differences and complexities will be discussed in detail.Also,the impact of such projects on future research results will be discussed.展开更多
1 Introduction Information technology has been playing an ever-increasing role in geoscience.Sphisicated database platforms are essential for geological data storage,analysis and exchange of Big Data(Feblowitz,2013;Zh...1 Introduction Information technology has been playing an ever-increasing role in geoscience.Sphisicated database platforms are essential for geological data storage,analysis and exchange of Big Data(Feblowitz,2013;Zhang et al.,2016;Teng et al.,2016;Tian and Li,2018).The United States has built an information-sharing platform for state-owned scientific data as a national strategy.展开更多
This paper first puts forward a case based system framework based on data mining techniques. Then the paper examines the possibility of using neural networks as a method of retrieval in such a case based system. In ...This paper first puts forward a case based system framework based on data mining techniques. Then the paper examines the possibility of using neural networks as a method of retrieval in such a case based system. In this system we propose data mining algorithms to discover case knowledge and other algorithms.展开更多
Since web based GIS processes large size spatial geographic information on internet, we should try to improve the efficiency of spatial data query processing and transmission. This paper presents two efficient metho...Since web based GIS processes large size spatial geographic information on internet, we should try to improve the efficiency of spatial data query processing and transmission. This paper presents two efficient methods for this purpose: division transmission and progressive transmission methods. In division transmission method, a map can be divided into several parts, called “tiles”, and only tiles can be transmitted at the request of a client. In progressive transmission method, a map can be split into several phase views based on the significance of vertices, and a server produces a target object and then transmits it progressively when this spatial object is requested from a client. In order to achieve these methods, the algorithms, “tile division”, “priority order estimation” and the strategies for data transmission are proposed in this paper, respectively. Compared with such traditional methods as “map total transmission” and “layer transmission”, the web based GIS data transmission, proposed in this paper, is advantageous in the increase of the data transmission efficiency by a great margin.展开更多
An 8×10 GHz receiver optical sub-assembly (ROSA) consisting of an 8-channel arrayed waveguide grating (AWG) and an 8-channel PIN photodetector (PD) array is designed and fabricated based on silica hybrid in...An 8×10 GHz receiver optical sub-assembly (ROSA) consisting of an 8-channel arrayed waveguide grating (AWG) and an 8-channel PIN photodetector (PD) array is designed and fabricated based on silica hybrid integration technology. Multimode output waveguides in the silica AWG with 2% refractive index difference are used to obtain fiat-top spectra. The output waveguide facet is polished to 45° bevel to change the light propagation direction into the mesa-type PIN PD, which simplifies the packaging process. The experimentM results show that the single channel I dB bandwidth of AWG ranges from 2.12nm to 3.06nm, the ROSA responsivity ranges from 0.097 A/W to 0.158A/W, and the 3dB bandwidth is up to 11 GHz. It is promising to be applied in the eight-lane WDM transmission system in data center interconnection.展开更多
Sensor nodes in a wireless sensor network (WSN) are typically powered by batteries, thus the energy is constrained. It is our design goal to efficiently utilize the energy of each sensor node to extend its lifetime,...Sensor nodes in a wireless sensor network (WSN) are typically powered by batteries, thus the energy is constrained. It is our design goal to efficiently utilize the energy of each sensor node to extend its lifetime, so as to prolong the lifetime of the whole WSN. In this paper, we propose a path-based data aggregation scheme (PBDAS) for grid-based wireless sensor networks. In order to extend the lifetime of a WSN, we construct a grid infrastructure by partitioning the whole sensor field into a grid of cells. Each cell has a head responsible for aggregating its own data with the data sensed by the others in the same cell and then transmitting out. In order to efficiently and rapidly transmit the data to the base station (BS), we link each cell head to form a chain. Each cell head on the chain takes turn becoming the chain leader responsible for transmitting data to the BS. Aggregated data moves from head to head along the chain, and finally the chain leader transmits to the BS. In PBDAS, only the cell heads need to transmit data toward the BS. Therefore, the data transmissions to the BS substantially decrease. Besides, the cell heads and chain leader are designated in turn according to the energy level so that the energy depletion of nodes is evenly distributed. Simulation results show that the proposed PBDAS extends the lifetime of sensor nodes, so as to make the lifetime of the whole network longer.展开更多
Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offer...Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offers opportunities but may also bear risks. A hitherto neglected aspect is the possible increase in power consumption as smart devices in IoT applications are expected to be reachable by other devices at all times. This implies that the device is consuming electrical energy even when it is not in use for its primary function. Many researchers’ communities have started addressing storage ability like cache memory of smart devices using the concept called—Named Data Networking (NDN) to achieve better energy efficient communication model. In NDN, memory or buffer overflow is the common challenge especially when internal memory of node exceeds its limit and data with highest degree of freshness may not be accommodated and entire scenarios behaves like a traditional network. In such case, Data Caching is not performed by intermediate nodes to guarantee highest degree of freshness. On the periodical updates sent from data producers, it is exceedingly demanded that data consumers must get up to date information at cost of lease energy. Consequently, there is challenge in maintaining tradeoff between freshness energy consumption during Publisher-Subscriber interaction. In our work, we proposed the architecture to overcome cache strategy issue by Smart Caching Algorithm for improvement in memory management and data freshness. The smart caching strategy updates the data at precise interval by keeping garbage data into consideration. It is also observed from experiment that data redundancy can be easily obtained by ignoring/dropping data packets for the information which is not of interest by other participating nodes in network, ultimately leading to optimizing tradeoff between freshness and energy required.展开更多
Cloud storage is widely used by large companies to store vast amounts of data and files,offering flexibility,financial savings,and security.However,information shoplifting poses significant threats,potentially leading...Cloud storage is widely used by large companies to store vast amounts of data and files,offering flexibility,financial savings,and security.However,information shoplifting poses significant threats,potentially leading to poor performance and privacy breaches.Blockchain-based cognitive computing can help protect and maintain information security and privacy in cloud platforms,ensuring businesses can focus on business development.To ensure data security in cloud platforms,this research proposed a blockchain-based Hybridized Data Driven Cognitive Computing(HD2C)model.However,the proposed HD2C framework addresses breaches of the privacy information of mixed participants of the Internet of Things(IoT)in the cloud.HD2C is developed by combining Federated Learning(FL)with a Blockchain consensus algorithm to connect smart contracts with Proof of Authority.The“Data Island”problem can be solved by FL’s emphasis on privacy and lightning-fast processing,while Blockchain provides a decentralized incentive structure that is impervious to poisoning.FL with Blockchain allows quick consensus through smart member selection and verification.The HD2C paradigm significantly improves the computational processing efficiency of intelligent manufacturing.Extensive analysis results derived from IIoT datasets confirm HD2C superiority.When compared to other consensus algorithms,the Blockchain PoA’s foundational cost is significant.The accuracy and memory utilization evaluation results predict the total benefits of the system.In comparison to the values 0.004 and 0.04,the value of 0.4 achieves good accuracy.According to the experiment results,the number of transactions per second has minimal impact on memory requirements.The findings of this study resulted in the development of a brand-new IIoT framework based on blockchain technology.展开更多
The traditional threat score based on fixed thresholds for precipitation verification is sensitive to intensity forecast bias. In this study, the neighborhood precipitation threat score is modified by defining the thr...The traditional threat score based on fixed thresholds for precipitation verification is sensitive to intensity forecast bias. In this study, the neighborhood precipitation threat score is modified by defining the thresholds in terms of the percentiles of overall precipitation instead of fixed threshold values. The impact of intensity forecast bias on the calculated threat score is reduced. The method is tested with the forecasts of a tropical storm that re-intensified after making landfall and caused heavy flooding. The forecasts are produced with and without radar data assimilation. The forecast with assimilation of both radial velocity and reflectivity produce precipitation patterns that better match observations but have large positive intensity bias. When using fixed thresholds, the neighborhood threat scores fail to yield high scores for forecasts that have good pattern match with observations, due to large intensity bias. In contrast, the percentile-based neighborhood method yields the highest score for the forecast with the best pattern match and the smallest position error. The percentile-based method also yields scores that are more consistent with object-based verifications, which are less sensitive to intensity bias, demonstrating the potential value of percentile-based verification.展开更多
The Moon-based Ultraviolet Telescope (MUVT) is one of the payloads on the Chang'e-3 (CE-3) lunar lander. Because of the advantages of having no at- mospheric disturbances and the slow rotation of the Moon, we can...The Moon-based Ultraviolet Telescope (MUVT) is one of the payloads on the Chang'e-3 (CE-3) lunar lander. Because of the advantages of having no at- mospheric disturbances and the slow rotation of the Moon, we can make long-term continuous observations of a series of important celestial objects in the near ultra- violet band (245-340 nm), and perform a sky survey of selected areas, which can- not be completed on Earth. We can find characteristic changes in celestial brightness with time by analyzing image data from the MUVT, and deduce the radiation mech- anism and physical properties of these celestial objects after comparing with a phys- ical model. In order to explain the scientific purposes of MUVT, this article analyzes the preprocessing of MUVT image data and makes a preliminary evaluation of data quality. The results demonstrate that the methods used for data collection and prepro- cessing are effective, and the Level 2A and 2B image data satisfy the requirements of follow-up scientific researches.展开更多
Recently, use of mobile communicational devices in field data collection is increasing such as smart phones and cellular phones due to emergence of embedded Global Position System GPS and Wi-Fi Internet access. Accura...Recently, use of mobile communicational devices in field data collection is increasing such as smart phones and cellular phones due to emergence of embedded Global Position System GPS and Wi-Fi Internet access. Accurate timely and handy field data collection is required for disaster management and emergency quick responses. In this article, we introduce web-based GIS system to collect the field data by personal mobile phone through Post Office Protocol POP3 mail server. The main objective of this work is to demonstrate real-time field data collection method to the students using their mobile phone to collect field data by timely and handy manners, either individual or group survey in local or global scale research.展开更多
Building detection in very high resolution (VHR) images is crucial for mapping and analysing urban environments. Since buildings are elevated objects, elevation data need to be integrated with images for reliable dete...Building detection in very high resolution (VHR) images is crucial for mapping and analysing urban environments. Since buildings are elevated objects, elevation data need to be integrated with images for reliable detection. This process requires two critical steps: optical-elevation data co-registration and aboveground elevation calculation. These two steps are still challenging to some extent. Therefore, this paper introduces optical-elevation data co-registration and normalization techniques for generating a dataset that facilitates elevation-based building detection. For achieving accurate co-registration, a dense set of stereo-based elevations is generated and co-registered to their relevant image based on their corresponding image locations. To normalize these co-registered elevations, the bare-earth elevations are detected based on classification information of some terrain-level features after achieving the image co-registration. The developed method was executed and validated. After implementation, 80% overall-quality of detection result was achieved with 94% correct detection. Together, the developed techniques successfully facilitate the incorporation of stereo-based elevations for detecting buildings in VHR remote sensing images.展开更多
This paper is concerned with the synchronization of delayed neural networks via sampled-data control. A new technique, namely, the free-matrix-based time-dependent discontinuous Lyapunov functional approach, is adopte...This paper is concerned with the synchronization of delayed neural networks via sampled-data control. A new technique, namely, the free-matrix-based time-dependent discontinuous Lyapunov functional approach, is adopted in constructing the Lyapunov functional, which takes advantage of the sampling characteristic of sawtooth input delay. Based on this discontinuous Lyapunov functional, some less conservative synchronization criteria are established to ensure that the slave system is synchronous with the master system. The desired sampled-data controller can be obtained through the use of the linear matrix inequality(LMI) technique. Finally, two numerical examples are provided to demonstrate the effectiveness and the improvements of the proposed methods.展开更多
Painting is done according to the artist’s style.The most representative of the style is the texture and shape of the brush stroke.Computer simulations allow the artist’s painting to be produced by taking this strok...Painting is done according to the artist’s style.The most representative of the style is the texture and shape of the brush stroke.Computer simulations allow the artist’s painting to be produced by taking this stroke and pasting it onto the image.This is called stroke-based rendering.The quality of the result depends on the number or quality of this stroke,since the stroke is taken to create the image.It is not easy to render using a large amount of information,as there is a limit to having a stroke scanned.In this work,we intend to produce rendering results using mass data that produces large amounts of strokes by expanding existing strokes through warping.Through this,we have produced results that have higher quality than conventional studies.Finally,we also compare the correlation between the amount of data and the results.展开更多
文摘Significant amounts of free amino acids exist in commercially sold vegetables and fruits. Despite of the fact, only a little information is available about the free amino acid contents in foods. To utilize information of free amino acids in food, we have carried out the experiments to quantitate the free amino acids by derivatized with NBD-F (4-fluoro-7-nitrobenzofurazan) and analyzed on reversed-phase UHPLC (ultra-high pressure liquid chromatography) equipped with ultraviolet visible detector. Almost all of food extracts contained free amino acids including GABA (T-amino butyrate). Contents of free amino acids vary considerably depending upon vegetables and fruits. Principal free amino acids found in vegetables and fruits were asparagine, glutamine, arginine and GABA, which are involved in important metabolic pathways in human. About 140 species of vegetables and fruits were subjected for the data base. All of the plants and fruits we examined exhibit significant amount of free amino acids, those are clearly distinct from data bases obtained after acid hydrolysis treated food samples. Since glutamate and GABA act as excitatory and inhibitory neurotransmitters in CNS, respectively; free amino acids in vegetables and fruits that we eat daily, should be an important source for the cellular metabolic activities.
文摘The development and application of coated cemented carbide made in China are pres-ented. Three aspects of the coated carbide tool's performance: cutting forces, surface finish and toollife are studied. Furthermore speed-correcting coefficients of the tool are given. On the basis of thework, a data base for coated carbide tools has been built on a microcomputer. It consists of fivemodules. essential data base, tools' comparison and inquiry, recommending cutting regimes, exper.imental curve base and an expert system for tool selection.
文摘Various code development platforms, such as the ATHENA Framework [1] of the ATLAS [2] experiment encounter lengthy compilation/linking times. To augment this situation, the IRIS Development Platform was built as a software development framework acting as compiler, cross-project linker and data fetcher, which allow hot-swaps in order to compare various versions of software under test. The flexibility fostered by IRIS allowed modular exchange of software libraries among developers, making it a powerful development tool. The IRIS platform used input data ROOT-ntuples [3];however a new data model is sought, in line with the facilities offered by IRIS. The schematic of a possible new data structuring—as a user implemented object oriented data base, is presented.
基金financial support extended for this academic work by the Beijing Natural Science Foundation(Grant 2232066)the Open Project Foundation of State Key Laboratory of Solid Lubrication(Grant LSL-2212).
文摘The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO40,and PriEco3000 component in a composite base oil system on the performance of lubricants.The study was conducted under small laboratory sample conditions,and a data expansion method using the Gaussian Copula function was proposed to improve the prediction ability of the hybrid model.The study also compared four optimization algorithms,sticky mushroom algorithm(SMA),genetic algorithm(GA),whale optimization algorithm(WOA),and seagull optimization algorithm(SOA),to predict the kinematic viscosity at 40℃,kinematic viscosity at 100℃,viscosity index,and oxidation induction time performance of the lubricant.The results showed that the Gaussian Copula function data expansion method improved the prediction ability of the hybrid model in the case of small samples.The SOA-GBDT hybrid model had the fastest convergence speed for the samples and the best prediction effect,with determination coefficients(R^(2))for the four indicators of lubricants reaching 0.98,0.99,0.96 and 0.96,respectively.Thus,this model can significantly reduce the model’s prediction error and has good prediction ability.
基金Supported by Singapore Millennium Foundationthe National Natural Science Foundation of China (Grant Nos. 60821091, 60674308)
文摘In this paper, sampled-data based average-consensus control is considered for networks consisting of continuous-time first-order integrator agents in a noisy distributed communication environment. The impact of the sampling size and the number of network nodes on the system performances is analyzed. The control input of each agent can only use information measured at the sampling instants from its neighborhood rather than the complete continuous process, and the measurements of its neighbors' states are corrupted by random noises. By probability limit theory and the property of graph Laplacian matrix, it is shown that for a connected network, the static mean square error between the individual state and the average of the initial states of all agents can be made arbitrarily small, provided the sampling size is sufficiently small. Furthermore, by properly choosing the consensus gains, almost sure consensus can be achieved. It is worth pointing out that an uncertainty principle of Gaussian networks is obtained, which implies that in the case of white Gaussian noises, no matter what the sampling size is, the product of the steady-state and transient performance indices is always equal to or larger than a constant depending on the noise intensity, network topology and the number of network nodes.
文摘On one hand,the diversity of activities and on the other hand,the conflicts between beneficiaries necessitate the efficient management and supervision of coastal areas.Accordingly,monitoring and evaluation of such areas can be considered as a critical factor in the national development and directorship of the sources.With regard to this fact,remote sourcing technologies with use of analytical operations of geographic information systems(GIS),will be remarkably advantageous.Iran’s south-eastern Makran coasts are geopolitically and economically,of importance due to their strategic characteristics but have been neglected and their development and transit infrastructure are significantly beyond the international standards.Therefore,in this paper,with regard to the importance of developing Makran coasts,a Multi-Criterion Decision Analysis(MCDA)method was applied to identify and prioritize the intended criteria and parameters of zoning,in order to establish new maritime zones.The major scope of this study is to employ the satellite data,remote sensing methods,and regional statistics obtained from Jask synoptic station and investigate the region’s status in terms of topography,rainfall rate and temperature changes to reach to a comprehensive monitoring and zoning of the coastal line and to provide a pervasive local data base via use of GIS and MCDA,which will be implemented to construct the coastal regions.In this article,while explaining the steps of coastal monitoring,its main objectives are also explained and the necessary procedures for doing so are presented.Then,the general steps of marine climate identification and study of marine parameters are stated and the final achievements of the coastal monitoring process are determined.In the following,considering that this article focuses on the monitoring of Makran beaches,the method of work in the mentioned region will be described and its specific differences and complexities will be discussed in detail.Also,the impact of such projects on future research results will be discussed.
基金granted by the National Science&Technology Major Projects of China(Grant No.2016ZX05033).
文摘1 Introduction Information technology has been playing an ever-increasing role in geoscience.Sphisicated database platforms are essential for geological data storage,analysis and exchange of Big Data(Feblowitz,2013;Zhang et al.,2016;Teng et al.,2016;Tian and Li,2018).The United States has built an information-sharing platform for state-owned scientific data as a national strategy.
基金Supported by the National Science of China(6 0 0 75 0 15 ) and Key Project of Scientific and Technological Departmentin Anhui
文摘This paper first puts forward a case based system framework based on data mining techniques. Then the paper examines the possibility of using neural networks as a method of retrieval in such a case based system. In this system we propose data mining algorithms to discover case knowledge and other algorithms.
文摘Since web based GIS processes large size spatial geographic information on internet, we should try to improve the efficiency of spatial data query processing and transmission. This paper presents two efficient methods for this purpose: division transmission and progressive transmission methods. In division transmission method, a map can be divided into several parts, called “tiles”, and only tiles can be transmitted at the request of a client. In progressive transmission method, a map can be split into several phase views based on the significance of vertices, and a server produces a target object and then transmits it progressively when this spatial object is requested from a client. In order to achieve these methods, the algorithms, “tile division”, “priority order estimation” and the strategies for data transmission are proposed in this paper, respectively. Compared with such traditional methods as “map total transmission” and “layer transmission”, the web based GIS data transmission, proposed in this paper, is advantageous in the increase of the data transmission efficiency by a great margin.
基金Supported by the National High Technology Research and Development Program of China under Grant No 2015AA016902the National Natural Science Foundation of China under Grant Nos 61435013 and 61405188the K.C.Wong Education Foundation
文摘An 8×10 GHz receiver optical sub-assembly (ROSA) consisting of an 8-channel arrayed waveguide grating (AWG) and an 8-channel PIN photodetector (PD) array is designed and fabricated based on silica hybrid integration technology. Multimode output waveguides in the silica AWG with 2% refractive index difference are used to obtain fiat-top spectra. The output waveguide facet is polished to 45° bevel to change the light propagation direction into the mesa-type PIN PD, which simplifies the packaging process. The experimentM results show that the single channel I dB bandwidth of AWG ranges from 2.12nm to 3.06nm, the ROSA responsivity ranges from 0.097 A/W to 0.158A/W, and the 3dB bandwidth is up to 11 GHz. It is promising to be applied in the eight-lane WDM transmission system in data center interconnection.
基金supported by the NSC under Grant No.NSC-101-2221-E-239-032 and NSC-102-2221-E-239-020
文摘Sensor nodes in a wireless sensor network (WSN) are typically powered by batteries, thus the energy is constrained. It is our design goal to efficiently utilize the energy of each sensor node to extend its lifetime, so as to prolong the lifetime of the whole WSN. In this paper, we propose a path-based data aggregation scheme (PBDAS) for grid-based wireless sensor networks. In order to extend the lifetime of a WSN, we construct a grid infrastructure by partitioning the whole sensor field into a grid of cells. Each cell has a head responsible for aggregating its own data with the data sensed by the others in the same cell and then transmitting out. In order to efficiently and rapidly transmit the data to the base station (BS), we link each cell head to form a chain. Each cell head on the chain takes turn becoming the chain leader responsible for transmitting data to the BS. Aggregated data moves from head to head along the chain, and finally the chain leader transmits to the BS. In PBDAS, only the cell heads need to transmit data toward the BS. Therefore, the data transmissions to the BS substantially decrease. Besides, the cell heads and chain leader are designated in turn according to the energy level so that the energy depletion of nodes is evenly distributed. Simulation results show that the proposed PBDAS extends the lifetime of sensor nodes, so as to make the lifetime of the whole network longer.
基金Supported by National Natural Science Foundation of China (61304079, 61125306, 61034002), the Open Research Project from SKLMCCS (20120106), the Fundamental Research Funds for the Central Universities (FRF-TP-13-018A), and the China Postdoctoral Science. Foundation (201_3M_ 5305_27)_ _ _
文摘为有致动器浸透和未知动力学的分离时间的系统的一个班的一个新奇最佳的追踪控制方法在这份报纸被建议。计划基于反复的适应动态编程(自动数据处理) 算法。以便实现控制计划,一个 data-based 标识符首先为未知系统动力学被构造。由介绍 M 网络,稳定的控制的明确的公式被完成。以便消除致动器浸透的效果, nonquadratic 表演功能被介绍,然后一个反复的自动数据处理算法被建立与集中分析完成最佳的追踪控制解决方案。为实现最佳的控制方法,神经网络被用来建立 data-based 标识符,计算性能索引功能,近似最佳的控制政策并且分别地解决稳定的控制。模拟例子被提供验证介绍最佳的追踪的控制计划的有效性。
文摘Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offers opportunities but may also bear risks. A hitherto neglected aspect is the possible increase in power consumption as smart devices in IoT applications are expected to be reachable by other devices at all times. This implies that the device is consuming electrical energy even when it is not in use for its primary function. Many researchers’ communities have started addressing storage ability like cache memory of smart devices using the concept called—Named Data Networking (NDN) to achieve better energy efficient communication model. In NDN, memory or buffer overflow is the common challenge especially when internal memory of node exceeds its limit and data with highest degree of freshness may not be accommodated and entire scenarios behaves like a traditional network. In such case, Data Caching is not performed by intermediate nodes to guarantee highest degree of freshness. On the periodical updates sent from data producers, it is exceedingly demanded that data consumers must get up to date information at cost of lease energy. Consequently, there is challenge in maintaining tradeoff between freshness energy consumption during Publisher-Subscriber interaction. In our work, we proposed the architecture to overcome cache strategy issue by Smart Caching Algorithm for improvement in memory management and data freshness. The smart caching strategy updates the data at precise interval by keeping garbage data into consideration. It is also observed from experiment that data redundancy can be easily obtained by ignoring/dropping data packets for the information which is not of interest by other participating nodes in network, ultimately leading to optimizing tradeoff between freshness and energy required.
文摘Cloud storage is widely used by large companies to store vast amounts of data and files,offering flexibility,financial savings,and security.However,information shoplifting poses significant threats,potentially leading to poor performance and privacy breaches.Blockchain-based cognitive computing can help protect and maintain information security and privacy in cloud platforms,ensuring businesses can focus on business development.To ensure data security in cloud platforms,this research proposed a blockchain-based Hybridized Data Driven Cognitive Computing(HD2C)model.However,the proposed HD2C framework addresses breaches of the privacy information of mixed participants of the Internet of Things(IoT)in the cloud.HD2C is developed by combining Federated Learning(FL)with a Blockchain consensus algorithm to connect smart contracts with Proof of Authority.The“Data Island”problem can be solved by FL’s emphasis on privacy and lightning-fast processing,while Blockchain provides a decentralized incentive structure that is impervious to poisoning.FL with Blockchain allows quick consensus through smart member selection and verification.The HD2C paradigm significantly improves the computational processing efficiency of intelligent manufacturing.Extensive analysis results derived from IIoT datasets confirm HD2C superiority.When compared to other consensus algorithms,the Blockchain PoA’s foundational cost is significant.The accuracy and memory utilization evaluation results predict the total benefits of the system.In comparison to the values 0.004 and 0.04,the value of 0.4 achieves good accuracy.According to the experiment results,the number of transactions per second has minimal impact on memory requirements.The findings of this study resulted in the development of a brand-new IIoT framework based on blockchain technology.
基金primarily supported by the National 973 Fundamental Research Program of China(Grant No.2013CB430103)the Department of Transportation Federal Aviation Administration(Grant No.NA17RJ1227)through the National Oceanic and Atmospheric Administration+1 种基金supported by the National Science Foundation of China(Grant No.41405100)the Fundamental Research Funds for the Central Universities(Grant No.20620140343)
文摘The traditional threat score based on fixed thresholds for precipitation verification is sensitive to intensity forecast bias. In this study, the neighborhood precipitation threat score is modified by defining the thresholds in terms of the percentiles of overall precipitation instead of fixed threshold values. The impact of intensity forecast bias on the calculated threat score is reduced. The method is tested with the forecasts of a tropical storm that re-intensified after making landfall and caused heavy flooding. The forecasts are produced with and without radar data assimilation. The forecast with assimilation of both radial velocity and reflectivity produce precipitation patterns that better match observations but have large positive intensity bias. When using fixed thresholds, the neighborhood threat scores fail to yield high scores for forecasts that have good pattern match with observations, due to large intensity bias. In contrast, the percentile-based neighborhood method yields the highest score for the forecast with the best pattern match and the smallest position error. The percentile-based method also yields scores that are more consistent with object-based verifications, which are less sensitive to intensity bias, demonstrating the potential value of percentile-based verification.
文摘The Moon-based Ultraviolet Telescope (MUVT) is one of the payloads on the Chang'e-3 (CE-3) lunar lander. Because of the advantages of having no at- mospheric disturbances and the slow rotation of the Moon, we can make long-term continuous observations of a series of important celestial objects in the near ultra- violet band (245-340 nm), and perform a sky survey of selected areas, which can- not be completed on Earth. We can find characteristic changes in celestial brightness with time by analyzing image data from the MUVT, and deduce the radiation mech- anism and physical properties of these celestial objects after comparing with a phys- ical model. In order to explain the scientific purposes of MUVT, this article analyzes the preprocessing of MUVT image data and makes a preliminary evaluation of data quality. The results demonstrate that the methods used for data collection and prepro- cessing are effective, and the Level 2A and 2B image data satisfy the requirements of follow-up scientific researches.
文摘Recently, use of mobile communicational devices in field data collection is increasing such as smart phones and cellular phones due to emergence of embedded Global Position System GPS and Wi-Fi Internet access. Accurate timely and handy field data collection is required for disaster management and emergency quick responses. In this article, we introduce web-based GIS system to collect the field data by personal mobile phone through Post Office Protocol POP3 mail server. The main objective of this work is to demonstrate real-time field data collection method to the students using their mobile phone to collect field data by timely and handy manners, either individual or group survey in local or global scale research.
文摘Building detection in very high resolution (VHR) images is crucial for mapping and analysing urban environments. Since buildings are elevated objects, elevation data need to be integrated with images for reliable detection. This process requires two critical steps: optical-elevation data co-registration and aboveground elevation calculation. These two steps are still challenging to some extent. Therefore, this paper introduces optical-elevation data co-registration and normalization techniques for generating a dataset that facilitates elevation-based building detection. For achieving accurate co-registration, a dense set of stereo-based elevations is generated and co-registered to their relevant image based on their corresponding image locations. To normalize these co-registered elevations, the bare-earth elevations are detected based on classification information of some terrain-level features after achieving the image co-registration. The developed method was executed and validated. After implementation, 80% overall-quality of detection result was achieved with 94% correct detection. Together, the developed techniques successfully facilitate the incorporation of stereo-based elevations for detecting buildings in VHR remote sensing images.
基金Project supported by the National Natural Science Foundation of China(Grant No.61304064)the Scientific Research Fund of Hunan Provincial Education Department,China(Grant Nos.15B067 and 16C0475)a Discovering Grant from Australian Research Council
文摘This paper is concerned with the synchronization of delayed neural networks via sampled-data control. A new technique, namely, the free-matrix-based time-dependent discontinuous Lyapunov functional approach, is adopted in constructing the Lyapunov functional, which takes advantage of the sampling characteristic of sawtooth input delay. Based on this discontinuous Lyapunov functional, some less conservative synchronization criteria are established to ensure that the slave system is synchronous with the master system. The desired sampled-data controller can be obtained through the use of the linear matrix inequality(LMI) technique. Finally, two numerical examples are provided to demonstrate the effectiveness and the improvements of the proposed methods.
基金This research was supported by the Chung-Ang University Research Scholarship Grants in 2017.
文摘Painting is done according to the artist’s style.The most representative of the style is the texture and shape of the brush stroke.Computer simulations allow the artist’s painting to be produced by taking this stroke and pasting it onto the image.This is called stroke-based rendering.The quality of the result depends on the number or quality of this stroke,since the stroke is taken to create the image.It is not easy to render using a large amount of information,as there is a limit to having a stroke scanned.In this work,we intend to produce rendering results using mass data that produces large amounts of strokes by expanding existing strokes through warping.Through this,we have produced results that have higher quality than conventional studies.Finally,we also compare the correlation between the amount of data and the results.