Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offer...Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offers opportunities but may also bear risks. A hitherto neglected aspect is the possible increase in power consumption as smart devices in IoT applications are expected to be reachable by other devices at all times. This implies that the device is consuming electrical energy even when it is not in use for its primary function. Many researchers’ communities have started addressing storage ability like cache memory of smart devices using the concept called—Named Data Networking (NDN) to achieve better energy efficient communication model. In NDN, memory or buffer overflow is the common challenge especially when internal memory of node exceeds its limit and data with highest degree of freshness may not be accommodated and entire scenarios behaves like a traditional network. In such case, Data Caching is not performed by intermediate nodes to guarantee highest degree of freshness. On the periodical updates sent from data producers, it is exceedingly demanded that data consumers must get up to date information at cost of lease energy. Consequently, there is challenge in maintaining tradeoff between freshness energy consumption during Publisher-Subscriber interaction. In our work, we proposed the architecture to overcome cache strategy issue by Smart Caching Algorithm for improvement in memory management and data freshness. The smart caching strategy updates the data at precise interval by keeping garbage data into consideration. It is also observed from experiment that data redundancy can be easily obtained by ignoring/dropping data packets for the information which is not of interest by other participating nodes in network, ultimately leading to optimizing tradeoff between freshness and energy required.展开更多
The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO...The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO40,and PriEco3000 component in a composite base oil system on the performance of lubricants.The study was conducted under small laboratory sample conditions,and a data expansion method using the Gaussian Copula function was proposed to improve the prediction ability of the hybrid model.The study also compared four optimization algorithms,sticky mushroom algorithm(SMA),genetic algorithm(GA),whale optimization algorithm(WOA),and seagull optimization algorithm(SOA),to predict the kinematic viscosity at 40℃,kinematic viscosity at 100℃,viscosity index,and oxidation induction time performance of the lubricant.The results showed that the Gaussian Copula function data expansion method improved the prediction ability of the hybrid model in the case of small samples.The SOA-GBDT hybrid model had the fastest convergence speed for the samples and the best prediction effect,with determination coefficients(R^(2))for the four indicators of lubricants reaching 0.98,0.99,0.96 and 0.96,respectively.Thus,this model can significantly reduce the model’s prediction error and has good prediction ability.展开更多
Traditional oil-based drilling muds(OBMs) have a relatively high solid content, which is detrimental to penetration rate increase and reservoir protection. Aimed at solving this problem, an organoclay-free OBM system ...Traditional oil-based drilling muds(OBMs) have a relatively high solid content, which is detrimental to penetration rate increase and reservoir protection. Aimed at solving this problem, an organoclay-free OBM system was studied, the synthesis methods and functioning mechanism of key additives were introduced, and performance evaluation of the system was performed. The rheology modifier was prepared by reacting a dimer fatty acid with diethanolamine, the primary emulsifier was made by oxidation and addition reaction of fatty acids, the secondary emulsifier was made by amidation of a fatty acid, and finally the fluid loss additive of water-soluble acrylic resin was synthesized by introducing acrylic acid into styrene/butyl acrylate polymerization. The rheology modifier could enhance the attraction between droplets, particles in the emulsion via intermolecular hydrogen bonding and improve the shear stress by forming a three-dimensional network structure in the emulsion. Lab experimental results show that the organoclay-free OBM could tolerate temperatures up to 220 ?C and HTHP filtration is less than 5 m L. Compared with the traditional OBMs, the organoclay-free OBM has low plastic viscosity, high shear stress, high ratio of dynamic shear force to plastic viscosity and high permeability recovery, which are beneficial to penetration rate increase, hole cleaning and reservoir protection.展开更多
Recently, use of mobile communicational devices in field data collection is increasing such as smart phones and cellular phones due to emergence of embedded Global Position System GPS and Wi-Fi Internet access. Accura...Recently, use of mobile communicational devices in field data collection is increasing such as smart phones and cellular phones due to emergence of embedded Global Position System GPS and Wi-Fi Internet access. Accurate timely and handy field data collection is required for disaster management and emergency quick responses. In this article, we introduce web-based GIS system to collect the field data by personal mobile phone through Post Office Protocol POP3 mail server. The main objective of this work is to demonstrate real-time field data collection method to the students using their mobile phone to collect field data by timely and handy manners, either individual or group survey in local or global scale research.展开更多
This paper proposes a useful web-based system for the management and sharing of electron probe micro-analysis( EPMA)data in geology. A new web-based architecture that integrates the management and sharing functions is...This paper proposes a useful web-based system for the management and sharing of electron probe micro-analysis( EPMA)data in geology. A new web-based architecture that integrates the management and sharing functions is developed and implemented.Earth scientists can utilize this system to not only manage their data,but also easily communicate and share it with other researchers.Data query methods provide the core functionality of the proposed management and sharing modules. The modules in this system have been developed using cloud GIS technologies,which help achieve real-time spatial area retrieval on a map. The system has been tested by approximately 263 users at Jilin University and Beijing SHRIMP Center. A survey was conducted among these users to estimate the usability of the primary functions of the system,and the assessment result is summarized and presented.展开更多
Building detection in very high resolution (VHR) images is crucial for mapping and analysing urban environments. Since buildings are elevated objects, elevation data need to be integrated with images for reliable dete...Building detection in very high resolution (VHR) images is crucial for mapping and analysing urban environments. Since buildings are elevated objects, elevation data need to be integrated with images for reliable detection. This process requires two critical steps: optical-elevation data co-registration and aboveground elevation calculation. These two steps are still challenging to some extent. Therefore, this paper introduces optical-elevation data co-registration and normalization techniques for generating a dataset that facilitates elevation-based building detection. For achieving accurate co-registration, a dense set of stereo-based elevations is generated and co-registered to their relevant image based on their corresponding image locations. To normalize these co-registered elevations, the bare-earth elevations are detected based on classification information of some terrain-level features after achieving the image co-registration. The developed method was executed and validated. After implementation, 80% overall-quality of detection result was achieved with 94% correct detection. Together, the developed techniques successfully facilitate the incorporation of stereo-based elevations for detecting buildings in VHR remote sensing images.展开更多
Cloud storage is widely used by large companies to store vast amounts of data and files,offering flexibility,financial savings,and security.However,information shoplifting poses significant threats,potentially leading...Cloud storage is widely used by large companies to store vast amounts of data and files,offering flexibility,financial savings,and security.However,information shoplifting poses significant threats,potentially leading to poor performance and privacy breaches.Blockchain-based cognitive computing can help protect and maintain information security and privacy in cloud platforms,ensuring businesses can focus on business development.To ensure data security in cloud platforms,this research proposed a blockchain-based Hybridized Data Driven Cognitive Computing(HD2C)model.However,the proposed HD2C framework addresses breaches of the privacy information of mixed participants of the Internet of Things(IoT)in the cloud.HD2C is developed by combining Federated Learning(FL)with a Blockchain consensus algorithm to connect smart contracts with Proof of Authority.The“Data Island”problem can be solved by FL’s emphasis on privacy and lightning-fast processing,while Blockchain provides a decentralized incentive structure that is impervious to poisoning.FL with Blockchain allows quick consensus through smart member selection and verification.The HD2C paradigm significantly improves the computational processing efficiency of intelligent manufacturing.Extensive analysis results derived from IIoT datasets confirm HD2C superiority.When compared to other consensus algorithms,the Blockchain PoA’s foundational cost is significant.The accuracy and memory utilization evaluation results predict the total benefits of the system.In comparison to the values 0.004 and 0.04,the value of 0.4 achieves good accuracy.According to the experiment results,the number of transactions per second has minimal impact on memory requirements.The findings of this study resulted in the development of a brand-new IIoT framework based on blockchain technology.展开更多
In studies of HIV, interval-censored data occur naturally. HIV infection time is not usually known exactly, only that it occurred before the survey, within some time interval or has not occurred at the time of the sur...In studies of HIV, interval-censored data occur naturally. HIV infection time is not usually known exactly, only that it occurred before the survey, within some time interval or has not occurred at the time of the survey. Infections are often clustered within geographical areas such as enumerator areas (EAs) and thus inducing unobserved frailty. In this paper we consider an approach for estimating parameters when infection time is unknown and assumed correlated within an EA where dependency is modeled as frailties assuming a normal distribution for frailties and a Weibull distribution for baseline hazards. The data was from a household based population survey that used a multi-stage stratified sample design to randomly select 23,275 interviewed individuals from 10,584 households of whom 15,851 interviewed individuals were further tested for HIV (crude prevalence = 9.1%). A further test conducted among those that tested HIV positive found 181 (12.5%) recently infected. Results show high degree of heterogeneity in HIV distribution between EAs translating to a modest correlation of 0.198. Intervention strategies should target geographical areas that contribute disproportionately to the epidemic of HIV. Further research needs to identify such hot spot areas and understand what factors make these areas prone to HIV.展开更多
This paper presents the semantic analysis of queries written in natural language (French) and dedicated to the object oriented data bases. The studied queries include one or two nominal groups (NG) articulating around...This paper presents the semantic analysis of queries written in natural language (French) and dedicated to the object oriented data bases. The studied queries include one or two nominal groups (NG) articulating around a verb. A NG consists of one or several keywords (application dependent noun or value). Simple semantic filters are defined for identifying these keywords which can be of semantic value: class, simple attribute, composed attribute, key value or not key value. Coherence rules and coherence constraints are introduced, to check the validity of the co-occurrence of two consecutive nouns in complex NG. If a query is constituted of a single NG, no further analysis is required. Otherwise, if a query covers two valid NG, it is a subject of studying the semantic coherence of the verb and both NG which are attached to it.展开更多
1 Introduction Information technology has been playing an ever-increasing role in geoscience.Sphisicated database platforms are essential for geological data storage,analysis and exchange of Big Data(Feblowitz,2013;Zh...1 Introduction Information technology has been playing an ever-increasing role in geoscience.Sphisicated database platforms are essential for geological data storage,analysis and exchange of Big Data(Feblowitz,2013;Zhang et al.,2016;Teng et al.,2016;Tian and Li,2018).The United States has built an information-sharing platform for state-owned scientific data as a national strategy.展开更多
Humic acids(HAs)are widely used as filtrate and viscosity reducers in drilling fluids.However,their practical utility is limited due to poor stability in salt resistance and high-temperature resistance.Hightemperature...Humic acids(HAs)are widely used as filtrate and viscosity reducers in drilling fluids.However,their practical utility is limited due to poor stability in salt resistance and high-temperature resistance.Hightemperature coal pitch(CP)is a by-product from coal pyrolysis above 650℃.The substance's molecular structure is characterized by a dense arrangement of aromatic hydrocarbon and alkyl substituents.This unique structure gives it unique chemical properties and excellent drilling performance,surpassing traditional humic acids in drilling operations.Potassium humate is prepared from CP(CP-HA-K)by thermal catalysis.A new type of high-quality humic acid temperature-resistant viscosity-reducer(Graft CP-HA-K polymer)is synthesized with CP-HA-K,hydrolyzed polyacrylonitrile sodium salt(Na-HPAN),urea,formaldehyde,phenol and acrylamide(AAM)as raw materials.The experimental results demonstrate that the most favorable conditions for the catalytic preparation of CP-HA-K are 1 wt%catalyst dosage,30 wt%KOH dosage,a reaction temperature of 250℃,and a reaction time of 2 h,resulting in a maximum yield of CP-HA-K of 39.58%.The temperature resistance of the Graft CP-HA-K polymer is measured to be 177.39℃,which is 55.39℃ higher than that of commercial HA-K.This is due to the abundant presence of amide,hydroxyl,and amine functional groups in the Graft CP-HA-K polymer,which increase the length of the carbon chains,enhance the electrostatic repulsion on the surface of solid particles.After being aged to 120℃ for a specified duration,the Graft CP-HA-K polymer demonstrates significantly higher viscosity reduction(42.12%)compared to commercial HA-K(C-HA-K).Furthermore,the Graft CP-HA-K polymer can tolerate a high salt concentration of 8000 mg.L-1,measured after the addition of optimum amount of 3 wt%Graft CP-HA-K polymer.The action mechanism of Graft CP-HA-K polymer on high-temperature drilling fluid is that the Graft CP-HA-K polymer can increase the repulsive force between solid particles and disrupt bentonite's reticulation structure.Overall,this research provides novelty insights into the synthesis of artificial humic acid materials and the development of temperature-resistant viscosity reducers,offering a new avenue for the utilization of CP resources.展开更多
The present study aimed to investigate the durability and microstructure evolution of road base materials(RBM)prepared from red mud and flue gas desulfurization fly ash.The durability testing showed that the strength ...The present study aimed to investigate the durability and microstructure evolution of road base materials(RBM)prepared from red mud and flue gas desulfurization fly ash.The durability testing showed that the strength of RBM with the blast furnace slag addition of 1wt%,3wt%and 5wt%reached 3.81,4.87,and 5.84 MPa after 5 freezing–thawing(F–T)cycles and reached 5.21,5.75,and 6.98 MPa after 20 weting–drying(W–D)cycles,respectively.The results also indicated that hydration products were continuously formed even during W–D and F–T exposures,resulting in an increase of the strength and durability of RBM.The observed increase of macropores(>1μm)after F–T and W–D exposures suggested that the mechanism of RBM deterioration is pore enlargement due to cracks that develop inside their matrix.Moreover,the F–T exposure showed a greater negative effect on the durability of RBM compared to the W–D exposure.The leaching tests showed that sodium and heavy metals were solidified below the minimum requirement,which indicates that these wastes are suitable for use as a natural material replacement in road base construction.展开更多
With the development of cloud computing, the mutual understandability among distributed data access control has become an important issue in the security field of cloud computing. To ensure security, confidentiality a...With the development of cloud computing, the mutual understandability among distributed data access control has become an important issue in the security field of cloud computing. To ensure security, confidentiality and fine-grained data access control of Cloud Data Storage (CDS) environment, we proposed Multi-Agent System (MAS) architecture. This architecture consists of two agents: Cloud Service Provider Agent (CSPA) and Cloud Data Confidentiality Agent (CDConA). CSPA provides a graphical interface to the cloud user that facilitates the access to the services offered by the system. CDConA provides each cloud user by definition and enforcement expressive and flexible access structure as a logic formula over cloud data file attributes. This new access control is named as Formula-Based Cloud Data Access Control (FCDAC). Our proposed FCDAC based on MAS architecture consists of four layers: interface layer, existing access control layer, proposed FCDAC layer and CDS layer as well as four types of entities of Cloud Service Provider (CSP), cloud users, knowledge base and confidentiality policy roles. FCDAC, it’s an access policy determined by our MAS architecture, not by the CSPs. A prototype of our proposed FCDAC scheme is implemented using the Java Agent Development Framework Security (JADE-S). Our results in the practical scenario defined formally in this paper, show the Round Trip Time (RTT) for an agent to travel in our system and measured by the times required for an agent to travel around different number of cloud users before and after implementing FCDAC.展开更多
With the increasing number of quantitative models available to forecast the volatility of crude oil prices, the assessment of the relative performance of competing models becomes a critical task. Our survey of the lit...With the increasing number of quantitative models available to forecast the volatility of crude oil prices, the assessment of the relative performance of competing models becomes a critical task. Our survey of the literature revealed that most studies tend to use several performance criteria to evaluate the performance of competing forecasting models;however, models are compared to each other using a single criterion at a time, which often leads to different rankings for different criteria—A situation where one cannot make an informed decision as to which model performs best when taking all criteria into account. In order to overcome this methodological problem, Xu and Ouenniche [1] proposed a multidimensional framework based on an input-oriented radial super-efficiency Data Envelopment Analysis (DEA) model to rank order competing forecasting models of crude oil prices’ volatility. However, their approach suffers from a number of issues. In this paper, we overcome such issues by proposing an alternative framework.展开更多
Outlier detection is an important task in data mining. In fact, it is difficult to find the clustering centers in some sophisticated multidimensional datasets and to measure the deviation degree of each potential outl...Outlier detection is an important task in data mining. In fact, it is difficult to find the clustering centers in some sophisticated multidimensional datasets and to measure the deviation degree of each potential outlier. In this work, an effective outlier detection method based on multi-dimensional clustering and local density(ODBMCLD) is proposed. ODBMCLD firstly identifies the center objects by the local density peak of data objects, and clusters the whole dataset based on the center objects. Then, outlier objects belonging to different clusters will be marked as candidates of abnormal data. Finally, the top N points among these abnormal candidates are chosen as final anomaly objects with high outlier factors. The feasibility and effectiveness of the method are verified by experiments.展开更多
In petroleum engineering,real-time lithology identification is very important for reservoir evaluation,drilling decisions and petroleum geological exploration.A lithology identification method while drilling based on ...In petroleum engineering,real-time lithology identification is very important for reservoir evaluation,drilling decisions and petroleum geological exploration.A lithology identification method while drilling based on machine learning and mud logging data is studied in this paper.This method can effectively utilize downhole parameters collected in real-time during drilling,to identify lithology in real-time and provide a reference for optimization of drilling parameters.Given the imbalance of lithology samples,the synthetic minority over-sampling technique(SMOTE)and Tomek link were used to balance the sample number of five lithologies.Meanwhile,this paper introduces Tent map,random opposition-based learning and dynamic perceived probability to the original crow search algorithm(CSA),and establishes an improved crow search algorithm(ICSA).In this paper,ICSA is used to optimize the hyperparameter combination of random forest(RF),extremely random trees(ET),extreme gradient boosting(XGB),and light gradient boosting machine(LGBM)models.In addition,this study combines the recognition advantages of the four models.The accuracy of lithology identification by the weighted average probability model reaches 0.877.The study of this paper realizes high-precision real-time lithology identification method,which can provide lithology reference for the drilling process.展开更多
In this paper, we propose a rule management system for data cleaning that is based on knowledge. This system combines features of both rule based systems and rule based data cleaning frameworks. The important advantag...In this paper, we propose a rule management system for data cleaning that is based on knowledge. This system combines features of both rule based systems and rule based data cleaning frameworks. The important advantages of our system are threefold. First, it aims at proposing a strong and unified rule form based on first order structure that permits the representation and management of all the types of rules and their quality via some characteristics. Second, it leads to increase the quality of rules which conditions the quality of data cleaning. Third, it uses an appropriate knowledge acquisition process, which is the weakest task in the current rule and knowledge based systems. As several research works have shown that data cleaning is rather driven by domain knowledge than by data, we have identified and analyzed the properties that distinguish knowledge and rules from data for better determining the most components of the proposed system. In order to illustrate our system, we also present a first experiment with a case study at health sector where we demonstrate how the system is useful for the improvement of data quality. The autonomy, extensibility and platform-independency of the proposed rule management system facilitate its incorporation in any system that is interested in data quality management.展开更多
In a typical oil-based mud environment, the borehole fluid and mud cake are highly resistive and will not permit any significant current flow from the tool to the formation. In order to overcome the high insulation ef...In a typical oil-based mud environment, the borehole fluid and mud cake are highly resistive and will not permit any significant current flow from the tool to the formation. In order to overcome the high insulation effect of the medium, measurement current must be injected at a relative high frequency since most of the conduction is due to capacitive coupling. In this paper, an OBIT numerical model based on four-terminal method was established to study the tool responses during the measurements. The influences of tool parameters, such as the area and distance of current-injector electrodes, inject frequency, distance of button sensors, standoff and electrical properties of borehole fluid, the tool responses, were investigated and the tool optimization was discussed.展开更多
文摘Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offers opportunities but may also bear risks. A hitherto neglected aspect is the possible increase in power consumption as smart devices in IoT applications are expected to be reachable by other devices at all times. This implies that the device is consuming electrical energy even when it is not in use for its primary function. Many researchers’ communities have started addressing storage ability like cache memory of smart devices using the concept called—Named Data Networking (NDN) to achieve better energy efficient communication model. In NDN, memory or buffer overflow is the common challenge especially when internal memory of node exceeds its limit and data with highest degree of freshness may not be accommodated and entire scenarios behaves like a traditional network. In such case, Data Caching is not performed by intermediate nodes to guarantee highest degree of freshness. On the periodical updates sent from data producers, it is exceedingly demanded that data consumers must get up to date information at cost of lease energy. Consequently, there is challenge in maintaining tradeoff between freshness energy consumption during Publisher-Subscriber interaction. In our work, we proposed the architecture to overcome cache strategy issue by Smart Caching Algorithm for improvement in memory management and data freshness. The smart caching strategy updates the data at precise interval by keeping garbage data into consideration. It is also observed from experiment that data redundancy can be easily obtained by ignoring/dropping data packets for the information which is not of interest by other participating nodes in network, ultimately leading to optimizing tradeoff between freshness and energy required.
基金financial support extended for this academic work by the Beijing Natural Science Foundation(Grant 2232066)the Open Project Foundation of State Key Laboratory of Solid Lubrication(Grant LSL-2212).
文摘The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO40,and PriEco3000 component in a composite base oil system on the performance of lubricants.The study was conducted under small laboratory sample conditions,and a data expansion method using the Gaussian Copula function was proposed to improve the prediction ability of the hybrid model.The study also compared four optimization algorithms,sticky mushroom algorithm(SMA),genetic algorithm(GA),whale optimization algorithm(WOA),and seagull optimization algorithm(SOA),to predict the kinematic viscosity at 40℃,kinematic viscosity at 100℃,viscosity index,and oxidation induction time performance of the lubricant.The results showed that the Gaussian Copula function data expansion method improved the prediction ability of the hybrid model in the case of small samples.The SOA-GBDT hybrid model had the fastest convergence speed for the samples and the best prediction effect,with determination coefficients(R^(2))for the four indicators of lubricants reaching 0.98,0.99,0.96 and 0.96,respectively.Thus,this model can significantly reduce the model’s prediction error and has good prediction ability.
基金Supported by the Basic Research Funds Reserved to State-run Universities(18CX02171A,18CX02033A)
文摘Traditional oil-based drilling muds(OBMs) have a relatively high solid content, which is detrimental to penetration rate increase and reservoir protection. Aimed at solving this problem, an organoclay-free OBM system was studied, the synthesis methods and functioning mechanism of key additives were introduced, and performance evaluation of the system was performed. The rheology modifier was prepared by reacting a dimer fatty acid with diethanolamine, the primary emulsifier was made by oxidation and addition reaction of fatty acids, the secondary emulsifier was made by amidation of a fatty acid, and finally the fluid loss additive of water-soluble acrylic resin was synthesized by introducing acrylic acid into styrene/butyl acrylate polymerization. The rheology modifier could enhance the attraction between droplets, particles in the emulsion via intermolecular hydrogen bonding and improve the shear stress by forming a three-dimensional network structure in the emulsion. Lab experimental results show that the organoclay-free OBM could tolerate temperatures up to 220 ?C and HTHP filtration is less than 5 m L. Compared with the traditional OBMs, the organoclay-free OBM has low plastic viscosity, high shear stress, high ratio of dynamic shear force to plastic viscosity and high permeability recovery, which are beneficial to penetration rate increase, hole cleaning and reservoir protection.
文摘Recently, use of mobile communicational devices in field data collection is increasing such as smart phones and cellular phones due to emergence of embedded Global Position System GPS and Wi-Fi Internet access. Accurate timely and handy field data collection is required for disaster management and emergency quick responses. In this article, we introduce web-based GIS system to collect the field data by personal mobile phone through Post Office Protocol POP3 mail server. The main objective of this work is to demonstrate real-time field data collection method to the students using their mobile phone to collect field data by timely and handy manners, either individual or group survey in local or global scale research.
基金National Major Scientific Instruments and Equipment Development Special Funds,China(No.2016YFF0103303)National Science and Technology Support Program,China(No.2014BAK02B03)
文摘This paper proposes a useful web-based system for the management and sharing of electron probe micro-analysis( EPMA)data in geology. A new web-based architecture that integrates the management and sharing functions is developed and implemented.Earth scientists can utilize this system to not only manage their data,but also easily communicate and share it with other researchers.Data query methods provide the core functionality of the proposed management and sharing modules. The modules in this system have been developed using cloud GIS technologies,which help achieve real-time spatial area retrieval on a map. The system has been tested by approximately 263 users at Jilin University and Beijing SHRIMP Center. A survey was conducted among these users to estimate the usability of the primary functions of the system,and the assessment result is summarized and presented.
文摘Building detection in very high resolution (VHR) images is crucial for mapping and analysing urban environments. Since buildings are elevated objects, elevation data need to be integrated with images for reliable detection. This process requires two critical steps: optical-elevation data co-registration and aboveground elevation calculation. These two steps are still challenging to some extent. Therefore, this paper introduces optical-elevation data co-registration and normalization techniques for generating a dataset that facilitates elevation-based building detection. For achieving accurate co-registration, a dense set of stereo-based elevations is generated and co-registered to their relevant image based on their corresponding image locations. To normalize these co-registered elevations, the bare-earth elevations are detected based on classification information of some terrain-level features after achieving the image co-registration. The developed method was executed and validated. After implementation, 80% overall-quality of detection result was achieved with 94% correct detection. Together, the developed techniques successfully facilitate the incorporation of stereo-based elevations for detecting buildings in VHR remote sensing images.
文摘Cloud storage is widely used by large companies to store vast amounts of data and files,offering flexibility,financial savings,and security.However,information shoplifting poses significant threats,potentially leading to poor performance and privacy breaches.Blockchain-based cognitive computing can help protect and maintain information security and privacy in cloud platforms,ensuring businesses can focus on business development.To ensure data security in cloud platforms,this research proposed a blockchain-based Hybridized Data Driven Cognitive Computing(HD2C)model.However,the proposed HD2C framework addresses breaches of the privacy information of mixed participants of the Internet of Things(IoT)in the cloud.HD2C is developed by combining Federated Learning(FL)with a Blockchain consensus algorithm to connect smart contracts with Proof of Authority.The“Data Island”problem can be solved by FL’s emphasis on privacy and lightning-fast processing,while Blockchain provides a decentralized incentive structure that is impervious to poisoning.FL with Blockchain allows quick consensus through smart member selection and verification.The HD2C paradigm significantly improves the computational processing efficiency of intelligent manufacturing.Extensive analysis results derived from IIoT datasets confirm HD2C superiority.When compared to other consensus algorithms,the Blockchain PoA’s foundational cost is significant.The accuracy and memory utilization evaluation results predict the total benefits of the system.In comparison to the values 0.004 and 0.04,the value of 0.4 achieves good accuracy.According to the experiment results,the number of transactions per second has minimal impact on memory requirements.The findings of this study resulted in the development of a brand-new IIoT framework based on blockchain technology.
文摘In studies of HIV, interval-censored data occur naturally. HIV infection time is not usually known exactly, only that it occurred before the survey, within some time interval or has not occurred at the time of the survey. Infections are often clustered within geographical areas such as enumerator areas (EAs) and thus inducing unobserved frailty. In this paper we consider an approach for estimating parameters when infection time is unknown and assumed correlated within an EA where dependency is modeled as frailties assuming a normal distribution for frailties and a Weibull distribution for baseline hazards. The data was from a household based population survey that used a multi-stage stratified sample design to randomly select 23,275 interviewed individuals from 10,584 households of whom 15,851 interviewed individuals were further tested for HIV (crude prevalence = 9.1%). A further test conducted among those that tested HIV positive found 181 (12.5%) recently infected. Results show high degree of heterogeneity in HIV distribution between EAs translating to a modest correlation of 0.198. Intervention strategies should target geographical areas that contribute disproportionately to the epidemic of HIV. Further research needs to identify such hot spot areas and understand what factors make these areas prone to HIV.
文摘This paper presents the semantic analysis of queries written in natural language (French) and dedicated to the object oriented data bases. The studied queries include one or two nominal groups (NG) articulating around a verb. A NG consists of one or several keywords (application dependent noun or value). Simple semantic filters are defined for identifying these keywords which can be of semantic value: class, simple attribute, composed attribute, key value or not key value. Coherence rules and coherence constraints are introduced, to check the validity of the co-occurrence of two consecutive nouns in complex NG. If a query is constituted of a single NG, no further analysis is required. Otherwise, if a query covers two valid NG, it is a subject of studying the semantic coherence of the verb and both NG which are attached to it.
基金granted by the National Science&Technology Major Projects of China(Grant No.2016ZX05033).
文摘1 Introduction Information technology has been playing an ever-increasing role in geoscience.Sphisicated database platforms are essential for geological data storage,analysis and exchange of Big Data(Feblowitz,2013;Zhang et al.,2016;Teng et al.,2016;Tian and Li,2018).The United States has built an information-sharing platform for state-owned scientific data as a national strategy.
基金supported by the Key R&D projects in Xinjiang (2022B01042)Research and Innovation Team Cultivation Plan of Yili Normal University (#CXZK2021002)。
文摘Humic acids(HAs)are widely used as filtrate and viscosity reducers in drilling fluids.However,their practical utility is limited due to poor stability in salt resistance and high-temperature resistance.Hightemperature coal pitch(CP)is a by-product from coal pyrolysis above 650℃.The substance's molecular structure is characterized by a dense arrangement of aromatic hydrocarbon and alkyl substituents.This unique structure gives it unique chemical properties and excellent drilling performance,surpassing traditional humic acids in drilling operations.Potassium humate is prepared from CP(CP-HA-K)by thermal catalysis.A new type of high-quality humic acid temperature-resistant viscosity-reducer(Graft CP-HA-K polymer)is synthesized with CP-HA-K,hydrolyzed polyacrylonitrile sodium salt(Na-HPAN),urea,formaldehyde,phenol and acrylamide(AAM)as raw materials.The experimental results demonstrate that the most favorable conditions for the catalytic preparation of CP-HA-K are 1 wt%catalyst dosage,30 wt%KOH dosage,a reaction temperature of 250℃,and a reaction time of 2 h,resulting in a maximum yield of CP-HA-K of 39.58%.The temperature resistance of the Graft CP-HA-K polymer is measured to be 177.39℃,which is 55.39℃ higher than that of commercial HA-K.This is due to the abundant presence of amide,hydroxyl,and amine functional groups in the Graft CP-HA-K polymer,which increase the length of the carbon chains,enhance the electrostatic repulsion on the surface of solid particles.After being aged to 120℃ for a specified duration,the Graft CP-HA-K polymer demonstrates significantly higher viscosity reduction(42.12%)compared to commercial HA-K(C-HA-K).Furthermore,the Graft CP-HA-K polymer can tolerate a high salt concentration of 8000 mg.L-1,measured after the addition of optimum amount of 3 wt%Graft CP-HA-K polymer.The action mechanism of Graft CP-HA-K polymer on high-temperature drilling fluid is that the Graft CP-HA-K polymer can increase the repulsive force between solid particles and disrupt bentonite's reticulation structure.Overall,this research provides novelty insights into the synthesis of artificial humic acid materials and the development of temperature-resistant viscosity reducers,offering a new avenue for the utilization of CP resources.
基金the National Natural Science Foundation of China(Nos.51574024 and U1760112)Fundamental Research Funds for the Central Universities of China(FRF-AT-19-007).
文摘The present study aimed to investigate the durability and microstructure evolution of road base materials(RBM)prepared from red mud and flue gas desulfurization fly ash.The durability testing showed that the strength of RBM with the blast furnace slag addition of 1wt%,3wt%and 5wt%reached 3.81,4.87,and 5.84 MPa after 5 freezing–thawing(F–T)cycles and reached 5.21,5.75,and 6.98 MPa after 20 weting–drying(W–D)cycles,respectively.The results also indicated that hydration products were continuously formed even during W–D and F–T exposures,resulting in an increase of the strength and durability of RBM.The observed increase of macropores(>1μm)after F–T and W–D exposures suggested that the mechanism of RBM deterioration is pore enlargement due to cracks that develop inside their matrix.Moreover,the F–T exposure showed a greater negative effect on the durability of RBM compared to the W–D exposure.The leaching tests showed that sodium and heavy metals were solidified below the minimum requirement,which indicates that these wastes are suitable for use as a natural material replacement in road base construction.
文摘With the development of cloud computing, the mutual understandability among distributed data access control has become an important issue in the security field of cloud computing. To ensure security, confidentiality and fine-grained data access control of Cloud Data Storage (CDS) environment, we proposed Multi-Agent System (MAS) architecture. This architecture consists of two agents: Cloud Service Provider Agent (CSPA) and Cloud Data Confidentiality Agent (CDConA). CSPA provides a graphical interface to the cloud user that facilitates the access to the services offered by the system. CDConA provides each cloud user by definition and enforcement expressive and flexible access structure as a logic formula over cloud data file attributes. This new access control is named as Formula-Based Cloud Data Access Control (FCDAC). Our proposed FCDAC based on MAS architecture consists of four layers: interface layer, existing access control layer, proposed FCDAC layer and CDS layer as well as four types of entities of Cloud Service Provider (CSP), cloud users, knowledge base and confidentiality policy roles. FCDAC, it’s an access policy determined by our MAS architecture, not by the CSPs. A prototype of our proposed FCDAC scheme is implemented using the Java Agent Development Framework Security (JADE-S). Our results in the practical scenario defined formally in this paper, show the Round Trip Time (RTT) for an agent to travel in our system and measured by the times required for an agent to travel around different number of cloud users before and after implementing FCDAC.
文摘With the increasing number of quantitative models available to forecast the volatility of crude oil prices, the assessment of the relative performance of competing models becomes a critical task. Our survey of the literature revealed that most studies tend to use several performance criteria to evaluate the performance of competing forecasting models;however, models are compared to each other using a single criterion at a time, which often leads to different rankings for different criteria—A situation where one cannot make an informed decision as to which model performs best when taking all criteria into account. In order to overcome this methodological problem, Xu and Ouenniche [1] proposed a multidimensional framework based on an input-oriented radial super-efficiency Data Envelopment Analysis (DEA) model to rank order competing forecasting models of crude oil prices’ volatility. However, their approach suffers from a number of issues. In this paper, we overcome such issues by proposing an alternative framework.
基金Project(61362021)supported by the National Natural Science Foundation of ChinaProject(2016GXNSFAA380149)supported by Natural Science Foundation of Guangxi Province,China+1 种基金Projects(2016YJCXB02,2017YJCX34)supported by Innovation Project of GUET Graduate Education,ChinaProject(2011KF11)supported by the Key Laboratory of Cognitive Radio and Information Processing,Ministry of Education,China
文摘Outlier detection is an important task in data mining. In fact, it is difficult to find the clustering centers in some sophisticated multidimensional datasets and to measure the deviation degree of each potential outlier. In this work, an effective outlier detection method based on multi-dimensional clustering and local density(ODBMCLD) is proposed. ODBMCLD firstly identifies the center objects by the local density peak of data objects, and clusters the whole dataset based on the center objects. Then, outlier objects belonging to different clusters will be marked as candidates of abnormal data. Finally, the top N points among these abnormal candidates are chosen as final anomaly objects with high outlier factors. The feasibility and effectiveness of the method are verified by experiments.
基金supported by CNPC-CZU Innovation Alliancesupported by the Program of Polar Drilling Environmental Protection and Waste Treatment Technology (2022YFC2806403)。
文摘In petroleum engineering,real-time lithology identification is very important for reservoir evaluation,drilling decisions and petroleum geological exploration.A lithology identification method while drilling based on machine learning and mud logging data is studied in this paper.This method can effectively utilize downhole parameters collected in real-time during drilling,to identify lithology in real-time and provide a reference for optimization of drilling parameters.Given the imbalance of lithology samples,the synthetic minority over-sampling technique(SMOTE)and Tomek link were used to balance the sample number of five lithologies.Meanwhile,this paper introduces Tent map,random opposition-based learning and dynamic perceived probability to the original crow search algorithm(CSA),and establishes an improved crow search algorithm(ICSA).In this paper,ICSA is used to optimize the hyperparameter combination of random forest(RF),extremely random trees(ET),extreme gradient boosting(XGB),and light gradient boosting machine(LGBM)models.In addition,this study combines the recognition advantages of the four models.The accuracy of lithology identification by the weighted average probability model reaches 0.877.The study of this paper realizes high-precision real-time lithology identification method,which can provide lithology reference for the drilling process.
文摘In this paper, we propose a rule management system for data cleaning that is based on knowledge. This system combines features of both rule based systems and rule based data cleaning frameworks. The important advantages of our system are threefold. First, it aims at proposing a strong and unified rule form based on first order structure that permits the representation and management of all the types of rules and their quality via some characteristics. Second, it leads to increase the quality of rules which conditions the quality of data cleaning. Third, it uses an appropriate knowledge acquisition process, which is the weakest task in the current rule and knowledge based systems. As several research works have shown that data cleaning is rather driven by domain knowledge than by data, we have identified and analyzed the properties that distinguish knowledge and rules from data for better determining the most components of the proposed system. In order to illustrate our system, we also present a first experiment with a case study at health sector where we demonstrate how the system is useful for the improvement of data quality. The autonomy, extensibility and platform-independency of the proposed rule management system facilitate its incorporation in any system that is interested in data quality management.
文摘In a typical oil-based mud environment, the borehole fluid and mud cake are highly resistive and will not permit any significant current flow from the tool to the formation. In order to overcome the high insulation effect of the medium, measurement current must be injected at a relative high frequency since most of the conduction is due to capacitive coupling. In this paper, an OBIT numerical model based on four-terminal method was established to study the tool responses during the measurements. The influences of tool parameters, such as the area and distance of current-injector electrodes, inject frequency, distance of button sensors, standoff and electrical properties of borehole fluid, the tool responses, were investigated and the tool optimization was discussed.