The Tohoku megathrust earthquake, which occurred on March 11, 2011 and had an epicenter that was 70 km east of Tohoku, Japan, resulted in an estimated ten′s of billions of dollars in damage and a death toll of more t...The Tohoku megathrust earthquake, which occurred on March 11, 2011 and had an epicenter that was 70 km east of Tohoku, Japan, resulted in an estimated ten′s of billions of dollars in damage and a death toll of more than 15 thousand lives, yet few studies have documented key spatio-temporal seismogenic characteristics. Specifically, the temporal decay of aftershock activity, the number of strong aftershocks (with magnitudes greater than or equal to 7.0), the magnitude of the greatest aftershock, and area of possible aftershocks. Forecasted results from this study are based on Gutenberg-Richter’s relation, Bath’s law, Omori’s law, and Well’s relation of rupture scale utilizing the magnitude and statistical parameters of earthquakes in USA and China (Landers, Northridge, Hector Mine, San Simeon and Wenchuan earthquakes). The number of strong aftershocks, the parameters of Gutenberg-Richter’s relation, and the modified form of Omori’s law are confirmed based on the aftershock sequence data from the Mw9.0 Tohoku earthquake. Moreover, for a large earthquake, the seismogenic structure could be a fault, a fault system, or an intersection of several faults. The seismogenic structure of the earthquake suggests that the event occurred on a thrust fault near the Japan trench within the overriding plate that subsequently triggered three or more active faults producing large aftershocks.展开更多
The continuous evaluation of the measured Stand Pipe Pressure(SPP)against a modeled SPP value in real-time involves the automatic detection of undesirable drilling events such as drill string washouts and mud pump fai...The continuous evaluation of the measured Stand Pipe Pressure(SPP)against a modeled SPP value in real-time involves the automatic detection of undesirable drilling events such as drill string washouts and mud pump failures.Numerous theoretical and experimental studies have been established to calculate the friction pressure losses using different rheological models and based on an extension of pipe flow correlations to an annular geometry.However,it would not be feasible to employ these models for real-time applications since they are limited to some conditions and intervals of application and require input parameters that might not be available in real-time on each rig.In this study,The Group Method of Data Handling(GMDH)is applied to develop a trustworthy model that can predict the SPP in real-time as a function of mud flow,well depth,RPM and the Fan VG viscometer reading at 600 and 300 rpm.In order to accomplish the modeling task,3351 data points were collected from two wells from Algerian fields.Graphical and statistical assessment criteria disclosed that the model predictions are in excellent agreement with the experimental data with a coefficient of determination of 0.9666 and an average percent relative error less than 2.401%.Furthermore,another dataset(1594 data points)from well-3 was employed to validate the developed correlation for SPP.The obtained results confirmed that the proposed GMDH-SPP model can be applied in real-time to estimate the SPP with high accuracy.Besides,it was found that the proposed GMDH correlation follows the physically expected trends with respect to the employed input parameters.Lastly,the findings of this study can help for the early detection of downhole problems such as drill string washout,pump failure,and bit balling.展开更多
Currently,the reduction of weight in automotive is a very important topic to reduce the air pollution.In this context,the purpose of the present paper is to analyze a real case study through a comparison of the enviro...Currently,the reduction of weight in automotive is a very important topic to reduce the air pollution.In this context,the purpose of the present paper is to analyze a real case study through a comparison of the environmental impacts between a conventional steel bumper and a polyester prototype.In the first part of this work,a door-to-door life-cycle assessment methodology was used throughout the study of the component manufacturing phase.The SimaPro 7.1 software is used to evaluate the impacts of both bumpers on the environment and health.The second part is devoted to dust analysis from the polyester workshop.The obtained results have allowed us to show the company that its choice of steel substitution by the polyester is advantageous for certain impacts including the impact of climate change,but unfortunately there may be,given the working conditions of the polyester workshop,a transfer of impact,since we will end up with a risk of health(irritations,cancers)for the workers.LCA has proven to be a very useful tool for validating a redesigned automotive component from an environmental point of view;from this case study,several recommendations were made for the company to design environmentally friendly components,and ecodesign should be introduced into the company’s procedures.展开更多
Purpose-The study of the skyline queries has received considerable attention from several database researchers since the end of 2000’s.Skyline queries are an appropriate tool that can help users to make intelligent d...Purpose-The study of the skyline queries has received considerable attention from several database researchers since the end of 2000’s.Skyline queries are an appropriate tool that can help users to make intelligent decisions in the presence of multidimensional data when different,and often contradictory criteria are to be taken into account.Based on the concept of Pareto dominance,the skyline process extracts the most interesting(not dominated in the sense of Pareto)objects from a set of data.Skyline computation methods often lead to a set with a large size which is less informative for the end users and not easy to be exploited.The purpose of this paper is to tackle this problem,known as the large size skyline problem,and propose a solution to deal with it by applying an appropriate refining process.Design/methodology/approach-The problem of the skyline refinement is formalized in the fuzzy formal concept analysis setting.Then,an ideal fuzzy formal concept is computed in the sense of some particular defined criteria.By leveraging the elements of this ideal concept,one can reduce the size of the computed Skyline.Findings-An appropriate and rational solution is discussed for the problem of interest.Then,a tool,named SkyRef,is developed.Rich experiments are done using this tool on both synthetic and real datasets.Research limitations/implications-The authors have conducted experiments on synthetic and some real datasets to show the effectiveness of the proposed approaches.However,thorough experiments on large-scale real datasets are highly desirable to show the behavior of the tool with respect to the performance and time execution criteria.Practical implications-The tool developed SkyRef can have many domains applications that require decision-making,personalized recommendation and where the size of skyline has to be reduced.In particular,SkyRef can be used in several real-world applications such as economic,security,medicine and services.Social implications-This work can be expected in all domains that require decision-making like hotel finder,restaurant recommender,recruitment of candidates,etc.Originality/value-This study mixes two research fields artificial intelligence(i.e.formal concept analysis)and databases(i.e.skyline queries).The key elements of the solution proposed for the skyline refinement problem are borrowed from the fuzzy formal concept analysis which makes it clearer and rational,semantically speaking.On the other hand,this study opens the door for using the formal concept analysis and its extensions in solving other issues related to skyline queries,such as relaxation.展开更多
基金supported by the National Natural Science Foundation of China (No. 51278474)Special Research Project of Earthquake Engineering (No. 201108003)International Science and Technology Cooperation Program of China (No. 2011DFA21460)
文摘The Tohoku megathrust earthquake, which occurred on March 11, 2011 and had an epicenter that was 70 km east of Tohoku, Japan, resulted in an estimated ten′s of billions of dollars in damage and a death toll of more than 15 thousand lives, yet few studies have documented key spatio-temporal seismogenic characteristics. Specifically, the temporal decay of aftershock activity, the number of strong aftershocks (with magnitudes greater than or equal to 7.0), the magnitude of the greatest aftershock, and area of possible aftershocks. Forecasted results from this study are based on Gutenberg-Richter’s relation, Bath’s law, Omori’s law, and Well’s relation of rupture scale utilizing the magnitude and statistical parameters of earthquakes in USA and China (Landers, Northridge, Hector Mine, San Simeon and Wenchuan earthquakes). The number of strong aftershocks, the parameters of Gutenberg-Richter’s relation, and the modified form of Omori’s law are confirmed based on the aftershock sequence data from the Mw9.0 Tohoku earthquake. Moreover, for a large earthquake, the seismogenic structure could be a fault, a fault system, or an intersection of several faults. The seismogenic structure of the earthquake suggests that the event occurred on a thrust fault near the Japan trench within the overriding plate that subsequently triggered three or more active faults producing large aftershocks.
基金The authors would like to acknowledge the Petroleum Equipment’s Reliability and Materials Laboratory of the University of Boumerdes for their assistance throughout this study。
文摘The continuous evaluation of the measured Stand Pipe Pressure(SPP)against a modeled SPP value in real-time involves the automatic detection of undesirable drilling events such as drill string washouts and mud pump failures.Numerous theoretical and experimental studies have been established to calculate the friction pressure losses using different rheological models and based on an extension of pipe flow correlations to an annular geometry.However,it would not be feasible to employ these models for real-time applications since they are limited to some conditions and intervals of application and require input parameters that might not be available in real-time on each rig.In this study,The Group Method of Data Handling(GMDH)is applied to develop a trustworthy model that can predict the SPP in real-time as a function of mud flow,well depth,RPM and the Fan VG viscometer reading at 600 and 300 rpm.In order to accomplish the modeling task,3351 data points were collected from two wells from Algerian fields.Graphical and statistical assessment criteria disclosed that the model predictions are in excellent agreement with the experimental data with a coefficient of determination of 0.9666 and an average percent relative error less than 2.401%.Furthermore,another dataset(1594 data points)from well-3 was employed to validate the developed correlation for SPP.The obtained results confirmed that the proposed GMDH-SPP model can be applied in real-time to estimate the SPP with high accuracy.Besides,it was found that the proposed GMDH correlation follows the physically expected trends with respect to the employed input parameters.Lastly,the findings of this study can help for the early detection of downhole problems such as drill string washout,pump failure,and bit balling.
文摘Currently,the reduction of weight in automotive is a very important topic to reduce the air pollution.In this context,the purpose of the present paper is to analyze a real case study through a comparison of the environmental impacts between a conventional steel bumper and a polyester prototype.In the first part of this work,a door-to-door life-cycle assessment methodology was used throughout the study of the component manufacturing phase.The SimaPro 7.1 software is used to evaluate the impacts of both bumpers on the environment and health.The second part is devoted to dust analysis from the polyester workshop.The obtained results have allowed us to show the company that its choice of steel substitution by the polyester is advantageous for certain impacts including the impact of climate change,but unfortunately there may be,given the working conditions of the polyester workshop,a transfer of impact,since we will end up with a risk of health(irritations,cancers)for the workers.LCA has proven to be a very useful tool for validating a redesigned automotive component from an environmental point of view;from this case study,several recommendations were made for the company to design environmentally friendly components,and ecodesign should be introduced into the company’s procedures.
基金The authors would like to express their special thanks of gratitude to the Directorate General for Scientific Research and Technological Development(DGRSDT),for the support of this work under the subvention number C0662300 and the grant number 167/PNE.
文摘Purpose-The study of the skyline queries has received considerable attention from several database researchers since the end of 2000’s.Skyline queries are an appropriate tool that can help users to make intelligent decisions in the presence of multidimensional data when different,and often contradictory criteria are to be taken into account.Based on the concept of Pareto dominance,the skyline process extracts the most interesting(not dominated in the sense of Pareto)objects from a set of data.Skyline computation methods often lead to a set with a large size which is less informative for the end users and not easy to be exploited.The purpose of this paper is to tackle this problem,known as the large size skyline problem,and propose a solution to deal with it by applying an appropriate refining process.Design/methodology/approach-The problem of the skyline refinement is formalized in the fuzzy formal concept analysis setting.Then,an ideal fuzzy formal concept is computed in the sense of some particular defined criteria.By leveraging the elements of this ideal concept,one can reduce the size of the computed Skyline.Findings-An appropriate and rational solution is discussed for the problem of interest.Then,a tool,named SkyRef,is developed.Rich experiments are done using this tool on both synthetic and real datasets.Research limitations/implications-The authors have conducted experiments on synthetic and some real datasets to show the effectiveness of the proposed approaches.However,thorough experiments on large-scale real datasets are highly desirable to show the behavior of the tool with respect to the performance and time execution criteria.Practical implications-The tool developed SkyRef can have many domains applications that require decision-making,personalized recommendation and where the size of skyline has to be reduced.In particular,SkyRef can be used in several real-world applications such as economic,security,medicine and services.Social implications-This work can be expected in all domains that require decision-making like hotel finder,restaurant recommender,recruitment of candidates,etc.Originality/value-This study mixes two research fields artificial intelligence(i.e.formal concept analysis)and databases(i.e.skyline queries).The key elements of the solution proposed for the skyline refinement problem are borrowed from the fuzzy formal concept analysis which makes it clearer and rational,semantically speaking.On the other hand,this study opens the door for using the formal concept analysis and its extensions in solving other issues related to skyline queries,such as relaxation.