In the manufacturing industry,reasonable scheduling can greatly improve production efficiency,while excessive resource consumption highlights the growing significance of energy conservation in production.This paper st...In the manufacturing industry,reasonable scheduling can greatly improve production efficiency,while excessive resource consumption highlights the growing significance of energy conservation in production.This paper studies the problem of energy-efficient distributed heterogeneous permutation flowshop problem with variable processing speed(DHPFSP-VPS),considering both the minimum makespan and total energy consumption(TEC)as objectives.A discrete multi-objective squirrel search algorithm(DMSSA)is proposed to solve the DHPFSPVPS.DMSSA makes four improvements based on the squirrel search algorithm.Firstly,in terms of the population initialization strategy,four hybrid initialization methods targeting different objectives are proposed to enhance the quality of initial solutions.Secondly,enhancements are made to the population hierarchy system and position updating methods of the squirrel search algorithm,making it more suitable for discrete scheduling problems.Additionally,regarding the search strategy,six local searches are designed based on problem characteristics to enhance search capability.Moreover,a dynamic predator strategy based on Q-learning is devised to effectively balance DMSSA’s capability for global exploration and local exploitation.Finally,two speed control energy-efficient strategies are designed to reduce TEC.Extensive comparative experiments are conducted in this paper to validate the effectiveness of the proposed strategies.The results of comparing DMSSA with other algorithms demonstrate its superior performance and its potential for efficient solving of the DHPFSP-VPS problem.展开更多
Dear Editor,A few studies have focused on exploring APOE gene- related effects on cognitive functions and brain activities in healthy populations. Bondi et aL found that ε4 carriers perform significantly worse on the...Dear Editor,A few studies have focused on exploring APOE gene- related effects on cognitive functions and brain activities in healthy populations. Bondi et aL found that ε4 carriers perform significantly worse on the California Verbal Learning Test than non-carriers in non-demented old subjects (mean age, 72 years)ε11. But the results are not entirely consistent. For example, Scarmeas et aL found no effect of the E4 allele on neuropsychological performance[2] in young adults, and Jochemsen et al. found that the ε4 allele is associated with age-related cognitive decline[3]. Furthermore, protective and negative effects of the E2 allele on cognition are inconsistent[4' s]. APOE E2 is thought to be a protective allele for AD in the elderly population due to its role in the superior cognitive performance of ε2 carriers compared to E3 or E4 carriers[5]. However, the ε2 allele has also been found to have a negative effect on AD pathology[4].展开更多
In offshore engineering design, it is considerably significant to have an adequately accurate estimation of marine environmental parameters, in particular, the extreme wind speed of tropical cyclone (TC) with differ...In offshore engineering design, it is considerably significant to have an adequately accurate estimation of marine environmental parameters, in particular, the extreme wind speed of tropical cyclone (TC) with different return periods to guarantee the safety in projected operating life period. Based on the 71-year (1945-2015) TC data in the Northwest Pacific (NWP) by the Joint Typhoon Warning Center (JTWC) of US, a notable growth of the TC intensity is observed in the context of climate change. The fact implies that the traditional stationary model might be incapable of predicting parameters in the extreme events. Therefore, a non-stationary model is proposed in this study to estimate extreme wind speed in the South China Sea (SCS) and NWP. We find that the extreme wind speeds of different return periods exhibit an evident enhancement trend, for instance, the extreme wind speeds with different return periods by non- stationary model are 4.1%-4.4% higher than stationary ones in SCS. Also, the spatial distribution of extreme wind speed in NWP has been examined with the same methodology by dividing the west sea areas of the NWP 0°-45°N, 105°E-130°E into 45 subareas of 5° × 5°, where oil and gas resources are abundant. Similarly, remarkable spacial in-homogeneity in the extreme wind speed is seen in this area: the extreme wind speed with 50-year return period in the subarea (15°N-20°N, 115°E-120°E) of Zhongsha and Dongsha Islands is 73.8 m/s, while that in the subarea of Yellow Sea (30°N-35°N, 120°E-125°E) is only 47.1 m/s. As a result, the present study demonstrates that non-stationary and in-homogeneous effects should be taken into consideration in the estimation of extreme wind speed.展开更多
The research aims to improve the performance of image recognition methods based on a description in the form of a set of keypoint descriptors.The main focus is on increasing the speed of establishing the relevance of ...The research aims to improve the performance of image recognition methods based on a description in the form of a set of keypoint descriptors.The main focus is on increasing the speed of establishing the relevance of object and etalon descriptions while maintaining the required level of classification efficiency.The class to be recognized is represented by an infinite set of images obtained from the etalon by applying arbitrary geometric transformations.It is proposed to reduce the descriptions for the etalon database by selecting the most significant descriptor components according to the information content criterion.The informativeness of an etalon descriptor is estimated by the difference of the closest distances to its own and other descriptions.The developed method determines the relevance of the full description of the recognized object with the reduced description of the etalons.Several practical models of the classifier with different options for establishing the correspondence between object descriptors and etalons are considered.The results of the experimental modeling of the proposed methods for a database including images of museum jewelry are presented.The test sample is formed as a set of images from the etalon database and out of the database with the application of geometric transformations of scale and rotation in the field of view.The practical problems of determining the threshold for the number of votes,based on which a classification decision is made,have been researched.Modeling has revealed the practical possibility of tenfold reducing descriptions with full preservation of classification accuracy.Reducing the descriptions by twenty times in the experiment leads to slightly decreased accuracy.The speed of the analysis increases in proportion to the degree of reduction.The use of reduction by the informativeness criterion confirmed the possibility of obtaining the most significant subset of features for classification,which guarantees a decent level of accuracy.展开更多
Cognitive impairment is a common clinical manifestation of multiple sclerosis,but its pathophysiology is not completely understood.White and grey matter injury together with synaptic dysfunction do play a role.The mea...Cognitive impairment is a common clinical manifestation of multiple sclerosis,but its pathophysiology is not completely understood.White and grey matter injury together with synaptic dysfunction do play a role.The measurement of biomarkers in the cerebrospinal fluid and the study of their association with cognitive impairment may provide interesting in vivo evidence of the biological mechanisms underlying multiple sclerosis-related cognitive impairment.So far,only a few studies on this topic have been published,giving interesting results that deserve further investigation.Cerebrospinal fluid biomarkers of different pathophysiological mechanisms seem to reflect different neuropsychological patterns of cognitive deficits in multiple sclerosis.The aim of this review is to discuss the studies that have correlated cerebrospinal fluid markers of immune,glial and neuronal pathology with cognitive impairment in multiple sclerosis.Although preliminary,these findings suggest that cerebrospinal fluid biomarkers show some correlation with cognitive performance in multiple sclerosis,thus providing interesting insights into the mechanisms underlying the involvement of specific cognitive domains.展开更多
The results of the development of the new fast-speed method of classification images using a structural approach are presented.The method is based on the system of hierarchical features,based on the bitwise data distr...The results of the development of the new fast-speed method of classification images using a structural approach are presented.The method is based on the system of hierarchical features,based on the bitwise data distribution for the set of descriptors of image description.The article also proposes the use of the spatial data processing apparatus,which simplifies and accelerates the classification process.Experiments have shown that the time of calculation of the relevance for two descriptions according to their distributions is about 1000 times less than for the traditional voting procedure,for which the sets of descriptors are compared.The introduction of the system of hierarchical features allows to further reduce the calculation time by 2–3 times while ensuring high efficiency of classification.The noise immunity of the method to additive noise has been experimentally studied.According to the results of the research,the marginal degree of the hierarchy of features for reliable classification with the standard deviation of noise less than 30 is the 8-bit distribution.Computing costs increase proportionally with decreasing bit distribution.The method can be used for application tasks where object identification time is critical.展开更多
The problem of image recognition in the computer vision systems is being studied.The results of the development of efficient classification methods,given the figure of processing speed,based on the analysis of the seg...The problem of image recognition in the computer vision systems is being studied.The results of the development of efficient classification methods,given the figure of processing speed,based on the analysis of the segment representation of the structural description in the form of a set of descriptors are provided.We propose three versions of the classifier according to the following principles:“object-etalon”,“object descriptor-etalon”and“vector description of the object-etalon”,which are not similar in level of integration of researched data analysis.The options for constructing clusters over the whole set of descriptions of the etalon database,separately for each of the etalons,as well as the optimal method to compare sets of segment centers for the etalons and object,are implemented.An experimental rating of the efficiency of the created classifiers in terms of productivity,processing time,and classification quality has been realized of the applied.The proposed methods classify the set of etalons without error.We have formed the inference about the efficiency of classification approaches based on segment centers.The time of image processing according to the developedmethods is hundreds of times less than according to the traditional one,without reducing the accuracy.展开更多
In textile finishing, stenters always draw considerable attention to newer inventions to boost up production via maximum utilization of energy. Prior to main drying or heat-setting chambers, intermediate heating of cy...In textile finishing, stenters always draw considerable attention to newer inventions to boost up production via maximum utilization of energy. Prior to main drying or heat-setting chambers, intermediate heating of cylindrical system especially by steam has a direct blessing to moisture evaporation, processing speed, fabric quality and so on. Based on actual operational data, this study reveals the outcomes of a pre-heating module installed within a stenter. After employing the pre-heating system to knit fabrics of different structures and compositions, 23% - 61% moisture reduction was found and the speed of processing fabrics was increased simultaneously by 17% - 30% without any compromise on fabric quality. Moreover, no less than 8.21% savings in annual electricity consumption were observed.展开更多
Hot deformation behavior of a new type of M3∶ 2 high speed steel with niobium addition made by spray forming was investigated based on compression tests in the temperature range of 950-1 150 ℃ and strain rate of 0. ...Hot deformation behavior of a new type of M3∶ 2 high speed steel with niobium addition made by spray forming was investigated based on compression tests in the temperature range of 950-1 150 ℃ and strain rate of 0. 001-10 s^(-1). A comprehensive constitutive equation was obtained,which could be used to predict the flow stress at different strains. Processing map was developed on the basis of the flow stress data using the principles of dynamic material model. The results showed that the flow curves were in fair agreement with the dynamic recrystallization model. The flow stresses,which were calculated by the comprehensive constitutive equation,agreed well with the test data at low strain rates( ≤1 s^(-1)). The material constant( α),stress exponent( n) and the hot deformation activation energy( Q_(HW)) of the new steel were 0. 006 15 MPa^(-1),4. 81 and 546 kJ·mol^(-1),respectively. Analysis of the processing map with an observation of microstructures revealed that hot working processes of the steel could be carried out safely in the domain( T = 1 050-1 150 ℃,ε = 0. 01- 0. 1 s^(-1))with about 33% peak efficiency of power dissipation( η). Cracks was expected in two domains at either lower temperatures( 〈 1 000 ℃) or low strain rates( 0. 001 s^(-1)) with different cracking mechanisms. Flow localization occurred when the strain rates exceeded 1 s^(-1) at all testing temperatures.展开更多
The aim of this project was to develop non-contact fiber optic based displacement sensors to operate in the harsh environment of a "light gas gun" (LGG), which can "fire" small particles at velocities ranging fr...The aim of this project was to develop non-contact fiber optic based displacement sensors to operate in the harsh environment of a "light gas gun" (LGG), which can "fire" small particles at velocities ranging from 1 km/s-8.4km/s. The LGG is used extensively for research in aerospace to analyze the effects of high speed impacts on materials. Ideally the measurement should be made close to the center of the impact to minimize corruption of the data from edge effects and survive the impact. We chose to develop a non-contact "pseudo" confocal intensity sensor, which demonstrated resolution comparable with conventional polyvinylidene fluoride (PVDF) sensors combined with high survivability and low cost. A second sensor was developed based on "fiber Bragg gratings" (FBG) to enable a more detailed analysis of the effects of the impact, although requiring contact with the target the low weight and very small contact area of the FBG had minimal effect on the dynamics of the target. The FBG was mounted either on the surface of the target or tangentially between a fixed location. The output signals from the FBG were interrogated in time by a new method. Measurements were made on carbon fiber composite plates in the LGG and on low velocity impact tests. The particle momentum for the low velocity impact tests was chosen to be similar to that of the particles used in the LGG.展开更多
Shale gas reservoirs are found all over the world.Their endowment worldwide is estimated at 10,000 tcf by the GFREE team in the Schulich School of Engineering at the University of Calgary.The shale gas work and produc...Shale gas reservoirs are found all over the world.Their endowment worldwide is estimated at 10,000 tcf by the GFREE team in the Schulich School of Engineering at the University of Calgary.The shale gas work and production initiated successfully in the Unites States and extended to Canada will have application,with modifications,in several other countries in the future.The‘modifications’qualifier is important as each shale gas reservoir should be considered as a research project by itself to avoid fiascos and major financial losses.Shale gas reservoirs are best represented by at least quadruple porosity models.Some of the production obtained from shale reservoirs is dominated by diffusion flow.The approximate boundary between viscous and diffusion-like flow is estimated with Knudsen number.Viscous flow is present,for example,when the architecture of the rock is dominated by mega pore throat,macro pore throat,meso pore throat and sometimes micro pore throat.Diffusion flow on the other hand is observed at the nano pore throat level.The process speed concept has been used successfully in conventional reservoirs for several decades.However,the concept discussed in this paper for tight gas and shale gas reservoirs,with the support of core data,has been developed only recently,and permits differentiating between viscous and diffusion dominated flow.This is valuable,for example,in those cases where the formation to be developed is composed of alternating stacked layers of tight sands and shales,or where there are lateral variations due to facies changes.An approach to develop the concept of a super-giant shale gas reservoir is presented as well as a description of GFREE,a successful research program for tight formations.The paper closes with examples of detailed original gas-in-place(OGIP)calculations for 3 North American shale gas reservoirs including free gas in natural fractures and the porous network within the organic matter,gas in the non-organic matter,adsorbed gas,and estimates of free gas within fractures created during hydraulic fracturing jobs.The examples show that the amount of free gas in shale reservoirs,as a percent of the total OGIP,is probably larger than considered previously in the literature.展开更多
基金supported by the Key Research and Development Project of Hubei Province(Nos.2020BAB114 and 2023BAB094).
文摘In the manufacturing industry,reasonable scheduling can greatly improve production efficiency,while excessive resource consumption highlights the growing significance of energy conservation in production.This paper studies the problem of energy-efficient distributed heterogeneous permutation flowshop problem with variable processing speed(DHPFSP-VPS),considering both the minimum makespan and total energy consumption(TEC)as objectives.A discrete multi-objective squirrel search algorithm(DMSSA)is proposed to solve the DHPFSPVPS.DMSSA makes four improvements based on the squirrel search algorithm.Firstly,in terms of the population initialization strategy,four hybrid initialization methods targeting different objectives are proposed to enhance the quality of initial solutions.Secondly,enhancements are made to the population hierarchy system and position updating methods of the squirrel search algorithm,making it more suitable for discrete scheduling problems.Additionally,regarding the search strategy,six local searches are designed based on problem characteristics to enhance search capability.Moreover,a dynamic predator strategy based on Q-learning is devised to effectively balance DMSSA’s capability for global exploration and local exploitation.Finally,two speed control energy-efficient strategies are designed to reduce TEC.Extensive comparative experiments are conducted in this paper to validate the effectiveness of the proposed strategies.The results of comparing DMSSA with other algorithms demonstrate its superior performance and its potential for efficient solving of the DHPFSP-VPS problem.
基金supported by the National Natural Science Foundation of China (81130024)the National Key Technology R & D Program of the Ministry of Science and Technology of China during the 12th Five-Year Plan (2012BAI01B06)
文摘Dear Editor,A few studies have focused on exploring APOE gene- related effects on cognitive functions and brain activities in healthy populations. Bondi et aL found that ε4 carriers perform significantly worse on the California Verbal Learning Test than non-carriers in non-demented old subjects (mean age, 72 years)ε11. But the results are not entirely consistent. For example, Scarmeas et aL found no effect of the E4 allele on neuropsychological performance[2] in young adults, and Jochemsen et al. found that the ε4 allele is associated with age-related cognitive decline[3]. Furthermore, protective and negative effects of the E2 allele on cognition are inconsistent[4' s]. APOE E2 is thought to be a protective allele for AD in the elderly population due to its role in the superior cognitive performance of ε2 carriers compared to E3 or E4 carriers[5]. However, the ε2 allele has also been found to have a negative effect on AD pathology[4].
基金financially supported by the Ministry of Science and Technology(863 program)(2006AA09A103-4)the National Natural Science Foundation of China(11232012)the Chinese Academy of Sciences(CAS)knowledge innovation program(KJCXYW-L02)
文摘In offshore engineering design, it is considerably significant to have an adequately accurate estimation of marine environmental parameters, in particular, the extreme wind speed of tropical cyclone (TC) with different return periods to guarantee the safety in projected operating life period. Based on the 71-year (1945-2015) TC data in the Northwest Pacific (NWP) by the Joint Typhoon Warning Center (JTWC) of US, a notable growth of the TC intensity is observed in the context of climate change. The fact implies that the traditional stationary model might be incapable of predicting parameters in the extreme events. Therefore, a non-stationary model is proposed in this study to estimate extreme wind speed in the South China Sea (SCS) and NWP. We find that the extreme wind speeds of different return periods exhibit an evident enhancement trend, for instance, the extreme wind speeds with different return periods by non- stationary model are 4.1%-4.4% higher than stationary ones in SCS. Also, the spatial distribution of extreme wind speed in NWP has been examined with the same methodology by dividing the west sea areas of the NWP 0°-45°N, 105°E-130°E into 45 subareas of 5° × 5°, where oil and gas resources are abundant. Similarly, remarkable spacial in-homogeneity in the extreme wind speed is seen in this area: the extreme wind speed with 50-year return period in the subarea (15°N-20°N, 115°E-120°E) of Zhongsha and Dongsha Islands is 73.8 m/s, while that in the subarea of Yellow Sea (30°N-35°N, 120°E-125°E) is only 47.1 m/s. As a result, the present study demonstrates that non-stationary and in-homogeneous effects should be taken into consideration in the estimation of extreme wind speed.
基金This research was funded by Prince Sattam bin Abdulaziz University(Project Number PSAU/2023/01/25387).
文摘The research aims to improve the performance of image recognition methods based on a description in the form of a set of keypoint descriptors.The main focus is on increasing the speed of establishing the relevance of object and etalon descriptions while maintaining the required level of classification efficiency.The class to be recognized is represented by an infinite set of images obtained from the etalon by applying arbitrary geometric transformations.It is proposed to reduce the descriptions for the etalon database by selecting the most significant descriptor components according to the information content criterion.The informativeness of an etalon descriptor is estimated by the difference of the closest distances to its own and other descriptions.The developed method determines the relevance of the full description of the recognized object with the reduced description of the etalons.Several practical models of the classifier with different options for establishing the correspondence between object descriptors and etalons are considered.The results of the experimental modeling of the proposed methods for a database including images of museum jewelry are presented.The test sample is formed as a set of images from the etalon database and out of the database with the application of geometric transformations of scale and rotation in the field of view.The practical problems of determining the threshold for the number of votes,based on which a classification decision is made,have been researched.Modeling has revealed the practical possibility of tenfold reducing descriptions with full preservation of classification accuracy.Reducing the descriptions by twenty times in the experiment leads to slightly decreased accuracy.The speed of the analysis increases in proportion to the degree of reduction.The use of reduction by the informativeness criterion confirmed the possibility of obtaining the most significant subset of features for classification,which guarantees a decent level of accuracy.
文摘Cognitive impairment is a common clinical manifestation of multiple sclerosis,but its pathophysiology is not completely understood.White and grey matter injury together with synaptic dysfunction do play a role.The measurement of biomarkers in the cerebrospinal fluid and the study of their association with cognitive impairment may provide interesting in vivo evidence of the biological mechanisms underlying multiple sclerosis-related cognitive impairment.So far,only a few studies on this topic have been published,giving interesting results that deserve further investigation.Cerebrospinal fluid biomarkers of different pathophysiological mechanisms seem to reflect different neuropsychological patterns of cognitive deficits in multiple sclerosis.The aim of this review is to discuss the studies that have correlated cerebrospinal fluid markers of immune,glial and neuronal pathology with cognitive impairment in multiple sclerosis.Although preliminary,these findings suggest that cerebrospinal fluid biomarkers show some correlation with cognitive performance in multiple sclerosis,thus providing interesting insights into the mechanisms underlying the involvement of specific cognitive domains.
文摘The results of the development of the new fast-speed method of classification images using a structural approach are presented.The method is based on the system of hierarchical features,based on the bitwise data distribution for the set of descriptors of image description.The article also proposes the use of the spatial data processing apparatus,which simplifies and accelerates the classification process.Experiments have shown that the time of calculation of the relevance for two descriptions according to their distributions is about 1000 times less than for the traditional voting procedure,for which the sets of descriptors are compared.The introduction of the system of hierarchical features allows to further reduce the calculation time by 2–3 times while ensuring high efficiency of classification.The noise immunity of the method to additive noise has been experimentally studied.According to the results of the research,the marginal degree of the hierarchy of features for reliable classification with the standard deviation of noise less than 30 is the 8-bit distribution.Computing costs increase proportionally with decreasing bit distribution.The method can be used for application tasks where object identification time is critical.
基金The authors received specific funding for this research-Project Number IF-PSAU-2021/01/18487.
文摘The problem of image recognition in the computer vision systems is being studied.The results of the development of efficient classification methods,given the figure of processing speed,based on the analysis of the segment representation of the structural description in the form of a set of descriptors are provided.We propose three versions of the classifier according to the following principles:“object-etalon”,“object descriptor-etalon”and“vector description of the object-etalon”,which are not similar in level of integration of researched data analysis.The options for constructing clusters over the whole set of descriptions of the etalon database,separately for each of the etalons,as well as the optimal method to compare sets of segment centers for the etalons and object,are implemented.An experimental rating of the efficiency of the created classifiers in terms of productivity,processing time,and classification quality has been realized of the applied.The proposed methods classify the set of etalons without error.We have formed the inference about the efficiency of classification approaches based on segment centers.The time of image processing according to the developedmethods is hundreds of times less than according to the traditional one,without reducing the accuracy.
文摘In textile finishing, stenters always draw considerable attention to newer inventions to boost up production via maximum utilization of energy. Prior to main drying or heat-setting chambers, intermediate heating of cylindrical system especially by steam has a direct blessing to moisture evaporation, processing speed, fabric quality and so on. Based on actual operational data, this study reveals the outcomes of a pre-heating module installed within a stenter. After employing the pre-heating system to knit fabrics of different structures and compositions, 23% - 61% moisture reduction was found and the speed of processing fabrics was increased simultaneously by 17% - 30% without any compromise on fabric quality. Moreover, no less than 8.21% savings in annual electricity consumption were observed.
基金Item Sponsored by National Basic Research Program of China(2011CB606303)Constructed Project for Key Laboratory of Beijing of China
文摘Hot deformation behavior of a new type of M3∶ 2 high speed steel with niobium addition made by spray forming was investigated based on compression tests in the temperature range of 950-1 150 ℃ and strain rate of 0. 001-10 s^(-1). A comprehensive constitutive equation was obtained,which could be used to predict the flow stress at different strains. Processing map was developed on the basis of the flow stress data using the principles of dynamic material model. The results showed that the flow curves were in fair agreement with the dynamic recrystallization model. The flow stresses,which were calculated by the comprehensive constitutive equation,agreed well with the test data at low strain rates( ≤1 s^(-1)). The material constant( α),stress exponent( n) and the hot deformation activation energy( Q_(HW)) of the new steel were 0. 006 15 MPa^(-1),4. 81 and 546 kJ·mol^(-1),respectively. Analysis of the processing map with an observation of microstructures revealed that hot working processes of the steel could be carried out safely in the domain( T = 1 050-1 150 ℃,ε = 0. 01- 0. 1 s^(-1))with about 33% peak efficiency of power dissipation( η). Cracks was expected in two domains at either lower temperatures( 〈 1 000 ℃) or low strain rates( 0. 001 s^(-1)) with different cracking mechanisms. Flow localization occurred when the strain rates exceeded 1 s^(-1) at all testing temperatures.
文摘The aim of this project was to develop non-contact fiber optic based displacement sensors to operate in the harsh environment of a "light gas gun" (LGG), which can "fire" small particles at velocities ranging from 1 km/s-8.4km/s. The LGG is used extensively for research in aerospace to analyze the effects of high speed impacts on materials. Ideally the measurement should be made close to the center of the impact to minimize corruption of the data from edge effects and survive the impact. We chose to develop a non-contact "pseudo" confocal intensity sensor, which demonstrated resolution comparable with conventional polyvinylidene fluoride (PVDF) sensors combined with high survivability and low cost. A second sensor was developed based on "fiber Bragg gratings" (FBG) to enable a more detailed analysis of the effects of the impact, although requiring contact with the target the low weight and very small contact area of the FBG had minimal effect on the dynamics of the target. The FBG was mounted either on the surface of the target or tangentially between a fixed location. The output signals from the FBG were interrogated in time by a new method. Measurements were made on carbon fiber composite plates in the LGG and on low velocity impact tests. The particle momentum for the low velocity impact tests was chosen to be similar to that of the particles used in the LGG.
基金Parts of this work were funded by the Natural Sciences and Engineering Research Council of Canada(NSERC agreement 347825-06)ConocoPhillips(agreement 4204638)+2 种基金Alberta Innovates Energy and Environment Solutions(AERI agreement 1711)the Schulich School of Engineering at the University of Calgary and Servipetrol Ltd.Porosities and permeabilities from Nikanassin drill cuttings were determined by Nisael Solano of the University of Calgary using Darcylog equipment provided by Mr.Roland Lenormand of Cydarex in Paris,FranceThe 3D hydraulic fracturing simulation was performed using GOHFER,contributed to the GFREE Research program by R.D.Barree of B&A and Kevin Svatek of Core Lab.
文摘Shale gas reservoirs are found all over the world.Their endowment worldwide is estimated at 10,000 tcf by the GFREE team in the Schulich School of Engineering at the University of Calgary.The shale gas work and production initiated successfully in the Unites States and extended to Canada will have application,with modifications,in several other countries in the future.The‘modifications’qualifier is important as each shale gas reservoir should be considered as a research project by itself to avoid fiascos and major financial losses.Shale gas reservoirs are best represented by at least quadruple porosity models.Some of the production obtained from shale reservoirs is dominated by diffusion flow.The approximate boundary between viscous and diffusion-like flow is estimated with Knudsen number.Viscous flow is present,for example,when the architecture of the rock is dominated by mega pore throat,macro pore throat,meso pore throat and sometimes micro pore throat.Diffusion flow on the other hand is observed at the nano pore throat level.The process speed concept has been used successfully in conventional reservoirs for several decades.However,the concept discussed in this paper for tight gas and shale gas reservoirs,with the support of core data,has been developed only recently,and permits differentiating between viscous and diffusion dominated flow.This is valuable,for example,in those cases where the formation to be developed is composed of alternating stacked layers of tight sands and shales,or where there are lateral variations due to facies changes.An approach to develop the concept of a super-giant shale gas reservoir is presented as well as a description of GFREE,a successful research program for tight formations.The paper closes with examples of detailed original gas-in-place(OGIP)calculations for 3 North American shale gas reservoirs including free gas in natural fractures and the porous network within the organic matter,gas in the non-organic matter,adsorbed gas,and estimates of free gas within fractures created during hydraulic fracturing jobs.The examples show that the amount of free gas in shale reservoirs,as a percent of the total OGIP,is probably larger than considered previously in the literature.