The paper presents the results of comprehensive studies of filtration and capacitance properties of highly porous reservoir rocks of the aquifer of an underground gas storage facility.The geomechanical part of the res...The paper presents the results of comprehensive studies of filtration and capacitance properties of highly porous reservoir rocks of the aquifer of an underground gas storage facility.The geomechanical part of the research included studying the dependence of rock permeability on the stress-strain state in the vicinity of the wells,and physical modeling of the implementation of the method of increasing the permeability of the wellbore zone-the method of directional unloading of the reservoir.The digital part of the research included computed tomography(CT)-based computer analysis of the internal structure,pore space characteristics,and filtration properties before and after the tests.According to the results of physical modeling of deformation and filtration processes,it is found that the permeability of rocks before fracture depends on the stress-strain state insignificantly,and this influence is reversible.However,when downhole pressure reaches 7-8 MPa,macrocracks in the rock begin to grow,accompanied by irreversible permeability increase.Porosity,geodesic tortuosity and permeability values were obtained based on digital studies and numerical modeling.A weak degree of transversal anisotropy of the filtration properties of rocks was detected.Based on the analysis of pore size distribution,pressure field and flow velocities,high homogeneity and connectivity of the rock pore space is shown.The absence of pronounced changes in pore space characteristics and pore permeability after non-uniform triaxial loading rocks was shown.On the basis of geometrical analysis of pore space,the reasons for weak permeability anisotropy were identified.The filtration-capacitance properties obtained from the digital analysis showed very good agreement with the results of field and laboratory measurements.The physical modeling has confirmed the efficiency of application of the directional unloading method for the reservoir under study.The necessary parameters of its application were calculated:bottomhole geometry,stage of operation,stresses and pressure drawdown value.展开更多
Due to their unique structural features, electrospun membranes have gained considerable attention for use in applications where quality of depth filtration is a dominant performance factor. To elucidate the depth filt...Due to their unique structural features, electrospun membranes have gained considerable attention for use in applications where quality of depth filtration is a dominant performance factor. To elucidate the depth filtration phenomena it is important to quantify the intrinsic structural properties independent from the dynamics of transport media. Several methods have been proposed for structural characterization of such membranes. However, these methods do not meet the requirement for the quantification of intrinsic structural properties in depth filtration. This may be due to the complex influence of transport media dynamics and structural elements in the depth filtration process. In addition, the different morphological architectures of electrospun membranes present obstacles to precise quantification. This paper seeks to quantify the structural characteristics of electrospun membranes by introducing a robust image analysis technique and exploiting it to evaluate the permeation-filtration mechanism. To this end, a nanostructured fibrous network was simulated as an ideal membrane using adaptive local criteria in the image analysis. The reliability of the proposed approach was validated with measurements and comparison of structural characteristics in different morphological conditions. The results were found to be well compatible with empirical observations of perfect membrane structures. This approach, based on optimization of electrospinning parameters, may pave the way for producing optimal membrane structures for boosting the performance of electrospun membranes in end-use applications.展开更多
INTRODUCTION The hepatitis A virus specific immunoglobulin M(IgM)antibody is a specific serological marker forearly diagnosis of hepatitis A..At present,themethods used at home or abroad for detecting anti-HAV IgM are...INTRODUCTION The hepatitis A virus specific immunoglobulin M(IgM)antibody is a specific serological marker forearly diagnosis of hepatitis A..At present,themethods used at home or abroad for detecting anti-HAV IgM are RIA,ELISA and SPHAI.The dotimmunogold combination assay that has beendeveloped since 1989 is a new technique with theproperty of simple and rapid immunologicaldetection,by using the red colloidal gold particles tolabel the antibodies as indicator,and the展开更多
AIM To detect hepatitis A virus-specific immunoglobulin M (IgM) antibody rapidly.METHODS Colloidal gold with an average dia-meter of 15 nm was prepared by controlled reduction of a boiling solution of 0.2 g/ L chloroa...AIM To detect hepatitis A virus-specific immunoglobulin M (IgM) antibody rapidly.METHODS Colloidal gold with an average dia-meter of 15 nm was prepared by controlled reduction of a boiling solution of 0.2 g/ L chloroauric acid with 10 g/ L sodium citrate and labeled with anti-HAVIgG as gold probe. Dot immunogold filtration assay (DIGFA) has been developed by coating anti-human μ chain on nitrocellulose membrane (NCM) for capturing the anti-HAV IgM in serum, then using cultured hepatitis A antigen as a 'bridge', connecting anti-HAV IgM in sample and anti-HAV IgG labeled colloidal gold. If there was anti-HAV IgM in sample, gold probes would concentrate on NCM, which will appear a pink dot.RESULTS A total of 264 serum samples were comparatively detected with both DIGFA and ELISA by 'blind' method. Among them, 88 were positive and 146 were negative with the two methods. The sensitivity and the specificity of DIGFA were 86.27% and 90.12%, respectively. Fifteen negative serum samples and 15 positive serum samples were detected 3 times repeatedly, the results were the same.CONCLUSION DIGFA is a simple, rapid, sensitive, specific and reliable method without expensive equipment and is not interfered with rheumatoid factor (RF) in serum. It is suitable for basic medical laboratories. The test could be applied for diagnosis and epidemiological survey of hepatitis A. It has a broad prospect in application.INTRODUCTIONHepatitis A virus-specific immunoglobulin M antibody (anti-HAV IgM) is the specific serological marker for the early diagnosis of acute hepatitis A. It can be detected by radioimmunoassay (RIA), enzyme-linked immunosorbent assay (ELISA), solid phase hemagglutination inhibition test (SPHIA) and other methods. At present, double sandwich ELISA is in widespread use[1]. However, it takes more time to finish the test and the procedure is complicated. The need of a simple, rapid, and noninstrumented test is evident in many basic units, where laboratory facilities and trained personnel are limited. In 1989, Chun developed a rapid test, DIGFA[2]. It has been used to detect HCG, C-reactive protein, immunoglobulin G antibody and others[3,4]. We developed DIGFA for detection of anti-HAV IgM. The evaluation of this test is presented below.展开更多
The world health organization(WHO)terms dengue as a serious illness that impacts almost half of the world’s population and carries no specific treatment.Early and accurate detection of spread in affected regions can ...The world health organization(WHO)terms dengue as a serious illness that impacts almost half of the world’s population and carries no specific treatment.Early and accurate detection of spread in affected regions can save precious lives.Despite the severity of the disease,a few noticeable works can be found that involve sentiment analysis to mine accurate intuitions from the social media text streams.However,the massive data explosion in recent years has led to difficulties in terms of storing and processing large amounts of data,as reliable mechanisms to gather the data and suitable techniques to extract meaningful insights from the data are required.This research study proposes a sentiment analysis polarity approach for collecting data and extracting relevant information about dengue via Apache Hadoop.The method consists of two main parts:the first part collects data from social media using Apache Flume,while the second part focuses on querying and extracting relevant information via the hybrid filtration-polarity algorithm using Apache Hive.To overcome the noisy and unstructured nature of the data,the process of extracting information is characterized by pre and post-filtration phases.As a result,only with the integration of Flume and Hive with filtration and polarity analysis,can a reliable sentiment analysis technique be offered to collect and process large-scale data from the social network.We introduce how the Apache Hadoop ecosystem–Flume and Hive–can provide a sentiment analysis capability by storing and processing large amounts of data.An important finding of this paper is that developing efficient sentiment analysis applications for detecting diseases can be more reliable through the use of the Hadoop ecosystem components than through the use of normal machines.展开更多
An economy consists of the economic system of a country or other unit of human society.It includes labor,capital,natural resources,production,trade,distribution and consumption of goods and services in the area where ...An economy consists of the economic system of a country or other unit of human society.It includes labor,capital,natural resources,production,trade,distribution and consumption of goods and services in the area where human society is active.These factors give context,content,and determine the conditions and parameters with which the economy operates.When searching with data mining techniques to identify or find out dimensionless groups(DGs)in technical literature,it is possible to meet errors/faults/omissions concerning both,the form and the content of such groups.In the present study,a methodological framework has been developed in terms of a logical flow chart,including 11 activity stages and seven decision nodes,to acquire/process/store/retrieve knowledge for reconstruction and identification of these groups.Case Based Reasoning(CBR),especially modified to meet the needs of this work,has been used for tracing causality paths by similarity and making correction suggestions.Two case examples are presented to prove the functionality of the proposed methodology.Non-dimensional groups are used in engineering but can also be used in economic science.Through this analysis,we can calculate the scale of industrial processes from laboratory to pilot and then factory scale.Still through the study of non-dimensional groups,it is easy to calculate economies of scale embedded in the production process.Synergy savings and target economies cause economies of scale in a production process and reduce the cost of production per unit of output when production is increased.Non-dimensional groups can be a quantitative and measurable indicator for calculating and predicting economies of scale in an industrial unit.The same can happen in an economic unit providing services,that is,intangible products.展开更多
Because of the correlation of images,the efficiency of the standard ICA is not satisfied in the blind source separation (BSS) of image.Therefore,a new method of sub-band ICA with selection criterion is proposed for th...Because of the correlation of images,the efficiency of the standard ICA is not satisfied in the blind source separation (BSS) of image.Therefore,a new method of sub-band ICA with selection criterion is proposed for this problem.Firstly,the sub-bands of the new method are made up of the wavelet packets (WP) coefficients.Secondly,the selection criterion of the new method is a combination of the mutual information (MI),kurtosis and sparsity.One sub-band or a sub-bands group obtained from the new method are more suitable as the inputs parameters of the algorithm of ICA than mixed images.The new method has been applied into the BSS of partially dependent images and highly dependent images successfully.According to the separation experiments,it is shown that the separation efficacy of the new method is more accurate and robust.展开更多
In this paper, we present a novel and efficient scheme for detection of P300 component of the event-related potential in the Brain Computer Interface (BCI) speller paradigm that needs significantly less EEG channels a...In this paper, we present a novel and efficient scheme for detection of P300 component of the event-related potential in the Brain Computer Interface (BCI) speller paradigm that needs significantly less EEG channels and uses a minimal subset of effective features. Removing unnecessary channels and reducing the feature dimension resulted in lower cost and shorter time and thus improved the BCI implementation. The idea was to employ a proper method to optimize the number of channels and feature vectors while keeping high accuracy in classification performance. Optimal channel selection was based on both discriminative criteria and forward-backward investigation. Besides, we obtained a minimal subset of effective features by choosing the discriminant coefficients of wavelet decomposition. Our algorithm was tested on dataset II of the BCI competition 2005. We achieved 92% accuracy using a simple LDA classifier, as compared with the second best result in BCI 2005 with an accuracy of 90.5% using SVM for classification which required more computation, and against the highest accuracy of 96.5% in BCI 2005 that used SVM and much more channels requiring excessive calculations. We also applied our proposed scheme on Hoffmann’s dataset to evaluate the effectiveness of channel reduction and achieved acceptable results.展开更多
目的通过系统分析方法评价双重滤过血浆置换(DFPP)治疗高脂血症性急性胰腺炎(HLAP)的临床疗效。方法通过计算机检索美国国立医学图书馆PubMed数据库、荷兰医学文摘EMbase数据库、Cochrane图书馆数据库、科学网(Web of Science)、万方数...目的通过系统分析方法评价双重滤过血浆置换(DFPP)治疗高脂血症性急性胰腺炎(HLAP)的临床疗效。方法通过计算机检索美国国立医学图书馆PubMed数据库、荷兰医学文摘EMbase数据库、Cochrane图书馆数据库、科学网(Web of Science)、万方数据、维普中文科技期刊全文数据库(VIP)和中国知网(CNKI)等中英文数据库中由建库至2022年9月发表的有关DFPP治疗HLAP的临床对照试验(CCT)或随机对照试验(RCT)。对照组给予药物治疗,DFPP组在药物治疗基础上加用DFPP;主要结局指标为:住院时间、重症监护病房(ICU)住院时间、总体病死率;次要结局指标为:三酰甘油(TG)、血淀粉酶、C-反应蛋白(CRP)。由2名研究者收集数据,依据Cochrane 5.1手册评价文献质量,采用RevMan 5.3软件进行Meta分析;Meta分析结果的稳定性用敏感性分析方法检验,用漏斗图分析文章的发表偏倚。结果最终纳入16篇中英文文献,共涉及835例患者,其中DFPP组450例,对照组385例。Meta分析结果显示,与对照组相比,DFPP组住院时间缩短〔均数差(MD)=-5.28,95%可信区间(95%CI)为-7.14~-3.15,P<0.00001〕、ICU住院时间缩短(MD=-3.90,95%CI为-5.71~-2.05,P<0.0001),TG(MD=-10.75,95%CI为-15.23~-6.27,P<0.00001)、血淀粉酶(MD=-219.01,95%CI为-320.05~-117.96,P<0.0001)、CRP(MD=-34.84,95%CI为-59.11~-10.57,P=0.005)均显著降低。但两组总体病死率比较差异无统计学意义〔相对危险度(RR)=0.77,95%CI为0.20~3.03,P=0.71〕。对纳入文献数量超过10篇的TG进行漏斗图分析,结果显示,本研究文献可能存在一定发表偏倚。结论在使用药物治疗的基础上应用DFPP能提高HLAP治疗的临床效果,改善预后。展开更多
基金supported by the Russian Science Foundation(Grant No.22-11-00273).
文摘The paper presents the results of comprehensive studies of filtration and capacitance properties of highly porous reservoir rocks of the aquifer of an underground gas storage facility.The geomechanical part of the research included studying the dependence of rock permeability on the stress-strain state in the vicinity of the wells,and physical modeling of the implementation of the method of increasing the permeability of the wellbore zone-the method of directional unloading of the reservoir.The digital part of the research included computed tomography(CT)-based computer analysis of the internal structure,pore space characteristics,and filtration properties before and after the tests.According to the results of physical modeling of deformation and filtration processes,it is found that the permeability of rocks before fracture depends on the stress-strain state insignificantly,and this influence is reversible.However,when downhole pressure reaches 7-8 MPa,macrocracks in the rock begin to grow,accompanied by irreversible permeability increase.Porosity,geodesic tortuosity and permeability values were obtained based on digital studies and numerical modeling.A weak degree of transversal anisotropy of the filtration properties of rocks was detected.Based on the analysis of pore size distribution,pressure field and flow velocities,high homogeneity and connectivity of the rock pore space is shown.The absence of pronounced changes in pore space characteristics and pore permeability after non-uniform triaxial loading rocks was shown.On the basis of geometrical analysis of pore space,the reasons for weak permeability anisotropy were identified.The filtration-capacitance properties obtained from the digital analysis showed very good agreement with the results of field and laboratory measurements.The physical modeling has confirmed the efficiency of application of the directional unloading method for the reservoir under study.The necessary parameters of its application were calculated:bottomhole geometry,stage of operation,stresses and pressure drawdown value.
文摘Due to their unique structural features, electrospun membranes have gained considerable attention for use in applications where quality of depth filtration is a dominant performance factor. To elucidate the depth filtration phenomena it is important to quantify the intrinsic structural properties independent from the dynamics of transport media. Several methods have been proposed for structural characterization of such membranes. However, these methods do not meet the requirement for the quantification of intrinsic structural properties in depth filtration. This may be due to the complex influence of transport media dynamics and structural elements in the depth filtration process. In addition, the different morphological architectures of electrospun membranes present obstacles to precise quantification. This paper seeks to quantify the structural characteristics of electrospun membranes by introducing a robust image analysis technique and exploiting it to evaluate the permeation-filtration mechanism. To this end, a nanostructured fibrous network was simulated as an ideal membrane using adaptive local criteria in the image analysis. The reliability of the proposed approach was validated with measurements and comparison of structural characteristics in different morphological conditions. The results were found to be well compatible with empirical observations of perfect membrane structures. This approach, based on optimization of electrospinning parameters, may pave the way for producing optimal membrane structures for boosting the performance of electrospun membranes in end-use applications.
文摘INTRODUCTION The hepatitis A virus specific immunoglobulin M(IgM)antibody is a specific serological marker forearly diagnosis of hepatitis A..At present,themethods used at home or abroad for detecting anti-HAV IgM are RIA,ELISA and SPHAI.The dotimmunogold combination assay that has beendeveloped since 1989 is a new technique with theproperty of simple and rapid immunologicaldetection,by using the red colloidal gold particles tolabel the antibodies as indicator,and the
文摘AIM To detect hepatitis A virus-specific immunoglobulin M (IgM) antibody rapidly.METHODS Colloidal gold with an average dia-meter of 15 nm was prepared by controlled reduction of a boiling solution of 0.2 g/ L chloroauric acid with 10 g/ L sodium citrate and labeled with anti-HAVIgG as gold probe. Dot immunogold filtration assay (DIGFA) has been developed by coating anti-human μ chain on nitrocellulose membrane (NCM) for capturing the anti-HAV IgM in serum, then using cultured hepatitis A antigen as a 'bridge', connecting anti-HAV IgM in sample and anti-HAV IgG labeled colloidal gold. If there was anti-HAV IgM in sample, gold probes would concentrate on NCM, which will appear a pink dot.RESULTS A total of 264 serum samples were comparatively detected with both DIGFA and ELISA by 'blind' method. Among them, 88 were positive and 146 were negative with the two methods. The sensitivity and the specificity of DIGFA were 86.27% and 90.12%, respectively. Fifteen negative serum samples and 15 positive serum samples were detected 3 times repeatedly, the results were the same.CONCLUSION DIGFA is a simple, rapid, sensitive, specific and reliable method without expensive equipment and is not interfered with rheumatoid factor (RF) in serum. It is suitable for basic medical laboratories. The test could be applied for diagnosis and epidemiological survey of hepatitis A. It has a broad prospect in application.INTRODUCTIONHepatitis A virus-specific immunoglobulin M antibody (anti-HAV IgM) is the specific serological marker for the early diagnosis of acute hepatitis A. It can be detected by radioimmunoassay (RIA), enzyme-linked immunosorbent assay (ELISA), solid phase hemagglutination inhibition test (SPHIA) and other methods. At present, double sandwich ELISA is in widespread use[1]. However, it takes more time to finish the test and the procedure is complicated. The need of a simple, rapid, and noninstrumented test is evident in many basic units, where laboratory facilities and trained personnel are limited. In 1989, Chun developed a rapid test, DIGFA[2]. It has been used to detect HCG, C-reactive protein, immunoglobulin G antibody and others[3,4]. We developed DIGFA for detection of anti-HAV IgM. The evaluation of this test is presented below.
基金Taif University Researchers Supporting Project number(TURSP-2020/98).
文摘The world health organization(WHO)terms dengue as a serious illness that impacts almost half of the world’s population and carries no specific treatment.Early and accurate detection of spread in affected regions can save precious lives.Despite the severity of the disease,a few noticeable works can be found that involve sentiment analysis to mine accurate intuitions from the social media text streams.However,the massive data explosion in recent years has led to difficulties in terms of storing and processing large amounts of data,as reliable mechanisms to gather the data and suitable techniques to extract meaningful insights from the data are required.This research study proposes a sentiment analysis polarity approach for collecting data and extracting relevant information about dengue via Apache Hadoop.The method consists of two main parts:the first part collects data from social media using Apache Flume,while the second part focuses on querying and extracting relevant information via the hybrid filtration-polarity algorithm using Apache Hive.To overcome the noisy and unstructured nature of the data,the process of extracting information is characterized by pre and post-filtration phases.As a result,only with the integration of Flume and Hive with filtration and polarity analysis,can a reliable sentiment analysis technique be offered to collect and process large-scale data from the social network.We introduce how the Apache Hadoop ecosystem–Flume and Hive–can provide a sentiment analysis capability by storing and processing large amounts of data.An important finding of this paper is that developing efficient sentiment analysis applications for detecting diseases can be more reliable through the use of the Hadoop ecosystem components than through the use of normal machines.
文摘An economy consists of the economic system of a country or other unit of human society.It includes labor,capital,natural resources,production,trade,distribution and consumption of goods and services in the area where human society is active.These factors give context,content,and determine the conditions and parameters with which the economy operates.When searching with data mining techniques to identify or find out dimensionless groups(DGs)in technical literature,it is possible to meet errors/faults/omissions concerning both,the form and the content of such groups.In the present study,a methodological framework has been developed in terms of a logical flow chart,including 11 activity stages and seven decision nodes,to acquire/process/store/retrieve knowledge for reconstruction and identification of these groups.Case Based Reasoning(CBR),especially modified to meet the needs of this work,has been used for tracing causality paths by similarity and making correction suggestions.Two case examples are presented to prove the functionality of the proposed methodology.Non-dimensional groups are used in engineering but can also be used in economic science.Through this analysis,we can calculate the scale of industrial processes from laboratory to pilot and then factory scale.Still through the study of non-dimensional groups,it is easy to calculate economies of scale embedded in the production process.Synergy savings and target economies cause economies of scale in a production process and reduce the cost of production per unit of output when production is increased.Non-dimensional groups can be a quantitative and measurable indicator for calculating and predicting economies of scale in an industrial unit.The same can happen in an economic unit providing services,that is,intangible products.
文摘Because of the correlation of images,the efficiency of the standard ICA is not satisfied in the blind source separation (BSS) of image.Therefore,a new method of sub-band ICA with selection criterion is proposed for this problem.Firstly,the sub-bands of the new method are made up of the wavelet packets (WP) coefficients.Secondly,the selection criterion of the new method is a combination of the mutual information (MI),kurtosis and sparsity.One sub-band or a sub-bands group obtained from the new method are more suitable as the inputs parameters of the algorithm of ICA than mixed images.The new method has been applied into the BSS of partially dependent images and highly dependent images successfully.According to the separation experiments,it is shown that the separation efficacy of the new method is more accurate and robust.
文摘In this paper, we present a novel and efficient scheme for detection of P300 component of the event-related potential in the Brain Computer Interface (BCI) speller paradigm that needs significantly less EEG channels and uses a minimal subset of effective features. Removing unnecessary channels and reducing the feature dimension resulted in lower cost and shorter time and thus improved the BCI implementation. The idea was to employ a proper method to optimize the number of channels and feature vectors while keeping high accuracy in classification performance. Optimal channel selection was based on both discriminative criteria and forward-backward investigation. Besides, we obtained a minimal subset of effective features by choosing the discriminant coefficients of wavelet decomposition. Our algorithm was tested on dataset II of the BCI competition 2005. We achieved 92% accuracy using a simple LDA classifier, as compared with the second best result in BCI 2005 with an accuracy of 90.5% using SVM for classification which required more computation, and against the highest accuracy of 96.5% in BCI 2005 that used SVM and much more channels requiring excessive calculations. We also applied our proposed scheme on Hoffmann’s dataset to evaluate the effectiveness of channel reduction and achieved acceptable results.
文摘目的通过系统分析方法评价双重滤过血浆置换(DFPP)治疗高脂血症性急性胰腺炎(HLAP)的临床疗效。方法通过计算机检索美国国立医学图书馆PubMed数据库、荷兰医学文摘EMbase数据库、Cochrane图书馆数据库、科学网(Web of Science)、万方数据、维普中文科技期刊全文数据库(VIP)和中国知网(CNKI)等中英文数据库中由建库至2022年9月发表的有关DFPP治疗HLAP的临床对照试验(CCT)或随机对照试验(RCT)。对照组给予药物治疗,DFPP组在药物治疗基础上加用DFPP;主要结局指标为:住院时间、重症监护病房(ICU)住院时间、总体病死率;次要结局指标为:三酰甘油(TG)、血淀粉酶、C-反应蛋白(CRP)。由2名研究者收集数据,依据Cochrane 5.1手册评价文献质量,采用RevMan 5.3软件进行Meta分析;Meta分析结果的稳定性用敏感性分析方法检验,用漏斗图分析文章的发表偏倚。结果最终纳入16篇中英文文献,共涉及835例患者,其中DFPP组450例,对照组385例。Meta分析结果显示,与对照组相比,DFPP组住院时间缩短〔均数差(MD)=-5.28,95%可信区间(95%CI)为-7.14~-3.15,P<0.00001〕、ICU住院时间缩短(MD=-3.90,95%CI为-5.71~-2.05,P<0.0001),TG(MD=-10.75,95%CI为-15.23~-6.27,P<0.00001)、血淀粉酶(MD=-219.01,95%CI为-320.05~-117.96,P<0.0001)、CRP(MD=-34.84,95%CI为-59.11~-10.57,P=0.005)均显著降低。但两组总体病死率比较差异无统计学意义〔相对危险度(RR)=0.77,95%CI为0.20~3.03,P=0.71〕。对纳入文献数量超过10篇的TG进行漏斗图分析,结果显示,本研究文献可能存在一定发表偏倚。结论在使用药物治疗的基础上应用DFPP能提高HLAP治疗的临床效果,改善预后。