To determine the prevalence of metabolic syndrome (MetS) in Malaysian type 2 diabetic patients using WHO, NCEP ATP III, IDF and the new Harmonized definitions, and the concordance between these definitions. This study...To determine the prevalence of metabolic syndrome (MetS) in Malaysian type 2 diabetic patients using WHO, NCEP ATP III, IDF and the new Harmonized definitions, and the concordance between these definitions. This study involved 313 patients diagnosed with type 2 diabetes mellitus (T2DM) at two Malaysian tertiary hospitals. Socio-demographic data were assessed using a pre-tested interviewer-administered structured questionnaire. Anthropometric measurements were carried out according to standard protocols. Clinical and laboratory characteristics were examined. Kappa (k) statistics were used for the agreement between the four MetS definitions. The overall prevalence rates of MetS (95% CI) were 95.8% (93.6-98.1), 96.1% (94.0-98.3), 84.8% (80.8-88.9) and 97.7% (96.1-99.4) according to the WHO, NCEP ATP III, IDF and the Harmonized definitions, respectively. The Kappa statistics demonstrated a slight to substantial agreement between the definitions (k = 0.179-0.875, p k = 0.875, p hest specificity (100%) in identifying MetS. In conclusion, the new Harmonized criteria established the highest prevalence of MetS among the four definitions applied. There was a very good concordance between the WHO and NCEP ATP III criteria. The extremely high prevalence of MetS observed in type 2 diabetic patients indicates an impending pandemic of CVD risk in Malaysia. Aggressive treatment of MetS components is required to reduce cardiovascular risk in T2DM.展开更多
In Advances in Pure Mathematics (www.scirp.org/journal/apm), Vol. 1, No. 4 (July 2011), pp. 136-154, the mathematical structure of the much discussed problem of probability known as the Monty Hall problem was mapped i...In Advances in Pure Mathematics (www.scirp.org/journal/apm), Vol. 1, No. 4 (July 2011), pp. 136-154, the mathematical structure of the much discussed problem of probability known as the Monty Hall problem was mapped in detail. It is styled here as Monty Hall 1.0. The proposed analysis was then generalized to related cases involving any number of doors (d), cars (c), and opened doors (o) (Monty Hall 2.0) and 1 specific case involving more than 1 picked door (p) (Monty Hall 3.0). In cognitive terms, this analysis was interpreted in function of the presumed digital nature of rational thought and language. In the present paper, Monty Hall 1.0 and 2.0 are briefly reviewed (§§2-3). Additional generalizations of the problem are then presented in §§4-7. They concern expansions of the problem to the following items: (1) to any number of picked doors, with p denoting the number of doors initially picked and q the number of doors picked when switching doors after doors have been opened to reveal goats (Monty Hall 3.0;see §4);(3) to the precise conditions under which one’s chances increase or decrease in instances of Monty Hall 3.0 (Monty Hall 3.2;see §6);and (4) to any number of switches of doors (s) (Monty Hall 4.0;see §7). The afore-mentioned article in APM, Vol. 1, No. 4 may serve as a useful introduction to the analysis of the higher variations of the Monty Hall problem offered in the present article. The body of the article is by Leo Depuydt. An appendix by Richard D. Gill (see §8) provides additional context by building a bridge to modern probability theory in its conventional notation and by pointing to the benefits of certain interesting and relevant tools of computation now available on the Internet. The cognitive component of the earlier investigation is extended in §9 by reflections on the foundations of mathematics. It will be proposed, in the footsteps of George Boole, that the phenomenon of mathematics needs to be defined in empirical terms as something that happens to the brain or something that the brain does. It is generally assumed that mathematics is a property of nature or reality or whatever one may call it. There is not the slightest intention in this paper to falsify this assumption because it cannot be falsified, just as it cannot be empirically or positively proven. But there is no way that this assumption can be a factual observation. It can be no more than an altogether reasonable, yet fully secondary, inference derived mainly from the fact that mathematics appears to work, even if some may deem the fact of this match to constitute proof. On the deepest empirical level, mathematics can only be directly observed and therefore directly analyzed as an activity of the brain. The study of mathematics therefore becomes an essential part of the study of cognition and human intelligence. The reflections on mathematics as a phenomenon offered in the present article will serve as a prelude to planned articles on how to redefine the foundations of probability as one type of mathematics in cognitive fashion and on how exactly Boole’s theory of probability subsumes, supersedes, and completes classical probability theory. §§2-7 combined, on the one hand, and §9, on the other hand, are both self-sufficient units and can be read independently from one another. The ultimate design of the larger project of which this paper is part remains the increase of digitalization of the analysis of rational thought and language, that is, of (rational, not emotional) human intelligence. To reach out to other disciplines, an effort is made to describe the mathematics more explicitly than is usual.展开更多
Partial formalization, which involves the development of deductive connections among statements, can be used to examine assumptions, definitions and related methodologies that are used in science. This approach has be...Partial formalization, which involves the development of deductive connections among statements, can be used to examine assumptions, definitions and related methodologies that are used in science. This approach has been applied to the study of nucleic acids recovered from natural microbial assemblages (NMA) by the use of bulk extraction. Six pools of bulk-extractable nucleic acids (BENA) are suggested to be present in a NMA: (pool 1) inactive microbes (abiotic-limited);(pool 2) inactive microbes (abiotic permissive, biotic-limited);(pool 3) dormant microbes (abiotic permissive, biotic-limited, but can become biotic permissive);(pool 4) in situ active microbes (the microbial community);(pool 5) viruses (virocells/virions/cryptic viral genomes);and (pool 6) extracellular nucleic acids including extracellular DNA (eDNA). Definitions for cells, the microbial community (in situ active cells), the rare biosphere, dormant cells (the microbial seed bank), viruses (virocells/virions/cryptic viral genomic), and diversity are presented, together with methodology suggested to allow their study. The word diversity will require at least 4 definitions, each involving a different methodology. These suggested definitions and methodologies should make it possible to make further advances in bulk extraction-based molecular microbial ecology.展开更多
The microstructure polymer optical fibre (mPOF) inscribed long period grating (LPG) offers a wide field of application in strain sensors arena within the materials elastic limit. Flexible innovative macro fibre compos...The microstructure polymer optical fibre (mPOF) inscribed long period grating (LPG) offers a wide field of application in strain sensors arena within the materials elastic limit. Flexible innovative macro fibre composite (MFC) actuator generates electromechanical force under DC driving voltage. We propose a novel method for Bragg wavelength blue shifting through stretch tuning of mPOF LPG in axial direction under applied DC voltage on attached MFC with LPG. The grating period of mPOF LPG changes refractive index and causes blue shift of Bragg grating fibre wavelength. The shifting governs on the values of generated electromechanically strain transfer from flexible MFC to mPOF LPG and they have potential applications in strain sensor.展开更多
BACKGROUND Endoscopic submucosal dissection(ESD)and surgical resection are the standard of care for cT1N0M0 esophageal cancer(EC),whereas definitive chemoradiotherapy(d-CRT)is a treatment option.Nevertheless,the compa...BACKGROUND Endoscopic submucosal dissection(ESD)and surgical resection are the standard of care for cT1N0M0 esophageal cancer(EC),whereas definitive chemoradiotherapy(d-CRT)is a treatment option.Nevertheless,the comparative efficiency and safety of ESD,surgery and d-CRT for cT1N0M0 EC remain unclear.AIM To compare the efficiency and safety of ESD,surgery and d-CRT for cT1N0M0 EC.METHODS We retrospectively analyzed the hospitalized data of a total of 472 consecutive patients with cT1N0M0 EC treated at Sun Yat-sen University Cancer center between 2017-2019 and followed up until October 30th,2022.We analyzed demographic,medical recorded,histopathologic characteristics,imaging and endoscopic,and follow-up data.The Kaplan-Meier method and Cox proportional hazards modeling were used to analyze the difference of survival outcome by treatments.Inverse probability of treatment weighting(IPTW)was used to minimize potential confounding factors.RESULTS We retrospectively analyzed patients who underwent ESD(n=99)or surgery(n=220)or d-CRT(n=16)at the Sun Yat-sen University Cancer Center from 2017 to 2019.The median follow-up time for the ESD group,the surgery group,and the d-CRT group was 42.0 mo(95%CI:35.0-60.2),45.0 mo(95%CI:34.0-61.75)and 32.5 mo(95%CI:28.3-40.0),respectively.After adjusting for background factors using IPTW,the highest 3-year overall survival(OS)rate and 3-year recurrence-free survival(RFS)rate were observed in the ESD group(3-year OS:99.7% and 94.7% and 79.1%;and 3-year RFS:98.3%,87.4% and 79.1%,in the ESD,surgical,and d-CRT groups,respectively).There was no difference of severe complications occurring between the three groups(P≥0.05).Multivariate analysis showed that treatment method,histology and depth of infiltration were independently associated with OS and RFS.CONCLUSION For cT1N0M0 EC,ESD had better long-term survival and lower hospitalization costs than those who underwent d-CRT and surgery,with a similar rate of severe complications occurring.展开更多
The grand unified theory (GUT) originated in mathematics with this question: why are there long standing unsolved problems in mathematics, e.g., Fermat’s conjecture (also known as Fermat’s last theorem (FLT))? The a...The grand unified theory (GUT) originated in mathematics with this question: why are there long standing unsolved problems in mathematics, e.g., Fermat’s conjecture (also known as Fermat’s last theorem (FLT))? The answer came quickly: its underlying fields—foundations and the real number system—are defective. In particular, formal logic is inapplicable to mathematics (language of science) and the real number system is inconsistent. Critique-rectification of these fields was undertaken leading to a new mathematical methodology and the consistent new real number system that provides the main mathematics of GUT. Similar question was posed in physics: why are there long standing problems, e.g., the gravitational n-body and turbulence problems? The answer: the present methodology, quantitative modeling is inadequate and the remedy is a new methodology—qualitative mathematics and modeling that solved these problems and provided the initial formulation of GUT. This paper presents the basic logic of GUT and its fundamental concepts, particularly, the superstring or fundamental building block of matter.展开更多
在适度的空间和时间尺度组合下,裂纹既可在几个月中蠕变几个纳米,也能在几秒钟内扩展10km.虽然裂纹的尖端没有实际的质量,但是它能通过激活周围的物质而处于高能量状态.依赖于材料的损伤方向,激活质量的减少和增加可发生在尺度转变之前...在适度的空间和时间尺度组合下,裂纹既可在几个月中蠕变几个纳米,也能在几秒钟内扩展10km.虽然裂纹的尖端没有实际的质量,但是它能通过激活周围的物质而处于高能量状态.依赖于材料的损伤方向,激活质量的减少和增加可发生在尺度转变之前或之后.每个尺度区的分段阈值被假定为与裂纹尖端速度的平方a^2和激活质量密度M的乘积有关:W=M_(↓↑)a_(↑↓)~2和D=M^(↓↑)a_(↑↓)~2.W和D分别被称为直接吸收和自耗散能量密度.正如下标/上标符号所示,激活的质量密度M_(↓↑)和M^(↓↑)与裂纹尖端速度a变化趋势相反,既可增加也可减少.a^2和M的互补效应隐含着常用于宇宙物理学建模的膨胀和/或收缩的物理过程.在用于尺度敏感的裂纹尖端的行为时,激活的质量密度有相同的解释.分段时的多尺度可以由…皮观、纳观、微观和宏观…组成.因此,形象地说,材料损伤过程可以通过裂纹扩展过程中非均匀的总体和局部能量的传递来模拟.疲劳裂纹扩展引起的材料损伤被用来阐释由大到小和由慢到快的尺度/时间序,热力学中的冷→热和有序→无序转换.这一过程正巧与宇宙演化的箭形方向相反,宇宙演化遵循小→大和快→慢,而热力学相反,遵循热→冷和无序→有序.为了表示由损伤萌生所造成的类裂缝型缺陷的不均匀性,提出了一个被称为裂纹尖端力学(crack tip mechanics,CTM)的新模式.涉及的范围是模拟原子列之间的界面裂纹或连续体中分叉的切口.假如需要的话,尺寸和时间的范围可以复盖从皮观到宏观甚至更大.虽然采用疲劳裂纹来说明CTM的基本原理,在宇宙物理学背景中与直接吸收和自耗散相关的膨胀和收缩的情况可以描述裂纹周围激活质量的行为,它们可看为能量的汇或源.奇异性被用来捕获能量的源或汇的特性,物理上,两者作为界面的一部分,从数学上看则是不连续的线的一部分.能量从一种形式变为另一种形式取决于能量吸收或耗散的箭形损伤时间,这之中牵涉到尺度分段和奇异性强度的联合应用.材料组分随时间的劣化是根据指定的设计寿命导出的,从而使材料的响应与加载率的时间历史匹配.2024-T3铝板的皮观/纳观/微观/宏观开裂模型用来说明什么地方可以增加结构的寿命部分.皮观/纳观/微观/宏观/结构系统的性能随时间劣化可以用9个尺度转变物理参数来描述:纳观/微观区有3个(μ_(na/mi)~*,σ_(na/mi)~*,d_(na/mi)~*),微观/宏观区有3个(μ_(mi/ma)~*,σ_(mi/ma)~*,d_(mi/ma)~*),皮观/纳观区有3个(μ_(pi/na)~*,σ_(pi/na)~*,d_(pi/na)~*).下标pi,na,mi,ma和struc分别表示皮观、纳观、微观、宏观和结构.只要知道两个相连的尺度敏感参数,在较低尺度的时间相关的局部物理参数就完成了分析连续体的形式论,虽然它们并不需要用实验来知道.更具体地说,根据皮观→纳观→微观→宏观分别有1.25/1.00/0.75/0.50的λ奇异性强度,皮观裂纹、纳观裂纹、微观裂纹和宏观裂纹的转变特征是从时间箭形的指定的寿命预期来确定的.附加的0.25强度的奇异性可用于结构元件.回想起来,λ=0.5相应于断裂力学中的应力分量与r^(0.5)成反比,r是与宏观裂纹尖端的距离.微观裂纹、纳观裂纹和皮观裂纹分别赋予r^(-0.75),r^(-1.0),r^(-1.25)的奇异性.箭形时间(以年为单位)取决于问题的定义.设备的关键部件可用1.5~±/2.5~±/3.5~±/5.5~±寿命分布和总寿命为13~±年(a)的皮观/纳观/微观/宏观尺度来设计运行.上标±表示多于或少于实际运行的时间.累进损伤被假定为发生在皮观→纳观→微观→宏观方向.同样的方案用于20年总寿命的2024-T3铝板的疲劳损伤,按照1.5~±/2.5~±/3.5~±/5.5~±/7.0~±的方式将它的寿命分布在皮观、纳观、微观、宏观和结构的尺度上,这样的指定只是满足在每个尺度范围内损伤内部材料结构所用的能量匹配,因此可以强制执行在总寿命的跨度内精确的时间相关的材料性能劣化过程.展开更多
In order to examine the seasonal and spatial distributions of benthic animals in the intertidal mudflat of the southern Yellow River Delta,field investigations were carried out in 2007 and 2008 and multiple methods we...In order to examine the seasonal and spatial distributions of benthic animals in the intertidal mudflat of the southern Yellow River Delta,field investigations were carried out in 2007 and 2008 and multiple methods were applied.Results showed that,the biomass of macro benthos ranged at 0.75-1151.00 g wet m^(-2) and averaged at 156.31 g wet m^(-2),in which Mactra veneriformis accounted for 75.6%-93.4% of the total macro benthic biomass.More than 90% of macro benthos inhabited in the middle and low tide lines,and higher biomass occurred in early summer and lower in winter.Statistical analysis showed that:1)M.veneriformis growth was primarily favored at higher temperature and lower salinity;2)after long time interaction,benthic bivalve grazers led to patching distributions of Chlorophyll a(Chl a);3)macro benthic biomass positively related with Chl a when the concentration of Chl a was low,but they were negatively related when Chl a concentration was high;and 4)furthermore,the biomass of benthic bivalves peaked in the sediment with median grain size about 0.55 mm,but decreased gradually in coarse or fine sediments.The secondary productivity ranged at 0.37-283.68 g m^(-2)yr^(-1) and averaged at 47.88 g m^(-2) yr^(-1),in which 69.7% was contributed by M.veneriformis It was estimated that primary production was transformed to secondary production at a rate of 6.87%approximately,which implies that there is a local sustainability of high bivalve production.展开更多
AIM: To study the accuracy of using high definition(HD) scope with narrow band imaging(NBI) vs standard white light colonoscope without NBI(ST), to predict the histology of the colon polyps, particularly those < 1 ...AIM: To study the accuracy of using high definition(HD) scope with narrow band imaging(NBI) vs standard white light colonoscope without NBI(ST), to predict the histology of the colon polyps, particularly those < 1 cm.METHODS: A total of 147 African Americans patients who were referred to Howard University Hospital for screening or, diagnostic or follow up colonoscopy, during a 12-mo period in 2012 were prospectively recruited. Some patients had multiple polyps and total number of polyps was 179. Their colonoscopies were performed by 3 experienced endoscopists who determined the size and stated whether the polyps being removed were hyperplastic or adenomatous polyps using standard colonoscopes or high definition colonoscopes with NBI. The histopathologic diagnosis was reported by pathologists as part of routine care. RESULTS: Of participants in the study, 55(37%) were male and median(interquartile range) of age was 56(19-80). Demographic, clinical characteristics, past medical history of patients, and the data obtained by two instruments were not significantly different and two methods detected similar number of polyps. In ST scope 89% of polyps were < 1 cm vs 87% in HD scope(P = 0.7). The ST scope had a positive predictive value(PPV) and positive likelihood ratio(PLR) of 86% and 4.0 for adenoma compared to 74% and 2.6 for HD scope. There was a trend of higher sensitivity for HD scope(68%) compare to ST scope(53%) with almost the same specificity. The ST scope had a PPV and PLR of 38% and 1.8 for hyperplastic polyp(HPP) compared to 42% and 2.2 for HD scope. The sensitivity and specificity of two instruments for HPP diagnosis were similar.CONCLUSION: Our results indicated that HD scope was more sensitive in diagnosis of adenoma than ST scope. Clinical diagnosis of HPP with either scope is less accurate compared to adenoma. Colonoscopy diagnosis is not yet fully matched with pathologic diagnosis of colon polyp. However with the advancement of both imaging and training, it may be possible to increase the sensitivity and specificity of the scopes and hence save money for eliminating time and the cost of Immunohistochemistry/pathology.展开更多
The paper identifies chaos, turbulence and fractal of quantum and macro gravity and studies their behavior, properties and applications based on the grand unified theory (GUT) and qualitative mathematics and modeling....The paper identifies chaos, turbulence and fractal of quantum and macro gravity and studies their behavior, properties and applications based on the grand unified theory (GUT) and qualitative mathematics and modeling. Applications include devising electromagnetic engines, tornado aborter and terminator and technologies for electromagnetic treatment of genetic diseases such as cancer, systemic lupos erythematosus, diabetes and mental disorder without harm to normal cell and side effect. Typhoon in the Western Pacific which is turbulence is impossible to terminate and impractical to deflect but prediction can be improved because it follows the Northern Pacific Wind Cycle (Southern Pacific Wind Cycle in the Southern Hemisphere) and is affected by the temperature variation over the Philippine Deep and around Mayon Volcano. The electromagnetic engine uses the clean, inexhaustible, free dark matter, specifically, the energy of magnetic flux, in place of conventional fuel, e.g., fossil, nuclear and geothermal. The tornado aborter and terminator utilize the gravitational flux of the Earth, its vortex flux of superstrings as a cosmological vortex which is turbulence. The technologies for electromagnetic treatment of genetic diseases utilize electromagnetic waves based on resonance. All of them are GUT technologies because they are applications of GUT. Except for the magnetic train which is in operation the rest is still at the conceptual and research and development (R&D) phase but the theory is complete and the strategy for R&D are laid down in detail in the cited original papers.展开更多
文摘To determine the prevalence of metabolic syndrome (MetS) in Malaysian type 2 diabetic patients using WHO, NCEP ATP III, IDF and the new Harmonized definitions, and the concordance between these definitions. This study involved 313 patients diagnosed with type 2 diabetes mellitus (T2DM) at two Malaysian tertiary hospitals. Socio-demographic data were assessed using a pre-tested interviewer-administered structured questionnaire. Anthropometric measurements were carried out according to standard protocols. Clinical and laboratory characteristics were examined. Kappa (k) statistics were used for the agreement between the four MetS definitions. The overall prevalence rates of MetS (95% CI) were 95.8% (93.6-98.1), 96.1% (94.0-98.3), 84.8% (80.8-88.9) and 97.7% (96.1-99.4) according to the WHO, NCEP ATP III, IDF and the Harmonized definitions, respectively. The Kappa statistics demonstrated a slight to substantial agreement between the definitions (k = 0.179-0.875, p k = 0.875, p hest specificity (100%) in identifying MetS. In conclusion, the new Harmonized criteria established the highest prevalence of MetS among the four definitions applied. There was a very good concordance between the WHO and NCEP ATP III criteria. The extremely high prevalence of MetS observed in type 2 diabetic patients indicates an impending pandemic of CVD risk in Malaysia. Aggressive treatment of MetS components is required to reduce cardiovascular risk in T2DM.
文摘In Advances in Pure Mathematics (www.scirp.org/journal/apm), Vol. 1, No. 4 (July 2011), pp. 136-154, the mathematical structure of the much discussed problem of probability known as the Monty Hall problem was mapped in detail. It is styled here as Monty Hall 1.0. The proposed analysis was then generalized to related cases involving any number of doors (d), cars (c), and opened doors (o) (Monty Hall 2.0) and 1 specific case involving more than 1 picked door (p) (Monty Hall 3.0). In cognitive terms, this analysis was interpreted in function of the presumed digital nature of rational thought and language. In the present paper, Monty Hall 1.0 and 2.0 are briefly reviewed (§§2-3). Additional generalizations of the problem are then presented in §§4-7. They concern expansions of the problem to the following items: (1) to any number of picked doors, with p denoting the number of doors initially picked and q the number of doors picked when switching doors after doors have been opened to reveal goats (Monty Hall 3.0;see §4);(3) to the precise conditions under which one’s chances increase or decrease in instances of Monty Hall 3.0 (Monty Hall 3.2;see §6);and (4) to any number of switches of doors (s) (Monty Hall 4.0;see §7). The afore-mentioned article in APM, Vol. 1, No. 4 may serve as a useful introduction to the analysis of the higher variations of the Monty Hall problem offered in the present article. The body of the article is by Leo Depuydt. An appendix by Richard D. Gill (see §8) provides additional context by building a bridge to modern probability theory in its conventional notation and by pointing to the benefits of certain interesting and relevant tools of computation now available on the Internet. The cognitive component of the earlier investigation is extended in §9 by reflections on the foundations of mathematics. It will be proposed, in the footsteps of George Boole, that the phenomenon of mathematics needs to be defined in empirical terms as something that happens to the brain or something that the brain does. It is generally assumed that mathematics is a property of nature or reality or whatever one may call it. There is not the slightest intention in this paper to falsify this assumption because it cannot be falsified, just as it cannot be empirically or positively proven. But there is no way that this assumption can be a factual observation. It can be no more than an altogether reasonable, yet fully secondary, inference derived mainly from the fact that mathematics appears to work, even if some may deem the fact of this match to constitute proof. On the deepest empirical level, mathematics can only be directly observed and therefore directly analyzed as an activity of the brain. The study of mathematics therefore becomes an essential part of the study of cognition and human intelligence. The reflections on mathematics as a phenomenon offered in the present article will serve as a prelude to planned articles on how to redefine the foundations of probability as one type of mathematics in cognitive fashion and on how exactly Boole’s theory of probability subsumes, supersedes, and completes classical probability theory. §§2-7 combined, on the one hand, and §9, on the other hand, are both self-sufficient units and can be read independently from one another. The ultimate design of the larger project of which this paper is part remains the increase of digitalization of the analysis of rational thought and language, that is, of (rational, not emotional) human intelligence. To reach out to other disciplines, an effort is made to describe the mathematics more explicitly than is usual.
文摘Partial formalization, which involves the development of deductive connections among statements, can be used to examine assumptions, definitions and related methodologies that are used in science. This approach has been applied to the study of nucleic acids recovered from natural microbial assemblages (NMA) by the use of bulk extraction. Six pools of bulk-extractable nucleic acids (BENA) are suggested to be present in a NMA: (pool 1) inactive microbes (abiotic-limited);(pool 2) inactive microbes (abiotic permissive, biotic-limited);(pool 3) dormant microbes (abiotic permissive, biotic-limited, but can become biotic permissive);(pool 4) in situ active microbes (the microbial community);(pool 5) viruses (virocells/virions/cryptic viral genomes);and (pool 6) extracellular nucleic acids including extracellular DNA (eDNA). Definitions for cells, the microbial community (in situ active cells), the rare biosphere, dormant cells (the microbial seed bank), viruses (virocells/virions/cryptic viral genomic), and diversity are presented, together with methodology suggested to allow their study. The word diversity will require at least 4 definitions, each involving a different methodology. These suggested definitions and methodologies should make it possible to make further advances in bulk extraction-based molecular microbial ecology.
文摘The microstructure polymer optical fibre (mPOF) inscribed long period grating (LPG) offers a wide field of application in strain sensors arena within the materials elastic limit. Flexible innovative macro fibre composite (MFC) actuator generates electromechanical force under DC driving voltage. We propose a novel method for Bragg wavelength blue shifting through stretch tuning of mPOF LPG in axial direction under applied DC voltage on attached MFC with LPG. The grating period of mPOF LPG changes refractive index and causes blue shift of Bragg grating fibre wavelength. The shifting governs on the values of generated electromechanically strain transfer from flexible MFC to mPOF LPG and they have potential applications in strain sensor.
基金Supported by the Guangdong Esophageal Cancer Institute Science and Technology Program,No.M202013Guangdong Medical Research Foundation,No.A2021369.
文摘BACKGROUND Endoscopic submucosal dissection(ESD)and surgical resection are the standard of care for cT1N0M0 esophageal cancer(EC),whereas definitive chemoradiotherapy(d-CRT)is a treatment option.Nevertheless,the comparative efficiency and safety of ESD,surgery and d-CRT for cT1N0M0 EC remain unclear.AIM To compare the efficiency and safety of ESD,surgery and d-CRT for cT1N0M0 EC.METHODS We retrospectively analyzed the hospitalized data of a total of 472 consecutive patients with cT1N0M0 EC treated at Sun Yat-sen University Cancer center between 2017-2019 and followed up until October 30th,2022.We analyzed demographic,medical recorded,histopathologic characteristics,imaging and endoscopic,and follow-up data.The Kaplan-Meier method and Cox proportional hazards modeling were used to analyze the difference of survival outcome by treatments.Inverse probability of treatment weighting(IPTW)was used to minimize potential confounding factors.RESULTS We retrospectively analyzed patients who underwent ESD(n=99)or surgery(n=220)or d-CRT(n=16)at the Sun Yat-sen University Cancer Center from 2017 to 2019.The median follow-up time for the ESD group,the surgery group,and the d-CRT group was 42.0 mo(95%CI:35.0-60.2),45.0 mo(95%CI:34.0-61.75)and 32.5 mo(95%CI:28.3-40.0),respectively.After adjusting for background factors using IPTW,the highest 3-year overall survival(OS)rate and 3-year recurrence-free survival(RFS)rate were observed in the ESD group(3-year OS:99.7% and 94.7% and 79.1%;and 3-year RFS:98.3%,87.4% and 79.1%,in the ESD,surgical,and d-CRT groups,respectively).There was no difference of severe complications occurring between the three groups(P≥0.05).Multivariate analysis showed that treatment method,histology and depth of infiltration were independently associated with OS and RFS.CONCLUSION For cT1N0M0 EC,ESD had better long-term survival and lower hospitalization costs than those who underwent d-CRT and surgery,with a similar rate of severe complications occurring.
文摘We exploit the theory of reproducing kernels to deduce a matrix inequality for the inverse of the restriction of a positive definite Hermitian matrix.
文摘The grand unified theory (GUT) originated in mathematics with this question: why are there long standing unsolved problems in mathematics, e.g., Fermat’s conjecture (also known as Fermat’s last theorem (FLT))? The answer came quickly: its underlying fields—foundations and the real number system—are defective. In particular, formal logic is inapplicable to mathematics (language of science) and the real number system is inconsistent. Critique-rectification of these fields was undertaken leading to a new mathematical methodology and the consistent new real number system that provides the main mathematics of GUT. Similar question was posed in physics: why are there long standing problems, e.g., the gravitational n-body and turbulence problems? The answer: the present methodology, quantitative modeling is inadequate and the remedy is a new methodology—qualitative mathematics and modeling that solved these problems and provided the initial formulation of GUT. This paper presents the basic logic of GUT and its fundamental concepts, particularly, the superstring or fundamental building block of matter.
文摘在适度的空间和时间尺度组合下,裂纹既可在几个月中蠕变几个纳米,也能在几秒钟内扩展10km.虽然裂纹的尖端没有实际的质量,但是它能通过激活周围的物质而处于高能量状态.依赖于材料的损伤方向,激活质量的减少和增加可发生在尺度转变之前或之后.每个尺度区的分段阈值被假定为与裂纹尖端速度的平方a^2和激活质量密度M的乘积有关:W=M_(↓↑)a_(↑↓)~2和D=M^(↓↑)a_(↑↓)~2.W和D分别被称为直接吸收和自耗散能量密度.正如下标/上标符号所示,激活的质量密度M_(↓↑)和M^(↓↑)与裂纹尖端速度a变化趋势相反,既可增加也可减少.a^2和M的互补效应隐含着常用于宇宙物理学建模的膨胀和/或收缩的物理过程.在用于尺度敏感的裂纹尖端的行为时,激活的质量密度有相同的解释.分段时的多尺度可以由…皮观、纳观、微观和宏观…组成.因此,形象地说,材料损伤过程可以通过裂纹扩展过程中非均匀的总体和局部能量的传递来模拟.疲劳裂纹扩展引起的材料损伤被用来阐释由大到小和由慢到快的尺度/时间序,热力学中的冷→热和有序→无序转换.这一过程正巧与宇宙演化的箭形方向相反,宇宙演化遵循小→大和快→慢,而热力学相反,遵循热→冷和无序→有序.为了表示由损伤萌生所造成的类裂缝型缺陷的不均匀性,提出了一个被称为裂纹尖端力学(crack tip mechanics,CTM)的新模式.涉及的范围是模拟原子列之间的界面裂纹或连续体中分叉的切口.假如需要的话,尺寸和时间的范围可以复盖从皮观到宏观甚至更大.虽然采用疲劳裂纹来说明CTM的基本原理,在宇宙物理学背景中与直接吸收和自耗散相关的膨胀和收缩的情况可以描述裂纹周围激活质量的行为,它们可看为能量的汇或源.奇异性被用来捕获能量的源或汇的特性,物理上,两者作为界面的一部分,从数学上看则是不连续的线的一部分.能量从一种形式变为另一种形式取决于能量吸收或耗散的箭形损伤时间,这之中牵涉到尺度分段和奇异性强度的联合应用.材料组分随时间的劣化是根据指定的设计寿命导出的,从而使材料的响应与加载率的时间历史匹配.2024-T3铝板的皮观/纳观/微观/宏观开裂模型用来说明什么地方可以增加结构的寿命部分.皮观/纳观/微观/宏观/结构系统的性能随时间劣化可以用9个尺度转变物理参数来描述:纳观/微观区有3个(μ_(na/mi)~*,σ_(na/mi)~*,d_(na/mi)~*),微观/宏观区有3个(μ_(mi/ma)~*,σ_(mi/ma)~*,d_(mi/ma)~*),皮观/纳观区有3个(μ_(pi/na)~*,σ_(pi/na)~*,d_(pi/na)~*).下标pi,na,mi,ma和struc分别表示皮观、纳观、微观、宏观和结构.只要知道两个相连的尺度敏感参数,在较低尺度的时间相关的局部物理参数就完成了分析连续体的形式论,虽然它们并不需要用实验来知道.更具体地说,根据皮观→纳观→微观→宏观分别有1.25/1.00/0.75/0.50的λ奇异性强度,皮观裂纹、纳观裂纹、微观裂纹和宏观裂纹的转变特征是从时间箭形的指定的寿命预期来确定的.附加的0.25强度的奇异性可用于结构元件.回想起来,λ=0.5相应于断裂力学中的应力分量与r^(0.5)成反比,r是与宏观裂纹尖端的距离.微观裂纹、纳观裂纹和皮观裂纹分别赋予r^(-0.75),r^(-1.0),r^(-1.25)的奇异性.箭形时间(以年为单位)取决于问题的定义.设备的关键部件可用1.5~±/2.5~±/3.5~±/5.5~±寿命分布和总寿命为13~±年(a)的皮观/纳观/微观/宏观尺度来设计运行.上标±表示多于或少于实际运行的时间.累进损伤被假定为发生在皮观→纳观→微观→宏观方向.同样的方案用于20年总寿命的2024-T3铝板的疲劳损伤,按照1.5~±/2.5~±/3.5~±/5.5~±/7.0~±的方式将它的寿命分布在皮观、纳观、微观、宏观和结构的尺度上,这样的指定只是满足在每个尺度范围内损伤内部材料结构所用的能量匹配,因此可以强制执行在总寿命的跨度内精确的时间相关的材料性能劣化过程.
基金supported by the National Natural Science Foundation of China(No.41176064)NSFC-Shandong Joint Fund for Marine Science Research Centers(No.U1406403)
文摘In order to examine the seasonal and spatial distributions of benthic animals in the intertidal mudflat of the southern Yellow River Delta,field investigations were carried out in 2007 and 2008 and multiple methods were applied.Results showed that,the biomass of macro benthos ranged at 0.75-1151.00 g wet m^(-2) and averaged at 156.31 g wet m^(-2),in which Mactra veneriformis accounted for 75.6%-93.4% of the total macro benthic biomass.More than 90% of macro benthos inhabited in the middle and low tide lines,and higher biomass occurred in early summer and lower in winter.Statistical analysis showed that:1)M.veneriformis growth was primarily favored at higher temperature and lower salinity;2)after long time interaction,benthic bivalve grazers led to patching distributions of Chlorophyll a(Chl a);3)macro benthic biomass positively related with Chl a when the concentration of Chl a was low,but they were negatively related when Chl a concentration was high;and 4)furthermore,the biomass of benthic bivalves peaked in the sediment with median grain size about 0.55 mm,but decreased gradually in coarse or fine sediments.The secondary productivity ranged at 0.37-283.68 g m^(-2)yr^(-1) and averaged at 47.88 g m^(-2) yr^(-1),in which 69.7% was contributed by M.veneriformis It was estimated that primary production was transformed to secondary production at a rate of 6.87%approximately,which implies that there is a local sustainability of high bivalve production.
基金Supported by the National Institute on Minority Health and Health Disparities of the National Institutes of Health(in part)Award No.G12MD007597
文摘AIM: To study the accuracy of using high definition(HD) scope with narrow band imaging(NBI) vs standard white light colonoscope without NBI(ST), to predict the histology of the colon polyps, particularly those < 1 cm.METHODS: A total of 147 African Americans patients who were referred to Howard University Hospital for screening or, diagnostic or follow up colonoscopy, during a 12-mo period in 2012 were prospectively recruited. Some patients had multiple polyps and total number of polyps was 179. Their colonoscopies were performed by 3 experienced endoscopists who determined the size and stated whether the polyps being removed were hyperplastic or adenomatous polyps using standard colonoscopes or high definition colonoscopes with NBI. The histopathologic diagnosis was reported by pathologists as part of routine care. RESULTS: Of participants in the study, 55(37%) were male and median(interquartile range) of age was 56(19-80). Demographic, clinical characteristics, past medical history of patients, and the data obtained by two instruments were not significantly different and two methods detected similar number of polyps. In ST scope 89% of polyps were < 1 cm vs 87% in HD scope(P = 0.7). The ST scope had a positive predictive value(PPV) and positive likelihood ratio(PLR) of 86% and 4.0 for adenoma compared to 74% and 2.6 for HD scope. There was a trend of higher sensitivity for HD scope(68%) compare to ST scope(53%) with almost the same specificity. The ST scope had a PPV and PLR of 38% and 1.8 for hyperplastic polyp(HPP) compared to 42% and 2.2 for HD scope. The sensitivity and specificity of two instruments for HPP diagnosis were similar.CONCLUSION: Our results indicated that HD scope was more sensitive in diagnosis of adenoma than ST scope. Clinical diagnosis of HPP with either scope is less accurate compared to adenoma. Colonoscopy diagnosis is not yet fully matched with pathologic diagnosis of colon polyp. However with the advancement of both imaging and training, it may be possible to increase the sensitivity and specificity of the scopes and hence save money for eliminating time and the cost of Immunohistochemistry/pathology.
文摘The paper identifies chaos, turbulence and fractal of quantum and macro gravity and studies their behavior, properties and applications based on the grand unified theory (GUT) and qualitative mathematics and modeling. Applications include devising electromagnetic engines, tornado aborter and terminator and technologies for electromagnetic treatment of genetic diseases such as cancer, systemic lupos erythematosus, diabetes and mental disorder without harm to normal cell and side effect. Typhoon in the Western Pacific which is turbulence is impossible to terminate and impractical to deflect but prediction can be improved because it follows the Northern Pacific Wind Cycle (Southern Pacific Wind Cycle in the Southern Hemisphere) and is affected by the temperature variation over the Philippine Deep and around Mayon Volcano. The electromagnetic engine uses the clean, inexhaustible, free dark matter, specifically, the energy of magnetic flux, in place of conventional fuel, e.g., fossil, nuclear and geothermal. The tornado aborter and terminator utilize the gravitational flux of the Earth, its vortex flux of superstrings as a cosmological vortex which is turbulence. The technologies for electromagnetic treatment of genetic diseases utilize electromagnetic waves based on resonance. All of them are GUT technologies because they are applications of GUT. Except for the magnetic train which is in operation the rest is still at the conceptual and research and development (R&D) phase but the theory is complete and the strategy for R&D are laid down in detail in the cited original papers.