In this study, ozone gas was applied to samples of durum wheat stored in four experimental groups (durum wheat without any treatment for comparison, durum wheat treated with ozone, purified durum wheat, and purified d...In this study, ozone gas was applied to samples of durum wheat stored in four experimental groups (durum wheat without any treatment for comparison, durum wheat treated with ozone, purified durum wheat, and purified durum wheat treated with ozone). Two groups were treated with ozone gas at 3 ppm concentration for 1 hour. Groups were then placed in air-tight glass jars and stored for 6 months at variable temperatures between 24.7°C to 34.8°C. Microbiological (total count bacteria, yeast/molds and coliform) and physical properties (moisture, color and ash) evaluated. Ozone application statistically caused a significant reduction in the numbers of bacteria, yeast, molds and coliforms. Ozone application, washing process and storage temperature are the major factors affecting the microbial counts. No significant differences were determined in moisture and ash contents of samples after ozone treatment. The color measurement results showed that color values of wheat samples were affected by ozone treatment, storage and washing.展开更多
A graph has the unique path property UPPn if there is a unique path of length n between any ordered pair of nodes. This paper reiterates Royle and MacKay's technique for constructing orderly algorithms. We wish to u...A graph has the unique path property UPPn if there is a unique path of length n between any ordered pair of nodes. This paper reiterates Royle and MacKay's technique for constructing orderly algorithms. We wish to use this technique to enumerate all UPP2 graphs of small orders 3^2 and 4^2. We attempt to use the direct graph formalism and find that the algorithm is inefficient. We introduce a generalised problem and derive algebraic and combinatoric structures with appropriate structure. Then we are able to design an orderly algorithm to determine all UPP2 graphs of order 3^2, which runs fast enough. We hope to be able to determine the UPP2 graphs of order 4^2 in the near future.展开更多
The disintegration of networks is a widely researched topic with significant applications in fields such as counterterrorism and infectious disease control. While the traditional approaches for achieving network disin...The disintegration of networks is a widely researched topic with significant applications in fields such as counterterrorism and infectious disease control. While the traditional approaches for achieving network disintegration involve identifying critical sets of nodes or edges, limited research has been carried out on edge-based disintegration strategies. We propose a novel algorithm, i.e., a rank aggregation elite enumeration algorithm based on edge-coupled networks(RAEEC),which aims to implement tiling for edge-coupled networks by finding important sets of edges in the network while balancing effectiveness and efficiency. Our algorithm is based on a two-layer edge-coupled network model with one-to-one links, and utilizes three advanced edge importance metrics to rank the edges separately. A comprehensive ranking of edges is obtained using a rank aggregation approach proposed in this study. The top few edges from the ranking set obtained by RAEEC are then used to generate an enumeration set, which is continuously iteratively updated to identify the set of elite attack edges.We conduct extensive experiments on synthetic networks to evaluate the performance of our proposed method, and the results indicate that RAEEC achieves a satisfactory balance between efficiency and effectiveness. Our approach represents a significant contribution to the field of network disintegration, particularly for edge-based strategies.展开更多
With the development of the 5th generation of mobile communi-cation(5G)networks and artificial intelligence(AI)technologies,the use of the Internet of Things(IoT)has expanded throughout industry.Although IoT networks ...With the development of the 5th generation of mobile communi-cation(5G)networks and artificial intelligence(AI)technologies,the use of the Internet of Things(IoT)has expanded throughout industry.Although IoT networks have improved industrial productivity and convenience,they are highly dependent on nonstandard protocol stacks and open-source-based,poorly validated software,resulting in several security vulnerabilities.How-ever,conventional AI-based software vulnerability discovery technologies cannot be applied to IoT because they require excessive memory and com-puting power.This study developed a technique for optimizing training data size to detect software vulnerabilities rapidly while maintaining learning accuracy.Experimental results using a software vulnerability classification dataset showed that different optimal data sizes did not affect the learning performance of the learning models.Moreover,the minimal data size required to train a model without performance degradation could be determined in advance.For example,the random forest model saved 85.18%of memory and improved latency by 97.82%while maintaining a learning accuracy similar to that achieved when using 100%of data,despite using only 1%.展开更多
The boom of coding languages in the 1950s revolutionized how our digital world was construed and accessed. The languages invented then, including Fortran, are still in use today due to their versatility and ability to...The boom of coding languages in the 1950s revolutionized how our digital world was construed and accessed. The languages invented then, including Fortran, are still in use today due to their versatility and ability to underpin a large majority of the older portions of our digital world and applications. Fortran, or Formula Translation, was a programming language implemented by IBM that shortened the apparatus of coding and the efficacy of the language syntax. Fortran marked the beginning of a new era of efficient programming by reducing the number of statements needed to operate a machine several-fold. Since then, dozens more languages have come into regular practice and have been increasingly diversified over the years. Some modern languages include Python, Java, JavaScript, C, C++, and PHP. These languages significantly improved efficiency and also have a broad range of uses. Python is mainly used for website/software development, data analysis, task automation, image processing, and graphic design applications. On the other hand, Java is primarily used as a client-side programming language. Expanding the coding languages allowed for increasing accessibility but also opened up applications to pertinent security issues. These security issues have varied by prevalence and language. Previous research has narrowed its focus on individual languages, failing to evaluate the security. This research paper investigates the severity and frequency of coding vulnerabilities comparatively across different languages and contextualizes their uses in a systematic literature review.展开更多
The subcarrier allocation problem in cognitive radio(CR)networks with multi-user orthogonal frequency-division multiplexing(OFDM)and distributed antenna is analyzed and modeled for the flat fading channel and the ...The subcarrier allocation problem in cognitive radio(CR)networks with multi-user orthogonal frequency-division multiplexing(OFDM)and distributed antenna is analyzed and modeled for the flat fading channel and the frequency selective channel,where the constraint on the secondary user(SU)to protect the primary user(PU)is that the total throughput of each PU must be above the given threshold instead of the "interference temperature".According to the features of different types of channels,the optimal subcarrier allocation schemes are proposed to pursue efficiency(or maximal throughput),using the branch and bound algorithm and the 0-1 implicit enumeration algorithm.Furthermore,considering the tradeoff between efficiency and fairness,the optimal subcarrier allocation schemes with fairness are proposed in different fading channels,using the pegging algorithm.Extensive simulation results illustrate the significant performance improvement of the proposed subcarrier allocation schemes compared with the existing ones in different scenarios.展开更多
The Ludox-QPS method is a newly developed technique,which combines the Ludox HS 40 density centrifugation and quantitative protargol stain,to enumerate marine ciliates with good taxonomic resolution.We tested the meth...The Ludox-QPS method is a newly developed technique,which combines the Ludox HS 40 density centrifugation and quantitative protargol stain,to enumerate marine ciliates with good taxonomic resolution.We tested the method for simultaneous enumeration of diatoms,protozoa and meiobenthos and compared its extraction efficiency for meiobenthos with that of the routine Ludox-TM centrifugation and a modified protocol using Ludox HS 40.We conducted the evaluation with a sample size of 8.3 ml each from sandy,muddy-sand and muddy sediments collected from the intertidal area of the Yellow Sea in summer 2006 and spring 2007.The Ludox-QPS method not only produced high extraction efficiencies of 97±1.3% for diatoms and 97.6±0.8% for ciliates,indicating a reliable enumeration for eukaryotic microbenthos,but also produced excellent extraction efficiencies of on average 97.3% for total meiobenthos,97.9% for nematodes and 97.8% for copepods from sands,muddy sands and mud.By contrast,the routine Ludox-TM centrifugation obtained only about 74% of total meiobenthos abundance with one extraction cycle,and the modified Ludox HS 40 centrifugation yielded on average 93% of total meiobenthos:89.4±2.0% from sands,93±4.1% from muddy sands and 97.1±3.0% from mud.Apart from the sediment type,sample volume was another important factor affecting the extraction efficiency for meiobenthos.The extraction rate was increased to about 96.4% when using the same modified Ludox centrifugation for a 4 ml sediment sample.Besides the excellent extraction efficiency,the Ludox-QPS method obtained higher abundances of meiobenthos,in particular nematodes,than the routine Ludox centrifugation,which frequently resulted in an uncertain loss of small meiobenthos during the sieving process.Statistical analyses demonstrated that there were no significant differences between the meiobenthos communities revealed by the Ludox-QPS method and the modified Ludox HS 40 centrifugation,showing the high efficiency of the Ludox-QPS method for simultaneous enumeration of diatom,protozoa and meiobenthos.Moreover,the comparatively high taxonomic resolution of the method,especially for diatoms and ciliates,makes it feasible to investigate microbial ecology at community level.展开更多
AIM: To establish a scoring system for predicting the incidence of postoperative complications and mortality in general surgery based on the physiological and operative severity score for the enumeration of mortality ...AIM: To establish a scoring system for predicting the incidence of postoperative complications and mortality in general surgery based on the physiological and operative severity score for the enumeration of mortality and morbidity (POSSUM), and to evaluate its efficacy. METHODS: Eighty-four patients with postoperative complications or death and 172 patients without postoperative complications, who underwent surgery in our department during the previous 2 years, were retrospectively analyzed by logistic regression. Fifteen indexes were investigated including age, cardiovascular function, respiratory function, blood test results, endocrine function, central nervous system function, hepatic function, renal function, nutritional status, extent of operative trauma, and course of anesthesia. Modified POSSUM (M-POSSUM) was developed using significant risk factors with its efficacy evaluated. RESULTS: The significant risk factors were found to be age, cardiovascular function, respiratory function, hepatic function, renal function, blood test results, endocrine function, nutritional status, duration of operation, intraoperative blood loss, and course of anesthesia. These factors were all included in the scoring system. There were significant differences in the scores between the patients with and without postoperative complications, between the patients died and survived with complications, and between the patients died and survived without complications. The receiver operating characteristic curves showed that the M-POSSUM could accurately predict postoperative complications and mortality.CONCLUSION: M-POSSUM correlates well with postoperative complications and mortality, and is more accurate than POSSUM.展开更多
Hilbert problem 15 required understanding Schubert's book.In this book,reducing to degenerate cases was one of the main methods for enumeration.We found that nonstandard analysis is a suitable tool for making rigo...Hilbert problem 15 required understanding Schubert's book.In this book,reducing to degenerate cases was one of the main methods for enumeration.We found that nonstandard analysis is a suitable tool for making rigorous of Schubert's proofs of some results,which used degeneration method,but are obviously not rigorous.In this paper,we give a rigorous proof for Example 4 in Schubert's book,Chapter 1.§4 according to his idea.This shows that Schubert's intuitive idea is correct,but to make it rigorous a lot of work should be done.展开更多
In this paper,we give rigorous justification of the ideas put forward in§20,Chapter 4 of Schubert’s book;a section that deals with the enumeration of conics in space.In that section,Schubert introduced two degen...In this paper,we give rigorous justification of the ideas put forward in§20,Chapter 4 of Schubert’s book;a section that deals with the enumeration of conics in space.In that section,Schubert introduced two degenerate conditions about conics,i.e.,the double line and the two intersection lines.Using these two degenerate conditions,he obtained all relations regarding the following three conditions:conics whose planes pass through a given point,conics intersecting with a given line,and conics which are tangent to a given plane.We use the language of blow-ups to rigorously treat the two degenerate conditions and prove all formulas about degenerate conditions stemming from Schubert’s idea.展开更多
The concept of graphlike manifolds was presented in [1] and the problem of counting the homeomorphic classes of graphlike manifolds has been studied in a series of articles, e.g., [1~8]. In this paper we suggest an a...The concept of graphlike manifolds was presented in [1] and the problem of counting the homeomorphic classes of graphlike manifolds has been studied in a series of articles, e.g., [1~8]. In this paper we suggest an approach based on the graph colouring, Abelian group and the combinatorial enumeration method.展开更多
Hilbert Problem 15 required an understanding of Schubert’s book[1],both its methods and its results.In this paper,following his idea,we prove that the formulas in§6,§7,§10,about the incidence of points...Hilbert Problem 15 required an understanding of Schubert’s book[1],both its methods and its results.In this paper,following his idea,we prove that the formulas in§6,§7,§10,about the incidence of points,lines and planes,are all correct.As an application,we prove formulas 8 and 9 in§12,which are frequently used in his book.展开更多
Let Γm,n^* denote all m × n strongly connected bipartite tournaments and a(m, n) the maximal integer k such that every m × n bipartite tournament contains at least a k × k transitive bipartite subtour...Let Γm,n^* denote all m × n strongly connected bipartite tournaments and a(m, n) the maximal integer k such that every m × n bipartite tournament contains at least a k × k transitive bipartite subtournament. Let t ( m, n, k, l ) = max{t( Tm,n,k, l ) : Tm,n∈Γm,n^*}, where t ( Tm,n, k, l ) is the number of k × l(k≥2,l≥2) transitive bipartite subtournaments contained in Tm,n∈Γm,n^*. We obtain a method of graph theory for solving some integral programmings, investigate the upper bounds of a(m,n) and obtain t (m,n, k,l).展开更多
Multiple-Input Multiple-Output (MIMO) techniques are promising in wireless communication systems for its high spectral efficiency. Sphere Detector (SD) is favoured in MIMO detection to achieve Maximum-Likelihood (ML) ...Multiple-Input Multiple-Output (MIMO) techniques are promising in wireless communication systems for its high spectral efficiency. Sphere Detector (SD) is favoured in MIMO detection to achieve Maximum-Likelihood (ML) performance. In this paper, we proposed a new SD method for MIMO-Orthogonal Frequency Division Multiplexing (OFDM) systems based on IEEE802.11n, which uses Singular Value Decomposition (SVD) in complex domain to reduce the computation complexity. Furthermore, a new Schnorr-Euchner (SE) enumeration algorithm is also discussed in detail. The computer simulation result shows that the computational complexity and the number of visited nodes can be reduced significantly compared with conventional SD detectors with the same Bit Error Rate (BER) performance.展开更多
The number of configurations, c(n, m), of a single chain with length n attached to a flat surface with m monomers contacting the surface is exactly enumerated. A fimction of c(n, m) about m and n is obtained. From...The number of configurations, c(n, m), of a single chain with length n attached to a flat surface with m monomers contacting the surface is exactly enumerated. A fimction of c(n, m) about m and n is obtained. From the function, a scaling law for mean energy of chain is derived, and we estimate the critical point εc = 0.276 and the crossover exponent φ = 0.5. The free energy difference between tethered chain and free chain in dilute solution is also studied, which shows the critical adsorption point is about 0.272 for infinite long chain with φ= 0.5.展开更多
The conformational entropy S and free energy F were calculated by exact enumeration of polysilane chain up to 23 segments with excluded volume (EV) and long-range van der Waals (VW) interaction. A nonlinear relation b...The conformational entropy S and free energy F were calculated by exact enumeration of polysilane chain up to 23 segments with excluded volume (EV) and long-range van der Waals (VW) interaction. A nonlinear relation between SEV+VW and chain length n was found though S-EV was found to vary linearly with n. We found that the second-order transition temperature of polysilane chain with VW interaction increases with the increase of chain length, while that of polysilane chain without VW interaction is chain length independent. Moreover, the free energies FEV+VW and F-EV are both linearly related with n, and FEV+VW < F-EV for all temperatures.展开更多
By using Polya' s theorem, the generating functions of configurations, chiral and achi-ral configurations of saturated polybasic alcohols containing an adamantane skeleton were obtained. Tn order to calculate the ...By using Polya' s theorem, the generating functions of configurations, chiral and achi-ral configurations of saturated polybasic alcohols containing an adamantane skeleton were obtained. Tn order to calculate the generating functions of compounds which are substituted with a limited condition, an exclusive method is used and cycle index is modified.展开更多
Large high-dimensional data have posed great challenges to existing algorithms for frequent itemsets mining.To solve the problem,a hybrid method,consisting of a novel row enumeration algorithm and a column enumeration...Large high-dimensional data have posed great challenges to existing algorithms for frequent itemsets mining.To solve the problem,a hybrid method,consisting of a novel row enumeration algorithm and a column enumeration algorithm,is proposed.The intention of the hybrid method is to decompose the mining task into two subtasks and then choose appropriate algorithms to solve them respectively.The novel algorithm,i.e.,Inter-transaction is based on the characteristic that there are few common items between or among long transactions.In addition,an optimization technique is adopted to improve the performance of the intersection of bit-vectors.Experiments on synthetic data show that our method achieves high performance in large high-dimensional data.展开更多
This paper presents the techniques of implicit traversing and state verification for sequential finite state machines(FSMs) based of on the state collapsing of state transition graph(STG). The problems of state design...This paper presents the techniques of implicit traversing and state verification for sequential finite state machines(FSMs) based of on the state collapsing of state transition graph(STG). The problems of state designing are described. In order to achieve high state enumeration coverage, heuristic knowledge is proposed.展开更多
文摘In this study, ozone gas was applied to samples of durum wheat stored in four experimental groups (durum wheat without any treatment for comparison, durum wheat treated with ozone, purified durum wheat, and purified durum wheat treated with ozone). Two groups were treated with ozone gas at 3 ppm concentration for 1 hour. Groups were then placed in air-tight glass jars and stored for 6 months at variable temperatures between 24.7°C to 34.8°C. Microbiological (total count bacteria, yeast/molds and coliform) and physical properties (moisture, color and ash) evaluated. Ozone application statistically caused a significant reduction in the numbers of bacteria, yeast, molds and coliforms. Ozone application, washing process and storage temperature are the major factors affecting the microbial counts. No significant differences were determined in moisture and ash contents of samples after ozone treatment. The color measurement results showed that color values of wheat samples were affected by ozone treatment, storage and washing.
基金supported in part by Project P15691 from the Austrian Federal FWF,the national science finding body,as well as by several ongoing grants from Stadt Linz,Land Obersterreich and the Austrian Federal BKA.Kunst
文摘A graph has the unique path property UPPn if there is a unique path of length n between any ordered pair of nodes. This paper reiterates Royle and MacKay's technique for constructing orderly algorithms. We wish to use this technique to enumerate all UPP2 graphs of small orders 3^2 and 4^2. We attempt to use the direct graph formalism and find that the algorithm is inefficient. We introduce a generalised problem and derive algebraic and combinatoric structures with appropriate structure. Then we are able to design an orderly algorithm to determine all UPP2 graphs of order 3^2, which runs fast enough. We hope to be able to determine the UPP2 graphs of order 4^2 in the near future.
基金supported by the National Natural Science Foundation of China (Grant Nos. 61877046, 12271419, and 62106186)the Natural Science Basic Research Program of Shaanxi (Program No. 2022JQ-620)the Fundamental Research Funds for the Central Universities (Grant Nos. XJS220709, JB210701, and QTZX23002)。
文摘The disintegration of networks is a widely researched topic with significant applications in fields such as counterterrorism and infectious disease control. While the traditional approaches for achieving network disintegration involve identifying critical sets of nodes or edges, limited research has been carried out on edge-based disintegration strategies. We propose a novel algorithm, i.e., a rank aggregation elite enumeration algorithm based on edge-coupled networks(RAEEC),which aims to implement tiling for edge-coupled networks by finding important sets of edges in the network while balancing effectiveness and efficiency. Our algorithm is based on a two-layer edge-coupled network model with one-to-one links, and utilizes three advanced edge importance metrics to rank the edges separately. A comprehensive ranking of edges is obtained using a rank aggregation approach proposed in this study. The top few edges from the ranking set obtained by RAEEC are then used to generate an enumeration set, which is continuously iteratively updated to identify the set of elite attack edges.We conduct extensive experiments on synthetic networks to evaluate the performance of our proposed method, and the results indicate that RAEEC achieves a satisfactory balance between efficiency and effectiveness. Our approach represents a significant contribution to the field of network disintegration, particularly for edge-based strategies.
基金supported by a National Research Foundation of Korea (NRF)grant funded by the Ministry of Science and ICT (MSIT) (No.2020R1F1A1061107)the Korea Institute for Advancement of Technology (KIAT)grant funded by the Korean Government (MOTIE) (P0008703,The Competency Development Program for Industry Specialists)the MSIT under the ICAN (ICT Challenge and Advanced Network of HRD)program (No.IITP-2022-RS-2022-00156310)supervised by the Institute of Information&Communication Technology Planning and Evaluation (IITP).
文摘With the development of the 5th generation of mobile communi-cation(5G)networks and artificial intelligence(AI)technologies,the use of the Internet of Things(IoT)has expanded throughout industry.Although IoT networks have improved industrial productivity and convenience,they are highly dependent on nonstandard protocol stacks and open-source-based,poorly validated software,resulting in several security vulnerabilities.How-ever,conventional AI-based software vulnerability discovery technologies cannot be applied to IoT because they require excessive memory and com-puting power.This study developed a technique for optimizing training data size to detect software vulnerabilities rapidly while maintaining learning accuracy.Experimental results using a software vulnerability classification dataset showed that different optimal data sizes did not affect the learning performance of the learning models.Moreover,the minimal data size required to train a model without performance degradation could be determined in advance.For example,the random forest model saved 85.18%of memory and improved latency by 97.82%while maintaining a learning accuracy similar to that achieved when using 100%of data,despite using only 1%.
文摘The boom of coding languages in the 1950s revolutionized how our digital world was construed and accessed. The languages invented then, including Fortran, are still in use today due to their versatility and ability to underpin a large majority of the older portions of our digital world and applications. Fortran, or Formula Translation, was a programming language implemented by IBM that shortened the apparatus of coding and the efficacy of the language syntax. Fortran marked the beginning of a new era of efficient programming by reducing the number of statements needed to operate a machine several-fold. Since then, dozens more languages have come into regular practice and have been increasingly diversified over the years. Some modern languages include Python, Java, JavaScript, C, C++, and PHP. These languages significantly improved efficiency and also have a broad range of uses. Python is mainly used for website/software development, data analysis, task automation, image processing, and graphic design applications. On the other hand, Java is primarily used as a client-side programming language. Expanding the coding languages allowed for increasing accessibility but also opened up applications to pertinent security issues. These security issues have varied by prevalence and language. Previous research has narrowed its focus on individual languages, failing to evaluate the security. This research paper investigates the severity and frequency of coding vulnerabilities comparatively across different languages and contextualizes their uses in a systematic literature review.
基金The National Natural Science Foundation of China(No.60832009)Beijing Municipal Natural Science Foundation(No.4102044)National Major Science & Technology Project(No.2009ZX03003-003-01)
文摘The subcarrier allocation problem in cognitive radio(CR)networks with multi-user orthogonal frequency-division multiplexing(OFDM)and distributed antenna is analyzed and modeled for the flat fading channel and the frequency selective channel,where the constraint on the secondary user(SU)to protect the primary user(PU)is that the total throughput of each PU must be above the given threshold instead of the "interference temperature".According to the features of different types of channels,the optimal subcarrier allocation schemes are proposed to pursue efficiency(or maximal throughput),using the branch and bound algorithm and the 0-1 implicit enumeration algorithm.Furthermore,considering the tradeoff between efficiency and fairness,the optimal subcarrier allocation schemes with fairness are proposed in different fading channels,using the pegging algorithm.Extensive simulation results illustrate the significant performance improvement of the proposed subcarrier allocation schemes compared with the existing ones in different scenarios.
基金Supported by the Knowledge Innovation Program of Chinese Academy of Sciences (No.KZCX2-YW-417)the National Natural Science Foundation of China (No.40576072+1 种基金40706047)the "100 Talents Project" of the Chinese Academy of Sciences
文摘The Ludox-QPS method is a newly developed technique,which combines the Ludox HS 40 density centrifugation and quantitative protargol stain,to enumerate marine ciliates with good taxonomic resolution.We tested the method for simultaneous enumeration of diatoms,protozoa and meiobenthos and compared its extraction efficiency for meiobenthos with that of the routine Ludox-TM centrifugation and a modified protocol using Ludox HS 40.We conducted the evaluation with a sample size of 8.3 ml each from sandy,muddy-sand and muddy sediments collected from the intertidal area of the Yellow Sea in summer 2006 and spring 2007.The Ludox-QPS method not only produced high extraction efficiencies of 97±1.3% for diatoms and 97.6±0.8% for ciliates,indicating a reliable enumeration for eukaryotic microbenthos,but also produced excellent extraction efficiencies of on average 97.3% for total meiobenthos,97.9% for nematodes and 97.8% for copepods from sands,muddy sands and mud.By contrast,the routine Ludox-TM centrifugation obtained only about 74% of total meiobenthos abundance with one extraction cycle,and the modified Ludox HS 40 centrifugation yielded on average 93% of total meiobenthos:89.4±2.0% from sands,93±4.1% from muddy sands and 97.1±3.0% from mud.Apart from the sediment type,sample volume was another important factor affecting the extraction efficiency for meiobenthos.The extraction rate was increased to about 96.4% when using the same modified Ludox centrifugation for a 4 ml sediment sample.Besides the excellent extraction efficiency,the Ludox-QPS method obtained higher abundances of meiobenthos,in particular nematodes,than the routine Ludox centrifugation,which frequently resulted in an uncertain loss of small meiobenthos during the sieving process.Statistical analyses demonstrated that there were no significant differences between the meiobenthos communities revealed by the Ludox-QPS method and the modified Ludox HS 40 centrifugation,showing the high efficiency of the Ludox-QPS method for simultaneous enumeration of diatom,protozoa and meiobenthos.Moreover,the comparatively high taxonomic resolution of the method,especially for diatoms and ciliates,makes it feasible to investigate microbial ecology at community level.
文摘AIM: To establish a scoring system for predicting the incidence of postoperative complications and mortality in general surgery based on the physiological and operative severity score for the enumeration of mortality and morbidity (POSSUM), and to evaluate its efficacy. METHODS: Eighty-four patients with postoperative complications or death and 172 patients without postoperative complications, who underwent surgery in our department during the previous 2 years, were retrospectively analyzed by logistic regression. Fifteen indexes were investigated including age, cardiovascular function, respiratory function, blood test results, endocrine function, central nervous system function, hepatic function, renal function, nutritional status, extent of operative trauma, and course of anesthesia. Modified POSSUM (M-POSSUM) was developed using significant risk factors with its efficacy evaluated. RESULTS: The significant risk factors were found to be age, cardiovascular function, respiratory function, hepatic function, renal function, blood test results, endocrine function, nutritional status, duration of operation, intraoperative blood loss, and course of anesthesia. These factors were all included in the scoring system. There were significant differences in the scores between the patients with and without postoperative complications, between the patients died and survived with complications, and between the patients died and survived without complications. The receiver operating characteristic curves showed that the M-POSSUM could accurately predict postoperative complications and mortality.CONCLUSION: M-POSSUM correlates well with postoperative complications and mortality, and is more accurate than POSSUM.
文摘Hilbert problem 15 required understanding Schubert's book.In this book,reducing to degenerate cases was one of the main methods for enumeration.We found that nonstandard analysis is a suitable tool for making rigorous of Schubert's proofs of some results,which used degeneration method,but are obviously not rigorous.In this paper,we give a rigorous proof for Example 4 in Schubert's book,Chapter 1.§4 according to his idea.This shows that Schubert's intuitive idea is correct,but to make it rigorous a lot of work should be done.
基金partially supported by National Center for Mathematics and Interdisciplinary Sciences,CAS。
文摘In this paper,we give rigorous justification of the ideas put forward in§20,Chapter 4 of Schubert’s book;a section that deals with the enumeration of conics in space.In that section,Schubert introduced two degenerate conditions about conics,i.e.,the double line and the two intersection lines.Using these two degenerate conditions,he obtained all relations regarding the following three conditions:conics whose planes pass through a given point,conics intersecting with a given line,and conics which are tangent to a given plane.We use the language of blow-ups to rigorously treat the two degenerate conditions and prove all formulas about degenerate conditions stemming from Schubert’s idea.
文摘The concept of graphlike manifolds was presented in [1] and the problem of counting the homeomorphic classes of graphlike manifolds has been studied in a series of articles, e.g., [1~8]. In this paper we suggest an approach based on the graph colouring, Abelian group and the combinatorial enumeration method.
基金partially supported by National Center for Mathematics and Interdisciplinary Sciences,CAS。
文摘Hilbert Problem 15 required an understanding of Schubert’s book[1],both its methods and its results.In this paper,following his idea,we prove that the formulas in§6,§7,§10,about the incidence of points,lines and planes,are all correct.As an application,we prove formulas 8 and 9 in§12,which are frequently used in his book.
文摘Let Γm,n^* denote all m × n strongly connected bipartite tournaments and a(m, n) the maximal integer k such that every m × n bipartite tournament contains at least a k × k transitive bipartite subtournament. Let t ( m, n, k, l ) = max{t( Tm,n,k, l ) : Tm,n∈Γm,n^*}, where t ( Tm,n, k, l ) is the number of k × l(k≥2,l≥2) transitive bipartite subtournaments contained in Tm,n∈Γm,n^*. We obtain a method of graph theory for solving some integral programmings, investigate the upper bounds of a(m,n) and obtain t (m,n, k,l).
文摘Multiple-Input Multiple-Output (MIMO) techniques are promising in wireless communication systems for its high spectral efficiency. Sphere Detector (SD) is favoured in MIMO detection to achieve Maximum-Likelihood (ML) performance. In this paper, we proposed a new SD method for MIMO-Orthogonal Frequency Division Multiplexing (OFDM) systems based on IEEE802.11n, which uses Singular Value Decomposition (SVD) in complex domain to reduce the computation complexity. Furthermore, a new Schnorr-Euchner (SE) enumeration algorithm is also discussed in detail. The computer simulation result shows that the computational complexity and the number of visited nodes can be reduced significantly compared with conventional SD detectors with the same Bit Error Rate (BER) performance.
基金supported by the National Natural Science Foundation of China(No.20874088)
文摘The number of configurations, c(n, m), of a single chain with length n attached to a flat surface with m monomers contacting the surface is exactly enumerated. A fimction of c(n, m) about m and n is obtained. From the function, a scaling law for mean energy of chain is derived, and we estimate the critical point εc = 0.276 and the crossover exponent φ = 0.5. The free energy difference between tethered chain and free chain in dilute solution is also studied, which shows the critical adsorption point is about 0.272 for infinite long chain with φ= 0.5.
文摘The conformational entropy S and free energy F were calculated by exact enumeration of polysilane chain up to 23 segments with excluded volume (EV) and long-range van der Waals (VW) interaction. A nonlinear relation between SEV+VW and chain length n was found though S-EV was found to vary linearly with n. We found that the second-order transition temperature of polysilane chain with VW interaction increases with the increase of chain length, while that of polysilane chain without VW interaction is chain length independent. Moreover, the free energies FEV+VW and F-EV are both linearly related with n, and FEV+VW < F-EV for all temperatures.
文摘By using Polya' s theorem, the generating functions of configurations, chiral and achi-ral configurations of saturated polybasic alcohols containing an adamantane skeleton were obtained. Tn order to calculate the generating functions of compounds which are substituted with a limited condition, an exclusive method is used and cycle index is modified.
基金The work was supported in part by Research Fund for the Doctoral Program of Higher Education of China(No.20060255006)
文摘Large high-dimensional data have posed great challenges to existing algorithms for frequent itemsets mining.To solve the problem,a hybrid method,consisting of a novel row enumeration algorithm and a column enumeration algorithm,is proposed.The intention of the hybrid method is to decompose the mining task into two subtasks and then choose appropriate algorithms to solve them respectively.The novel algorithm,i.e.,Inter-transaction is based on the characteristic that there are few common items between or among long transactions.In addition,an optimization technique is adopted to improve the performance of the intersection of bit-vectors.Experiments on synthetic data show that our method achieves high performance in large high-dimensional data.
基金Supported by the National Natural Science Foundation of China
文摘This paper presents the techniques of implicit traversing and state verification for sequential finite state machines(FSMs) based of on the state collapsing of state transition graph(STG). The problems of state designing are described. In order to achieve high state enumeration coverage, heuristic knowledge is proposed.