In this paper, Laplace decomposition method (LDM) and Pade approximant are employed to find approximate solutions for the Whitham-Broer-Kaup shallow water model, the coupled nonlinear reaction diffusion equations and ...In this paper, Laplace decomposition method (LDM) and Pade approximant are employed to find approximate solutions for the Whitham-Broer-Kaup shallow water model, the coupled nonlinear reaction diffusion equations and the system of Hirota-Satsuma coupled KdV. In addition, the results obtained from Laplace decomposition method (LDM) and Pade approximant are compared with corresponding exact analytical solutions.展开更多
Let p be a prime. For any finite p-group G, the deep transfers T H,G ' : H / H ' → G ' / G " from the maximal subgroups H of index (G:H) = p in G to the derived subgroup G ' are introduced as an ...Let p be a prime. For any finite p-group G, the deep transfers T H,G ' : H / H ' → G ' / G " from the maximal subgroups H of index (G:H) = p in G to the derived subgroup G ' are introduced as an innovative tool for identifying G uniquely by means of the family of kernels ùd(G) =(ker(T H,G ')) (G: H) = p. For all finite 3-groups G of coclass cc(G) = 1, the family ùd(G) is determined explicitly. The results are applied to the Galois groups G =Gal(F3 (∞)/ F) of the Hilbert 3-class towers of all real quadratic fields F = Q(√d) with fundamental discriminants d > 1, 3-class group Cl3(F) □ C3 × C3, and total 3-principalization in each of their four unramified cyclic cubic extensions E/F. A systematic statistical evaluation is given for the complete range 1 d 7, and a few exceptional cases are pointed out for 1 d 8.展开更多
Partial formalization, which involves the development of deductive connections among statements, can be used to examine assumptions, definitions and related methodologies that are used in science. This approach has be...Partial formalization, which involves the development of deductive connections among statements, can be used to examine assumptions, definitions and related methodologies that are used in science. This approach has been applied to the study of nucleic acids recovered from natural microbial assemblages (NMA) by the use of bulk extraction. Six pools of bulk-extractable nucleic acids (BENA) are suggested to be present in a NMA: (pool 1) inactive microbes (abiotic-limited);(pool 2) inactive microbes (abiotic permissive, biotic-limited);(pool 3) dormant microbes (abiotic permissive, biotic-limited, but can become biotic permissive);(pool 4) in situ active microbes (the microbial community);(pool 5) viruses (virocells/virions/cryptic viral genomes);and (pool 6) extracellular nucleic acids including extracellular DNA (eDNA). Definitions for cells, the microbial community (in situ active cells), the rare biosphere, dormant cells (the microbial seed bank), viruses (virocells/virions/cryptic viral genomic), and diversity are presented, together with methodology suggested to allow their study. The word diversity will require at least 4 definitions, each involving a different methodology. These suggested definitions and methodologies should make it possible to make further advances in bulk extraction-based molecular microbial ecology.展开更多
The design of this paper is to present the first installment of a complete and final theory of rational human intelligence. The theory is mathematical in the strictest possible sense. The mathematics involved is stric...The design of this paper is to present the first installment of a complete and final theory of rational human intelligence. The theory is mathematical in the strictest possible sense. The mathematics involved is strictly digital—not quantitative in the manner that what is usually thought of as mathematics is quantitative. It is anticipated at this time that the exclusively digital nature of rational human intelligence exhibits four flavors of digitality, apparently no more, and that each flavor will require a lengthy study in its own right. (For more information,please refer to the PDF.)展开更多
The 3D reconstruction pipeline uses the Bundle Adjustment algorithm to refine the camera and point parameters. The Bundle Adjustment algorithm is a compute-intensive algorithm, and many researchers have improved its p...The 3D reconstruction pipeline uses the Bundle Adjustment algorithm to refine the camera and point parameters. The Bundle Adjustment algorithm is a compute-intensive algorithm, and many researchers have improved its performance by implementing the algorithm on GPUs. In the previous research work, “Improving Accuracy and Computational Burden of Bundle Adjustment Algorithm using GPUs,” the authors demonstrated first the Bundle Adjustment algorithmic performance improvement by reducing the mean square error using an additional radial distorting parameter and explicitly computed analytical derivatives and reducing the computational burden of the Bundle Adjustment algorithm using GPUs. The naïve implementation of the CUDA code, a speedup of 10× for the largest dataset of 13,678 cameras, 4,455,747 points, and 28,975,571 projections was achieved. In this paper, we present the optimization of the Bundle Adjustment algorithm CUDA code on GPUs to achieve higher speedup. We propose a new data memory layout for the parameters in the Bundle Adjustment algorithm, resulting in contiguous memory access. We demonstrate that it improves the memory throughput on the GPUs, thereby improving the overall performance. We also demonstrate an increase in the computational throughput of the algorithm by optimizing the CUDA kernels to utilize the GPU resources effectively. A comparative performance study of explicitly computing an algorithm parameter versus using the Jacobians instead is presented. In the previous work, the Bundle Adjustment algorithm failed to converge for certain datasets due to several block matrices of the cameras in the augmented normal equation, resulting in rank-deficient matrices. In this work, we identify the cameras that cause rank-deficient matrices and preprocess the datasets to ensure the convergence of the BA algorithm. Our optimized CUDA implementation achieves convergence of the Bundle Adjustment algorithm in around 22 seconds for the largest dataset compared to 654 seconds for the sequential implementation, resulting in a speedup of 30×. Our optimized CUDA implementation presented in this paper has achieved a 3× speedup for the largest dataset compared to the previous naïve CUDA implementation.展开更多
This research aimed to clarify the role of by-product materials, such as CKD with SF as partial replacement by weight of cement in concrete manufacturing and inclusion on different characteristics of concrete. Concret...This research aimed to clarify the role of by-product materials, such as CKD with SF as partial replacement by weight of cement in concrete manufacturing and inclusion on different characteristics of concrete. Concrete test specimens were mixed with 0%, 5%, 10%, 15%, 20% and 25% (CKD) with 15% (SF) as partial replacement by weight of Cement (CEM I-52.5N). Fresh concrete properties have been evaluated by workability measurement slump test. While hardened concrete properties have been evaluated by compressive, split tensile and flexural strengths tests at ages 7, 28 and 56 days, but evaluated for bond strength, modulus of elasticity and chemical composition measurement with X-Ray Fluorescence at age of 28 days. The test results have revealed that the increase of CKD amount with fixed amount of SF in concrete mixtures as partial replacement by weight of cement leads to gradual decrease of fresh concrete workability. In concrete mixtures, 20% CKD in the presence of 15% SF as partial replacement by the weight of cement are the optimum ratios which can be used without any negative effect on mechanical properties compressive, indirect tensile, flexural and bond strength at all the ages of concrete. Also modulus of elasticity and bond strength increased by 8.81% and 0.69% respectively at the age 28 days compared with control mixture.展开更多
In this paper, a numerical solution of nonlinear partial differential equation, Benjamin-Bona-Mahony (BBM) and Cahn-Hilliard equation is presented by using Adomain Decomposition Method (ADM) and Variational Iteration ...In this paper, a numerical solution of nonlinear partial differential equation, Benjamin-Bona-Mahony (BBM) and Cahn-Hilliard equation is presented by using Adomain Decomposition Method (ADM) and Variational Iteration Method (VIM). The results reveal that the two methods are very effective, simple and very close to the exact solution.展开更多
Biogas is a renewable and clean energy source that plays an important role in the current environment of lowcarbon transition.If high-content CO_(2) in biogas can be separated,transformed,and utilized,it not only real...Biogas is a renewable and clean energy source that plays an important role in the current environment of lowcarbon transition.If high-content CO_(2) in biogas can be separated,transformed,and utilized,it not only realizes high-value utilization of biogas but also promotes carbon reduction in the biogas field.To improve the combustion stability of biogas,an inhomogeneous,partially premixed stratified(IPPS)combustion model was adopted in this study.The thermal flame structure and stability were investigated for a wide range of mixture inhomogeneities,turbulence levels,CO_(2) concentrations,air-to-fuel velocity ratios,and combustion energies in a concentric flow slot burner(CFSB).A fine-wire thermocouple is used to resolve the thermal flame structure.The flame size was reduced by increasing the CO_(2) concentration and the flames became lighter blue.The flame temperature also decreased with increase in CO_(2) concentration.Flame stability was reduced by increasing the CO_(2) concentration.However,at a certain level of mixture inhomogeneity,the concentration of CO_(2) in the IPPS mode did not affect the stability.Accordingly,the IPPS mode of combustion should be suitable for the combustion and stabilization of biogas.This should support the design of highly stabilized biogas turbulent flames independent of CO_(2) concentration.The data show that the lower stability conditions are partially due to the change in fuel combustion energy,which is characterized by the Wobbe index(WI).In addition,at a certain level of mixture inhomogeneity,the effect of the WI on flame stability becomes dominant.展开更多
A photovoltaic (PV) string with multiple modules with bypass diodes frequently deployed on a variety of autonomous PV systems may present multiple power peaks under uneven shading. For optimal solar harvesting, there ...A photovoltaic (PV) string with multiple modules with bypass diodes frequently deployed on a variety of autonomous PV systems may present multiple power peaks under uneven shading. For optimal solar harvesting, there is a need for a control schema to force the PV string to operate at global maximum power point (GMPP). While a lot of tracking methods have been proposed in the literature, they are usually complex and do not fully take advantage of the available characteristics of the PV array. This work highlights how the voltage at operating point and the forward voltage of the bypass diode are considered to design a global maximum power point tracking (GMPPT) algorithm with a very limited global search phase called Fast GMPPT. This algorithm successfully tracks GMPP between 94% and 98% of the time under a theoretical evaluation. It is then compared against Perturb and Observe, Deterministic Particle Swarm Optimization, and Grey Wolf Optimization under a sequence of irradiance steps as well as a power-over-voltage characteristics profile that mimics the electrical characteristics of a PV string under varying partial shading conditions. Overall, the simulation with the sequence of irradiance steps shows that while Fast GMPPT does not have the best convergence time, it has an excellent convergence rate as well as causes the least amount of power loss during the global search phase. Experimental test under varying partial shading conditions shows that while the GMPPT proposal is simple and lightweight, it is very performant under a wide range of dynamically varying partial shading conditions and boasts the best energy efficiency (94.74%) out of the 4 tested algorithms.展开更多
BACKGROUND The success of liver resection relies on the ability of the remnant liver to regenerate.Most of the knowledge regarding the pathophysiological basis of liver regeneration comes from rodent studies,and data ...BACKGROUND The success of liver resection relies on the ability of the remnant liver to regenerate.Most of the knowledge regarding the pathophysiological basis of liver regeneration comes from rodent studies,and data on humans are scarce.Additionally,there is limited knowledge about the preoperative factors that influence postoperative regeneration.AIM To quantify postoperative remnant liver volume by the latest volumetric software and investigate perioperative factors that affect posthepatectomy liver regenera-tion.METHODS A total of 268 patients who received partial hepatectomy were enrolled.Patients were grouped into right hepatectomy/trisegmentectomy(RH/Tri),left hepa-tectomy(LH),segmentectomy(Seg),and subsegmentectomy/nonanatomical hepatectomy(Sub/Non)groups.The regeneration index(RI)and late rege-neration rate were defined as(postoperative liver volume)/[total functional liver volume(TFLV)]×100 and(RI at 6-months-RI at 3-months)/RI at 6-months,respectively.The lower 25th percentile of RI and the higher 25th percentile of late regeneration rate in each group were defined as“low regeneration”and“delayed regeneration”.“Restoration to the original size”was defined as regeneration of the liver volume by more than 90%of the TFLV at 12 months postsurgery.RESULTS The numbers of patients in the RH/Tri,LH,Seg,and Sub/Non groups were 41,53,99 and 75,respectively.The RI plateaued at 3 months in the LH,Seg,and Sub/Non groups,whereas the RI increased until 12 months in the RH/Tri group.According to our multivariate analysis,the preoperative albumin-bilirubin(ALBI)score was an independent factor for low regeneration at 3 months[odds ratio(OR)95%CI=2.80(1.17-6.69),P=0.02;per 1.0 up]and 12 months[OR=2.27(1.01-5.09),P=0.04;per 1.0 up].Multivariate analysis revealed that only liver resection percentage[OR=1.03(1.00-1.05),P=0.04]was associated with delayed regeneration.Furthermore,multivariate analysis demonstrated that the preoperative ALBI score[OR=2.63(1.00-1.05),P=0.02;per 1.0 up]and liver resection percentage[OR=1.02(1.00-1.05),P=0.04;per 1.0 up]were found to be independent risk factors associated with volume restoration failure.CONCLUSION Liver regeneration posthepatectomy was determined by the resection percentage and preoperative ALBI score.This knowledge helps surgeons decide the timing and type of rehepatectomy for recurrent cases.展开更多
The main purpose of this paper is to obtain the inference of parameters of heterogeneous population represented by finite mixture of two Pareto (MTP) distributions of the second kind. The constant-partially accelerate...The main purpose of this paper is to obtain the inference of parameters of heterogeneous population represented by finite mixture of two Pareto (MTP) distributions of the second kind. The constant-partially accelerated life tests are applied based on progressively type-II censored samples. The maximum likelihood estimates (MLEs) for the considered parameters are obtained by solving the likelihood equations of the model parameters numerically. The Bayes estimators are obtained by using Markov chain Monte Carlo algorithm under the balanced squared error loss function. Based on Monte Carlo simulation, Bayes estimators are compared with their corresponding maximum likelihood estimators. The two-sample prediction technique is considered to derive Bayesian prediction bounds for future order statistics based on progressively type-II censored informative samples obtained from constant-partially accelerated life testing models. The informative and future samples are assumed to be obtained from the same population. The coverage probabilities and the average interval lengths of the confidence intervals are computed via a Monte Carlo simulation to investigate the procedure of the prediction intervals. Analysis of a simulated data set has also been presented for illustrative purposes. Finally, comparisons are made between Bayesian and maximum likelihood estimators via a Monte Carlo simulation study.展开更多
Aims: This study aims at designing and implementing syllabus-oriented question-bank system that is capable of producing paper-based exams with multiple forms along with answer keys. The developed software tool is nam...Aims: This study aims at designing and implementing syllabus-oriented question-bank system that is capable of producing paper-based exams with multiple forms along with answer keys. The developed software tool is named Χ(Chi)-Pro Milestone and supports four types of questions, namely: Multiple-choice, True/False, Short-Answer and Free-Response Essay questions. The study is motivated by the fact that student number in schools and universities is continuously growing at high, non-linear, and uncontrolled rates. This growth, however, is not accompanied by an equivalent growth of educational resources (mainly: instructors, classrooms, and labs). A direct result of this situation is having relatively large number of students in each classroom. It is observed that providing and using online-examining systems could be intractable and expensive. As an alternative, paper-based exams can be used. One main issue is that manually produced paper-based exams are of low quality because of some human factors such as instability and relatively narrow range of topics [1]. Further, it is observed that instructors usually need to spend a lot of time and energy in composing paper-based exams with multiple forms. Therefore, the use of computers for automatic production of paper-based exams from question banks is becoming more and more important. Methodology: The design and evaluation of X-Pro Milestone are done by considering a basic set of design principles that are based on a list of identified Functional and Non-Functional Requirements. Deriving those requirements is made possible by developing X-Pro Milestone using the Iterative and Incremental model from software engineering domain. Results: We demonstrate that X-Pro Milestone has a number of excellent characteristics compared to the exam-preparation and question banks tools available in market. Some of these characteristics are: ease of use and operation, user-friendly interface and good usability, high security and protection of the question bank-items, high stability, and reliability. Further, X-Pro Milestone makes initiating, maintaining and archiving Question-Banks and produced exams possible. Putting X-Pro Milestone into real use has showed that X-Pro Milestone is easy to be learned and effectively used. We demonstrate that X-Pro Milestone is a cost-effective alternative to online examining systems with more and richer features and with low infrastructure requirements.展开更多
文摘In this paper, Laplace decomposition method (LDM) and Pade approximant are employed to find approximate solutions for the Whitham-Broer-Kaup shallow water model, the coupled nonlinear reaction diffusion equations and the system of Hirota-Satsuma coupled KdV. In addition, the results obtained from Laplace decomposition method (LDM) and Pade approximant are compared with corresponding exact analytical solutions.
文摘Let p be a prime. For any finite p-group G, the deep transfers T H,G ' : H / H ' → G ' / G " from the maximal subgroups H of index (G:H) = p in G to the derived subgroup G ' are introduced as an innovative tool for identifying G uniquely by means of the family of kernels ùd(G) =(ker(T H,G ')) (G: H) = p. For all finite 3-groups G of coclass cc(G) = 1, the family ùd(G) is determined explicitly. The results are applied to the Galois groups G =Gal(F3 (∞)/ F) of the Hilbert 3-class towers of all real quadratic fields F = Q(√d) with fundamental discriminants d > 1, 3-class group Cl3(F) □ C3 × C3, and total 3-principalization in each of their four unramified cyclic cubic extensions E/F. A systematic statistical evaluation is given for the complete range 1 d 7, and a few exceptional cases are pointed out for 1 d 8.
文摘Partial formalization, which involves the development of deductive connections among statements, can be used to examine assumptions, definitions and related methodologies that are used in science. This approach has been applied to the study of nucleic acids recovered from natural microbial assemblages (NMA) by the use of bulk extraction. Six pools of bulk-extractable nucleic acids (BENA) are suggested to be present in a NMA: (pool 1) inactive microbes (abiotic-limited);(pool 2) inactive microbes (abiotic permissive, biotic-limited);(pool 3) dormant microbes (abiotic permissive, biotic-limited, but can become biotic permissive);(pool 4) in situ active microbes (the microbial community);(pool 5) viruses (virocells/virions/cryptic viral genomes);and (pool 6) extracellular nucleic acids including extracellular DNA (eDNA). Definitions for cells, the microbial community (in situ active cells), the rare biosphere, dormant cells (the microbial seed bank), viruses (virocells/virions/cryptic viral genomic), and diversity are presented, together with methodology suggested to allow their study. The word diversity will require at least 4 definitions, each involving a different methodology. These suggested definitions and methodologies should make it possible to make further advances in bulk extraction-based molecular microbial ecology.
文摘The design of this paper is to present the first installment of a complete and final theory of rational human intelligence. The theory is mathematical in the strictest possible sense. The mathematics involved is strictly digital—not quantitative in the manner that what is usually thought of as mathematics is quantitative. It is anticipated at this time that the exclusively digital nature of rational human intelligence exhibits four flavors of digitality, apparently no more, and that each flavor will require a lengthy study in its own right. (For more information,please refer to the PDF.)
文摘The 3D reconstruction pipeline uses the Bundle Adjustment algorithm to refine the camera and point parameters. The Bundle Adjustment algorithm is a compute-intensive algorithm, and many researchers have improved its performance by implementing the algorithm on GPUs. In the previous research work, “Improving Accuracy and Computational Burden of Bundle Adjustment Algorithm using GPUs,” the authors demonstrated first the Bundle Adjustment algorithmic performance improvement by reducing the mean square error using an additional radial distorting parameter and explicitly computed analytical derivatives and reducing the computational burden of the Bundle Adjustment algorithm using GPUs. The naïve implementation of the CUDA code, a speedup of 10× for the largest dataset of 13,678 cameras, 4,455,747 points, and 28,975,571 projections was achieved. In this paper, we present the optimization of the Bundle Adjustment algorithm CUDA code on GPUs to achieve higher speedup. We propose a new data memory layout for the parameters in the Bundle Adjustment algorithm, resulting in contiguous memory access. We demonstrate that it improves the memory throughput on the GPUs, thereby improving the overall performance. We also demonstrate an increase in the computational throughput of the algorithm by optimizing the CUDA kernels to utilize the GPU resources effectively. A comparative performance study of explicitly computing an algorithm parameter versus using the Jacobians instead is presented. In the previous work, the Bundle Adjustment algorithm failed to converge for certain datasets due to several block matrices of the cameras in the augmented normal equation, resulting in rank-deficient matrices. In this work, we identify the cameras that cause rank-deficient matrices and preprocess the datasets to ensure the convergence of the BA algorithm. Our optimized CUDA implementation achieves convergence of the Bundle Adjustment algorithm in around 22 seconds for the largest dataset compared to 654 seconds for the sequential implementation, resulting in a speedup of 30×. Our optimized CUDA implementation presented in this paper has achieved a 3× speedup for the largest dataset compared to the previous naïve CUDA implementation.
文摘This research aimed to clarify the role of by-product materials, such as CKD with SF as partial replacement by weight of cement in concrete manufacturing and inclusion on different characteristics of concrete. Concrete test specimens were mixed with 0%, 5%, 10%, 15%, 20% and 25% (CKD) with 15% (SF) as partial replacement by weight of Cement (CEM I-52.5N). Fresh concrete properties have been evaluated by workability measurement slump test. While hardened concrete properties have been evaluated by compressive, split tensile and flexural strengths tests at ages 7, 28 and 56 days, but evaluated for bond strength, modulus of elasticity and chemical composition measurement with X-Ray Fluorescence at age of 28 days. The test results have revealed that the increase of CKD amount with fixed amount of SF in concrete mixtures as partial replacement by weight of cement leads to gradual decrease of fresh concrete workability. In concrete mixtures, 20% CKD in the presence of 15% SF as partial replacement by the weight of cement are the optimum ratios which can be used without any negative effect on mechanical properties compressive, indirect tensile, flexural and bond strength at all the ages of concrete. Also modulus of elasticity and bond strength increased by 8.81% and 0.69% respectively at the age 28 days compared with control mixture.
文摘In this paper, a numerical solution of nonlinear partial differential equation, Benjamin-Bona-Mahony (BBM) and Cahn-Hilliard equation is presented by using Adomain Decomposition Method (ADM) and Variational Iteration Method (VIM). The results reveal that the two methods are very effective, simple and very close to the exact solution.
基金funded by the American University in Cairo research grants(Project number SSE-MENG-M.M.-FY18-FY19-FY20-RG(1-18)–2017-Nov-11-17-52-02).
文摘Biogas is a renewable and clean energy source that plays an important role in the current environment of lowcarbon transition.If high-content CO_(2) in biogas can be separated,transformed,and utilized,it not only realizes high-value utilization of biogas but also promotes carbon reduction in the biogas field.To improve the combustion stability of biogas,an inhomogeneous,partially premixed stratified(IPPS)combustion model was adopted in this study.The thermal flame structure and stability were investigated for a wide range of mixture inhomogeneities,turbulence levels,CO_(2) concentrations,air-to-fuel velocity ratios,and combustion energies in a concentric flow slot burner(CFSB).A fine-wire thermocouple is used to resolve the thermal flame structure.The flame size was reduced by increasing the CO_(2) concentration and the flames became lighter blue.The flame temperature also decreased with increase in CO_(2) concentration.Flame stability was reduced by increasing the CO_(2) concentration.However,at a certain level of mixture inhomogeneity,the concentration of CO_(2) in the IPPS mode did not affect the stability.Accordingly,the IPPS mode of combustion should be suitable for the combustion and stabilization of biogas.This should support the design of highly stabilized biogas turbulent flames independent of CO_(2) concentration.The data show that the lower stability conditions are partially due to the change in fuel combustion energy,which is characterized by the Wobbe index(WI).In addition,at a certain level of mixture inhomogeneity,the effect of the WI on flame stability becomes dominant.
文摘A photovoltaic (PV) string with multiple modules with bypass diodes frequently deployed on a variety of autonomous PV systems may present multiple power peaks under uneven shading. For optimal solar harvesting, there is a need for a control schema to force the PV string to operate at global maximum power point (GMPP). While a lot of tracking methods have been proposed in the literature, they are usually complex and do not fully take advantage of the available characteristics of the PV array. This work highlights how the voltage at operating point and the forward voltage of the bypass diode are considered to design a global maximum power point tracking (GMPPT) algorithm with a very limited global search phase called Fast GMPPT. This algorithm successfully tracks GMPP between 94% and 98% of the time under a theoretical evaluation. It is then compared against Perturb and Observe, Deterministic Particle Swarm Optimization, and Grey Wolf Optimization under a sequence of irradiance steps as well as a power-over-voltage characteristics profile that mimics the electrical characteristics of a PV string under varying partial shading conditions. Overall, the simulation with the sequence of irradiance steps shows that while Fast GMPPT does not have the best convergence time, it has an excellent convergence rate as well as causes the least amount of power loss during the global search phase. Experimental test under varying partial shading conditions shows that while the GMPPT proposal is simple and lightweight, it is very performant under a wide range of dynamically varying partial shading conditions and boasts the best energy efficiency (94.74%) out of the 4 tested algorithms.
文摘BACKGROUND The success of liver resection relies on the ability of the remnant liver to regenerate.Most of the knowledge regarding the pathophysiological basis of liver regeneration comes from rodent studies,and data on humans are scarce.Additionally,there is limited knowledge about the preoperative factors that influence postoperative regeneration.AIM To quantify postoperative remnant liver volume by the latest volumetric software and investigate perioperative factors that affect posthepatectomy liver regenera-tion.METHODS A total of 268 patients who received partial hepatectomy were enrolled.Patients were grouped into right hepatectomy/trisegmentectomy(RH/Tri),left hepa-tectomy(LH),segmentectomy(Seg),and subsegmentectomy/nonanatomical hepatectomy(Sub/Non)groups.The regeneration index(RI)and late rege-neration rate were defined as(postoperative liver volume)/[total functional liver volume(TFLV)]×100 and(RI at 6-months-RI at 3-months)/RI at 6-months,respectively.The lower 25th percentile of RI and the higher 25th percentile of late regeneration rate in each group were defined as“low regeneration”and“delayed regeneration”.“Restoration to the original size”was defined as regeneration of the liver volume by more than 90%of the TFLV at 12 months postsurgery.RESULTS The numbers of patients in the RH/Tri,LH,Seg,and Sub/Non groups were 41,53,99 and 75,respectively.The RI plateaued at 3 months in the LH,Seg,and Sub/Non groups,whereas the RI increased until 12 months in the RH/Tri group.According to our multivariate analysis,the preoperative albumin-bilirubin(ALBI)score was an independent factor for low regeneration at 3 months[odds ratio(OR)95%CI=2.80(1.17-6.69),P=0.02;per 1.0 up]and 12 months[OR=2.27(1.01-5.09),P=0.04;per 1.0 up].Multivariate analysis revealed that only liver resection percentage[OR=1.03(1.00-1.05),P=0.04]was associated with delayed regeneration.Furthermore,multivariate analysis demonstrated that the preoperative ALBI score[OR=2.63(1.00-1.05),P=0.02;per 1.0 up]and liver resection percentage[OR=1.02(1.00-1.05),P=0.04;per 1.0 up]were found to be independent risk factors associated with volume restoration failure.CONCLUSION Liver regeneration posthepatectomy was determined by the resection percentage and preoperative ALBI score.This knowledge helps surgeons decide the timing and type of rehepatectomy for recurrent cases.
文摘The main purpose of this paper is to obtain the inference of parameters of heterogeneous population represented by finite mixture of two Pareto (MTP) distributions of the second kind. The constant-partially accelerated life tests are applied based on progressively type-II censored samples. The maximum likelihood estimates (MLEs) for the considered parameters are obtained by solving the likelihood equations of the model parameters numerically. The Bayes estimators are obtained by using Markov chain Monte Carlo algorithm under the balanced squared error loss function. Based on Monte Carlo simulation, Bayes estimators are compared with their corresponding maximum likelihood estimators. The two-sample prediction technique is considered to derive Bayesian prediction bounds for future order statistics based on progressively type-II censored informative samples obtained from constant-partially accelerated life testing models. The informative and future samples are assumed to be obtained from the same population. The coverage probabilities and the average interval lengths of the confidence intervals are computed via a Monte Carlo simulation to investigate the procedure of the prediction intervals. Analysis of a simulated data set has also been presented for illustrative purposes. Finally, comparisons are made between Bayesian and maximum likelihood estimators via a Monte Carlo simulation study.
文摘Aims: This study aims at designing and implementing syllabus-oriented question-bank system that is capable of producing paper-based exams with multiple forms along with answer keys. The developed software tool is named Χ(Chi)-Pro Milestone and supports four types of questions, namely: Multiple-choice, True/False, Short-Answer and Free-Response Essay questions. The study is motivated by the fact that student number in schools and universities is continuously growing at high, non-linear, and uncontrolled rates. This growth, however, is not accompanied by an equivalent growth of educational resources (mainly: instructors, classrooms, and labs). A direct result of this situation is having relatively large number of students in each classroom. It is observed that providing and using online-examining systems could be intractable and expensive. As an alternative, paper-based exams can be used. One main issue is that manually produced paper-based exams are of low quality because of some human factors such as instability and relatively narrow range of topics [1]. Further, it is observed that instructors usually need to spend a lot of time and energy in composing paper-based exams with multiple forms. Therefore, the use of computers for automatic production of paper-based exams from question banks is becoming more and more important. Methodology: The design and evaluation of X-Pro Milestone are done by considering a basic set of design principles that are based on a list of identified Functional and Non-Functional Requirements. Deriving those requirements is made possible by developing X-Pro Milestone using the Iterative and Incremental model from software engineering domain. Results: We demonstrate that X-Pro Milestone has a number of excellent characteristics compared to the exam-preparation and question banks tools available in market. Some of these characteristics are: ease of use and operation, user-friendly interface and good usability, high security and protection of the question bank-items, high stability, and reliability. Further, X-Pro Milestone makes initiating, maintaining and archiving Question-Banks and produced exams possible. Putting X-Pro Milestone into real use has showed that X-Pro Milestone is easy to be learned and effectively used. We demonstrate that X-Pro Milestone is a cost-effective alternative to online examining systems with more and richer features and with low infrastructure requirements.