Objective: To analyze the impact of an integrated extended care model on improving the quality of life of elderly patients with Type 2 Diabetes Mellitus (T2DM). Methods: A total of 176 patients admitted to the hospita...Objective: To analyze the impact of an integrated extended care model on improving the quality of life of elderly patients with Type 2 Diabetes Mellitus (T2DM). Methods: A total of 176 patients admitted to the hospital from March 2015 to February 2018 were selected and randomly assigned to an observation group and a control group, with 88 patients each. The control group implemented conventional nursing interventions, and the observation group carried out an integrated extended-care model. The level of glycemic control, quality of life, and daily medication adherence between both groups were compared. Results: The observation group showed significant improvement in the level of glycemic control, and their fasting blood glucose, 2-hour postprandial blood glucose, and glycated hemoglobin levels were significantly lower as compared with those in the study group (P < 0.05). The quality of life of the patients in the observation group was higher than that of the control group (P < 0.05). The observation group had a higher compliance score (95.48 ± 7.45) than the control group (81.31 ± 8.72) (t = 8.909, P < 0.05). Conclusion: The integrated extended care model allows patients to receive comprehensive and individualized nursing services after discharge, which improves the effect of drug therapy and the quality of life of patients.展开更多
We work within a Winterberg framework where space, i.e., the vacuum, consists of a two component superfluid/super-solid made up of a vast assembly (sea) of positive and negative mass Planck particles, called planckion...We work within a Winterberg framework where space, i.e., the vacuum, consists of a two component superfluid/super-solid made up of a vast assembly (sea) of positive and negative mass Planck particles, called planckions. These material particles interact indirectly, and have very strong restoring forces keeping them a finite distance apart from each other within their respective species. Because of their mass compensating effect, the vacuum appears massless, charge-less, without pressure, net energy density or entropy. In addition, we consider two varying G models, where G, is Newton’s constant, and G<sup>-1</sup>, increases with an increase in cosmological time. We argue that there are at least two competing models for the quantum vacuum within such a framework. The first follows a strict extension of Winterberg’s model. This leads to nonsensible results, if G increases, going back in cosmological time, as the length scale inherent in such a model will not scale properly. The second model introduces a different length scale, which does scale properly, but keeps the mass of the Planck particle as, ± the Planck mass. Moreover we establish a connection between ordinary matter, dark matter, and dark energy, where all three mass densities within the Friedman equation must be interpreted as residual vacuum energies, which only surface, once aggregate matter has formed, at relatively low CMB temperatures. The symmetry of the vacuum will be shown to be broken, because of the different scaling laws, beginning with the formation of elementary particles. Much like waves on an ocean where positive and negative planckion mass densities effectively cancel each other out and form a zero vacuum energy density/zero vacuum pressure surface, these positive mass densities are very small perturbations (anomalies) about the mean. This greatly alleviates, i.e., minimizes the cosmological constant problem, a long standing problem associated with the vacuum.展开更多
BACKGROUND Stroke has become one of the most serious life-threatening diseases due to its high morbidity,disability,recurrence and mortality rates.AIM To explore the intervention effect of multi-disciplinary treatment...BACKGROUND Stroke has become one of the most serious life-threatening diseases due to its high morbidity,disability,recurrence and mortality rates.AIM To explore the intervention effect of multi-disciplinary treatment(MDT)extended nursing model on negative emotions and quality of life of young patients with post-stroke.METHODS A total of 60 young stroke patients who were hospitalized in the neurology department of our hospital from January 2020 to December 2021 were selected and randomly divided into a control group and an experimental group,with 30 patients in each group.The control group used the conventional care model and the experimental group used the MDT extended nursing model.After the inhospital and 3-mo post-discharge interventions,the differences in negative emotions and quality of life scores between the two groups were evaluated and analyzed at the time of admission,at the time of discharge and after discharge,respectively.RESULTS There are no statistically significant differences in the negative emotions scores between the two groups at admission,while there are statistically significant differences in the negative emotions scores within each group at admission and discharge,at discharge and post-discharge,and at discharge and post-discharge.In addition,the negative emotions scores were all statistically significant at discharge and after discharge when compared between the two groups.There was no statistically significant difference in quality of life scores at the time of admission between the two groups,and the difference between quality of life scores at the time of admission and discharge,at the time of discharge and post-discharge,and at the time of admission and post-discharge for each group of patients was statistically significant.CONCLUSION The MDT extended nursing mode can improve the negative emotion of patients and improve their quality of life.Therefore,it can be applied in future clinical practice and is worthy of promotion.展开更多
Exploring the role of entanglement in quantum nonequilibrium dynamics is important to understand the mechanism of thermalization in an isolated system. We study the relaxation dynamics in a one-dimensional extended B...Exploring the role of entanglement in quantum nonequilibrium dynamics is important to understand the mechanism of thermalization in an isolated system. We study the relaxation dynamics in a one-dimensional extended Bose–Hubbard model after a global interaction quench by considering several observables: the local Boson numbers, the nonlocal entanglement entropy, and the momentum distribution functions. We calculate the thermalization fidelity for different quench parameters and different sizes of subsystems, and the results show that the degree of thermalization is affected by the distance from the integrable point and the size of the subsystem. We employ the Pearson coefficient as the measurement of the correlation between the entanglement entropy and thermalization fidelity, and a strong correlation is demonstrated for the quenched system.展开更多
Adaptive fractional polynomial modeling of general correlated outcomes is formulated to address nonlinearity in means, variances/dispersions, and correlations. Means and variances/dispersions are modeled using general...Adaptive fractional polynomial modeling of general correlated outcomes is formulated to address nonlinearity in means, variances/dispersions, and correlations. Means and variances/dispersions are modeled using generalized linear models in fixed effects/coefficients. Correlations are modeled using random effects/coefficients. Nonlinearity is addressed using power transforms of primary (untransformed) predictors. Parameter estimation is based on extended linear mixed modeling generalizing both generalized estimating equations and linear mixed modeling. Models are evaluated using likelihood cross-validation (LCV) scores and are generated adaptively using a heuristic search controlled by LCV scores. Cases covered include linear, Poisson, logistic, exponential, and discrete regression of correlated continuous, count/rate, dichotomous, positive continuous, and discrete numeric outcomes treated as normally, Poisson, Bernoulli, exponentially, and discrete numerically distributed, respectively. Example analyses are also generated for these five cases to compare adaptive random effects/coefficients modeling of correlated outcomes to previously developed adaptive modeling based on directly specified covariance structures. Adaptive random effects/coefficients modeling substantially outperforms direct covariance modeling in the linear, exponential, and discrete regression example analyses. It generates equivalent results in the logistic regression example analyses and it is substantially outperformed in the Poisson regression case. Random effects/coefficients modeling of correlated outcomes can provide substantial improvements in model selection compared to directly specified covariance modeling. However, directly specified covariance modeling can generate competitive or substantially better results in some cases while usually requiring less computation time.展开更多
As the increasing popularity and complexity of Web applications and the emergence of their new characteristics, the testing and maintenance of large, complex Web applications are becoming more complex and difficult. W...As the increasing popularity and complexity of Web applications and the emergence of their new characteristics, the testing and maintenance of large, complex Web applications are becoming more complex and difficult. Web applications generally contain lots of pages and are used by enormous users. Statistical testing is an effective way of ensuring their quality. Web usage can be accurately described by Markov chain which has been proved to be an ideal model for software statistical testing. The results of unit testing can be utilized in the latter stages, which is an important strategy for bottom-to-top integration testing, and the other improvement of extended Markov chain model (EMM) is to present the error type vector which is treated as a part of page node. this paper also proposes the algorithm for generating test cases of usage paths. Finally, optional usage reliability evaluation methods and an incremental usability regression testing model for testing and evaluation are presented. Key words statistical testing - evaluation for Web usability - extended Markov chain model (EMM) - Web log mining - reliability evaluation CLC number TP311. 5 Foundation item: Supported by the National Defence Research Project (No. 41315. 9. 2) and National Science and Technology Plan (2001BA102A04-02-03)Biography: MAO Cheng-ying (1978-), male, Ph.D. candidate, research direction: software testing. Research direction: advanced database system, software testing, component technology and data mining.展开更多
Accurate identification of influential nodes facilitates the control of rumor propagation and interrupts the spread of computer viruses.Many classical approaches have been proposed by researchers regarding different a...Accurate identification of influential nodes facilitates the control of rumor propagation and interrupts the spread of computer viruses.Many classical approaches have been proposed by researchers regarding different aspects.To explore the impact of location information in depth,this paper proposes an improved global structure model to characterize the influence of nodes.The method considers both the node’s self-information and the role of the location information of neighboring nodes.First,degree centrality of each node is calculated,and then degree value of each node is used to represent self-influence,and degree values of the neighbor layer nodes are divided by the power of the path length,which is path attenuation used to represent global influence.Finally,an extended improved global structure model that considers the nearest neighbor information after combining self-influence and global influence is proposed to identify influential nodes.In this paper,the propagation process of a real network is obtained by simulation with the SIR model,and the effectiveness of the proposed method is verified from two aspects of discrimination and accuracy.The experimental results show that the proposed method is more accurate in identifying influential nodes than other comparative methods with multiple networks.展开更多
Phosphorus is one of the most important nutrients required to support various kinds of biodegradation processes. As this particular nutrient is not included in the activated sludge model no. 1 (ASM1), this study ext...Phosphorus is one of the most important nutrients required to support various kinds of biodegradation processes. As this particular nutrient is not included in the activated sludge model no. 1 (ASM1), this study extended this model in order to determine the fate of phosphorus during the biodegradation processes. When some of the kinetics parameters are modified using observed data from the restoration project of the Xuxi River in Wuxi City, China, from August 25 to 31 in 2009, the extended model shows excellent results. In order to obtain optimum values of coefficients of nitrogen and phosphorus, the mass fraction method was used to ensure that the final results were reasonable and practically relevant. The temporal distribution of the data calculated with the extended ASM1 approximates that of the observed data.展开更多
In the paper the extended modelling method with serial sands is used in an experimental research on the erosion patterns at the discharge outlet of a beach Hua-Neng power plant. The theoretical basis for the extended ...In the paper the extended modelling method with serial sands is used in an experimental research on the erosion patterns at the discharge outlet of a beach Hua-Neng power plant. The theoretical basis for the extended modelling method with serial sands is systematically presented in the paper and the method has been successfully employed in the sediment experiment of coastal works. According to the Froude Law, the model is designed to be a normal one with movable bed, the geometric scale lambda(L) = lambda(H) = 15, and three scales of sediment grain size are chosen, i.e., lambda(d1) = 0.207; lambda(d2) = 0.393; and lambda(d3) = 0.656. The median particle diameter of sea beach prototype sand d(50p) = 0.059 mm and the dis-changed water flow of the power plant is 21.7 m(3) / s. Three types of natural sea sands have been chosen as the serial modelling sands to extend the simulation of the prototype, thus replacing the conventional test in which artificial lightweight sands are used. As a result, this method can not only reduce the cost significantly, but also it is an advanced technique easy to use. Upon a series of tests, satisfactory results have been obtained.展开更多
In the framework of nonperturbative quantum field theory, the critical phenomena of one-dimensionalextended Hubbard model (EHM) at half-filling are discussed from weak to intermediate interactions. After the EHMbeing ...In the framework of nonperturbative quantum field theory, the critical phenomena of one-dimensionalextended Hubbard model (EHM) at half-filling are discussed from weak to intermediate interactions. After the EHMbeing mapped into two decoupled sine-Gordon models, the ground state phase diagram of the system is derived in anexplicit way. It is confirmed that the coexisting phases appear in different interaction regimes which cannot be foundby conventional theoretical methods. The diagram shows that there are seven different phase regions in the groundstate, which seems not to be the same as previous discussions, especially the boundary between the phase separationand condensed phase regions. The phase transition properties of the model between various phase regions are studied indetail.展开更多
We study systematically an extended Bose-Hubbard model on the triangular lattice by means of a meanfield method based on the Gutzwiller ansatz. Pair hopping terms are explicitly included and a three-body constraint is...We study systematically an extended Bose-Hubbard model on the triangular lattice by means of a meanfield method based on the Gutzwiller ansatz. Pair hopping terms are explicitly included and a three-body constraint is applied. The zero-temperature phase diagram and a variety of quantum phase transitions are investigated in great detail. In particular, we show the existence and the stability of the pair supersolid phase.展开更多
In this study, the performance of the extended shallow water model (ESWM) in evaluation of the flow regime of turbidity currents entering the Dez Reservoir was investigated. The continuity equations for fluid and pa...In this study, the performance of the extended shallow water model (ESWM) in evaluation of the flow regime of turbidity currents entering the Dez Reservoir was investigated. The continuity equations for fluid and particles and the Navier-Stokes equations govern the entire flow of turbidity currents. The shallow water equations governing the flow of the depositing phase of turbidity currents are derived from these equations. A case study was conducted on the flow regime of turbidity currents entering the Dez Reservoir in Iran from January 2002 to July 2003. Facing a serious sedimentation problem, the dead storage of the Dez Reservoir will be full in the coming 10 years, and the inflowing water in the hydropower conduit system is now becoming turbid. Based on the values of the dimensionless friction number ( Nf ≤1 ) and dimensionless entrainment number ( NE≤ 1 ) of turbidity currents, and the coefficient of determination between the observed and predicted deposit depths (R2 = 0.86) for the flow regime of negligible friction and negligible entrainment (NFNE), the flow regime of turbidity currents coming into the Dez Reservoir is considered to be NFNE. The results suggest that the ESWM is an appropriate approach for evaluation of the flow regime of turbidity currents in dam reservoirs where the characteristics of turbidity currents, such as the deposit depth, must be evaluated.展开更多
By using the bosonization and renormalization group methods, we have studied the low energy physical properties in one-dimensional extended Hubbard model. The formation of charge and spin gaps is investigated at the h...By using the bosonization and renormalization group methods, we have studied the low energy physical properties in one-dimensional extended Hubbard model. The formation of charge and spin gaps is investigated at the half-filled electron band. An analytical expression for the charge gap in terms of the Coulomb repulsive interaction strength U and the nearest-neighbour interaction parameter V is obtained.展开更多
We propose the generalization of Einstein’s special theory of relativity (STR). In our model, we use the (1 + 4)-dimensional space G, which is the extension of the (1 + 3)-dimensional Minkowski space M. As a fifth ad...We propose the generalization of Einstein’s special theory of relativity (STR). In our model, we use the (1 + 4)-dimensional space G, which is the extension of the (1 + 3)-dimensional Minkowski space M. As a fifth additional coordinate, the interval S is used. This value is constant under the usual Lorentz transformations in M, but it changes when the transformations in the extended space G are used. We call this model the Extended space model (ESM). From a physical point of view, our expansion means that processes in which the rest mass of the particles changes are acceptable now. In the ESM, gravity and electromagnetism are combined in one field. In the ESM, a photon can have a nonzero mass and this mass can be either positive or negative. It is also possible to establish in the frame of ESM connection between mass of a particle and its size.展开更多
We present a new interpretation of the Higgs field as a composite particle made up of a positive, with, a negative mass Planck particle. According to the Winterberg hypothesis, space, i.e., the vacuum, consists of bot...We present a new interpretation of the Higgs field as a composite particle made up of a positive, with, a negative mass Planck particle. According to the Winterberg hypothesis, space, i.e., the vacuum, consists of both positive and negative physical massive particles, which he called planckions, interacting through strong superfluid forces. In our composite model for the Higgs boson, there is an intrinsic length scale associated with the vacuum, different from the one introduced by Winterberg, where, when the vacuum is in a perfectly balanced state, the number density of positive Planck particles equals the number density of negative Planck particles. Due to the mass compensating effect, the vacuum thus appears massless, chargeless, without pressure, energy density, or entropy. However, a situation can arise where there is an effective mass density imbalance due to the two species of Planck particle not matching in terms of populations, within their respective excited energy states. This does not require the physical addition or removal of either positive or negative Planck particles, within a given region of space, as originally thought. Ordinary matter, dark matter, and dark energy can thus be given a new interpretation as residual vacuum energies within the context of a greater vacuum, where the populations of the positive and negative energy states exactly balance. In the present epoch, it is estimated that the dark energy number density imbalance amounts to, , per cubic meter, when cosmic distance scales in excess of, 100 Mpc, are considered. Compared to a strictly balanced vacuum, where we estimate that the positive, and the negative Planck number density, is of the order, 7.85E54 particles per cubic meter, the above is a very small perturbation. This slight imbalance, we argue, would dramatically alleviate, if not altogether eliminate, the long standing cosmological constant problem.展开更多
A self-consistent fluid model is developed to investigate the radial distributions of dusty plasma parameters in a DC glow discharge,in which the extended fluid approach of plasma particles and the transport equations...A self-consistent fluid model is developed to investigate the radial distributions of dusty plasma parameters in a DC glow discharge,in which the extended fluid approach of plasma particles and the transport equations of dust particles are coupled.The electrical interaction between charged dust particles is considered in the model.The time evolution of radial distributions of dust density,plasma density,the radial component of electric field and the forces acting on dust particles when dust density tends to be stable,are obtained and analyzed under different discharge currents and dust particle radii.It is shown that the dust density structure is determined mainly by the radial electrostatic force,thermophoretic force and ion drag force in the discharge tube,and both discharge current and dust particle radius have an obvious effect on the transport processes of dust particles.The dust particles gather in the central region of the discharge tube for low discharge current and small dust radius,then dust voids are formed and become wider when the discharge current and dust radius increase.The plasma parameters in the dust gathering region are obviously affected by the dust particles due to the charging processes of electrons and ions to the dust surface.展开更多
We have shown that classic works of Modigliani and Miller, Black and Scholes, Merton, Black and Cox, and Leland making the foundation of the modern asset pricing theory, are wrong due to misinterpretation of no arbitr...We have shown that classic works of Modigliani and Miller, Black and Scholes, Merton, Black and Cox, and Leland making the foundation of the modern asset pricing theory, are wrong due to misinterpretation of no arbitrage as the martingale no-arbitrage principle. This error explains appearance of the geometric Brownian model (GBM) for description of the firm value and other long-term assets considering the firm and its assets as self-financing portfolios with symmetric return distributions. It contradicts the empirical observations that returns on firms, stocks, and bonds are skewed. On the other side, the settings of the asset valuation problems, taking into account the default line and business securing expenses, BSEs, generate skewed return distributions for the firm and its securities. The Extended Merton model (EMM), taking into account BSEs and the default line, shows that the no-arbitrage principle should be understood as the non-martingale no arbitrage, when for sufficiently long periods both the predictable part of returns and the mean of the stochastic part of returns occur negative, and the value of the return deficit depends on time and the states of the firm and market. The EMM findings explain the problems with the S&P 500 VIX, the strange behavior of variance and skewness of stock returns before and after the crisis of 1987, etc.展开更多
Bogie is a pivotal system and plays a critical part in the safety and reliability management of high-speed rail.However,the available bogie system reliability analysis methods lack the consideration of multi-state cha...Bogie is a pivotal system and plays a critical part in the safety and reliability management of high-speed rail.However,the available bogie system reliability analysis methods lack the consideration of multi-state characteristics,and the common multi-state reliability analysis methods,being an NP-hard problem,lead to low efficiency.In order to overcome these drawbacks,this paper proposes a novel multi-state rail train bogie system reliability analysis approach based on the extended d-MC model.Three different function interactions within the bogie system are considered to build the multi-state bogie system flow network model.Meanwhile,an extended d-MC model is established to remove unnecessary d-MC candidates and duplicates,which greatly enhances the computing efficiency.The bogie system reliability is calculated,and examples are provided.Numerical experiments are carried out for the different operational conditions of the bogie system and are used to test the practicability of the method proposed in this article;it is found that this method outperforms a newly developed method in solving multi-state reliability problems.展开更多
After probability theory, fuzzy set theory and evidence theory, rough set theory is a new mathematical tool for dealing with vague, imprecise, inconsistent and uncertain knowledge. In recent years, the research and ap...After probability theory, fuzzy set theory and evidence theory, rough set theory is a new mathematical tool for dealing with vague, imprecise, inconsistent and uncertain knowledge. In recent years, the research and applications on rough set theory have attracted more and more researchers' attention. And it is one of the hot issues in the artificial intelligence field. In this paper, the basic concepts, operations and characteristics on the rough set theory are introduced firstly, and then the extensions of rough set model, the situation of their applications, some application software and the key problems in applied research for the rough set theory are presented.展开更多
Shift-share analysis has been confirmed a useful approach in the study of regional economics and many kinds of extended shift-share models have been advanced and put into practice in economic studies, but few have hit...Shift-share analysis has been confirmed a useful approach in the study of regional economics and many kinds of extended shift-share models have been advanced and put into practice in economic studies, but few have hitherto been introduced and applied to the tourism research in China. Moreover understanding the spatially competitive relationship is of paramount importance for marketers, developers, and planners involved in tourism strategy development. Based on international tourism receipts from 1995 to 2004, this study aims at probing into the spatial competitiveness of interna- tional tourism in Jiangsu Province in comparison with its neighbors by applying a spatially extended shift-share model and a modified dynamic shift-share model. The empirical results illustrate that exceptional years may exist in the ap- plication of dynamic shift-share models. To solve this issue, modifications to dynamic shift-share model are put forward. The analytical results are not only presented but also explained by the comparison of background conditions of tourism development between Jiangsu and its key competitors. The conclusions can be drawn that the growth of international tourism receipts in Jiangsu mainly attributes to the national component and the competitive component and Zhejiang is the most important rival to Jiangsu during the period of 1995-2004. In order to upgrade the tourism competitiveness, it is indispensable for Jiangsu to take proper positioning, promoting and marketing strategies and to cooperate and integrate with its main rivals.展开更多
文摘Objective: To analyze the impact of an integrated extended care model on improving the quality of life of elderly patients with Type 2 Diabetes Mellitus (T2DM). Methods: A total of 176 patients admitted to the hospital from March 2015 to February 2018 were selected and randomly assigned to an observation group and a control group, with 88 patients each. The control group implemented conventional nursing interventions, and the observation group carried out an integrated extended-care model. The level of glycemic control, quality of life, and daily medication adherence between both groups were compared. Results: The observation group showed significant improvement in the level of glycemic control, and their fasting blood glucose, 2-hour postprandial blood glucose, and glycated hemoglobin levels were significantly lower as compared with those in the study group (P < 0.05). The quality of life of the patients in the observation group was higher than that of the control group (P < 0.05). The observation group had a higher compliance score (95.48 ± 7.45) than the control group (81.31 ± 8.72) (t = 8.909, P < 0.05). Conclusion: The integrated extended care model allows patients to receive comprehensive and individualized nursing services after discharge, which improves the effect of drug therapy and the quality of life of patients.
文摘We work within a Winterberg framework where space, i.e., the vacuum, consists of a two component superfluid/super-solid made up of a vast assembly (sea) of positive and negative mass Planck particles, called planckions. These material particles interact indirectly, and have very strong restoring forces keeping them a finite distance apart from each other within their respective species. Because of their mass compensating effect, the vacuum appears massless, charge-less, without pressure, net energy density or entropy. In addition, we consider two varying G models, where G, is Newton’s constant, and G<sup>-1</sup>, increases with an increase in cosmological time. We argue that there are at least two competing models for the quantum vacuum within such a framework. The first follows a strict extension of Winterberg’s model. This leads to nonsensible results, if G increases, going back in cosmological time, as the length scale inherent in such a model will not scale properly. The second model introduces a different length scale, which does scale properly, but keeps the mass of the Planck particle as, ± the Planck mass. Moreover we establish a connection between ordinary matter, dark matter, and dark energy, where all three mass densities within the Friedman equation must be interpreted as residual vacuum energies, which only surface, once aggregate matter has formed, at relatively low CMB temperatures. The symmetry of the vacuum will be shown to be broken, because of the different scaling laws, beginning with the formation of elementary particles. Much like waves on an ocean where positive and negative planckion mass densities effectively cancel each other out and form a zero vacuum energy density/zero vacuum pressure surface, these positive mass densities are very small perturbations (anomalies) about the mean. This greatly alleviates, i.e., minimizes the cosmological constant problem, a long standing problem associated with the vacuum.
基金Supported by the Joint Guidance Project of Qiqihar Science and Technology Plan in 2020,No.LHYD-202054。
文摘BACKGROUND Stroke has become one of the most serious life-threatening diseases due to its high morbidity,disability,recurrence and mortality rates.AIM To explore the intervention effect of multi-disciplinary treatment(MDT)extended nursing model on negative emotions and quality of life of young patients with post-stroke.METHODS A total of 60 young stroke patients who were hospitalized in the neurology department of our hospital from January 2020 to December 2021 were selected and randomly divided into a control group and an experimental group,with 30 patients in each group.The control group used the conventional care model and the experimental group used the MDT extended nursing model.After the inhospital and 3-mo post-discharge interventions,the differences in negative emotions and quality of life scores between the two groups were evaluated and analyzed at the time of admission,at the time of discharge and after discharge,respectively.RESULTS There are no statistically significant differences in the negative emotions scores between the two groups at admission,while there are statistically significant differences in the negative emotions scores within each group at admission and discharge,at discharge and post-discharge,and at discharge and post-discharge.In addition,the negative emotions scores were all statistically significant at discharge and after discharge when compared between the two groups.There was no statistically significant difference in quality of life scores at the time of admission between the two groups,and the difference between quality of life scores at the time of admission and discharge,at the time of discharge and post-discharge,and at the time of admission and post-discharge for each group of patients was statistically significant.CONCLUSION The MDT extended nursing mode can improve the negative emotion of patients and improve their quality of life.Therefore,it can be applied in future clinical practice and is worthy of promotion.
基金supported by the National Natural Science Foundation of China (Grant No. 11147110)the Natural Science Youth Foundation of Shanxi Province, China (Grant No. 2011021003)。
文摘Exploring the role of entanglement in quantum nonequilibrium dynamics is important to understand the mechanism of thermalization in an isolated system. We study the relaxation dynamics in a one-dimensional extended Bose–Hubbard model after a global interaction quench by considering several observables: the local Boson numbers, the nonlocal entanglement entropy, and the momentum distribution functions. We calculate the thermalization fidelity for different quench parameters and different sizes of subsystems, and the results show that the degree of thermalization is affected by the distance from the integrable point and the size of the subsystem. We employ the Pearson coefficient as the measurement of the correlation between the entanglement entropy and thermalization fidelity, and a strong correlation is demonstrated for the quenched system.
文摘Adaptive fractional polynomial modeling of general correlated outcomes is formulated to address nonlinearity in means, variances/dispersions, and correlations. Means and variances/dispersions are modeled using generalized linear models in fixed effects/coefficients. Correlations are modeled using random effects/coefficients. Nonlinearity is addressed using power transforms of primary (untransformed) predictors. Parameter estimation is based on extended linear mixed modeling generalizing both generalized estimating equations and linear mixed modeling. Models are evaluated using likelihood cross-validation (LCV) scores and are generated adaptively using a heuristic search controlled by LCV scores. Cases covered include linear, Poisson, logistic, exponential, and discrete regression of correlated continuous, count/rate, dichotomous, positive continuous, and discrete numeric outcomes treated as normally, Poisson, Bernoulli, exponentially, and discrete numerically distributed, respectively. Example analyses are also generated for these five cases to compare adaptive random effects/coefficients modeling of correlated outcomes to previously developed adaptive modeling based on directly specified covariance structures. Adaptive random effects/coefficients modeling substantially outperforms direct covariance modeling in the linear, exponential, and discrete regression example analyses. It generates equivalent results in the logistic regression example analyses and it is substantially outperformed in the Poisson regression case. Random effects/coefficients modeling of correlated outcomes can provide substantial improvements in model selection compared to directly specified covariance modeling. However, directly specified covariance modeling can generate competitive or substantially better results in some cases while usually requiring less computation time.
文摘As the increasing popularity and complexity of Web applications and the emergence of their new characteristics, the testing and maintenance of large, complex Web applications are becoming more complex and difficult. Web applications generally contain lots of pages and are used by enormous users. Statistical testing is an effective way of ensuring their quality. Web usage can be accurately described by Markov chain which has been proved to be an ideal model for software statistical testing. The results of unit testing can be utilized in the latter stages, which is an important strategy for bottom-to-top integration testing, and the other improvement of extended Markov chain model (EMM) is to present the error type vector which is treated as a part of page node. this paper also proposes the algorithm for generating test cases of usage paths. Finally, optional usage reliability evaluation methods and an incremental usability regression testing model for testing and evaluation are presented. Key words statistical testing - evaluation for Web usability - extended Markov chain model (EMM) - Web log mining - reliability evaluation CLC number TP311. 5 Foundation item: Supported by the National Defence Research Project (No. 41315. 9. 2) and National Science and Technology Plan (2001BA102A04-02-03)Biography: MAO Cheng-ying (1978-), male, Ph.D. candidate, research direction: software testing. Research direction: advanced database system, software testing, component technology and data mining.
基金supported by the National Natural Science Foundation of China(Grant No.11975307).
文摘Accurate identification of influential nodes facilitates the control of rumor propagation and interrupts the spread of computer viruses.Many classical approaches have been proposed by researchers regarding different aspects.To explore the impact of location information in depth,this paper proposes an improved global structure model to characterize the influence of nodes.The method considers both the node’s self-information and the role of the location information of neighboring nodes.First,degree centrality of each node is calculated,and then degree value of each node is used to represent self-influence,and degree values of the neighbor layer nodes are divided by the power of the path length,which is path attenuation used to represent global influence.Finally,an extended improved global structure model that considers the nearest neighbor information after combining self-influence and global influence is proposed to identify influential nodes.In this paper,the propagation process of a real network is obtained by simulation with the SIR model,and the effectiveness of the proposed method is verified from two aspects of discrimination and accuracy.The experimental results show that the proposed method is more accurate in identifying influential nodes than other comparative methods with multiple networks.
文摘Phosphorus is one of the most important nutrients required to support various kinds of biodegradation processes. As this particular nutrient is not included in the activated sludge model no. 1 (ASM1), this study extended this model in order to determine the fate of phosphorus during the biodegradation processes. When some of the kinetics parameters are modified using observed data from the restoration project of the Xuxi River in Wuxi City, China, from August 25 to 31 in 2009, the extended model shows excellent results. In order to obtain optimum values of coefficients of nitrogen and phosphorus, the mass fraction method was used to ensure that the final results were reasonable and practically relevant. The temporal distribution of the data calculated with the extended ASM1 approximates that of the observed data.
文摘In the paper the extended modelling method with serial sands is used in an experimental research on the erosion patterns at the discharge outlet of a beach Hua-Neng power plant. The theoretical basis for the extended modelling method with serial sands is systematically presented in the paper and the method has been successfully employed in the sediment experiment of coastal works. According to the Froude Law, the model is designed to be a normal one with movable bed, the geometric scale lambda(L) = lambda(H) = 15, and three scales of sediment grain size are chosen, i.e., lambda(d1) = 0.207; lambda(d2) = 0.393; and lambda(d3) = 0.656. The median particle diameter of sea beach prototype sand d(50p) = 0.059 mm and the dis-changed water flow of the power plant is 21.7 m(3) / s. Three types of natural sea sands have been chosen as the serial modelling sands to extend the simulation of the prototype, thus replacing the conventional test in which artificial lightweight sands are used. As a result, this method can not only reduce the cost significantly, but also it is an advanced technique easy to use. Upon a series of tests, satisfactory results have been obtained.
文摘In the framework of nonperturbative quantum field theory, the critical phenomena of one-dimensionalextended Hubbard model (EHM) at half-filling are discussed from weak to intermediate interactions. After the EHMbeing mapped into two decoupled sine-Gordon models, the ground state phase diagram of the system is derived in anexplicit way. It is confirmed that the coexisting phases appear in different interaction regimes which cannot be foundby conventional theoretical methods. The diagram shows that there are seven different phase regions in the groundstate, which seems not to be the same as previous discussions, especially the boundary between the phase separationand condensed phase regions. The phase transition properties of the model between various phase regions are studied indetail.
基金supported by the National Natural Science Foundation of China (Grant Nos. 11175018 and 11247251)
文摘We study systematically an extended Bose-Hubbard model on the triangular lattice by means of a meanfield method based on the Gutzwiller ansatz. Pair hopping terms are explicitly included and a three-body constraint is applied. The zero-temperature phase diagram and a variety of quantum phase transitions are investigated in great detail. In particular, we show the existence and the stability of the pair supersolid phase.
文摘In this study, the performance of the extended shallow water model (ESWM) in evaluation of the flow regime of turbidity currents entering the Dez Reservoir was investigated. The continuity equations for fluid and particles and the Navier-Stokes equations govern the entire flow of turbidity currents. The shallow water equations governing the flow of the depositing phase of turbidity currents are derived from these equations. A case study was conducted on the flow regime of turbidity currents entering the Dez Reservoir in Iran from January 2002 to July 2003. Facing a serious sedimentation problem, the dead storage of the Dez Reservoir will be full in the coming 10 years, and the inflowing water in the hydropower conduit system is now becoming turbid. Based on the values of the dimensionless friction number ( Nf ≤1 ) and dimensionless entrainment number ( NE≤ 1 ) of turbidity currents, and the coefficient of determination between the observed and predicted deposit depths (R2 = 0.86) for the flow regime of negligible friction and negligible entrainment (NFNE), the flow regime of turbidity currents coming into the Dez Reservoir is considered to be NFNE. The results suggest that the ESWM is an appropriate approach for evaluation of the flow regime of turbidity currents in dam reservoirs where the characteristics of turbidity currents, such as the deposit depth, must be evaluated.
基金国家自然科学基金,the Research Fund for the Doctoral Program of Higher Education of China
文摘By using the bosonization and renormalization group methods, we have studied the low energy physical properties in one-dimensional extended Hubbard model. The formation of charge and spin gaps is investigated at the half-filled electron band. An analytical expression for the charge gap in terms of the Coulomb repulsive interaction strength U and the nearest-neighbour interaction parameter V is obtained.
文摘We propose the generalization of Einstein’s special theory of relativity (STR). In our model, we use the (1 + 4)-dimensional space G, which is the extension of the (1 + 3)-dimensional Minkowski space M. As a fifth additional coordinate, the interval S is used. This value is constant under the usual Lorentz transformations in M, but it changes when the transformations in the extended space G are used. We call this model the Extended space model (ESM). From a physical point of view, our expansion means that processes in which the rest mass of the particles changes are acceptable now. In the ESM, gravity and electromagnetism are combined in one field. In the ESM, a photon can have a nonzero mass and this mass can be either positive or negative. It is also possible to establish in the frame of ESM connection between mass of a particle and its size.
文摘We present a new interpretation of the Higgs field as a composite particle made up of a positive, with, a negative mass Planck particle. According to the Winterberg hypothesis, space, i.e., the vacuum, consists of both positive and negative physical massive particles, which he called planckions, interacting through strong superfluid forces. In our composite model for the Higgs boson, there is an intrinsic length scale associated with the vacuum, different from the one introduced by Winterberg, where, when the vacuum is in a perfectly balanced state, the number density of positive Planck particles equals the number density of negative Planck particles. Due to the mass compensating effect, the vacuum thus appears massless, chargeless, without pressure, energy density, or entropy. However, a situation can arise where there is an effective mass density imbalance due to the two species of Planck particle not matching in terms of populations, within their respective excited energy states. This does not require the physical addition or removal of either positive or negative Planck particles, within a given region of space, as originally thought. Ordinary matter, dark matter, and dark energy can thus be given a new interpretation as residual vacuum energies within the context of a greater vacuum, where the populations of the positive and negative energy states exactly balance. In the present epoch, it is estimated that the dark energy number density imbalance amounts to, , per cubic meter, when cosmic distance scales in excess of, 100 Mpc, are considered. Compared to a strictly balanced vacuum, where we estimate that the positive, and the negative Planck number density, is of the order, 7.85E54 particles per cubic meter, the above is a very small perturbation. This slight imbalance, we argue, would dramatically alleviate, if not altogether eliminate, the long standing cosmological constant problem.
基金supported by the Stable-Support Scientific Project of China Research Institute of Radiowave Propagation(No.132101W07)National Natural Science Foundation of China(No.12105251)National Key Laboratory Foundation Electromagnetic Environment(Nos.A382101001,A382101002 and A152101731-C02).
文摘A self-consistent fluid model is developed to investigate the radial distributions of dusty plasma parameters in a DC glow discharge,in which the extended fluid approach of plasma particles and the transport equations of dust particles are coupled.The electrical interaction between charged dust particles is considered in the model.The time evolution of radial distributions of dust density,plasma density,the radial component of electric field and the forces acting on dust particles when dust density tends to be stable,are obtained and analyzed under different discharge currents and dust particle radii.It is shown that the dust density structure is determined mainly by the radial electrostatic force,thermophoretic force and ion drag force in the discharge tube,and both discharge current and dust particle radius have an obvious effect on the transport processes of dust particles.The dust particles gather in the central region of the discharge tube for low discharge current and small dust radius,then dust voids are formed and become wider when the discharge current and dust radius increase.The plasma parameters in the dust gathering region are obviously affected by the dust particles due to the charging processes of electrons and ions to the dust surface.
文摘We have shown that classic works of Modigliani and Miller, Black and Scholes, Merton, Black and Cox, and Leland making the foundation of the modern asset pricing theory, are wrong due to misinterpretation of no arbitrage as the martingale no-arbitrage principle. This error explains appearance of the geometric Brownian model (GBM) for description of the firm value and other long-term assets considering the firm and its assets as self-financing portfolios with symmetric return distributions. It contradicts the empirical observations that returns on firms, stocks, and bonds are skewed. On the other side, the settings of the asset valuation problems, taking into account the default line and business securing expenses, BSEs, generate skewed return distributions for the firm and its securities. The Extended Merton model (EMM), taking into account BSEs and the default line, shows that the no-arbitrage principle should be understood as the non-martingale no arbitrage, when for sufficiently long periods both the predictable part of returns and the mean of the stochastic part of returns occur negative, and the value of the return deficit depends on time and the states of the firm and market. The EMM findings explain the problems with the S&P 500 VIX, the strange behavior of variance and skewness of stock returns before and after the crisis of 1987, etc.
基金funded by the Hunan Science and Technology‘Lotus Bud’Talent Support Program(Grant No.2022TJ-XH-009).
文摘Bogie is a pivotal system and plays a critical part in the safety and reliability management of high-speed rail.However,the available bogie system reliability analysis methods lack the consideration of multi-state characteristics,and the common multi-state reliability analysis methods,being an NP-hard problem,lead to low efficiency.In order to overcome these drawbacks,this paper proposes a novel multi-state rail train bogie system reliability analysis approach based on the extended d-MC model.Three different function interactions within the bogie system are considered to build the multi-state bogie system flow network model.Meanwhile,an extended d-MC model is established to remove unnecessary d-MC candidates and duplicates,which greatly enhances the computing efficiency.The bogie system reliability is calculated,and examples are provided.Numerical experiments are carried out for the different operational conditions of the bogie system and are used to test the practicability of the method proposed in this article;it is found that this method outperforms a newly developed method in solving multi-state reliability problems.
文摘After probability theory, fuzzy set theory and evidence theory, rough set theory is a new mathematical tool for dealing with vague, imprecise, inconsistent and uncertain knowledge. In recent years, the research and applications on rough set theory have attracted more and more researchers' attention. And it is one of the hot issues in the artificial intelligence field. In this paper, the basic concepts, operations and characteristics on the rough set theory are introduced firstly, and then the extensions of rough set model, the situation of their applications, some application software and the key problems in applied research for the rough set theory are presented.
基金Under the auspices of the National Natural Science Foundation of China (No. 40371030)
文摘Shift-share analysis has been confirmed a useful approach in the study of regional economics and many kinds of extended shift-share models have been advanced and put into practice in economic studies, but few have hitherto been introduced and applied to the tourism research in China. Moreover understanding the spatially competitive relationship is of paramount importance for marketers, developers, and planners involved in tourism strategy development. Based on international tourism receipts from 1995 to 2004, this study aims at probing into the spatial competitiveness of interna- tional tourism in Jiangsu Province in comparison with its neighbors by applying a spatially extended shift-share model and a modified dynamic shift-share model. The empirical results illustrate that exceptional years may exist in the ap- plication of dynamic shift-share models. To solve this issue, modifications to dynamic shift-share model are put forward. The analytical results are not only presented but also explained by the comparison of background conditions of tourism development between Jiangsu and its key competitors. The conclusions can be drawn that the growth of international tourism receipts in Jiangsu mainly attributes to the national component and the competitive component and Zhejiang is the most important rival to Jiangsu during the period of 1995-2004. In order to upgrade the tourism competitiveness, it is indispensable for Jiangsu to take proper positioning, promoting and marketing strategies and to cooperate and integrate with its main rivals.