In 2023,the majority of the Earth witnessed its warmest boreal summer and autumn since 1850.Whether 2023 will indeed turn out to be the warmest year on record and what caused the astonishingly large margin of warming ...In 2023,the majority of the Earth witnessed its warmest boreal summer and autumn since 1850.Whether 2023 will indeed turn out to be the warmest year on record and what caused the astonishingly large margin of warming has become one of the hottest topics in the scientific community and is closely connected to the future development of human society.We analyzed the monthly varying global mean surface temperature(GMST)in 2023 and found that the globe,the land,and the oceans in 2023 all exhibit extraordinary warming,which is distinct from any previous year in recorded history.Based on the GMST statistical ensemble prediction model developed at the Institute of Atmospheric Physics,the GMST in 2023 is predicted to be 1.41℃±0.07℃,which will certainly surpass that in 2016 as the warmest year since 1850,and is approaching the 1.5℃ global warming threshold.Compared to 2022,the GMST in 2023 will increase by 0.24℃,with 88%of the increment contributed by the annual variability as mostly affected by El Niño.Moreover,the multidecadal variability related to the Atlantic Multidecadal Oscillation(AMO)in 2023 also provided an important warming background for sparking the GMST rise.As a result,the GMST in 2023 is projected to be 1.15℃±0.07℃,with only a 0.02℃ increment,if the effects of natural variability—including El Niño and the AMO—are eliminated and only the global warming trend is considered.展开更多
The purpose of this research work is to investigate the numerical solutions of the fractional dengue transmission model(FDTM)in the presence of Wolbachia using the stochastic-based Levenberg-Marquardt neural network(L...The purpose of this research work is to investigate the numerical solutions of the fractional dengue transmission model(FDTM)in the presence of Wolbachia using the stochastic-based Levenberg-Marquardt neural network(LM-NN)technique.The fractional dengue transmission model(FDTM)consists of 12 compartments.The human population is divided into four compartments;susceptible humans(S_(h)),exposed humans(E_(h)),infectious humans(I_(h)),and recovered humans(R_(h)).Wolbachia-infected and Wolbachia-uninfected mosquito population is also divided into four compartments:aquatic(eggs,larvae,pupae),susceptible,exposed,and infectious.We investigated three different cases of vertical transmission probability(η),namely when Wolbachia-free mosquitoes persist only(η=0.6),when both types of mosquitoes persist(η=0.8),and when Wolbachia-carrying mosquitoes persist only(η=1).The objective of this study is to investigate the effectiveness of Wolbachia in reducing dengue and presenting the numerical results by using the stochastic structure LM-NN approach with 10 hidden layers of neurons for three different cases of the fractional order derivatives(α=0.4,0.6,0.8).LM-NN approach includes a training,validation,and testing procedure to minimize the mean square error(MSE)values using the reference dataset(obtained by solving the model using the Adams-Bashforth-Moulton method(ABM).The distribution of data is 80% data for training,10% for validation,and,10% for testing purpose)results.A comprehensive investigation is accessible to observe the competence,precision,capacity,and efficiency of the suggested LM-NN approach by executing the MSE,state transitions findings,and regression analysis.The effectiveness of the LM-NN approach for solving the FDTM is demonstrated by the overlap of the findings with trustworthy measures,which achieves a precision of up to 10^(-4).展开更多
In this paper, the magnetocaloric in La0.5Sm0.2Sr0.3Mn1-xFexO3 compounds with x = 0 (LSSMO) and x = 0.05 (LSSMFO) were simulated using mean field model theory. A strong consistency was observed between the theoretical...In this paper, the magnetocaloric in La0.5Sm0.2Sr0.3Mn1-xFexO3 compounds with x = 0 (LSSMO) and x = 0.05 (LSSMFO) were simulated using mean field model theory. A strong consistency was observed between the theoretical and experimental curves of magnetizations and magnetic entropy changes, −ΔSM(T). Based on the mean-field generated −ΔSM(T), the substantial Temperature-averaged Entropy Change (TEC) values reinforce the appropriateness of these materials for use in magnetic refrigeration technology within TEC (10) values of 1 and 0.57 J∙kg−1∙K−1under 1 T applied magnetic field.展开更多
We present a formalism of charge self-consistent dynamical mean field theory(DMFT)in combination with densityfunctional theory(DFT)within the linear combination of numerical atomic orbitals(LCNAO)framework.We implemen...We present a formalism of charge self-consistent dynamical mean field theory(DMFT)in combination with densityfunctional theory(DFT)within the linear combination of numerical atomic orbitals(LCNAO)framework.We implementedthe charge self-consistent DFT+DMFT formalism by interfacing a full-potential all-electron DFT code with threehybridization expansion-based continuous-time quantum Monte Carlo impurity solvers.The benchmarks on several 3d,4fand 5f strongly correlated electron systems validated our formalism and implementation.Furthermore,within the LCANOframework,our formalism is general and the code architecture is extensible,so it can work as a bridge merging differentLCNAO DFT packages and impurity solvers to do charge self-consistent DFT+DMFT calculations.展开更多
The realization of 100%polarized topologicalWeyl fermions in half-metallic ferromagnets is of particular importance for fundamental research and spintronic applications.Here,we theoretically investigate the electronic...The realization of 100%polarized topologicalWeyl fermions in half-metallic ferromagnets is of particular importance for fundamental research and spintronic applications.Here,we theoretically investigate the electronic and topological properties of the zinc-blende compound VAs,which was deemed as a half-metallic ferromagnet related to dynamic correlations.Based on the combination of density functional theory and dynamical mean field theory,we uncover that the half-metallic ferromagnet VAs exhibits attractive Weyl semimetallic behaviors which are very close to the Fermi level in the DFT+U regime with effect U values ranging from 1.5 eV to 2.5 eV.Meanwhile,we also investigate the magnetization-dependent topological properties;the results show that the change of magnetization directions only slightly affects the positions of Weyl points,which is attributed to the weak spin–orbital coupling effects.The topological surface states of VAs projected on semi-infinite(001)and(111)surfaces are investigated.The Fermi arcs of all Weyl points are clearly visible on the projected Fermi surfaces.Our findings suggest that VAs is a fully spin-polarized Weyl semimetal with many-body correlated effects in the effective U values range from 1.5 eV to 2.5 eV.展开更多
The Mean First-Passage Time (MFPT) and Stochastic Resonance (SR) of a stochastic tumor-immune model withnoise perturbation are discussed in this paper. Firstly, considering environmental perturbation, Gaussian whiteno...The Mean First-Passage Time (MFPT) and Stochastic Resonance (SR) of a stochastic tumor-immune model withnoise perturbation are discussed in this paper. Firstly, considering environmental perturbation, Gaussian whitenoise and Gaussian colored noise are introduced into a tumor growth model under immune surveillance. Asfollows, the long-time evolution of the tumor characterized by the Stationary Probability Density (SPD) and MFPTis obtained in theory on the basis of the Approximated Fokker-Planck Equation (AFPE). Herein the recurrenceof the tumor from the extinction state to the tumor-present state is more concerned in this paper. A moreefficient algorithmof Back-Propagation Neural Network (BPNN) is utilized in order to testify the correction of thetheoretical SPDandMFPT.With the existence of aweak signal, the functional relationship between Signal-to-NoiseRatio (SNR), noise intensities and correlation time is also studied. Numerical results show that both multiplicativeGaussian colored noise and additive Gaussian white noise can promote the extinction of the tumors, and themultiplicative Gaussian colored noise can lead to the resonance-like peak on MFPT curves, while the increasingintensity of the additiveGaussian white noise results in theminimum of MFPT. In addition, the correlation timesare negatively correlated with MFPT. As for the SNR, we find the intensities of both the Gaussian white noise andthe Gaussian colored noise, as well as their correlation intensity can induce SR. Especially, SNR is monotonouslyincreased in the case ofGaussian white noisewith the change of the correlation time.At last, the optimal parametersin BPNN structure are analyzed for MFPT from three aspects: the penalty factors, the number of neural networklayers and the number of nodes in each layer.展开更多
In this paper, we considered the equality problem of weighted Bajraktarević means with weighted quasi-arithmetic means. Using the method of substituting for functions, we first transform the equality problem into solv...In this paper, we considered the equality problem of weighted Bajraktarević means with weighted quasi-arithmetic means. Using the method of substituting for functions, we first transform the equality problem into solving an equivalent functional equation. We obtain the necessary and sufficient conditions for the equality equation.展开更多
Lexical meaning mainly includes rational meaning,grammatical meaning,and coloring meaning.Mastering the coloring meaning of vocabulary is of great significance for foreign students to use Chinese vocabulary correctly....Lexical meaning mainly includes rational meaning,grammatical meaning,and coloring meaning.Mastering the coloring meaning of vocabulary is of great significance for foreign students to use Chinese vocabulary correctly.This study mainly examines the psychological mechanism of Chinese second language learners mastering the coloring meaning of words,examines the psychological characteristics of students mastering the color meaning of words from the perspectives of second language learning theory and cognitive theory,establishes a cognitive schema for coloring meaning learning,and proposes corresponding learning models and teaching strategies.展开更多
Objective:To explore the meaning of care experienced by people with blindness in hospitals.Methods:Interpretive phenomenology along with the 6-step method of van Manen was used to conduct the study.Using purposeful sa...Objective:To explore the meaning of care experienced by people with blindness in hospitals.Methods:Interpretive phenomenology along with the 6-step method of van Manen was used to conduct the study.Using purposeful sampling,15 people with legal blindness were interviewed.Thematic analysis was used to isolate the meaning of care.Results:Five themes emerged:(a)nurses in the eyes of patients with blindness;(b)negligence in the caring moments;(c)being cared for in ambiguity;(d)Uncoordinated care;and(e)Psychological discomfor t.These sub-themes were condensed into an overarching theme titled as“marginalized patients inside the stereotypical healthcare system.”Conclusions:Lived experiences of patients with blindness revealed that hospitals provide stereotypic or inappropriate care for this minority group in society.Health professionals par ticularly nurses should be skilled to provide person-centered and coordinated care for patients with blindness.展开更多
Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear mode...Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.展开更多
This study looks at the interaction between the process of intersemiosis and resemiosis in multimodality. The importance of both phases is widely acknowledged as part of the meaning making process but many practical s...This study looks at the interaction between the process of intersemiosis and resemiosis in multimodality. The importance of both phases is widely acknowledged as part of the meaning making process but many practical studies focus on the first rather than the second. In particular this study looks at two groups of images about gender relations in Saudi Arabia following the post-2017 reforms of the male guardianship laws. One group are mostly made up of photographs and the second group of cartoons and posters. One important finding is that the latter tend to be less ambiguous in their semiotic structure than the former. In particular, there are instances in the first group where a standard study of intersemiosis indicates low modality but the image may be seen as inherently plausible by many observers. This suggests that while resemiosis can be applied to a single image it may be more appropriate as a tool when applied to an overall news article or set of images. In the same way that not all individual semiotic modes are complementary in how they build meaning, then it is possible for different images to be supportive, contradictory or unclear when studied in isolation.展开更多
The research examines President Xi’s 2021 New Year speech with research questions centering around its abundant interpersonal meanings.Through qualitative content analysis,the research finds that it is typical for Ch...The research examines President Xi’s 2021 New Year speech with research questions centering around its abundant interpersonal meanings.Through qualitative content analysis,the research finds that it is typical for Chinese president to frequently use judgment and appreciation resources in reviewing the past year.Even in the face of the pandemic and natural disasters,the overall emotions of the speech remain positive,which corresponds to the forward-looking feature of New Year speech.Significance of the study abounds and future research can investigate how COVID-19 impacts the ideologies conveyed through political leaders’speeches through a comparative lens and how to produce more understandings that can help dismantle stereotypes and discrimination hidden in reports about COVID-19 by using Appraisal Theory critically,systematically,and comprehensively.展开更多
According to the latest version(version 2.0) of the China global Merged Surface Temperature(CMST2.0) dataset, the global mean surface temperature(GMST) in the first half of 2023 reached its third warmest value since t...According to the latest version(version 2.0) of the China global Merged Surface Temperature(CMST2.0) dataset, the global mean surface temperature(GMST) in the first half of 2023 reached its third warmest value since the period of instrumental observation began, being only slightly lower than the values recorded in 2016 and 2020, and historically record-breaking GMST emerged from May to July 2023. Further analysis also indicates that if the surface temperature in the last five months of 2023 approaches the average level of the past five years, the annual average surface temperature anomaly in 2023 of approximately 1.26°C will break the previous highest surface temperature, which was recorded in 2016of approximately 1.25°C(both values relative to the global pre-industrialization period, i.e., the average value from 1850 to1900). With El Ni?o triggering a record-breaking hottest July, record-breaking average annual temperatures will most likely become a reality in 2023.展开更多
In this study,we aim to assess dynamical downscaling simulations by utilizing a novel bias-corrected global climate model(GCM)data to drive a regional climate model(RCM)over the Asia-western North Pacific region.Three...In this study,we aim to assess dynamical downscaling simulations by utilizing a novel bias-corrected global climate model(GCM)data to drive a regional climate model(RCM)over the Asia-western North Pacific region.Three simulations were conducted with a 25-km grid spacing for the period 1980–2014.The first simulation(WRF_ERA5)was driven by the European Centre for Medium-Range Weather Forecasts Reanalysis 5(ERA5)dataset and served as the validation dataset.The original GCM dataset(MPI-ESM1-2-HR model)was used to drive the second simulation(WRF_GCM),while the third simulation(WRF_GCMbc)was driven by the bias-corrected GCM dataset.The bias-corrected GCM data has an ERA5-based mean and interannual variance and long-term trends derived from the ensemble mean of 18 CMIP6 models.Results demonstrate that the WRF_GCMbc significantly reduced the root-mean-square errors(RMSEs)of the climatological mean of downscaled variables,including temperature,precipitation,snow,wind,relative humidity,and planetary boundary layer height by 50%–90%compared to the WRF_GCM.Similarly,the RMSEs of interannual-tointerdecadal variances of downscaled variables were reduced by 30%–60%.Furthermore,the WRF_GCMbc better captured the annual cycle of the monsoon circulation and intraseasonal and day-to-day variabilities.The leading empirical orthogonal function(EOF)shows a monopole precipitation mode in the WRF_GCM.In contrast,the WRF_GCMbc successfully reproduced the observed tri-pole mode of summer precipitation over eastern China.This improvement could be attributed to a better-simulated location of the western North Pacific subtropical high in the WRF_GCMbc after GCM bias correction.展开更多
The northern Andaman Sea off Myanmar is one of the relatively high productive regions in the Indian Ocean.The abundance,biomass and species composition of mesozooplankton and their relationships with environmental var...The northern Andaman Sea off Myanmar is one of the relatively high productive regions in the Indian Ocean.The abundance,biomass and species composition of mesozooplankton and their relationships with environmental variables in the epipelagic zone(~200 m)were studied for the first time during the Sino-Myanmar joint cruise(February 2020).The mean abundance and biomass of mesozooplankton were(1916.7±1192.9)ind./m3and(17.8±7.9)mg/m3,respectively.A total of 213 species(taxa)were identified from all samples.The omnivorous Cyclopoida Oncaea venusta and Oithona spp.were the top two dominant taxa.Three mesozooplankton communities were determined via cluster analysis:the open ocean in the Andaman Sea and the Bay of Bengal(Group A),the transition zone across the Preparis Channel(Group B),and nearshore water off the Ayeyarwady Delta and along the Tanintharyi Coast(Group C).Variation partitioning analysis revealed that the interaction of physical and biological factors explained 98.8%of mesozooplankton community spatial variation,and redundancy analysis revealed that column mean chlorophyll a concentration(CMCHLA)was the most important explanatory variable(43.1%).The abundance and biomass were significantly higher in Group C,the same as CMCHLA and column mean temperature(CMT)and in contrast to salinity,and CMT was the dominant factor.Significant taxon spatial variations were controlled by CMCHLA,salinity and temperature.This study suggested that mesozooplankton spatial variation was mainly regulated by physical processes through their effects on CMCHLA.The physical processes were simultaneously affected by heat loss differences,freshwater influx,eddies and depth.展开更多
Clinical practice guidelines drive clinical practice and clinicians rely to them when trying to answer their most common questions.One of the most important position papers in the field of gastro-esophageal reflux dis...Clinical practice guidelines drive clinical practice and clinicians rely to them when trying to answer their most common questions.One of the most important position papers in the field of gastro-esophageal reflux disease(GERD)is the one produced by the Lyon Consensus.Recently an updated second version has been released.Mean nocturnal baseline impedance(MNBI)was proposed by the first Consensus to act as supportive evidence for GERD diagnosis.Originally a cut-off of 2292 Ohms was proposed,a value revised in the second edition.The updated Consensus recommended that an MNBI<1500 Ohms strongly suggests GERD while a value>2500 Ohms can be used to refute GERD.The proposed cut-offs move in the correct direction by diminishing the original cut-off,nevertheless they arise from a study of normal subjects where cut-offs were provided by measuring the mean value±2SD and not in symptomatic patients.However,data exist that even symptomatic patients with inconclusive disease or reflux hypersensitivity(RH)show lower MNBI values in comparison to normal subjects or patients with functional heartburn(FH).Moreover,according to the data,MNBI,even among symptomatic patients,is affected by age and body mass index.Also,various studies have proposed different cut-offs by using receiver operating characteristic curve analysis even lower than the one proposed.Finally,no information is given for patients submitted to on-proton pump inhibitors pH-impedance studies even if new and extremely important data now exist.Therefore,even if MNBI is an extremely important tool when trying to approach patients with reflux symptoms and could distinguish conclusive GERD from RH or FH,its values should be interpreted with caution.展开更多
In Unsupervised Domain Adaptation(UDA)for person re-identification(re-ID),the primary challenge is reducing the distribution discrepancy between the source and target domains.This can be achieved by implicitly or expl...In Unsupervised Domain Adaptation(UDA)for person re-identification(re-ID),the primary challenge is reducing the distribution discrepancy between the source and target domains.This can be achieved by implicitly or explicitly constructing an appropriate intermediate domain to enhance recognition capability on the target domain.Implicit construction is difficult due to the absence of intermediate state supervision,making smooth knowledge transfer from the source to the target domain a challenge.To explicitly construct the most suitable intermediate domain for the model to gradually adapt to the feature distribution changes from the source to the target domain,we propose the Minimal Transfer Cost Framework(MTCF).MTCF considers all scenarios of the intermediate domain during the transfer process,ensuring smoother and more efficient domain alignment.Our framework mainly includes threemodules:Intermediate Domain Generator(IDG),Cross-domain Feature Constraint Module(CFCM),and Residual Channel Space Module(RCSM).First,the IDG Module is introduced to generate all possible intermediate domains,ensuring a smooth transition of knowledge fromthe source to the target domain.To reduce the cross-domain feature distribution discrepancy,we propose the CFCM Module,which quantifies the difficulty of knowledge transfer and ensures the diversity of intermediate domain features and their semantic relevance,achieving alignment between the source and target domains by incorporating mutual information and maximum mean discrepancy.We also design the RCSM,which utilizes attention mechanism to enhance the model’s focus on personnel features in low-resolution images,improving the accuracy and efficiency of person re-ID.Our proposed method outperforms existing technologies in all common UDA re-ID tasks and improves the Mean Average Precision(mAP)by 2.3%in the Market to Duke task compared to the state-of-the-art(SOTA)methods.展开更多
In this paper,by choosing some appropriate test functions,we prove the Weyl’s lemma for triharmonic functions based on the new type of mean value formulas.
This paper presents an investigation on the effect of JPEG compression on the similarity between the target image and the background,where the similarity is further used to determine the degree of clutter in the image...This paper presents an investigation on the effect of JPEG compression on the similarity between the target image and the background,where the similarity is further used to determine the degree of clutter in the image.Four new clutter metrics based on image quality assessment are introduced,among which the Haar wavelet-based perceptual similarity index,known as HaarPSI,provides the best target acquisition prediction results.It is shown that the similarity between the target and the background at the boundary between visually lossless and visually lossy compression does not change significantly compared to the case when an uncompressed image is used.In future work,through subjective tests,it is necessary to check whether this presence of compression at the threshold of just noticeable differences will affect the human target acquisition performance.Similarity values are compared with the results of subjective tests of the well-known target Search_2 database,where the degree of agreement between objective and subjective scores,measured through linear correlation,reached a value of 90%.展开更多
基金supported by the Key Research Program of Frontier Sciences,Chinese Academy of Sciences(Grant No.ZDBS-LY-DQC010)the National Natural Science Foundation of China(Grant No.42175045).
文摘In 2023,the majority of the Earth witnessed its warmest boreal summer and autumn since 1850.Whether 2023 will indeed turn out to be the warmest year on record and what caused the astonishingly large margin of warming has become one of the hottest topics in the scientific community and is closely connected to the future development of human society.We analyzed the monthly varying global mean surface temperature(GMST)in 2023 and found that the globe,the land,and the oceans in 2023 all exhibit extraordinary warming,which is distinct from any previous year in recorded history.Based on the GMST statistical ensemble prediction model developed at the Institute of Atmospheric Physics,the GMST in 2023 is predicted to be 1.41℃±0.07℃,which will certainly surpass that in 2016 as the warmest year since 1850,and is approaching the 1.5℃ global warming threshold.Compared to 2022,the GMST in 2023 will increase by 0.24℃,with 88%of the increment contributed by the annual variability as mostly affected by El Niño.Moreover,the multidecadal variability related to the Atlantic Multidecadal Oscillation(AMO)in 2023 also provided an important warming background for sparking the GMST rise.As a result,the GMST in 2023 is projected to be 1.15℃±0.07℃,with only a 0.02℃ increment,if the effects of natural variability—including El Niño and the AMO—are eliminated and only the global warming trend is considered.
文摘The purpose of this research work is to investigate the numerical solutions of the fractional dengue transmission model(FDTM)in the presence of Wolbachia using the stochastic-based Levenberg-Marquardt neural network(LM-NN)technique.The fractional dengue transmission model(FDTM)consists of 12 compartments.The human population is divided into four compartments;susceptible humans(S_(h)),exposed humans(E_(h)),infectious humans(I_(h)),and recovered humans(R_(h)).Wolbachia-infected and Wolbachia-uninfected mosquito population is also divided into four compartments:aquatic(eggs,larvae,pupae),susceptible,exposed,and infectious.We investigated three different cases of vertical transmission probability(η),namely when Wolbachia-free mosquitoes persist only(η=0.6),when both types of mosquitoes persist(η=0.8),and when Wolbachia-carrying mosquitoes persist only(η=1).The objective of this study is to investigate the effectiveness of Wolbachia in reducing dengue and presenting the numerical results by using the stochastic structure LM-NN approach with 10 hidden layers of neurons for three different cases of the fractional order derivatives(α=0.4,0.6,0.8).LM-NN approach includes a training,validation,and testing procedure to minimize the mean square error(MSE)values using the reference dataset(obtained by solving the model using the Adams-Bashforth-Moulton method(ABM).The distribution of data is 80% data for training,10% for validation,and,10% for testing purpose)results.A comprehensive investigation is accessible to observe the competence,precision,capacity,and efficiency of the suggested LM-NN approach by executing the MSE,state transitions findings,and regression analysis.The effectiveness of the LM-NN approach for solving the FDTM is demonstrated by the overlap of the findings with trustworthy measures,which achieves a precision of up to 10^(-4).
文摘In this paper, the magnetocaloric in La0.5Sm0.2Sr0.3Mn1-xFexO3 compounds with x = 0 (LSSMO) and x = 0.05 (LSSMFO) were simulated using mean field model theory. A strong consistency was observed between the theoretical and experimental curves of magnetizations and magnetic entropy changes, −ΔSM(T). Based on the mean-field generated −ΔSM(T), the substantial Temperature-averaged Entropy Change (TEC) values reinforce the appropriateness of these materials for use in magnetic refrigeration technology within TEC (10) values of 1 and 0.57 J∙kg−1∙K−1under 1 T applied magnetic field.
文摘We present a formalism of charge self-consistent dynamical mean field theory(DMFT)in combination with densityfunctional theory(DFT)within the linear combination of numerical atomic orbitals(LCNAO)framework.We implementedthe charge self-consistent DFT+DMFT formalism by interfacing a full-potential all-electron DFT code with threehybridization expansion-based continuous-time quantum Monte Carlo impurity solvers.The benchmarks on several 3d,4fand 5f strongly correlated electron systems validated our formalism and implementation.Furthermore,within the LCANOframework,our formalism is general and the code architecture is extensible,so it can work as a bridge merging differentLCNAO DFT packages and impurity solvers to do charge self-consistent DFT+DMFT calculations.
基金the National Natural Science Foun-dation of China(Grant Nos.12204074,12222402,92365101,and 12347101)the Natural Science Foundation of Chong-ging(Grant No.CSTB2023NSCQ-JQX0024).
文摘The realization of 100%polarized topologicalWeyl fermions in half-metallic ferromagnets is of particular importance for fundamental research and spintronic applications.Here,we theoretically investigate the electronic and topological properties of the zinc-blende compound VAs,which was deemed as a half-metallic ferromagnet related to dynamic correlations.Based on the combination of density functional theory and dynamical mean field theory,we uncover that the half-metallic ferromagnet VAs exhibits attractive Weyl semimetallic behaviors which are very close to the Fermi level in the DFT+U regime with effect U values ranging from 1.5 eV to 2.5 eV.Meanwhile,we also investigate the magnetization-dependent topological properties;the results show that the change of magnetization directions only slightly affects the positions of Weyl points,which is attributed to the weak spin–orbital coupling effects.The topological surface states of VAs projected on semi-infinite(001)and(111)surfaces are investigated.The Fermi arcs of all Weyl points are clearly visible on the projected Fermi surfaces.Our findings suggest that VAs is a fully spin-polarized Weyl semimetal with many-body correlated effects in the effective U values range from 1.5 eV to 2.5 eV.
基金National Natural Science Foundation of China(Nos.12272283,12172266).
文摘The Mean First-Passage Time (MFPT) and Stochastic Resonance (SR) of a stochastic tumor-immune model withnoise perturbation are discussed in this paper. Firstly, considering environmental perturbation, Gaussian whitenoise and Gaussian colored noise are introduced into a tumor growth model under immune surveillance. Asfollows, the long-time evolution of the tumor characterized by the Stationary Probability Density (SPD) and MFPTis obtained in theory on the basis of the Approximated Fokker-Planck Equation (AFPE). Herein the recurrenceof the tumor from the extinction state to the tumor-present state is more concerned in this paper. A moreefficient algorithmof Back-Propagation Neural Network (BPNN) is utilized in order to testify the correction of thetheoretical SPDandMFPT.With the existence of aweak signal, the functional relationship between Signal-to-NoiseRatio (SNR), noise intensities and correlation time is also studied. Numerical results show that both multiplicativeGaussian colored noise and additive Gaussian white noise can promote the extinction of the tumors, and themultiplicative Gaussian colored noise can lead to the resonance-like peak on MFPT curves, while the increasingintensity of the additiveGaussian white noise results in theminimum of MFPT. In addition, the correlation timesare negatively correlated with MFPT. As for the SNR, we find the intensities of both the Gaussian white noise andthe Gaussian colored noise, as well as their correlation intensity can induce SR. Especially, SNR is monotonouslyincreased in the case ofGaussian white noisewith the change of the correlation time.At last, the optimal parametersin BPNN structure are analyzed for MFPT from three aspects: the penalty factors, the number of neural networklayers and the number of nodes in each layer.
文摘In this paper, we considered the equality problem of weighted Bajraktarević means with weighted quasi-arithmetic means. Using the method of substituting for functions, we first transform the equality problem into solving an equivalent functional equation. We obtain the necessary and sufficient conditions for the equality equation.
文摘Lexical meaning mainly includes rational meaning,grammatical meaning,and coloring meaning.Mastering the coloring meaning of vocabulary is of great significance for foreign students to use Chinese vocabulary correctly.This study mainly examines the psychological mechanism of Chinese second language learners mastering the coloring meaning of words,examines the psychological characteristics of students mastering the color meaning of words from the perspectives of second language learning theory and cognitive theory,establishes a cognitive schema for coloring meaning learning,and proposes corresponding learning models and teaching strategies.
基金supported by Ardabil University of Medical Sciences(No.9319.1393-11-21)。
文摘Objective:To explore the meaning of care experienced by people with blindness in hospitals.Methods:Interpretive phenomenology along with the 6-step method of van Manen was used to conduct the study.Using purposeful sampling,15 people with legal blindness were interviewed.Thematic analysis was used to isolate the meaning of care.Results:Five themes emerged:(a)nurses in the eyes of patients with blindness;(b)negligence in the caring moments;(c)being cared for in ambiguity;(d)Uncoordinated care;and(e)Psychological discomfor t.These sub-themes were condensed into an overarching theme titled as“marginalized patients inside the stereotypical healthcare system.”Conclusions:Lived experiences of patients with blindness revealed that hospitals provide stereotypic or inappropriate care for this minority group in society.Health professionals par ticularly nurses should be skilled to provide person-centered and coordinated care for patients with blindness.
文摘Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.
文摘This study looks at the interaction between the process of intersemiosis and resemiosis in multimodality. The importance of both phases is widely acknowledged as part of the meaning making process but many practical studies focus on the first rather than the second. In particular this study looks at two groups of images about gender relations in Saudi Arabia following the post-2017 reforms of the male guardianship laws. One group are mostly made up of photographs and the second group of cartoons and posters. One important finding is that the latter tend to be less ambiguous in their semiotic structure than the former. In particular, there are instances in the first group where a standard study of intersemiosis indicates low modality but the image may be seen as inherently plausible by many observers. This suggests that while resemiosis can be applied to a single image it may be more appropriate as a tool when applied to an overall news article or set of images. In the same way that not all individual semiotic modes are complementary in how they build meaning, then it is possible for different images to be supportive, contradictory or unclear when studied in isolation.
基金Under the major project of the Center for Language Education and Cooperation in 2021“Research on the Construction and Promotion of International Chinese Education Standard System”(21YH04A).
文摘The research examines President Xi’s 2021 New Year speech with research questions centering around its abundant interpersonal meanings.Through qualitative content analysis,the research finds that it is typical for Chinese president to frequently use judgment and appreciation resources in reviewing the past year.Even in the face of the pandemic and natural disasters,the overall emotions of the speech remain positive,which corresponds to the forward-looking feature of New Year speech.Significance of the study abounds and future research can investigate how COVID-19 impacts the ideologies conveyed through political leaders’speeches through a comparative lens and how to produce more understandings that can help dismantle stereotypes and discrimination hidden in reports about COVID-19 by using Appraisal Theory critically,systematically,and comprehensively.
基金support from the National Natural Science Foundation of China (Grant Nos. 41975105 and 42375022)。
文摘According to the latest version(version 2.0) of the China global Merged Surface Temperature(CMST2.0) dataset, the global mean surface temperature(GMST) in the first half of 2023 reached its third warmest value since the period of instrumental observation began, being only slightly lower than the values recorded in 2016 and 2020, and historically record-breaking GMST emerged from May to July 2023. Further analysis also indicates that if the surface temperature in the last five months of 2023 approaches the average level of the past five years, the annual average surface temperature anomaly in 2023 of approximately 1.26°C will break the previous highest surface temperature, which was recorded in 2016of approximately 1.25°C(both values relative to the global pre-industrialization period, i.e., the average value from 1850 to1900). With El Ni?o triggering a record-breaking hottest July, record-breaking average annual temperatures will most likely become a reality in 2023.
基金supported jointly by the National Natural Science Foundation of China (Grant No.42075170)the National Key Research and Development Program of China (2022YFF0802503)+2 种基金the Jiangsu Collaborative Innovation Center for Climate Changea Chinese University Direct Grant(Grant No. 4053331)supported by the National Key Scientific and Technological Infrastructure project“Earth System Numerical Simulator Facility”(EarthLab)
文摘In this study,we aim to assess dynamical downscaling simulations by utilizing a novel bias-corrected global climate model(GCM)data to drive a regional climate model(RCM)over the Asia-western North Pacific region.Three simulations were conducted with a 25-km grid spacing for the period 1980–2014.The first simulation(WRF_ERA5)was driven by the European Centre for Medium-Range Weather Forecasts Reanalysis 5(ERA5)dataset and served as the validation dataset.The original GCM dataset(MPI-ESM1-2-HR model)was used to drive the second simulation(WRF_GCM),while the third simulation(WRF_GCMbc)was driven by the bias-corrected GCM dataset.The bias-corrected GCM data has an ERA5-based mean and interannual variance and long-term trends derived from the ensemble mean of 18 CMIP6 models.Results demonstrate that the WRF_GCMbc significantly reduced the root-mean-square errors(RMSEs)of the climatological mean of downscaled variables,including temperature,precipitation,snow,wind,relative humidity,and planetary boundary layer height by 50%–90%compared to the WRF_GCM.Similarly,the RMSEs of interannual-tointerdecadal variances of downscaled variables were reduced by 30%–60%.Furthermore,the WRF_GCMbc better captured the annual cycle of the monsoon circulation and intraseasonal and day-to-day variabilities.The leading empirical orthogonal function(EOF)shows a monopole precipitation mode in the WRF_GCM.In contrast,the WRF_GCMbc successfully reproduced the observed tri-pole mode of summer precipitation over eastern China.This improvement could be attributed to a better-simulated location of the western North Pacific subtropical high in the WRF_GCMbc after GCM bias correction.
基金The Scientific Research Fund of the Second Institute of Oceanography,Ministry of Natural Resources under contract No.JG2210the Global Change and Air-Sea Interaction II Program under contract No.GASI-01-EIND-STwinthe National Natural Science Foundation of China under contract Nos 42176148 and 42176039。
文摘The northern Andaman Sea off Myanmar is one of the relatively high productive regions in the Indian Ocean.The abundance,biomass and species composition of mesozooplankton and their relationships with environmental variables in the epipelagic zone(~200 m)were studied for the first time during the Sino-Myanmar joint cruise(February 2020).The mean abundance and biomass of mesozooplankton were(1916.7±1192.9)ind./m3and(17.8±7.9)mg/m3,respectively.A total of 213 species(taxa)were identified from all samples.The omnivorous Cyclopoida Oncaea venusta and Oithona spp.were the top two dominant taxa.Three mesozooplankton communities were determined via cluster analysis:the open ocean in the Andaman Sea and the Bay of Bengal(Group A),the transition zone across the Preparis Channel(Group B),and nearshore water off the Ayeyarwady Delta and along the Tanintharyi Coast(Group C).Variation partitioning analysis revealed that the interaction of physical and biological factors explained 98.8%of mesozooplankton community spatial variation,and redundancy analysis revealed that column mean chlorophyll a concentration(CMCHLA)was the most important explanatory variable(43.1%).The abundance and biomass were significantly higher in Group C,the same as CMCHLA and column mean temperature(CMT)and in contrast to salinity,and CMT was the dominant factor.Significant taxon spatial variations were controlled by CMCHLA,salinity and temperature.This study suggested that mesozooplankton spatial variation was mainly regulated by physical processes through their effects on CMCHLA.The physical processes were simultaneously affected by heat loss differences,freshwater influx,eddies and depth.
文摘Clinical practice guidelines drive clinical practice and clinicians rely to them when trying to answer their most common questions.One of the most important position papers in the field of gastro-esophageal reflux disease(GERD)is the one produced by the Lyon Consensus.Recently an updated second version has been released.Mean nocturnal baseline impedance(MNBI)was proposed by the first Consensus to act as supportive evidence for GERD diagnosis.Originally a cut-off of 2292 Ohms was proposed,a value revised in the second edition.The updated Consensus recommended that an MNBI<1500 Ohms strongly suggests GERD while a value>2500 Ohms can be used to refute GERD.The proposed cut-offs move in the correct direction by diminishing the original cut-off,nevertheless they arise from a study of normal subjects where cut-offs were provided by measuring the mean value±2SD and not in symptomatic patients.However,data exist that even symptomatic patients with inconclusive disease or reflux hypersensitivity(RH)show lower MNBI values in comparison to normal subjects or patients with functional heartburn(FH).Moreover,according to the data,MNBI,even among symptomatic patients,is affected by age and body mass index.Also,various studies have proposed different cut-offs by using receiver operating characteristic curve analysis even lower than the one proposed.Finally,no information is given for patients submitted to on-proton pump inhibitors pH-impedance studies even if new and extremely important data now exist.Therefore,even if MNBI is an extremely important tool when trying to approach patients with reflux symptoms and could distinguish conclusive GERD from RH or FH,its values should be interpreted with caution.
文摘In Unsupervised Domain Adaptation(UDA)for person re-identification(re-ID),the primary challenge is reducing the distribution discrepancy between the source and target domains.This can be achieved by implicitly or explicitly constructing an appropriate intermediate domain to enhance recognition capability on the target domain.Implicit construction is difficult due to the absence of intermediate state supervision,making smooth knowledge transfer from the source to the target domain a challenge.To explicitly construct the most suitable intermediate domain for the model to gradually adapt to the feature distribution changes from the source to the target domain,we propose the Minimal Transfer Cost Framework(MTCF).MTCF considers all scenarios of the intermediate domain during the transfer process,ensuring smoother and more efficient domain alignment.Our framework mainly includes threemodules:Intermediate Domain Generator(IDG),Cross-domain Feature Constraint Module(CFCM),and Residual Channel Space Module(RCSM).First,the IDG Module is introduced to generate all possible intermediate domains,ensuring a smooth transition of knowledge fromthe source to the target domain.To reduce the cross-domain feature distribution discrepancy,we propose the CFCM Module,which quantifies the difficulty of knowledge transfer and ensures the diversity of intermediate domain features and their semantic relevance,achieving alignment between the source and target domains by incorporating mutual information and maximum mean discrepancy.We also design the RCSM,which utilizes attention mechanism to enhance the model’s focus on personnel features in low-resolution images,improving the accuracy and efficiency of person re-ID.Our proposed method outperforms existing technologies in all common UDA re-ID tasks and improves the Mean Average Precision(mAP)by 2.3%in the Market to Duke task compared to the state-of-the-art(SOTA)methods.
基金Supported by National Natural Science Foundation of China(Grant Nos.11801006 and 12071489).
文摘In this paper,by choosing some appropriate test functions,we prove the Weyl’s lemma for triharmonic functions based on the new type of mean value formulas.
文摘This paper presents an investigation on the effect of JPEG compression on the similarity between the target image and the background,where the similarity is further used to determine the degree of clutter in the image.Four new clutter metrics based on image quality assessment are introduced,among which the Haar wavelet-based perceptual similarity index,known as HaarPSI,provides the best target acquisition prediction results.It is shown that the similarity between the target and the background at the boundary between visually lossless and visually lossy compression does not change significantly compared to the case when an uncompressed image is used.In future work,through subjective tests,it is necessary to check whether this presence of compression at the threshold of just noticeable differences will affect the human target acquisition performance.Similarity values are compared with the results of subjective tests of the well-known target Search_2 database,where the degree of agreement between objective and subjective scores,measured through linear correlation,reached a value of 90%.