In order to describe the characteristics of dynamic traffic flow and improve the robustness of its multiple applications, a dynamic traffic temporal-spatial model(DTTS) is established. With consideration of the tempor...In order to describe the characteristics of dynamic traffic flow and improve the robustness of its multiple applications, a dynamic traffic temporal-spatial model(DTTS) is established. With consideration of the temporal correlation, spatial correlation and historical correlation, a basic DTTS model is built. And a three-stage approach is put forward for the simplification and calibration of the basic DTTS model. Through critical sections pre-selection and critical time pre-selection, the first stage reduces the variable number of the basic DTTS model. In the second stage, variable coefficient calibration is implemented based on basic model simplification and stepwise regression analysis. Aimed at dynamic noise estimation, the characteristics of noise are summarized and an extreme learning machine is presented in the third stage. A case study based on a real-world road network in Beijing, China, is carried out to test the efficiency and applicability of proposed DTTS model and the three-stage approach.展开更多
In the field of target recognition based on the temporal-spatial information fusion,evidence the-ory has received extensive attention.To achieve accurate and efficient target recognition by the evi-dence theory,an ada...In the field of target recognition based on the temporal-spatial information fusion,evidence the-ory has received extensive attention.To achieve accurate and efficient target recognition by the evi-dence theory,an adaptive temporal-spatial information fusion model is proposed.Firstly,an adaptive evaluation correction mechanism is constructed by the evidence distance and Deng entropy,which realizes the credibility discrimination and adaptive correction of the spatial evidence.Secondly,the credibility decay operator is introduced to obtain the dynamic credibility of temporal evidence.Finally,the sequential combination of temporal-spatial evidences is achieved by Shafer’s discount criterion and Dempster’s combination rule.The simulation results show that the proposed method not only considers the dynamic and sequential characteristics of the temporal-spatial evidences com-bination,but also has a strong conflict information processing capability,which provides a new refer-ence for the field of temporal-spatial information fusion.展开更多
[Objective] The aim was to study on temporal-spatial distribution model of cold chain logistics for vegetables. [Method] Broccoli was taken as an example. Detailedly, time-space distribution model of cold chain logist...[Objective] The aim was to study on temporal-spatial distribution model of cold chain logistics for vegetables. [Method] Broccoli was taken as an example. Detailedly, time-space distribution model of cold chain logistics for broccoli was proposed from standpoints of costs and benefits based on changes of time and space, and a comprehensive evaluation was made on timeliness, efficiency, risks, added- value of products and satisfaction of information in cold-chain logistics. [Result] The efficiency of cold chain logistics for vegetable can be greatly improved by temporal- spatial distribution model of cold chain logistics. [Conclusion] Costs and benefits of vegetables in temporal-apstial distribution could be evaluated by the model.展开更多
安全生产事故往往由多组织交互、多因素耦合造成,事故原因涉及多个组织。为预防和遏制多组织生产安全事故的发生,基于系统理论事故建模与过程模型(Systems-Theory Accident Modeling and Process,STAMP)、24Model,构建一种用于多组织事...安全生产事故往往由多组织交互、多因素耦合造成,事故原因涉及多个组织。为预防和遏制多组织生产安全事故的发生,基于系统理论事故建模与过程模型(Systems-Theory Accident Modeling and Process,STAMP)、24Model,构建一种用于多组织事故分析的方法,并以青岛石油爆炸事故为例进行事故原因分析。结果显示:STAMP-24Model可以分组织,分层次且有效、全面、详细地分析涉及多个组织的事故原因,探究多组织之间的交互关系;对事故进行动态演化分析,可得到各组织不安全动作耦合关系与形成的事故失效链及管控失效路径,进而为预防多组织事故提供思路和参考。展开更多
Precipitous Arctic sea-ice decline and the corresponding increase in Arctic open-water areas in summer months give more space for sea-ice growth in the subsequent cold seasons. Compared to the decline of the entire Ar...Precipitous Arctic sea-ice decline and the corresponding increase in Arctic open-water areas in summer months give more space for sea-ice growth in the subsequent cold seasons. Compared to the decline of the entire Arctic multiyear sea ice,changes in newly formed sea ice indicate more thermodynamic and dynamic information on Arctic atmosphere–ocean–ice interaction and northern mid–high latitude atmospheric teleconnections. Here, we use a large multimodel ensemble from phase 6 of the Coupled Model Intercomparison Project(CMIP6) to investigate future changes in wintertime newly formed Arctic sea ice. The commonly used model-democracy approach that gives equal weight to each model essentially assumes that all models are independent and equally plausible, which contradicts with the fact that there are large interdependencies in the ensemble and discrepancies in models' performances in reproducing observations. Therefore, instead of using the arithmetic mean of well-performing models or all available models for projections like in previous studies, we employ a newly developed model weighting scheme that weights all models in the ensemble with consideration of their performance and independence to provide more reliable projections. Model democracy leads to evident bias and large intermodel spread in CMIP6 projections of newly formed Arctic sea ice. However, we show that both the bias and the intermodel spread can be effectively reduced by the weighting scheme. Projections from the weighted models indicate that wintertime newly formed Arctic sea ice is likely to increase dramatically until the middle of this century regardless of the emissions scenario.Thereafter, it may decrease(or remain stable) if the Arctic warming crosses a threshold(or is extensively constrained).展开更多
Understanding the anisotropic creep behaviors of shale under direct shearing is a challenging issue.In this context,we conducted shear-creep and steady-creep tests on shale with five bedding orientations (i.e.0°,...Understanding the anisotropic creep behaviors of shale under direct shearing is a challenging issue.In this context,we conducted shear-creep and steady-creep tests on shale with five bedding orientations (i.e.0°,30°,45°,60°,and 90°),under multiple levels of direct shearing for the first time.The results show that the anisotropic creep of shale exhibits a significant stress-dependent behavior.Under a low shear stress,the creep compliance of shale increases linearly with the logarithm of time at all bedding orientations,and the increase depends on the bedding orientation and creep time.Under high shear stress conditions,the creep compliance of shale is minimal when the bedding orientation is 0°,and the steady-creep rate of shale increases significantly with increasing bedding orientations of 30°,45°,60°,and 90°.The stress-strain values corresponding to the inception of the accelerated creep stage show an increasing and then decreasing trend with the bedding orientation.A semilogarithmic model that could reflect the stress dependence of the steady-creep rate while considering the hardening and damage process is proposed.The model minimizes the deviation of the calculated steady-state creep rate from the observed value and reveals the behavior of the bedding orientation's influence on the steady-creep rate.The applicability of the five classical empirical creep models is quantitatively evaluated.It shows that the logarithmic model can well explain the experimental creep strain and creep rate,and it can accurately predict long-term shear creep deformation.Based on an improved logarithmic model,the variations in creep parameters with shear stress and bedding orientations are discussed.With abovementioned findings,a mathematical method for constructing an anisotropic shear creep model of shale is proposed,which can characterize the nonlinear dependence of the anisotropic shear creep behavior of shale on the bedding orientation.展开更多
Since the 1950s,when the Turing Test was introduced,there has been notable progress in machine language intelligence.Language modeling,crucial for AI development,has evolved from statistical to neural models over the ...Since the 1950s,when the Turing Test was introduced,there has been notable progress in machine language intelligence.Language modeling,crucial for AI development,has evolved from statistical to neural models over the last two decades.Recently,transformer-based Pre-trained Language Models(PLM)have excelled in Natural Language Processing(NLP)tasks by leveraging large-scale training corpora.Increasing the scale of these models enhances performance significantly,introducing abilities like context learning that smaller models lack.The advancement in Large Language Models,exemplified by the development of ChatGPT,has made significant impacts both academically and industrially,capturing widespread societal interest.This survey provides an overview of the development and prospects from Large Language Models(LLM)to Large Multimodal Models(LMM).It first discusses the contributions and technological advancements of LLMs in the field of natural language processing,especially in text generation and language understanding.Then,it turns to the discussion of LMMs,which integrates various data modalities such as text,images,and sound,demonstrating advanced capabilities in understanding and generating cross-modal content,paving new pathways for the adaptability and flexibility of AI systems.Finally,the survey highlights the prospects of LMMs in terms of technological development and application potential,while also pointing out challenges in data integration,cross-modal understanding accuracy,providing a comprehensive perspective on the latest developments in this field.展开更多
BACKGROUND Colorectal cancer(CRC)is a serious threat worldwide.Although early screening is suggested to be the most effective method to prevent and control CRC,the current situation of early screening for CRC is still...BACKGROUND Colorectal cancer(CRC)is a serious threat worldwide.Although early screening is suggested to be the most effective method to prevent and control CRC,the current situation of early screening for CRC is still not optimistic.In China,the incidence of CRC in the Yangtze River Delta region is increasing dramatically,but few studies have been conducted.Therefore,it is necessary to develop a simple and efficient early screening model for CRC.AIM To develop and validate an early-screening nomogram model to identify individuals at high risk of CRC.METHODS Data of 64448 participants obtained from Ningbo Hospital,China between 2014 and 2017 were retrospectively analyzed.The cohort comprised 64448 individuals,of which,530 were excluded due to missing or incorrect data.Of 63918,7607(11.9%)individuals were considered to be high risk for CRC,and 56311(88.1%)were not.The participants were randomly allocated to a training set(44743)or validation set(19175).The discriminatory ability,predictive accuracy,and clinical utility of the model were evaluated by constructing and analyzing receiver operating characteristic(ROC)curves and calibration curves and by decision curve analysis.Finally,the model was validated internally using a bootstrap resampling technique.RESULTS Seven variables,including demographic,lifestyle,and family history information,were examined.Multifactorial logistic regression analysis revealed that age[odds ratio(OR):1.03,95%confidence interval(CI):1.02-1.03,P<0.001],body mass index(BMI)(OR:1.07,95%CI:1.06-1.08,P<0.001),waist circumference(WC)(OR:1.03,95%CI:1.02-1.03 P<0.001),lifestyle(OR:0.45,95%CI:0.42-0.48,P<0.001),and family history(OR:4.28,95%CI:4.04-4.54,P<0.001)were the most significant predictors of high-risk CRC.Healthy lifestyle was a protective factor,whereas family history was the most significant risk factor.The area under the curve was 0.734(95%CI:0.723-0.745)for the final validation set ROC curve and 0.735(95%CI:0.728-0.742)for the training set ROC curve.The calibration curve demonstrated a high correlation between the CRC high-risk population predicted by the nomogram model and the actual CRC high-risk population.CONCLUSION The early-screening nomogram model for CRC prediction in high-risk populations developed in this study based on age,BMI,WC,lifestyle,and family history exhibited high accuracy.展开更多
Flow units(FU)rock typing is a common technique for characterizing reservoir flow behavior,producing reliable porosity and permeability estimation even in complex geological settings.However,the lateral extrapolation ...Flow units(FU)rock typing is a common technique for characterizing reservoir flow behavior,producing reliable porosity and permeability estimation even in complex geological settings.However,the lateral extrapolation of FU away from the well into the whole reservoir grid is commonly a difficult task and using the seismic data as constraints is rarely a subject of study.This paper proposes a workflow to generate numerous possible 3D volumes of flow units,porosity and permeability below the seismic resolution limit,respecting the available seismic data at larger scales.The methodology is used in the Mero Field,a Brazilian presalt carbonate reservoir located in the Santos Basin,who presents a complex and heterogenic geological setting with different sedimentological processes and diagenetic history.We generated metric flow units using the conventional core analysis and transposed to the well log data.Then,given a Markov chain Monte Carlo algorithm,the seismic data and the well log statistics,we simulated acoustic impedance,decametric flow units(DFU),metric flow units(MFU),porosity and permeability volumes in the metric scale.The aim is to estimate a minimum amount of MFU able to calculate realistic scenarios porosity and permeability scenarios,without losing the seismic lateral control.In other words,every porosity and permeability volume simulated produces a synthetic seismic that match the real seismic of the area,even in the metric scale.The achieved 3D results represent a high-resolution fluid flow reservoir modelling considering the lateral control of the seismic during the process and can be directly incorporated in the dynamic characterization workflow.展开更多
Artificial intelligence(AI)models have significantly impacted various areas of the atmospheric sciences,reshaping our approach to climate-related challenges.Amid this AI-driven transformation,the foundational role of ...Artificial intelligence(AI)models have significantly impacted various areas of the atmospheric sciences,reshaping our approach to climate-related challenges.Amid this AI-driven transformation,the foundational role of physics in climate science has occasionally been overlooked.Our perspective suggests that the future of climate modeling involves a synergistic partnership between AI and physics,rather than an“either/or”scenario.Scrutinizing controversies around current physical inconsistencies in large AI models,we stress the critical need for detailed dynamic diagnostics and physical constraints.Furthermore,we provide illustrative examples to guide future assessments and constraints for AI models.Regarding AI integration with numerical models,we argue that offline AI parameterization schemes may fall short of achieving global optimality,emphasizing the importance of constructing online schemes.Additionally,we highlight the significance of fostering a community culture and propose the OCR(Open,Comparable,Reproducible)principles.Through a better community culture and a deep integration of physics and AI,we contend that developing a learnable climate model,balancing AI and physics,is an achievable goal.展开更多
Short-term(up to 30 days)predictions of Earth Rotation Parameters(ERPs)such as Polar Motion(PM:PMX and PMY)play an essential role in real-time applications related to high-precision reference frame conversion.Currentl...Short-term(up to 30 days)predictions of Earth Rotation Parameters(ERPs)such as Polar Motion(PM:PMX and PMY)play an essential role in real-time applications related to high-precision reference frame conversion.Currently,least squares(LS)+auto-regressive(AR)hybrid method is one of the main techniques of PM prediction.Besides,the weighted LS+AR hybrid method performs well for PM short-term prediction.However,the corresponding covariance information of LS fitting residuals deserves further exploration in the AR model.In this study,we have derived a modified stochastic model for the LS+AR hybrid method,namely the weighted LS+weighted AR hybrid method.By using the PM data products of IERS EOP 14 C04,the numerical results indicate that for PM short-term forecasting,the proposed weighted LS+weighted AR hybrid method shows an advantage over both the LS+AR hybrid method and the weighted LS+AR hybrid method.Compared to the mean absolute errors(MAEs)of PMX/PMY sho rt-term prediction of the LS+AR hybrid method and the weighted LS+AR hybrid method,the weighted LS+weighted AR hybrid method shows average improvements of 6.61%/12.08%and 0.24%/11.65%,respectively.Besides,for the slopes of the linear regression lines fitted to the errors of each method,the growth of the prediction error of the proposed method is slower than that of the other two methods.展开更多
Lunar Environment heliospheric X-ray Imager(LEXI)and Solar wind−Magnetosphere−Ionosphere Link Explorer(SMILE)will observe magnetosheath and its boundary motion in soft X-rays for understanding magnetopause reconnectio...Lunar Environment heliospheric X-ray Imager(LEXI)and Solar wind−Magnetosphere−Ionosphere Link Explorer(SMILE)will observe magnetosheath and its boundary motion in soft X-rays for understanding magnetopause reconnection modes under various solar wind conditions after their respective launches in 2024 and 2025.Magnetosheath conditions,namely,plasma density,velocity,and temperature,are key parameters for predicting and analyzing soft X-ray images from the LEXI and SMILE missions.We developed a userfriendly model of magnetosheath that parameterizes number density,velocity,temperature,and magnetic field by utilizing the global Magnetohydrodynamics(MHD)model as well as the pre-existing gas-dynamic and analytic models.Using this parameterized magnetosheath model,scientists can easily reconstruct expected soft X-ray images and utilize them for analysis of observed images of LEXI and SMILE without simulating the complicated global magnetosphere models.First,we created an MHD-based magnetosheath model by running a total of 14 OpenGGCM global MHD simulations under 7 solar wind densities(1,5,10,15,20,25,and 30 cm)and 2 interplanetary magnetic field Bz components(±4 nT),and then parameterizing the results in new magnetosheath conditions.We compared the magnetosheath model result with THEMIS statistical data and it showed good agreement with a weighted Pearson correlation coefficient greater than 0.77,especially for plasma density and plasma velocity.Second,we compiled a suite of magnetosheath models incorporating previous magnetosheath models(gas-dynamic,analytic),and did two case studies to test the performance.The MHD-based model was comparable to or better than the previous models while providing self-consistency among the magnetosheath parameters.Third,we constructed a tool to calculate a soft X-ray image from any given vantage point,which can support the planning and data analysis of the aforementioned LEXI and SMILE missions.A release of the code has been uploaded to a Github repository.展开更多
To accurately estimate winter wheat yields and analyze the uncertainty in crop model data assimilations, winter wheat yield estimates were obtained by assimilating measured or remotely sensed leaf area index (LAI) v...To accurately estimate winter wheat yields and analyze the uncertainty in crop model data assimilations, winter wheat yield estimates were obtained by assimilating measured or remotely sensed leaf area index (LAI) values. The performances of the calibrated crop environment resource synthesis for wheat (CERES-Wheat) model for two different assimilation scenarios were compared by employing ensemble Kalman filter (EnKF)-based strategies. The uncertainty factors of the crop model data assimilation was analyzed by considering the observation errors, assimilation stages and temporal-spatial scales. Overalll the results indicated a better yield estimate performance when the EnKF-based strategy was used to comprehen- sively consider several factors in the initial conditions and observations. When using this strategy, an adjusted coefficients of determination (R2) of 0.84, a root mean square error (RMSE) of 323 kg ha-1, and a relative errors (RE) of 4.15% were obtained at the field plot scale and an R2 of 0.81, an RMSE of 362 kg ha-1, and an RE of 4.52% were obtained at the pixel scale of 30 mx30 m. With increasing observation errors, the accuracy of the yield estimates obviously decreased, but an acceptable estimate was observed when the observation errors were within 20%. Winter wheat yield estimates could be improved significantly by assimilating observations from the middle to the end of the crop growing seasons. With decreasing assimilation frequency and pixel resolution, the accuracy of the crop yield estimates decreased; however, the computation time decreased. It is important to consider reasonable temporal-spatial scales and assimilation stages to obtain tradeoffs between accuracy and computation time, especially in operational systems used for regional crop yield estimates.展开更多
Neurodegenerative diseases(NDs)are a group of debilitating neurological disorders that primarily affect elderly populations and include Alzheimer's disease(AD),Parkinson's disease(PD),Huntington's disease(...Neurodegenerative diseases(NDs)are a group of debilitating neurological disorders that primarily affect elderly populations and include Alzheimer's disease(AD),Parkinson's disease(PD),Huntington's disease(HD),and amyotrophic lateral sclerosis(ALS).Currently,there are no therapies available that can delay,stop,or reverse the pathological progression of NDs in clinical settings.As the population ages,NDs are imposing a huge burden on public health systems and affected families.Animal models are important tools for preclinical investigations to understand disease pathogenesis and test potential treatments.While numerous rodent models of NDs have been developed to enhance our understanding of disease mechanisms,the limited success of translating findings from animal models to clinical practice suggests that there is still a need to bridge this translation gap.Old World nonhuman primates(NHPs),such as rhesus,cynomolgus,and vervet monkeys,are phylogenetically,physiologically,biochemically,and behaviorally most relevant to humans.This is particularly evident in the similarity of the structure and function of their central nervous systems,rendering such species uniquely valuable for neuroscience research.Recently,the development of several genetically modified NHP models of NDs has successfully recapitulated key pathologies and revealed novel mechanisms.This review focuses on the efficacy of NHPs in modeling NDs and the novel pathological insights gained,as well as the challenges associated with the generation of such models and the complexities involved in their subsequent analysis.展开更多
Parkinson’s disease is chara cterized by the loss of dopaminergic neurons in the substantia nigra pars com pacta,and although restoring striatal dopamine levels may improve symptoms,no treatment can cure or reve rse ...Parkinson’s disease is chara cterized by the loss of dopaminergic neurons in the substantia nigra pars com pacta,and although restoring striatal dopamine levels may improve symptoms,no treatment can cure or reve rse the disease itself.Stem cell therapy has a regenerative effect and is being actively studied as a candidate for the treatment of Parkinson’s disease.Mesenchymal stem cells are considered a promising option due to fewer ethical concerns,a lower risk of immune rejection,and a lower risk of teratogenicity.We performed a meta-analysis to evaluate the therapeutic effects of mesenchymal stem cells and their derivatives on motor function,memory,and preservation of dopamine rgic neurons in a Parkinson’s disease animal model.We searched bibliographic databases(PubMed/MEDLINE,Embase,CENTRAL,Scopus,and Web of Science)to identify articles and included only pee r-reviewed in vivo interve ntional animal studies published in any language through J une 28,2023.The study utilized the random-effect model to estimate the 95%confidence intervals(CI)of the standard mean differences(SMD)between the treatment and control groups.We use the systematic review center for laboratory animal expe rimentation’s risk of bias tool and the collaborative approach to meta-analysis and review of animal studies checklist for study quality assessment.A total of 33studies with data from 840 Parkinson’s disease model animals were included in the meta-analysis.Treatment with mesenchymal stem cells significantly improved motor function as assessed by the amphetamine-induced rotational test.Among the stem cell types,the bone marrow MSCs with neurotrophic factor group showed la rgest effect size(SMD[95%CI]=-6.21[-9.50 to-2.93],P=0.0001,I^(2)=0.0%).The stem cell treatment group had significantly more tyrosine hydroxylase positive dopamine rgic neurons in the striatum([95%CI]=1.04[0.59 to 1.49],P=0.0001,I^(2)=65.1%)and substantia nigra(SMD[95%CI]=1.38[0.89 to 1.87],P=0.0001,I^(2)=75.3%),indicating a protective effect on dopaminergic neurons.Subgroup analysis of the amphetamine-induced rotation test showed a significant reduction only in the intracranial-striatum route(SMD[95%CI]=-2.59[-3.25 to-1.94],P=0.0001,I^(2)=74.4%).The memory test showed significant improvement only in the intravenous route(SMD[95%CI]=4.80[1.84 to 7.76],P=0.027,I^(2)=79.6%).Mesenchymal stem cells have been shown to positively impact motor function and memory function and protect dopaminergic neurons in preclinical models of Parkinson’s disease.Further research is required to determine the optimal stem cell types,modifications,transplanted cell numbe rs,and delivery methods for these protocols.展开更多
Parameterization is a critical step in modelling ecosystem dynamics.However,assigning parameter values can be a technical challenge for structurally complex natural plant communities;uncertainties in model simulations...Parameterization is a critical step in modelling ecosystem dynamics.However,assigning parameter values can be a technical challenge for structurally complex natural plant communities;uncertainties in model simulations often arise from inappropriate model parameterization.Here we compared five methods for defining community-level specific leaf area(SLA)and leaf C:N across nine contrasting forest sites along the North-South Transect of Eastern China,including biomass-weighted average for the entire plant community(AP_BW)and four simplified selective sampling(biomass-weighted average over five dominant tree species[5DT_BW],basal area weighted average over five dominant tree species[5DT_AW],biomass-weighted average over all tree species[AT_BW]and basal area weighted average over all tree species[AT_AW]).We found that the default values for SLA and leaf C:N embedded in the Biome-BGC v4.2 were higher than the five computational methods produced across the nine sites,with deviations ranging from 28.0 to 73.3%.In addition,there were only slight deviations(<10%)between the whole plant community sampling(AP_BW)predicted NPP and the four simplified selective sampling methods,and no significant difference between the predictions of AT_BW and AP_BW except the Shennongjia site.The findings in this study highlights the critical importance of computational strategies for community-level parameterization in ecosystem process modelling,and will support the choice of parameterization methods.展开更多
基金Project(2014BAG01B0403)supported by the National High-Tech Research and Development Program of China
文摘In order to describe the characteristics of dynamic traffic flow and improve the robustness of its multiple applications, a dynamic traffic temporal-spatial model(DTTS) is established. With consideration of the temporal correlation, spatial correlation and historical correlation, a basic DTTS model is built. And a three-stage approach is put forward for the simplification and calibration of the basic DTTS model. Through critical sections pre-selection and critical time pre-selection, the first stage reduces the variable number of the basic DTTS model. In the second stage, variable coefficient calibration is implemented based on basic model simplification and stepwise regression analysis. Aimed at dynamic noise estimation, the characteristics of noise are summarized and an extreme learning machine is presented in the third stage. A case study based on a real-world road network in Beijing, China, is carried out to test the efficiency and applicability of proposed DTTS model and the three-stage approach.
基金the National Natural Science Foundation of China(No.61976080)the Key Project on Research and Practice of Henan University Graduate Education and Teaching Reform(YJSJG2023XJ006)+1 种基金the Key Research and Development Projects of Henan Province(231111212500)the Henan University Graduate Education Innovation and Quality Improvement Program(SYLKC2023016).
文摘In the field of target recognition based on the temporal-spatial information fusion,evidence the-ory has received extensive attention.To achieve accurate and efficient target recognition by the evi-dence theory,an adaptive temporal-spatial information fusion model is proposed.Firstly,an adaptive evaluation correction mechanism is constructed by the evidence distance and Deng entropy,which realizes the credibility discrimination and adaptive correction of the spatial evidence.Secondly,the credibility decay operator is introduced to obtain the dynamic credibility of temporal evidence.Finally,the sequential combination of temporal-spatial evidences is achieved by Shafer’s discount criterion and Dempster’s combination rule.The simulation results show that the proposed method not only considers the dynamic and sequential characteristics of the temporal-spatial evidences com-bination,but also has a strong conflict information processing capability,which provides a new refer-ence for the field of temporal-spatial information fusion.
基金Supported by Tianjin Science and Technology Development Project (060YFGNC1900)National Key Technology R&D Program in the 11th Five-year Plan of China(2012BAD38B01)~~
文摘[Objective] The aim was to study on temporal-spatial distribution model of cold chain logistics for vegetables. [Method] Broccoli was taken as an example. Detailedly, time-space distribution model of cold chain logistics for broccoli was proposed from standpoints of costs and benefits based on changes of time and space, and a comprehensive evaluation was made on timeliness, efficiency, risks, added- value of products and satisfaction of information in cold-chain logistics. [Result] The efficiency of cold chain logistics for vegetable can be greatly improved by temporal- spatial distribution model of cold chain logistics. [Conclusion] Costs and benefits of vegetables in temporal-apstial distribution could be evaluated by the model.
文摘安全生产事故往往由多组织交互、多因素耦合造成,事故原因涉及多个组织。为预防和遏制多组织生产安全事故的发生,基于系统理论事故建模与过程模型(Systems-Theory Accident Modeling and Process,STAMP)、24Model,构建一种用于多组织事故分析的方法,并以青岛石油爆炸事故为例进行事故原因分析。结果显示:STAMP-24Model可以分组织,分层次且有效、全面、详细地分析涉及多个组织的事故原因,探究多组织之间的交互关系;对事故进行动态演化分析,可得到各组织不安全动作耦合关系与形成的事故失效链及管控失效路径,进而为预防多组织事故提供思路和参考。
基金supported by the Chinese–Norwegian Collaboration Projects within Climate Systems jointly funded by the National Key Research and Development Program of China (Grant No.2022YFE0106800)the Research Council of Norway funded project,MAPARC (Grant No.328943)+2 种基金the support from the Research Council of Norway funded project,COMBINED (Grant No.328935)the National Natural Science Foundation of China (Grant No.42075030)the Postgraduate Research and Practice Innovation Program of Jiangsu Province (KYCX23_1314)。
文摘Precipitous Arctic sea-ice decline and the corresponding increase in Arctic open-water areas in summer months give more space for sea-ice growth in the subsequent cold seasons. Compared to the decline of the entire Arctic multiyear sea ice,changes in newly formed sea ice indicate more thermodynamic and dynamic information on Arctic atmosphere–ocean–ice interaction and northern mid–high latitude atmospheric teleconnections. Here, we use a large multimodel ensemble from phase 6 of the Coupled Model Intercomparison Project(CMIP6) to investigate future changes in wintertime newly formed Arctic sea ice. The commonly used model-democracy approach that gives equal weight to each model essentially assumes that all models are independent and equally plausible, which contradicts with the fact that there are large interdependencies in the ensemble and discrepancies in models' performances in reproducing observations. Therefore, instead of using the arithmetic mean of well-performing models or all available models for projections like in previous studies, we employ a newly developed model weighting scheme that weights all models in the ensemble with consideration of their performance and independence to provide more reliable projections. Model democracy leads to evident bias and large intermodel spread in CMIP6 projections of newly formed Arctic sea ice. However, we show that both the bias and the intermodel spread can be effectively reduced by the weighting scheme. Projections from the weighted models indicate that wintertime newly formed Arctic sea ice is likely to increase dramatically until the middle of this century regardless of the emissions scenario.Thereafter, it may decrease(or remain stable) if the Arctic warming crosses a threshold(or is extensively constrained).
基金funded by the National Natural Science Foundation of China(Grant Nos.U22A20166 and 12172230)the Guangdong Basic and Applied Basic Research Foundation(Grant No.2023A1515012654)+1 种基金funded by the National Natural Science Foundation of China(Grant Nos.U22A20166 and 12172230)the Guangdong Basic and Applied Basic Research Foundation(Grant No.2023A1515012654)。
文摘Understanding the anisotropic creep behaviors of shale under direct shearing is a challenging issue.In this context,we conducted shear-creep and steady-creep tests on shale with five bedding orientations (i.e.0°,30°,45°,60°,and 90°),under multiple levels of direct shearing for the first time.The results show that the anisotropic creep of shale exhibits a significant stress-dependent behavior.Under a low shear stress,the creep compliance of shale increases linearly with the logarithm of time at all bedding orientations,and the increase depends on the bedding orientation and creep time.Under high shear stress conditions,the creep compliance of shale is minimal when the bedding orientation is 0°,and the steady-creep rate of shale increases significantly with increasing bedding orientations of 30°,45°,60°,and 90°.The stress-strain values corresponding to the inception of the accelerated creep stage show an increasing and then decreasing trend with the bedding orientation.A semilogarithmic model that could reflect the stress dependence of the steady-creep rate while considering the hardening and damage process is proposed.The model minimizes the deviation of the calculated steady-state creep rate from the observed value and reveals the behavior of the bedding orientation's influence on the steady-creep rate.The applicability of the five classical empirical creep models is quantitatively evaluated.It shows that the logarithmic model can well explain the experimental creep strain and creep rate,and it can accurately predict long-term shear creep deformation.Based on an improved logarithmic model,the variations in creep parameters with shear stress and bedding orientations are discussed.With abovementioned findings,a mathematical method for constructing an anisotropic shear creep model of shale is proposed,which can characterize the nonlinear dependence of the anisotropic shear creep behavior of shale on the bedding orientation.
基金We acknowledge funding from NSFC Grant 62306283.
文摘Since the 1950s,when the Turing Test was introduced,there has been notable progress in machine language intelligence.Language modeling,crucial for AI development,has evolved from statistical to neural models over the last two decades.Recently,transformer-based Pre-trained Language Models(PLM)have excelled in Natural Language Processing(NLP)tasks by leveraging large-scale training corpora.Increasing the scale of these models enhances performance significantly,introducing abilities like context learning that smaller models lack.The advancement in Large Language Models,exemplified by the development of ChatGPT,has made significant impacts both academically and industrially,capturing widespread societal interest.This survey provides an overview of the development and prospects from Large Language Models(LLM)to Large Multimodal Models(LMM).It first discusses the contributions and technological advancements of LLMs in the field of natural language processing,especially in text generation and language understanding.Then,it turns to the discussion of LMMs,which integrates various data modalities such as text,images,and sound,demonstrating advanced capabilities in understanding and generating cross-modal content,paving new pathways for the adaptability and flexibility of AI systems.Finally,the survey highlights the prospects of LMMs in terms of technological development and application potential,while also pointing out challenges in data integration,cross-modal understanding accuracy,providing a comprehensive perspective on the latest developments in this field.
基金Supported by the Project of NINGBO Leading Medical Health Discipline,No.2022-B11Ningbo Natural Science Foundation,No.202003N4206Public Welfare Foundation of Ningbo,No.2021S108.
文摘BACKGROUND Colorectal cancer(CRC)is a serious threat worldwide.Although early screening is suggested to be the most effective method to prevent and control CRC,the current situation of early screening for CRC is still not optimistic.In China,the incidence of CRC in the Yangtze River Delta region is increasing dramatically,but few studies have been conducted.Therefore,it is necessary to develop a simple and efficient early screening model for CRC.AIM To develop and validate an early-screening nomogram model to identify individuals at high risk of CRC.METHODS Data of 64448 participants obtained from Ningbo Hospital,China between 2014 and 2017 were retrospectively analyzed.The cohort comprised 64448 individuals,of which,530 were excluded due to missing or incorrect data.Of 63918,7607(11.9%)individuals were considered to be high risk for CRC,and 56311(88.1%)were not.The participants were randomly allocated to a training set(44743)or validation set(19175).The discriminatory ability,predictive accuracy,and clinical utility of the model were evaluated by constructing and analyzing receiver operating characteristic(ROC)curves and calibration curves and by decision curve analysis.Finally,the model was validated internally using a bootstrap resampling technique.RESULTS Seven variables,including demographic,lifestyle,and family history information,were examined.Multifactorial logistic regression analysis revealed that age[odds ratio(OR):1.03,95%confidence interval(CI):1.02-1.03,P<0.001],body mass index(BMI)(OR:1.07,95%CI:1.06-1.08,P<0.001),waist circumference(WC)(OR:1.03,95%CI:1.02-1.03 P<0.001),lifestyle(OR:0.45,95%CI:0.42-0.48,P<0.001),and family history(OR:4.28,95%CI:4.04-4.54,P<0.001)were the most significant predictors of high-risk CRC.Healthy lifestyle was a protective factor,whereas family history was the most significant risk factor.The area under the curve was 0.734(95%CI:0.723-0.745)for the final validation set ROC curve and 0.735(95%CI:0.728-0.742)for the training set ROC curve.The calibration curve demonstrated a high correlation between the CRC high-risk population predicted by the nomogram model and the actual CRC high-risk population.CONCLUSION The early-screening nomogram model for CRC prediction in high-risk populations developed in this study based on age,BMI,WC,lifestyle,and family history exhibited high accuracy.
文摘Flow units(FU)rock typing is a common technique for characterizing reservoir flow behavior,producing reliable porosity and permeability estimation even in complex geological settings.However,the lateral extrapolation of FU away from the well into the whole reservoir grid is commonly a difficult task and using the seismic data as constraints is rarely a subject of study.This paper proposes a workflow to generate numerous possible 3D volumes of flow units,porosity and permeability below the seismic resolution limit,respecting the available seismic data at larger scales.The methodology is used in the Mero Field,a Brazilian presalt carbonate reservoir located in the Santos Basin,who presents a complex and heterogenic geological setting with different sedimentological processes and diagenetic history.We generated metric flow units using the conventional core analysis and transposed to the well log data.Then,given a Markov chain Monte Carlo algorithm,the seismic data and the well log statistics,we simulated acoustic impedance,decametric flow units(DFU),metric flow units(MFU),porosity and permeability volumes in the metric scale.The aim is to estimate a minimum amount of MFU able to calculate realistic scenarios porosity and permeability scenarios,without losing the seismic lateral control.In other words,every porosity and permeability volume simulated produces a synthetic seismic that match the real seismic of the area,even in the metric scale.The achieved 3D results represent a high-resolution fluid flow reservoir modelling considering the lateral control of the seismic during the process and can be directly incorporated in the dynamic characterization workflow.
基金supported by the National Natural Science Foundation of China(Grant Nos.42141019 and 42261144687)and STEP(Grant No.2019QZKK0102)supported by the Korea Environmental Industry&Technology Institute(KEITI)through the“Project for developing an observation-based GHG emissions geospatial information map”,funded by the Korea Ministry of Environment(MOE)(Grant No.RS-2023-00232066).
文摘Artificial intelligence(AI)models have significantly impacted various areas of the atmospheric sciences,reshaping our approach to climate-related challenges.Amid this AI-driven transformation,the foundational role of physics in climate science has occasionally been overlooked.Our perspective suggests that the future of climate modeling involves a synergistic partnership between AI and physics,rather than an“either/or”scenario.Scrutinizing controversies around current physical inconsistencies in large AI models,we stress the critical need for detailed dynamic diagnostics and physical constraints.Furthermore,we provide illustrative examples to guide future assessments and constraints for AI models.Regarding AI integration with numerical models,we argue that offline AI parameterization schemes may fall short of achieving global optimality,emphasizing the importance of constructing online schemes.Additionally,we highlight the significance of fostering a community culture and propose the OCR(Open,Comparable,Reproducible)principles.Through a better community culture and a deep integration of physics and AI,we contend that developing a learnable climate model,balancing AI and physics,is an achievable goal.
基金supported by National Natural Science Foundation of China,China(No.42004016)HuBei Natural Science Fund,China(No.2020CFB329)+1 种基金HuNan Natural Science Fund,China(No.2023JJ60559,2023JJ60560)the State Key Laboratory of Geodesy and Earth’s Dynamics self-deployment project,China(No.S21L6101)。
文摘Short-term(up to 30 days)predictions of Earth Rotation Parameters(ERPs)such as Polar Motion(PM:PMX and PMY)play an essential role in real-time applications related to high-precision reference frame conversion.Currently,least squares(LS)+auto-regressive(AR)hybrid method is one of the main techniques of PM prediction.Besides,the weighted LS+AR hybrid method performs well for PM short-term prediction.However,the corresponding covariance information of LS fitting residuals deserves further exploration in the AR model.In this study,we have derived a modified stochastic model for the LS+AR hybrid method,namely the weighted LS+weighted AR hybrid method.By using the PM data products of IERS EOP 14 C04,the numerical results indicate that for PM short-term forecasting,the proposed weighted LS+weighted AR hybrid method shows an advantage over both the LS+AR hybrid method and the weighted LS+AR hybrid method.Compared to the mean absolute errors(MAEs)of PMX/PMY sho rt-term prediction of the LS+AR hybrid method and the weighted LS+AR hybrid method,the weighted LS+weighted AR hybrid method shows average improvements of 6.61%/12.08%and 0.24%/11.65%,respectively.Besides,for the slopes of the linear regression lines fitted to the errors of each method,the growth of the prediction error of the proposed method is slower than that of the other two methods.
基金supported by the NSF grant AGS-1928883the NASA grants,80NSSC20K1670 and 80MSFC20C0019+2 种基金support from NASA GSFC IRADHIFISFM funds。
文摘Lunar Environment heliospheric X-ray Imager(LEXI)and Solar wind−Magnetosphere−Ionosphere Link Explorer(SMILE)will observe magnetosheath and its boundary motion in soft X-rays for understanding magnetopause reconnection modes under various solar wind conditions after their respective launches in 2024 and 2025.Magnetosheath conditions,namely,plasma density,velocity,and temperature,are key parameters for predicting and analyzing soft X-ray images from the LEXI and SMILE missions.We developed a userfriendly model of magnetosheath that parameterizes number density,velocity,temperature,and magnetic field by utilizing the global Magnetohydrodynamics(MHD)model as well as the pre-existing gas-dynamic and analytic models.Using this parameterized magnetosheath model,scientists can easily reconstruct expected soft X-ray images and utilize them for analysis of observed images of LEXI and SMILE without simulating the complicated global magnetosphere models.First,we created an MHD-based magnetosheath model by running a total of 14 OpenGGCM global MHD simulations under 7 solar wind densities(1,5,10,15,20,25,and 30 cm)and 2 interplanetary magnetic field Bz components(±4 nT),and then parameterizing the results in new magnetosheath conditions.We compared the magnetosheath model result with THEMIS statistical data and it showed good agreement with a weighted Pearson correlation coefficient greater than 0.77,especially for plasma density and plasma velocity.Second,we compiled a suite of magnetosheath models incorporating previous magnetosheath models(gas-dynamic,analytic),and did two case studies to test the performance.The MHD-based model was comparable to or better than the previous models while providing self-consistency among the magnetosheath parameters.Third,we constructed a tool to calculate a soft X-ray image from any given vantage point,which can support the planning and data analysis of the aforementioned LEXI and SMILE missions.A release of the code has been uploaded to a Github repository.
基金supported by the National Natural Science Foundation of China (41401491,41371396,41301457,41471364)the Introduction of International Advanced Agricultural Science and Technology,Ministry of Agriculture,China (948 Program,2016-X38)+1 种基金the Agricultural Scientific Research Fund of Outstanding Talentsthe Open Fund for the Key Laboratory of Agri-informatics,Ministry of Agriculture,China (2013009)
文摘To accurately estimate winter wheat yields and analyze the uncertainty in crop model data assimilations, winter wheat yield estimates were obtained by assimilating measured or remotely sensed leaf area index (LAI) values. The performances of the calibrated crop environment resource synthesis for wheat (CERES-Wheat) model for two different assimilation scenarios were compared by employing ensemble Kalman filter (EnKF)-based strategies. The uncertainty factors of the crop model data assimilation was analyzed by considering the observation errors, assimilation stages and temporal-spatial scales. Overalll the results indicated a better yield estimate performance when the EnKF-based strategy was used to comprehen- sively consider several factors in the initial conditions and observations. When using this strategy, an adjusted coefficients of determination (R2) of 0.84, a root mean square error (RMSE) of 323 kg ha-1, and a relative errors (RE) of 4.15% were obtained at the field plot scale and an R2 of 0.81, an RMSE of 362 kg ha-1, and an RE of 4.52% were obtained at the pixel scale of 30 mx30 m. With increasing observation errors, the accuracy of the yield estimates obviously decreased, but an acceptable estimate was observed when the observation errors were within 20%. Winter wheat yield estimates could be improved significantly by assimilating observations from the middle to the end of the crop growing seasons. With decreasing assimilation frequency and pixel resolution, the accuracy of the crop yield estimates decreased; however, the computation time decreased. It is important to consider reasonable temporal-spatial scales and assimilation stages to obtain tradeoffs between accuracy and computation time, especially in operational systems used for regional crop yield estimates.
基金supported by the National Key Research and Development Program of China (2021YFF0702201)National Natural Science Foundation of China (81873736,31872779,81830032)+2 种基金Guangzhou Key Research Program on Brain Science (202007030008)Department of Science and Technology of Guangdong Province (2021ZT09Y007,2020B121201006,2018B030337001,2021A1515012526)Natural Science Foundation of Guangdong Province (2021A1515012526,2022A1515012651)。
文摘Neurodegenerative diseases(NDs)are a group of debilitating neurological disorders that primarily affect elderly populations and include Alzheimer's disease(AD),Parkinson's disease(PD),Huntington's disease(HD),and amyotrophic lateral sclerosis(ALS).Currently,there are no therapies available that can delay,stop,or reverse the pathological progression of NDs in clinical settings.As the population ages,NDs are imposing a huge burden on public health systems and affected families.Animal models are important tools for preclinical investigations to understand disease pathogenesis and test potential treatments.While numerous rodent models of NDs have been developed to enhance our understanding of disease mechanisms,the limited success of translating findings from animal models to clinical practice suggests that there is still a need to bridge this translation gap.Old World nonhuman primates(NHPs),such as rhesus,cynomolgus,and vervet monkeys,are phylogenetically,physiologically,biochemically,and behaviorally most relevant to humans.This is particularly evident in the similarity of the structure and function of their central nervous systems,rendering such species uniquely valuable for neuroscience research.Recently,the development of several genetically modified NHP models of NDs has successfully recapitulated key pathologies and revealed novel mechanisms.This review focuses on the efficacy of NHPs in modeling NDs and the novel pathological insights gained,as well as the challenges associated with the generation of such models and the complexities involved in their subsequent analysis.
文摘Parkinson’s disease is chara cterized by the loss of dopaminergic neurons in the substantia nigra pars com pacta,and although restoring striatal dopamine levels may improve symptoms,no treatment can cure or reve rse the disease itself.Stem cell therapy has a regenerative effect and is being actively studied as a candidate for the treatment of Parkinson’s disease.Mesenchymal stem cells are considered a promising option due to fewer ethical concerns,a lower risk of immune rejection,and a lower risk of teratogenicity.We performed a meta-analysis to evaluate the therapeutic effects of mesenchymal stem cells and their derivatives on motor function,memory,and preservation of dopamine rgic neurons in a Parkinson’s disease animal model.We searched bibliographic databases(PubMed/MEDLINE,Embase,CENTRAL,Scopus,and Web of Science)to identify articles and included only pee r-reviewed in vivo interve ntional animal studies published in any language through J une 28,2023.The study utilized the random-effect model to estimate the 95%confidence intervals(CI)of the standard mean differences(SMD)between the treatment and control groups.We use the systematic review center for laboratory animal expe rimentation’s risk of bias tool and the collaborative approach to meta-analysis and review of animal studies checklist for study quality assessment.A total of 33studies with data from 840 Parkinson’s disease model animals were included in the meta-analysis.Treatment with mesenchymal stem cells significantly improved motor function as assessed by the amphetamine-induced rotational test.Among the stem cell types,the bone marrow MSCs with neurotrophic factor group showed la rgest effect size(SMD[95%CI]=-6.21[-9.50 to-2.93],P=0.0001,I^(2)=0.0%).The stem cell treatment group had significantly more tyrosine hydroxylase positive dopamine rgic neurons in the striatum([95%CI]=1.04[0.59 to 1.49],P=0.0001,I^(2)=65.1%)and substantia nigra(SMD[95%CI]=1.38[0.89 to 1.87],P=0.0001,I^(2)=75.3%),indicating a protective effect on dopaminergic neurons.Subgroup analysis of the amphetamine-induced rotation test showed a significant reduction only in the intracranial-striatum route(SMD[95%CI]=-2.59[-3.25 to-1.94],P=0.0001,I^(2)=74.4%).The memory test showed significant improvement only in the intravenous route(SMD[95%CI]=4.80[1.84 to 7.76],P=0.027,I^(2)=79.6%).Mesenchymal stem cells have been shown to positively impact motor function and memory function and protect dopaminergic neurons in preclinical models of Parkinson’s disease.Further research is required to determine the optimal stem cell types,modifications,transplanted cell numbe rs,and delivery methods for these protocols.
基金This research was funded by the National Natural Science Foundation of China(Grant Nos.31870426).
文摘Parameterization is a critical step in modelling ecosystem dynamics.However,assigning parameter values can be a technical challenge for structurally complex natural plant communities;uncertainties in model simulations often arise from inappropriate model parameterization.Here we compared five methods for defining community-level specific leaf area(SLA)and leaf C:N across nine contrasting forest sites along the North-South Transect of Eastern China,including biomass-weighted average for the entire plant community(AP_BW)and four simplified selective sampling(biomass-weighted average over five dominant tree species[5DT_BW],basal area weighted average over five dominant tree species[5DT_AW],biomass-weighted average over all tree species[AT_BW]and basal area weighted average over all tree species[AT_AW]).We found that the default values for SLA and leaf C:N embedded in the Biome-BGC v4.2 were higher than the five computational methods produced across the nine sites,with deviations ranging from 28.0 to 73.3%.In addition,there were only slight deviations(<10%)between the whole plant community sampling(AP_BW)predicted NPP and the four simplified selective sampling methods,and no significant difference between the predictions of AT_BW and AP_BW except the Shennongjia site.The findings in this study highlights the critical importance of computational strategies for community-level parameterization in ecosystem process modelling,and will support the choice of parameterization methods.