To solve the problem of delayed update of spectrum information(SI) in the database assisted dynamic spectrum management(DB-DSM), this paper studies a novel dynamic update scheme of SI in DB-DSM. Firstly, a dynamic upd...To solve the problem of delayed update of spectrum information(SI) in the database assisted dynamic spectrum management(DB-DSM), this paper studies a novel dynamic update scheme of SI in DB-DSM. Firstly, a dynamic update mechanism of SI based on spectrum opportunity incentive is established, in which spectrum users are encouraged to actively assist the database to update SI in real time. Secondly, the information update contribution(IUC) of spectrum opportunity is defined to describe the cost of accessing spectrum opportunity for heterogeneous spectrum users, and the profit of SI update obtained by the database from spectrum allocation. The process that the database determines the IUC of spectrum opportunity and spectrum user selects spectrum opportunity is mapped to a Hotelling model. Thirdly, the process of determining the IUC of spectrum opportunities is further modelled as a Stackelberg game by establishing multiple virtual spectrum resource providers(VSRPs) in the database. It is proved that there is a Nash Equilibrium in the game of determining the IUC of spectrum opportunities by VSRPs. Finally, an algorithm of determining the IUC based on a genetic algorithm is designed to achieve the optimal IUC. The-oretical analysis and simulation results show that the proposed method can quickly find the optimal solution of the IUC, and ensure that the spectrum resource provider can obtain the optimal profit of SI update.展开更多
In evolutionary games,most studies on finite populations have focused on a single updating mechanism.However,given the differences in individual cognition,individuals may change their strategies according to different...In evolutionary games,most studies on finite populations have focused on a single updating mechanism.However,given the differences in individual cognition,individuals may change their strategies according to different updating mechanisms.For this reason,we consider two different aspiration-driven updating mechanisms in structured populations:satisfied-stay unsatisfied shift(SSUS)and satisfied-cooperate unsatisfied defect(SCUD).To simulate the game player’s learning process,this paper improves the particle swarm optimization algorithm,which will be used to simulate the game player’s strategy selection,i.e.,population particle swarm optimization(PPSO)algorithms.We find that in the prisoner’s dilemma,the conditions that SSUS facilitates the evolution of cooperation do not enable cooperation to emerge.In contrast,SCUD conditions that promote the evolution of cooperation enable cooperation to emerge.In addition,the invasion of SCUD individuals helps promote cooperation among SSUS individuals.Simulated by the PPSO algorithm,the theoretical approximation results are found to be consistent with the trend of change in the simulation results.展开更多
A fluid-structure interaction approach is proposed in this paper based onNon-Ordinary State-Based Peridynamics(NOSB-PD)and Updated Lagrangian Particle Hydrodynamics(ULPH)to simulate the fluid-structure interaction pro...A fluid-structure interaction approach is proposed in this paper based onNon-Ordinary State-Based Peridynamics(NOSB-PD)and Updated Lagrangian Particle Hydrodynamics(ULPH)to simulate the fluid-structure interaction problem with large geometric deformation and material failure and solve the fluid-structure interaction problem of Newtonian fluid.In the coupled framework,the NOSB-PD theory describes the deformation and fracture of the solid material structure.ULPH is applied to describe the flow of Newtonian fluids due to its advantages in computational accuracy.The framework utilizes the advantages of NOSB-PD theory for solving discontinuous problems and ULPH theory for solving fluid problems,with good computational stability and robustness.A fluidstructure coupling algorithm using pressure as the transmission medium is established to deal with the fluidstructure interface.The dynamic model of solid structure and the PD-ULPH fluid-structure interaction model involving large deformation are verified by numerical simulations.The results agree with the analytical solution,the available experimental data,and other numerical results.Thus,the accuracy and effectiveness of the proposed method in solving the fluid-structure interaction problem are demonstrated.The fluid-structure interactionmodel based on ULPH and NOSB-PD established in this paper provides a new idea for the numerical solution of fluidstructure interaction and a promising approach for engineering design and experimental prediction.展开更多
Network updates have become increasingly prevalent since the broad adoption of software-defined networks(SDNs)in data centers.Modern TCP designs,including cutting-edge TCP variants DCTCP,CUBIC,and BBR,however,are not ...Network updates have become increasingly prevalent since the broad adoption of software-defined networks(SDNs)in data centers.Modern TCP designs,including cutting-edge TCP variants DCTCP,CUBIC,and BBR,however,are not resilient to network updates that provoke flow rerouting.In this paper,we first demonstrate that popular TCP implementations perform inadequately in the presence of frequent and inconsistent network updates,because inconsistent and frequent network updates result in out-of-order packets and packet drops induced via transitory congestion and lead to serious performance deterioration.We look into the causes and propose a network update-friendly TCP(NUFTCP),which is an extension of the DCTCP variant,as a solution.Simulations are used to assess the proposed NUFTCP.Our findings reveal that NUFTCP can more effectively manage the problems of out-of-order packets and packet drops triggered in network updates,and it outperforms DCTCP considerably.展开更多
Natural convection is a heat transfer mechanism driven by temperature or density differences,leading to fluid motion without external influence.It occurs in various natural and engineering phenomena,influencing heat t...Natural convection is a heat transfer mechanism driven by temperature or density differences,leading to fluid motion without external influence.It occurs in various natural and engineering phenomena,influencing heat transfer,climate,and fluid mixing in industrial processes.This work aims to use the Updated Lagrangian Particle Hydrodynamics(ULPH)theory to address natural convection problems.The Navier-Stokes equation is discretized using second-order nonlocal differential operators,allowing a direct solution of the Laplace operator for temperature in the energy equation.Various numerical simulations,including cases such as natural convection in square cavities and two concentric cylinders,were conducted to validate the reliability of the model.The results demonstrate that the proposed model exhibits excellent accuracy and performance,providing a promising and effective numerical approach for natural convection problems.展开更多
Julie:What are you looking at,Sam?Sam:Oh,hi,Julie.I'm looking at Fairview City's weekly snowfall update.Julie:But it's only Monday.Sam:I know.The update is for last week's snowfall.Julie:I see.It's...Julie:What are you looking at,Sam?Sam:Oh,hi,Julie.I'm looking at Fairview City's weekly snowfall update.Julie:But it's only Monday.Sam:I know.The update is for last week's snowfall.Julie:I see.It'sforthesecond weekofthis month,then.Sam:That's right.The datesare from December 8 to December 14.展开更多
Somatic symptom disorder(SSD)is a new diagnosis introduced into the Diagnostic and Statistical Manual of Mental Disorders,Fifth Edition(DSM-5),which is expected to solve the diagnostic difficulties of patients with me...Somatic symptom disorder(SSD)is a new diagnosis introduced into the Diagnostic and Statistical Manual of Mental Disorders,Fifth Edition(DSM-5),which is expected to solve the diagnostic difficulties of patients with medically unexplained symptoms.Based on the previous work,this review aims to comprehensively synthesise updated evidence related to SSD from recent years in English publications and,more extensively,from data published in Chinese language journals.The scoping review update was based on an earlier scoping review and included Chinese language publication data from China National Knowledge Internet(CNKI),WANFANG and WEIPU between January 2013 and May 2022 and data from PubMed,PsyclNFO,and Cochrane Library between June 2020 and May 2022.Initially,2984 articles were identified,of which 63 full texts were included for analysis.In China,SSD is mainly applied in scientific research,but it also shows good predictive validity and clinical application potential.The mean frequency of SSD was 4.5%in the general population,25.2%in the primary care population and 33.5%in diverse specialised care settings.Biological factors,such as brain region changes and heart rate variability,are associated with the onset of sSD.Psychological impairment related to somatic symptoms is the best predictor of prognosis.While adolescent SSD is significantly associated with family function,SSD overall is associated with an increased dysfunction of cognition and emotion,decreased quality of life,and high comorbidity with anxiety and depressive disorders.Further research is needed on suicide risk and cultural and gender-related issues.Updating the data of Chinese language studies,our research enriches the evidence-based findings related to the topics addressed in the text sections of the SSD chapter of DSM-5.However,research gaps remain about SSD reliability,population-based prevalence,suicide risk,and cultural and gender-related issues.展开更多
GTD (Gestational Trophoblastic Disease) is a pathology that encompasses benign and malignant clinical forms, affects women of childbearing age, has a variable incidence and is more frequent in developing or underdevel...GTD (Gestational Trophoblastic Disease) is a pathology that encompasses benign and malignant clinical forms, affects women of childbearing age, has a variable incidence and is more frequent in developing or underdeveloped countries, colliding with the economic barrier. The frequent absence of clear protocols and guidelines for the correct diagnosis of the pathology results in inadequate classification, imprecise treatment and failed post-therapeutic observation, increasing the risk of relapses, morbidity and mortality. The present study aims to point out updated national and international practice protocols of diagnosis of GTD, through an integrative review. Seven articles were selected and it was observed that the main international reference centers are agreed with the management suggested by the IFGO (International Federation of Gynecology and Obstetrics), being the conduct in the Hydatidiform Mole (HM): evacuation by suction and curettage under ultrasound guidance, followed by hCG monitoring every 1 - 2 weeks until normalized (usually one month for Partial Hydatidiform Mole six months for Complete Hydatidiform Mole and one year for Gestational Trophoblastic Neoplasia). Unfortunately, regarding the diagnosis of MH, the guidelines of some countries show the absence or difficulty of access to the karyotype test and ploid p57 or pelvic ultrasound accompanying the uterine curettage, contrary to what is proposed by the IFGO guideline. Establishing and complying with consistent guidelines can improve patient care, with early diagnosis of the pathology and its complications, reducing the rate of recurrence, morbidity and mortality, especially in less developed countries.展开更多
The interactions between players of the prisoner's dilemma game are inferred using observed game data.All participants play the game with their counterparts and gain corresponding rewards during each round of the ...The interactions between players of the prisoner's dilemma game are inferred using observed game data.All participants play the game with their counterparts and gain corresponding rewards during each round of the game.The strategies of each player are updated asynchronously during the game.Two inference methods of the interactions between players are derived with naive mean-field(n MF)approximation and maximum log-likelihood estimation(MLE),respectively.Two methods are tested numerically also for fully connected asymmetric Sherrington-Kirkpatrick models,varying the data length,asymmetric degree,payoff,and system noise(coupling strength).We find that the mean square error of reconstruction for the MLE method is inversely proportional to the data length and typically half(benefit from the extra information of update times)of that by n MF.Both methods are robust to the asymmetric degree but work better for large payoffs.Compared with MLE,n MF is more sensitive to the strength of couplings and prefers weak couplings.展开更多
To achieve the high availability of health data in erasure-coded cloud storage systems,the data update performance in erasure coding should be continuously optimized.However,the data update performance is often bottle...To achieve the high availability of health data in erasure-coded cloud storage systems,the data update performance in erasure coding should be continuously optimized.However,the data update performance is often bottlenecked by the constrained cross-rack bandwidth.Various techniques have been proposed in the literature to improve network bandwidth efficiency,including delta transmission,relay,and batch update.These techniques were largely proposed individually previously,and in this work,we seek to use them jointly.To mitigate the cross-rack update traffic,we propose DXR-DU which builds on four valuable techniques:(i)delta transmission,(ii)XOR-based data update,(iii)relay,and(iv)batch update.Meanwhile,we offer two selective update approaches:1)data-deltabased update,and 2)parity-delta-based update.The proposed DXR-DU is evaluated via trace-driven local testbed experiments.Comprehensive experiments show that DXR-DU can significantly improve data update throughput while mitigating the cross-rack update traffic.展开更多
With the expansion of network services,large-scale networks have progressively become common.The network status changes rapidly in response to customer needs and configuration changes,so network configuration changes ...With the expansion of network services,large-scale networks have progressively become common.The network status changes rapidly in response to customer needs and configuration changes,so network configuration changes are also very frequent.However,no matter what changes,the network must ensure the correct conditions,such as isolating tenants from each other or guaranteeing essential services.Once changes occur,it is necessary to verify the after-changed network.Whereas,for the verification of large-scale network configuration changes,many current verifiers show poor efficiency.In order to solve the problem ofmultiple global verifications caused by frequent updates of local configurations in large networks,we present a fast configuration updates verification tool,FastCUV,for distributed control planes.FastCUV aims to enhance the efficiency of distributed control plane verification for medium and large networks while ensuring correctness.This paper presents a method to determine the network range affected by the configuration change.We present a flow model and graph structure to facilitate the design of verification algorithms and speed up verification.Our scheme verifies the network area affected by obtaining the change of the Forwarding Information Base(FIB)before and after.FastCUV supports rich network attributes,meanwhile,has high efficiency and correctness performance.After experimental verification and result analysis,our method outperforms the state-of-the-art method to a certain extent.展开更多
Multiple failure modes tend to be identified in the reliability analysis of a redundant truss structure.This identification process involves updating the model for identifying the next potential failure members.Herein...Multiple failure modes tend to be identified in the reliability analysis of a redundant truss structure.This identification process involves updating the model for identifying the next potential failure members.Herein we intend to update the finite element model automatically in the identification process of failure modes and further perform the system reliability analysis efficiently.This study presents a framework that is implemented through the joint simulation of MATLAB and APDL and consists of three parts:reliability index of a single member,identification of dominant failure modes,and system-level reliability analysis for system reliability analysis of truss structures.Firstly,RSM(response surface method)combines with a constrained optimization model to calculate the reliability indices ofmembers.Then theβ-unzipping method is adopted to identify the dominant failuremodes,and the system function in MATLAB,as well as the EKILL command in APDL,is used to facilitate the automatic update of the finite element model and realize load-redistribution.Besides,the differential equivalence recursion algorithmis performed to approximate the reliability indices of failuremodes efficiently and accurately.Eventually,the PNET(probabilistic network evaluation technique)is used to calculate the joint failure probability as well as the system reliability index.Two illustrative examples demonstrate the accuracy and efficiency of the proposed system reliability analysis framework through comparison with corresponding references.展开更多
In the traditional incremental analysis update(IAU)process,all analysis increments are treated as constant forcing in a model’s prognostic equations over a certain time window.This approach effectively reduces high-f...In the traditional incremental analysis update(IAU)process,all analysis increments are treated as constant forcing in a model’s prognostic equations over a certain time window.This approach effectively reduces high-frequency oscillations introduced by data assimilation.However,as different scales of increments have unique evolutionary speeds and life histories in a numerical model,the traditional IAU scheme cannot fully meet the requirements of short-term forecasting for the damping of high-frequency noise and may even cause systematic drifts.Therefore,a multi-scale IAU scheme is proposed in this paper.Analysis increments were divided into different scale parts using a spatial filtering technique.For each scale increment,the optimal relaxation time in the IAU scheme was determined by the skill of the forecasting results.Finally,different scales of analysis increments were added to the model integration during their optimal relaxation time.The multi-scale IAU scheme can effectively reduce the noise and further improve the balance between large-scale and small-scale increments in the model initialization stage.To evaluate its performance,several numerical experiments were conducted to simulate the path and intensity of Typhoon Mangkhut(2018)and showed that:(1)the multi-scale IAU scheme had an obvious effect on noise control at the initial stage of data assimilation;(2)the optimal relaxation time for large-scale and small-scale increments was estimated as 6 h and 3 h,respectively;(3)the forecast performance of the multi-scale IAU scheme in the prediction of Typhoon Mangkhut(2018)was better than that of the traditional IAU scheme.The results demonstrate the superiority of the multi-scale IAU scheme.展开更多
This paper presents a new finite element model updating method for estimating structural parameters and detecting structural damage location and severity based on the structural responses(output-only data).The method ...This paper presents a new finite element model updating method for estimating structural parameters and detecting structural damage location and severity based on the structural responses(output-only data).The method uses the sensitivity relation of transmissibility data through a least-squares algorithm and appropriate normalization of the extracted equations.The proposed transmissibility-based sensitivity equation produces a more significant number of equations than the sensitivity equations based on the frequency response function(FRF),which can estimate the structural parameters with higher accuracy.The abilities of the proposed method are assessed by using numerical data of a two-story two-bay frame model and a plate structure model.In evaluating different damage cases,the number,location,and stiffness reduction of the damaged elements and the severity of the simulated damage have been accurately identified.The reliability and stability of the presented method against measurement and modeling errors are examined using error-contaminated data.The parameter estimation results prove the method’s capabilities as an accurate model updating algorithm.展开更多
The scale and complexity of big data are growing continuously,posing severe challenges to traditional data processing methods,especially in the field of clustering analysis.To address this issue,this paper introduces ...The scale and complexity of big data are growing continuously,posing severe challenges to traditional data processing methods,especially in the field of clustering analysis.To address this issue,this paper introduces a new method named Big Data Tensor Multi-Cluster Distributed Incremental Update(BDTMCDIncreUpdate),which combines distributed computing,storage technology,and incremental update techniques to provide an efficient and effective means for clustering analysis.Firstly,the original dataset is divided into multiple subblocks,and distributed computing resources are utilized to process the sub-blocks in parallel,enhancing efficiency.Then,initial clustering is performed on each sub-block using tensor-based multi-clustering techniques to obtain preliminary results.When new data arrives,incremental update technology is employed to update the core tensor and factor matrix,ensuring that the clustering model can adapt to changes in data.Finally,by combining the updated core tensor and factor matrix with historical computational results,refined clustering results are obtained,achieving real-time adaptation to dynamic data.Through experimental simulation on the Aminer dataset,the BDTMCDIncreUpdate method has demonstrated outstanding performance in terms of accuracy(ACC)and normalized mutual information(NMI)metrics,achieving an accuracy rate of 90%and an NMI score of 0.85,which outperforms existing methods such as TClusInitUpdate and TKLClusUpdate in most scenarios.Therefore,the BDTMCDIncreUpdate method offers an innovative solution to the field of big data analysis,integrating distributed computing,incremental updates,and tensor-based multi-clustering techniques.It not only improves the efficiency and scalability in processing large-scale high-dimensional datasets but also has been validated for its effectiveness and accuracy through experiments.This method shows great potential in real-world applications where dynamic data growth is common,and it is of significant importance for advancing the development of data analysis technology.展开更多
With the development of big data and social computing,large-scale group decisionmaking(LGDM)is nowmerging with social networks.Using social network analysis(SNA),this study proposes an LGDM consensus model that consid...With the development of big data and social computing,large-scale group decisionmaking(LGDM)is nowmerging with social networks.Using social network analysis(SNA),this study proposes an LGDM consensus model that considers the trust relationship among decisionmakers(DMs).In the process of consensusmeasurement:the social network is constructed according to the social relationship among DMs,and the Louvain method is introduced to classify social networks to form subgroups.In this study,the weights of each decision maker and each subgroup are computed by comprehensive network weights and trust weights.In the process of consensus improvement:A feedback mechanism with four identification and two direction rules is designed to guide the consensus of the improvement process.Based on the trust relationship among DMs,the preferences are modified,and the corresponding social network is updated to accelerate the consensus.Compared with the previous research,the proposedmodel not only allows the subgroups to be reconstructed and updated during the adjustment process,but also improves the accuracy of the adjustment by the feedbackmechanism.Finally,an example analysis is conducted to verify the effectiveness and flexibility of the proposed method.Moreover,compared with previous studies,the superiority of the proposed method in solving the LGDM problem is highlighted.展开更多
Interval model updating(IMU)methods have been widely used in uncertain model updating due to their low requirements for sample data.However,the surrogate model in IMU methods mostly adopts the one-time construction me...Interval model updating(IMU)methods have been widely used in uncertain model updating due to their low requirements for sample data.However,the surrogate model in IMU methods mostly adopts the one-time construction method.This makes the accuracy of the surrogate model highly dependent on the experience of users and affects the accuracy of IMU methods.Therefore,an improved IMU method via the adaptive Kriging models is proposed.This method transforms the objective function of the IMU problem into two deterministic global optimization problems about the upper bound and the interval diameter through universal grey numbers.These optimization problems are addressed through the adaptive Kriging models and the particle swarm optimization(PSO)method to quantify the uncertain parameters,and the IMU is accomplished.During the construction of these adaptive Kriging models,the sample space is gridded according to sensitivity information.Local sampling is then performed in key subspaces based on the maximum mean square error(MMSE)criterion.The interval division coefficient and random sampling coefficient are adaptively adjusted without human interference until the model meets accuracy requirements.The effectiveness of the proposed method is demonstrated by a numerical example of a three-degree-of-freedom mass-spring system and an experimental example of a butted cylindrical shell.The results show that the updated results of the interval model are in good agreement with the experimental results.展开更多
Delamination is a prevalent type of damage in composite laminate structures.Its accumulation degrades structural performance and threatens the safety and integrity of aircraft.This study presents a method for the quan...Delamination is a prevalent type of damage in composite laminate structures.Its accumulation degrades structural performance and threatens the safety and integrity of aircraft.This study presents a method for the quantitative identification of delamination identification in composite materials,leveraging distributed optical fiber sensors and a model updating approach.Initially,a numerical analysis is performed to establish a parameterized finite element model of the composite plate.Then,this model subsequently generates a database of strain responses corresponding to damage of varying sizes and locations.The radial basis function neural network surrogate model is then constructed based on the numerical simulation results and strain responses captured from the distributed fiber optic sensors.Finally,a multi-island genetic algorithm is employed for global optimization to identify the size and location of the damage.The efficacy of the proposed method is validated through numerical examples and experiment studies,examining the correlations between damage location,damage size,and strain responses.The findings confirm that the model updating technique,in conjunction with distributed fiber optic sensors,can precisely identify delamination in composite structures.展开更多
文摘To solve the problem of delayed update of spectrum information(SI) in the database assisted dynamic spectrum management(DB-DSM), this paper studies a novel dynamic update scheme of SI in DB-DSM. Firstly, a dynamic update mechanism of SI based on spectrum opportunity incentive is established, in which spectrum users are encouraged to actively assist the database to update SI in real time. Secondly, the information update contribution(IUC) of spectrum opportunity is defined to describe the cost of accessing spectrum opportunity for heterogeneous spectrum users, and the profit of SI update obtained by the database from spectrum allocation. The process that the database determines the IUC of spectrum opportunity and spectrum user selects spectrum opportunity is mapped to a Hotelling model. Thirdly, the process of determining the IUC of spectrum opportunities is further modelled as a Stackelberg game by establishing multiple virtual spectrum resource providers(VSRPs) in the database. It is proved that there is a Nash Equilibrium in the game of determining the IUC of spectrum opportunities by VSRPs. Finally, an algorithm of determining the IUC based on a genetic algorithm is designed to achieve the optimal IUC. The-oretical analysis and simulation results show that the proposed method can quickly find the optimal solution of the IUC, and ensure that the spectrum resource provider can obtain the optimal profit of SI update.
基金Project supported by the Doctoral Foundation Project of Guizhou University(Grant No.(2019)49)the National Natural Science Foundation of China(Grant No.71961003)the Science and Technology Program of Guizhou Province(Grant No.7223)。
文摘In evolutionary games,most studies on finite populations have focused on a single updating mechanism.However,given the differences in individual cognition,individuals may change their strategies according to different updating mechanisms.For this reason,we consider two different aspiration-driven updating mechanisms in structured populations:satisfied-stay unsatisfied shift(SSUS)and satisfied-cooperate unsatisfied defect(SCUD).To simulate the game player’s learning process,this paper improves the particle swarm optimization algorithm,which will be used to simulate the game player’s strategy selection,i.e.,population particle swarm optimization(PPSO)algorithms.We find that in the prisoner’s dilemma,the conditions that SSUS facilitates the evolution of cooperation do not enable cooperation to emerge.In contrast,SCUD conditions that promote the evolution of cooperation enable cooperation to emerge.In addition,the invasion of SCUD individuals helps promote cooperation among SSUS individuals.Simulated by the PPSO algorithm,the theoretical approximation results are found to be consistent with the trend of change in the simulation results.
基金open foundation of the Hubei Key Laboratory of Theory and Application of Advanced Materials Mechanicsthe Open Foundation of Hubei Key Laboratory of Engineering Structural Analysis and Safety Assessment.
文摘A fluid-structure interaction approach is proposed in this paper based onNon-Ordinary State-Based Peridynamics(NOSB-PD)and Updated Lagrangian Particle Hydrodynamics(ULPH)to simulate the fluid-structure interaction problem with large geometric deformation and material failure and solve the fluid-structure interaction problem of Newtonian fluid.In the coupled framework,the NOSB-PD theory describes the deformation and fracture of the solid material structure.ULPH is applied to describe the flow of Newtonian fluids due to its advantages in computational accuracy.The framework utilizes the advantages of NOSB-PD theory for solving discontinuous problems and ULPH theory for solving fluid problems,with good computational stability and robustness.A fluidstructure coupling algorithm using pressure as the transmission medium is established to deal with the fluidstructure interface.The dynamic model of solid structure and the PD-ULPH fluid-structure interaction model involving large deformation are verified by numerical simulations.The results agree with the analytical solution,the available experimental data,and other numerical results.Thus,the accuracy and effectiveness of the proposed method in solving the fluid-structure interaction problem are demonstrated.The fluid-structure interactionmodel based on ULPH and NOSB-PD established in this paper provides a new idea for the numerical solution of fluidstructure interaction and a promising approach for engineering design and experimental prediction.
基金supportted by the King Khalid University through the Large Group Project(No.RGP.2/312/44).
文摘Network updates have become increasingly prevalent since the broad adoption of software-defined networks(SDNs)in data centers.Modern TCP designs,including cutting-edge TCP variants DCTCP,CUBIC,and BBR,however,are not resilient to network updates that provoke flow rerouting.In this paper,we first demonstrate that popular TCP implementations perform inadequately in the presence of frequent and inconsistent network updates,because inconsistent and frequent network updates result in out-of-order packets and packet drops induced via transitory congestion and lead to serious performance deterioration.We look into the causes and propose a network update-friendly TCP(NUFTCP),which is an extension of the DCTCP variant,as a solution.Simulations are used to assess the proposed NUFTCP.Our findings reveal that NUFTCP can more effectively manage the problems of out-of-order packets and packet drops triggered in network updates,and it outperforms DCTCP considerably.
基金support from the National Natural Science Foundations of China(Nos.11972267 and 11802214)the Open Foundation of the Hubei Key Laboratory of Theory and Application of Advanced Materials Mechanics and the Open Foundation of Hubei Key Laboratory of Engineering Structural Analysis and Safety Assessment.
文摘Natural convection is a heat transfer mechanism driven by temperature or density differences,leading to fluid motion without external influence.It occurs in various natural and engineering phenomena,influencing heat transfer,climate,and fluid mixing in industrial processes.This work aims to use the Updated Lagrangian Particle Hydrodynamics(ULPH)theory to address natural convection problems.The Navier-Stokes equation is discretized using second-order nonlocal differential operators,allowing a direct solution of the Laplace operator for temperature in the energy equation.Various numerical simulations,including cases such as natural convection in square cavities and two concentric cylinders,were conducted to validate the reliability of the model.The results demonstrate that the proposed model exhibits excellent accuracy and performance,providing a promising and effective numerical approach for natural convection problems.
文摘Julie:What are you looking at,Sam?Sam:Oh,hi,Julie.I'm looking at Fairview City's weekly snowfall update.Julie:But it's only Monday.Sam:I know.The update is for last week's snowfall.Julie:I see.It'sforthesecond weekofthis month,then.Sam:That's right.The datesare from December 8 to December 14.
文摘Somatic symptom disorder(SSD)is a new diagnosis introduced into the Diagnostic and Statistical Manual of Mental Disorders,Fifth Edition(DSM-5),which is expected to solve the diagnostic difficulties of patients with medically unexplained symptoms.Based on the previous work,this review aims to comprehensively synthesise updated evidence related to SSD from recent years in English publications and,more extensively,from data published in Chinese language journals.The scoping review update was based on an earlier scoping review and included Chinese language publication data from China National Knowledge Internet(CNKI),WANFANG and WEIPU between January 2013 and May 2022 and data from PubMed,PsyclNFO,and Cochrane Library between June 2020 and May 2022.Initially,2984 articles were identified,of which 63 full texts were included for analysis.In China,SSD is mainly applied in scientific research,but it also shows good predictive validity and clinical application potential.The mean frequency of SSD was 4.5%in the general population,25.2%in the primary care population and 33.5%in diverse specialised care settings.Biological factors,such as brain region changes and heart rate variability,are associated with the onset of sSD.Psychological impairment related to somatic symptoms is the best predictor of prognosis.While adolescent SSD is significantly associated with family function,SSD overall is associated with an increased dysfunction of cognition and emotion,decreased quality of life,and high comorbidity with anxiety and depressive disorders.Further research is needed on suicide risk and cultural and gender-related issues.Updating the data of Chinese language studies,our research enriches the evidence-based findings related to the topics addressed in the text sections of the SSD chapter of DSM-5.However,research gaps remain about SSD reliability,population-based prevalence,suicide risk,and cultural and gender-related issues.
文摘GTD (Gestational Trophoblastic Disease) is a pathology that encompasses benign and malignant clinical forms, affects women of childbearing age, has a variable incidence and is more frequent in developing or underdeveloped countries, colliding with the economic barrier. The frequent absence of clear protocols and guidelines for the correct diagnosis of the pathology results in inadequate classification, imprecise treatment and failed post-therapeutic observation, increasing the risk of relapses, morbidity and mortality. The present study aims to point out updated national and international practice protocols of diagnosis of GTD, through an integrative review. Seven articles were selected and it was observed that the main international reference centers are agreed with the management suggested by the IFGO (International Federation of Gynecology and Obstetrics), being the conduct in the Hydatidiform Mole (HM): evacuation by suction and curettage under ultrasound guidance, followed by hCG monitoring every 1 - 2 weeks until normalized (usually one month for Partial Hydatidiform Mole six months for Complete Hydatidiform Mole and one year for Gestational Trophoblastic Neoplasia). Unfortunately, regarding the diagnosis of MH, the guidelines of some countries show the absence or difficulty of access to the karyotype test and ploid p57 or pelvic ultrasound accompanying the uterine curettage, contrary to what is proposed by the IFGO guideline. Establishing and complying with consistent guidelines can improve patient care, with early diagnosis of the pathology and its complications, reducing the rate of recurrence, morbidity and mortality, especially in less developed countries.
基金supported by the National Natural Science Foundation of China(Grant Nos.11705079 and 11705279)the Scientific Research Foundation of Nanjing University of Posts and Telecommunications(Grant Nos.NY221101 and NY222134)the Science and Technology Innovation Training Program(Grant No.STITP 202210293044Z)。
文摘The interactions between players of the prisoner's dilemma game are inferred using observed game data.All participants play the game with their counterparts and gain corresponding rewards during each round of the game.The strategies of each player are updated asynchronously during the game.Two inference methods of the interactions between players are derived with naive mean-field(n MF)approximation and maximum log-likelihood estimation(MLE),respectively.Two methods are tested numerically also for fully connected asymmetric Sherrington-Kirkpatrick models,varying the data length,asymmetric degree,payoff,and system noise(coupling strength).We find that the mean square error of reconstruction for the MLE method is inversely proportional to the data length and typically half(benefit from the extra information of update times)of that by n MF.Both methods are robust to the asymmetric degree but work better for large payoffs.Compared with MLE,n MF is more sensitive to the strength of couplings and prefers weak couplings.
基金supported by Major Special Project of Sichuan Science and Technology Department(2020YFG0460)Central University Project of China(ZYGX2020ZB020,ZYGX2020ZB019).
文摘To achieve the high availability of health data in erasure-coded cloud storage systems,the data update performance in erasure coding should be continuously optimized.However,the data update performance is often bottlenecked by the constrained cross-rack bandwidth.Various techniques have been proposed in the literature to improve network bandwidth efficiency,including delta transmission,relay,and batch update.These techniques were largely proposed individually previously,and in this work,we seek to use them jointly.To mitigate the cross-rack update traffic,we propose DXR-DU which builds on four valuable techniques:(i)delta transmission,(ii)XOR-based data update,(iii)relay,and(iv)batch update.Meanwhile,we offer two selective update approaches:1)data-deltabased update,and 2)parity-delta-based update.The proposed DXR-DU is evaluated via trace-driven local testbed experiments.Comprehensive experiments show that DXR-DU can significantly improve data update throughput while mitigating the cross-rack update traffic.
基金supported by the Fundamental Research Funds for the Central Universities(2021RC239)the Postdoctoral Science Foundation of China(2021 M690338)+3 种基金theHainan Provincial Natural Science Foundation of China(620RC562,2019RC096,620RC560)the Scientific Research Setup Fund of Hainan University(KYQD(ZR)1877)the Program of Hainan Association for Science and Technology Plans to Youth R&D Innovation(QCXM201910)the National Natural Science Foundation of China(61802092,62162021).
文摘With the expansion of network services,large-scale networks have progressively become common.The network status changes rapidly in response to customer needs and configuration changes,so network configuration changes are also very frequent.However,no matter what changes,the network must ensure the correct conditions,such as isolating tenants from each other or guaranteeing essential services.Once changes occur,it is necessary to verify the after-changed network.Whereas,for the verification of large-scale network configuration changes,many current verifiers show poor efficiency.In order to solve the problem ofmultiple global verifications caused by frequent updates of local configurations in large networks,we present a fast configuration updates verification tool,FastCUV,for distributed control planes.FastCUV aims to enhance the efficiency of distributed control plane verification for medium and large networks while ensuring correctness.This paper presents a method to determine the network range affected by the configuration change.We present a flow model and graph structure to facilitate the design of verification algorithms and speed up verification.Our scheme verifies the network area affected by obtaining the change of the Forwarding Information Base(FIB)before and after.FastCUV supports rich network attributes,meanwhile,has high efficiency and correctness performance.After experimental verification and result analysis,our method outperforms the state-of-the-art method to a certain extent.
基金support from the National Key R&D Program of China(Grant Nos.2021YFB2600605,2021YFB2600600)the Overseas Scholar Program in the Hebei Province(C20190514)+1 种基金from the State Key Laboratory of Mechanical Behavior and System Safety of Traffic Engineering Structures Project(ZZ2020-20)from the Youth Foundation of Hebei Science and Technology Research Project(QN2018108).
文摘Multiple failure modes tend to be identified in the reliability analysis of a redundant truss structure.This identification process involves updating the model for identifying the next potential failure members.Herein we intend to update the finite element model automatically in the identification process of failure modes and further perform the system reliability analysis efficiently.This study presents a framework that is implemented through the joint simulation of MATLAB and APDL and consists of three parts:reliability index of a single member,identification of dominant failure modes,and system-level reliability analysis for system reliability analysis of truss structures.Firstly,RSM(response surface method)combines with a constrained optimization model to calculate the reliability indices ofmembers.Then theβ-unzipping method is adopted to identify the dominant failuremodes,and the system function in MATLAB,as well as the EKILL command in APDL,is used to facilitate the automatic update of the finite element model and realize load-redistribution.Besides,the differential equivalence recursion algorithmis performed to approximate the reliability indices of failuremodes efficiently and accurately.Eventually,the PNET(probabilistic network evaluation technique)is used to calculate the joint failure probability as well as the system reliability index.Two illustrative examples demonstrate the accuracy and efficiency of the proposed system reliability analysis framework through comparison with corresponding references.
基金jointly sponsored by the Shenzhen Science and Technology Innovation Commission (Grant No. KCXFZ20201221173610028)the key program of the National Natural Science Foundation of China (Grant No. 42130605)
文摘In the traditional incremental analysis update(IAU)process,all analysis increments are treated as constant forcing in a model’s prognostic equations over a certain time window.This approach effectively reduces high-frequency oscillations introduced by data assimilation.However,as different scales of increments have unique evolutionary speeds and life histories in a numerical model,the traditional IAU scheme cannot fully meet the requirements of short-term forecasting for the damping of high-frequency noise and may even cause systematic drifts.Therefore,a multi-scale IAU scheme is proposed in this paper.Analysis increments were divided into different scale parts using a spatial filtering technique.For each scale increment,the optimal relaxation time in the IAU scheme was determined by the skill of the forecasting results.Finally,different scales of analysis increments were added to the model integration during their optimal relaxation time.The multi-scale IAU scheme can effectively reduce the noise and further improve the balance between large-scale and small-scale increments in the model initialization stage.To evaluate its performance,several numerical experiments were conducted to simulate the path and intensity of Typhoon Mangkhut(2018)and showed that:(1)the multi-scale IAU scheme had an obvious effect on noise control at the initial stage of data assimilation;(2)the optimal relaxation time for large-scale and small-scale increments was estimated as 6 h and 3 h,respectively;(3)the forecast performance of the multi-scale IAU scheme in the prediction of Typhoon Mangkhut(2018)was better than that of the traditional IAU scheme.The results demonstrate the superiority of the multi-scale IAU scheme.
文摘This paper presents a new finite element model updating method for estimating structural parameters and detecting structural damage location and severity based on the structural responses(output-only data).The method uses the sensitivity relation of transmissibility data through a least-squares algorithm and appropriate normalization of the extracted equations.The proposed transmissibility-based sensitivity equation produces a more significant number of equations than the sensitivity equations based on the frequency response function(FRF),which can estimate the structural parameters with higher accuracy.The abilities of the proposed method are assessed by using numerical data of a two-story two-bay frame model and a plate structure model.In evaluating different damage cases,the number,location,and stiffness reduction of the damaged elements and the severity of the simulated damage have been accurately identified.The reliability and stability of the presented method against measurement and modeling errors are examined using error-contaminated data.The parameter estimation results prove the method’s capabilities as an accurate model updating algorithm.
基金sponsored by the National Natural Science Foundation of China(Nos.61972208,62102194 and 62102196)National Natural Science Foundation of China(Youth Project)(No.62302237)+3 种基金Six Talent Peaks Project of Jiangsu Province(No.RJFW-111),China Postdoctoral Science Foundation Project(No.2018M640509)Postgraduate Research and Practice Innovation Program of Jiangsu Province(Nos.KYCX22_1019,KYCX23_1087,KYCX22_1027,KYCX23_1087,SJCX24_0339 and SJCX24_0346)Innovative Training Program for College Students of Nanjing University of Posts and Telecommunications(No.XZD2019116)Nanjing University of Posts and Telecommunications College Students Innovation Training Program(Nos.XZD2019116,XYB2019331).
文摘The scale and complexity of big data are growing continuously,posing severe challenges to traditional data processing methods,especially in the field of clustering analysis.To address this issue,this paper introduces a new method named Big Data Tensor Multi-Cluster Distributed Incremental Update(BDTMCDIncreUpdate),which combines distributed computing,storage technology,and incremental update techniques to provide an efficient and effective means for clustering analysis.Firstly,the original dataset is divided into multiple subblocks,and distributed computing resources are utilized to process the sub-blocks in parallel,enhancing efficiency.Then,initial clustering is performed on each sub-block using tensor-based multi-clustering techniques to obtain preliminary results.When new data arrives,incremental update technology is employed to update the core tensor and factor matrix,ensuring that the clustering model can adapt to changes in data.Finally,by combining the updated core tensor and factor matrix with historical computational results,refined clustering results are obtained,achieving real-time adaptation to dynamic data.Through experimental simulation on the Aminer dataset,the BDTMCDIncreUpdate method has demonstrated outstanding performance in terms of accuracy(ACC)and normalized mutual information(NMI)metrics,achieving an accuracy rate of 90%and an NMI score of 0.85,which outperforms existing methods such as TClusInitUpdate and TKLClusUpdate in most scenarios.Therefore,the BDTMCDIncreUpdate method offers an innovative solution to the field of big data analysis,integrating distributed computing,incremental updates,and tensor-based multi-clustering techniques.It not only improves the efficiency and scalability in processing large-scale high-dimensional datasets but also has been validated for its effectiveness and accuracy through experiments.This method shows great potential in real-world applications where dynamic data growth is common,and it is of significant importance for advancing the development of data analysis technology.
基金The work was supported by Humanities and Social Sciences Fund of the Ministry of Education(No.22YJA630119)the National Natural Science Foundation of China(No.71971051)Natural Science Foundation of Hebei Province(No.G2021501004).
文摘With the development of big data and social computing,large-scale group decisionmaking(LGDM)is nowmerging with social networks.Using social network analysis(SNA),this study proposes an LGDM consensus model that considers the trust relationship among decisionmakers(DMs).In the process of consensusmeasurement:the social network is constructed according to the social relationship among DMs,and the Louvain method is introduced to classify social networks to form subgroups.In this study,the weights of each decision maker and each subgroup are computed by comprehensive network weights and trust weights.In the process of consensus improvement:A feedback mechanism with four identification and two direction rules is designed to guide the consensus of the improvement process.Based on the trust relationship among DMs,the preferences are modified,and the corresponding social network is updated to accelerate the consensus.Compared with the previous research,the proposedmodel not only allows the subgroups to be reconstructed and updated during the adjustment process,but also improves the accuracy of the adjustment by the feedbackmechanism.Finally,an example analysis is conducted to verify the effectiveness and flexibility of the proposed method.Moreover,compared with previous studies,the superiority of the proposed method in solving the LGDM problem is highlighted.
基金Project supported by the National Natural Science Foundation of China(Nos.12272211,12072181,12121002)。
文摘Interval model updating(IMU)methods have been widely used in uncertain model updating due to their low requirements for sample data.However,the surrogate model in IMU methods mostly adopts the one-time construction method.This makes the accuracy of the surrogate model highly dependent on the experience of users and affects the accuracy of IMU methods.Therefore,an improved IMU method via the adaptive Kriging models is proposed.This method transforms the objective function of the IMU problem into two deterministic global optimization problems about the upper bound and the interval diameter through universal grey numbers.These optimization problems are addressed through the adaptive Kriging models and the particle swarm optimization(PSO)method to quantify the uncertain parameters,and the IMU is accomplished.During the construction of these adaptive Kriging models,the sample space is gridded according to sensitivity information.Local sampling is then performed in key subspaces based on the maximum mean square error(MMSE)criterion.The interval division coefficient and random sampling coefficient are adaptively adjusted without human interference until the model meets accuracy requirements.The effectiveness of the proposed method is demonstrated by a numerical example of a three-degree-of-freedom mass-spring system and an experimental example of a butted cylindrical shell.The results show that the updated results of the interval model are in good agreement with the experimental results.
基金supported by the National Natural Science Foundation of China(No.12072056)the National Key Research and Development Program of China(No.2018YFA0702800)+1 种基金the Jiangsu-Czech Bilateral Co-Funding R&D Project(No.BZ2023011)the Fundamental Research Funds for the Central Universities(No.B220204002).
文摘Delamination is a prevalent type of damage in composite laminate structures.Its accumulation degrades structural performance and threatens the safety and integrity of aircraft.This study presents a method for the quantitative identification of delamination identification in composite materials,leveraging distributed optical fiber sensors and a model updating approach.Initially,a numerical analysis is performed to establish a parameterized finite element model of the composite plate.Then,this model subsequently generates a database of strain responses corresponding to damage of varying sizes and locations.The radial basis function neural network surrogate model is then constructed based on the numerical simulation results and strain responses captured from the distributed fiber optic sensors.Finally,a multi-island genetic algorithm is employed for global optimization to identify the size and location of the damage.The efficacy of the proposed method is validated through numerical examples and experiment studies,examining the correlations between damage location,damage size,and strain responses.The findings confirm that the model updating technique,in conjunction with distributed fiber optic sensors,can precisely identify delamination in composite structures.