Due to the restricted satellite payloads in LEO mega-constellation networks(LMCNs),remote sensing image analysis,online learning and other big data services desirably need onboard distributed processing(OBDP).In exist...Due to the restricted satellite payloads in LEO mega-constellation networks(LMCNs),remote sensing image analysis,online learning and other big data services desirably need onboard distributed processing(OBDP).In existing technologies,the efficiency of big data applications(BDAs)in distributed systems hinges on the stable-state and low-latency links between worker nodes.However,LMCNs with high-dynamic nodes and long-distance links can not provide the above conditions,which makes the performance of OBDP hard to be intuitively measured.To bridge this gap,a multidimensional simulation platform is indispensable that can simulate the network environment of LMCNs and put BDAs in it for performance testing.Using STK's APIs and parallel computing framework,we achieve real-time simulation for thousands of satellite nodes,which are mapped as application nodes through software defined network(SDN)and container technologies.We elaborate the architecture and mechanism of the simulation platform,and take the Starlink and Hadoop as realistic examples for simulations.The results indicate that LMCNs have dynamic end-to-end latency which fluctuates periodically with the constellation movement.Compared to ground data center networks(GDCNs),LMCNs deteriorate the computing and storage job throughput,which can be alleviated by the utilization of erasure codes and data flow scheduling of worker nodes.展开更多
Background:Missing data are frequently occurred in clinical studies.Due to the development of precision medicine,there is an increased interest in N-of-1 trial.Bayesian models are one of main statistical methods for a...Background:Missing data are frequently occurred in clinical studies.Due to the development of precision medicine,there is an increased interest in N-of-1 trial.Bayesian models are one of main statistical methods for analyzing the data of N-of-1 trials.This simulation study aimed to compare two statistical methods for handling missing values of quantitative data in Bayesian N-of-1 trials.Methods:The simulated data of N-of-1 trials with different coefficients of autocorrelation,effect sizes and missing ratios are obtained by SAS 9.1 system.The missing values are filled with mean filling and regression filling respectively in the condition of different coefficients of autocorrelation,effect sizes and missing ratios by SPSS 25.0 software.Bayesian models are built to estimate the posterior means by Winbugs 14 software.Results:When the missing ratio is relatively small,e.g.5%,missing values have relatively little effect on the results.Therapeutic effects may be underestimated when the coefficient of autocorrelation increases and no filling is used.However,it may be overestimated when mean or regression filling is used,and the results after mean filling are closer to the actual effect than regression filling.In the case of moderate missing ratio,the estimated effect after mean filling is closer to the actual effect compared to regression filling.When a large missing ratio(20%)occurs,data missing can lead to significantly underestimate the effect.In this case,the estimated effect after regression filling is closer to the actual effect compared to mean filling.Conclusion:Data missing can affect the estimated therapeutic effects using Bayesian models in N-of-1 trials.The present study suggests that mean filling can be used under situation of missing ratio≤10%.Otherwise,regression filling may be preferable.展开更多
An attempt has been made to develop a distributed software infrastructure model for onboard data fusion system simulation, which is also applied to netted radar systems, onboard distributed detection systems and advan...An attempt has been made to develop a distributed software infrastructure model for onboard data fusion system simulation, which is also applied to netted radar systems, onboard distributed detection systems and advanced C3I systems. Two architectures are provided and verified: one is based on pure TCP/IP protocol and C/S model, and implemented with Winsock, the other is based on CORBA (common object request broker architecture). The performance of data fusion simulation system, i.e. reliability, flexibility and scalability, is improved and enhanced by two models. The study of them makes valuable explore on incorporating the distributed computation concepts into radar system simulation techniques.展开更多
The MM5 and its four dimensional variational data assimilation (4D-Var) system are used in this paper. Based on the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) re...The MM5 and its four dimensional variational data assimilation (4D-Var) system are used in this paper. Based on the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis data, the authors generate an optimal initial condition for a typhoon by using the bogus data assimilation (BDA) scheme. BDA is able to recover many of the structural features of typhoons including a warm-core vertex, the correct center position, and the strong circulation. As a result of BDA using a bogus surface low, dramatic improvement is achieved in the 72 h prediction of typhoon Herb. Through several cases, the initialization by BDA effectively generates the harmonious inner structure of the typhoon, but which is lacking in the original analysis field. Therefore the intensity forecast is improved greatly. Some improvements are made in the track forecast, but more work still needs to be done.展开更多
Background:The universal occurrence of randomly distributed dark holes(i.e.,data pits appearing within the tree crown)in LiDAR-derived canopy height models(CHMs)negatively affects the accuracy of extracted forest inve...Background:The universal occurrence of randomly distributed dark holes(i.e.,data pits appearing within the tree crown)in LiDAR-derived canopy height models(CHMs)negatively affects the accuracy of extracted forest inventory parameters.Methods:We develop an algorithm based on cloth simulation for constructing a pit-free CHM.Results:The proposed algorithm effectively fills data pits of various sizes whilst preserving canopy details.Our pitfree CHMs derived from point clouds at different proportions of data pits are remarkably better than those constructed using other algorithms,as evidenced by the lowest average root mean square error(0.4981 m)between the reference CHMs and the constructed pit-free CHMs.Moreover,our pit-free CHMs show the best performance overall in terms of maximum tree height estimation(average bias=0.9674 m).Conclusion:The proposed algorithm can be adopted when working with different quality LiDAR data and shows high potential in forestry applications.展开更多
This paper describes the historical simulations produced by the Chinese Academy of Meteorological Sciences(CAMS)climate system model(CAMS-CSM),which are contributing to phase 6 of the Coupled Model Intercomparison Pro...This paper describes the historical simulations produced by the Chinese Academy of Meteorological Sciences(CAMS)climate system model(CAMS-CSM),which are contributing to phase 6 of the Coupled Model Intercomparison Project(CMIP6).The model description,experiment design and model outputs are presented.Three members’historical experiments are conducted by CAMS-CSM,with two members starting from different initial conditions,and one excluding the stratospheric aerosol to identify the effect of volcanic eruptions.The outputs of the historical experiments are also validated using observational data.It is found that the model can reproduce the climatological mean states and seasonal cycle of the major climate system quantities,including the surface air temperature,precipitation,and the equatorial thermocline.The long-term trend of air temperature and precipitation is also reasonably captured by CAMS-CSM.There are still some biases in the model that need further improvement.This paper can help the users to better understand the performance and the datasets of CAMS-CSM.展开更多
Basins in many parts of the world are ungauged or poorly gauged, and in some cases existing measurement networks are declining. The purpose of this study was to examine the utility of reanalysis and global precipitati...Basins in many parts of the world are ungauged or poorly gauged, and in some cases existing measurement networks are declining. The purpose of this study was to examine the utility of reanalysis and global precipitation datasets in the river discharge simulation for a data-scarce basin. The White Volta basin of Ghana which is one of international rivers was selected as a study basin. NCEP1, NCEP2, ERA-Interim, and GPCP datasets were compared with corresponding observed precipitation data. Annual variations were not reproduced in NCEP1, NCEP2, and ERA-Interim. However, GPCP data, which is based on satellite and observed data, had good seasonal accuracy and reproduced annual variations well. Moreover, five datasets were used as input data to a hydrologic model with HYMOD, which is a water balance model, and with WTM, which is a river model;thereafter, the hydrologic model was calibrated for each datum set by a global optimization method, and river discharge were simulated. The results were evaluated by the root mean square error, relative error, and water balance error. As a result, the combination of GPCP precipitation and ERA-Interim evaporation data was the best in terms of most evaluations. The relative errors in the calibration and validation periods were 43.1% and 46.6%, respectively. Moreover, the results for the GPCP precipitation and ERA-Interim evaporation were better than those for the combination of observed precipitation and ERA-Interim evaporation. In conclusion, GPCP precipitation data and ERA-Interim evaporation data are very useful in a data-scarce basin water balance analysis.展开更多
When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive ...When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.展开更多
Recent advances in deep learning have expanded new possibilities for fluid flow simulation in petroleum reservoirs.However,the predominant approach in existing research is to train neural networks using high-fidelity ...Recent advances in deep learning have expanded new possibilities for fluid flow simulation in petroleum reservoirs.However,the predominant approach in existing research is to train neural networks using high-fidelity numerical simulation data.This presents a significant challenge because the sole source of authentic wellbore production data for training is sparse.In response to this challenge,this work introduces a novel architecture called physics-informed neural network based on domain decomposition(PINN-DD),aiming to effectively utilize the sparse production data of wells for reservoir simulation with large-scale systems.To harness the capabilities of physics-informed neural networks(PINNs)in handling small-scale spatial-temporal domain while addressing the challenges of large-scale systems with sparse labeled data,the computational domain is divided into two distinct sub-domains:the well-containing and the well-free sub-domain.Moreover,the two sub-domains and the interface are rigorously constrained by the governing equations,data matching,and boundary conditions.The accuracy of the proposed method is evaluated on two problems,and its performance is compared against state-of-the-art PINNs through numerical analysis as a benchmark.The results demonstrate the superiority of PINN-DD in handling large-scale reservoir simulation with limited data and show its potential to outperform conventional PINNs in such scenarios.展开更多
The monopile is the most common foundation to support offshore wind turbines.In the marine environment,local scour due to combined currents and waves is a significant issue that must be considered in the design of win...The monopile is the most common foundation to support offshore wind turbines.In the marine environment,local scour due to combined currents and waves is a significant issue that must be considered in the design of wind turbine foundations.In this paper,a full-scale numerical model was developed and validated based on field data from Rudong,China.The scour development around monopiles was investigated,and the effects of waves and the Reynolds number Re were analyzed.Several formulas for predicting the scour depth in the literature have been evaluated.It is found that waves can accelerate scour development even if the KC number is small(0.78<KC<1.57).The formula obtained from small-scale model tests may be unsafe or wasteful when it is applied in practical design due to the scale effect.A new equation for predicting the scour depth based on the average pile Reynolds number(Rea)is proposed and validated with field data.The equilibrium scour depth predicted using the proposed equation is evaluated and compared with those from nine equations in the literature.It is demonstrated that the values predicted from the proposed equation and from the S/M(Sheppard/Melville)equation are closer to the field data.展开更多
Typhoon Rananim (0414) has been simulated by using the non-hydrostatic Advanced Regional Prediction System (ARPS) from Center of Analysis and Prediction of Storms (CAPS). The prediction of Rananim has generally ...Typhoon Rananim (0414) has been simulated by using the non-hydrostatic Advanced Regional Prediction System (ARPS) from Center of Analysis and Prediction of Storms (CAPS). The prediction of Rananim has generally been improved with ARPS using the new generation CINRAD Doppler radar data. Numerical experiments with or without using the radar data have shown that model initial fields with the assimilated radar radial velocity data in ARPS can change the wind field at the middle and high levels of the troposphere; fine characteristics of the tropical cyclone (TC) are introduced into the initial wind, the x component of wind speed south of the TC is increased and so is the y component west of it. They lead to improved forecasting of TC tracks for the time after landfall. The field of water vapor mixing ratio, temperature, cloud water mixing ratio and rainwater mixing ratio have also been improved by using radar refiectivity data. The model's initial response to the introduction of hydrometeors has been increased. It is shown that horizontal model resolution has a significant impact on intensity forecasts, by greatly improving the forecasting of TC rainfall, and heavy rainstorm of the TC specially, as well as its distribution and variation with time.展开更多
In the last decade,building energy simulation( BES)became a central component in building energy systems’ design and optimization. For each building location,BES requires one year of hourly weather data. Most buildin...In the last decade,building energy simulation( BES)became a central component in building energy systems’ design and optimization. For each building location,BES requires one year of hourly weather data. Most buildings are designed to last50 + years,consequently,the building design phase should include BES with future weather files considering climate change.This paper presents a comparative study of two methods to produce future climate hourly data files for BES: Morphing and typical meteorological year of future climate( F-TMY). The study uses data from a high-resolution( 9 km) regional climate atmospheric model simulation of Iberia,spanning 10 years of historical and future hourly data. This study compares both methods by analyzing anomalies in air temperature,and the impact in BES predictions of annual and peak energy consumption for space heating, cooling and ventilation in 4 buildings.Additionally, this study performs a sensitivity analysis of morphing method. The analysis shows that F-TMY is representative of the multi-year simulation for BES applications.A high-quality Morphed TMY weather file has a similar performance compared to F-TMY( average difference: 8% versus 7%). Morphing based on different baseline climates,low-grid resolution and/or outdated climate projections leads to BES average differences of 16%~20%.展开更多
Dynamic numerical simulation of water conditions is useful for reservoir management. In remote semi-arid areas, however, meteorological and hydrological time-series data needed for computation are not frequently measu...Dynamic numerical simulation of water conditions is useful for reservoir management. In remote semi-arid areas, however, meteorological and hydrological time-series data needed for computation are not frequently measured and must be obtained using other information. This paper presents a case study of data generation for the computation of thermal conditions in the Joumine Reservoir, Tunisia. Data from the Wind Finder web site and daily sunshine duration at the nearest weather stations were utilized to generate cloud cover and solar radiation data based on meteorological correlations obtained in Japan, which is located at the same latitude as Tunisia. A time series of inflow water temperature was estimated from air temperature using a numerical filter expressed as a linear second-order differential equation. A numerical simulation using a vertical 2-D (two-dimensional) turbulent flow model for a stratified water body with generated data successfully reproduced seasonal thermal conditions in the reservoir, which were monitored using a thermistor chain.展开更多
The progress of safety technologies,based on the continuous advances in vehicle crash worthiness,restraint systems and active safety functions made traffic safer than ever before.Latest developments heading from assis...The progress of safety technologies,based on the continuous advances in vehicle crash worthiness,restraint systems and active safety functions made traffic safer than ever before.Latest developments heading from assisted Advanced Driver Assistance System(ADAS)to Automated Driving(AD),lead to more and more complex real-world situations to be handled,going from standard driving tasks up to critical situations,which may cause a collision.Therefore,throughout the development process of such systems,it becomes common to use simulation technologies in order to assess these systems in advance.To gain results out of the simulation,input data are required;typically,from various sources,so the requirements can be covered.Thus,the challenge of scoping with different input sources arises.To come along with that problem,two main kinds of input data will be needed in general:(1)the descriptive parameter covering all border conditions,so called parameter room;(2)the system specifications for estimation.The quality of the results correlates strongly with the quality of inputs given.In case of ADAS systems and AD functions,the second kind of input data is very well known.Major challenges relate to the first kind of input data.Thus,the paper will describe a way to create input data that cover all descriptive parameters needed from normal driving up to the collision by the combination of accident analysis and real-world road traffic observations.The method aims at being applicable to different data sources and to different countries.展开更多
The field of fluid simulation is developing rapidly,and data-driven methods provide many frameworks and techniques for fluid simulation.This paper presents a survey of data-driven methods used in fluid simulation in c...The field of fluid simulation is developing rapidly,and data-driven methods provide many frameworks and techniques for fluid simulation.This paper presents a survey of data-driven methods used in fluid simulation in computer graphics in recent years.First,we provide a brief introduction of physical based fluid simulation methods based on their spatial discretization,including Lagrangian,Eulerian,and hybrid methods.The characteristics of these underlying structures and their inherent connection with data driven methodologies are then analyzed.Subsequently,we review studies pertaining to a wide range of applications,including data-driven solvers,detail enhancement,animation synthesis,fluid control,and differentiable simulation.Finally,we discuss some related issues and potential directions in data-driven fluid simulation.We conclude that the fluid simulation combined with data-driven methods has some advantages,such as higher simulation efficiency,rich details and different pattern styles,compared with traditional methods under the same parameters.It can be seen that the data-driven fluid simulation is feasible and has broad prospects.展开更多
Recently, the China haze becomes more and more serious, but it is very difficult to model and control it. Here, a data-driven model is introduced for the simulation and monitoring of China haze. First, a multi-dimensi...Recently, the China haze becomes more and more serious, but it is very difficult to model and control it. Here, a data-driven model is introduced for the simulation and monitoring of China haze. First, a multi-dimensional evaluation system is built to evaluate the government performance of China haze. Second, a data-driven model is employed to reveal the operation mechanism of China’s haze and is described as a multi input and multi output system. Third, a prototype system is set up to verify the proposed scheme, and the result provides us with a graphical tool to monitor different haze control strategies.展开更多
The simulation techniques of hardware-in-loop simulation(HLS) of homing antitank missile based on the personal computer (PC) are discussed. The PC and MCS-96 chip controller employ A/D and D/A boards (with photoelectr...The simulation techniques of hardware-in-loop simulation(HLS) of homing antitank missile based on the personal computer (PC) are discussed. The PC and MCS-96 chip controller employ A/D and D/A boards (with photoelectricity isolation) to transfer measur ment and control information about homing head, gyro and rudder and utilize the digital hand shaking board to build correct communication communication protocol. In order to satisfy the real-time requirement of HLS, this paper first simplifies the aerodynamic data file reasonably, then builds a PC software with C language. The program of the controller part is made with PL/M language. The simulation of HLS based on PC is done with the same sampling period of 10ms as that of YH-F1 and the experiment results are identical to those of digital simulation of the homing anti-tank guided missile.展开更多
Based on MATRIXx, a universal real-time visual distributed simulation system is developed. The system can receive different input data from network or local terminal. Application models in the simulation modules can a...Based on MATRIXx, a universal real-time visual distributed simulation system is developed. The system can receive different input data from network or local terminal. Application models in the simulation modules can automatically get such data to be analyzed and calculated, and then produce real-time simulation control information. Meanwhile, this paper designs relevant simulation components to implement the input and output data, which can guarantee the real-time and universal of the data transmission. Result of the experimental system shows that the real-time performance of the simulation is perfect.展开更多
基金supported by National Natural Sciences Foundation of China(No.62271165,62027802,62201307)the Guangdong Basic and Applied Basic Research Foundation(No.2023A1515030297)+2 种基金the Shenzhen Science and Technology Program ZDSYS20210623091808025Stable Support Plan Program GXWD20231129102638002the Major Key Project of PCL(No.PCL2024A01)。
文摘Due to the restricted satellite payloads in LEO mega-constellation networks(LMCNs),remote sensing image analysis,online learning and other big data services desirably need onboard distributed processing(OBDP).In existing technologies,the efficiency of big data applications(BDAs)in distributed systems hinges on the stable-state and low-latency links between worker nodes.However,LMCNs with high-dynamic nodes and long-distance links can not provide the above conditions,which makes the performance of OBDP hard to be intuitively measured.To bridge this gap,a multidimensional simulation platform is indispensable that can simulate the network environment of LMCNs and put BDAs in it for performance testing.Using STK's APIs and parallel computing framework,we achieve real-time simulation for thousands of satellite nodes,which are mapped as application nodes through software defined network(SDN)and container technologies.We elaborate the architecture and mechanism of the simulation platform,and take the Starlink and Hadoop as realistic examples for simulations.The results indicate that LMCNs have dynamic end-to-end latency which fluctuates periodically with the constellation movement.Compared to ground data center networks(GDCNs),LMCNs deteriorate the computing and storage job throughput,which can be alleviated by the utilization of erasure codes and data flow scheduling of worker nodes.
基金supported by the National Natural Science Foundation of China (No.81973705).
文摘Background:Missing data are frequently occurred in clinical studies.Due to the development of precision medicine,there is an increased interest in N-of-1 trial.Bayesian models are one of main statistical methods for analyzing the data of N-of-1 trials.This simulation study aimed to compare two statistical methods for handling missing values of quantitative data in Bayesian N-of-1 trials.Methods:The simulated data of N-of-1 trials with different coefficients of autocorrelation,effect sizes and missing ratios are obtained by SAS 9.1 system.The missing values are filled with mean filling and regression filling respectively in the condition of different coefficients of autocorrelation,effect sizes and missing ratios by SPSS 25.0 software.Bayesian models are built to estimate the posterior means by Winbugs 14 software.Results:When the missing ratio is relatively small,e.g.5%,missing values have relatively little effect on the results.Therapeutic effects may be underestimated when the coefficient of autocorrelation increases and no filling is used.However,it may be overestimated when mean or regression filling is used,and the results after mean filling are closer to the actual effect than regression filling.In the case of moderate missing ratio,the estimated effect after mean filling is closer to the actual effect compared to regression filling.When a large missing ratio(20%)occurs,data missing can lead to significantly underestimate the effect.In this case,the estimated effect after regression filling is closer to the actual effect compared to mean filling.Conclusion:Data missing can affect the estimated therapeutic effects using Bayesian models in N-of-1 trials.The present study suggests that mean filling can be used under situation of missing ratio≤10%.Otherwise,regression filling may be preferable.
文摘An attempt has been made to develop a distributed software infrastructure model for onboard data fusion system simulation, which is also applied to netted radar systems, onboard distributed detection systems and advanced C3I systems. Two architectures are provided and verified: one is based on pure TCP/IP protocol and C/S model, and implemented with Winsock, the other is based on CORBA (common object request broker architecture). The performance of data fusion simulation system, i.e. reliability, flexibility and scalability, is improved and enhanced by two models. The study of them makes valuable explore on incorporating the distributed computation concepts into radar system simulation techniques.
文摘The MM5 and its four dimensional variational data assimilation (4D-Var) system are used in this paper. Based on the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis data, the authors generate an optimal initial condition for a typhoon by using the bogus data assimilation (BDA) scheme. BDA is able to recover many of the structural features of typhoons including a warm-core vertex, the correct center position, and the strong circulation. As a result of BDA using a bogus surface low, dramatic improvement is achieved in the 72 h prediction of typhoon Herb. Through several cases, the initialization by BDA effectively generates the harmonious inner structure of the typhoon, but which is lacking in the original analysis field. Therefore the intensity forecast is improved greatly. Some improvements are made in the track forecast, but more work still needs to be done.
基金the National Natural Science Foundation of China(Grant Nos.41671414,41971380 and 41171265)the National Key Research and Development Program of China(No.2016YFB0501404).
文摘Background:The universal occurrence of randomly distributed dark holes(i.e.,data pits appearing within the tree crown)in LiDAR-derived canopy height models(CHMs)negatively affects the accuracy of extracted forest inventory parameters.Methods:We develop an algorithm based on cloth simulation for constructing a pit-free CHM.Results:The proposed algorithm effectively fills data pits of various sizes whilst preserving canopy details.Our pitfree CHMs derived from point clouds at different proportions of data pits are remarkably better than those constructed using other algorithms,as evidenced by the lowest average root mean square error(0.4981 m)between the reference CHMs and the constructed pit-free CHMs.Moreover,our pit-free CHMs show the best performance overall in terms of maximum tree height estimation(average bias=0.9674 m).Conclusion:The proposed algorithm can be adopted when working with different quality LiDAR data and shows high potential in forestry applications.
基金supported by the National Key Research and Development Program of China(Grant No.2019YFC1510001)the National Natural Science Foundation of China(Grant No.91637210)+1 种基金the Basic Research Fund of CAMS(Grant No.2018Z007)the Jiangsu Collaborative Innovation Center for Climate Change。
文摘This paper describes the historical simulations produced by the Chinese Academy of Meteorological Sciences(CAMS)climate system model(CAMS-CSM),which are contributing to phase 6 of the Coupled Model Intercomparison Project(CMIP6).The model description,experiment design and model outputs are presented.Three members’historical experiments are conducted by CAMS-CSM,with two members starting from different initial conditions,and one excluding the stratospheric aerosol to identify the effect of volcanic eruptions.The outputs of the historical experiments are also validated using observational data.It is found that the model can reproduce the climatological mean states and seasonal cycle of the major climate system quantities,including the surface air temperature,precipitation,and the equatorial thermocline.The long-term trend of air temperature and precipitation is also reasonably captured by CAMS-CSM.There are still some biases in the model that need further improvement.This paper can help the users to better understand the performance and the datasets of CAMS-CSM.
文摘Basins in many parts of the world are ungauged or poorly gauged, and in some cases existing measurement networks are declining. The purpose of this study was to examine the utility of reanalysis and global precipitation datasets in the river discharge simulation for a data-scarce basin. The White Volta basin of Ghana which is one of international rivers was selected as a study basin. NCEP1, NCEP2, ERA-Interim, and GPCP datasets were compared with corresponding observed precipitation data. Annual variations were not reproduced in NCEP1, NCEP2, and ERA-Interim. However, GPCP data, which is based on satellite and observed data, had good seasonal accuracy and reproduced annual variations well. Moreover, five datasets were used as input data to a hydrologic model with HYMOD, which is a water balance model, and with WTM, which is a river model;thereafter, the hydrologic model was calibrated for each datum set by a global optimization method, and river discharge were simulated. The results were evaluated by the root mean square error, relative error, and water balance error. As a result, the combination of GPCP precipitation and ERA-Interim evaporation data was the best in terms of most evaluations. The relative errors in the calibration and validation periods were 43.1% and 46.6%, respectively. Moreover, the results for the GPCP precipitation and ERA-Interim evaporation were better than those for the combination of observed precipitation and ERA-Interim evaporation. In conclusion, GPCP precipitation data and ERA-Interim evaporation data are very useful in a data-scarce basin water balance analysis.
基金supported by the New Century Excellent Talents in University(NCET-09-0396)the National Science&Technology Key Projects of Numerical Control(2012ZX04014-031)+1 种基金the Natural Science Foundation of Hubei Province(2011CDB279)the Foundation for Innovative Research Groups of the Natural Science Foundation of Hubei Province,China(2010CDA067)
文摘When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.
基金funded by the National Natural Science Foundation of China(Grant No.52274048)Beijing Natural Science Foundation(Grant No.3222037)+1 种基金the CNPC 14th Five-Year Perspective Fundamental Research Project(Grant No.2021DJ2104)the Science Foundation of China University of Petroleum-Beijing(No.2462021YXZZ010).
文摘Recent advances in deep learning have expanded new possibilities for fluid flow simulation in petroleum reservoirs.However,the predominant approach in existing research is to train neural networks using high-fidelity numerical simulation data.This presents a significant challenge because the sole source of authentic wellbore production data for training is sparse.In response to this challenge,this work introduces a novel architecture called physics-informed neural network based on domain decomposition(PINN-DD),aiming to effectively utilize the sparse production data of wells for reservoir simulation with large-scale systems.To harness the capabilities of physics-informed neural networks(PINNs)in handling small-scale spatial-temporal domain while addressing the challenges of large-scale systems with sparse labeled data,the computational domain is divided into two distinct sub-domains:the well-containing and the well-free sub-domain.Moreover,the two sub-domains and the interface are rigorously constrained by the governing equations,data matching,and boundary conditions.The accuracy of the proposed method is evaluated on two problems,and its performance is compared against state-of-the-art PINNs through numerical analysis as a benchmark.The results demonstrate the superiority of PINN-DD in handling large-scale reservoir simulation with limited data and show its potential to outperform conventional PINNs in such scenarios.
基金financially supported by the National Natural Science Foundation of China (Grant No.52378329)。
文摘The monopile is the most common foundation to support offshore wind turbines.In the marine environment,local scour due to combined currents and waves is a significant issue that must be considered in the design of wind turbine foundations.In this paper,a full-scale numerical model was developed and validated based on field data from Rudong,China.The scour development around monopiles was investigated,and the effects of waves and the Reynolds number Re were analyzed.Several formulas for predicting the scour depth in the literature have been evaluated.It is found that waves can accelerate scour development even if the KC number is small(0.78<KC<1.57).The formula obtained from small-scale model tests may be unsafe or wasteful when it is applied in practical design due to the scale effect.A new equation for predicting the scour depth based on the average pile Reynolds number(Rea)is proposed and validated with field data.The equilibrium scour depth predicted using the proposed equation is evaluated and compared with those from nine equations in the literature.It is demonstrated that the values predicted from the proposed equation and from the S/M(Sheppard/Melville)equation are closer to the field data.
基金Technical Plan Key Project of Zhejiang Province (2006C13025)Key Subsidiary Project for Meteorological Science of Wenzhou (S200601)Technical Plan Key Project of Wenzhou (S2003A011)
文摘Typhoon Rananim (0414) has been simulated by using the non-hydrostatic Advanced Regional Prediction System (ARPS) from Center of Analysis and Prediction of Storms (CAPS). The prediction of Rananim has generally been improved with ARPS using the new generation CINRAD Doppler radar data. Numerical experiments with or without using the radar data have shown that model initial fields with the assimilated radar radial velocity data in ARPS can change the wind field at the middle and high levels of the troposphere; fine characteristics of the tropical cyclone (TC) are introduced into the initial wind, the x component of wind speed south of the TC is increased and so is the y component west of it. They lead to improved forecasting of TC tracks for the time after landfall. The field of water vapor mixing ratio, temperature, cloud water mixing ratio and rainwater mixing ratio have also been improved by using radar refiectivity data. The model's initial response to the introduction of hydrometeors has been increased. It is shown that horizontal model resolution has a significant impact on intensity forecasts, by greatly improving the forecasting of TC rainfall, and heavy rainstorm of the TC specially, as well as its distribution and variation with time.
文摘In the last decade,building energy simulation( BES)became a central component in building energy systems’ design and optimization. For each building location,BES requires one year of hourly weather data. Most buildings are designed to last50 + years,consequently,the building design phase should include BES with future weather files considering climate change.This paper presents a comparative study of two methods to produce future climate hourly data files for BES: Morphing and typical meteorological year of future climate( F-TMY). The study uses data from a high-resolution( 9 km) regional climate atmospheric model simulation of Iberia,spanning 10 years of historical and future hourly data. This study compares both methods by analyzing anomalies in air temperature,and the impact in BES predictions of annual and peak energy consumption for space heating, cooling and ventilation in 4 buildings.Additionally, this study performs a sensitivity analysis of morphing method. The analysis shows that F-TMY is representative of the multi-year simulation for BES applications.A high-quality Morphed TMY weather file has a similar performance compared to F-TMY( average difference: 8% versus 7%). Morphing based on different baseline climates,low-grid resolution and/or outdated climate projections leads to BES average differences of 16%~20%.
文摘Dynamic numerical simulation of water conditions is useful for reservoir management. In remote semi-arid areas, however, meteorological and hydrological time-series data needed for computation are not frequently measured and must be obtained using other information. This paper presents a case study of data generation for the computation of thermal conditions in the Joumine Reservoir, Tunisia. Data from the Wind Finder web site and daily sunshine duration at the nearest weather stations were utilized to generate cloud cover and solar radiation data based on meteorological correlations obtained in Japan, which is located at the same latitude as Tunisia. A time series of inflow water temperature was estimated from air temperature using a numerical filter expressed as a linear second-order differential equation. A numerical simulation using a vertical 2-D (two-dimensional) turbulent flow model for a stratified water body with generated data successfully reproduced seasonal thermal conditions in the reservoir, which were monitored using a thermistor chain.
文摘The progress of safety technologies,based on the continuous advances in vehicle crash worthiness,restraint systems and active safety functions made traffic safer than ever before.Latest developments heading from assisted Advanced Driver Assistance System(ADAS)to Automated Driving(AD),lead to more and more complex real-world situations to be handled,going from standard driving tasks up to critical situations,which may cause a collision.Therefore,throughout the development process of such systems,it becomes common to use simulation technologies in order to assess these systems in advance.To gain results out of the simulation,input data are required;typically,from various sources,so the requirements can be covered.Thus,the challenge of scoping with different input sources arises.To come along with that problem,two main kinds of input data will be needed in general:(1)the descriptive parameter covering all border conditions,so called parameter room;(2)the system specifications for estimation.The quality of the results correlates strongly with the quality of inputs given.In case of ADAS systems and AD functions,the second kind of input data is very well known.Major challenges relate to the first kind of input data.Thus,the paper will describe a way to create input data that cover all descriptive parameters needed from normal driving up to the collision by the combination of accident analysis and real-world road traffic observations.The method aims at being applicable to different data sources and to different countries.
基金the Natural Key Research and Development Program of China(2018YFB1004902)the Natural Science Foundation of China(61772329,61373085).
文摘The field of fluid simulation is developing rapidly,and data-driven methods provide many frameworks and techniques for fluid simulation.This paper presents a survey of data-driven methods used in fluid simulation in computer graphics in recent years.First,we provide a brief introduction of physical based fluid simulation methods based on their spatial discretization,including Lagrangian,Eulerian,and hybrid methods.The characteristics of these underlying structures and their inherent connection with data driven methodologies are then analyzed.Subsequently,we review studies pertaining to a wide range of applications,including data-driven solvers,detail enhancement,animation synthesis,fluid control,and differentiable simulation.Finally,we discuss some related issues and potential directions in data-driven fluid simulation.We conclude that the fluid simulation combined with data-driven methods has some advantages,such as higher simulation efficiency,rich details and different pattern styles,compared with traditional methods under the same parameters.It can be seen that the data-driven fluid simulation is feasible and has broad prospects.
文摘Recently, the China haze becomes more and more serious, but it is very difficult to model and control it. Here, a data-driven model is introduced for the simulation and monitoring of China haze. First, a multi-dimensional evaluation system is built to evaluate the government performance of China haze. Second, a data-driven model is employed to reveal the operation mechanism of China’s haze and is described as a multi input and multi output system. Third, a prototype system is set up to verify the proposed scheme, and the result provides us with a graphical tool to monitor different haze control strategies.
文摘The simulation techniques of hardware-in-loop simulation(HLS) of homing antitank missile based on the personal computer (PC) are discussed. The PC and MCS-96 chip controller employ A/D and D/A boards (with photoelectricity isolation) to transfer measur ment and control information about homing head, gyro and rudder and utilize the digital hand shaking board to build correct communication communication protocol. In order to satisfy the real-time requirement of HLS, this paper first simplifies the aerodynamic data file reasonably, then builds a PC software with C language. The program of the controller part is made with PL/M language. The simulation of HLS based on PC is done with the same sampling period of 10ms as that of YH-F1 and the experiment results are identical to those of digital simulation of the homing anti-tank guided missile.
文摘Based on MATRIXx, a universal real-time visual distributed simulation system is developed. The system can receive different input data from network or local terminal. Application models in the simulation modules can automatically get such data to be analyzed and calculated, and then produce real-time simulation control information. Meanwhile, this paper designs relevant simulation components to implement the input and output data, which can guarantee the real-time and universal of the data transmission. Result of the experimental system shows that the real-time performance of the simulation is perfect.