Approximate dynamic programming (ADP) is a general and effective approach for solving optimal control and estimation problems by adapting to uncertain and nonconvex environments over time.
There is a common difficulty in elastic-plastic impact codes such as EPIC[2,3] NONSAP[4], etc.. Most of these codes use the simple linear functions usually taken from static problem to represent the displacement compo...There is a common difficulty in elastic-plastic impact codes such as EPIC[2,3] NONSAP[4], etc.. Most of these codes use the simple linear functions usually taken from static problem to represent the displacement components. In such finite element formulation, the stress components are constant in each element and they are discontinuous in any two neighboring elements. Therefore, the bases of using the virtual work principle in such elements are unreliable. In this paper, we introduce a new method, namely, the compatible stress iterative method, to eliminate the above-said difficulty. The calculated examples show that the calculation using the new method in dynamic finite element analysis of high velocity impact is valid and stable, and the element stiffness can be somewhat reduced.展开更多
This study presents a methodology to evaluate and prevent security vulnerabilities issues for web applications.The analysis process is based on the use of techniques and tools that allow to perform security assessment...This study presents a methodology to evaluate and prevent security vulnerabilities issues for web applications.The analysis process is based on the use of techniques and tools that allow to perform security assessments of white box and black box,to carry out the security validation of a web application in an agile and precise way.The objective of the methodology is to take advantage of the synergies of semi-automatic static and dynamic security analysis tools and manual checks.Each one of the phases contemplated in the methodology is supported by security analysis tools of different degrees of coverage,so that the results generated in one phase are used as feed for the following phases in order to get an optimized global security analysis result.The methodology can be used as part of other more general methodologies that do not cover how to use static and dynamic analysis tools in the implementation and testing phases of a Secure Software Development Life Cycle(SSDLC).A practical application of the methodology to analyze the security of a real web application demonstrates its effectiveness by obtaining a better optimized vulnerability detection result against the true and false positive metrics.Dynamic analysis with manual checking is used to audit the results,24.6 per cent of security vulnerabilities reported by the static analysis has been checked and it allows to study which vulnerabilities can be directly exploited externally.This phase is very important because it permits that each reported vulnerability can be checked by a dynamic second tool to confirm whether a vulnerability is true or false positive and it allows to study which vulnerabilities can be directly exploited externally.Dynamic analysis finds six(6)additional critical vulnerabilities.Access control analysis finds other five(5)important vulnerabilities such as Insufficient Protected Passwords or Weak Password Policy and Excessive Authentication Attacks,two vulnerabilities that permit brute force attacks.展开更多
Dynamic data driven simulation (DDDS) is proposed to improve the model by incorporaing real data from the practical systems into the model. Instead of giving a static input, multiple possible sets of inputs are fed ...Dynamic data driven simulation (DDDS) is proposed to improve the model by incorporaing real data from the practical systems into the model. Instead of giving a static input, multiple possible sets of inputs are fed into the model. And the computational errors are corrected using statistical approaches. It involves a variety of aspects, including the uncertainty modeling, the measurement evaluation, the system model and the measurement model coupling ,the computation complexity, and the performance issue. Authors intend to set up the architecture of DDDS for wildfire spread model, DEVS-FIRE, based on the discrete event speeification (DEVS) formalism. The experimental results show that the framework can track the dynamically changing fire front based on fire sen- sor data, thus, it provides more aecurate predictions.展开更多
Background Disturbed circadian rhythm is a potential cause of delirium and is linked to disorganisation of the circadian rhythmicity. Dynamic light (DL) could reset the circadian rhythm by activation of the suprachi...Background Disturbed circadian rhythm is a potential cause of delirium and is linked to disorganisation of the circadian rhythmicity. Dynamic light (DL) could reset the circadian rhythm by activation of the suprachiasmatic nucleus to prevent delirium. Evidence regarding the effects of light therapy is predominantly focused on psychiatric disorders and circadian rhythm sleep disorders. In this study, we investi- gated the effect of DL on the total hospital length of stay (LOS) and occurrence of delirium in patients admitted to the Coronary Care Unit (CCU). Methods This was a retrospective cohort study. Patients older than 18 years, who were hospitalized longer than 12 h at the CCU and had a total hospital LOS for at least 24 h, were included. Patients were assigned to a room with DL (n = 369) or regular lighting condi- tions (n = 379). DL was administered at the CCU by two ceiling-mounted light panels delivering light with a colour temperature between 2700 and 6500 degrees Kelvin. Reported outcome data were: total hospital LOS, delirium incidence, consultation of a geriatrician and the amount of prescripted antipsychotics. Results Between May 2015 and May 2016, data from 748 patients were collected. Baseline charac- teristics, including risk factors provoking delirium, were equal in both groups. Median total hospital LOS in the DL group was 100.5 (70.8-186.0) and 101.0 (73.0-176.4) h in the control group (P = 0.935). The incidence of delirium in the DL and control group was 5.4% (20/369) and 5.0% (19/379), respectively (P = 0.802). No significant differences between the DL and control group were observed in secon- dary endpoints. Subgroup analysis based on age and CCU LOS also showed no differences. Conclusion Our study suggests exposure to DL as an early single approach does not result in a reduction of total hospital LOS or reduced incidence of delirium. When delirium was diagnosed, it was associated with poor hospital outcome.展开更多
Flight delay prediction remains an important research topic due to dynamic nature in flight operation and numerous delay factors.Dynamic data-driven application system in the control area can provide a solution to thi...Flight delay prediction remains an important research topic due to dynamic nature in flight operation and numerous delay factors.Dynamic data-driven application system in the control area can provide a solution to this problem.However,in order to apply the approach,a state-space flight delay model needs to be established to represent the relationship among system states,as well as the relationship between system states and input/output variables.Based on the analysis of delay event sequence in a single flight,a state-space mixture model is established and input variables in the model are studied.Case study is also carried out on historical flight delay data.In addition,the genetic expectation-maximization(EM)algorithm is used to obtain the global optimal estimates of parameters in the mixture model,and results fit the historical data.At last,the model is validated in Kolmogorov-Smirnov tests.Results show that the model has reasonable goodness of fitting the data,and the search performance of traditional EM algorithm can be improved by using the genetic algorithm.展开更多
For many years, computer systems have emerged; they now occupy an important place in our daily lives. The growing needs and ever increasing use of computer systems have made application development more and more compl...For many years, computer systems have emerged; they now occupy an important place in our daily lives. The growing needs and ever increasing use of computer systems have made application development more and more complicated, The complexity of these applications poses problems such as reuse, installation, administration and evolution of applications. The development of applications is related to the evolution of paradigms and approaches to developing them. This paper presents different approaches and paradigms of development starting with the procedural approach, coming up for service, through the component and object-oriented approaches. Also, for each of the approaches we determine the advantages and limitations.展开更多
A high-frequency radar system has been deployed in Galway Bay, a semi-enclosed bay on the west coast of Ireland. The system provides surface currents with fine spatial resolution every hour. Prior to its use for model...A high-frequency radar system has been deployed in Galway Bay, a semi-enclosed bay on the west coast of Ireland. The system provides surface currents with fine spatial resolution every hour. Prior to its use for model validation, the accuracy of the radar data was verified through comparison with measurements from acoustic Doppler current profilers (ADCPs) and a good correlation between time series of surface current speeds and directions obtained from radar data and ADCP data. Since Galway Bay is located on the coast of the Atlantic Ocean, it is subject to relatively windy conditions, and surface currents are therefore strongly wind-driven. With a view to assimilating the radar data for forecasting purposes, a three-dimensional numerical model of Galway Bay, the Environmental Fluid Dynamics Code (EFDC), was developed based on a terrain-following vertical (sigma) coordinate system. This study shows that the performance and accuracy of the numerical model, particularly with regard to tide- and wind-induced surface currents, are sensitive to the vertical layer structure. Results of five models with different layer structures are presented and compared with radar measurements. A variable vertical structure with thin layers at the bottom and the surface and thicker layers in the middle of the water column was found to be the optimal layer structure for reproduction of tideand wind-induced surface currents. This structure ensures that wind shear can properly propagate from the surface layer to the sub-surface layers, thereby ensuring that wind forcing is not overdamped by tidal forcing. The vertical layer structure affects not only the velocities at the surface layer but also the velocities further down in the water column.展开更多
Signal processing in phase space based on nonlinear dynamics theory is a new method for underwater acoustic signal processing. One key problem when analyzing actual acoustic signal in phase space is how to reduce the ...Signal processing in phase space based on nonlinear dynamics theory is a new method for underwater acoustic signal processing. One key problem when analyzing actual acoustic signal in phase space is how to reduce the noise and lower the embedding dimen- sion. In this paper, local-geometric-projection method is applied to obtain fow dimensional element from various target radiating noise and the derived phase portraits show obviously low dimensional attractors. Furthermore, attractor dimension and cross prediction error are used for classification. It concludes that combining these features representing the geometric and dynamical properties respectively shows effects in target classification.展开更多
This paper compares the quality and execution times of several algorithms for scheduling service based workflow applications with changeable service availability and parameters. A workflow is defined as an acyclic dir...This paper compares the quality and execution times of several algorithms for scheduling service based workflow applications with changeable service availability and parameters. A workflow is defined as an acyclic directed graph with nodes corresponding to tasks and edges to dependencies between tasks. For each task, one out of several available services needs to be chosen and scheduled to minimize the workflow execution time and keep the cost of service within the budget. During the execution of a workflow, some services may become unavailable, new ones may appear, and costs and execution times may change with a certain probability. Rescheduling is needed to obtain a better schedule. A solution is proposed on how integer linear programming can be used to solve this problem to obtain optimal solutions for smaller problems or suboptimal solutions for larger ones. It is compared side-by-side with GAIN, divide-and-conquer, and genetic algorithms for various probabilities of service unavailability or change in service parameters. The algorithms are implemented and subsequently tested in a real BeesyCluster environment.展开更多
The state of art and future prospects are described for the application of the computational fluid dynamics to engineering purposes.2D and 3D simulations are presented for a flow about a pair of bridges,a flow about a...The state of art and future prospects are described for the application of the computational fluid dynamics to engineering purposes.2D and 3D simulations are presented for a flow about a pair of bridges,a flow about a cylin- der in waves,a flow about an airplane and a ship,a flow past a sphere,a two layers flow and a flow in a wall boundary layer,The choice of grid system and of turbulence modei is discussed.展开更多
Particle Filter (PF) is a data assimilation method to solve recursive state estimation problem which does not depend on the assumption of Gaussian noise, and is able to be applied for various systems even with non-l...Particle Filter (PF) is a data assimilation method to solve recursive state estimation problem which does not depend on the assumption of Gaussian noise, and is able to be applied for various systems even with non-linear and non-Gaussian noise. However, while applying PF in dynamic systems, PF undergoes particle degeneracy, sample impoverishment, and problems of high computational complexity. Rapidly developing sensing technologies are providing highly convenient availability of real-time big traffic data from the system under study like never before. Moreover, some sensors can even receive control commands to adjust their monitoring parameters. To address these problems, a bidirectional dynamic data-driven improvement framework for PF (B3DPF) is proposed. The B3DPF enhances feedback between the simulation model and the big traffic data collected by the sensors, which means the execution strategies (sensor data management, parameters used in the weight computation, resampling) of B3DPF can be optimized based on the simulation results and the types and dimensions of traffic data injected into B3DPF can be adjusted dynamically. The first experiment indicates that the B3DPF overcomes particle degeneracy and sample impoverishment problems and accurately estimates the state at a faster speed than the normal PF. More importantly, the new method has higher accuracy for multidimensional random systems. In the rest of experiments, the proposed framework is applied to estimate the traffic state on a real road network and obtains satisfactory results. More experiments can be designed to validate the universal properties of B3DPF.展开更多
文摘Approximate dynamic programming (ADP) is a general and effective approach for solving optimal control and estimation problems by adapting to uncertain and nonconvex environments over time.
文摘There is a common difficulty in elastic-plastic impact codes such as EPIC[2,3] NONSAP[4], etc.. Most of these codes use the simple linear functions usually taken from static problem to represent the displacement components. In such finite element formulation, the stress components are constant in each element and they are discontinuous in any two neighboring elements. Therefore, the bases of using the virtual work principle in such elements are unreliable. In this paper, we introduce a new method, namely, the compatible stress iterative method, to eliminate the above-said difficulty. The calculated examples show that the calculation using the new method in dynamic finite element analysis of high velocity impact is valid and stable, and the element stiffness can be somewhat reduced.
文摘This study presents a methodology to evaluate and prevent security vulnerabilities issues for web applications.The analysis process is based on the use of techniques and tools that allow to perform security assessments of white box and black box,to carry out the security validation of a web application in an agile and precise way.The objective of the methodology is to take advantage of the synergies of semi-automatic static and dynamic security analysis tools and manual checks.Each one of the phases contemplated in the methodology is supported by security analysis tools of different degrees of coverage,so that the results generated in one phase are used as feed for the following phases in order to get an optimized global security analysis result.The methodology can be used as part of other more general methodologies that do not cover how to use static and dynamic analysis tools in the implementation and testing phases of a Secure Software Development Life Cycle(SSDLC).A practical application of the methodology to analyze the security of a real web application demonstrates its effectiveness by obtaining a better optimized vulnerability detection result against the true and false positive metrics.Dynamic analysis with manual checking is used to audit the results,24.6 per cent of security vulnerabilities reported by the static analysis has been checked and it allows to study which vulnerabilities can be directly exploited externally.This phase is very important because it permits that each reported vulnerability can be checked by a dynamic second tool to confirm whether a vulnerability is true or false positive and it allows to study which vulnerabilities can be directly exploited externally.Dynamic analysis finds six(6)additional critical vulnerabilities.Access control analysis finds other five(5)important vulnerabilities such as Insufficient Protected Passwords or Weak Password Policy and Excessive Authentication Attacks,two vulnerabilities that permit brute force attacks.
文摘Dynamic data driven simulation (DDDS) is proposed to improve the model by incorporaing real data from the practical systems into the model. Instead of giving a static input, multiple possible sets of inputs are fed into the model. And the computational errors are corrected using statistical approaches. It involves a variety of aspects, including the uncertainty modeling, the measurement evaluation, the system model and the measurement model coupling ,the computation complexity, and the performance issue. Authors intend to set up the architecture of DDDS for wildfire spread model, DEVS-FIRE, based on the discrete event speeification (DEVS) formalism. The experimental results show that the framework can track the dynamically changing fire front based on fire sen- sor data, thus, it provides more aecurate predictions.
文摘Background Disturbed circadian rhythm is a potential cause of delirium and is linked to disorganisation of the circadian rhythmicity. Dynamic light (DL) could reset the circadian rhythm by activation of the suprachiasmatic nucleus to prevent delirium. Evidence regarding the effects of light therapy is predominantly focused on psychiatric disorders and circadian rhythm sleep disorders. In this study, we investi- gated the effect of DL on the total hospital length of stay (LOS) and occurrence of delirium in patients admitted to the Coronary Care Unit (CCU). Methods This was a retrospective cohort study. Patients older than 18 years, who were hospitalized longer than 12 h at the CCU and had a total hospital LOS for at least 24 h, were included. Patients were assigned to a room with DL (n = 369) or regular lighting condi- tions (n = 379). DL was administered at the CCU by two ceiling-mounted light panels delivering light with a colour temperature between 2700 and 6500 degrees Kelvin. Reported outcome data were: total hospital LOS, delirium incidence, consultation of a geriatrician and the amount of prescripted antipsychotics. Results Between May 2015 and May 2016, data from 748 patients were collected. Baseline charac- teristics, including risk factors provoking delirium, were equal in both groups. Median total hospital LOS in the DL group was 100.5 (70.8-186.0) and 101.0 (73.0-176.4) h in the control group (P = 0.935). The incidence of delirium in the DL and control group was 5.4% (20/369) and 5.0% (19/379), respectively (P = 0.802). No significant differences between the DL and control group were observed in secon- dary endpoints. Subgroup analysis based on age and CCU LOS also showed no differences. Conclusion Our study suggests exposure to DL as an early single approach does not result in a reduction of total hospital LOS or reduced incidence of delirium. When delirium was diagnosed, it was associated with poor hospital outcome.
基金Supported by the High Technology Research and Development Programme of China(2006AA12A106)~~
文摘Flight delay prediction remains an important research topic due to dynamic nature in flight operation and numerous delay factors.Dynamic data-driven application system in the control area can provide a solution to this problem.However,in order to apply the approach,a state-space flight delay model needs to be established to represent the relationship among system states,as well as the relationship between system states and input/output variables.Based on the analysis of delay event sequence in a single flight,a state-space mixture model is established and input variables in the model are studied.Case study is also carried out on historical flight delay data.In addition,the genetic expectation-maximization(EM)algorithm is used to obtain the global optimal estimates of parameters in the mixture model,and results fit the historical data.At last,the model is validated in Kolmogorov-Smirnov tests.Results show that the model has reasonable goodness of fitting the data,and the search performance of traditional EM algorithm can be improved by using the genetic algorithm.
文摘For many years, computer systems have emerged; they now occupy an important place in our daily lives. The growing needs and ever increasing use of computer systems have made application development more and more complicated, The complexity of these applications poses problems such as reuse, installation, administration and evolution of applications. The development of applications is related to the evolution of paradigms and approaches to developing them. This paper presents different approaches and paradigms of development starting with the procedural approach, coming up for service, through the component and object-oriented approaches. Also, for each of the approaches we determine the advantages and limitations.
基金supported by the China Scholarship Council(Grant No.2011671057)the European Regional Development Fund(ERDF)through the Atlantic Area Transnational Programme(INTERREG IV)the National University of Ireland
文摘A high-frequency radar system has been deployed in Galway Bay, a semi-enclosed bay on the west coast of Ireland. The system provides surface currents with fine spatial resolution every hour. Prior to its use for model validation, the accuracy of the radar data was verified through comparison with measurements from acoustic Doppler current profilers (ADCPs) and a good correlation between time series of surface current speeds and directions obtained from radar data and ADCP data. Since Galway Bay is located on the coast of the Atlantic Ocean, it is subject to relatively windy conditions, and surface currents are therefore strongly wind-driven. With a view to assimilating the radar data for forecasting purposes, a three-dimensional numerical model of Galway Bay, the Environmental Fluid Dynamics Code (EFDC), was developed based on a terrain-following vertical (sigma) coordinate system. This study shows that the performance and accuracy of the numerical model, particularly with regard to tide- and wind-induced surface currents, are sensitive to the vertical layer structure. Results of five models with different layer structures are presented and compared with radar measurements. A variable vertical structure with thin layers at the bottom and the surface and thicker layers in the middle of the water column was found to be the optimal layer structure for reproduction of tideand wind-induced surface currents. This structure ensures that wind shear can properly propagate from the surface layer to the sub-surface layers, thereby ensuring that wind forcing is not overdamped by tidal forcing. The vertical layer structure affects not only the velocities at the surface layer but also the velocities further down in the water column.
文摘Signal processing in phase space based on nonlinear dynamics theory is a new method for underwater acoustic signal processing. One key problem when analyzing actual acoustic signal in phase space is how to reduce the noise and lower the embedding dimen- sion. In this paper, local-geometric-projection method is applied to obtain fow dimensional element from various target radiating noise and the derived phase portraits show obviously low dimensional attractors. Furthermore, attractor dimension and cross prediction error are used for classification. It concludes that combining these features representing the geometric and dynamical properties respectively shows effects in target classification.
基金Project partially supported by the Polish National Science Center(No.DEC-2012/07/B/ST6/01516)
文摘This paper compares the quality and execution times of several algorithms for scheduling service based workflow applications with changeable service availability and parameters. A workflow is defined as an acyclic directed graph with nodes corresponding to tasks and edges to dependencies between tasks. For each task, one out of several available services needs to be chosen and scheduled to minimize the workflow execution time and keep the cost of service within the budget. During the execution of a workflow, some services may become unavailable, new ones may appear, and costs and execution times may change with a certain probability. Rescheduling is needed to obtain a better schedule. A solution is proposed on how integer linear programming can be used to solve this problem to obtain optimal solutions for smaller problems or suboptimal solutions for larger ones. It is compared side-by-side with GAIN, divide-and-conquer, and genetic algorithms for various probabilities of service unavailability or change in service parameters. The algorithms are implemented and subsequently tested in a real BeesyCluster environment.
文摘The state of art and future prospects are described for the application of the computational fluid dynamics to engineering purposes.2D and 3D simulations are presented for a flow about a pair of bridges,a flow about a cylin- der in waves,a flow about an airplane and a ship,a flow past a sphere,a two layers flow and a flow in a wall boundary layer,The choice of grid system and of turbulence modei is discussed.
基金supported by the State Basic Scientific Research of National Defense (No. c0420110005)13th Five-Year Key Basic Research Project (No. JCKY2016206B001)the Six talent peaks project in Jiangsu Province (No. XXRJ-004)
文摘Particle Filter (PF) is a data assimilation method to solve recursive state estimation problem which does not depend on the assumption of Gaussian noise, and is able to be applied for various systems even with non-linear and non-Gaussian noise. However, while applying PF in dynamic systems, PF undergoes particle degeneracy, sample impoverishment, and problems of high computational complexity. Rapidly developing sensing technologies are providing highly convenient availability of real-time big traffic data from the system under study like never before. Moreover, some sensors can even receive control commands to adjust their monitoring parameters. To address these problems, a bidirectional dynamic data-driven improvement framework for PF (B3DPF) is proposed. The B3DPF enhances feedback between the simulation model and the big traffic data collected by the sensors, which means the execution strategies (sensor data management, parameters used in the weight computation, resampling) of B3DPF can be optimized based on the simulation results and the types and dimensions of traffic data injected into B3DPF can be adjusted dynamically. The first experiment indicates that the B3DPF overcomes particle degeneracy and sample impoverishment problems and accurately estimates the state at a faster speed than the normal PF. More importantly, the new method has higher accuracy for multidimensional random systems. In the rest of experiments, the proposed framework is applied to estimate the traffic state on a real road network and obtains satisfactory results. More experiments can be designed to validate the universal properties of B3DPF.