Appropriate maintenance technologies that facilitate model consistency in distributed simulation systems are relevant but generally unavailable.To resolve this problem,we analyze the main factors that cause model inco...Appropriate maintenance technologies that facilitate model consistency in distributed simulation systems are relevant but generally unavailable.To resolve this problem,we analyze the main factors that cause model inconsistency.The analysis methods used for traditional distributed simulations are mostly empirical and qualitative,and disregard the dynamic characteristics of factor evolution in model operational running.Furthermore,distributed simulation applications(DSAs)are rapidly evolving in terms of large-scale,distributed,service-oriented,compositional,and dynamic features.Such developments present difficulty in the use of traditional analysis methods in DSAs,for the analysis of factorial effects on simulation models.To solve these problems,we construct a dynamic evolution mechanism of model consistency,called the connected model hyper-digraph(CMH).CMH is developed using formal methods that accurately specify the evolutional processes and activities of models(i.e.,self-evolution,interoperability,compositionality,and authenticity).We also develop an algorithm of model consistency evolution(AMCE)based on CMH to quantitatively and dynamically evaluate influencing factors.Experimental results demonstrate that non-combination(33.7%on average)is the most influential factor,non-single-directed understanding(26.6%)is the second most influential,and non-double-directed understanding(5.0%)is the least influential.Unlike previous analysis methods,AMCE provides good feasibility and effectiveness.This research can serve as guidance for designers of consistency maintenance technologies toward achieving a high level of consistency in future DSAs.展开更多
An energy-dissipation based viscoplastic consistency model is presented to describe the performance of concrete under dynamic loading. The development of plasticity is started with the thermodynamic hypotheses in orde...An energy-dissipation based viscoplastic consistency model is presented to describe the performance of concrete under dynamic loading. The development of plasticity is started with the thermodynamic hypotheses in order that the model may have a sound theoretical background. Independent hardening and softening and the rate dependence of concrete are described separately for tension and compression. A modified implicit backward Euler integration scheme is adopted for the numerical computation. Static and dynamic behavior of the material is illustrated with certain numerical examples at material point level and structural level, and compared with existing experimental data. Results validate the effectiveness of the model.展开更多
It is vital that a well-defined conceptual model can be realized by a macro-model (e.g., a Continuous System Simulation (CSS) model) or a micro-model (e.g., an Agent-Based model or Discrete Event Simulation model) and...It is vital that a well-defined conceptual model can be realized by a macro-model (e.g., a Continuous System Simulation (CSS) model) or a micro-model (e.g., an Agent-Based model or Discrete Event Simulation model) and still produce mutually consistent results. The Full Potential CSS concept provides the rules so that the results from macro-modelling become fully consistent with those from micro-modelling. This paper focuses on the simulation language StochSD (Stochastic System Dynamics), which is an extension of classical Continuous System Simulation that implements the Full Potential CSS concept. Thus, in addition to modelling and simulating continuous flows between compartments represented by “real” numbers, it can also handle transitions of discrete entities by integer numbers, enabling combined models to be constructed in a straight-forward way. However, transition events of discrete entities (e.g., arrivals, accidents, deaths) usually happen irregularly over time, so stochasticity often plays a crucial role in their modelling. Therefore, StochSD contains powerful random functions to model uncertainties of different kinds, together with devices to collect statistics during a simulation or from multiple replications of the same stochastic model. Also, tools for sensitivity analysis, optimisation and statistical analysis are included. In particular, StochSD includes features for stochastic modelling, post-analysis of multiple simulations, and presentation of the results in statistical form. In addition to making StochSD a Full Potential CSS language, a second purpose is to provide an open-source package intended for small and middle-sized models in education, self-studies and research. To make StochSD and its philosophy easy to comprehend and use, it is based on the System Dynamics approach, where a system is described in terms of stocks and flows. StochSD is available for Windows, macOS and Linux. On the StochSD homepage, there is extensive material for a course in Modelling and Simulation in form of PowerPoint lectures and laboratory exercises.展开更多
In this paper, an exponential inequality for the maximal partial sums of negatively superadditive-dependent (NSD, in short) random variables is established. By uSing the exponen- tial inequality, we present some gen...In this paper, an exponential inequality for the maximal partial sums of negatively superadditive-dependent (NSD, in short) random variables is established. By uSing the exponen- tial inequality, we present some general results on the complete convergence for arrays of rowwise NSD random variables, which improve or generalize the corresponding ones of Wang et al. [28] and Chen et al. [2]. In addition, some sufficient conditions to prove the complete convergence are provided. As an application of the complete convergence that we established, we further investigate the complete consistency and convergence rate of the estimator in a nonparametric regression model based on NSD errors.展开更多
Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardwae-centric. This paper presents a ...Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardwae-centric. This paper presents a framework of memory consistency models which describes the memory consistency model on the behavior level.Based on the understanding that the behavior of an execution is determined by the execution order of confiicting accesses, a memory consistency model is defined as an interprocessor synchronization mechanism which orders the execution of operations from different processors. Synchronization order of an execution under certain consistency model is also defined. The synchronization order, together with the program order,determines the behavior of an execution.This paper also presents criteria for correct program and correct implementation of consistency models. Regarding an implementation of a consistency model as certain memory event ordering constraints, this paper provides a method to prove the correctness of consistency model implementations, and the correctness of the lock-based cache coherence protocol is proved with this method.展开更多
Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded co...Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded consistency model adapting to multithreaded processor is proposed in this paper. The restriction imposed on memory event ordering by chip multithreaded consistency is presented and formalized. With the idea of critical cycle built by Wei-Wu Hu, we prove that the proposed chip multithreaded consistency model satisfies the criterion of correct execution of sequential consistency model. Chip multithreaded consistency model provides a way of achieving high performance compared with sequential consistency model and easures the compatibility of software that the execution result in multithreaded processor is the same as the execution result in uniprocessor. The implementation strategy of chip multithreaded consistency model in Godson-2 SMT processor is also proposed. Godson-2 SMT processor supports chip multithreaded consistency model correctly by exception scheme based on the sequential memory access queue of each thread.展开更多
For the linear model y_i=x_iθ+e_i, i=1, 2,…, let the error sequence {e_i}_i=1 be iidr.v.’s, with unknown density f(x). In this paper,a nonparametric estimation method based onthe residuals is proposed for estimatin...For the linear model y_i=x_iθ+e_i, i=1, 2,…, let the error sequence {e_i}_i=1 be iidr.v.’s, with unknown density f(x). In this paper,a nonparametric estimation method based onthe residuals is proposed for estimating f(x) and the consistency of the estimators is obtained.展开更多
Checking if the implementations conform to the requirement models is challenging. Most existing techniques for consistency checking either focus on requirement models(e.g., requirements consistency checking), or on ...Checking if the implementations conform to the requirement models is challenging. Most existing techniques for consistency checking either focus on requirement models(e.g., requirements consistency checking), or on the implementations(e.g., code-based testing) only. In this paper we propose an approach to checking behavioral consistency of implementations against requirement models directly to overcome these limitations. Our approach extracts two behavioral models represented by Labelled Transition Systems(LTS) from requirement models and implementations respectively, and checks the behavioral consistency between these two models based on behavioral simulation relation of LTS. The checking results of our approach provide evidence for behavioral inconsistency as well as inconsistent localization. A research prototype called BCCH and a case study are presented to give initial validation of this approach.展开更多
A steady increase in consumer demands, and severe constraints from both a somewhat damaged environment and newly installed government policies, require today's product design and development to be faster and more...A steady increase in consumer demands, and severe constraints from both a somewhat damaged environment and newly installed government policies, require today's product design and development to be faster and more efficient than ever before, yet utilizing even fewer resources. New holistic approaches, such as total product life cycle modeling which embraces all aspects of a product's life cycle, are current attempts to solve these problems. Within the field of product design and modeling, feature technology has proved to be one very promising solution component. Owing to the tremendous increase in information technology, to transfer from low level data processing towards knowledge modeling and information processing is about to bring a change in almost every computerized application. From this viewpoint, current problems of both feature frameworks and feature systems are analyzed in respect to static and dynamic consistency breakdowns. The analysis ranges from early stages of designing (feature) concepts to final system implementation and application. Por the first time, an integrated view is given oil approaches, solutions and practical experience, with feature concepts and structures, providing both a feature framework and its implementation with sufficient system architecture and computational power to master a fair number of known consistency breakdowns, while providing for robust contexts for feature semantics and integrated models. Within today's heavy use of information technology these are pre-requisites if the full potential of feature technology is to be successfully translated into practice.展开更多
There is a growing trend of applying machine learning methods to medical datasets in order to predict patients’future status.Although some of these methods achieve high performance,challenges still exist in comparing...There is a growing trend of applying machine learning methods to medical datasets in order to predict patients’future status.Although some of these methods achieve high performance,challenges still exist in comparing and evaluating different models through their interpretable information.Such analytics can help clinicians improve evidence-based medical decision making.In this work,we develop a visual analytics system that compares multiple models’prediction criteria and evaluates their consistency.With our system,users can generate knowledge on different models’inner criteria and how confidently we can rely on each model’s prediction for a certain patient.Through a case study of a publicly available clinical dataset,we demonstrate the effectiveness of our visual analytics system to assist clinicians and researchers in comparing and quantitatively evaluating different machine learning methods.展开更多
Today, data is flowing into various organizations at an unprecedented scale. The ability to scale out for processing an enhanced workload has become an important factor for the proliferation and popularization of data...Today, data is flowing into various organizations at an unprecedented scale. The ability to scale out for processing an enhanced workload has become an important factor for the proliferation and popularization of database systems. Big data applications demand and consequently lead to the developments of diverse large-scale data management systems in different organizations, ranging from traditional database vendors to new emerging Internet-based enterprises. In this survey, we investigate, characterize, and analyze the large-scale data management systems in depth and develop comprehensive taxonomies for various critical aspects covering the data model, the system architecture, and the consistency model. We map the prevailing highly scalable data management systems to the proposed taxonomies, not only to classify the common techniques but also to provide a basis for analyzing current system scalability limitations. To overcome these limitations, we predicate and highlight the possible principles that future efforts need to be undertaken for the next generation large-scale data management systems.展开更多
基金Project supported by the National Natural Science Foundation of China(No.61272336)
文摘Appropriate maintenance technologies that facilitate model consistency in distributed simulation systems are relevant but generally unavailable.To resolve this problem,we analyze the main factors that cause model inconsistency.The analysis methods used for traditional distributed simulations are mostly empirical and qualitative,and disregard the dynamic characteristics of factor evolution in model operational running.Furthermore,distributed simulation applications(DSAs)are rapidly evolving in terms of large-scale,distributed,service-oriented,compositional,and dynamic features.Such developments present difficulty in the use of traditional analysis methods in DSAs,for the analysis of factorial effects on simulation models.To solve these problems,we construct a dynamic evolution mechanism of model consistency,called the connected model hyper-digraph(CMH).CMH is developed using formal methods that accurately specify the evolutional processes and activities of models(i.e.,self-evolution,interoperability,compositionality,and authenticity).We also develop an algorithm of model consistency evolution(AMCE)based on CMH to quantitatively and dynamically evaluate influencing factors.Experimental results demonstrate that non-combination(33.7%on average)is the most influential factor,non-single-directed understanding(26.6%)is the second most influential,and non-double-directed understanding(5.0%)is the least influential.Unlike previous analysis methods,AMCE provides good feasibility and effectiveness.This research can serve as guidance for designers of consistency maintenance technologies toward achieving a high level of consistency in future DSAs.
基金supported by the National Natural Science Foundation of China (No.90510018)
文摘An energy-dissipation based viscoplastic consistency model is presented to describe the performance of concrete under dynamic loading. The development of plasticity is started with the thermodynamic hypotheses in order that the model may have a sound theoretical background. Independent hardening and softening and the rate dependence of concrete are described separately for tension and compression. A modified implicit backward Euler integration scheme is adopted for the numerical computation. Static and dynamic behavior of the material is illustrated with certain numerical examples at material point level and structural level, and compared with existing experimental data. Results validate the effectiveness of the model.
文摘It is vital that a well-defined conceptual model can be realized by a macro-model (e.g., a Continuous System Simulation (CSS) model) or a micro-model (e.g., an Agent-Based model or Discrete Event Simulation model) and still produce mutually consistent results. The Full Potential CSS concept provides the rules so that the results from macro-modelling become fully consistent with those from micro-modelling. This paper focuses on the simulation language StochSD (Stochastic System Dynamics), which is an extension of classical Continuous System Simulation that implements the Full Potential CSS concept. Thus, in addition to modelling and simulating continuous flows between compartments represented by “real” numbers, it can also handle transitions of discrete entities by integer numbers, enabling combined models to be constructed in a straight-forward way. However, transition events of discrete entities (e.g., arrivals, accidents, deaths) usually happen irregularly over time, so stochasticity often plays a crucial role in their modelling. Therefore, StochSD contains powerful random functions to model uncertainties of different kinds, together with devices to collect statistics during a simulation or from multiple replications of the same stochastic model. Also, tools for sensitivity analysis, optimisation and statistical analysis are included. In particular, StochSD includes features for stochastic modelling, post-analysis of multiple simulations, and presentation of the results in statistical form. In addition to making StochSD a Full Potential CSS language, a second purpose is to provide an open-source package intended for small and middle-sized models in education, self-studies and research. To make StochSD and its philosophy easy to comprehend and use, it is based on the System Dynamics approach, where a system is described in terms of stocks and flows. StochSD is available for Windows, macOS and Linux. On the StochSD homepage, there is extensive material for a course in Modelling and Simulation in form of PowerPoint lectures and laboratory exercises.
基金Supported by the National Natural Science Foundation of China(11501004,11501005,11526033,11671012)the Natural Science Foundation of Anhui Province(1508085J06,1608085QA02)+1 种基金the Key Projects for Academic Talent of Anhui Province(gxbj ZD2016005)the Research Teaching Model Curriculum of Anhui University(xjyjkc1407)
文摘In this paper, an exponential inequality for the maximal partial sums of negatively superadditive-dependent (NSD, in short) random variables is established. By uSing the exponen- tial inequality, we present some general results on the complete convergence for arrays of rowwise NSD random variables, which improve or generalize the corresponding ones of Wang et al. [28] and Chen et al. [2]. In addition, some sufficient conditions to prove the complete convergence are provided. As an application of the complete convergence that we established, we further investigate the complete consistency and convergence rate of the estimator in a nonparametric regression model based on NSD errors.
文摘Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardwae-centric. This paper presents a framework of memory consistency models which describes the memory consistency model on the behavior level.Based on the understanding that the behavior of an execution is determined by the execution order of confiicting accesses, a memory consistency model is defined as an interprocessor synchronization mechanism which orders the execution of operations from different processors. Synchronization order of an execution under certain consistency model is also defined. The synchronization order, together with the program order,determines the behavior of an execution.This paper also presents criteria for correct program and correct implementation of consistency models. Regarding an implementation of a consistency model as certain memory event ordering constraints, this paper provides a method to prove the correctness of consistency model implementations, and the correctness of the lock-based cache coherence protocol is proved with this method.
基金Supported by the National High Technology Development 863 Program of China(Grant Nos.2007AA01Z114, 2006AA010201)the National Natural Science Foundation of China(Grant Nos.60703017, 60736012, 60325205, 60673146, 60603049)+1 种基金the National Grand Fundamental Research 973 Program of China(Grant Nos.2005CB321601, 2005CB321603)Beijing Natural Science Foundation(Grant No.4072024).
文摘Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded consistency model adapting to multithreaded processor is proposed in this paper. The restriction imposed on memory event ordering by chip multithreaded consistency is presented and formalized. With the idea of critical cycle built by Wei-Wu Hu, we prove that the proposed chip multithreaded consistency model satisfies the criterion of correct execution of sequential consistency model. Chip multithreaded consistency model provides a way of achieving high performance compared with sequential consistency model and easures the compatibility of software that the execution result in multithreaded processor is the same as the execution result in uniprocessor. The implementation strategy of chip multithreaded consistency model in Godson-2 SMT processor is also proposed. Godson-2 SMT processor supports chip multithreaded consistency model correctly by exception scheme based on the sequential memory access queue of each thread.
基金The project supported by National Natural Science Foundation of China Crant 18971061
文摘For the linear model y_i=x_iθ+e_i, i=1, 2,…, let the error sequence {e_i}_i=1 be iidr.v.’s, with unknown density f(x). In this paper,a nonparametric estimation method based onthe residuals is proposed for estimating f(x) and the consistency of the estimators is obtained.
基金Supported by the National Natural Science Foundation of China(91118003,61003071)the Fundamental Research Funds for the Central Universities(3101046,201121102020006)the Special Funds for Shenzhen Strategic New Industry Development(JCYJ20120616135936123)
文摘Checking if the implementations conform to the requirement models is challenging. Most existing techniques for consistency checking either focus on requirement models(e.g., requirements consistency checking), or on the implementations(e.g., code-based testing) only. In this paper we propose an approach to checking behavioral consistency of implementations against requirement models directly to overcome these limitations. Our approach extracts two behavioral models represented by Labelled Transition Systems(LTS) from requirement models and implementations respectively, and checks the behavioral consistency between these two models based on behavioral simulation relation of LTS. The checking results of our approach provide evidence for behavioral inconsistency as well as inconsistent localization. A research prototype called BCCH and a case study are presented to give initial validation of this approach.
文摘A steady increase in consumer demands, and severe constraints from both a somewhat damaged environment and newly installed government policies, require today's product design and development to be faster and more efficient than ever before, yet utilizing even fewer resources. New holistic approaches, such as total product life cycle modeling which embraces all aspects of a product's life cycle, are current attempts to solve these problems. Within the field of product design and modeling, feature technology has proved to be one very promising solution component. Owing to the tremendous increase in information technology, to transfer from low level data processing towards knowledge modeling and information processing is about to bring a change in almost every computerized application. From this viewpoint, current problems of both feature frameworks and feature systems are analyzed in respect to static and dynamic consistency breakdowns. The analysis ranges from early stages of designing (feature) concepts to final system implementation and application. Por the first time, an integrated view is given oil approaches, solutions and practical experience, with feature concepts and structures, providing both a feature framework and its implementation with sufficient system architecture and computational power to master a fair number of known consistency breakdowns, while providing for robust contexts for feature semantics and integrated models. Within today's heavy use of information technology these are pre-requisites if the full potential of feature technology is to be successfully translated into practice.
基金the U.S.National Science Foundation through grant IIS-1741536 and a 2019 Seed Fund Award from CITRIS and the Banatao Institute at the University of California.
文摘There is a growing trend of applying machine learning methods to medical datasets in order to predict patients’future status.Although some of these methods achieve high performance,challenges still exist in comparing and evaluating different models through their interpretable information.Such analytics can help clinicians improve evidence-based medical decision making.In this work,we develop a visual analytics system that compares multiple models’prediction criteria and evaluates their consistency.With our system,users can generate knowledge on different models’inner criteria and how confidently we can rely on each model’s prediction for a certain patient.Through a case study of a publicly available clinical dataset,we demonstrate the effectiveness of our visual analytics system to assist clinicians and researchers in comparing and quantitatively evaluating different machine learning methods.
文摘Today, data is flowing into various organizations at an unprecedented scale. The ability to scale out for processing an enhanced workload has become an important factor for the proliferation and popularization of database systems. Big data applications demand and consequently lead to the developments of diverse large-scale data management systems in different organizations, ranging from traditional database vendors to new emerging Internet-based enterprises. In this survey, we investigate, characterize, and analyze the large-scale data management systems in depth and develop comprehensive taxonomies for various critical aspects covering the data model, the system architecture, and the consistency model. We map the prevailing highly scalable data management systems to the proposed taxonomies, not only to classify the common techniques but also to provide a basis for analyzing current system scalability limitations. To overcome these limitations, we predicate and highlight the possible principles that future efforts need to be undertaken for the next generation large-scale data management systems.