Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardwae-centric. This paper presents a ...Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardwae-centric. This paper presents a framework of memory consistency models which describes the memory consistency model on the behavior level.Based on the understanding that the behavior of an execution is determined by the execution order of confiicting accesses, a memory consistency model is defined as an interprocessor synchronization mechanism which orders the execution of operations from different processors. Synchronization order of an execution under certain consistency model is also defined. The synchronization order, together with the program order,determines the behavior of an execution.This paper also presents criteria for correct program and correct implementation of consistency models. Regarding an implementation of a consistency model as certain memory event ordering constraints, this paper provides a method to prove the correctness of consistency model implementations, and the correctness of the lock-based cache coherence protocol is proved with this method.展开更多
Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded co...Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded consistency model adapting to multithreaded processor is proposed in this paper. The restriction imposed on memory event ordering by chip multithreaded consistency is presented and formalized. With the idea of critical cycle built by Wei-Wu Hu, we prove that the proposed chip multithreaded consistency model satisfies the criterion of correct execution of sequential consistency model. Chip multithreaded consistency model provides a way of achieving high performance compared with sequential consistency model and easures the compatibility of software that the execution result in multithreaded processor is the same as the execution result in uniprocessor. The implementation strategy of chip multithreaded consistency model in Godson-2 SMT processor is also proposed. Godson-2 SMT processor supports chip multithreaded consistency model correctly by exception scheme based on the sequential memory access queue of each thread.展开更多
An energy-dissipation based viscoplastic consistency model is presented to describe the performance of concrete under dynamic loading. The development of plasticity is started with the thermodynamic hypotheses in orde...An energy-dissipation based viscoplastic consistency model is presented to describe the performance of concrete under dynamic loading. The development of plasticity is started with the thermodynamic hypotheses in order that the model may have a sound theoretical background. Independent hardening and softening and the rate dependence of concrete are described separately for tension and compression. A modified implicit backward Euler integration scheme is adopted for the numerical computation. Static and dynamic behavior of the material is illustrated with certain numerical examples at material point level and structural level, and compared with existing experimental data. Results validate the effectiveness of the model.展开更多
Aiming to identify policy topics and their evolutionary logic that enhance the digital and green development(dual development)of traditional manufacturing enterprises,address weaknesses in current policies,and provide...Aiming to identify policy topics and their evolutionary logic that enhance the digital and green development(dual development)of traditional manufacturing enterprises,address weaknesses in current policies,and provide resources for refining dual development policies,a total of 15954 dual development-related policies issued by national and various departmental authorities in China from January 2000 to August 2023 were analyzed.Based on topic modeling techniques and the policy modeling consistency(PMC)framework,the evolution of policy topics was visualized,and a dynamic assessment of the policies was conducted.The results show that the digital and green development policy framework is progressively refined,and the governance philosophy shifts from a“regulatory government”paradigm to a“service-oriented government”.The support pattern evolves from“dispersed matching”to“integrated symbiosis”.However,there are still significant deficiencies in departmental cooperation,balanced measures,coordinated links,and multi-stakeholder participation.Future policy improvements should,therefore,focus on guiding multi-stakeholder participation,enhancing public demand orientation,and addressing the entire value chain.These steps aim to create an open and shared digital industry ecosystem to promote the coordinated dual development of traditional manufacturing enterprises.展开更多
To gain a better understanding about texture evolution during rolling process of AZ31 alloy, polycrystalline plasticity model was implemented into the explicit FE package, ABAQUS/Explicit by writing a user subroutine ...To gain a better understanding about texture evolution during rolling process of AZ31 alloy, polycrystalline plasticity model was implemented into the explicit FE package, ABAQUS/Explicit by writing a user subroutine VUMAT. For each individual grain in the polycrystalline aggregate, the rate dependent model was adopted to calculate the plastic shear strain increment in combination with the Voce hardening law to describe the hardening response, the lattice reorientation caused by slip and twinning were calculated separately due to their different mechanisms. The elasto-plastic self consistent (EPSC) model was employed to relate the response of individual grain to the response of the polycrystalline aggregate. Rolling processes of AZ31 sheet and as-cast AZ31 alloy were simulated respectively. The predicted texture distributions are in aualitative a^reement with experimental results.展开更多
It is vital that a well-defined conceptual model can be realized by a macro-model (e.g., a Continuous System Simulation (CSS) model) or a micro-model (e.g., an Agent-Based model or Discrete Event Simulation model) and...It is vital that a well-defined conceptual model can be realized by a macro-model (e.g., a Continuous System Simulation (CSS) model) or a micro-model (e.g., an Agent-Based model or Discrete Event Simulation model) and still produce mutually consistent results. The Full Potential CSS concept provides the rules so that the results from macro-modelling become fully consistent with those from micro-modelling. This paper focuses on the simulation language StochSD (Stochastic System Dynamics), which is an extension of classical Continuous System Simulation that implements the Full Potential CSS concept. Thus, in addition to modelling and simulating continuous flows between compartments represented by “real” numbers, it can also handle transitions of discrete entities by integer numbers, enabling combined models to be constructed in a straight-forward way. However, transition events of discrete entities (e.g., arrivals, accidents, deaths) usually happen irregularly over time, so stochasticity often plays a crucial role in their modelling. Therefore, StochSD contains powerful random functions to model uncertainties of different kinds, together with devices to collect statistics during a simulation or from multiple replications of the same stochastic model. Also, tools for sensitivity analysis, optimisation and statistical analysis are included. In particular, StochSD includes features for stochastic modelling, post-analysis of multiple simulations, and presentation of the results in statistical form. In addition to making StochSD a Full Potential CSS language, a second purpose is to provide an open-source package intended for small and middle-sized models in education, self-studies and research. To make StochSD and its philosophy easy to comprehend and use, it is based on the System Dynamics approach, where a system is described in terms of stocks and flows. StochSD is available for Windows, macOS and Linux. On the StochSD homepage, there is extensive material for a course in Modelling and Simulation in form of PowerPoint lectures and laboratory exercises.展开更多
In this paper, an exponential inequality for the maximal partial sums of negatively superadditive-dependent (NSD, in short) random variables is established. By uSing the exponen- tial inequality, we present some gen...In this paper, an exponential inequality for the maximal partial sums of negatively superadditive-dependent (NSD, in short) random variables is established. By uSing the exponen- tial inequality, we present some general results on the complete convergence for arrays of rowwise NSD random variables, which improve or generalize the corresponding ones of Wang et al. [28] and Chen et al. [2]. In addition, some sufficient conditions to prove the complete convergence are provided. As an application of the complete convergence that we established, we further investigate the complete consistency and convergence rate of the estimator in a nonparametric regression model based on NSD errors.展开更多
Complex processes often work with multiple operation regions, it is critical to develop effective monitoring approaches to ensure the safety of chemical processes. In this work, a discriminant local consistency Gaussi...Complex processes often work with multiple operation regions, it is critical to develop effective monitoring approaches to ensure the safety of chemical processes. In this work, a discriminant local consistency Gaussian mixture model(DLCGMM) for multimode process monitoring is proposed for multimode process monitoring by integrating LCGMM with modified local Fisher discriminant analysis(MLFDA). Different from Fisher discriminant analysis(FDA) that aims to discover the global optimal discriminant directions, MLFDA is capable of uncovering multimodality and local structure of the data by exploiting the posterior probabilities of observations within clusters calculated from the results of LCGMM. This may enable MLFDA to capture more meaningful discriminant information hidden in the high-dimensional multimode observations comparing to FDA. Contrary to most existing multimode process monitoring approaches, DLCGMM performs LCGMM and MFLDA iteratively, and the optimal subspaces with multi-Gaussianity and the optimal discriminant projection vectors are simultaneously achieved in the framework of supervised and unsupervised learning. Furthermore, monitoring statistics are established on each cluster that represents a specific operation condition and two global Bayesian inference-based fault monitoring indexes are established by combining with all the monitoring results of all clusters. The efficiency and effectiveness of the proposed method are evaluated through UCI datasets, a simulated multimode model and the Tennessee Eastman benchmark process.展开更多
Appropriate maintenance technologies that facilitate model consistency in distributed simulation systems are relevant but generally unavailable.To resolve this problem,we analyze the main factors that cause model inco...Appropriate maintenance technologies that facilitate model consistency in distributed simulation systems are relevant but generally unavailable.To resolve this problem,we analyze the main factors that cause model inconsistency.The analysis methods used for traditional distributed simulations are mostly empirical and qualitative,and disregard the dynamic characteristics of factor evolution in model operational running.Furthermore,distributed simulation applications(DSAs)are rapidly evolving in terms of large-scale,distributed,service-oriented,compositional,and dynamic features.Such developments present difficulty in the use of traditional analysis methods in DSAs,for the analysis of factorial effects on simulation models.To solve these problems,we construct a dynamic evolution mechanism of model consistency,called the connected model hyper-digraph(CMH).CMH is developed using formal methods that accurately specify the evolutional processes and activities of models(i.e.,self-evolution,interoperability,compositionality,and authenticity).We also develop an algorithm of model consistency evolution(AMCE)based on CMH to quantitatively and dynamically evaluate influencing factors.Experimental results demonstrate that non-combination(33.7%on average)is the most influential factor,non-single-directed understanding(26.6%)is the second most influential,and non-double-directed understanding(5.0%)is the least influential.Unlike previous analysis methods,AMCE provides good feasibility and effectiveness.This research can serve as guidance for designers of consistency maintenance technologies toward achieving a high level of consistency in future DSAs.展开更多
For the linear model y_i=x_iθ+e_i, i=1, 2,…, let the error sequence {e_i}_i=1 be iidr.v.’s, with unknown density f(x). In this paper,a nonparametric estimation method based onthe residuals is proposed for estimatin...For the linear model y_i=x_iθ+e_i, i=1, 2,…, let the error sequence {e_i}_i=1 be iidr.v.’s, with unknown density f(x). In this paper,a nonparametric estimation method based onthe residuals is proposed for estimating f(x) and the consistency of the estimators is obtained.展开更多
Checking if the implementations conform to the requirement models is challenging. Most existing techniques for consistency checking either focus on requirement models(e.g., requirements consistency checking), or on ...Checking if the implementations conform to the requirement models is challenging. Most existing techniques for consistency checking either focus on requirement models(e.g., requirements consistency checking), or on the implementations(e.g., code-based testing) only. In this paper we propose an approach to checking behavioral consistency of implementations against requirement models directly to overcome these limitations. Our approach extracts two behavioral models represented by Labelled Transition Systems(LTS) from requirement models and implementations respectively, and checks the behavioral consistency between these two models based on behavioral simulation relation of LTS. The checking results of our approach provide evidence for behavioral inconsistency as well as inconsistent localization. A research prototype called BCCH and a case study are presented to give initial validation of this approach.展开更多
As two popularly used variable selection methods, the Dantzig selector and the LASSO have been proved asymptotically equivalent in some scenarios. However, it is not the case in general for linear models, as disclosed...As two popularly used variable selection methods, the Dantzig selector and the LASSO have been proved asymptotically equivalent in some scenarios. However, it is not the case in general for linear models, as disclosed in Gai, Zhu and Lin's paper in 2013. In this paper, it is further shown that generally the asymptotic equivalence is not true either for a general single-index model with random design of predictors. To achieve this goal, the authors systematically investigate necessary and sufficient conditions for the consistent model selection of the Dantzig selector. An adaptive Dantzig selector is also recommended for the cases where those conditions are not satisfied. Also, different from existing methods for linear models, no distributional assumption on error term is needed with a trade-off that more stringent condition on the predictor vector is assumed. A small scale simulation is conducted to examine the performances of the Dantzig selector and the adaptive Dantzig selector.展开更多
Today, data is flowing into various organizations at an unprecedented scale. The ability to scale out for processing an enhanced workload has become an important factor for the proliferation and popularization of data...Today, data is flowing into various organizations at an unprecedented scale. The ability to scale out for processing an enhanced workload has become an important factor for the proliferation and popularization of database systems. Big data applications demand and consequently lead to the developments of diverse large-scale data management systems in different organizations, ranging from traditional database vendors to new emerging Internet-based enterprises. In this survey, we investigate, characterize, and analyze the large-scale data management systems in depth and develop comprehensive taxonomies for various critical aspects covering the data model, the system architecture, and the consistency model. We map the prevailing highly scalable data management systems to the proposed taxonomies, not only to classify the common techniques but also to provide a basis for analyzing current system scalability limitations. To overcome these limitations, we predicate and highlight the possible principles that future efforts need to be undertaken for the next generation large-scale data management systems.展开更多
In statistics and machine learning communities, the last fifteen years have witnessed a surge of high-dimensional models backed by penalized methods and other state-of-the-art variable selection techniques.The high-di...In statistics and machine learning communities, the last fifteen years have witnessed a surge of high-dimensional models backed by penalized methods and other state-of-the-art variable selection techniques.The high-dimensional models we refer to differ from conventional models in that the number of all parameters p and number of significant parameters s are both allowed to grow with the sample size T. When the field-specific knowledge is preliminary and in view of recent and potential affluence of data from genetics, finance and on-line social networks, etc., such(s, T, p)-triply diverging models enjoy ultimate flexibility in terms of modeling, and they can be used as a data-guided first step of investigation. However, model selection consistency and other theoretical properties were addressed only for independent data, leaving time series largely uncovered. On a simple linear regression model endowed with a weakly dependent sequence, this paper applies a penalized least squares(PLS) approach. Under regularity conditions, we show sign consistency, derive finite sample bound with high probability for estimation error, and prove that PLS estimate is consistent in L_2 norm with rate (s log s/T)~1/2.展开更多
A steady increase in consumer demands, and severe constraints from both a somewhat damaged environment and newly installed government policies, require today's product design and development to be faster and more...A steady increase in consumer demands, and severe constraints from both a somewhat damaged environment and newly installed government policies, require today's product design and development to be faster and more efficient than ever before, yet utilizing even fewer resources. New holistic approaches, such as total product life cycle modeling which embraces all aspects of a product's life cycle, are current attempts to solve these problems. Within the field of product design and modeling, feature technology has proved to be one very promising solution component. Owing to the tremendous increase in information technology, to transfer from low level data processing towards knowledge modeling and information processing is about to bring a change in almost every computerized application. From this viewpoint, current problems of both feature frameworks and feature systems are analyzed in respect to static and dynamic consistency breakdowns. The analysis ranges from early stages of designing (feature) concepts to final system implementation and application. Por the first time, an integrated view is given oil approaches, solutions and practical experience, with feature concepts and structures, providing both a feature framework and its implementation with sufficient system architecture and computational power to master a fair number of known consistency breakdowns, while providing for robust contexts for feature semantics and integrated models. Within today's heavy use of information technology these are pre-requisites if the full potential of feature technology is to be successfully translated into practice.展开更多
There is a growing trend of applying machine learning methods to medical datasets in order to predict patients’future status.Although some of these methods achieve high performance,challenges still exist in comparing...There is a growing trend of applying machine learning methods to medical datasets in order to predict patients’future status.Although some of these methods achieve high performance,challenges still exist in comparing and evaluating different models through their interpretable information.Such analytics can help clinicians improve evidence-based medical decision making.In this work,we develop a visual analytics system that compares multiple models’prediction criteria and evaluates their consistency.With our system,users can generate knowledge on different models’inner criteria and how confidently we can rely on each model’s prediction for a certain patient.Through a case study of a publicly available clinical dataset,we demonstrate the effectiveness of our visual analytics system to assist clinicians and researchers in comparing and quantitatively evaluating different machine learning methods.展开更多
文摘Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardwae-centric. This paper presents a framework of memory consistency models which describes the memory consistency model on the behavior level.Based on the understanding that the behavior of an execution is determined by the execution order of confiicting accesses, a memory consistency model is defined as an interprocessor synchronization mechanism which orders the execution of operations from different processors. Synchronization order of an execution under certain consistency model is also defined. The synchronization order, together with the program order,determines the behavior of an execution.This paper also presents criteria for correct program and correct implementation of consistency models. Regarding an implementation of a consistency model as certain memory event ordering constraints, this paper provides a method to prove the correctness of consistency model implementations, and the correctness of the lock-based cache coherence protocol is proved with this method.
基金Supported by the National High Technology Development 863 Program of China(Grant Nos.2007AA01Z114, 2006AA010201)the National Natural Science Foundation of China(Grant Nos.60703017, 60736012, 60325205, 60673146, 60603049)+1 种基金the National Grand Fundamental Research 973 Program of China(Grant Nos.2005CB321601, 2005CB321603)Beijing Natural Science Foundation(Grant No.4072024).
文摘Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded consistency model adapting to multithreaded processor is proposed in this paper. The restriction imposed on memory event ordering by chip multithreaded consistency is presented and formalized. With the idea of critical cycle built by Wei-Wu Hu, we prove that the proposed chip multithreaded consistency model satisfies the criterion of correct execution of sequential consistency model. Chip multithreaded consistency model provides a way of achieving high performance compared with sequential consistency model and easures the compatibility of software that the execution result in multithreaded processor is the same as the execution result in uniprocessor. The implementation strategy of chip multithreaded consistency model in Godson-2 SMT processor is also proposed. Godson-2 SMT processor supports chip multithreaded consistency model correctly by exception scheme based on the sequential memory access queue of each thread.
基金supported by the National Natural Science Foundation of China (No.90510018)
文摘An energy-dissipation based viscoplastic consistency model is presented to describe the performance of concrete under dynamic loading. The development of plasticity is started with the thermodynamic hypotheses in order that the model may have a sound theoretical background. Independent hardening and softening and the rate dependence of concrete are described separately for tension and compression. A modified implicit backward Euler integration scheme is adopted for the numerical computation. Static and dynamic behavior of the material is illustrated with certain numerical examples at material point level and structural level, and compared with existing experimental data. Results validate the effectiveness of the model.
基金The National Natural Science Foundation of China(No.71973023,42277493).
文摘Aiming to identify policy topics and their evolutionary logic that enhance the digital and green development(dual development)of traditional manufacturing enterprises,address weaknesses in current policies,and provide resources for refining dual development policies,a total of 15954 dual development-related policies issued by national and various departmental authorities in China from January 2000 to August 2023 were analyzed.Based on topic modeling techniques and the policy modeling consistency(PMC)framework,the evolution of policy topics was visualized,and a dynamic assessment of the policies was conducted.The results show that the digital and green development policy framework is progressively refined,and the governance philosophy shifts from a“regulatory government”paradigm to a“service-oriented government”.The support pattern evolves from“dispersed matching”to“integrated symbiosis”.However,there are still significant deficiencies in departmental cooperation,balanced measures,coordinated links,and multi-stakeholder participation.Future policy improvements should,therefore,focus on guiding multi-stakeholder participation,enhancing public demand orientation,and addressing the entire value chain.These steps aim to create an open and shared digital industry ecosystem to promote the coordinated dual development of traditional manufacturing enterprises.
基金Projects(50821003,50405014)supported by the National Natural Science Foundation of ChinaProjects(10QH1401400,10520705000,10JC1407300)supported by Shanghai Committee of Science and Technology,China+1 种基金Project(NCET-07-0545)supported by Program for New Century Excellent Talents in University,ChinaFord University Research Program,China
文摘To gain a better understanding about texture evolution during rolling process of AZ31 alloy, polycrystalline plasticity model was implemented into the explicit FE package, ABAQUS/Explicit by writing a user subroutine VUMAT. For each individual grain in the polycrystalline aggregate, the rate dependent model was adopted to calculate the plastic shear strain increment in combination with the Voce hardening law to describe the hardening response, the lattice reorientation caused by slip and twinning were calculated separately due to their different mechanisms. The elasto-plastic self consistent (EPSC) model was employed to relate the response of individual grain to the response of the polycrystalline aggregate. Rolling processes of AZ31 sheet and as-cast AZ31 alloy were simulated respectively. The predicted texture distributions are in aualitative a^reement with experimental results.
文摘It is vital that a well-defined conceptual model can be realized by a macro-model (e.g., a Continuous System Simulation (CSS) model) or a micro-model (e.g., an Agent-Based model or Discrete Event Simulation model) and still produce mutually consistent results. The Full Potential CSS concept provides the rules so that the results from macro-modelling become fully consistent with those from micro-modelling. This paper focuses on the simulation language StochSD (Stochastic System Dynamics), which is an extension of classical Continuous System Simulation that implements the Full Potential CSS concept. Thus, in addition to modelling and simulating continuous flows between compartments represented by “real” numbers, it can also handle transitions of discrete entities by integer numbers, enabling combined models to be constructed in a straight-forward way. However, transition events of discrete entities (e.g., arrivals, accidents, deaths) usually happen irregularly over time, so stochasticity often plays a crucial role in their modelling. Therefore, StochSD contains powerful random functions to model uncertainties of different kinds, together with devices to collect statistics during a simulation or from multiple replications of the same stochastic model. Also, tools for sensitivity analysis, optimisation and statistical analysis are included. In particular, StochSD includes features for stochastic modelling, post-analysis of multiple simulations, and presentation of the results in statistical form. In addition to making StochSD a Full Potential CSS language, a second purpose is to provide an open-source package intended for small and middle-sized models in education, self-studies and research. To make StochSD and its philosophy easy to comprehend and use, it is based on the System Dynamics approach, where a system is described in terms of stocks and flows. StochSD is available for Windows, macOS and Linux. On the StochSD homepage, there is extensive material for a course in Modelling and Simulation in form of PowerPoint lectures and laboratory exercises.
基金Supported by the National Natural Science Foundation of China(11501004,11501005,11526033,11671012)the Natural Science Foundation of Anhui Province(1508085J06,1608085QA02)+1 种基金the Key Projects for Academic Talent of Anhui Province(gxbj ZD2016005)the Research Teaching Model Curriculum of Anhui University(xjyjkc1407)
文摘In this paper, an exponential inequality for the maximal partial sums of negatively superadditive-dependent (NSD, in short) random variables is established. By uSing the exponen- tial inequality, we present some general results on the complete convergence for arrays of rowwise NSD random variables, which improve or generalize the corresponding ones of Wang et al. [28] and Chen et al. [2]. In addition, some sufficient conditions to prove the complete convergence are provided. As an application of the complete convergence that we established, we further investigate the complete consistency and convergence rate of the estimator in a nonparametric regression model based on NSD errors.
基金Supported by the National Natural Science Foundation of China(61273167)
文摘Complex processes often work with multiple operation regions, it is critical to develop effective monitoring approaches to ensure the safety of chemical processes. In this work, a discriminant local consistency Gaussian mixture model(DLCGMM) for multimode process monitoring is proposed for multimode process monitoring by integrating LCGMM with modified local Fisher discriminant analysis(MLFDA). Different from Fisher discriminant analysis(FDA) that aims to discover the global optimal discriminant directions, MLFDA is capable of uncovering multimodality and local structure of the data by exploiting the posterior probabilities of observations within clusters calculated from the results of LCGMM. This may enable MLFDA to capture more meaningful discriminant information hidden in the high-dimensional multimode observations comparing to FDA. Contrary to most existing multimode process monitoring approaches, DLCGMM performs LCGMM and MFLDA iteratively, and the optimal subspaces with multi-Gaussianity and the optimal discriminant projection vectors are simultaneously achieved in the framework of supervised and unsupervised learning. Furthermore, monitoring statistics are established on each cluster that represents a specific operation condition and two global Bayesian inference-based fault monitoring indexes are established by combining with all the monitoring results of all clusters. The efficiency and effectiveness of the proposed method are evaluated through UCI datasets, a simulated multimode model and the Tennessee Eastman benchmark process.
基金Project supported by the National Natural Science Foundation of China(No.61272336)
文摘Appropriate maintenance technologies that facilitate model consistency in distributed simulation systems are relevant but generally unavailable.To resolve this problem,we analyze the main factors that cause model inconsistency.The analysis methods used for traditional distributed simulations are mostly empirical and qualitative,and disregard the dynamic characteristics of factor evolution in model operational running.Furthermore,distributed simulation applications(DSAs)are rapidly evolving in terms of large-scale,distributed,service-oriented,compositional,and dynamic features.Such developments present difficulty in the use of traditional analysis methods in DSAs,for the analysis of factorial effects on simulation models.To solve these problems,we construct a dynamic evolution mechanism of model consistency,called the connected model hyper-digraph(CMH).CMH is developed using formal methods that accurately specify the evolutional processes and activities of models(i.e.,self-evolution,interoperability,compositionality,and authenticity).We also develop an algorithm of model consistency evolution(AMCE)based on CMH to quantitatively and dynamically evaluate influencing factors.Experimental results demonstrate that non-combination(33.7%on average)is the most influential factor,non-single-directed understanding(26.6%)is the second most influential,and non-double-directed understanding(5.0%)is the least influential.Unlike previous analysis methods,AMCE provides good feasibility and effectiveness.This research can serve as guidance for designers of consistency maintenance technologies toward achieving a high level of consistency in future DSAs.
基金The project supported by National Natural Science Foundation of China Crant 18971061
文摘For the linear model y_i=x_iθ+e_i, i=1, 2,…, let the error sequence {e_i}_i=1 be iidr.v.’s, with unknown density f(x). In this paper,a nonparametric estimation method based onthe residuals is proposed for estimating f(x) and the consistency of the estimators is obtained.
基金Supported by the National Natural Science Foundation of China(91118003,61003071)the Fundamental Research Funds for the Central Universities(3101046,201121102020006)the Special Funds for Shenzhen Strategic New Industry Development(JCYJ20120616135936123)
文摘Checking if the implementations conform to the requirement models is challenging. Most existing techniques for consistency checking either focus on requirement models(e.g., requirements consistency checking), or on the implementations(e.g., code-based testing) only. In this paper we propose an approach to checking behavioral consistency of implementations against requirement models directly to overcome these limitations. Our approach extracts two behavioral models represented by Labelled Transition Systems(LTS) from requirement models and implementations respectively, and checks the behavioral consistency between these two models based on behavioral simulation relation of LTS. The checking results of our approach provide evidence for behavioral inconsistency as well as inconsistent localization. A research prototype called BCCH and a case study are presented to give initial validation of this approach.
基金supported by the National Natural Science Foundation of China under Grant Nos.11501354,11201499,11301309 and 714732802015 Shanghai Young Faculty Training Program under Grant No.A1A-6119-15-003
文摘As two popularly used variable selection methods, the Dantzig selector and the LASSO have been proved asymptotically equivalent in some scenarios. However, it is not the case in general for linear models, as disclosed in Gai, Zhu and Lin's paper in 2013. In this paper, it is further shown that generally the asymptotic equivalence is not true either for a general single-index model with random design of predictors. To achieve this goal, the authors systematically investigate necessary and sufficient conditions for the consistent model selection of the Dantzig selector. An adaptive Dantzig selector is also recommended for the cases where those conditions are not satisfied. Also, different from existing methods for linear models, no distributional assumption on error term is needed with a trade-off that more stringent condition on the predictor vector is assumed. A small scale simulation is conducted to examine the performances of the Dantzig selector and the adaptive Dantzig selector.
文摘Today, data is flowing into various organizations at an unprecedented scale. The ability to scale out for processing an enhanced workload has become an important factor for the proliferation and popularization of database systems. Big data applications demand and consequently lead to the developments of diverse large-scale data management systems in different organizations, ranging from traditional database vendors to new emerging Internet-based enterprises. In this survey, we investigate, characterize, and analyze the large-scale data management systems in depth and develop comprehensive taxonomies for various critical aspects covering the data model, the system architecture, and the consistency model. We map the prevailing highly scalable data management systems to the proposed taxonomies, not only to classify the common techniques but also to provide a basis for analyzing current system scalability limitations. To overcome these limitations, we predicate and highlight the possible principles that future efforts need to be undertaken for the next generation large-scale data management systems.
基金supported by Natural Science Foundation of USA (Grant Nos. DMS1206464 and DMS1613338)National Institutes of Health of USA (Grant Nos. R01GM072611, R01GM100474 and R01GM120507)
文摘In statistics and machine learning communities, the last fifteen years have witnessed a surge of high-dimensional models backed by penalized methods and other state-of-the-art variable selection techniques.The high-dimensional models we refer to differ from conventional models in that the number of all parameters p and number of significant parameters s are both allowed to grow with the sample size T. When the field-specific knowledge is preliminary and in view of recent and potential affluence of data from genetics, finance and on-line social networks, etc., such(s, T, p)-triply diverging models enjoy ultimate flexibility in terms of modeling, and they can be used as a data-guided first step of investigation. However, model selection consistency and other theoretical properties were addressed only for independent data, leaving time series largely uncovered. On a simple linear regression model endowed with a weakly dependent sequence, this paper applies a penalized least squares(PLS) approach. Under regularity conditions, we show sign consistency, derive finite sample bound with high probability for estimation error, and prove that PLS estimate is consistent in L_2 norm with rate (s log s/T)~1/2.
文摘A steady increase in consumer demands, and severe constraints from both a somewhat damaged environment and newly installed government policies, require today's product design and development to be faster and more efficient than ever before, yet utilizing even fewer resources. New holistic approaches, such as total product life cycle modeling which embraces all aspects of a product's life cycle, are current attempts to solve these problems. Within the field of product design and modeling, feature technology has proved to be one very promising solution component. Owing to the tremendous increase in information technology, to transfer from low level data processing towards knowledge modeling and information processing is about to bring a change in almost every computerized application. From this viewpoint, current problems of both feature frameworks and feature systems are analyzed in respect to static and dynamic consistency breakdowns. The analysis ranges from early stages of designing (feature) concepts to final system implementation and application. Por the first time, an integrated view is given oil approaches, solutions and practical experience, with feature concepts and structures, providing both a feature framework and its implementation with sufficient system architecture and computational power to master a fair number of known consistency breakdowns, while providing for robust contexts for feature semantics and integrated models. Within today's heavy use of information technology these are pre-requisites if the full potential of feature technology is to be successfully translated into practice.
基金the U.S.National Science Foundation through grant IIS-1741536 and a 2019 Seed Fund Award from CITRIS and the Banatao Institute at the University of California.
文摘There is a growing trend of applying machine learning methods to medical datasets in order to predict patients’future status.Although some of these methods achieve high performance,challenges still exist in comparing and evaluating different models through their interpretable information.Such analytics can help clinicians improve evidence-based medical decision making.In this work,we develop a visual analytics system that compares multiple models’prediction criteria and evaluates their consistency.With our system,users can generate knowledge on different models’inner criteria and how confidently we can rely on each model’s prediction for a certain patient.Through a case study of a publicly available clinical dataset,we demonstrate the effectiveness of our visual analytics system to assist clinicians and researchers in comparing and quantitatively evaluating different machine learning methods.