Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are ...Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are various kinds of process models that are used by the software industries for the development of small, medium and long-term software projects, but many of them do not cover risk management. It is quite obvious that the improper selection of the software development process model leads to failure of the software products as it is time bound activity. In the present work, a new software development process model is proposed which covers the risks at any stage of the development of the software product. The model is named a Hemant-Vipin (HV) process model and may be helpful for the software industries for development of the efficient software products and timely delivery at the end of the client. The efficiency of the HV process model is observed by considering various kinds of factors like requirement clarity, user feedback, change agility, predictability, risk identification, practical implementation, customer satisfaction, incremental development, use of ready-made components, quick design, resource organization and many more and found through a case study that the presented approach covers many of parameters in comparison of the existing process models. .展开更多
This research recognizes the limitation and challenges of adaptingand applying Process Mining as a powerful tool and technique in theHypothetical Software Architecture (SA) Evaluation Framework with thefeatures and fa...This research recognizes the limitation and challenges of adaptingand applying Process Mining as a powerful tool and technique in theHypothetical Software Architecture (SA) Evaluation Framework with thefeatures and factors of lightweightness. Process mining deals with the largescalecomplexity of security and performance analysis, which are the goalsof SA evaluation frameworks. As a result of these conjectures, all ProcessMining researches in the realm of SA are thoroughly reviewed, and ninechallenges for Process Mining Adaption are recognized. Process mining isembedded in the framework and to boost the quality of the SA model forfurther analysis, the framework nominates architectural discovery algorithmsFlower, Alpha, Integer Linear Programming (ILP), Heuristic, and Inductiveand compares them vs. twelve quality criteria. Finally, the framework’s testingon three case studies approves the feasibility of applying process mining toarchitectural evaluation. The extraction of the SA model is also done by thebest model discovery algorithm, which is selected by intensive benchmarkingin this research. This research presents case studies of SA in service-oriented,Pipe and Filter, and component-based styles, modeled and simulated byHierarchical Colored Petri Net techniques based on the cases’ documentation.Processminingwithin this framework dealswith the system’s log files obtainedfrom SA simulation. Applying process mining is challenging, especially for aSA evaluation framework, as it has not been done yet. The research recognizesthe problems of process mining adaption to a hypothetical lightweightSA evaluation framework and addresses these problems during the solutiondevelopment.展开更多
The production and energy coupling system is used to mainly present energy flow, material flow, information flow, and their coupling interaction. Through the modeling and simulation of this system, the performance of ...The production and energy coupling system is used to mainly present energy flow, material flow, information flow, and their coupling interaction. Through the modeling and simulation of this system, the performance of energy flow can be analyzed and optimized in the process industry. In order to study this system, the component based hybrid Petri net methodology (CpnHPN) is proposed, synthesizing a number of extended Petri net methods and using the concept of energy place, material place, and information place. Through the interface place in CpnHPN, the component based encapsulation is established, which enables the production and energy coupling system to be built, analyzed, and optimized on the multi-level framework. Considering the block and brief simulation for hybrid system, the CpnHPN model is simulated with Simulink/Stateflow. To illustrate the use of the proposed methodology, the application of CpnHPN in the energy optimization of chlorine balance system is provided.展开更多
Behaviour detection models based on automata have been studied widely. By add- ing edge ε, the local automata are combined into global automata to describe and detect soft- ware behaviour. However, these methods in- ...Behaviour detection models based on automata have been studied widely. By add- ing edge ε, the local automata are combined into global automata to describe and detect soft- ware behaviour. However, these methods in- troduce nondeterminacy, leading to models that are imprecise or inefficient. We present a model of software Behaviour Detection based on Process Algebra and system call (BDPA). In this model, a system call is mapped into an action, and a function is mapped into a process We construct a process expression for each function to describe its behaviour. Without con- strutting automata or introducing nondeter- minacy, we use algebraic properties and algo- rithms to obtain a global process expression by combining the process expressions derived from each function. Behaviour detection rules and methods based on BDPA are determined by equivalence theory. Experiments demon- strate that the BDPA model has better preci- sion and efficiency than traditional methods.展开更多
We propose a software reliability growth model with testing-effort based on a continuous-state space stochastic process, such as a lognormal process, and conduct its goodness-of-fit evaluation. We also discuss a param...We propose a software reliability growth model with testing-effort based on a continuous-state space stochastic process, such as a lognormal process, and conduct its goodness-of-fit evaluation. We also discuss a parameter estimation method of our model. Then, we derive several software reliability assessment measures by the probability distribution of its solution process, and compare our model with existing continuous-state space software reliability growth models in terms of the mean square error and the Akaike’s information criterion by using actual fault count data.展开更多
The main objective of this paper is to analyze the representativeness of the SPEM (Software Process Engineering Metamodel Specification) and the BPMN (Business Process Modeling Notation) standards in the software proc...The main objective of this paper is to analyze the representativeness of the SPEM (Software Process Engineering Metamodel Specification) and the BPMN (Business Process Modeling Notation) standards in the software processes modeling context. To perform this analysis, it was adopted a standard structure to define a software process based upon a process ontology. Then, the SPEM and BPMN standards notations and their semantically corresponding elements in the default process were identified. This mapping also includes components of the CMMI-DEV (Capability Maturity Model Integration for Development) and MR-MPS (Reference Model for Software Process Improvement) quality models. This was necessary to assist in the mapping evaluation through a case study which models the best practices of these quality models. Finally, we carried out an analysis of these standards through specific characteristics considered necessary to model and to represent software processes.展开更多
Several software reliability growth models (SRGM) have been developed to monitor the reliability growth during the testing phase of software development. In most of the existing research available in the literatures...Several software reliability growth models (SRGM) have been developed to monitor the reliability growth during the testing phase of software development. In most of the existing research available in the literatures, it is considered that a similar testing effort is required on each debugging effort. However, in practice, different types of faults may require different amounts of testing efforts for their detection and removal. Consequently, faults are classified into three categories on the basis of severity: simple, hard and complex. This categorization may be extended to r type of faults on the basis of severity. Although some existing research in the literatures has incorporated this concept that fault removal rate (FRR) is different for different types of faults, they assume that the FRR remains constant during the overall testing period. On the contrary, it has been observed that as testing progresses, FRR changes due to changing testing strategy, skill, environment and personnel resources. In this paper, a general discrete SRGM is proposed for errors of different severity in software systems using the change-point concept. Then, the models are formulated for two particular environments. The models were validated on two real-life data sets. The results show better fit and wider applicability of the proposed models as to different types of failure datasets.展开更多
The curse of dimensionality refers to the problem o increased sparsity and computational complexity when dealing with high-dimensional data.In recent years,the types and vari ables of industrial data have increased si...The curse of dimensionality refers to the problem o increased sparsity and computational complexity when dealing with high-dimensional data.In recent years,the types and vari ables of industrial data have increased significantly,making data driven models more challenging to develop.To address this prob lem,data augmentation technology has been introduced as an effective tool to solve the sparsity problem of high-dimensiona industrial data.This paper systematically explores and discusses the necessity,feasibility,and effectiveness of augmented indus trial data-driven modeling in the context of the curse of dimen sionality and virtual big data.Then,the process of data augmen tation modeling is analyzed,and the concept of data boosting augmentation is proposed.The data boosting augmentation involves designing the reliability weight and actual-virtual weigh functions,and developing a double weighted partial least squares model to optimize the three stages of data generation,data fusion and modeling.This approach significantly improves the inter pretability,effectiveness,and practicality of data augmentation in the industrial modeling.Finally,the proposed method is verified using practical examples of fault diagnosis systems and virtua measurement systems in the industry.The results demonstrate the effectiveness of the proposed approach in improving the accu racy and robustness of data-driven models,making them more suitable for real-world industrial applications.展开更多
According to the characteristic of Team Software Process (TSP), it adopts a hierarchy-based model combined discrete event model with system dynamics model. This model represents TSP as form of two levels, the inner ...According to the characteristic of Team Software Process (TSP), it adopts a hierarchy-based model combined discrete event model with system dynamics model. This model represents TSP as form of two levels, the inner level embodies the continuity of the software process, the outer embodies the software development process by phases, and the structure and principle of the model is explained in detail, then formalization description of the model is offered. At last, an example is presented to demonstrate the simulation process and result. This model can simulate team software process from various angles, supervise and predict the software process. Also it can make the management of software development become more scientific and improve the quality of software.展开更多
The current status of the China's software industry is introduced, including the great potential opportunities and many problems across the path to success. The main measures are discussed to keep abreast of the...The current status of the China's software industry is introduced, including the great potential opportunities and many problems across the path to success. The main measures are discussed to keep abreast of the time needs and fuse with the whole family of global IT industry.展开更多
Due to the randomness and time dependence of the factors affecting software reliability, most software reliability models are treated as stochastic processes, and the non-homogeneous Poisson process(NHPP) is the most ...Due to the randomness and time dependence of the factors affecting software reliability, most software reliability models are treated as stochastic processes, and the non-homogeneous Poisson process(NHPP) is the most used one.However, the failure behavior of software does not follow the NHPP in a statistically rigorous manner, and the pure random method might be not enough to describe the software failure behavior. To solve these problems, this paper proposes a new integrated approach that combines stochastic process and grey system theory to describe the failure behavior of software. A grey NHPP software reliability model is put forward in a discrete form, and a grey-based approach for estimating software reliability under the NHPP is proposed as a nonlinear multi-objective programming problem. Finally, four grey NHPP software reliability models are applied to four real datasets, the dynamic R-square and predictive relative error are calculated. Comparing with the original single NHPP software reliability model, it is found that the modeling using the integrated approach has a higher prediction accuracy of software reliability. Therefore, there is the characteristics of grey uncertain information in the NHPP software reliability models, and exploiting the latent grey uncertain information might lead to more accurate software reliability estimation.展开更多
The Goel-Okumoto software reliability model, also known as the Exponential Nonhomogeneous Poisson Process,is one of the earliest software reliability models to be proposed. From literature, it is evident that most of ...The Goel-Okumoto software reliability model, also known as the Exponential Nonhomogeneous Poisson Process,is one of the earliest software reliability models to be proposed. From literature, it is evident that most of the study that has been done on the Goel-Okumoto software reliability model is parameter estimation using the MLE method and model fit. It is widely known that predictive analysis is very useful for modifying, debugging and determining when to terminate software development testing process. However, there is a conspicuous absence of literature on both the classical and Bayesian predictive analyses on the model. This paper presents some results about predictive analyses for the Goel-Okumoto software reliability model. Driven by the requirement of highly reliable software used in computers embedded in automotive, mechanical and safety control systems, industrial and quality process control, real-time sensor networks, aircrafts, nuclear reactors among others, we address four issues in single-sample prediction associated closely with software development process. We have adopted Bayesian methods based on non-informative priors to develop explicit solutions to these problems. An example with real data in the form of time between software failures will be used to illustrate the developed methodologies.展开更多
Using Michael Porter's "diamond model", based on regional development characteristics, we conduct analysis of the competitiveness of processing industry cluster of livestock products in Inner Mongolia fr...Using Michael Porter's "diamond model", based on regional development characteristics, we conduct analysis of the competitiveness of processing industry cluster of livestock products in Inner Mongolia from six aspects (the factor conditions, demand conditions, corporate strategy, structure and competition, related and supporting industries, government and opportunities). And we put forward the following rational recommendations for improving the competitiveness of processing industry cluster of livestock products in Inner Mongolia: (i) The government should increase capital input, focus on supporting processing industry of livestock products, and give play to the guidance and aggregation effect of financial funds; (ii) In terms of enterprises, it is necessary to vigorously develop leading enterprises, to give full play to the cluster effect of the leading enterprises.展开更多
The Goel-Okumoto software reliability model is one of the earliest attempts to use a non-homogeneous Poisson process to model failure times observed during software test interval. The model is known as exponential NHP...The Goel-Okumoto software reliability model is one of the earliest attempts to use a non-homogeneous Poisson process to model failure times observed during software test interval. The model is known as exponential NHPP model as it describes exponential software failure curve. Parameter estimation, model fit and predictive analyses based on one sample have been conducted on the Goel-Okumoto software reliability model. However, predictive analyses based on two samples have not been conducted on the model. In two-sample prediction, the parameters and characteristics of the first sample are used to analyze and to make predictions for the second sample. This helps in saving time and resources during the software development process. This paper presents some results about predictive analyses for the Goel-Okumoto software reliability model based on two samples. We have addressed three issues in two-sample prediction associated closely with software development testing process. Bayesian methods based on non-informative priors have been adopted to develop solutions to these issues. The developed methodologies have been illustrated by two sets of software failure data simulated from the Goel-Okumoto software reliability model.展开更多
Due to high cost of fixing failures, safety concerns, and legal liabilities, organizations need to produce software that is highly reliable. Software reliability growth models have been developed by software developer...Due to high cost of fixing failures, safety concerns, and legal liabilities, organizations need to produce software that is highly reliable. Software reliability growth models have been developed by software developers in tracking and measuring the growth of reliability. Most of the Software Reliability Growth Models, which have been proposed, treat the event of software fault detection in the testing and operational phase as a counting process. Moreover, if the size of software system is large, the number of software faults detected during the testing phase becomes large, and the change of the number of faults which are detected and removed through debugging activities becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. Therefore in such a situation, we can model the software fault detection process as a stochastic process with a continuous state space. Recently, Artificial Neural Networks (ANN) have been applied in software reliability growth prediction. In this paper, we propose an ANN based software reliability growth model based on Ito type of stochastic differential equation. The model has been validated, evaluated and compared with other existing NHPP model by applying it on actual failure/fault removal data sets cited from real software development projects. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP based model.展开更多
Testing-effort(TE) and imperfect debugging(ID) in the reliability modeling process may further improve the fitting and prediction results of software reliability growth models(SRGMs). For describing the S-shaped...Testing-effort(TE) and imperfect debugging(ID) in the reliability modeling process may further improve the fitting and prediction results of software reliability growth models(SRGMs). For describing the S-shaped varying trend of TE increasing rate more accurately, first, two S-shaped testing-effort functions(TEFs), i.e.,delayed S-shaped TEF(DS-TEF) and inflected S-shaped TEF(IS-TEF), are proposed. Then these two TEFs are incorporated into various types(exponential-type, delayed S-shaped and inflected S-shaped) of non-homogeneous Poisson process(NHPP)SRGMs with two forms of ID respectively for obtaining a series of new NHPP SRGMs which consider S-shaped TEFs as well as ID. Finally these new SRGMs and several comparison NHPP SRGMs are applied into four real failure data-sets respectively for investigating the fitting and prediction power of these new SRGMs.The experimental results show that:(i) the proposed IS-TEF is more suitable and flexible for describing the consumption of TE than the previous TEFs;(ii) incorporating TEFs into the inflected S-shaped NHPP SRGM may be more effective and appropriate compared with the exponential-type and the delayed S-shaped NHPP SRGMs;(iii) the inflected S-shaped NHPP SRGM considering both IS-TEF and ID yields the most accurate fitting and prediction results than the other comparison NHPP SRGMs.展开更多
Owing to its low cost,short process and low energy consumption,semi-solid processing(SSP)of aluminum(Al)and magnesium(Mg)alloys has been considered as a competitive approach to fabricate complicated components with ex...Owing to its low cost,short process and low energy consumption,semi-solid processing(SSP)of aluminum(Al)and magnesium(Mg)alloys has been considered as a competitive approach to fabricate complicated components with excellent performance.Over the past decade,significant progress has been achieved in deeply understanding the SSP process,the microstructure and performance of the fabricated components in China.This paper starts with a retrospective overview of some common slurry preparation methods,followed by presenting the performance and the underlying mechanisms of SSP fabricated alloys.Then,the mainstream opinions on the microstructure evolution and rheological flow behavior of semi-solid slurry are discussed.Subsequently,the general situation and some recent examples of industrial applications of SSP are presented.Finally,special attention is paid to the unresolved issues and the future directions in SSP of Al and Mg alloys in China.展开更多
High-moisture extrusion technology should be considered one of the best choices for producing plant-based meat substitutes with the rich fibrous structure offered by real animal meat products.Unfortunately,the extrusi...High-moisture extrusion technology should be considered one of the best choices for producing plant-based meat substitutes with the rich fibrous structure offered by real animal meat products.Unfortunately,the extrusion process has been seen as a“black box”with limited information about what occurs inside,causing serious obstacles in developing meat substitutes.This study designed a high-moisture extrusion process and developed 10 new plant-based meat substitutes comparable to the fibrous structure of real animal meat.The study used the Feature-Augmented Principal Component Analysis(FA-PCA)method to visualize and understand the whole extrusion process in three ways systematically and accurately.It established six sets of mathematical models of the high-moisture extrusion process based on 8000 pieces of data,including five types of parameters.The FA-PCA method improved the R^(2) values significantly compared with the PCA method.The Way 3 was the best to predict product quality(Z),demonstrating that the gradually molecular conformational changes(Y^(n'))were critical in controlling the final quality of the plant-based meat substitutes.Moreover,the first visualization platform software for the high-moisture extrusion process has been established to clearly show the“black box”by combining the virtual simulation technology.Through the software,some practice work such as equipment installation,parameter adjustment,equipment disassembly,and data prediction can be easily achieved.展开更多
In order to effectively analyse the multivariate time series data of complex process,a generic reconstruction technology based on reduction theory of rough sets was proposed,Firstly,the phase space of multivariate tim...In order to effectively analyse the multivariate time series data of complex process,a generic reconstruction technology based on reduction theory of rough sets was proposed,Firstly,the phase space of multivariate time series was originally reconstructed by a classical reconstruction technology.Then,the original decision-table of rough set theory was set up according to the embedding dimensions and time-delays of the original reconstruction phase space,and the rough set reduction was used to delete the redundant dimensions and irrelevant variables and to reconstruct the generic phase space,Finally,the input vectors for the prediction of multivariate time series were extracted according to generic reconstruction results to identify the parameters of prediction model.Verification results show that the developed reconstruction method leads to better generalization ability for the prediction model and it is feasible and worthwhile for application.展开更多
Mathematical model of filling disk-shaped mold cavity in steady state was studied.And the mathematical model under vibration field was developed from the model in steady state.According to the model of filling disk-sh...Mathematical model of filling disk-shaped mold cavity in steady state was studied.And the mathematical model under vibration field was developed from the model in steady state.According to the model of filling disk-shaped mold cavity in steady state,the filling time,the distribution of velocity field and the pressure field were obtained.The analysis results from rheological analytic model were compared with the numerical simulation results using Moldflow software in the powder injection molding filling process.Through the comparison,it is found that it is unreasonable to neglect the influence of temperature when calculated the pressure changing with the time at the cavity gate,while it can be neglected in other situations such as calculating the distribution of the velocity fields.This provides a theoretical reference for the establishment of correct model both in steady state and under vibration force field in the future.展开更多
文摘Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are various kinds of process models that are used by the software industries for the development of small, medium and long-term software projects, but many of them do not cover risk management. It is quite obvious that the improper selection of the software development process model leads to failure of the software products as it is time bound activity. In the present work, a new software development process model is proposed which covers the risks at any stage of the development of the software product. The model is named a Hemant-Vipin (HV) process model and may be helpful for the software industries for development of the efficient software products and timely delivery at the end of the client. The efficiency of the HV process model is observed by considering various kinds of factors like requirement clarity, user feedback, change agility, predictability, risk identification, practical implementation, customer satisfaction, incremental development, use of ready-made components, quick design, resource organization and many more and found through a case study that the presented approach covers many of parameters in comparison of the existing process models. .
基金This paper is supported by Research Grant Number:PP-FTSM-2022.
文摘This research recognizes the limitation and challenges of adaptingand applying Process Mining as a powerful tool and technique in theHypothetical Software Architecture (SA) Evaluation Framework with thefeatures and factors of lightweightness. Process mining deals with the largescalecomplexity of security and performance analysis, which are the goalsof SA evaluation frameworks. As a result of these conjectures, all ProcessMining researches in the realm of SA are thoroughly reviewed, and ninechallenges for Process Mining Adaption are recognized. Process mining isembedded in the framework and to boost the quality of the SA model forfurther analysis, the framework nominates architectural discovery algorithmsFlower, Alpha, Integer Linear Programming (ILP), Heuristic, and Inductiveand compares them vs. twelve quality criteria. Finally, the framework’s testingon three case studies approves the feasibility of applying process mining toarchitectural evaluation. The extraction of the SA model is also done by thebest model discovery algorithm, which is selected by intensive benchmarkingin this research. This research presents case studies of SA in service-oriented,Pipe and Filter, and component-based styles, modeled and simulated byHierarchical Colored Petri Net techniques based on the cases’ documentation.Processminingwithin this framework dealswith the system’s log files obtainedfrom SA simulation. Applying process mining is challenging, especially for aSA evaluation framework, as it has not been done yet. The research recognizesthe problems of process mining adaption to a hypothetical lightweightSA evaluation framework and addresses these problems during the solutiondevelopment.
基金Shanghai Municipal Science & Technology Projects, China (No. 09DZ1203300, No. 10JC1415200)
文摘The production and energy coupling system is used to mainly present energy flow, material flow, information flow, and their coupling interaction. Through the modeling and simulation of this system, the performance of energy flow can be analyzed and optimized in the process industry. In order to study this system, the component based hybrid Petri net methodology (CpnHPN) is proposed, synthesizing a number of extended Petri net methods and using the concept of energy place, material place, and information place. Through the interface place in CpnHPN, the component based encapsulation is established, which enables the production and energy coupling system to be built, analyzed, and optimized on the multi-level framework. Considering the block and brief simulation for hybrid system, the CpnHPN model is simulated with Simulink/Stateflow. To illustrate the use of the proposed methodology, the application of CpnHPN in the energy optimization of chlorine balance system is provided.
基金supported by the Fund of National Natural Science Project under Grant No.61272125the Specialized Research Fund for the Doctoral Program of Higher Education under Grant No.20121333110014the Hebei Provincial Natural Science Foundation under Grant No.F2011203234
文摘Behaviour detection models based on automata have been studied widely. By add- ing edge ε, the local automata are combined into global automata to describe and detect soft- ware behaviour. However, these methods in- troduce nondeterminacy, leading to models that are imprecise or inefficient. We present a model of software Behaviour Detection based on Process Algebra and system call (BDPA). In this model, a system call is mapped into an action, and a function is mapped into a process We construct a process expression for each function to describe its behaviour. Without con- strutting automata or introducing nondeter- minacy, we use algebraic properties and algo- rithms to obtain a global process expression by combining the process expressions derived from each function. Behaviour detection rules and methods based on BDPA are determined by equivalence theory. Experiments demon- strate that the BDPA model has better preci- sion and efficiency than traditional methods.
文摘We propose a software reliability growth model with testing-effort based on a continuous-state space stochastic process, such as a lognormal process, and conduct its goodness-of-fit evaluation. We also discuss a parameter estimation method of our model. Then, we derive several software reliability assessment measures by the probability distribution of its solution process, and compare our model with existing continuous-state space software reliability growth models in terms of the mean square error and the Akaike’s information criterion by using actual fault count data.
基金The authors would like to thank CNPq(Conselho Na-cional de Desenvolvimento Científico e Tecnológico-Na-tional Counsel of Technological and Scientific Develop-ment),for financial support through the DTI grant of the MCT/CNPq/FNDCT No.19/2009 announcement for the development of this work
文摘The main objective of this paper is to analyze the representativeness of the SPEM (Software Process Engineering Metamodel Specification) and the BPMN (Business Process Modeling Notation) standards in the software processes modeling context. To perform this analysis, it was adopted a standard structure to define a software process based upon a process ontology. Then, the SPEM and BPMN standards notations and their semantically corresponding elements in the default process were identified. This mapping also includes components of the CMMI-DEV (Capability Maturity Model Integration for Development) and MR-MPS (Reference Model for Software Process Improvement) quality models. This was necessary to assist in the mapping evaluation through a case study which models the best practices of these quality models. Finally, we carried out an analysis of these standards through specific characteristics considered necessary to model and to represent software processes.
文摘Several software reliability growth models (SRGM) have been developed to monitor the reliability growth during the testing phase of software development. In most of the existing research available in the literatures, it is considered that a similar testing effort is required on each debugging effort. However, in practice, different types of faults may require different amounts of testing efforts for their detection and removal. Consequently, faults are classified into three categories on the basis of severity: simple, hard and complex. This categorization may be extended to r type of faults on the basis of severity. Although some existing research in the literatures has incorporated this concept that fault removal rate (FRR) is different for different types of faults, they assume that the FRR remains constant during the overall testing period. On the contrary, it has been observed that as testing progresses, FRR changes due to changing testing strategy, skill, environment and personnel resources. In this paper, a general discrete SRGM is proposed for errors of different severity in software systems using the change-point concept. Then, the models are formulated for two particular environments. The models were validated on two real-life data sets. The results show better fit and wider applicability of the proposed models as to different types of failure datasets.
基金supported in part by the National Natural Science Foundation of China(NSFC)(92167106,61833014)Key Research and Development Program of Zhejiang Province(2022C01206)。
文摘The curse of dimensionality refers to the problem o increased sparsity and computational complexity when dealing with high-dimensional data.In recent years,the types and vari ables of industrial data have increased significantly,making data driven models more challenging to develop.To address this prob lem,data augmentation technology has been introduced as an effective tool to solve the sparsity problem of high-dimensiona industrial data.This paper systematically explores and discusses the necessity,feasibility,and effectiveness of augmented indus trial data-driven modeling in the context of the curse of dimen sionality and virtual big data.Then,the process of data augmen tation modeling is analyzed,and the concept of data boosting augmentation is proposed.The data boosting augmentation involves designing the reliability weight and actual-virtual weigh functions,and developing a double weighted partial least squares model to optimize the three stages of data generation,data fusion and modeling.This approach significantly improves the inter pretability,effectiveness,and practicality of data augmentation in the industrial modeling.Finally,the proposed method is verified using practical examples of fault diagnosis systems and virtua measurement systems in the industry.The results demonstrate the effectiveness of the proposed approach in improving the accu racy and robustness of data-driven models,making them more suitable for real-world industrial applications.
基金Supported by the National Defense Basic ResearchFoundation (K1503063165)
文摘According to the characteristic of Team Software Process (TSP), it adopts a hierarchy-based model combined discrete event model with system dynamics model. This model represents TSP as form of two levels, the inner level embodies the continuity of the software process, the outer embodies the software development process by phases, and the structure and principle of the model is explained in detail, then formalization description of the model is offered. At last, an example is presented to demonstrate the simulation process and result. This model can simulate team software process from various angles, supervise and predict the software process. Also it can make the management of software development become more scientific and improve the quality of software.
文摘The current status of the China's software industry is introduced, including the great potential opportunities and many problems across the path to success. The main measures are discussed to keep abreast of the time needs and fuse with the whole family of global IT industry.
基金supported by the National Natural Science Foundation of China (71671090)the Fundamental Research Funds for the Central Universities (NP2020022)the Qinglan Project of Excellent Youth or Middle-Aged Academic Leaders in Jiangsu Province。
文摘Due to the randomness and time dependence of the factors affecting software reliability, most software reliability models are treated as stochastic processes, and the non-homogeneous Poisson process(NHPP) is the most used one.However, the failure behavior of software does not follow the NHPP in a statistically rigorous manner, and the pure random method might be not enough to describe the software failure behavior. To solve these problems, this paper proposes a new integrated approach that combines stochastic process and grey system theory to describe the failure behavior of software. A grey NHPP software reliability model is put forward in a discrete form, and a grey-based approach for estimating software reliability under the NHPP is proposed as a nonlinear multi-objective programming problem. Finally, four grey NHPP software reliability models are applied to four real datasets, the dynamic R-square and predictive relative error are calculated. Comparing with the original single NHPP software reliability model, it is found that the modeling using the integrated approach has a higher prediction accuracy of software reliability. Therefore, there is the characteristics of grey uncertain information in the NHPP software reliability models, and exploiting the latent grey uncertain information might lead to more accurate software reliability estimation.
文摘The Goel-Okumoto software reliability model, also known as the Exponential Nonhomogeneous Poisson Process,is one of the earliest software reliability models to be proposed. From literature, it is evident that most of the study that has been done on the Goel-Okumoto software reliability model is parameter estimation using the MLE method and model fit. It is widely known that predictive analysis is very useful for modifying, debugging and determining when to terminate software development testing process. However, there is a conspicuous absence of literature on both the classical and Bayesian predictive analyses on the model. This paper presents some results about predictive analyses for the Goel-Okumoto software reliability model. Driven by the requirement of highly reliable software used in computers embedded in automotive, mechanical and safety control systems, industrial and quality process control, real-time sensor networks, aircrafts, nuclear reactors among others, we address four issues in single-sample prediction associated closely with software development process. We have adopted Bayesian methods based on non-informative priors to develop explicit solutions to these problems. An example with real data in the form of time between software failures will be used to illustrate the developed methodologies.
基金Supported by National Natural Science Foundation(70963014,71210107012)
文摘Using Michael Porter's "diamond model", based on regional development characteristics, we conduct analysis of the competitiveness of processing industry cluster of livestock products in Inner Mongolia from six aspects (the factor conditions, demand conditions, corporate strategy, structure and competition, related and supporting industries, government and opportunities). And we put forward the following rational recommendations for improving the competitiveness of processing industry cluster of livestock products in Inner Mongolia: (i) The government should increase capital input, focus on supporting processing industry of livestock products, and give play to the guidance and aggregation effect of financial funds; (ii) In terms of enterprises, it is necessary to vigorously develop leading enterprises, to give full play to the cluster effect of the leading enterprises.
文摘The Goel-Okumoto software reliability model is one of the earliest attempts to use a non-homogeneous Poisson process to model failure times observed during software test interval. The model is known as exponential NHPP model as it describes exponential software failure curve. Parameter estimation, model fit and predictive analyses based on one sample have been conducted on the Goel-Okumoto software reliability model. However, predictive analyses based on two samples have not been conducted on the model. In two-sample prediction, the parameters and characteristics of the first sample are used to analyze and to make predictions for the second sample. This helps in saving time and resources during the software development process. This paper presents some results about predictive analyses for the Goel-Okumoto software reliability model based on two samples. We have addressed three issues in two-sample prediction associated closely with software development testing process. Bayesian methods based on non-informative priors have been adopted to develop solutions to these issues. The developed methodologies have been illustrated by two sets of software failure data simulated from the Goel-Okumoto software reliability model.
文摘Due to high cost of fixing failures, safety concerns, and legal liabilities, organizations need to produce software that is highly reliable. Software reliability growth models have been developed by software developers in tracking and measuring the growth of reliability. Most of the Software Reliability Growth Models, which have been proposed, treat the event of software fault detection in the testing and operational phase as a counting process. Moreover, if the size of software system is large, the number of software faults detected during the testing phase becomes large, and the change of the number of faults which are detected and removed through debugging activities becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. Therefore in such a situation, we can model the software fault detection process as a stochastic process with a continuous state space. Recently, Artificial Neural Networks (ANN) have been applied in software reliability growth prediction. In this paper, we propose an ANN based software reliability growth model based on Ito type of stochastic differential equation. The model has been validated, evaluated and compared with other existing NHPP model by applying it on actual failure/fault removal data sets cited from real software development projects. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP based model.
基金supported by the Pre-research Foundation of CPLA General Equipment Department
文摘Testing-effort(TE) and imperfect debugging(ID) in the reliability modeling process may further improve the fitting and prediction results of software reliability growth models(SRGMs). For describing the S-shaped varying trend of TE increasing rate more accurately, first, two S-shaped testing-effort functions(TEFs), i.e.,delayed S-shaped TEF(DS-TEF) and inflected S-shaped TEF(IS-TEF), are proposed. Then these two TEFs are incorporated into various types(exponential-type, delayed S-shaped and inflected S-shaped) of non-homogeneous Poisson process(NHPP)SRGMs with two forms of ID respectively for obtaining a series of new NHPP SRGMs which consider S-shaped TEFs as well as ID. Finally these new SRGMs and several comparison NHPP SRGMs are applied into four real failure data-sets respectively for investigating the fitting and prediction power of these new SRGMs.The experimental results show that:(i) the proposed IS-TEF is more suitable and flexible for describing the consumption of TE than the previous TEFs;(ii) incorporating TEFs into the inflected S-shaped NHPP SRGM may be more effective and appropriate compared with the exponential-type and the delayed S-shaped NHPP SRGMs;(iii) the inflected S-shaped NHPP SRGM considering both IS-TEF and ID yields the most accurate fitting and prediction results than the other comparison NHPP SRGMs.
基金financial supports from the Shenzhen Science and Technology Innovation Commission, China (Nos. KQTD20170328154443162, JCYJ20180305123432756)。
文摘Owing to its low cost,short process and low energy consumption,semi-solid processing(SSP)of aluminum(Al)and magnesium(Mg)alloys has been considered as a competitive approach to fabricate complicated components with excellent performance.Over the past decade,significant progress has been achieved in deeply understanding the SSP process,the microstructure and performance of the fabricated components in China.This paper starts with a retrospective overview of some common slurry preparation methods,followed by presenting the performance and the underlying mechanisms of SSP fabricated alloys.Then,the mainstream opinions on the microstructure evolution and rheological flow behavior of semi-solid slurry are discussed.Subsequently,the general situation and some recent examples of industrial applications of SSP are presented.Finally,special attention is paid to the unresolved issues and the future directions in SSP of Al and Mg alloys in China.
基金supported by the National Natural Science Foundation of China(31901608)the National Key Research and Development Plan of China(2021YFC2101402)the Science and Technology Innovation Project of Chinese Academy of Agricultural Sciences(CAAS-ASTIP-2022-IFST)。
文摘High-moisture extrusion technology should be considered one of the best choices for producing plant-based meat substitutes with the rich fibrous structure offered by real animal meat products.Unfortunately,the extrusion process has been seen as a“black box”with limited information about what occurs inside,causing serious obstacles in developing meat substitutes.This study designed a high-moisture extrusion process and developed 10 new plant-based meat substitutes comparable to the fibrous structure of real animal meat.The study used the Feature-Augmented Principal Component Analysis(FA-PCA)method to visualize and understand the whole extrusion process in three ways systematically and accurately.It established six sets of mathematical models of the high-moisture extrusion process based on 8000 pieces of data,including five types of parameters.The FA-PCA method improved the R^(2) values significantly compared with the PCA method.The Way 3 was the best to predict product quality(Z),demonstrating that the gradually molecular conformational changes(Y^(n'))were critical in controlling the final quality of the plant-based meat substitutes.Moreover,the first visualization platform software for the high-moisture extrusion process has been established to clearly show the“black box”by combining the virtual simulation technology.Through the software,some practice work such as equipment installation,parameter adjustment,equipment disassembly,and data prediction can be easily achieved.
基金Project(61025015) supported by the National Natural Science Funds for Distinguished Young Scholars of ChinaProject(21106036) supported by the National Natural Science Foundation of China+2 种基金Project(200805331103) supported by Research Fund for the Doctoral Program of Higher Education of ChinaProject(NCET-08-0576) supported by Program for New Century Excellent Talents in Universities of ChinaProject(11B038) supported by Scientific Research Fund for the Excellent Youth Scholars of Hunan Provincial Education Department,China
文摘In order to effectively analyse the multivariate time series data of complex process,a generic reconstruction technology based on reduction theory of rough sets was proposed,Firstly,the phase space of multivariate time series was originally reconstructed by a classical reconstruction technology.Then,the original decision-table of rough set theory was set up according to the embedding dimensions and time-delays of the original reconstruction phase space,and the rough set reduction was used to delete the redundant dimensions and irrelevant variables and to reconstruct the generic phase space,Finally,the input vectors for the prediction of multivariate time series were extracted according to generic reconstruction results to identify the parameters of prediction model.Verification results show that the developed reconstruction method leads to better generalization ability for the prediction model and it is feasible and worthwhile for application.
基金Project(10672197) supported by the National Natural Science Foundation of ChinaProject(07JJ1001) supported by the Natural Science Foundation of Hunan Province for Distinguished Young Scholars,China
文摘Mathematical model of filling disk-shaped mold cavity in steady state was studied.And the mathematical model under vibration field was developed from the model in steady state.According to the model of filling disk-shaped mold cavity in steady state,the filling time,the distribution of velocity field and the pressure field were obtained.The analysis results from rheological analytic model were compared with the numerical simulation results using Moldflow software in the powder injection molding filling process.Through the comparison,it is found that it is unreasonable to neglect the influence of temperature when calculated the pressure changing with the time at the cavity gate,while it can be neglected in other situations such as calculating the distribution of the velocity fields.This provides a theoretical reference for the establishment of correct model both in steady state and under vibration force field in the future.