Along with the extensive use of workflow, analysis methods to verify the correctness of the workflow are becoming more and more important. In the paper, we exploit the verification method based on Petri net for workfl...Along with the extensive use of workflow, analysis methods to verify the correctness of the workflow are becoming more and more important. In the paper, we exploit the verification method based on Petri net for workflow process models which deals with the verification of workflow and finds the potential errors in the process design. Additionally, an efficient verification algorithm is given.展开更多
Classical management accounting (MA) Focusing on the facilitating perspective, focuses on decision facilitating and influencing (Demski & Feltham, 1976). MA has to provide information to managers and depending on...Classical management accounting (MA) Focusing on the facilitating perspective, focuses on decision facilitating and influencing (Demski & Feltham, 1976). MA has to provide information to managers and depending on the problem complexity, they have to solve problems in a dyadic way. A dual process model, the heuristic systematic model (HSM), expands this so-called manager-accountant-dyad and shows different cases of actual human information processing. Managers and accountants either process systematically or heuristically. So far, many concepts have been designed in relation to the normative concept of the economical rational principle. Consequently, recent research only uses systematic information processing, based on the principle of the economic man. In this paper, a decision-behavior oriented approach tries to describe actual decision makers such as managers and accountants and shows new possibilities within MA. Therefore, the potential of heuristic information processing is analyzed, based on the phenomenon of ecological rationality as one shape of bounded rationality. Thus, different cognitive heuristics in business economics are identified and analyzed. Furthermore, the outstanding performance of heuristics compared with more complex calculations is shown. Unfortunately, these findings have been limited to marketing and investments so far. Significant research is needed, regarding conditions for applications and success factors of heuristics in business economics. New empirical findings have to be explicitly transferred to MA.展开更多
The reservoir volumetric approach represents a widely accepted, but flawed method of petroleum play resource calculation. In this paper, we propose a combination of techniques that can improve the applicability and qu...The reservoir volumetric approach represents a widely accepted, but flawed method of petroleum play resource calculation. In this paper, we propose a combination of techniques that can improve the applicability and quality of the resource estimation. These techniques include: 1) the use of the Multivariate Discovery Process model (MDP) to derive unbiased distribution parameters of reservoir volumetric variables and to reveal correlations among the variables; 2) the use of the Geo-anchored method to estimate simultaneously the number of oil and gas pools in the same play; and 3) the crossvalidation of assessment results from different methods. These techniques are illustrated by using an example of crude oil and natural gas resource assessment of the Sverdrup Basin, Canadian Archipelago. The example shows that when direct volumetric measurements of the untested prospects are not available, the MDP model can help derive unbiased estimates of the distribution parameters by using information from the discovered oil and gas accumulations. It also shows that an estimation of the number of oil and gas accumulations and associated size ranges from a discovery process model can provide an alternative and efficient approach when inadequate geological data hinder the estimation. Cross-examination of assessment results derived using different methods allows one to focus on and analyze the causes for the major differences, thus providing a more reliable assessment outcome.展开更多
Low pressure chemical vapor deposition(LPCVD) is one of the most important processes during semiconductor manufacturing.However,the spatial distribution of internal temperature and extremely few samples makes it hard ...Low pressure chemical vapor deposition(LPCVD) is one of the most important processes during semiconductor manufacturing.However,the spatial distribution of internal temperature and extremely few samples makes it hard to build a good-quality model of this batch process.Besides,due to the properties of this process,the reliability of the model must be taken into consideration when optimizing the MVs.In this work,an optimal design strategy based on the self-learning Gaussian process model(GPM) is proposed to control this kind of spatial batch process.The GPM is utilized as the internal model to predict the thicknesses of thin films on all spatial-distributed wafers using the limited data.Unlike the conventional model based design,the uncertainties of predictions provided by GPM are taken into consideration to guide the optimal design of manipulated variables so that the designing can be more prudent Besides,the GPM is also actively enhanced using as little data as possible based on the predictive uncertainties.The effectiveness of the proposed strategy is successfully demonstrated in an LPCVD process.展开更多
Building owners,designers and constructors are seeing a rapid increase in the number of sustainably designed high performance buildings.These buildings provide numerous benefits to the owners and occupants to include ...Building owners,designers and constructors are seeing a rapid increase in the number of sustainably designed high performance buildings.These buildings provide numerous benefits to the owners and occupants to include improved indoor air quality,energy efficiency,and environmental site standards;and ultimately enhance productivity for the building occupants.As the demand increases for higher building energy efficiency and environmental standards,application of a set of process models will support consistency and optimization during the design process.Systems engineering process models have proven effective in taking an integrated and comprehensive view of a system while allowing for clear stakeholder engagement,requirements definition,life cycle analysis,technology insertion,validation and verification.This paper overlays systems engineering on the sustainable design process by providing a framework for application of the Waterfall,Vee,and Spiral process models to high performance buildings.Each process model is mapped to the sustainable design process and is evaluated for its applicability to projects and building types.Adaptations of the models are provided as Green Building Process Models.展开更多
To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new lig...To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new light attention module,and a residue module—that are specially designed to learn the general dynamic behavior,transient disturbances,and other input factors of chemical processes,respectively.Combined with a hyperparameter optimization framework,Optuna,the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process.The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models,including the feedforward neural network,convolution neural network,long short-term memory(LSTM),and attention-LSTM.Moreover,compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1,the LACG parameters are demonstrated to be interpretable,and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM.This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling,paving a route to intelligent manufacturing.展开更多
Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are ...Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are various kinds of process models that are used by the software industries for the development of small, medium and long-term software projects, but many of them do not cover risk management. It is quite obvious that the improper selection of the software development process model leads to failure of the software products as it is time bound activity. In the present work, a new software development process model is proposed which covers the risks at any stage of the development of the software product. The model is named a Hemant-Vipin (HV) process model and may be helpful for the software industries for development of the efficient software products and timely delivery at the end of the client. The efficiency of the HV process model is observed by considering various kinds of factors like requirement clarity, user feedback, change agility, predictability, risk identification, practical implementation, customer satisfaction, incremental development, use of ready-made components, quick design, resource organization and many more and found through a case study that the presented approach covers many of parameters in comparison of the existing process models. .展开更多
Aims Recent mechanistic explanations for community assembly focus on the debates surrounding niche-based deterministic and dispersalbased stochastic models.This body of work has emphasized the importance of both habit...Aims Recent mechanistic explanations for community assembly focus on the debates surrounding niche-based deterministic and dispersalbased stochastic models.This body of work has emphasized the importance of both habitat filtering and dispersal limitation,and many of these works have utilized the assumption of species spatial independence to simplify the complexity of the spatial modeling in natural communities when given dispersal limitation and/or habitat filtering.One potential drawback of this simplification is that it does not consider species interactions and how they may influence the spatial distribution of species,phylogenetic and functional diversity.Here,we assess the validity of the assumption of species spatial independence using data from a subtropical forest plot in southeastern China.Methods We use the four most commonly employed spatial statistical models—the homogeneous Poisson process representing pure random effect,the heterogeneous Poisson process for the effect of habitat heterogeneity,the homogenous Thomas process for sole dispersal limitation and the heterogeneous Thomas process for joint effect of habitat heterogeneity and dispersal limitation—to investigate the contribution of different mechanisms in shaping the species,phylogenetic and functional structures of communities.Important Findings Our evidence from species,phylogenetic and functional diversity demonstrates that the habitat filtering and/or dispersal-based models perform well and the assumption of species spatial independence is relatively valid at larger scales(50×50 m).Conversely,at local scales(10×10 and 20×20 m),the models often fail to predict the species,phylogenetic and functional diversity,suggesting that the assumption of species spatial independence is invalid and that biotic interactions are increasingly important at these spatial scales.展开更多
With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning te...With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning technology provides a new method other than production experience and metallurgical principles in dealing with large amounts of data.The application of machine learning in the steelmaking process has become a research hotspot in recent years.This paper provides an overview of the applications of machine learning in the steelmaking process modeling involving hot metal pretreatment,primary steelmaking,secondary refining,and some other aspects.The three most frequently used machine learning algorithms in steelmaking process modeling are the artificial neural network,support vector machine,and case-based reasoning,demonstrating proportions of 56%,14%,and 10%,respectively.Collected data in the steelmaking plants are frequently faulty.Thus,data processing,especially data cleaning,is crucially important to the performance of machine learning models.The detection of variable importance can be used to optimize the process parameters and guide production.Machine learning is used in hot metal pretreatment modeling mainly for endpoint S content prediction.The predictions of the endpoints of element compositions and the process parameters are widely investigated in primary steelmaking.Machine learning is used in secondary refining modeling mainly for ladle furnaces,Ruhrstahl–Heraeus,vacuum degassing,argon oxygen decarburization,and vacuum oxygen decarburization processes.Further development of machine learning in the steelmaking process modeling can be realized through additional efforts in the construction of the data platform,the industrial transformation of the research achievements to the practical steelmaking process,and the improvement of the universality of the machine learning models.展开更多
With regards to the assembly line of cost control of Dechang(HK)company,the motor housing’s cost control of process will be necessarily respected.Because the supply quantity is big in a machine the price of motor hou...With regards to the assembly line of cost control of Dechang(HK)company,the motor housing’s cost control of process will be necessarily respected.Because the supply quantity is big in a machine the price of motor housing is small,so that the cost control of automatic production line is significant with modeling.It is found that the control of equipment includes in shaft and crank linkage for benefit which also needs to be controlled in detail.For the sake of benefits can we fundamentally resolve the main problem of high cost process.展开更多
The successful execution and management of Offshore Software Maintenance Outsourcing(OSMO)can be very beneficial for OSMO vendors and the OSMO client.Although a lot of research on software outsourcing is going on,most...The successful execution and management of Offshore Software Maintenance Outsourcing(OSMO)can be very beneficial for OSMO vendors and the OSMO client.Although a lot of research on software outsourcing is going on,most of the existing literature on offshore outsourcing deals with the outsourcing of software development only.Several frameworks have been developed focusing on guiding software systemmanagers concerning offshore software outsourcing.However,none of these studies delivered comprehensive guidelines for managing the whole process of OSMO.There is a considerable lack of research working on managing OSMO from a vendor’s perspective.Therefore,to find the best practices for managing an OSMO process,it is necessary to further investigate such complex and multifaceted phenomena from the vendor’s perspective.This study validated the preliminary OSMO process model via a case study research approach.The results showed that the OSMO process model is applicable in an industrial setting with few changes.The industrial data collected during the case study enabled this paper to extend the preliminary OSMO process model.The refined version of the OSMO processmodel has four major phases including(i)Project Assessment,(ii)SLA(iii)Execution,and(iv)Risk.展开更多
Cities are facing challenges of high rise in population number and con-sequently need to be equipped with latest smart services to provide luxuries of life to its residents.Smart integrated solutions are also a need t...Cities are facing challenges of high rise in population number and con-sequently need to be equipped with latest smart services to provide luxuries of life to its residents.Smart integrated solutions are also a need to deal with the social and environmental challenges,caused by increasing urbanization.Currently,the development of smart services’integrated network,within a city,is facing the bar-riers including;less efficient collection and sharing of data,along with inadequate collaboration of software and hardware.Aiming to resolve these issues,this paper recommended a solution for a synchronous functionality in the smart services’integration process through modeling technique.Using this integration modeling solution,atfirst,the service participants,processes and tasks of smart services are identified and then standard illustrations are developed for the better understand-ing of the integrated service group environment.Business process modeling and notation(BPMN)language based models are developed and discussed for a devised case study,to test and experiment i.e.,for remote healthcare from a smart home.The research is concluded with the integration process model application for the required data sharing among different service groups.The outcomes of the modeling are better understanding and attaining maximum automation that can be referenced and replicated.展开更多
To investigate the process of information technology (IT) impacts on firm competitiveness, an integrated process model of IT impacts on firm competitiveness is brought forward based on the process-oriented view, the...To investigate the process of information technology (IT) impacts on firm competitiveness, an integrated process model of IT impacts on firm competitiveness is brought forward based on the process-oriented view, the resource-based view and the complementary resource view, which is comprised of an IT conversion process, an information system (IS) adoption process, an IS use process and a competition process. The application capability of IT plays the critical role, which determines the efficiency and effectiveness of the aforementioned four processes. The process model of IT impacts on firm competitiveness can also be used to explain why, under what situations and how IT can generate positive organizational outcomes, as well as theoretical bases for further empirical study.展开更多
Workflow management is an important aspect in CSCW at present. The elementary knowledge of workflow process is introduced, the Petri nets based process modeling methodology and basic definitions are provided, and the ...Workflow management is an important aspect in CSCW at present. The elementary knowledge of workflow process is introduced, the Petri nets based process modeling methodology and basic definitions are provided, and the analysis and verification of structural and behavioral correctness of workflow process are discussed. Finally, the algorithm of verification of process definitions is proposed.展开更多
To achieve an on-demand and dynamic composition model of inter-organizational business processes, a new approach for business process modeling and verification is introduced by using the pi-calculus theory. A new busi...To achieve an on-demand and dynamic composition model of inter-organizational business processes, a new approach for business process modeling and verification is introduced by using the pi-calculus theory. A new business process model which is multi-role, multi-dimensional, integrated and dynamic is proposed relying on inter-organizational collaboration. Compatible with the traditional linear sequence model, the new model is an M x N multi-dimensional mesh, and provides horizontal and vertical formal descriptions for the collaboration business process model. Finally, the pi-calculus theory is utilized to verify the deadlocks, livelocks and synchronization of the example models. The result shows that the proposed approach is efficient and applicable in inter-organizational business process modeling.展开更多
There are numerous application areas of computing similarity between process models.It includes finding similar models from a repository,controlling redundancy of process models,and finding corresponding activities be...There are numerous application areas of computing similarity between process models.It includes finding similar models from a repository,controlling redundancy of process models,and finding corresponding activities between a pair of process models.The similarity between two process models is computed based on their similarity between labels,structures,and execution behaviors.Several attempts have been made to develop similarity techniques between activity labels,as well as their execution behavior.However,a notable problem with the process model similarity is that two process models can also be similar if there is a structural variation between them.However,neither a benchmark dataset exists for the structural similarity between process models nor there exist an effective technique to compute structural similarity.To that end,we have developed a large collection of process models in which structural changes are handcrafted while preserving the semantics of the models.Furthermore,we have used a machine learning-based approach to compute the similarity between a pair of process models having structural and label differences.Finally,we have evaluated the proposed approach using our generated collection of process models.展开更多
To solve the problem of risk identification and quantitative assessment for human-computer interaction(HCI)in complex avionics systems,an HCI safety analysis framework based on system-theoretical process analysis(STPA...To solve the problem of risk identification and quantitative assessment for human-computer interaction(HCI)in complex avionics systems,an HCI safety analysis framework based on system-theoretical process analysis(STPA)and cognitive reliability and error analysis method(CREAM)is proposed.STPACREAM can identify unsafe control actions and find the causal path during the interaction of avionics systems and pilot with the help of formal verification tools automatically.The common performance conditions(CPC)of avionics systems in the aviation environment is established and a quantitative analysis of human failure is carried out.Taking the head-up display(HUD)system interaction process as an example,a case analysis is carried out,the layered safety control structure and formal model of the HUD interaction process are established.For the interactive behavior“Pilots approaching with HUD”,four unsafe control actions and35 causal scenarios are identified and the impact of common performance conditions at different levels on the pilot decision model are analyzed.The results show that HUD's HCI level gradually improves as the scores of CPC increase,and the quality of crew member cooperation and time sufficiency of the task is the key to its HCI.Through case analysis,it is shown that STPACREAM can quantitatively assess the hazards in HCI and identify the key factors that impact safety.展开更多
This study takes the virtual business society environment(VBSE)practical training course as a case study and applies the theoretical framework of the context,input,process,product(CIPP)model to construct an evaluation...This study takes the virtual business society environment(VBSE)practical training course as a case study and applies the theoretical framework of the context,input,process,product(CIPP)model to construct an evaluation indicator system for the application of civic and politics in professional practice courses.The context evaluation is measured from the support of the VBSE practical training course into course civic and politics,teachers’cognition,and the integration of course objectives;the input evaluation is measured from the matching degree of teachers’civic and political competence,and the matching degree of teaching resources;the process evaluation is measured from the degree of implementation of civic and politics teaching and the degree of students’acceptance;and the product evaluation is measured from the degree of impact of civic and politics teaching.展开更多
Currently, most public higher learning institutions in Tanzania rely on traditional in-class examinations, requiring students to register and present identification documents for examinations eligibility verification....Currently, most public higher learning institutions in Tanzania rely on traditional in-class examinations, requiring students to register and present identification documents for examinations eligibility verification. This system, however, is prone to impersonations due to security vulnerabilities in current students’ verification system. These vulnerabilities include weak authentication, lack of encryption, and inadequate anti-counterfeiting measures. Additionally, advanced printing technologies and online marketplaces which claim to produce convincing fake identification documents make it easy to create convincing fake identity documents. The Improved Mechanism for Detecting Impersonations (IMDIs) system detects impersonations in in-class exams by integrating QR codes and dynamic question generation based on student profiles. It consists of a mobile verification app, built with Flutter and communicating via RESTful APIs, and a web system, developed with Laravel using HTML, CSS, and JavaScript. The two components communicate through APIs, with MySQL managing the database. The mobile app and web server interact to ensure efficient verification and security during examinations. The implemented IMDIs system was validated by a mobile application which is integrated with a QR codes scanner for capturing codes embedded in student Identity Cards and linking them to a dynamic question generation model. The QG model uses natural language processing (NLP) algorithm and Question Generation (QG) techniques to create dynamic profile questions. Results show that the IMDIs system could generate four challenging profile-based questions within two seconds, allowing the verification of 200 students in 33 minutes by one operator. The IMDIs system also tracks exam-eligible students, aiding in exam attendance and integrates with a Short Message Service (SMS) to report impersonation incidents to a dedicated security officer in real-time. The IMDIs system was tested and found to be 98% secure, 100% convenient, with a 0% false rejection rate and a 2% false acceptance rate, demonstrating its security, reliability, and high performance.展开更多
Many applications of principal component analysis (PCA) can be found in dimensionality reduction. But linear PCA method is not well suitable for nonlinear chemical processes. A new PCA method based on im-proved input ...Many applications of principal component analysis (PCA) can be found in dimensionality reduction. But linear PCA method is not well suitable for nonlinear chemical processes. A new PCA method based on im-proved input training neural network (IT-NN) is proposed for the nonlinear system modelling in this paper. Mo-mentum factor and adaptive learning rate are introduced into learning algorithm to improve the training speed of IT-NN. Contrasting to the auto-associative neural network (ANN), IT-NN has less hidden layers and higher training speed. The effectiveness is illustrated through a comparison of IT-NN with linear PCA and ANN with experiments. Moreover, the IT-NN is combined with RBF neural network (RBF-NN) to model the yields of ethylene and propyl-ene in the naphtha pyrolysis system. From the illustrative example and practical application, IT-NN combined with RBF-NN is an effective method of nonlinear chemical process modelling.展开更多
文摘Along with the extensive use of workflow, analysis methods to verify the correctness of the workflow are becoming more and more important. In the paper, we exploit the verification method based on Petri net for workflow process models which deals with the verification of workflow and finds the potential errors in the process design. Additionally, an efficient verification algorithm is given.
文摘Classical management accounting (MA) Focusing on the facilitating perspective, focuses on decision facilitating and influencing (Demski & Feltham, 1976). MA has to provide information to managers and depending on the problem complexity, they have to solve problems in a dyadic way. A dual process model, the heuristic systematic model (HSM), expands this so-called manager-accountant-dyad and shows different cases of actual human information processing. Managers and accountants either process systematically or heuristically. So far, many concepts have been designed in relation to the normative concept of the economical rational principle. Consequently, recent research only uses systematic information processing, based on the principle of the economic man. In this paper, a decision-behavior oriented approach tries to describe actual decision makers such as managers and accountants and shows new possibilities within MA. Therefore, the potential of heuristic information processing is analyzed, based on the phenomenon of ecological rationality as one shape of bounded rationality. Thus, different cognitive heuristics in business economics are identified and analyzed. Furthermore, the outstanding performance of heuristics compared with more complex calculations is shown. Unfortunately, these findings have been limited to marketing and investments so far. Significant research is needed, regarding conditions for applications and success factors of heuristics in business economics. New empirical findings have to be explicitly transferred to MA.
文摘The reservoir volumetric approach represents a widely accepted, but flawed method of petroleum play resource calculation. In this paper, we propose a combination of techniques that can improve the applicability and quality of the resource estimation. These techniques include: 1) the use of the Multivariate Discovery Process model (MDP) to derive unbiased distribution parameters of reservoir volumetric variables and to reveal correlations among the variables; 2) the use of the Geo-anchored method to estimate simultaneously the number of oil and gas pools in the same play; and 3) the crossvalidation of assessment results from different methods. These techniques are illustrated by using an example of crude oil and natural gas resource assessment of the Sverdrup Basin, Canadian Archipelago. The example shows that when direct volumetric measurements of the untested prospects are not available, the MDP model can help derive unbiased estimates of the distribution parameters by using information from the discovered oil and gas accumulations. It also shows that an estimation of the number of oil and gas accumulations and associated size ranges from a discovery process model can provide an alternative and efficient approach when inadequate geological data hinder the estimation. Cross-examination of assessment results derived using different methods allows one to focus on and analyze the causes for the major differences, thus providing a more reliable assessment outcome.
基金Supported by the National High Technology Research and Development Program of China(2014AA041803)the National Natural Science Foundation of China(61320106009)
文摘Low pressure chemical vapor deposition(LPCVD) is one of the most important processes during semiconductor manufacturing.However,the spatial distribution of internal temperature and extremely few samples makes it hard to build a good-quality model of this batch process.Besides,due to the properties of this process,the reliability of the model must be taken into consideration when optimizing the MVs.In this work,an optimal design strategy based on the self-learning Gaussian process model(GPM) is proposed to control this kind of spatial batch process.The GPM is utilized as the internal model to predict the thicknesses of thin films on all spatial-distributed wafers using the limited data.Unlike the conventional model based design,the uncertainties of predictions provided by GPM are taken into consideration to guide the optimal design of manipulated variables so that the designing can be more prudent Besides,the GPM is also actively enhanced using as little data as possible based on the predictive uncertainties.The effectiveness of the proposed strategy is successfully demonstrated in an LPCVD process.
文摘Building owners,designers and constructors are seeing a rapid increase in the number of sustainably designed high performance buildings.These buildings provide numerous benefits to the owners and occupants to include improved indoor air quality,energy efficiency,and environmental site standards;and ultimately enhance productivity for the building occupants.As the demand increases for higher building energy efficiency and environmental standards,application of a set of process models will support consistency and optimization during the design process.Systems engineering process models have proven effective in taking an integrated and comprehensive view of a system while allowing for clear stakeholder engagement,requirements definition,life cycle analysis,technology insertion,validation and verification.This paper overlays systems engineering on the sustainable design process by providing a framework for application of the Waterfall,Vee,and Spiral process models to high performance buildings.Each process model is mapped to the sustainable design process and is evaluated for its applicability to projects and building types.Adaptations of the models are provided as Green Building Process Models.
基金support provided by the National Natural Science Foundation of China(22122802,22278044,and 21878028)the Chongqing Science Fund for Distinguished Young Scholars(CSTB2022NSCQ-JQX0021)the Fundamental Research Funds for the Central Universities(2022CDJXY-003).
文摘To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new light attention module,and a residue module—that are specially designed to learn the general dynamic behavior,transient disturbances,and other input factors of chemical processes,respectively.Combined with a hyperparameter optimization framework,Optuna,the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process.The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models,including the feedforward neural network,convolution neural network,long short-term memory(LSTM),and attention-LSTM.Moreover,compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1,the LACG parameters are demonstrated to be interpretable,and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM.This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling,paving a route to intelligent manufacturing.
文摘Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are various kinds of process models that are used by the software industries for the development of small, medium and long-term software projects, but many of them do not cover risk management. It is quite obvious that the improper selection of the software development process model leads to failure of the software products as it is time bound activity. In the present work, a new software development process model is proposed which covers the risks at any stage of the development of the software product. The model is named a Hemant-Vipin (HV) process model and may be helpful for the software industries for development of the efficient software products and timely delivery at the end of the client. The efficiency of the HV process model is observed by considering various kinds of factors like requirement clarity, user feedback, change agility, predictability, risk identification, practical implementation, customer satisfaction, incremental development, use of ready-made components, quick design, resource organization and many more and found through a case study that the presented approach covers many of parameters in comparison of the existing process models. .
基金NSFC grant of National Natural Science Foundation of China(31170401)Dimensions of biodiversity grant of Natural Science Fundation(NSF 1046113)Natural Science Foundation of Zhejiang Province(Y5100361).
文摘Aims Recent mechanistic explanations for community assembly focus on the debates surrounding niche-based deterministic and dispersalbased stochastic models.This body of work has emphasized the importance of both habitat filtering and dispersal limitation,and many of these works have utilized the assumption of species spatial independence to simplify the complexity of the spatial modeling in natural communities when given dispersal limitation and/or habitat filtering.One potential drawback of this simplification is that it does not consider species interactions and how they may influence the spatial distribution of species,phylogenetic and functional diversity.Here,we assess the validity of the assumption of species spatial independence using data from a subtropical forest plot in southeastern China.Methods We use the four most commonly employed spatial statistical models—the homogeneous Poisson process representing pure random effect,the heterogeneous Poisson process for the effect of habitat heterogeneity,the homogenous Thomas process for sole dispersal limitation and the heterogeneous Thomas process for joint effect of habitat heterogeneity and dispersal limitation—to investigate the contribution of different mechanisms in shaping the species,phylogenetic and functional structures of communities.Important Findings Our evidence from species,phylogenetic and functional diversity demonstrates that the habitat filtering and/or dispersal-based models perform well and the assumption of species spatial independence is relatively valid at larger scales(50×50 m).Conversely,at local scales(10×10 and 20×20 m),the models often fail to predict the species,phylogenetic and functional diversity,suggesting that the assumption of species spatial independence is invalid and that biotic interactions are increasingly important at these spatial scales.
基金supported by the National Natural Science Foundation of China(No.U1960202)。
文摘With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning technology provides a new method other than production experience and metallurgical principles in dealing with large amounts of data.The application of machine learning in the steelmaking process has become a research hotspot in recent years.This paper provides an overview of the applications of machine learning in the steelmaking process modeling involving hot metal pretreatment,primary steelmaking,secondary refining,and some other aspects.The three most frequently used machine learning algorithms in steelmaking process modeling are the artificial neural network,support vector machine,and case-based reasoning,demonstrating proportions of 56%,14%,and 10%,respectively.Collected data in the steelmaking plants are frequently faulty.Thus,data processing,especially data cleaning,is crucially important to the performance of machine learning models.The detection of variable importance can be used to optimize the process parameters and guide production.Machine learning is used in hot metal pretreatment modeling mainly for endpoint S content prediction.The predictions of the endpoints of element compositions and the process parameters are widely investigated in primary steelmaking.Machine learning is used in secondary refining modeling mainly for ladle furnaces,Ruhrstahl–Heraeus,vacuum degassing,argon oxygen decarburization,and vacuum oxygen decarburization processes.Further development of machine learning in the steelmaking process modeling can be realized through additional efforts in the construction of the data platform,the industrial transformation of the research achievements to the practical steelmaking process,and the improvement of the universality of the machine learning models.
文摘With regards to the assembly line of cost control of Dechang(HK)company,the motor housing’s cost control of process will be necessarily respected.Because the supply quantity is big in a machine the price of motor housing is small,so that the cost control of automatic production line is significant with modeling.It is found that the control of equipment includes in shaft and crank linkage for benefit which also needs to be controlled in detail.For the sake of benefits can we fundamentally resolve the main problem of high cost process.
基金This research is fully funded byUniversiti Malaysia Terengganu under the research Grant(PGRG).
文摘The successful execution and management of Offshore Software Maintenance Outsourcing(OSMO)can be very beneficial for OSMO vendors and the OSMO client.Although a lot of research on software outsourcing is going on,most of the existing literature on offshore outsourcing deals with the outsourcing of software development only.Several frameworks have been developed focusing on guiding software systemmanagers concerning offshore software outsourcing.However,none of these studies delivered comprehensive guidelines for managing the whole process of OSMO.There is a considerable lack of research working on managing OSMO from a vendor’s perspective.Therefore,to find the best practices for managing an OSMO process,it is necessary to further investigate such complex and multifaceted phenomena from the vendor’s perspective.This study validated the preliminary OSMO process model via a case study research approach.The results showed that the OSMO process model is applicable in an industrial setting with few changes.The industrial data collected during the case study enabled this paper to extend the preliminary OSMO process model.The refined version of the OSMO processmodel has four major phases including(i)Project Assessment,(ii)SLA(iii)Execution,and(iv)Risk.
文摘Cities are facing challenges of high rise in population number and con-sequently need to be equipped with latest smart services to provide luxuries of life to its residents.Smart integrated solutions are also a need to deal with the social and environmental challenges,caused by increasing urbanization.Currently,the development of smart services’integrated network,within a city,is facing the bar-riers including;less efficient collection and sharing of data,along with inadequate collaboration of software and hardware.Aiming to resolve these issues,this paper recommended a solution for a synchronous functionality in the smart services’integration process through modeling technique.Using this integration modeling solution,atfirst,the service participants,processes and tasks of smart services are identified and then standard illustrations are developed for the better understand-ing of the integrated service group environment.Business process modeling and notation(BPMN)language based models are developed and discussed for a devised case study,to test and experiment i.e.,for remote healthcare from a smart home.The research is concluded with the integration process model application for the required data sharing among different service groups.The outcomes of the modeling are better understanding and attaining maximum automation that can be referenced and replicated.
基金The National Natural Science Foundation of China(No.70671024).
文摘To investigate the process of information technology (IT) impacts on firm competitiveness, an integrated process model of IT impacts on firm competitiveness is brought forward based on the process-oriented view, the resource-based view and the complementary resource view, which is comprised of an IT conversion process, an information system (IS) adoption process, an IS use process and a competition process. The application capability of IT plays the critical role, which determines the efficiency and effectiveness of the aforementioned four processes. The process model of IT impacts on firm competitiveness can also be used to explain why, under what situations and how IT can generate positive organizational outcomes, as well as theoretical bases for further empirical study.
文摘Workflow management is an important aspect in CSCW at present. The elementary knowledge of workflow process is introduced, the Petri nets based process modeling methodology and basic definitions are provided, and the analysis and verification of structural and behavioral correctness of workflow process are discussed. Finally, the algorithm of verification of process definitions is proposed.
基金The National Natural Science Foundation of China(No60473078)
文摘To achieve an on-demand and dynamic composition model of inter-organizational business processes, a new approach for business process modeling and verification is introduced by using the pi-calculus theory. A new business process model which is multi-role, multi-dimensional, integrated and dynamic is proposed relying on inter-organizational collaboration. Compatible with the traditional linear sequence model, the new model is an M x N multi-dimensional mesh, and provides horizontal and vertical formal descriptions for the collaboration business process model. Finally, the pi-calculus theory is utilized to verify the deadlocks, livelocks and synchronization of the example models. The result shows that the proposed approach is efficient and applicable in inter-organizational business process modeling.
文摘There are numerous application areas of computing similarity between process models.It includes finding similar models from a repository,controlling redundancy of process models,and finding corresponding activities between a pair of process models.The similarity between two process models is computed based on their similarity between labels,structures,and execution behaviors.Several attempts have been made to develop similarity techniques between activity labels,as well as their execution behavior.However,a notable problem with the process model similarity is that two process models can also be similar if there is a structural variation between them.However,neither a benchmark dataset exists for the structural similarity between process models nor there exist an effective technique to compute structural similarity.To that end,we have developed a large collection of process models in which structural changes are handcrafted while preserving the semantics of the models.Furthermore,we have used a machine learning-based approach to compute the similarity between a pair of process models having structural and label differences.Finally,we have evaluated the proposed approach using our generated collection of process models.
基金supported by the National Key Research and Development Program of China(2021YFB1600601)the Joint Funds of the National Natural Science Foundation of China and the Civil Aviation Administration of China(U1933106)+2 种基金the Scientific Research Project of Tianjin Educational Committee(2019KJ134)the Natural Science Foundation of TianjinIntelligent Civil Aviation Program(21JCQNJ C00900)。
文摘To solve the problem of risk identification and quantitative assessment for human-computer interaction(HCI)in complex avionics systems,an HCI safety analysis framework based on system-theoretical process analysis(STPA)and cognitive reliability and error analysis method(CREAM)is proposed.STPACREAM can identify unsafe control actions and find the causal path during the interaction of avionics systems and pilot with the help of formal verification tools automatically.The common performance conditions(CPC)of avionics systems in the aviation environment is established and a quantitative analysis of human failure is carried out.Taking the head-up display(HUD)system interaction process as an example,a case analysis is carried out,the layered safety control structure and formal model of the HUD interaction process are established.For the interactive behavior“Pilots approaching with HUD”,four unsafe control actions and35 causal scenarios are identified and the impact of common performance conditions at different levels on the pilot decision model are analyzed.The results show that HUD's HCI level gradually improves as the scores of CPC increase,and the quality of crew member cooperation and time sufficiency of the task is the key to its HCI.Through case analysis,it is shown that STPACREAM can quantitatively assess the hazards in HCI and identify the key factors that impact safety.
基金2022 Southwest Forestry University Educational Science Research Project:Surface Project Grant(Project number:YB202227)Grant No.42 of 2024 Curriculum Civics Construction(Teaching Research Project)of Southwest Forestry University。
文摘This study takes the virtual business society environment(VBSE)practical training course as a case study and applies the theoretical framework of the context,input,process,product(CIPP)model to construct an evaluation indicator system for the application of civic and politics in professional practice courses.The context evaluation is measured from the support of the VBSE practical training course into course civic and politics,teachers’cognition,and the integration of course objectives;the input evaluation is measured from the matching degree of teachers’civic and political competence,and the matching degree of teaching resources;the process evaluation is measured from the degree of implementation of civic and politics teaching and the degree of students’acceptance;and the product evaluation is measured from the degree of impact of civic and politics teaching.
文摘Currently, most public higher learning institutions in Tanzania rely on traditional in-class examinations, requiring students to register and present identification documents for examinations eligibility verification. This system, however, is prone to impersonations due to security vulnerabilities in current students’ verification system. These vulnerabilities include weak authentication, lack of encryption, and inadequate anti-counterfeiting measures. Additionally, advanced printing technologies and online marketplaces which claim to produce convincing fake identification documents make it easy to create convincing fake identity documents. The Improved Mechanism for Detecting Impersonations (IMDIs) system detects impersonations in in-class exams by integrating QR codes and dynamic question generation based on student profiles. It consists of a mobile verification app, built with Flutter and communicating via RESTful APIs, and a web system, developed with Laravel using HTML, CSS, and JavaScript. The two components communicate through APIs, with MySQL managing the database. The mobile app and web server interact to ensure efficient verification and security during examinations. The implemented IMDIs system was validated by a mobile application which is integrated with a QR codes scanner for capturing codes embedded in student Identity Cards and linking them to a dynamic question generation model. The QG model uses natural language processing (NLP) algorithm and Question Generation (QG) techniques to create dynamic profile questions. Results show that the IMDIs system could generate four challenging profile-based questions within two seconds, allowing the verification of 200 students in 33 minutes by one operator. The IMDIs system also tracks exam-eligible students, aiding in exam attendance and integrates with a Short Message Service (SMS) to report impersonation incidents to a dedicated security officer in real-time. The IMDIs system was tested and found to be 98% secure, 100% convenient, with a 0% false rejection rate and a 2% false acceptance rate, demonstrating its security, reliability, and high performance.
基金Supported by Beijing Municipal Education Commission (No.xk100100435) and the Key Research Project of Science andTechnology from Sinopec (No.E03007).
文摘Many applications of principal component analysis (PCA) can be found in dimensionality reduction. But linear PCA method is not well suitable for nonlinear chemical processes. A new PCA method based on im-proved input training neural network (IT-NN) is proposed for the nonlinear system modelling in this paper. Mo-mentum factor and adaptive learning rate are introduced into learning algorithm to improve the training speed of IT-NN. Contrasting to the auto-associative neural network (ANN), IT-NN has less hidden layers and higher training speed. The effectiveness is illustrated through a comparison of IT-NN with linear PCA and ANN with experiments. Moreover, the IT-NN is combined with RBF neural network (RBF-NN) to model the yields of ethylene and propyl-ene in the naphtha pyrolysis system. From the illustrative example and practical application, IT-NN combined with RBF-NN is an effective method of nonlinear chemical process modelling.