This research recognizes the limitation and challenges of adaptingand applying Process Mining as a powerful tool and technique in theHypothetical Software Architecture (SA) Evaluation Framework with thefeatures and fa...This research recognizes the limitation and challenges of adaptingand applying Process Mining as a powerful tool and technique in theHypothetical Software Architecture (SA) Evaluation Framework with thefeatures and factors of lightweightness. Process mining deals with the largescalecomplexity of security and performance analysis, which are the goalsof SA evaluation frameworks. As a result of these conjectures, all ProcessMining researches in the realm of SA are thoroughly reviewed, and ninechallenges for Process Mining Adaption are recognized. Process mining isembedded in the framework and to boost the quality of the SA model forfurther analysis, the framework nominates architectural discovery algorithmsFlower, Alpha, Integer Linear Programming (ILP), Heuristic, and Inductiveand compares them vs. twelve quality criteria. Finally, the framework’s testingon three case studies approves the feasibility of applying process mining toarchitectural evaluation. The extraction of the SA model is also done by thebest model discovery algorithm, which is selected by intensive benchmarkingin this research. This research presents case studies of SA in service-oriented,Pipe and Filter, and component-based styles, modeled and simulated byHierarchical Colored Petri Net techniques based on the cases’ documentation.Processminingwithin this framework dealswith the system’s log files obtainedfrom SA simulation. Applying process mining is challenging, especially for aSA evaluation framework, as it has not been done yet. The research recognizesthe problems of process mining adaption to a hypothetical lightweightSA evaluation framework and addresses these problems during the solutiondevelopment.展开更多
The reservoir volumetric approach represents a widely accepted, but flawed method of petroleum play resource calculation. In this paper, we propose a combination of techniques that can improve the applicability and qu...The reservoir volumetric approach represents a widely accepted, but flawed method of petroleum play resource calculation. In this paper, we propose a combination of techniques that can improve the applicability and quality of the resource estimation. These techniques include: 1) the use of the Multivariate Discovery Process model (MDP) to derive unbiased distribution parameters of reservoir volumetric variables and to reveal correlations among the variables; 2) the use of the Geo-anchored method to estimate simultaneously the number of oil and gas pools in the same play; and 3) the crossvalidation of assessment results from different methods. These techniques are illustrated by using an example of crude oil and natural gas resource assessment of the Sverdrup Basin, Canadian Archipelago. The example shows that when direct volumetric measurements of the untested prospects are not available, the MDP model can help derive unbiased estimates of the distribution parameters by using information from the discovered oil and gas accumulations. It also shows that an estimation of the number of oil and gas accumulations and associated size ranges from a discovery process model can provide an alternative and efficient approach when inadequate geological data hinder the estimation. Cross-examination of assessment results derived using different methods allows one to focus on and analyze the causes for the major differences, thus providing a more reliable assessment outcome.展开更多
A Bayesian approach using Markov chain Monte Carlo algorithms has been developed to analyze Smith’s discretized version of the discovery process model. It avoids the problems involved in the maximum likelihood method...A Bayesian approach using Markov chain Monte Carlo algorithms has been developed to analyze Smith’s discretized version of the discovery process model. It avoids the problems involved in the maximum likelihood method by effectively making use of the information from the prior distribution and that from the discovery sequence according to posterior probabilities. All statistical inferences about the parameters of the model and total resources can be quantified by drawing samples directly from the joint posterior distribution. In addition, statistical errors of the samples can be easily assessed and the convergence properties can be monitored during the sampling. Because the information contained in a discovery sequence is not enough to estimate all parameters, especially the number of fields, geologically justified prior information is crucial to the estimation. The Bayesian approach allows the analyst to specify his subjective estimates of the required parameters and his degree of uncertainty about the estimates in a clearly identified fashion throughout the analysis. As an example, this approach is applied to the same data of the North Sea on which Smith demonstrated his maximum likelihood method. For this case, the Bayesian approach has really improved the overly pessimistic results and downward bias of the maximum likelihood procedure.展开更多
This summary paper will discuss the concept of forensic evidence and evidence collection methods. Emphasis will be placed on the techniques used to collect forensically sound digital evidence for the purpose of introd...This summary paper will discuss the concept of forensic evidence and evidence collection methods. Emphasis will be placed on the techniques used to collect forensically sound digital evidence for the purpose of introduction to digital forensics. This discussion will thereafter result in identifying and categorizing the different types of digital forensics evidence and a clear procedure for how to collect forensically sound digital evidence. This paper will further discuss the creation of awareness and promote the idea that competent practice of computer forensics collection is important for admissibility in court.展开更多
基金This paper is supported by Research Grant Number:PP-FTSM-2022.
文摘This research recognizes the limitation and challenges of adaptingand applying Process Mining as a powerful tool and technique in theHypothetical Software Architecture (SA) Evaluation Framework with thefeatures and factors of lightweightness. Process mining deals with the largescalecomplexity of security and performance analysis, which are the goalsof SA evaluation frameworks. As a result of these conjectures, all ProcessMining researches in the realm of SA are thoroughly reviewed, and ninechallenges for Process Mining Adaption are recognized. Process mining isembedded in the framework and to boost the quality of the SA model forfurther analysis, the framework nominates architectural discovery algorithmsFlower, Alpha, Integer Linear Programming (ILP), Heuristic, and Inductiveand compares them vs. twelve quality criteria. Finally, the framework’s testingon three case studies approves the feasibility of applying process mining toarchitectural evaluation. The extraction of the SA model is also done by thebest model discovery algorithm, which is selected by intensive benchmarkingin this research. This research presents case studies of SA in service-oriented,Pipe and Filter, and component-based styles, modeled and simulated byHierarchical Colored Petri Net techniques based on the cases’ documentation.Processminingwithin this framework dealswith the system’s log files obtainedfrom SA simulation. Applying process mining is challenging, especially for aSA evaluation framework, as it has not been done yet. The research recognizesthe problems of process mining adaption to a hypothetical lightweightSA evaluation framework and addresses these problems during the solutiondevelopment.
文摘The reservoir volumetric approach represents a widely accepted, but flawed method of petroleum play resource calculation. In this paper, we propose a combination of techniques that can improve the applicability and quality of the resource estimation. These techniques include: 1) the use of the Multivariate Discovery Process model (MDP) to derive unbiased distribution parameters of reservoir volumetric variables and to reveal correlations among the variables; 2) the use of the Geo-anchored method to estimate simultaneously the number of oil and gas pools in the same play; and 3) the crossvalidation of assessment results from different methods. These techniques are illustrated by using an example of crude oil and natural gas resource assessment of the Sverdrup Basin, Canadian Archipelago. The example shows that when direct volumetric measurements of the untested prospects are not available, the MDP model can help derive unbiased estimates of the distribution parameters by using information from the discovered oil and gas accumulations. It also shows that an estimation of the number of oil and gas accumulations and associated size ranges from a discovery process model can provide an alternative and efficient approach when inadequate geological data hinder the estimation. Cross-examination of assessment results derived using different methods allows one to focus on and analyze the causes for the major differences, thus providing a more reliable assessment outcome.
文摘A Bayesian approach using Markov chain Monte Carlo algorithms has been developed to analyze Smith’s discretized version of the discovery process model. It avoids the problems involved in the maximum likelihood method by effectively making use of the information from the prior distribution and that from the discovery sequence according to posterior probabilities. All statistical inferences about the parameters of the model and total resources can be quantified by drawing samples directly from the joint posterior distribution. In addition, statistical errors of the samples can be easily assessed and the convergence properties can be monitored during the sampling. Because the information contained in a discovery sequence is not enough to estimate all parameters, especially the number of fields, geologically justified prior information is crucial to the estimation. The Bayesian approach allows the analyst to specify his subjective estimates of the required parameters and his degree of uncertainty about the estimates in a clearly identified fashion throughout the analysis. As an example, this approach is applied to the same data of the North Sea on which Smith demonstrated his maximum likelihood method. For this case, the Bayesian approach has really improved the overly pessimistic results and downward bias of the maximum likelihood procedure.
文摘This summary paper will discuss the concept of forensic evidence and evidence collection methods. Emphasis will be placed on the techniques used to collect forensically sound digital evidence for the purpose of introduction to digital forensics. This discussion will thereafter result in identifying and categorizing the different types of digital forensics evidence and a clear procedure for how to collect forensically sound digital evidence. This paper will further discuss the creation of awareness and promote the idea that competent practice of computer forensics collection is important for admissibility in court.