Model checking is an automated formal verification method to verify whether epistemic multi-agent systems adhere to property specifications.Although there is an extensive literature on qualitative properties such as s...Model checking is an automated formal verification method to verify whether epistemic multi-agent systems adhere to property specifications.Although there is an extensive literature on qualitative properties such as safety and liveness,there is still a lack of quantitative and uncertain property verifications for these systems.In uncertain environments,agents must make judicious decisions based on subjective epistemic.To verify epistemic and measurable properties in multi-agent systems,this paper extends fuzzy computation tree logic by introducing epistemic modalities and proposing a new Fuzzy Computation Tree Logic of Knowledge(FCTLK).We represent fuzzy multi-agent systems as distributed knowledge bases with fuzzy epistemic interpreted systems.In addition,we provide a transformation algorithm from fuzzy epistemic interpreted systems to fuzzy Kripke structures,as well as transformation rules from FCTLK formulas to Fuzzy Computation Tree Logic(FCTL)formulas.Accordingly,we transform the FCTLK model checking problem into the FCTL model checking.This enables the verification of FCTLK formulas by using the fuzzy model checking algorithm of FCTL without additional computational overheads.Finally,we present correctness proofs and complexity analyses of the proposed algorithms.Additionally,we further illustrate the practical application of our approach through an example of a train control system.展开更多
In reliability engineering,the observations of the variables of interest are always limited due to cost or schedule constraints.Consequently,the epistemic uncertainty,which derives from lack of knowledge and informati...In reliability engineering,the observations of the variables of interest are always limited due to cost or schedule constraints.Consequently,the epistemic uncertainty,which derives from lack of knowledge and information,plays a vital influence on the reliability evaluation.Belief reliability is a new reliability metric that takes the impact of epistemic uncertainty into consideration and belief reliability distribution is fundamental to belief reliability application.This paper develops a new method called graduation formula to construct belief reliability distribution with limited observations.The developed method constructs the belief reliability distribution by determining the corresponding belief degrees of the observations.An algorithm is designed for the graduation formula as it is a set of transcendental equations,which is difficult to determine the analytical solution.The developed method and the proposed algorithm are illustrated by two numerical examples to show their efficiency and future application.展开更多
Fault tolerant technology has greatly improved the reliability of modern systems on one hand and makes their failure mechanisms more complex on the other.The characteristics of dynamics of failure,diversity of distrib...Fault tolerant technology has greatly improved the reliability of modern systems on one hand and makes their failure mechanisms more complex on the other.The characteristics of dynamics of failure,diversity of distribution and epistemic uncertainty always exist in these systems,which increase the challenges in the reliability assessment of these systems significantly.This paper presents a novel reliability analysis framework for complex systems within which the failure rates of components are expressed in interval numbers.Specifically,it uses a dynamic fault tree(DFT)to model the dynamic fault behaviors and copes with the epistemic uncertainty using Dempster-Shafer(D-S)theory and interval numbers.Furthermore,an approach is presented to convert a DFT into a dynamic evidential network(DEN)to calculate the reliability parameters.Additionally,a sorting method based on the possibility degree is proposed to rank the importance of components represented by interval numbers in order to obtain the most critical components,which can be used to provide the guidance for system design,maintenance planning and fault diagnosis.Finally,a numerical example is provided to illustrate the availability and efficiency of the proposed method.展开更多
Prior research on the resilience of critical infrastructure usually utilizes the network model to characterize the structure of the components so that a quantitative representation of resilience can be obtained. Parti...Prior research on the resilience of critical infrastructure usually utilizes the network model to characterize the structure of the components so that a quantitative representation of resilience can be obtained. Particularly, network component importance is addressed to express its significance in shaping the resilience performance of the whole system. Due to the intrinsic complexity of the problem, some idealized assumptions are exerted on the resilience-optimization problem to find partial solutions. This paper seeks to exploit the dynamic aspect of system resilience, i.e., the scheduling problem of link recovery in the post-disruption phase.The aim is to analyze the recovery strategy of the system with more practical assumptions, especially inhomogeneous time cost among links. In view of this, the presented work translates the resilience-maximization recovery plan into the dynamic decisionmaking of runtime recovery option. A heuristic scheme is devised to treat the core problem of link selection in an ongoing style.Through Monte Carlo simulation, the link recovery order rendered by the proposed scheme demonstrates excellent resilience performance as well as accommodation with uncertainty caused by epistemic knowledge.展开更多
Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of ...Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method(SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.展开更多
In order to provide scientists with a computational methodology and some computational tools to program their epistemic processes in scientific discovery, we are establishing a novel programming paradigm, named ‘Epis...In order to provide scientists with a computational methodology and some computational tools to program their epistemic processes in scientific discovery, we are establishing a novel programming paradigm, named ‘Epistemic Programming’, which regards conditionals as the subject of computing, takes primary epistemic operations as basic operations of computing, and regards epistemic processes as the subject of programming. This paper presents our fundamental observations and assumptions on scientific discovery processes and their automation, research problems on modeling, automating, and programming epistemic processes, and an outline of our research project of Epistemic Programming.展开更多
Caisson breakwaters are mainly constructed in deep waters to protect an area against waves.These breakwaters are con-ventionally designed based on the concept of the safety factor.However,the wave loads and resistance...Caisson breakwaters are mainly constructed in deep waters to protect an area against waves.These breakwaters are con-ventionally designed based on the concept of the safety factor.However,the wave loads and resistance of structures have epistemic or aleatory uncertainties.Furthermore,sliding failure is one of the most important failure modes of caisson breakwaters.In most previous studies,for assessment purposes,uncertainties,such as wave and wave period variation,were ignored.Therefore,in this study,Bayesian reliability analysis is implemented to assess the failure probability of the sliding of Tombak port breakwater in the Persian Gulf.The mean and standard deviations were taken as random variables to consider dismissed uncertainties.For this purpose,the frst-order reliability method(FORM)and the frst principal curvature cor-rection in FORM are used to calculate the reliability index.The performances of these methods are verifed by importance sampling through Monte Carlo simulation(MCS).In addition,the reliability index sensitivities of each random variable are calculated to evaluate the importance of diferent random variables while calculating the caisson sliding.The results show that the reliability index is most sensitive to the coefcients of friction,wave height,and caisson weight(or concrete density).The sensitivity of the failure probability of each of the random variables and their uncertainties are calculated by the derivative method.Finally,the Bayesian regression is implemented to predict the statistical properties of breakwater sliding with non-informative priors,which are compared to Goda’s formulation,used in breakwater design standards.The analysis shows that the model posterior for the sliding of a caisson breakwater has a mean and standard deviation of 0.039 and 0.022,respectively.A normal quantile analysis and residual analysis are also performed to evaluate the correctness of the model responses.展开更多
This article summarizes the collaboration between two historians of medicine on Sino-European medical exchanges.Gianna Pomata researches the history of medicine in early modern Europe and Marta Hanson researches the h...This article summarizes the collaboration between two historians of medicine on Sino-European medical exchanges.Gianna Pomata researches the history of medicine in early modern Europe and Marta Hanson researches the history of medicine in early modern China.The following covers the concept of epistemic genres that Pomata first developed out of her research on the history of the genres historia,observationes,recipes,medical cases,and the commentary in Europe.She connected these genres variously to empiricism,erudition,scientific observation,norm-making,and recording practice.The paper then evaluates how Pomata and Hanson used epistemic genres as a method for doing cross-cultural research on 17th-18th-century Sino-European medical exchanges.Pomata then wrote a comparative history of the medical case in Europe and China.The article concludes with how Hanson applied the distinction of epistemic genres to analyze the history of Chinese medicine from a new perspective.展开更多
As recent developments in autism research offer alternative explanations to the mainstream options, it can now be argued that the so-called cognitive deficits in the social domain associated with autism have been misc...As recent developments in autism research offer alternative explanations to the mainstream options, it can now be argued that the so-called cognitive deficits in the social domain associated with autism have been mischaracterized or, at least, oversimplified. We will use predictive models within a 4E (i.e., embodied, embedded, enactive and extended) conception of cognition to address the question of cognitive impairment in psychiatrics and autism. Such models force us to reassess what “cognitive deficit” means by integrating the environment not only in its usual sense (evo-developmental), but by understanding all cognitive performances as embedded in environments (or fields of affordances) that shape and sustain them. By adopting a predictive 4E perspective, we aim to show that the “cognitive deficits” associated with autism are in fact mismatches between environmental resources and the particular form of neurological functioning of autistic people (neurodiversity), brought about by the fact that the cultural niches that set up the relevant fields of affordances are structured by and for neurotypicals. This mismatch leads to epistemic injustices, both testimonial and hermeneutic, that feed back into research on autism and clinical approaches, thereby making the “deficits” appear based on individual shortcomings. In this context, autism interventions should partly focus on the development of social policies aimed at modifying those aspects of cultural niches that make environments unsuitable for the full development of all individuals.展开更多
Satisficing control remains an important concept in decision making. In this paper, a new epistemic utility satisficing control theory is proposed for a new model of complex CMMO ( constrained multi-objective multi d...Satisficing control remains an important concept in decision making. In this paper, a new epistemic utility satisficing control theory is proposed for a new model of complex CMMO ( constrained multi-objective multi degree-of-freedom optimization) system. As well, an epistemic utility function is developed and used to adjust the feasible region of soft constraints. The theory proved in this paper indicates that the utility function not only expresses the subjectivity of the original satisfactory-degree function, but also takes the cost of searching for a solution into account. Thus, the satisfactory-degree function can be adjusted and its rationality can be validated. This theory contributes an analytical method to the inverse satisfactory optimization problem. The findings indicate that this theory has good convergence and outcomes desired for satisfactory-degree functions.展开更多
This paper takes an integrative approach to the communication and comprehension of humor from the perspectives of the humorist’s manipulation and the recipient’s vigilance informed by relevance theory.It is proposed...This paper takes an integrative approach to the communication and comprehension of humor from the perspectives of the humorist’s manipulation and the recipient’s vigilance informed by relevance theory.It is proposed that,in order to communicate humor,the humorist manipulates the recipient’s expectation of relevance in the setup and in the punchline in two different but related ways:misleading and guiding.It is also proposed that,in order to comprehend and appreciate humor,the recipient exercises vigilance against his/her own shallow processing in the setup and exercises vigilance for special cognitive effects in the punchline.On this approach,humorous communication and comprehension is viewed as an interaction between manipulation and epistemic vigilance.Strategies of manipulation and vigilance are described,and some essential issues arising from the relevance-theoretic approach to humor are reconsidered with some implications drawn.This paper contributes to enhancing the explanatory power of relevance theory for the communication and comprehension of humor.展开更多
Previous studies interpreting the meanings of the sentence-final particle“LE”displayed two trends:either excessively complicated or excessively general.Since some scholars established a theoretical foundation of pro...Previous studies interpreting the meanings of the sentence-final particle“LE”displayed two trends:either excessively complicated or excessively general.Since some scholars established a theoretical foundation of propositional domain,epistemic domain and dialogic domain for the sentence-final particle“LE”,the nature or orientation of its semantic property has become more clear.However,there are also defects in the current“Three Domains”research model.In the first place,this model defines the meanings of the sentence-final particle“LE”as“emergence of new propositional content”,“emergence of new epistemic content”,and“emergence of new dialogic content”.But,the definition is excessively abstract and extensive.As many sentences not concluded with the particle“LE”can also express the three mentioned meanings,it fails to explain the difference between sentences ended with particle“LE”and ones without it.Secondly,the model fails to explore and discuss the nature or generation mechanism of relevant meanings of the particle“LE”.This study attempts to find a practical solution to those defects.展开更多
In this paper, a data-driven prognostic model capable to deal with different sources of uncertainty is proposed. The main novelty factor is the application of a mathematical framework, namely a Random Fuzzy Variable (...In this paper, a data-driven prognostic model capable to deal with different sources of uncertainty is proposed. The main novelty factor is the application of a mathematical framework, namely a Random Fuzzy Variable (RFV) approach, for the representation and propagation of the different uncertainty sources affecting </span><span style="font-family:Verdana;">Prognostic Health Management (PHM) applications: measurement, future and model uncertainty. </span><span style="font-family:Verdana;">In this way, it is possible to deal not only with measurement noise and model parameters uncertainty due to the stochastic nature of the degradation process, but also with systematic effects, such as systematic errors in the measurement process, incomplete knowledge of the degradation process, subjective belief about model parameters. Furthermore, the low analytical complexity of the employed prognostic model allows to easily propagate the measurement and parameters uncertainty into the RUL forecast, with no need of extensive Monte Carlo loops, so that low requirements in terms of computation power are needed. The model has been applied to two real application cases, showing high accuracy output, resulting in a potential</span></span><span style="font-family:Verdana;">ly</span><span style="font-family:Verdana;"> effective tool for predictive maintenance in different industrial sectors.展开更多
In light of the postcolonial theory,this thesis attempts to analyze the marginal plight of the low class and women images in Tar Baby from the perspective of Gayatri C.Spivak’s epistemic violence.Under the influence ...In light of the postcolonial theory,this thesis attempts to analyze the marginal plight of the low class and women images in Tar Baby from the perspective of Gayatri C.Spivak’s epistemic violence.Under the influence of epistemic violence,the resistance strategy in Tar Baby is highlighted in order to interpret the resistance thought displayed by Toni Morrison in the Tar Baby.Toni Morrison expresses the appeal of an active strategy to resist epistemic violence against cultural hegemony and the white dominant society.展开更多
In multiagent systems,agents usually do not have complete information of the whole system,which makes the analysis of such systems hard.The incompleteness of information is normally modelled by means of accessibility ...In multiagent systems,agents usually do not have complete information of the whole system,which makes the analysis of such systems hard.The incompleteness of information is normally modelled by means of accessibility relations,and the schedulers consistent with such relations are called uniform.In this paper,we consider probabilistic multiagent systems with accessibility relations and focus on the model checking problem with respect to the probabilistic epistemic temporal logic,which can specify both temporal and epistemic properties.However,the problem is undecidable in general.We show that it becomes decidable when restricted to memoryless uniform schedulers.Then,we present two algorithms for this case:one reduces the model checking problem into a mixed integer non-linear programming(MINLP)problem,which can then be solved by Satisfiability Modulo Theories(SMT)solvers,and the other is an approximate algorithm based on the upper confidence bounds applied to trees(UCT)algorithm,which can return a result whenever queried.These algorithms have been implemented in an existing model checker and then validated on experiments.The experimental results show the efficiency and extendability of these algorithms,and the algorithm based on UCT outperforms the one based on MINLP in most cases.展开更多
In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used...In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence- theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis- based reliability metrics, possibility-theory-based reliability metrics (posbist reliability) and uncertainty-theory-based reliability metrics (belief reliability). It is pointed out that a qualified reli- ability metric that is able to consider the effect of epistemic uncertainty needs to ( 1 ) compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2) satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.展开更多
A theory of open logic is developed.It can be used to describe the growthand the modification of knowledge,and to express the evolution of a hypothesis.Someconcepts,such as new premise,rejection by facts,reconstructio...A theory of open logic is developed.It can be used to describe the growthand the modification of knowledge,and to express the evolution of a hypothesis.Someconcepts,such as new premise,rejection by facts,reconstruction of a hypothesis and epis-temic process are defined.Their properties are studied and the related theorems are proved.The concept of the limit of an epistemic process is further defined.Every empiricalmodel about a specific problem is proved to be the limit of an epistemic process.As anapplication of the theory,a model theory of Reiter’s default reasoning is given using theconcepts of open logic.展开更多
Fragility curves are commonly used in civil engineering to assess the vulnerability of structures to earthquakes. The probability of failure associated with a prescribed criterion (e.g., the maximal inter-storey drif...Fragility curves are commonly used in civil engineering to assess the vulnerability of structures to earthquakes. The probability of failure associated with a prescribed criterion (e.g., the maximal inter-storey drift of a building exceeding a certain threshold) is represented as a function of the intensity of the earthquake ground motion (e.g., peak ground acceleration or spectral acceleration). The classical approach relies on assuming a lognormal shape of the fragility curves; it is thus parametric. In this paper, we introduce two non-parametric approaches to establish the fragility curves without employing the above assumption, namely binned Monte Carlo simulation and kernel density estimation. As an illustration, we compute the fragility curves for a three-storey steel frame using a large number of synthetic ground motions. The curves obtained with the non-parametric approaches are compared with respective curves based on the lognormal assumption. A similar comparison is presented for a case when a limited number of recorded ground motions is available. It is found that the accuracy of the lognormal curves depends on the ground motion intensity measure, the failure criterion and most importantly, on the employed method for estimating the parameters of the lognormal shape.展开更多
Uncertainty design can take account of aleatory and epistemic uncertainty in optimal processes.Aleatory uncertainty and epistemic uncertainty can be expressed as evidence theory uniformly, and evidence theory is used ...Uncertainty design can take account of aleatory and epistemic uncertainty in optimal processes.Aleatory uncertainty and epistemic uncertainty can be expressed as evidence theory uniformly, and evidence theory is used to describe the uncertainty. Transferring and response with evidence theory for structural optimal design are introduced. The principle of response evaluation is also set up. Finally, the cantilever beam in a test system is optimized in the introduced optimization process, and the results are estimated by the evaluation principle. The optimal process is validated after the optimization of beam.展开更多
基金The work is partially supported by Natural Science Foundation of Ningxia(Grant No.AAC03300)National Natural Science Foundation of China(Grant No.61962001)Graduate Innovation Project of North Minzu University(Grant No.YCX23152).
文摘Model checking is an automated formal verification method to verify whether epistemic multi-agent systems adhere to property specifications.Although there is an extensive literature on qualitative properties such as safety and liveness,there is still a lack of quantitative and uncertain property verifications for these systems.In uncertain environments,agents must make judicious decisions based on subjective epistemic.To verify epistemic and measurable properties in multi-agent systems,this paper extends fuzzy computation tree logic by introducing epistemic modalities and proposing a new Fuzzy Computation Tree Logic of Knowledge(FCTLK).We represent fuzzy multi-agent systems as distributed knowledge bases with fuzzy epistemic interpreted systems.In addition,we provide a transformation algorithm from fuzzy epistemic interpreted systems to fuzzy Kripke structures,as well as transformation rules from FCTLK formulas to Fuzzy Computation Tree Logic(FCTL)formulas.Accordingly,we transform the FCTLK model checking problem into the FCTL model checking.This enables the verification of FCTLK formulas by using the fuzzy model checking algorithm of FCTL without additional computational overheads.Finally,we present correctness proofs and complexity analyses of the proposed algorithms.Additionally,we further illustrate the practical application of our approach through an example of a train control system.
基金the National Natural Science Foundation of China(6157304371671009).
文摘In reliability engineering,the observations of the variables of interest are always limited due to cost or schedule constraints.Consequently,the epistemic uncertainty,which derives from lack of knowledge and information,plays a vital influence on the reliability evaluation.Belief reliability is a new reliability metric that takes the impact of epistemic uncertainty into consideration and belief reliability distribution is fundamental to belief reliability application.This paper develops a new method called graduation formula to construct belief reliability distribution with limited observations.The developed method constructs the belief reliability distribution by determining the corresponding belief degrees of the observations.An algorithm is designed for the graduation formula as it is a set of transcendental equations,which is difficult to determine the analytical solution.The developed method and the proposed algorithm are illustrated by two numerical examples to show their efficiency and future application.
基金the National Natural Science Foundation of China(71461021)the Natural Science Foundation of Jiangxi Province(20151BAB207044)+1 种基金the China Postdoctoral Science Foundation(2015M580568)the Postdoctoral Science Foundation of Jiangxi Province(2014KY36).
文摘Fault tolerant technology has greatly improved the reliability of modern systems on one hand and makes their failure mechanisms more complex on the other.The characteristics of dynamics of failure,diversity of distribution and epistemic uncertainty always exist in these systems,which increase the challenges in the reliability assessment of these systems significantly.This paper presents a novel reliability analysis framework for complex systems within which the failure rates of components are expressed in interval numbers.Specifically,it uses a dynamic fault tree(DFT)to model the dynamic fault behaviors and copes with the epistemic uncertainty using Dempster-Shafer(D-S)theory and interval numbers.Furthermore,an approach is presented to convert a DFT into a dynamic evidential network(DEN)to calculate the reliability parameters.Additionally,a sorting method based on the possibility degree is proposed to rank the importance of components represented by interval numbers in order to obtain the most critical components,which can be used to provide the guidance for system design,maintenance planning and fault diagnosis.Finally,a numerical example is provided to illustrate the availability and efficiency of the proposed method.
基金supported by the National Natural Science Foundation of China(51479158)the Fundamental Research Funds for the Central Universities(WUT:2018III061GX)
文摘Prior research on the resilience of critical infrastructure usually utilizes the network model to characterize the structure of the components so that a quantitative representation of resilience can be obtained. Particularly, network component importance is addressed to express its significance in shaping the resilience performance of the whole system. Due to the intrinsic complexity of the problem, some idealized assumptions are exerted on the resilience-optimization problem to find partial solutions. This paper seeks to exploit the dynamic aspect of system resilience, i.e., the scheduling problem of link recovery in the post-disruption phase.The aim is to analyze the recovery strategy of the system with more practical assumptions, especially inhomogeneous time cost among links. In view of this, the presented work translates the resilience-maximization recovery plan into the dynamic decisionmaking of runtime recovery option. A heuristic scheme is devised to treat the core problem of link selection in an ongoing style.Through Monte Carlo simulation, the link recovery order rendered by the proposed scheme demonstrates excellent resilience performance as well as accommodation with uncertainty caused by epistemic knowledge.
文摘Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method(SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.
基金Supported in part by The Ministry of EducationCulture+1 种基金SportsScience and Technology of Japan under Grant-in-Aid for Explor
文摘In order to provide scientists with a computational methodology and some computational tools to program their epistemic processes in scientific discovery, we are establishing a novel programming paradigm, named ‘Epistemic Programming’, which regards conditionals as the subject of computing, takes primary epistemic operations as basic operations of computing, and regards epistemic processes as the subject of programming. This paper presents our fundamental observations and assumptions on scientific discovery processes and their automation, research problems on modeling, automating, and programming epistemic processes, and an outline of our research project of Epistemic Programming.
文摘Caisson breakwaters are mainly constructed in deep waters to protect an area against waves.These breakwaters are con-ventionally designed based on the concept of the safety factor.However,the wave loads and resistance of structures have epistemic or aleatory uncertainties.Furthermore,sliding failure is one of the most important failure modes of caisson breakwaters.In most previous studies,for assessment purposes,uncertainties,such as wave and wave period variation,were ignored.Therefore,in this study,Bayesian reliability analysis is implemented to assess the failure probability of the sliding of Tombak port breakwater in the Persian Gulf.The mean and standard deviations were taken as random variables to consider dismissed uncertainties.For this purpose,the frst-order reliability method(FORM)and the frst principal curvature cor-rection in FORM are used to calculate the reliability index.The performances of these methods are verifed by importance sampling through Monte Carlo simulation(MCS).In addition,the reliability index sensitivities of each random variable are calculated to evaluate the importance of diferent random variables while calculating the caisson sliding.The results show that the reliability index is most sensitive to the coefcients of friction,wave height,and caisson weight(or concrete density).The sensitivity of the failure probability of each of the random variables and their uncertainties are calculated by the derivative method.Finally,the Bayesian regression is implemented to predict the statistical properties of breakwater sliding with non-informative priors,which are compared to Goda’s formulation,used in breakwater design standards.The analysis shows that the model posterior for the sliding of a caisson breakwater has a mean and standard deviation of 0.039 and 0.022,respectively.A normal quantile analysis and residual analysis are also performed to evaluate the correctness of the model responses.
基金Max Planck Institute for the History of Science,Berlin,Germany。
文摘This article summarizes the collaboration between two historians of medicine on Sino-European medical exchanges.Gianna Pomata researches the history of medicine in early modern Europe and Marta Hanson researches the history of medicine in early modern China.The following covers the concept of epistemic genres that Pomata first developed out of her research on the history of the genres historia,observationes,recipes,medical cases,and the commentary in Europe.She connected these genres variously to empiricism,erudition,scientific observation,norm-making,and recording practice.The paper then evaluates how Pomata and Hanson used epistemic genres as a method for doing cross-cultural research on 17th-18th-century Sino-European medical exchanges.Pomata then wrote a comparative history of the medical case in Europe and China.The article concludes with how Hanson applied the distinction of epistemic genres to analyze the history of Chinese medicine from a new perspective.
文摘As recent developments in autism research offer alternative explanations to the mainstream options, it can now be argued that the so-called cognitive deficits in the social domain associated with autism have been mischaracterized or, at least, oversimplified. We will use predictive models within a 4E (i.e., embodied, embedded, enactive and extended) conception of cognition to address the question of cognitive impairment in psychiatrics and autism. Such models force us to reassess what “cognitive deficit” means by integrating the environment not only in its usual sense (evo-developmental), but by understanding all cognitive performances as embedded in environments (or fields of affordances) that shape and sustain them. By adopting a predictive 4E perspective, we aim to show that the “cognitive deficits” associated with autism are in fact mismatches between environmental resources and the particular form of neurological functioning of autistic people (neurodiversity), brought about by the fact that the cultural niches that set up the relevant fields of affordances are structured by and for neurotypicals. This mismatch leads to epistemic injustices, both testimonial and hermeneutic, that feed back into research on autism and clinical approaches, thereby making the “deficits” appear based on individual shortcomings. In this context, autism interventions should partly focus on the development of social policies aimed at modifying those aspects of cultural niches that make environments unsuitable for the full development of all individuals.
文摘Satisficing control remains an important concept in decision making. In this paper, a new epistemic utility satisficing control theory is proposed for a new model of complex CMMO ( constrained multi-objective multi degree-of-freedom optimization) system. As well, an epistemic utility function is developed and used to adjust the feasible region of soft constraints. The theory proved in this paper indicates that the utility function not only expresses the subjectivity of the original satisfactory-degree function, but also takes the cost of searching for a solution into account. Thus, the satisfactory-degree function can be adjusted and its rationality can be validated. This theory contributes an analytical method to the inverse satisfactory optimization problem. The findings indicate that this theory has good convergence and outcomes desired for satisfactory-degree functions.
文摘This paper takes an integrative approach to the communication and comprehension of humor from the perspectives of the humorist’s manipulation and the recipient’s vigilance informed by relevance theory.It is proposed that,in order to communicate humor,the humorist manipulates the recipient’s expectation of relevance in the setup and in the punchline in two different but related ways:misleading and guiding.It is also proposed that,in order to comprehend and appreciate humor,the recipient exercises vigilance against his/her own shallow processing in the setup and exercises vigilance for special cognitive effects in the punchline.On this approach,humorous communication and comprehension is viewed as an interaction between manipulation and epistemic vigilance.Strategies of manipulation and vigilance are described,and some essential issues arising from the relevance-theoretic approach to humor are reconsidered with some implications drawn.This paper contributes to enhancing the explanatory power of relevance theory for the communication and comprehension of humor.
文摘Previous studies interpreting the meanings of the sentence-final particle“LE”displayed two trends:either excessively complicated or excessively general.Since some scholars established a theoretical foundation of propositional domain,epistemic domain and dialogic domain for the sentence-final particle“LE”,the nature or orientation of its semantic property has become more clear.However,there are also defects in the current“Three Domains”research model.In the first place,this model defines the meanings of the sentence-final particle“LE”as“emergence of new propositional content”,“emergence of new epistemic content”,and“emergence of new dialogic content”.But,the definition is excessively abstract and extensive.As many sentences not concluded with the particle“LE”can also express the three mentioned meanings,it fails to explain the difference between sentences ended with particle“LE”and ones without it.Secondly,the model fails to explore and discuss the nature or generation mechanism of relevant meanings of the particle“LE”.This study attempts to find a practical solution to those defects.
文摘In this paper, a data-driven prognostic model capable to deal with different sources of uncertainty is proposed. The main novelty factor is the application of a mathematical framework, namely a Random Fuzzy Variable (RFV) approach, for the representation and propagation of the different uncertainty sources affecting </span><span style="font-family:Verdana;">Prognostic Health Management (PHM) applications: measurement, future and model uncertainty. </span><span style="font-family:Verdana;">In this way, it is possible to deal not only with measurement noise and model parameters uncertainty due to the stochastic nature of the degradation process, but also with systematic effects, such as systematic errors in the measurement process, incomplete knowledge of the degradation process, subjective belief about model parameters. Furthermore, the low analytical complexity of the employed prognostic model allows to easily propagate the measurement and parameters uncertainty into the RUL forecast, with no need of extensive Monte Carlo loops, so that low requirements in terms of computation power are needed. The model has been applied to two real application cases, showing high accuracy output, resulting in a potential</span></span><span style="font-family:Verdana;">ly</span><span style="font-family:Verdana;"> effective tool for predictive maintenance in different industrial sectors.
文摘In light of the postcolonial theory,this thesis attempts to analyze the marginal plight of the low class and women images in Tar Baby from the perspective of Gayatri C.Spivak’s epistemic violence.Under the influence of epistemic violence,the resistance strategy in Tar Baby is highlighted in order to interpret the resistance thought displayed by Toni Morrison in the Tar Baby.Toni Morrison expresses the appeal of an active strategy to resist epistemic violence against cultural hegemony and the white dominant society.
基金supported by the National Natural Science Foundation of China under Grant No.61836005the Australian Research Council under Grant Nos.DP220102059 and DP180100691。
文摘In multiagent systems,agents usually do not have complete information of the whole system,which makes the analysis of such systems hard.The incompleteness of information is normally modelled by means of accessibility relations,and the schedulers consistent with such relations are called uniform.In this paper,we consider probabilistic multiagent systems with accessibility relations and focus on the model checking problem with respect to the probabilistic epistemic temporal logic,which can specify both temporal and epistemic properties.However,the problem is undecidable in general.We show that it becomes decidable when restricted to memoryless uniform schedulers.Then,we present two algorithms for this case:one reduces the model checking problem into a mixed integer non-linear programming(MINLP)problem,which can then be solved by Satisfiability Modulo Theories(SMT)solvers,and the other is an approximate algorithm based on the upper confidence bounds applied to trees(UCT)algorithm,which can return a result whenever queried.These algorithms have been implemented in an existing model checker and then validated on experiments.The experimental results show the efficiency and extendability of these algorithms,and the algorithm based on UCT outperforms the one based on MINLP in most cases.
基金supported by National Natural Science Foundation of China(No.61573043)
文摘In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence- theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis- based reliability metrics, possibility-theory-based reliability metrics (posbist reliability) and uncertainty-theory-based reliability metrics (belief reliability). It is pointed out that a qualified reli- ability metric that is able to consider the effect of epistemic uncertainty needs to ( 1 ) compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2) satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.
基金Project supported by the High Technology Research and Development Program of China.
文摘A theory of open logic is developed.It can be used to describe the growthand the modification of knowledge,and to express the evolution of a hypothesis.Someconcepts,such as new premise,rejection by facts,reconstruction of a hypothesis and epis-temic process are defined.Their properties are studied and the related theorems are proved.The concept of the limit of an epistemic process is further defined.Every empiricalmodel about a specific problem is proved to be the limit of an epistemic process.As anapplication of the theory,a model theory of Reiter’s default reasoning is given using theconcepts of open logic.
文摘Fragility curves are commonly used in civil engineering to assess the vulnerability of structures to earthquakes. The probability of failure associated with a prescribed criterion (e.g., the maximal inter-storey drift of a building exceeding a certain threshold) is represented as a function of the intensity of the earthquake ground motion (e.g., peak ground acceleration or spectral acceleration). The classical approach relies on assuming a lognormal shape of the fragility curves; it is thus parametric. In this paper, we introduce two non-parametric approaches to establish the fragility curves without employing the above assumption, namely binned Monte Carlo simulation and kernel density estimation. As an illustration, we compute the fragility curves for a three-storey steel frame using a large number of synthetic ground motions. The curves obtained with the non-parametric approaches are compared with respective curves based on the lognormal assumption. A similar comparison is presented for a case when a limited number of recorded ground motions is available. It is found that the accuracy of the lognormal curves depends on the ground motion intensity measure, the failure criterion and most importantly, on the employed method for estimating the parameters of the lognormal shape.
文摘Uncertainty design can take account of aleatory and epistemic uncertainty in optimal processes.Aleatory uncertainty and epistemic uncertainty can be expressed as evidence theory uniformly, and evidence theory is used to describe the uncertainty. Transferring and response with evidence theory for structural optimal design are introduced. The principle of response evaluation is also set up. Finally, the cantilever beam in a test system is optimized in the introduced optimization process, and the results are estimated by the evaluation principle. The optimal process is validated after the optimization of beam.