The probability calculus and statistics as well permeate nearly every discipline and professional sector, while no theories underpinning this wide spreading field reached universal consensus so far. The probability in...The probability calculus and statistics as well permeate nearly every discipline and professional sector, while no theories underpinning this wide spreading field reached universal consensus so far. The probability interpretations present irreconcilable traits, so the concept of probability is still substantially unclear. <strong>Purpose of this work: </strong>The present paper intends to demonstrate how the different models of probability constitute the facial problem which conceals another hidden and more fundamental question. <strong>Method:</strong> We show how authors do not agree with the concept of probability <em>P</em> and moreover they have different ideas about the precise object qualified by <em>P</em>, which has priority from the point of logic. It is clear how the element <em>X</em> measured by <em>P</em>(<em>X</em>) influences its meaning. In consequence of the conflicting opinions, theorists tend toward a compromise. They use the outcome or result of an experiment as the argument <em>X</em> of <em>P</em>(<em>X</em>) and represent <em>X</em> as a subset of the event space. This paper suggests replacing the outcome-subset with the event-triad <strong>E</strong>, which provides a comprehensive mathematical support. <strong>Results:</strong> The last section shows how the triadic model is formally consistent with the conventional theories and can integrate the conflicting views on probability. This unifying result can help mathematicians to go beyond the present theoretical deadlock. In summary, this paper advocates a more explicit notation system for probability and points out how probability can be ambiguous without rigorous specification of the sample space and the experiment in general.展开更多
In the evaluation of some simulation systems, only small samples data are gotten due to the limited conditions. In allusion to the evaluation problem of small sample data, an interval estimation approach with the impr...In the evaluation of some simulation systems, only small samples data are gotten due to the limited conditions. In allusion to the evaluation problem of small sample data, an interval estimation approach with the improved grey confidence degree is proposed.On the basis of the definition of grey distance, three kinds of definition of the grey weight for every sample element in grey estimated value are put forward, and then the improved grey confidence degree is designed. In accordance with the new concept, the grey interval estimation for small sample data is deduced. Furthermore,the bootstrap method is applied for more accurate grey confidence interval. Through resampling of the bootstrap, numerous small samples with the corresponding confidence intervals can be obtained. Then the final confidence interval is calculated from the union of these grey confidence intervals. In the end, the simulation system evaluation using the proposed method is conducted. The simulation results show that the reasonable confidence interval is acquired, which demonstrates the feasibility and effectiveness of the proposed method.展开更多
Relations between statistical residence time series and effective shooting are analyzed in accordance with the properties of the random residence time of maneuver targets crossing shot area in a given time. An estimat...Relations between statistical residence time series and effective shooting are analyzed in accordance with the properties of the random residence time of maneuver targets crossing shot area in a given time. An estimation method for kill probability is proposed, which solves the probability of number of residence times satisfied effective shooting in given time. Some expressions and their approximate formulae of kill probability are derived, under known the distribution of residence time series. Theoretical analysis and simulation results show that this method is suitable for evaluating the hit ability of fire system for maneuver targets in random shooting.展开更多
Aiming at the characteristics of complex logic relation and multiple dynamic gates in system,its failure probability model is established based on dynamic fault tree. For the multi-state dynamic fault tree,it can be t...Aiming at the characteristics of complex logic relation and multiple dynamic gates in system,its failure probability model is established based on dynamic fault tree. For the multi-state dynamic fault tree,it can be transferred into Markov chain with continuous parameters. The state transfer diagram can be decomposed into several state transfer chains,and the failure probability models can be derived according to the lengths of the chains. Then,the failure probability of the dynamic fault tree analysis(DFTA) can be obtained by adding each chain's probability. The failure probability calculation of DFTA based on the continuous parameter Markov chain is proposed and proved. Given an example,the analytic method is compared with the conventional methods which have to solve the differential equation. It is known from the results that the analytic method can be applied to engineering easily.展开更多
The purpose of the present study is to investigate the presence of multi-fractal behaviours in the traffic time series not only by statistical approaches but also by geometrical approaches. The pointwisc Hǒlder expon...The purpose of the present study is to investigate the presence of multi-fractal behaviours in the traffic time series not only by statistical approaches but also by geometrical approaches. The pointwisc Hǒlder exponent of a function is calculated by developing an algorithm for the numerical evaluation of HSlder exponent of time series. The traffic time series observed on the Beijing Yuquanying highway are analysed. The results from all these methods indicate that the traffic data exhibit the multi-fractal behaviour.展开更多
While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if eve...While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if ever do so.In this paper,we review some of the quantitative tools and techniques which are available for use in digital forensic investigations,including Bayesian networks,complexity theory,information theory and probability theory,and indicate how they may be used to obtain likelihood ratios or odds ratios for the relative plausibility of alternative explanations for the creation of the recovered digital evidence.The potential benefits of such quantitative measures for modern digital forensics are also outlined.展开更多
Correspondence imaging is a new modality of ghost imaging, which can retrieve a positive/negative image by simple conditional averaging of the reference frames that correspond to relatively large/small values of the t...Correspondence imaging is a new modality of ghost imaging, which can retrieve a positive/negative image by simple conditional averaging of the reference frames that correspond to relatively large/small values of the total intensity measured at the bucket detector. Here we propose and experimentally demonstrate a more rigorous and general approach in which a ghost image is retrieved by calculating a Pearson correlation coefficient between the bucket detector intensity and the brightness at a given pixel of the reference frames, and at the next pixel, and so on. Furthermore, we theoretically provide a statistical interpretation of these two imaging phenomena, and explain how the error depends on the sample size and what kind of distribution the error obeys. According to our analysis, the image signal-to-noise ratio can be greatly improved and the sampling number reduced by means of our new method.展开更多
We study the stochastic motion of a Brownian particle driven by a constant force over a static periodic potential. We show that both the effective diffusion and the effective drag coefficient are mathematically well-d...We study the stochastic motion of a Brownian particle driven by a constant force over a static periodic potential. We show that both the effective diffusion and the effective drag coefficient are mathematically well-defined and we derive analytic expressions for these two quantities. We then investigate the asymptotic behaviors of the effective diffusion and the effective drag coefficient, respectively, for small driving force and for large driving force. In the case of small driving force, the effective diffusion is reduced from its Brownian value by a factor that increases exponentially with the amplitude of the potential. The effective drag coefficient is increased by approximately the same factor. As a result, the Einstein relation between the diffusion coefficient and the drag coefficient is approximately valid when the driving force is small. For moderately large driving force, both the effective diffusion and the effective drag coefficient are increased from their Brownian values, and the Einstein relation breaks down. In the limit of very large driving force, both the effective diffusion and the effective drag coefficient converge to their Brownian values and the Einstein relation is once again valid.展开更多
With the bias between the predetermined planting location and the fact mine position,slant range of SLMM(submarine launch mobile mine)appears randomly scattered.The normal distribution model of slant range was propose...With the bias between the predetermined planting location and the fact mine position,slant range of SLMM(submarine launch mobile mine)appears randomly scattered.The normal distribution model of slant range was proposed by the distribution theory of multivariate random variables,and the simplified model based on key parameters was present,and the laws of slant range distribution parameters such as mean and variance were given,which were affected by key parameters.The conclusions ensure that slant range of SLMM can be controlled when laying mines and provide the basis for tactical decision-making.展开更多
On the basis of the entropy of incomplete statistics (IS) and the joint probability factorization condition, two controversial problems existing in IS are investigated: one is what expression of the internal energy...On the basis of the entropy of incomplete statistics (IS) and the joint probability factorization condition, two controversial problems existing in IS are investigated: one is what expression of the internal energy is reasonable for a composite system and the other is whether the traditional zeroth law of thermodynamics is suitable for IS. Some new equivalent expressions of the internal energy of a composite system are derived through accurate mathematical calculation. Moreover, a self-consistent calculation is used to expound that the zeroth law of thermodynamics is also suitable for IS, but it cannot be proven theoretically. Finally, it is pointed out that the generalized zeroth law of thermodynamics for incomplete nonextensive statistics is unnecessary and the nonextensive assumptions for the composite internal energy will lead to mathematical contradiction.展开更多
There is a current debate about the extent to which Academic Freedom should be permitted in our universities.On the one hand,we have traditionalists who maintain that Academic Freedom should be unrestricted:people who...There is a current debate about the extent to which Academic Freedom should be permitted in our universities.On the one hand,we have traditionalists who maintain that Academic Freedom should be unrestricted:people who have the appropriate qualifications and accomplishments should be allowed to develop theories about how the world is,or ought to be,as they see fit.On the other hand,we have post-traditional philosophers who argue against this degree of Academic Freedom.I consider a conservative version of post-traditional philosophy that permits restrictions on Academic Freedom only if the following conditions are met,Condition 1:The dissemination of the results of a given research project R must cause significant harm to some people,especially to people from oppressed groups.Condition 2:Condition 1 must possess strong empirical support,and which accepts the following assumptions:(1)there is a world of objective facts that is,in principle,discoverable,(2)rational means are the means of discovering it and,(3)rational means requires strong empirical support.I define strong empirical support for an hypothesis h on evidence e in probabilistic terms,as a ratio of posterior to prior probabilities substantially exceeding 1.I now argue in favour of a research policy that accepts unrestricted Academic Freedom.My argument is that there is a formal and general quandary that arises within the standard theory of probability when we apply this account of empirical support to a set of possible causal hypotheses framed in such a way that the“reverse probabilities”,pr(e/h)are 1.I consider various possible ways to escape this quandary,none of which are without difficulties,concluding that a research policy allowing for unrestricted Academic Freedom is probably the best that we can hope for.展开更多
In this work, we make a representation of non-relativistic quantum theory based on foundations of paraconsistent annotated logic (PAL), a propositional and evidential logic with an associated lattice FOUR. We use the ...In this work, we make a representation of non-relativistic quantum theory based on foundations of paraconsistent annotated logic (PAL), a propositional and evidential logic with an associated lattice FOUR. We use the PAL version with annotation of two values (PAL2v), named paraquantum logic (PQL), where the evidence signals are normalized values and the intensities of the inconsistencies are represented by degrees of contradiction. Quantum mechanics is represented through mapping on the interlaced bilattices where this logical formalization allows annotation of two values in the format of degrees of evidence of probability. The Bernoulli probability distribution is used to establish probabilistic logical states that identify the superposition of states and quantum entanglement with the equations and determine the state vectors located inside the interlaced Bilattice. In the proposed logical probabilistic paraquantum logic model (pPQL Model), we introduce the operation of logical conflation into interlaced bilattice. We verify that in the pPQL Model, the operation of logical conflation is responsible for providing a suitable model for various phenomena of quantum mechanics, mainly the quantum entanglement. The results obtained from the entanglement equations demonstrate the formalization and completeness of paraquantum logic that allows for interpretations of similar phenomena of quantum mechanics, including EPR paradox and the wave-particle theory.展开更多
The persistence exponent <img src="Edit_8589f062-08af-48bf-9fa4-ea64e4f98789.png" alt="" /> for the simple diffusion equation <img src="Edit_8bd8b3b8-7f1f-4ea5-a5f5-c5ccc20288f4.png&q...The persistence exponent <img src="Edit_8589f062-08af-48bf-9fa4-ea64e4f98789.png" alt="" /> for the simple diffusion equation <img src="Edit_8bd8b3b8-7f1f-4ea5-a5f5-c5ccc20288f4.png" alt="" /> , with random Gaussian initial condition, has been calculated exactly using a method known as selective averaging. The probability that the value of the field <img src="Edit_cc47d602-457a-4e52-93d8-acc18dcaf933.png" alt="" /> at a specified spatial coordinate remains positive throughout for a certain time<em> t</em> behaves as <img src="Edit_aacdd656-f2c2-4cde-ba3c-1b32bf053b3b.png" alt="" /> for asymptotically large time <em>t</em>. The value of <img src="Edit_77272c69-2a19-4918-a183-7db96b262c7a.png" alt="" /> , calculated here for any integer dimension <em>d</em>, is <img src="Edit_bc64e52a-d6d0-4b63-8ef3-aa0f9d3c39cc.png" alt="" /> for <img src="Edit_becf7ae7-0ae4-43a6-9a41-017f25747517.png" alt="" /> and 1 otherwise. This exact theoretical result is being reported possibly for the first time and is not in agreement with the accepted values <img src="Edit_fbefbfcf-d76b-4eeb-a5f5-d8afda4a1a0c.png" alt="" /> for <img src="Edit_ec927d57-c273-40dd-8126-706443b57534.png" alt="" /> respectively.展开更多
How to efficiently measure the distance between two basic probability assignments(BPAs) is an open issue. In this paper, a new method to measure the distance between two BPAs is proposed, based on two existing measu...How to efficiently measure the distance between two basic probability assignments(BPAs) is an open issue. In this paper, a new method to measure the distance between two BPAs is proposed, based on two existing measures of evidence distance. The new proposed method is comprehensive and generalized. Numerical examples are used to illustrate the effectiveness of the proposed method.展开更多
We propose a new theory of probability based on the general principle of the statistical stabilization of relative frequencies. According to this principle it is possible to consider the statistical stabilization not ...We propose a new theory of probability based on the general principle of the statistical stabilization of relative frequencies. According to this principle it is possible to consider the statistical stabilization not only with respect to the standard real topology on the field of rational numbers Q but also with respect to an arbitrary topology on Q. The case of p-adic (and more general non-Archimedean) topologies is the most important. Our frequency theory of Probability is a fruitful extension of the frequency theory of R. von Mises[18]. It's well known that the axiomatic theory of Kolmogorov uses the frequency theory as one of the foundations. And a new general frequency theory can be considered as the base for the general axiomatic theory of probability (Kolmogorov's theory is a particular case of this theory which corresponds to the real topology of the statistical stabilization on Q). The situation in the theory of probability becomes similar to that in modern geometry. The Kolmogorov axiomatics (as the Euclidean) is only one of the possibilities, and we have generated a great number of different non-Kolmogorov theories of probability.The applications to p-adic quantum mechanics and field theory are considered.展开更多
Markov chains are extensively used in modeling different aspects of engineering and scientific systems, such as performance of algorithms and reliability of systems. Different techniques have been developed for analyz...Markov chains are extensively used in modeling different aspects of engineering and scientific systems, such as performance of algorithms and reliability of systems. Different techniques have been developed for analyzing Markovian models, for example, Markov Chain Monte Carlo based simulation, Markov Analyzer, and more recently probabilistic model- checking. However, these techniques either do not guarantee accurate analysis or are not scalable. Higher-order-logic theorem proving is a formal method that has the ability to overcome the above mentioned limitations. However, it is not mature enough to handle all sorts of Markovian models. In this paper, we propose a formalization of Discrete-Time Markov Chain (DTMC) that facilitates formal reasoning about time-homogeneous finite-state discrete-time Markov chain. In particular, we provide a formal verification on some of its important properties, such as joint probabilities, Chapman-Kolmogorov equation, reversibility property, using higher-order logic. To demonstrate the usefulness of our work, we analyze two applications: a simplified binary communication channel and the Automatic Mail Quality Measurement protocol.展开更多
A table of 95%confidence limits on the probabilities for randomly downloading relatively small numbers of illegal images or sensitive documents amongst a relatively large number of other images or documents has been c...A table of 95%confidence limits on the probabilities for randomly downloading relatively small numbers of illegal images or sensitive documents amongst a relatively large number of other images or documents has been computed.It is anticipated that these data will assist prosecution officials in arriving at a decision as to whether or not there is a reasonable likelihood of a successful criminal prosecution when the inadvertent download defence is employed in cases of possession of child pornography,terrorist material or espionage-related documents.The same data can also be used by defence counsels to assess the strength of the prosecution’s case.展开更多
Unlike conventional forensics,digital forensics does not at present generally quantify the results of its investigations.It is suggested that digital forensics should aim to catch up with other forensic disciplines by...Unlike conventional forensics,digital forensics does not at present generally quantify the results of its investigations.It is suggested that digital forensics should aim to catch up with other forensic disciplines by using Bayesian and other numerical methodologies to quantify its investigations’results.Assessing the plausibility of alternative hypotheses(or propositions,or claims)which explain how recovered digital evidence came to exist on a device could assist both the prosecution and the defence sides in criminal proceedings:helping the prosecution to decide whether to proceed to trial and helping defence lawyers to advise a defendant how to plead.This paper reviews some numerical approaches to the goal of quantifying the relative weights of individual items of digital evidence and the plausibility of hypotheses based on that evidence.The potential advantages enabling the construction of cost-effective digital forensic triage schemas are also outlined.展开更多
Probabilistic techniques are widely used in the analysis of algorithms to estimate the computational complexity of algorithms or a computational problem.Traditionally,such analyses are performed using paper-and-pencil...Probabilistic techniques are widely used in the analysis of algorithms to estimate the computational complexity of algorithms or a computational problem.Traditionally,such analyses are performed using paper-and-pencil proofs and the results are sometimes validated using simulation techniques.These techniques are informal and thus may result in an inaccurate analysis.In this paper,we propose a formal technique for analyzing the expected time complexity of algorithms using higher-order-logic theorem proving.The approach calls for mathematically modeling the algorithm along with its inputs,using indicator random variables,in higher-order logic.This model is then used to formally reason about the expected time complexity of the underlying algorithm in a theorem prover.The paper includes the higher-order-logic formalization of indicator random variables,which are fundamental to the proposed infrastructure.In order to illustrate the practical effiectiveness and utilization of the proposed infrastructure,the paper also includes the analysis of algorithms for three well-known problems,i.e.,the hat-check problem,the birthday paradox and the hiring problem.展开更多
文摘The probability calculus and statistics as well permeate nearly every discipline and professional sector, while no theories underpinning this wide spreading field reached universal consensus so far. The probability interpretations present irreconcilable traits, so the concept of probability is still substantially unclear. <strong>Purpose of this work: </strong>The present paper intends to demonstrate how the different models of probability constitute the facial problem which conceals another hidden and more fundamental question. <strong>Method:</strong> We show how authors do not agree with the concept of probability <em>P</em> and moreover they have different ideas about the precise object qualified by <em>P</em>, which has priority from the point of logic. It is clear how the element <em>X</em> measured by <em>P</em>(<em>X</em>) influences its meaning. In consequence of the conflicting opinions, theorists tend toward a compromise. They use the outcome or result of an experiment as the argument <em>X</em> of <em>P</em>(<em>X</em>) and represent <em>X</em> as a subset of the event space. This paper suggests replacing the outcome-subset with the event-triad <strong>E</strong>, which provides a comprehensive mathematical support. <strong>Results:</strong> The last section shows how the triadic model is formally consistent with the conventional theories and can integrate the conflicting views on probability. This unifying result can help mathematicians to go beyond the present theoretical deadlock. In summary, this paper advocates a more explicit notation system for probability and points out how probability can be ambiguous without rigorous specification of the sample space and the experiment in general.
文摘In the evaluation of some simulation systems, only small samples data are gotten due to the limited conditions. In allusion to the evaluation problem of small sample data, an interval estimation approach with the improved grey confidence degree is proposed.On the basis of the definition of grey distance, three kinds of definition of the grey weight for every sample element in grey estimated value are put forward, and then the improved grey confidence degree is designed. In accordance with the new concept, the grey interval estimation for small sample data is deduced. Furthermore,the bootstrap method is applied for more accurate grey confidence interval. Through resampling of the bootstrap, numerous small samples with the corresponding confidence intervals can be obtained. Then the final confidence interval is calculated from the union of these grey confidence intervals. In the end, the simulation system evaluation using the proposed method is conducted. The simulation results show that the reasonable confidence interval is acquired, which demonstrates the feasibility and effectiveness of the proposed method.
基金Sponsored by the National Defense Funds under Grant(9140C300602080C30)Natural Science Foundation of Shanxi Province China(2008011011)
文摘Relations between statistical residence time series and effective shooting are analyzed in accordance with the properties of the random residence time of maneuver targets crossing shot area in a given time. An estimation method for kill probability is proposed, which solves the probability of number of residence times satisfied effective shooting in given time. Some expressions and their approximate formulae of kill probability are derived, under known the distribution of residence time series. Theoretical analysis and simulation results show that this method is suitable for evaluating the hit ability of fire system for maneuver targets in random shooting.
文摘Aiming at the characteristics of complex logic relation and multiple dynamic gates in system,its failure probability model is established based on dynamic fault tree. For the multi-state dynamic fault tree,it can be transferred into Markov chain with continuous parameters. The state transfer diagram can be decomposed into several state transfer chains,and the failure probability models can be derived according to the lengths of the chains. Then,the failure probability of the dynamic fault tree analysis(DFTA) can be obtained by adding each chain's probability. The failure probability calculation of DFTA based on the continuous parameter Markov chain is proposed and proved. Given an example,the analytic method is compared with the conventional methods which have to solve the differential equation. It is known from the results that the analytic method can be applied to engineering easily.
文摘The purpose of the present study is to investigate the presence of multi-fractal behaviours in the traffic time series not only by statistical approaches but also by geometrical approaches. The pointwisc Hǒlder exponent of a function is calculated by developing an algorithm for the numerical evaluation of HSlder exponent of time series. The traffic time series observed on the Beijing Yuquanying highway are analysed. The results from all these methods indicate that the traffic data exhibit the multi-fractal behaviour.
文摘While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if ever do so.In this paper,we review some of the quantitative tools and techniques which are available for use in digital forensic investigations,including Bayesian networks,complexity theory,information theory and probability theory,and indicate how they may be used to obtain likelihood ratios or odds ratios for the relative plausibility of alternative explanations for the creation of the recovered digital evidence.The potential benefits of such quantitative measures for modern digital forensics are also outlined.
基金supported by the National Key Scientific Instrument and Equipment Development Project of China(Grant No.2013YQ030595)the National High Technology Research and Development Program of China(Grant No.2013AA122902)
文摘Correspondence imaging is a new modality of ghost imaging, which can retrieve a positive/negative image by simple conditional averaging of the reference frames that correspond to relatively large/small values of the total intensity measured at the bucket detector. Here we propose and experimentally demonstrate a more rigorous and general approach in which a ghost image is retrieved by calculating a Pearson correlation coefficient between the bucket detector intensity and the brightness at a given pixel of the reference frames, and at the next pixel, and so on. Furthermore, we theoretically provide a statistical interpretation of these two imaging phenomena, and explain how the error depends on the sample size and what kind of distribution the error obeys. According to our analysis, the image signal-to-noise ratio can be greatly improved and the sampling number reduced by means of our new method.
文摘We study the stochastic motion of a Brownian particle driven by a constant force over a static periodic potential. We show that both the effective diffusion and the effective drag coefficient are mathematically well-defined and we derive analytic expressions for these two quantities. We then investigate the asymptotic behaviors of the effective diffusion and the effective drag coefficient, respectively, for small driving force and for large driving force. In the case of small driving force, the effective diffusion is reduced from its Brownian value by a factor that increases exponentially with the amplitude of the potential. The effective drag coefficient is increased by approximately the same factor. As a result, the Einstein relation between the diffusion coefficient and the drag coefficient is approximately valid when the driving force is small. For moderately large driving force, both the effective diffusion and the effective drag coefficient are increased from their Brownian values, and the Einstein relation breaks down. In the limit of very large driving force, both the effective diffusion and the effective drag coefficient converge to their Brownian values and the Einstein relation is once again valid.
基金Sponsored by the Science Research Development Foundation of Dalian Naval Academy(2009032)
文摘With the bias between the predetermined planting location and the fact mine position,slant range of SLMM(submarine launch mobile mine)appears randomly scattered.The normal distribution model of slant range was proposed by the distribution theory of multivariate random variables,and the simplified model based on key parameters was present,and the laws of slant range distribution parameters such as mean and variance were given,which were affected by key parameters.The conclusions ensure that slant range of SLMM can be controlled when laying mines and provide the basis for tactical decision-making.
基金Project supported by the National Natural Science Foundation of China (Grant No. 11005041)the Natural Science Foundation of Fujian Province, China (Grant No. 2010J05007)+2 种基金the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, China (Grant No. 2010-1561)the Basic Science Research Foundation, China (Grant No. JB-SJ1005)the Science Research Fund of Huaqiao University, China (including the support forHuang (Grant No. 11BS207))
文摘On the basis of the entropy of incomplete statistics (IS) and the joint probability factorization condition, two controversial problems existing in IS are investigated: one is what expression of the internal energy is reasonable for a composite system and the other is whether the traditional zeroth law of thermodynamics is suitable for IS. Some new equivalent expressions of the internal energy of a composite system are derived through accurate mathematical calculation. Moreover, a self-consistent calculation is used to expound that the zeroth law of thermodynamics is also suitable for IS, but it cannot be proven theoretically. Finally, it is pointed out that the generalized zeroth law of thermodynamics for incomplete nonextensive statistics is unnecessary and the nonextensive assumptions for the composite internal energy will lead to mathematical contradiction.
文摘There is a current debate about the extent to which Academic Freedom should be permitted in our universities.On the one hand,we have traditionalists who maintain that Academic Freedom should be unrestricted:people who have the appropriate qualifications and accomplishments should be allowed to develop theories about how the world is,or ought to be,as they see fit.On the other hand,we have post-traditional philosophers who argue against this degree of Academic Freedom.I consider a conservative version of post-traditional philosophy that permits restrictions on Academic Freedom only if the following conditions are met,Condition 1:The dissemination of the results of a given research project R must cause significant harm to some people,especially to people from oppressed groups.Condition 2:Condition 1 must possess strong empirical support,and which accepts the following assumptions:(1)there is a world of objective facts that is,in principle,discoverable,(2)rational means are the means of discovering it and,(3)rational means requires strong empirical support.I define strong empirical support for an hypothesis h on evidence e in probabilistic terms,as a ratio of posterior to prior probabilities substantially exceeding 1.I now argue in favour of a research policy that accepts unrestricted Academic Freedom.My argument is that there is a formal and general quandary that arises within the standard theory of probability when we apply this account of empirical support to a set of possible causal hypotheses framed in such a way that the“reverse probabilities”,pr(e/h)are 1.I consider various possible ways to escape this quandary,none of which are without difficulties,concluding that a research policy allowing for unrestricted Academic Freedom is probably the best that we can hope for.
文摘In this work, we make a representation of non-relativistic quantum theory based on foundations of paraconsistent annotated logic (PAL), a propositional and evidential logic with an associated lattice FOUR. We use the PAL version with annotation of two values (PAL2v), named paraquantum logic (PQL), where the evidence signals are normalized values and the intensities of the inconsistencies are represented by degrees of contradiction. Quantum mechanics is represented through mapping on the interlaced bilattices where this logical formalization allows annotation of two values in the format of degrees of evidence of probability. The Bernoulli probability distribution is used to establish probabilistic logical states that identify the superposition of states and quantum entanglement with the equations and determine the state vectors located inside the interlaced Bilattice. In the proposed logical probabilistic paraquantum logic model (pPQL Model), we introduce the operation of logical conflation into interlaced bilattice. We verify that in the pPQL Model, the operation of logical conflation is responsible for providing a suitable model for various phenomena of quantum mechanics, mainly the quantum entanglement. The results obtained from the entanglement equations demonstrate the formalization and completeness of paraquantum logic that allows for interpretations of similar phenomena of quantum mechanics, including EPR paradox and the wave-particle theory.
文摘The persistence exponent <img src="Edit_8589f062-08af-48bf-9fa4-ea64e4f98789.png" alt="" /> for the simple diffusion equation <img src="Edit_8bd8b3b8-7f1f-4ea5-a5f5-c5ccc20288f4.png" alt="" /> , with random Gaussian initial condition, has been calculated exactly using a method known as selective averaging. The probability that the value of the field <img src="Edit_cc47d602-457a-4e52-93d8-acc18dcaf933.png" alt="" /> at a specified spatial coordinate remains positive throughout for a certain time<em> t</em> behaves as <img src="Edit_aacdd656-f2c2-4cde-ba3c-1b32bf053b3b.png" alt="" /> for asymptotically large time <em>t</em>. The value of <img src="Edit_77272c69-2a19-4918-a183-7db96b262c7a.png" alt="" /> , calculated here for any integer dimension <em>d</em>, is <img src="Edit_bc64e52a-d6d0-4b63-8ef3-aa0f9d3c39cc.png" alt="" /> for <img src="Edit_becf7ae7-0ae4-43a6-9a41-017f25747517.png" alt="" /> and 1 otherwise. This exact theoretical result is being reported possibly for the first time and is not in agreement with the accepted values <img src="Edit_fbefbfcf-d76b-4eeb-a5f5-d8afda4a1a0c.png" alt="" /> for <img src="Edit_ec927d57-c273-40dd-8126-706443b57534.png" alt="" /> respectively.
基金supported by the National High Technology Research and Development Program of China(863 Program)(2013AA013801)the National Natural Science Foundation of China(61174022+4 种基金61573290)the open funding project of State Key Laboratory of Virtual Reality Technology and Systemsthe Beihang University(BUAA-VR-14KF-02)the General Research Program of Natural Science of Sichuan Provincial Department of Education(14ZB0322)the Self-financing Program of State Ethnic Affairs Commission of China(14SCZ014)
文摘How to efficiently measure the distance between two basic probability assignments(BPAs) is an open issue. In this paper, a new method to measure the distance between two BPAs is proposed, based on two existing measures of evidence distance. The new proposed method is comprehensive and generalized. Numerical examples are used to illustrate the effectiveness of the proposed method.
文摘We propose a new theory of probability based on the general principle of the statistical stabilization of relative frequencies. According to this principle it is possible to consider the statistical stabilization not only with respect to the standard real topology on the field of rational numbers Q but also with respect to an arbitrary topology on Q. The case of p-adic (and more general non-Archimedean) topologies is the most important. Our frequency theory of Probability is a fruitful extension of the frequency theory of R. von Mises[18]. It's well known that the axiomatic theory of Kolmogorov uses the frequency theory as one of the foundations. And a new general frequency theory can be considered as the base for the general axiomatic theory of probability (Kolmogorov's theory is a particular case of this theory which corresponds to the real topology of the statistical stabilization on Q). The situation in the theory of probability becomes similar to that in modern geometry. The Kolmogorov axiomatics (as the Euclidean) is only one of the possibilities, and we have generated a great number of different non-Kolmogorov theories of probability.The applications to p-adic quantum mechanics and field theory are considered.
文摘Markov chains are extensively used in modeling different aspects of engineering and scientific systems, such as performance of algorithms and reliability of systems. Different techniques have been developed for analyzing Markovian models, for example, Markov Chain Monte Carlo based simulation, Markov Analyzer, and more recently probabilistic model- checking. However, these techniques either do not guarantee accurate analysis or are not scalable. Higher-order-logic theorem proving is a formal method that has the ability to overcome the above mentioned limitations. However, it is not mature enough to handle all sorts of Markovian models. In this paper, we propose a formalization of Discrete-Time Markov Chain (DTMC) that facilitates formal reasoning about time-homogeneous finite-state discrete-time Markov chain. In particular, we provide a formal verification on some of its important properties, such as joint probabilities, Chapman-Kolmogorov equation, reversibility property, using higher-order logic. To demonstrate the usefulness of our work, we analyze two applications: a simplified binary communication channel and the Automatic Mail Quality Measurement protocol.
文摘A table of 95%confidence limits on the probabilities for randomly downloading relatively small numbers of illegal images or sensitive documents amongst a relatively large number of other images or documents has been computed.It is anticipated that these data will assist prosecution officials in arriving at a decision as to whether or not there is a reasonable likelihood of a successful criminal prosecution when the inadvertent download defence is employed in cases of possession of child pornography,terrorist material or espionage-related documents.The same data can also be used by defence counsels to assess the strength of the prosecution’s case.
文摘Unlike conventional forensics,digital forensics does not at present generally quantify the results of its investigations.It is suggested that digital forensics should aim to catch up with other forensic disciplines by using Bayesian and other numerical methodologies to quantify its investigations’results.Assessing the plausibility of alternative hypotheses(or propositions,or claims)which explain how recovered digital evidence came to exist on a device could assist both the prosecution and the defence sides in criminal proceedings:helping the prosecution to decide whether to proceed to trial and helping defence lawyers to advise a defendant how to plead.This paper reviews some numerical approaches to the goal of quantifying the relative weights of individual items of digital evidence and the plausibility of hypotheses based on that evidence.The potential advantages enabling the construction of cost-effective digital forensic triage schemas are also outlined.
文摘Probabilistic techniques are widely used in the analysis of algorithms to estimate the computational complexity of algorithms or a computational problem.Traditionally,such analyses are performed using paper-and-pencil proofs and the results are sometimes validated using simulation techniques.These techniques are informal and thus may result in an inaccurate analysis.In this paper,we propose a formal technique for analyzing the expected time complexity of algorithms using higher-order-logic theorem proving.The approach calls for mathematically modeling the algorithm along with its inputs,using indicator random variables,in higher-order logic.This model is then used to formally reason about the expected time complexity of the underlying algorithm in a theorem prover.The paper includes the higher-order-logic formalization of indicator random variables,which are fundamental to the proposed infrastructure.In order to illustrate the practical effiectiveness and utilization of the proposed infrastructure,the paper also includes the analysis of algorithms for three well-known problems,i.e.,the hat-check problem,the birthday paradox and the hiring problem.