There is no specific and standard definition of what free, fair and credible democratic elections mean under international law. The international law by implications only tries to lay down a guide and qualities of wha...There is no specific and standard definition of what free, fair and credible democratic elections mean under international law. The international law by implications only tries to lay down a guide and qualities of what a free and fair democratic election should and what it should not be. Both emerging and established democracies present frequent deviations from the ideals of a free, fair and credible election. Confidence in the electoral process has therefore become a key concern for political scientists and electoral administrators prompting this critical review. This article is mainly theoretical in perspective using the primary and secondary data in its context. Findings indicate allegations of administrative restrictions being selectively applied to losers coupled with election rigging by winners. This confirms that the quest for measuring and determining the credibility of an electoral outcome or "freeness and fairness" of an electoral process needs a collaborative approach. A model is used to explain the complexity of defining free and fair elections while emphasis is placed on aligning domestic law to international laws.展开更多
Dissemination of information medium are no longer limited to newspapers, television and radio but is extending to the online news. The number of online news readers is rapidly increasing along with the popularity of t...Dissemination of information medium are no longer limited to newspapers, television and radio but is extending to the online news. The number of online news readers is rapidly increasing along with the popularity of the Internet. However, only sources that emphasizes the element of credibility will be accepted by the readers. This study was conducted to determine the credibility of online news; to identify the relationship between online news credibility and youth acceptance. Using a quantitative approach, a total of 400 youth selected to participate in the study out of the 68,000 population in Selangor. The data was analyzed using Statistical Package for Social Science (SPSS) version 21.0. This study shows that a group of young women were the common group who seek and read online information. Based on the findings too, online news in Malaysia are less credible. Although the online news is less credible, the results shows that youth read online news because the contents are current and up to date. Further, result of the Pearson's correlation test shows that the more credible elements available in online news, the more that medium is accepted by youth.展开更多
With the development of technology,the connected vehicle has been upgraded from a traditional transport vehicle to an information terminal and energy storage terminal.The data of ICV(intelligent connected vehicles)is ...With the development of technology,the connected vehicle has been upgraded from a traditional transport vehicle to an information terminal and energy storage terminal.The data of ICV(intelligent connected vehicles)is the key to organically maximizing their efficiency.However,in the context of increasingly strict global data security supervision and compliance,numerous problems,including complex types of connected vehicle data,poor data collaboration between the IT(information technology)domain and OT(operation technology)domain,different data format standards,lack of shared trust sources,difficulty in ensuring the quality of shared data,lack of data control rights,as well as difficulty in defining data ownership,make vehicle data sharing face a lot of problems,and data islands are widespread.This study proposes FADSF(Fuzzy Anonymous Data Share Frame),an automobile data sharing scheme based on blockchain.The data holder publishes the shared data information and forms the corresponding label storage on the blockchain.The data demander browses the data directory information to select and purchase data assets and verify them.The data demander selects and purchases data assets and verifies them by browsing the data directory information.Meanwhile,this paper designs a data structure Data Discrimination Bloom Filter(DDBF),making complaints about illegal data.When the number of data complaints reaches the threshold,the audit traceability contract is triggered to punish the illegal data publisher,aiming to improve the data quality and maintain a good data sharing ecology.In this paper,based on Ethereum,the above scheme is tested to demonstrate its feasibility,efficiency and security.展开更多
This study focusses on the relationship between China’s accession to the World Trade Organisation(WTO)and its reform of stateowned enterprises(SOEs)and suggests that the major incentive for the Chinese government to ...This study focusses on the relationship between China’s accession to the World Trade Organisation(WTO)and its reform of stateowned enterprises(SOEs)and suggests that the major incentive for the Chinese government to join the WTO is to promote economic reforms through overcoming domestic obstacles.After other options such as decentralisation,legalisation,and privatisation failed to enhance viability of SOEs,the Chinese government began to rely on international institutions to enhance its credibility and harden the budget constraints on SOEs.The WTO is one of the most important international organisations and has binding force for its member states.China’s participation in the WTO will effectively harden budget constraints on its SOEs and improve the efficiency of these enterprises through introducing competition into the domestic market.Historical data support our argument and indicate that China has effectively enhanced the credibility of government commitments and promoted the reform of its SOEs since its accession to the WTO.展开更多
How to effectively evaluate the firing precision of weapon equipment at low cost is one of the core contents of improving the test level of weapon system.A new method to evaluate the firing precision of the MLRS consi...How to effectively evaluate the firing precision of weapon equipment at low cost is one of the core contents of improving the test level of weapon system.A new method to evaluate the firing precision of the MLRS considering the credibility of simulation system based on Bayesian theory is proposed in this paper.First of all,a comprehensive index system for the credibility of the simulation system of the firing precision of the MLRS is constructed combined with the group analytic hierarchy process.A modified method for determining the comprehensive weight of the index is established to improve the rationality of the index weight coefficients.The Bayesian posterior estimation formula of firing precision considering prior information is derived in the form of mixed prior distribution,and the rationality of prior information used in estimation model is discussed quantitatively.With the simulation tests,the different evaluation methods are compared to validate the effectiveness of the proposed method.Finally,the experimental results show that the effectiveness of estimation method for firing precision is improved by more than 25%.展开更多
The credible sets for system parameters and transfer functions are discussedin this paper. The derivations of the credible sets via the Bayesian statistical methods aregiven in various cases and the credible sets of t...The credible sets for system parameters and transfer functions are discussedin this paper. The derivations of the credible sets via the Bayesian statistical methods aregiven in various cases and the credible sets of transfer functions are constructed on thebasis of the parameter credible sets. In the small sample case the credible set, which theparameters are in, has minimal volume with given credible probability. In the large samplecaset it is proved that if the critical boundary is taken properly then the probability, withwhich the parameters are in the credible set, will tend to one as the number of data tendsto infinity. The credible set for transfer function is an envelope on the complex plane.Under some assumptions, the envelope tends to the curve of true transfer function.展开更多
The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented. Introducing the conditional p...The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented. Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed. A Fortran routine, BPOCI, has been developed to implement the calculation.展开更多
In this study, a blockchain based federated learning system using an enhanced weighted mean vector optimization algorithm, known as EINFO, is proposed. The proposed EINFO addresses the limitations of federated averagi...In this study, a blockchain based federated learning system using an enhanced weighted mean vector optimization algorithm, known as EINFO, is proposed. The proposed EINFO addresses the limitations of federated averaging during global update and model training, where data is unevenly distributed among devices and there are variations in the number of data samples. Using a well-defined structure and updating the vector positions by local searching, vector combining, and updating rules, the EINFO algorithm maximizes the shared model parameters. In order to increase the exploration and exploitation capabilities, the model convergence rate is improved and new vectors are generated through the use of a weighted mean vector based on the inverse square law. To choose validators, miners, and to propagate new blocks, a delegated proof of stake based on the reliability of blockchain nodes is suggested. Federated learning is included into the blockchain to protect nodes from both external and internal threats. To determine how well the suggested system performs in relation to current models in the literature, extensive simulations are run. The simulation results show that the proposed system outperforms existing schemes in terms of accuracy, sensitivity and specificity.展开更多
Government credibility is an important asset of contemporary national governance, an important criterion for evaluating government legitimacy, and a key factor in measuring the effectiveness of government governance. ...Government credibility is an important asset of contemporary national governance, an important criterion for evaluating government legitimacy, and a key factor in measuring the effectiveness of government governance. In recent years, researchers’ research on government credibility has mostly focused on exploring theories and mechanisms, with little empirical research on this topic. This article intends to apply variable selection models in the field of social statistics to the issue of government credibility, in order to achieve empirical research on government credibility and explore its core influencing factors from a statistical perspective. Specifically, this article intends to use four regression-analysis-based methods and three random-forest-based methods to study the influencing factors of government credibility in various provinces in China, and compare the performance of these seven variable selection methods in different dimensions. The research results show that there are certain differences in simplicity, accuracy, and variable importance ranking among different variable selection methods, which present different importance in the study of government credibility issues. This study provides a methodological reference for variable selection models in the field of social science research, and also offers a multidimensional comparative perspective for analyzing the influencing factors of government credibility.展开更多
It is quite often that the theoretic model used in the Kalman filtering may not be sufficiently accurate for practical applications,due to the fact that the covariances of noises are not exactly known.Our previous wor...It is quite often that the theoretic model used in the Kalman filtering may not be sufficiently accurate for practical applications,due to the fact that the covariances of noises are not exactly known.Our previous work reveals that in such scenario the filter calculated mean square errors(FMSE)and the true mean square errors(TMSE)become inconsistent,while FMSE and TMSE are consistent in the Kalman filter with accurate models.This can lead to low credibility of state estimation regardless of using Kalman filters or adaptive Kalman filters.Obviously,it is important to study the inconsistency issue since it is vital to understand the quantitative influence induced by the inaccurate models.Aiming at this,the concept of credibility is adopted to discuss the inconsistency problem in this paper.In order to formulate the degree of the credibility,a trust factor is constructed based on the FMSE and the TMSE.However,the trust factor can not be directly computed since the TMSE cannot be found for practical applications.Based on the definition of trust factor,the estimation of the trust factor is successfully modified to online estimation of the TMSE.More importantly,a necessary and sufficient condition is found,which turns out to be the basis for better design of Kalman filters with high performance.Accordingly,beyond trust factor estimation with Sage-Husa technique(TFE-SHT),three novel trust factor estimation methods,which are directly numerical solving method(TFE-DNS),the particle swarm optimization method(PSO)and expectation maximization-particle swarm optimization method(EM-PSO)are proposed.The analysis and simulation results both show that the proposed TFE-DNS is better than the TFE-SHT for the case of single unknown noise covariance.Meanwhile,the proposed EMPSO performs completely better than the EM and PSO on the estimation of the credibility degree and state when both noise covariances should be estimated online.展开更多
The growth of the internet and technology has had a significant effect on social interactions.False information has become an important research topic due to the massive amount of misinformed content on social network...The growth of the internet and technology has had a significant effect on social interactions.False information has become an important research topic due to the massive amount of misinformed content on social networks.It is very easy for any user to spread misinformation through the media.Therefore,misinformation is a problem for professionals,organizers,and societies.Hence,it is essential to observe the credibility and validity of the News articles being shared on social media.The core challenge is to distinguish the difference between accurate and false information.Recent studies focus on News article content,such as News titles and descriptions,which has limited their achievements.However,there are two ordinarily agreed-upon features of misinformation:first,the title and text of an article,and second,the user engagement.In the case of the News context,we extracted different user engagements with articles,for example,tweets,i.e.,read-only,user retweets,likes,and shares.We calculate user credibility and combine it with article content with the user’s context.After combining both features,we used three Natural language processing(NLP)feature extraction techniques,i.e.,Term Frequency-Inverse Document Frequency(TF-IDF),Count-Vectorizer(CV),and Hashing-Vectorizer(HV).Then,we applied different machine learning classifiers to classify misinformation as real or fake.Therefore,we used a Support Vector Machine(SVM),Naive Byes(NB),Random Forest(RF),Decision Tree(DT),Gradient Boosting(GB),and K-Nearest Neighbors(KNN).The proposed method has been tested on a real-world dataset,i.e.,“fakenewsnet”.We refine the fakenewsnet dataset repository according to our required features.The dataset contains 23000+articles with millions of user engagements.The highest accuracy score is 93.4%.The proposed model achieves its highest accuracy using count vector features and a random forest classifier.Our discoveries confirmed that the proposed classifier would effectively classify misinformation in social networks.展开更多
In the field of target recognition based on the temporal-spatial information fusion,evidence the-ory has received extensive attention.To achieve accurate and efficient target recognition by the evi-dence theory,an ada...In the field of target recognition based on the temporal-spatial information fusion,evidence the-ory has received extensive attention.To achieve accurate and efficient target recognition by the evi-dence theory,an adaptive temporal-spatial information fusion model is proposed.Firstly,an adaptive evaluation correction mechanism is constructed by the evidence distance and Deng entropy,which realizes the credibility discrimination and adaptive correction of the spatial evidence.Secondly,the credibility decay operator is introduced to obtain the dynamic credibility of temporal evidence.Finally,the sequential combination of temporal-spatial evidences is achieved by Shafer’s discount criterion and Dempster’s combination rule.The simulation results show that the proposed method not only considers the dynamic and sequential characteristics of the temporal-spatial evidences com-bination,but also has a strong conflict information processing capability,which provides a new refer-ence for the field of temporal-spatial information fusion.展开更多
More and more uncertain factors in power systems and more and more complex operation modes of power systems put forward higher requirements for online transient stability assessment methods.The traditional modeldriven...More and more uncertain factors in power systems and more and more complex operation modes of power systems put forward higher requirements for online transient stability assessment methods.The traditional modeldriven methods have clear physical mechanisms and reliable evaluation results but the calculation process is time-consuming,while the data-driven methods have the strong fitting ability and fast calculation speed but the evaluation results lack interpretation.Therefore,it is a future development trend of transient stability assessment methods to combine these two kinds of methods.In this paper,the rate of change of the kinetic energy method is used to calculate the transient stability in the model-driven stage,and the support vector machine and extreme learning machine with different internal principles are respectively used to predict the transient stability in the data-driven stage.In order to quantify the credibility level of the data-driven methods,the credibility index of the output results is proposed.Then the switching function controlling whether the rate of change of the kinetic energy method is activated or not is established based on this index.Thus,a newparallel integratedmodel-driven and datadriven online transient stability assessment method is proposed.The accuracy,efficiency,and adaptability of the proposed method are verified by numerical examples.展开更多
The thorium molten salt reactor–liquid fuel(TMSR-LF1) has inherent safety features. The accident occurrence possibility and their consequences are much lower for the TMSR-LF1 than that of traditional reactors.Based o...The thorium molten salt reactor–liquid fuel(TMSR-LF1) has inherent safety features. The accident occurrence possibility and their consequences are much lower for the TMSR-LF1 than that of traditional reactors.Based on accident analysis, the maximum credible accident and the radioactive source terms of the TMSR-LF1 were first estimated. Then, the total effective dose of the maximum credible accident was calculated. Based on calculations, the cover gas flow rate can significantly affect the radiation consequences of the maximum credible accident when it changes from 0 to 10 L/min. If no cover gas is flowing, a site-area emergency would be required within the range of 50–73 m from the reactor. In the case of cover gas flow, only an abnormal notification and an alert two emergency class would be required within the range of 50 m.展开更多
Estimation of seismic hazard for the fast developing coastal area of Pakistan is carried out using deterministic and probabilistic approaches. On the basis of seismotectonics and geology, eleven faults are recognized ...Estimation of seismic hazard for the fast developing coastal area of Pakistan is carried out using deterministic and probabilistic approaches. On the basis of seismotectonics and geology, eleven faults are recognized in five seismic provinces as potential hazard sources. Maximum magnitude potential for each of these sources is calculated. Peak ground acceleration (PGA) values at the seven coastal cities due to the maximum credible earthquake on the relevant source are also obtained. Cities of Gwadar and Ormara with acceleration values of 0.21g and 0.25g respectively fall in the high seismic risk area. Cities of Turbat and Karachi lie in low seismic risk area with acceleration values of less than 0.1g. The Probabilistic PGA maps with contour interval of 0.05g for 50 and 100 years return period with 90% probability of non-exceedance are also compiled.展开更多
A new type of vehicle routing problem (VRP), multiple vehicle routing problem integrated reverse logistics (MVRPRL), is studied. In this problem, there is delivery or pickup (or both) and uncertain features in t...A new type of vehicle routing problem (VRP), multiple vehicle routing problem integrated reverse logistics (MVRPRL), is studied. In this problem, there is delivery or pickup (or both) and uncertain features in the demands of the clients. The deliveries of every client as uncertain parameters are expressed as triangular fuzzy numbers. In order to describe MVRPRL, a multi-objective fuzzy programming model with credibility measure theory is constructed. Then the simulationbased tabu search algorithm combining inter-route and intra-route neighborhoods and embedded restarts are designed to solve it. Computational results show that the tabu search algorithm developed is superior to sweep algorithms and that compared with handling each on separate routes, the transportation costs can be reduced by 43% through combining pickups with deliveries.展开更多
A security architecture using secret key algorithm and vertical authentication mode is proposed. Establish security protocols in the chip of smart key at network client or mobile phone, and establish key exchange prot...A security architecture using secret key algorithm and vertical authentication mode is proposed. Establish security protocols in the chip of smart key at network client or mobile phone, and establish key exchange protocol in the chip of encryption cards at network key management center. A combined key real-time generation algorithm is used to solve the update and management problems. Online or offline authentication and documents encryption transmission protocols are adopted to achieve credible connection between users. Accordingly, set up security layer over Internet, which provides convenient encryption ability to each network user, and build credible and secure network system.展开更多
Assessing geographic variations in health events is one of the major tasks in spatial epidemiologic studies. Geographic variation in a health event can be estimated using the neighborhood-level variance that is derive...Assessing geographic variations in health events is one of the major tasks in spatial epidemiologic studies. Geographic variation in a health event can be estimated using the neighborhood-level variance that is derived from a generalized mixed linear model or a Bayesian spatial hierarchical model. Two novel heterogeneity measures, including median odds ratio and interquartile odds ratio, have been developed to quantify the magnitude of geographic variations and facilitate the data interpretation. However, the statistical significance of geographic heterogeneity measures was inaccurately estimated in previous epidemiologic studies that reported two-sided 95% confidence intervals based on standard error of the variance or 95% credible intervals with a range from 2.5th to 97.5th percentiles of the Bayesian posterior distribution. Given the mathematical algorithms of heterogeneity measures, the statistical significance of geographic variation should be evaluated using a one-tailed P value. Therefore, previous studies using two-tailed 95% confidence intervals based on a standard error of the variance may have underestimated the geographic variation in events of their interest and those using 95% Bayesian credible intervals may need to re-evaluate the geographic variation of their study outcomes.展开更多
Velocity of money is an important instrument used to measure the monetary target and quality of monetary policy.Referencing the trends in the money velocity,mainly in the short term,will have a paramount effect in det...Velocity of money is an important instrument used to measure the monetary target and quality of monetary policy.Referencing the trends in the money velocity,mainly in the short term,will have a paramount effect in determining the trends in real money growth.This study investigates the main causes of money velocity in Ethiopia using time series data for the period 1974/75 to 2015/16.A regression with Bayesian estimation and nonparametric Locally Weighted Scatterplot Smoothing(LOWESS)methods were used to analyze the data.Variables such as credit,real interest rate,real exchange rate and real per capita income were included as potential determinants of money velocity.The findings of using non-parametric LOWESS methods show an upward trends in the velocity of money since 2002 and downward trends before 2002,indicating the existence’s of prudent monetary policy in Ethiopia after 2002.The result also shows a positive effect of real exchange rate and credit,whereas income per capita and real interest rates have a negative effect on velocity of money in Ethiopia.Hence,this paper recommends that,the policy to encourage sustainable economic growth and increase in interest rate would be beneficial to reduce velocity of money.展开更多
CFD verification and validation (V&V) are fundamental activities of credibility analysis for aerodynamic simulations. Through these activities, a large number of data resources will be generated. How to efficiently...CFD verification and validation (V&V) are fundamental activities of credibility analysis for aerodynamic simulations. Through these activities, a large number of data resources will be generated. How to efficiently manage and utilize these treasures is the key problem for benchmark database. In this paper an operable design of open benchmark database is studied and proposed with emphasis on administration, availability, data reliably, unified data standards and open system architecture. The purpose is to provide a paradigm of aerodynamic open benchmark database for CFD V&V, and to overcome some universal obstacles in current aerodynamic database such as lacks of coordination, continuity and necessary communication. Besides, some recent efforts of credibility analysis for aerodynamic simulations in China are briefly introduced.展开更多
文摘There is no specific and standard definition of what free, fair and credible democratic elections mean under international law. The international law by implications only tries to lay down a guide and qualities of what a free and fair democratic election should and what it should not be. Both emerging and established democracies present frequent deviations from the ideals of a free, fair and credible election. Confidence in the electoral process has therefore become a key concern for political scientists and electoral administrators prompting this critical review. This article is mainly theoretical in perspective using the primary and secondary data in its context. Findings indicate allegations of administrative restrictions being selectively applied to losers coupled with election rigging by winners. This confirms that the quest for measuring and determining the credibility of an electoral outcome or "freeness and fairness" of an electoral process needs a collaborative approach. A model is used to explain the complexity of defining free and fair elections while emphasis is placed on aligning domestic law to international laws.
文摘Dissemination of information medium are no longer limited to newspapers, television and radio but is extending to the online news. The number of online news readers is rapidly increasing along with the popularity of the Internet. However, only sources that emphasizes the element of credibility will be accepted by the readers. This study was conducted to determine the credibility of online news; to identify the relationship between online news credibility and youth acceptance. Using a quantitative approach, a total of 400 youth selected to participate in the study out of the 68,000 population in Selangor. The data was analyzed using Statistical Package for Social Science (SPSS) version 21.0. This study shows that a group of young women were the common group who seek and read online information. Based on the findings too, online news in Malaysia are less credible. Although the online news is less credible, the results shows that youth read online news because the contents are current and up to date. Further, result of the Pearson's correlation test shows that the more credible elements available in online news, the more that medium is accepted by youth.
基金This work was financially supported by the National Key Research and Development Program of China(2022YFB3103200).
文摘With the development of technology,the connected vehicle has been upgraded from a traditional transport vehicle to an information terminal and energy storage terminal.The data of ICV(intelligent connected vehicles)is the key to organically maximizing their efficiency.However,in the context of increasingly strict global data security supervision and compliance,numerous problems,including complex types of connected vehicle data,poor data collaboration between the IT(information technology)domain and OT(operation technology)domain,different data format standards,lack of shared trust sources,difficulty in ensuring the quality of shared data,lack of data control rights,as well as difficulty in defining data ownership,make vehicle data sharing face a lot of problems,and data islands are widespread.This study proposes FADSF(Fuzzy Anonymous Data Share Frame),an automobile data sharing scheme based on blockchain.The data holder publishes the shared data information and forms the corresponding label storage on the blockchain.The data demander browses the data directory information to select and purchase data assets and verify them.The data demander selects and purchases data assets and verifies them by browsing the data directory information.Meanwhile,this paper designs a data structure Data Discrimination Bloom Filter(DDBF),making complaints about illegal data.When the number of data complaints reaches the threshold,the audit traceability contract is triggered to punish the illegal data publisher,aiming to improve the data quality and maintain a good data sharing ecology.In this paper,based on Ethereum,the above scheme is tested to demonstrate its feasibility,efficiency and security.
文摘This study focusses on the relationship between China’s accession to the World Trade Organisation(WTO)and its reform of stateowned enterprises(SOEs)and suggests that the major incentive for the Chinese government to join the WTO is to promote economic reforms through overcoming domestic obstacles.After other options such as decentralisation,legalisation,and privatisation failed to enhance viability of SOEs,the Chinese government began to rely on international institutions to enhance its credibility and harden the budget constraints on SOEs.The WTO is one of the most important international organisations and has binding force for its member states.China’s participation in the WTO will effectively harden budget constraints on its SOEs and improve the efficiency of these enterprises through introducing competition into the domestic market.Historical data support our argument and indicate that China has effectively enhanced the credibility of government commitments and promoted the reform of its SOEs since its accession to the WTO.
基金National Natural Science Foundation of China(Grant Nos.11972193 and 92266201)。
文摘How to effectively evaluate the firing precision of weapon equipment at low cost is one of the core contents of improving the test level of weapon system.A new method to evaluate the firing precision of the MLRS considering the credibility of simulation system based on Bayesian theory is proposed in this paper.First of all,a comprehensive index system for the credibility of the simulation system of the firing precision of the MLRS is constructed combined with the group analytic hierarchy process.A modified method for determining the comprehensive weight of the index is established to improve the rationality of the index weight coefficients.The Bayesian posterior estimation formula of firing precision considering prior information is derived in the form of mixed prior distribution,and the rationality of prior information used in estimation model is discussed quantitatively.With the simulation tests,the different evaluation methods are compared to validate the effectiveness of the proposed method.Finally,the experimental results show that the effectiveness of estimation method for firing precision is improved by more than 25%.
文摘The credible sets for system parameters and transfer functions are discussedin this paper. The derivations of the credible sets via the Bayesian statistical methods aregiven in various cases and the credible sets of transfer functions are constructed on thebasis of the parameter credible sets. In the small sample case the credible set, which theparameters are in, has minimal volume with given credible probability. In the large samplecaset it is proved that if the critical boundary is taken properly then the probability, withwhich the parameters are in the credible set, will tend to one as the number of data tendsto infinity. The credible set for transfer function is an envelope on the complex plane.Under some assumptions, the envelope tends to the curve of true transfer function.
文摘The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented. Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed. A Fortran routine, BPOCI, has been developed to implement the calculation.
文摘In this study, a blockchain based federated learning system using an enhanced weighted mean vector optimization algorithm, known as EINFO, is proposed. The proposed EINFO addresses the limitations of federated averaging during global update and model training, where data is unevenly distributed among devices and there are variations in the number of data samples. Using a well-defined structure and updating the vector positions by local searching, vector combining, and updating rules, the EINFO algorithm maximizes the shared model parameters. In order to increase the exploration and exploitation capabilities, the model convergence rate is improved and new vectors are generated through the use of a weighted mean vector based on the inverse square law. To choose validators, miners, and to propagate new blocks, a delegated proof of stake based on the reliability of blockchain nodes is suggested. Federated learning is included into the blockchain to protect nodes from both external and internal threats. To determine how well the suggested system performs in relation to current models in the literature, extensive simulations are run. The simulation results show that the proposed system outperforms existing schemes in terms of accuracy, sensitivity and specificity.
文摘Government credibility is an important asset of contemporary national governance, an important criterion for evaluating government legitimacy, and a key factor in measuring the effectiveness of government governance. In recent years, researchers’ research on government credibility has mostly focused on exploring theories and mechanisms, with little empirical research on this topic. This article intends to apply variable selection models in the field of social statistics to the issue of government credibility, in order to achieve empirical research on government credibility and explore its core influencing factors from a statistical perspective. Specifically, this article intends to use four regression-analysis-based methods and three random-forest-based methods to study the influencing factors of government credibility in various provinces in China, and compare the performance of these seven variable selection methods in different dimensions. The research results show that there are certain differences in simplicity, accuracy, and variable importance ranking among different variable selection methods, which present different importance in the study of government credibility issues. This study provides a methodological reference for variable selection models in the field of social science research, and also offers a multidimensional comparative perspective for analyzing the influencing factors of government credibility.
基金supported by the National Natural Science Foundation of China(62033010)Aeronautical Science Foundation of China(2019460T5001)。
文摘It is quite often that the theoretic model used in the Kalman filtering may not be sufficiently accurate for practical applications,due to the fact that the covariances of noises are not exactly known.Our previous work reveals that in such scenario the filter calculated mean square errors(FMSE)and the true mean square errors(TMSE)become inconsistent,while FMSE and TMSE are consistent in the Kalman filter with accurate models.This can lead to low credibility of state estimation regardless of using Kalman filters or adaptive Kalman filters.Obviously,it is important to study the inconsistency issue since it is vital to understand the quantitative influence induced by the inaccurate models.Aiming at this,the concept of credibility is adopted to discuss the inconsistency problem in this paper.In order to formulate the degree of the credibility,a trust factor is constructed based on the FMSE and the TMSE.However,the trust factor can not be directly computed since the TMSE cannot be found for practical applications.Based on the definition of trust factor,the estimation of the trust factor is successfully modified to online estimation of the TMSE.More importantly,a necessary and sufficient condition is found,which turns out to be the basis for better design of Kalman filters with high performance.Accordingly,beyond trust factor estimation with Sage-Husa technique(TFE-SHT),three novel trust factor estimation methods,which are directly numerical solving method(TFE-DNS),the particle swarm optimization method(PSO)and expectation maximization-particle swarm optimization method(EM-PSO)are proposed.The analysis and simulation results both show that the proposed TFE-DNS is better than the TFE-SHT for the case of single unknown noise covariance.Meanwhile,the proposed EMPSO performs completely better than the EM and PSO on the estimation of the credibility degree and state when both noise covariances should be estimated online.
文摘The growth of the internet and technology has had a significant effect on social interactions.False information has become an important research topic due to the massive amount of misinformed content on social networks.It is very easy for any user to spread misinformation through the media.Therefore,misinformation is a problem for professionals,organizers,and societies.Hence,it is essential to observe the credibility and validity of the News articles being shared on social media.The core challenge is to distinguish the difference between accurate and false information.Recent studies focus on News article content,such as News titles and descriptions,which has limited their achievements.However,there are two ordinarily agreed-upon features of misinformation:first,the title and text of an article,and second,the user engagement.In the case of the News context,we extracted different user engagements with articles,for example,tweets,i.e.,read-only,user retweets,likes,and shares.We calculate user credibility and combine it with article content with the user’s context.After combining both features,we used three Natural language processing(NLP)feature extraction techniques,i.e.,Term Frequency-Inverse Document Frequency(TF-IDF),Count-Vectorizer(CV),and Hashing-Vectorizer(HV).Then,we applied different machine learning classifiers to classify misinformation as real or fake.Therefore,we used a Support Vector Machine(SVM),Naive Byes(NB),Random Forest(RF),Decision Tree(DT),Gradient Boosting(GB),and K-Nearest Neighbors(KNN).The proposed method has been tested on a real-world dataset,i.e.,“fakenewsnet”.We refine the fakenewsnet dataset repository according to our required features.The dataset contains 23000+articles with millions of user engagements.The highest accuracy score is 93.4%.The proposed model achieves its highest accuracy using count vector features and a random forest classifier.Our discoveries confirmed that the proposed classifier would effectively classify misinformation in social networks.
基金the National Natural Science Foundation of China(No.61976080)the Key Project on Research and Practice of Henan University Graduate Education and Teaching Reform(YJSJG2023XJ006)+1 种基金the Key Research and Development Projects of Henan Province(231111212500)the Henan University Graduate Education Innovation and Quality Improvement Program(SYLKC2023016).
文摘In the field of target recognition based on the temporal-spatial information fusion,evidence the-ory has received extensive attention.To achieve accurate and efficient target recognition by the evi-dence theory,an adaptive temporal-spatial information fusion model is proposed.Firstly,an adaptive evaluation correction mechanism is constructed by the evidence distance and Deng entropy,which realizes the credibility discrimination and adaptive correction of the spatial evidence.Secondly,the credibility decay operator is introduced to obtain the dynamic credibility of temporal evidence.Finally,the sequential combination of temporal-spatial evidences is achieved by Shafer’s discount criterion and Dempster’s combination rule.The simulation results show that the proposed method not only considers the dynamic and sequential characteristics of the temporal-spatial evidences com-bination,but also has a strong conflict information processing capability,which provides a new refer-ence for the field of temporal-spatial information fusion.
基金funded by the Science and Technology Project of State Grid Shanxi Electric Power Co.,Ltd.(Project No.520530200013).
文摘More and more uncertain factors in power systems and more and more complex operation modes of power systems put forward higher requirements for online transient stability assessment methods.The traditional modeldriven methods have clear physical mechanisms and reliable evaluation results but the calculation process is time-consuming,while the data-driven methods have the strong fitting ability and fast calculation speed but the evaluation results lack interpretation.Therefore,it is a future development trend of transient stability assessment methods to combine these two kinds of methods.In this paper,the rate of change of the kinetic energy method is used to calculate the transient stability in the model-driven stage,and the support vector machine and extreme learning machine with different internal principles are respectively used to predict the transient stability in the data-driven stage.In order to quantify the credibility level of the data-driven methods,the credibility index of the output results is proposed.Then the switching function controlling whether the rate of change of the kinetic energy method is activated or not is established based on this index.Thus,a newparallel integratedmodel-driven and datadriven online transient stability assessment method is proposed.The accuracy,efficiency,and adaptability of the proposed method are verified by numerical examples.
文摘The thorium molten salt reactor–liquid fuel(TMSR-LF1) has inherent safety features. The accident occurrence possibility and their consequences are much lower for the TMSR-LF1 than that of traditional reactors.Based on accident analysis, the maximum credible accident and the radioactive source terms of the TMSR-LF1 were first estimated. Then, the total effective dose of the maximum credible accident was calculated. Based on calculations, the cover gas flow rate can significantly affect the radiation consequences of the maximum credible accident when it changes from 0 to 10 L/min. If no cover gas is flowing, a site-area emergency would be required within the range of 50–73 m from the reactor. In the case of cover gas flow, only an abnormal notification and an alert two emergency class would be required within the range of 50 m.
文摘Estimation of seismic hazard for the fast developing coastal area of Pakistan is carried out using deterministic and probabilistic approaches. On the basis of seismotectonics and geology, eleven faults are recognized in five seismic provinces as potential hazard sources. Maximum magnitude potential for each of these sources is calculated. Peak ground acceleration (PGA) values at the seven coastal cities due to the maximum credible earthquake on the relevant source are also obtained. Cities of Gwadar and Ormara with acceleration values of 0.21g and 0.25g respectively fall in the high seismic risk area. Cities of Turbat and Karachi lie in low seismic risk area with acceleration values of less than 0.1g. The Probabilistic PGA maps with contour interval of 0.05g for 50 and 100 years return period with 90% probability of non-exceedance are also compiled.
基金The National Natural Science Foundation of China(No.70772059)Youth Science and Technology Innovation Foundation of Nanjing Agriculture University(No.KJ06029)
文摘A new type of vehicle routing problem (VRP), multiple vehicle routing problem integrated reverse logistics (MVRPRL), is studied. In this problem, there is delivery or pickup (or both) and uncertain features in the demands of the clients. The deliveries of every client as uncertain parameters are expressed as triangular fuzzy numbers. In order to describe MVRPRL, a multi-objective fuzzy programming model with credibility measure theory is constructed. Then the simulationbased tabu search algorithm combining inter-route and intra-route neighborhoods and embedded restarts are designed to solve it. Computational results show that the tabu search algorithm developed is superior to sweep algorithms and that compared with handling each on separate routes, the transportation costs can be reduced by 43% through combining pickups with deliveries.
文摘A security architecture using secret key algorithm and vertical authentication mode is proposed. Establish security protocols in the chip of smart key at network client or mobile phone, and establish key exchange protocol in the chip of encryption cards at network key management center. A combined key real-time generation algorithm is used to solve the update and management problems. Online or offline authentication and documents encryption transmission protocols are adopted to achieve credible connection between users. Accordingly, set up security layer over Internet, which provides convenient encryption ability to each network user, and build credible and secure network system.
文摘Assessing geographic variations in health events is one of the major tasks in spatial epidemiologic studies. Geographic variation in a health event can be estimated using the neighborhood-level variance that is derived from a generalized mixed linear model or a Bayesian spatial hierarchical model. Two novel heterogeneity measures, including median odds ratio and interquartile odds ratio, have been developed to quantify the magnitude of geographic variations and facilitate the data interpretation. However, the statistical significance of geographic heterogeneity measures was inaccurately estimated in previous epidemiologic studies that reported two-sided 95% confidence intervals based on standard error of the variance or 95% credible intervals with a range from 2.5th to 97.5th percentiles of the Bayesian posterior distribution. Given the mathematical algorithms of heterogeneity measures, the statistical significance of geographic variation should be evaluated using a one-tailed P value. Therefore, previous studies using two-tailed 95% confidence intervals based on a standard error of the variance may have underestimated the geographic variation in events of their interest and those using 95% Bayesian credible intervals may need to re-evaluate the geographic variation of their study outcomes.
文摘Velocity of money is an important instrument used to measure the monetary target and quality of monetary policy.Referencing the trends in the money velocity,mainly in the short term,will have a paramount effect in determining the trends in real money growth.This study investigates the main causes of money velocity in Ethiopia using time series data for the period 1974/75 to 2015/16.A regression with Bayesian estimation and nonparametric Locally Weighted Scatterplot Smoothing(LOWESS)methods were used to analyze the data.Variables such as credit,real interest rate,real exchange rate and real per capita income were included as potential determinants of money velocity.The findings of using non-parametric LOWESS methods show an upward trends in the velocity of money since 2002 and downward trends before 2002,indicating the existence’s of prudent monetary policy in Ethiopia after 2002.The result also shows a positive effect of real exchange rate and credit,whereas income per capita and real interest rates have a negative effect on velocity of money in Ethiopia.Hence,this paper recommends that,the policy to encourage sustainable economic growth and increase in interest rate would be beneficial to reduce velocity of money.
文摘CFD verification and validation (V&V) are fundamental activities of credibility analysis for aerodynamic simulations. Through these activities, a large number of data resources will be generated. How to efficiently manage and utilize these treasures is the key problem for benchmark database. In this paper an operable design of open benchmark database is studied and proposed with emphasis on administration, availability, data reliably, unified data standards and open system architecture. The purpose is to provide a paradigm of aerodynamic open benchmark database for CFD V&V, and to overcome some universal obstacles in current aerodynamic database such as lacks of coordination, continuity and necessary communication. Besides, some recent efforts of credibility analysis for aerodynamic simulations in China are briefly introduced.