Ordering based search methods have advantages over graph based search methods for structure learning of Bayesian networks in terms on the efficiency. With the aim of further increasing the accuracy of ordering based s...Ordering based search methods have advantages over graph based search methods for structure learning of Bayesian networks in terms on the efficiency. With the aim of further increasing the accuracy of ordering based search methods, we first propose to increase the search space, which can facilitate escaping from the local optima. We present our search operators with majorizations, which are easy to implement. Experiments show that the proposed algorithm can obtain significantly more accurate results. With regard to the problem of the decrease on efficiency due to the increase of the search space, we then propose to add path priors as constraints into the swap process. We analyze the coefficient which may influence the performance of the proposed algorithm, the experiments show that the constraints can enhance the efficiency greatly, while has little effect on the accuracy. The final experiments show that, compared to other competitive methods, the proposed algorithm can find better solutions while holding high efficiency at the same time on both synthetic and real data sets.展开更多
AIM To develop a framework to incorporate background domain knowledge into classification rule learning for knowledge discovery in biomedicine.METHODS Bayesian rule learning(BRL) is a rule-based classifier that uses a...AIM To develop a framework to incorporate background domain knowledge into classification rule learning for knowledge discovery in biomedicine.METHODS Bayesian rule learning(BRL) is a rule-based classifier that uses a greedy best-first search over a space of Bayesian belief-networks(BN) to find the optimal BN to explain the input dataset, and then infers classification rules from this BN. BRL uses a Bayesian score to evaluate the quality of BNs. In this paper, we extended the Bayesian score to include informative structure priors, which encodes our prior domain knowledge about the dataset. We call this extension of BRL as BRL_p. The structure prior has a λ hyperparameter that allows the user to tune the degree of incorporation of the prior knowledge in the model learning process. We studied the effect of λ on model learning using a simulated dataset and a real-world lung cancer prognostic biomarker dataset, by measuring the degree of incorporation of our specified prior knowledge. We also monitored its effect on the model predictive performance. Finally, we compared BRL_p to other stateof-the-art classifiers commonly used in biomedicine.RESULTS We evaluated the degree of incorporation of prior knowledge into BRL_p, with simulated data by measuring the Graph Edit Distance between the true datagenerating model and the model learned by BRL_p. We specified the true model using informative structurepriors. We observed that by increasing the value of λ we were able to increase the influence of the specified structure priors on model learning. A large value of λ of BRL_p caused it to return the true model. This also led to a gain in predictive performance measured by area under the receiver operator characteristic curve(AUC). We then obtained a publicly available real-world lung cancer prognostic biomarker dataset and specified a known biomarker from literature [the epidermal growth factor receptor(EGFR) gene]. We again observed that larger values of λ led to an increased incorporation of EGFR into the final BRL_p model. This relevant background knowledge also led to a gain in AUC.CONCLUSION BRL_p enables tunable structure priors to be incorporated during Bayesian classification rule learning that integrates data and knowledge as demonstrated using lung cancer biomarker data.展开更多
The delayed S-shaped software reliability growth model (SRGM) is one of the non-homogeneous Poisson process (NHPP) models which have been proposed for software reliability assessment. The model is distinctive because ...The delayed S-shaped software reliability growth model (SRGM) is one of the non-homogeneous Poisson process (NHPP) models which have been proposed for software reliability assessment. The model is distinctive because it has a mean value function that reflects the delay in failure reporting: there is a delay between failure detection and reporting time. The model captures error detection, isolation, and removal processes, thus is appropriate for software reliability analysis. Predictive analysis in software testing is useful in modifying, debugging, and determining when to terminate software development testing processes. However, Bayesian predictive analyses on the delayed S-shaped model have not been extensively explored. This paper uses the delayed S-shaped SRGM to address four issues in one-sample prediction associated with the software development testing process. Bayesian approach based on non-informative priors was used to derive explicit solutions for the four issues, and the developed methodologies were illustrated using real data.展开更多
The original intention of visual question answering(VQA)models is to infer the answer based on the relevant information of the question text in the visual image,but many VQA models often yield answers that are biased ...The original intention of visual question answering(VQA)models is to infer the answer based on the relevant information of the question text in the visual image,but many VQA models often yield answers that are biased by some prior knowledge,especially the language priors.This paper proposes a mitigation model called language priors mitigation-VQA(LPM-VQA)for the language priors problem in VQA model,which divides language priors into positive and negative language priors.Different network branches are used to capture and process the different priors to achieve the purpose of mitigating language priors.A dynamically-changing language prior feedback objective function is designed with the intermediate results of some modules in the VQA model.The weight of the loss value for each answer is dynamically set according to the strength of its language priors to balance its proportion in the total VQA loss to further mitigate the language priors.This model does not depend on the baseline VQA architectures and can be configured like a plug-in to improve the performance of the model over most existing VQA models.The experimental results show that the proposed model is general and effective,achieving state-of-the-art accuracy in the VQA-CP v2 dataset.展开更多
Based on the concept of admissibility in statistics, a definition of generalized admissibility of Bayes estimates has been given at first, which was with inaccurate prior for the application in surveying adjustment. T...Based on the concept of admissibility in statistics, a definition of generalized admissibility of Bayes estimates has been given at first, which was with inaccurate prior for the application in surveying adjustment. Then according to the definition, the generalized admissibility of the normal linear Bayes estimate with the inaccurate prior information that contains deviations or model errors, as well as how to eliminate the effect of the model error on the Bayes estimate in surveying adjustment were studied. The results show that if the prior information is not accurate, that is, it contains model error, the generalized admissibility can explain whether the Bayes estimate can be accepted or not. For the case of linear normal Bayes estimate, the Bayes estimate can be made generally admissible by giving a less prior weight if the prior information is inaccurate. Finally an example was given.展开更多
Smoothness prior approach for spectral smoothing is investigated using Fourier frequency filter analysis.We show that the regularization parameter in penalized least squares could continuously control the bandwidth of...Smoothness prior approach for spectral smoothing is investigated using Fourier frequency filter analysis.We show that the regularization parameter in penalized least squares could continuously control the bandwidth of low-pass filter.Besides,due to its property of interpolating the missing values automatically and smoothly,a spectral baseline correction algorithm based on the approach is proposed.This algorithm generally comprises spectral peak detection and baseline estimation.First,the spectral peak regions are detected and identified according to the second derivatives.Then,generalized smoothness prior approach combining identification information could estimate the baseline in peak regions.Results with both the simulated and real spectra show accurate baseline-corrected signals with this method.展开更多
Isothermal transformation (TTT) behavior of the low carbon steels with two Si contents (0.50 wt pct and 1.35 wt pct) was investigated with and without the prior deformation. The results show that Si and the prior ...Isothermal transformation (TTT) behavior of the low carbon steels with two Si contents (0.50 wt pct and 1.35 wt pct) was investigated with and without the prior deformation. The results show that Si and the prior deformation of the austenite have significant effects on the transformation of the ferrite and bainite. The addition of Si refines the ferrite grains, accelerates the polygonal ferrite transformation and the formation of M/A constituents, leading to the improvement of the strength. The ferrite grains formed under the prior deformation of the austenite become more homogeneous and refined. However, the influence of deformation on the tensile strength of both steels is dependent on the isothermal temperatures. Thermodynamic calculation indicates that Si and prior deformation reduce the incubation time of both ferrite and bainite transformation, but the effect is weakened by the decrease of the isothermal temperatures.展开更多
Focusing on the degradation of foggy images, a restora- tion approach from a single image based on spatial correlation of dark channel prior is proposed. Firstly, the transmission of each pixel is estimated by the spa...Focusing on the degradation of foggy images, a restora- tion approach from a single image based on spatial correlation of dark channel prior is proposed. Firstly, the transmission of each pixel is estimated by the spatial correlation of dark channel prior. Secondly, a degradation model is utilized to restore the foggy image. Thirdly, the final recovered image, with enhanced contrast, is obtained by performing a post-processing technique based on just-noticeable difference. Experimental results demonstrate that the information of a foggy image can be recovered perfectly by the proposed method, even in the case of the abrupt depth changing scene.展开更多
Yin [1] has developed a new Bayesian measure of evidence for testing a point null hypothesis which agrees with the frequentist p-value thereby, solving Lindley’s paradox. Yin and Li [2] extended the methodology of Yi...Yin [1] has developed a new Bayesian measure of evidence for testing a point null hypothesis which agrees with the frequentist p-value thereby, solving Lindley’s paradox. Yin and Li [2] extended the methodology of Yin [1] to the case of the Behrens-Fisher problem by assigning Jeffreys’ independent prior to the nuisance parameters. In this paper, we were able to show both analytically and through the results from simulation studies that the methodology of Yin?[1] solves simultaneously, the Behrens-Fisher problem and Lindley’s paradox when a Gamma prior is assigned to the nuisance parameters.展开更多
The blurred image restoration method can dramatically highlight the image details and enhance the global contrast, which is of benefit to improvement of the visual effect during practical ap- plications. This paper is...The blurred image restoration method can dramatically highlight the image details and enhance the global contrast, which is of benefit to improvement of the visual effect during practical ap- plications. This paper is based on the dark channel prior principle and aims at the prior information absent blurred image degradation situation. A lot of improvements have been made to estimate the transmission map of blurred images. Since the dark channel prior principle can effectively restore the blurred image at the cost of a large amount of computation, the total variation (TV) and image morphology transform (specifically top-hat transform and bottom- hat transform) have been introduced into the improved method. Compared with original transmission map estimation methods, the proposed method features both simplicity and accuracy. The es- timated transmission map together with the element can restore the image. Simulation results show that this method could inhibit the ill-posed problem during image restoration, meanwhile it can greatly improve the image quality and definition.展开更多
In many practical applications of image segmentation problems,employing prior information can greatly improve segmentation results.This paper continues to study one kind of prior information,called prior distribution....In many practical applications of image segmentation problems,employing prior information can greatly improve segmentation results.This paper continues to study one kind of prior information,called prior distribution.Within this research,there is no exact template of the object;instead only several samples are given.The proposed method,called the parametric distribution prior model,extends our previous model by adding the training procedure to learn the prior distribution of the objects.Then this paper establishes the energy function of the active contour model(ACM)with consideration of this parametric form of prior distribution.Therefore,during the process of segmenting,the template can update itself while the contour evolves.Experiments are performed on the airplane data set.Experimental results demonstrate the potential of the proposed method that with the information of prior distribution,the segmentation effect and speed can be both improved efficaciously.展开更多
The likelihood function plays a central role in statistical analysis in relation to information, from both frequentist and Bayesian perspectives. In large samples several new properties of the likelihood in relation t...The likelihood function plays a central role in statistical analysis in relation to information, from both frequentist and Bayesian perspectives. In large samples several new properties of the likelihood in relation to information are developed here. The Arrow-Pratt absolute risk aversion measure is shown to be related to the Cramer-Rao Information bound. The derivative of the log-likelihood function is seen to provide a measure of information related stability for the Bayesian posterior density. As well, information similar prior densities can be defined reflecting the central role of likelihood in the Bayes learning paradigm.展开更多
A low carbon steel was used to determine the critical strain εc for completion of deformation enhanced ferrite transformation (DEFT) through a series of hot compression tests. In addition, the influence of prior au...A low carbon steel was used to determine the critical strain εc for completion of deformation enhanced ferrite transformation (DEFT) through a series of hot compression tests. In addition, the influence of prior austenite grain size (PAGS) on the critical strain was systematically investigated. Experimental results showed that the critical strain is affected by PAGS. When γ→α transformation completes, the smaller the PAGS is, the smaller the critical strain is. The ferrite grains obtained through DEFT can be refined to about 3 μm when the DEFT is completed.展开更多
When a one-dimensional luminance scalar is replaced by a vector of a colorful multi-dimension for every pixel of a monochrome image,the process is called colorization.However,colorization is under-constrained.Therefor...When a one-dimensional luminance scalar is replaced by a vector of a colorful multi-dimension for every pixel of a monochrome image,the process is called colorization.However,colorization is under-constrained.Therefore,the prior knowledge is considered and given to the monochrome image.Colorization using optimization algorithm is an effective algorithm for the above problem.However,it cannot effectively do with some images well without repeating experiments for confirming the place of scribbles.In this paper,a colorization algorithm is proposed,which can automatically generate the prior knowledge.The idea is that firstly,the prior knowledge crystallizes into some points of the prior knowledge which is automatically extracted by downsampling and upsampling method.And then some points of the prior knowledge are classified and given with corresponding colors.Lastly,the color image can be obtained by the color points of the prior knowledge.It is demonstrated that the proposal can not only effectively generate the prior knowledge but also colorize the monochrome image according to requirements of user with some experiments.展开更多
A general method for assessing local influence of minor perturbations of prior in Bayesian analysis is developed in this paper. U8ing some elementary ideas from differelltial geometryl we provide a unified approach fo...A general method for assessing local influence of minor perturbations of prior in Bayesian analysis is developed in this paper. U8ing some elementary ideas from differelltial geometryl we provide a unified approach for handling a variety of problexns of local prior influence. AS applications, we discuss the local influence of small perturbstions of normal-gamma prior density in linear model and investigate local prior influence from the predictive view.展开更多
In this paper, a novel Magnetic Resonance (MR) reconstruction framework which combines image-wise and patch-wise sparse prior is proposed. For addressing, a truncated beta-Bernoulli process is firstly employed to enfo...In this paper, a novel Magnetic Resonance (MR) reconstruction framework which combines image-wise and patch-wise sparse prior is proposed. For addressing, a truncated beta-Bernoulli process is firstly employed to enforce sparsity on overlapping image patches emphasizing local structures. Due to its properties, beta-Bernoulli process can adaptive infer the sparsity (number of non-zero coefficients) of each patch, an appropriate dictionary, and the noise variance simultaneously, which are prerequisite for iterative image reconstruction. Secondly, a General Gaussian Distribution (GGD) prior is introduced to engage image-wise sparsity for wavelet coefficients, which can be then estimated by a threshold denoising algorithm. Finally, MR image is reconstructed by patch-wise estimation, image-wise estimation and under-sampled k-space data with least square data fitting. Experimental results have demonstrated that proposed approach exhibits excellent reconstruction performance. Moreover, if the image is full of similar low-dimensional-structures, proposed algorithm has dramatically improved Peak Signal to Noise Ratio (PSNR) 7~9 dB, with comparisons to other state-of-art compressive sampling methods.展开更多
The present study was conducted to study the effect of feed restriction prior to slaughter on carcass weight of male broiler chicks from 32 to 40 days of age. A total number of 180 (Pure line) male broiler chicks were...The present study was conducted to study the effect of feed restriction prior to slaughter on carcass weight of male broiler chicks from 32 to 40 days of age. A total number of 180 (Pure line) male broiler chicks were taken randomly, labeled and divided into six groups. At 32 days of age, the experimental groups were put under the experimental feeding program. Group A fed ad libitum (control) while group B and C fed 120, 60 gm/bird/day for eight days, respectively. Group D and E fed 120, 60 gm/bird/day for four days respectively, followed by zero feeding for an extra 4 days. Group F deprived of food during the whole experimental period (8 days). The experimental diet was formulated to be approximately iso caloric-iso nitrogenous containing sorghum, groundnut cake, broiler concentrate, calcium, salt, lysine, methionine, and premix. The parameters taken were live body weight, feed intake, mortality, carcass, and non-carcass values. The effect of feed restriction program on male broiler chicks was not significant during the period from 32 to 34 days of age for parameters final live body weight, carcass weight, and dressing percentage, but net weight (gain or loss) was affected by feed restriction program and showed a significant difference (P < 0.01) between experimental groups. From 32 to 36 days of age male broilers subjected to feed restriction regimes showed the lowest reading for final live body weight, net weight (gain or loss) and carcass weight and the difference were significant (P 0.05) between experimental groups for dressing percentage during period from32 to 36 days of age. At the period from 32 to 38 days and the period from 32 to 40 days of age, all parameters were significantly affected by feed restriction program. It was concluded that carcass weight of broiler chickens can be controlled using different options of feed restriction programs according to the need of the market and the producer situation with special consideration to the economic return.展开更多
Due to the frequency of occlusion, cluttering and lowcontrast edges, gray intensity based active contour models oftenfail to segment meaningful objects. Prior shape information is usuallyutilized to segment desirable ...Due to the frequency of occlusion, cluttering and lowcontrast edges, gray intensity based active contour models oftenfail to segment meaningful objects. Prior shape information is usuallyutilized to segment desirable objects. A parametric shape priormodel is proposed. Firstly, principal component analysis is employedto train object shape and transformation is added to shaperepresentation. Then the energy function is constructed througha combination of shape prior energy, gray intensity energy andshape constraint energy of the kernel density function. The objectboundary extraction process is converted into the parameters solvingprocess of object shape. Besides, two new shape prior energyfunctions are defined when desirable objects are occluded by otherobjects or some parts of them are missing. Finally, an alternatingdecent iteration solving scheme is proposed for numerical implementation.Experiments on synthetic and real images demonstratethe robustness and accuracy of the proposed method.展开更多
基金supported by the National Natural Science Fundation of China(61573285)the Doctoral Fundation of China(2013ZC53037)
文摘Ordering based search methods have advantages over graph based search methods for structure learning of Bayesian networks in terms on the efficiency. With the aim of further increasing the accuracy of ordering based search methods, we first propose to increase the search space, which can facilitate escaping from the local optima. We present our search operators with majorizations, which are easy to implement. Experiments show that the proposed algorithm can obtain significantly more accurate results. With regard to the problem of the decrease on efficiency due to the increase of the search space, we then propose to add path priors as constraints into the swap process. We analyze the coefficient which may influence the performance of the proposed algorithm, the experiments show that the constraints can enhance the efficiency greatly, while has little effect on the accuracy. The final experiments show that, compared to other competitive methods, the proposed algorithm can find better solutions while holding high efficiency at the same time on both synthetic and real data sets.
基金Supported by National Institute of General Medical Sciences of the National Institutes of Health,No.R01GM100387
文摘AIM To develop a framework to incorporate background domain knowledge into classification rule learning for knowledge discovery in biomedicine.METHODS Bayesian rule learning(BRL) is a rule-based classifier that uses a greedy best-first search over a space of Bayesian belief-networks(BN) to find the optimal BN to explain the input dataset, and then infers classification rules from this BN. BRL uses a Bayesian score to evaluate the quality of BNs. In this paper, we extended the Bayesian score to include informative structure priors, which encodes our prior domain knowledge about the dataset. We call this extension of BRL as BRL_p. The structure prior has a λ hyperparameter that allows the user to tune the degree of incorporation of the prior knowledge in the model learning process. We studied the effect of λ on model learning using a simulated dataset and a real-world lung cancer prognostic biomarker dataset, by measuring the degree of incorporation of our specified prior knowledge. We also monitored its effect on the model predictive performance. Finally, we compared BRL_p to other stateof-the-art classifiers commonly used in biomedicine.RESULTS We evaluated the degree of incorporation of prior knowledge into BRL_p, with simulated data by measuring the Graph Edit Distance between the true datagenerating model and the model learned by BRL_p. We specified the true model using informative structurepriors. We observed that by increasing the value of λ we were able to increase the influence of the specified structure priors on model learning. A large value of λ of BRL_p caused it to return the true model. This also led to a gain in predictive performance measured by area under the receiver operator characteristic curve(AUC). We then obtained a publicly available real-world lung cancer prognostic biomarker dataset and specified a known biomarker from literature [the epidermal growth factor receptor(EGFR) gene]. We again observed that larger values of λ led to an increased incorporation of EGFR into the final BRL_p model. This relevant background knowledge also led to a gain in AUC.CONCLUSION BRL_p enables tunable structure priors to be incorporated during Bayesian classification rule learning that integrates data and knowledge as demonstrated using lung cancer biomarker data.
文摘The delayed S-shaped software reliability growth model (SRGM) is one of the non-homogeneous Poisson process (NHPP) models which have been proposed for software reliability assessment. The model is distinctive because it has a mean value function that reflects the delay in failure reporting: there is a delay between failure detection and reporting time. The model captures error detection, isolation, and removal processes, thus is appropriate for software reliability analysis. Predictive analysis in software testing is useful in modifying, debugging, and determining when to terminate software development testing processes. However, Bayesian predictive analyses on the delayed S-shaped model have not been extensively explored. This paper uses the delayed S-shaped SRGM to address four issues in one-sample prediction associated with the software development testing process. Bayesian approach based on non-informative priors was used to derive explicit solutions for the four issues, and the developed methodologies were illustrated using real data.
文摘The original intention of visual question answering(VQA)models is to infer the answer based on the relevant information of the question text in the visual image,but many VQA models often yield answers that are biased by some prior knowledge,especially the language priors.This paper proposes a mitigation model called language priors mitigation-VQA(LPM-VQA)for the language priors problem in VQA model,which divides language priors into positive and negative language priors.Different network branches are used to capture and process the different priors to achieve the purpose of mitigating language priors.A dynamically-changing language prior feedback objective function is designed with the intermediate results of some modules in the VQA model.The weight of the loss value for each answer is dynamically set according to the strength of its language priors to balance its proportion in the total VQA loss to further mitigate the language priors.This model does not depend on the baseline VQA architectures and can be configured like a plug-in to improve the performance of the model over most existing VQA models.The experimental results show that the proposed model is general and effective,achieving state-of-the-art accuracy in the VQA-CP v2 dataset.
文摘Based on the concept of admissibility in statistics, a definition of generalized admissibility of Bayes estimates has been given at first, which was with inaccurate prior for the application in surveying adjustment. Then according to the definition, the generalized admissibility of the normal linear Bayes estimate with the inaccurate prior information that contains deviations or model errors, as well as how to eliminate the effect of the model error on the Bayes estimate in surveying adjustment were studied. The results show that if the prior information is not accurate, that is, it contains model error, the generalized admissibility can explain whether the Bayes estimate can be accepted or not. For the case of linear normal Bayes estimate, the Bayes estimate can be made generally admissible by giving a less prior weight if the prior information is inaccurate. Finally an example was given.
基金Supported by the National Basic Research Program of China(61178072)
文摘Smoothness prior approach for spectral smoothing is investigated using Fourier frequency filter analysis.We show that the regularization parameter in penalized least squares could continuously control the bandwidth of low-pass filter.Besides,due to its property of interpolating the missing values automatically and smoothly,a spectral baseline correction algorithm based on the approach is proposed.This algorithm generally comprises spectral peak detection and baseline estimation.First,the spectral peak regions are detected and identified according to the second derivatives.Then,generalized smoothness prior approach combining identification information could estimate the baseline in peak regions.Results with both the simulated and real spectra show accurate baseline-corrected signals with this method.
基金the Baoshan Iron and Steel Group for the financial support
文摘Isothermal transformation (TTT) behavior of the low carbon steels with two Si contents (0.50 wt pct and 1.35 wt pct) was investigated with and without the prior deformation. The results show that Si and the prior deformation of the austenite have significant effects on the transformation of the ferrite and bainite. The addition of Si refines the ferrite grains, accelerates the polygonal ferrite transformation and the formation of M/A constituents, leading to the improvement of the strength. The ferrite grains formed under the prior deformation of the austenite become more homogeneous and refined. However, the influence of deformation on the tensile strength of both steels is dependent on the isothermal temperatures. Thermodynamic calculation indicates that Si and prior deformation reduce the incubation time of both ferrite and bainite transformation, but the effect is weakened by the decrease of the isothermal temperatures.
基金supported by "the Twelfth Five-year Civil Aerospace Technologies Pre-Research Program"(D040201)
文摘Focusing on the degradation of foggy images, a restora- tion approach from a single image based on spatial correlation of dark channel prior is proposed. Firstly, the transmission of each pixel is estimated by the spatial correlation of dark channel prior. Secondly, a degradation model is utilized to restore the foggy image. Thirdly, the final recovered image, with enhanced contrast, is obtained by performing a post-processing technique based on just-noticeable difference. Experimental results demonstrate that the information of a foggy image can be recovered perfectly by the proposed method, even in the case of the abrupt depth changing scene.
文摘Yin [1] has developed a new Bayesian measure of evidence for testing a point null hypothesis which agrees with the frequentist p-value thereby, solving Lindley’s paradox. Yin and Li [2] extended the methodology of Yin [1] to the case of the Behrens-Fisher problem by assigning Jeffreys’ independent prior to the nuisance parameters. In this paper, we were able to show both analytically and through the results from simulation studies that the methodology of Yin?[1] solves simultaneously, the Behrens-Fisher problem and Lindley’s paradox when a Gamma prior is assigned to the nuisance parameters.
基金supported by the National Natural Science Foundation of China(61301095)the Chinese University Scientific Fund(HEUCF130807)the Chinese Defense Advanced Research Program of Science and Technology(10J3.1.6)
文摘The blurred image restoration method can dramatically highlight the image details and enhance the global contrast, which is of benefit to improvement of the visual effect during practical ap- plications. This paper is based on the dark channel prior principle and aims at the prior information absent blurred image degradation situation. A lot of improvements have been made to estimate the transmission map of blurred images. Since the dark channel prior principle can effectively restore the blurred image at the cost of a large amount of computation, the total variation (TV) and image morphology transform (specifically top-hat transform and bottom- hat transform) have been introduced into the improved method. Compared with original transmission map estimation methods, the proposed method features both simplicity and accuracy. The es- timated transmission map together with the element can restore the image. Simulation results show that this method could inhibit the ill-posed problem during image restoration, meanwhile it can greatly improve the image quality and definition.
基金supported by the National Key R&D Program of China(2018YFC0309400)the National Natural Science Foundation of China(61871188)
文摘In many practical applications of image segmentation problems,employing prior information can greatly improve segmentation results.This paper continues to study one kind of prior information,called prior distribution.Within this research,there is no exact template of the object;instead only several samples are given.The proposed method,called the parametric distribution prior model,extends our previous model by adding the training procedure to learn the prior distribution of the objects.Then this paper establishes the energy function of the active contour model(ACM)with consideration of this parametric form of prior distribution.Therefore,during the process of segmenting,the template can update itself while the contour evolves.Experiments are performed on the airplane data set.Experimental results demonstrate the potential of the proposed method that with the information of prior distribution,the segmentation effect and speed can be both improved efficaciously.
文摘The likelihood function plays a central role in statistical analysis in relation to information, from both frequentist and Bayesian perspectives. In large samples several new properties of the likelihood in relation to information are developed here. The Arrow-Pratt absolute risk aversion measure is shown to be related to the Cramer-Rao Information bound. The derivative of the log-likelihood function is seen to provide a measure of information related stability for the Bayesian posterior density. As well, information similar prior densities can be defined reflecting the central role of likelihood in the Bayes learning paradigm.
基金This work was financially supported by the National Science and Technology Ministry to the research project ‘Advanced industriali-zation technique of manufacture for carbon steel of 500 MPa grade’ (No.2001AA332020).
文摘A low carbon steel was used to determine the critical strain εc for completion of deformation enhanced ferrite transformation (DEFT) through a series of hot compression tests. In addition, the influence of prior austenite grain size (PAGS) on the critical strain was systematically investigated. Experimental results showed that the critical strain is affected by PAGS. When γ→α transformation completes, the smaller the PAGS is, the smaller the critical strain is. The ferrite grains obtained through DEFT can be refined to about 3 μm when the DEFT is completed.
文摘When a one-dimensional luminance scalar is replaced by a vector of a colorful multi-dimension for every pixel of a monochrome image,the process is called colorization.However,colorization is under-constrained.Therefore,the prior knowledge is considered and given to the monochrome image.Colorization using optimization algorithm is an effective algorithm for the above problem.However,it cannot effectively do with some images well without repeating experiments for confirming the place of scribbles.In this paper,a colorization algorithm is proposed,which can automatically generate the prior knowledge.The idea is that firstly,the prior knowledge crystallizes into some points of the prior knowledge which is automatically extracted by downsampling and upsampling method.And then some points of the prior knowledge are classified and given with corresponding colors.Lastly,the color image can be obtained by the color points of the prior knowledge.It is demonstrated that the proposal can not only effectively generate the prior knowledge but also colorize the monochrome image according to requirements of user with some experiments.
文摘A general method for assessing local influence of minor perturbations of prior in Bayesian analysis is developed in this paper. U8ing some elementary ideas from differelltial geometryl we provide a unified approach for handling a variety of problexns of local prior influence. AS applications, we discuss the local influence of small perturbstions of normal-gamma prior density in linear model and investigate local prior influence from the predictive view.
基金Supported by the National Natural Science Foundation of China (No. 30900328, 61172179)the Fundamental Research Funds for the Central Universities (No.2011121051)the Natural Science Foundation of Fujian Province of China (No. 2012J05160)
文摘In this paper, a novel Magnetic Resonance (MR) reconstruction framework which combines image-wise and patch-wise sparse prior is proposed. For addressing, a truncated beta-Bernoulli process is firstly employed to enforce sparsity on overlapping image patches emphasizing local structures. Due to its properties, beta-Bernoulli process can adaptive infer the sparsity (number of non-zero coefficients) of each patch, an appropriate dictionary, and the noise variance simultaneously, which are prerequisite for iterative image reconstruction. Secondly, a General Gaussian Distribution (GGD) prior is introduced to engage image-wise sparsity for wavelet coefficients, which can be then estimated by a threshold denoising algorithm. Finally, MR image is reconstructed by patch-wise estimation, image-wise estimation and under-sampled k-space data with least square data fitting. Experimental results have demonstrated that proposed approach exhibits excellent reconstruction performance. Moreover, if the image is full of similar low-dimensional-structures, proposed algorithm has dramatically improved Peak Signal to Noise Ratio (PSNR) 7~9 dB, with comparisons to other state-of-art compressive sampling methods.
文摘The present study was conducted to study the effect of feed restriction prior to slaughter on carcass weight of male broiler chicks from 32 to 40 days of age. A total number of 180 (Pure line) male broiler chicks were taken randomly, labeled and divided into six groups. At 32 days of age, the experimental groups were put under the experimental feeding program. Group A fed ad libitum (control) while group B and C fed 120, 60 gm/bird/day for eight days, respectively. Group D and E fed 120, 60 gm/bird/day for four days respectively, followed by zero feeding for an extra 4 days. Group F deprived of food during the whole experimental period (8 days). The experimental diet was formulated to be approximately iso caloric-iso nitrogenous containing sorghum, groundnut cake, broiler concentrate, calcium, salt, lysine, methionine, and premix. The parameters taken were live body weight, feed intake, mortality, carcass, and non-carcass values. The effect of feed restriction program on male broiler chicks was not significant during the period from 32 to 34 days of age for parameters final live body weight, carcass weight, and dressing percentage, but net weight (gain or loss) was affected by feed restriction program and showed a significant difference (P < 0.01) between experimental groups. From 32 to 36 days of age male broilers subjected to feed restriction regimes showed the lowest reading for final live body weight, net weight (gain or loss) and carcass weight and the difference were significant (P 0.05) between experimental groups for dressing percentage during period from32 to 36 days of age. At the period from 32 to 38 days and the period from 32 to 40 days of age, all parameters were significantly affected by feed restriction program. It was concluded that carcass weight of broiler chickens can be controlled using different options of feed restriction programs according to the need of the market and the producer situation with special consideration to the economic return.
基金supported by the National Natural Science Foundation of China(6137214261571005U1401252)
文摘Due to the frequency of occlusion, cluttering and lowcontrast edges, gray intensity based active contour models oftenfail to segment meaningful objects. Prior shape information is usuallyutilized to segment desirable objects. A parametric shape priormodel is proposed. Firstly, principal component analysis is employedto train object shape and transformation is added to shaperepresentation. Then the energy function is constructed througha combination of shape prior energy, gray intensity energy andshape constraint energy of the kernel density function. The objectboundary extraction process is converted into the parameters solvingprocess of object shape. Besides, two new shape prior energyfunctions are defined when desirable objects are occluded by otherobjects or some parts of them are missing. Finally, an alternatingdecent iteration solving scheme is proposed for numerical implementation.Experiments on synthetic and real images demonstratethe robustness and accuracy of the proposed method.