The method of regularization factor selection determines stability and accuracy of the regularization method. A formula of regularization factor was proposed by analyzing the relationship between the improved SVD and ...The method of regularization factor selection determines stability and accuracy of the regularization method. A formula of regularization factor was proposed by analyzing the relationship between the improved SVD and regularization method. The improved SVD algorithm and regularization method could adapt to low SNR. The regularization method is better than the improved SVD in the case that SNR is below 30 and the improved SVD is better than the regularization method when SNR is higher than 30. The regularization method with the regularization factor proposed in this paper can be better applied into low SNR (5〈SNR) NMR logging. The numerical simulations and real NMR data process results indicated that the improved SVD algorithm and regularization method could adapt to the low signal to noise ratio and reduce the amount of computation greatly. These algorithms can be applied in NMR logging.展开更多
A two-level Bregmanized method with graph regularized sparse coding (TBGSC) is presented for image interpolation. The outer-level Bregman iterative procedure enforces the observation data constraints, while the inne...A two-level Bregmanized method with graph regularized sparse coding (TBGSC) is presented for image interpolation. The outer-level Bregman iterative procedure enforces the observation data constraints, while the inner-level Bregmanized method devotes to dictionary updating and sparse represention of small overlapping image patches. The introduced constraint of graph regularized sparse coding can capture local image features effectively, and consequently enables accurate reconstruction from highly undersampled partial data. Furthermore, modified sparse coding and simple dictionary updating applied in the inner minimization make the proposed algorithm converge within a relatively small number of iterations. Experimental results demonstrate that the proposed algorithm can effectively reconstruct images and it outperforms the current state-of-the-art approaches in terms of visual comparisons and quantitative measures.展开更多
This paper presents a new method to estimate the height of the atmospheric boundary layer(ABL) by using COSMIC radio occultation bending angle(BA) data. Using the numerical differentiation method combined with the reg...This paper presents a new method to estimate the height of the atmospheric boundary layer(ABL) by using COSMIC radio occultation bending angle(BA) data. Using the numerical differentiation method combined with the regularization technique, the first derivative of BA profiles is retrieved, and the height at which the first derivative of BA has the global minimum is defined to be the ABL height. To reflect the reliability of estimated ABL heights, the sharpness parameter is introduced, according to the relative minimum of the BA derivative. Then, it is applied to four months of COSMIC BA data(January, April, July, and October in 2008), and the ABL heights estimated are compared with two kinds of ABL heights from COSMIC products and with the heights determined by the finite difference method upon the refractivity data. For sharp ABL tops(large sharpness parameters), there is little difference between the ABL heights determined by different methods, i.e.,the uncertainties are small; whereas, for non-sharp ABL tops(small sharpness parameters), big differences exist in the ABL heights obtained by different methods, which means large uncertainties for different methods. In addition, the new method can detect thin ABLs and provide a reference ABL height in the cases eliminated by other methods. Thus, the application of the numerical differentiation method combined with the regularization technique to COSMIC BA data is an appropriate choice and has further application value.展开更多
The main subject of this paper is the theory of financial statement valuations observed in its historical development. More notably, regarding the subject, the research is concerned with some theoretical concepts deve...The main subject of this paper is the theory of financial statement valuations observed in its historical development. More notably, regarding the subject, the research is concerned with some theoretical concepts developed by the Italian doctrine in a very specific age, namely, between the 19th and the 20th century, which in fact, devoid of any accounting regulation. This paper analyzes in particular the shift from the exchange value rule to the historical cost method and tries to explain the reasons of such a development. In the second half of the 19th century, some of the best Italian scholars, who were faced with the need to properly develop the problem of accounting valuations, thought that it was appropriate to rely on concepts that belonged to similar sciences, such as economics and real estate appraisal, by blindly borrowing the theory of value from the former and the theory of valuations from the latter. During that age, everything hinged around the concept of exchange value. At the dawn of the 20th century, the Italian accounting doctrine began to wonder about a subject that was crucial to the financial statement theory: the informative purposes underlying the financial statements. At the same time, the first principle took shape, which might be called as the "finalistic principle of value". It is still the basis of the theory of financial accounting measurements, for which different evaluative criteria must be applied to different informative purposes. Thus, an alternative criterion to that of the exchange value makes its appearance on the scene of the accounting valuations, notably the historical cost. The introduction of the historical cost criteria and above all the relinquishment of the combination of the "economic cost" in favor of that of the "manufacturing cost" allow the Italian accounting to get rid of the theories of economics and real estate appraisal, thus, becoming independent regarding the financial statement valuations.展开更多
The paper presents a scheme of optimization of the cooling process of the gas turbine blade. As an optimization criterion has been taken into account on the outer surface temperature of the blade. Inverse problem is s...The paper presents a scheme of optimization of the cooling process of the gas turbine blade. As an optimization criterion has been taken into account on the outer surface temperature of the blade. Inverse problem is solved for stationary heat conduction in which beside the optimization criterion of the heat transfer coefficient on the outer surface of the blade the temperature distribution is known, and the values sought are the heat transfer coefficients and surface temperature of the cooling channels. This problem was solved by the boundary element method using SVD algorithm and Tikhonov regularization. The temperature and heat transfer coefficient of cooling channels obtained from the inverse problem was oscillating in nature. This solution is nonphysical, so the heat transfer coefficients on the surface of cooling channels were averaged. Then the problem was solved simply with averaged coefficients of heat transfer on the surface of the cooling channels and the known distribution on the outer surface of blade. The temperature distribution obtained from the solution of direct problem with averaged values of heat transfer coefficient was compared with the criterion of optimization.The calculation results obtained using the SVD algorithm gave the temperature distribution on the external wall of the blade closer to the criterion of optimization.展开更多
Data analysis and automatic processing is often interpreted as knowledge acquisition. In many cases it is necessary to somehow classify data or find regularities in them. Results obtained in the search of regularities...Data analysis and automatic processing is often interpreted as knowledge acquisition. In many cases it is necessary to somehow classify data or find regularities in them. Results obtained in the search of regularities in intelligent data analyzing applications are mostly represented with the help of IF-THEN rules. With the help of these rules the following tasks are solved: prediction, classification, pattern recognition and others. Using different approaches---clustering algorithms, neural network methods, fuzzy rule processing methods--we can extract rules that in an understandable language characterize the data. This allows interpreting the data, finding relationships in the data and extracting new rules that characterize them. Knowledge acquisition in this paper is defined as the process of extracting knowledge from numerical data in the form of rules. Extraction of rules in this context is based on clustering methods K-means and fuzzy C-means. With the assistance of K-means, clustering algorithm rules are derived from trained neural networks. Fuzzy C-means is used in fuzzy rule based design method. Rule extraction methodology is demonstrated in the Fisher's Iris flower data set samples. The effectiveness of the extracted rules is evaluated. Clustering and rule extraction methodology can be widely used in evaluating and analyzing various economic and financial processes.展开更多
文摘The method of regularization factor selection determines stability and accuracy of the regularization method. A formula of regularization factor was proposed by analyzing the relationship between the improved SVD and regularization method. The improved SVD algorithm and regularization method could adapt to low SNR. The regularization method is better than the improved SVD in the case that SNR is below 30 and the improved SVD is better than the regularization method when SNR is higher than 30. The regularization method with the regularization factor proposed in this paper can be better applied into low SNR (5〈SNR) NMR logging. The numerical simulations and real NMR data process results indicated that the improved SVD algorithm and regularization method could adapt to the low signal to noise ratio and reduce the amount of computation greatly. These algorithms can be applied in NMR logging.
基金The National Natural Science Foundation of China (No.61362001,61102043,61262084,20132BAB211030,20122BAB211015)the Basic Research Program of Shenzhen(No.JC201104220219A)
文摘A two-level Bregmanized method with graph regularized sparse coding (TBGSC) is presented for image interpolation. The outer-level Bregman iterative procedure enforces the observation data constraints, while the inner-level Bregmanized method devotes to dictionary updating and sparse represention of small overlapping image patches. The introduced constraint of graph regularized sparse coding can capture local image features effectively, and consequently enables accurate reconstruction from highly undersampled partial data. Furthermore, modified sparse coding and simple dictionary updating applied in the inner minimization make the proposed algorithm converge within a relatively small number of iterations. Experimental results demonstrate that the proposed algorithm can effectively reconstruct images and it outperforms the current state-of-the-art approaches in terms of visual comparisons and quantitative measures.
基金supported by the National Natural Science Foundation of China (Grant No. 41475021)
文摘This paper presents a new method to estimate the height of the atmospheric boundary layer(ABL) by using COSMIC radio occultation bending angle(BA) data. Using the numerical differentiation method combined with the regularization technique, the first derivative of BA profiles is retrieved, and the height at which the first derivative of BA has the global minimum is defined to be the ABL height. To reflect the reliability of estimated ABL heights, the sharpness parameter is introduced, according to the relative minimum of the BA derivative. Then, it is applied to four months of COSMIC BA data(January, April, July, and October in 2008), and the ABL heights estimated are compared with two kinds of ABL heights from COSMIC products and with the heights determined by the finite difference method upon the refractivity data. For sharp ABL tops(large sharpness parameters), there is little difference between the ABL heights determined by different methods, i.e.,the uncertainties are small; whereas, for non-sharp ABL tops(small sharpness parameters), big differences exist in the ABL heights obtained by different methods, which means large uncertainties for different methods. In addition, the new method can detect thin ABLs and provide a reference ABL height in the cases eliminated by other methods. Thus, the application of the numerical differentiation method combined with the regularization technique to COSMIC BA data is an appropriate choice and has further application value.
文摘The main subject of this paper is the theory of financial statement valuations observed in its historical development. More notably, regarding the subject, the research is concerned with some theoretical concepts developed by the Italian doctrine in a very specific age, namely, between the 19th and the 20th century, which in fact, devoid of any accounting regulation. This paper analyzes in particular the shift from the exchange value rule to the historical cost method and tries to explain the reasons of such a development. In the second half of the 19th century, some of the best Italian scholars, who were faced with the need to properly develop the problem of accounting valuations, thought that it was appropriate to rely on concepts that belonged to similar sciences, such as economics and real estate appraisal, by blindly borrowing the theory of value from the former and the theory of valuations from the latter. During that age, everything hinged around the concept of exchange value. At the dawn of the 20th century, the Italian accounting doctrine began to wonder about a subject that was crucial to the financial statement theory: the informative purposes underlying the financial statements. At the same time, the first principle took shape, which might be called as the "finalistic principle of value". It is still the basis of the theory of financial accounting measurements, for which different evaluative criteria must be applied to different informative purposes. Thus, an alternative criterion to that of the exchange value makes its appearance on the scene of the accounting valuations, notably the historical cost. The introduction of the historical cost criteria and above all the relinquishment of the combination of the "economic cost" in favor of that of the "manufacturing cost" allow the Italian accounting to get rid of the theories of economics and real estate appraisal, thus, becoming independent regarding the financial statement valuations.
文摘The paper presents a scheme of optimization of the cooling process of the gas turbine blade. As an optimization criterion has been taken into account on the outer surface temperature of the blade. Inverse problem is solved for stationary heat conduction in which beside the optimization criterion of the heat transfer coefficient on the outer surface of the blade the temperature distribution is known, and the values sought are the heat transfer coefficients and surface temperature of the cooling channels. This problem was solved by the boundary element method using SVD algorithm and Tikhonov regularization. The temperature and heat transfer coefficient of cooling channels obtained from the inverse problem was oscillating in nature. This solution is nonphysical, so the heat transfer coefficients on the surface of cooling channels were averaged. Then the problem was solved simply with averaged coefficients of heat transfer on the surface of the cooling channels and the known distribution on the outer surface of blade. The temperature distribution obtained from the solution of direct problem with averaged values of heat transfer coefficient was compared with the criterion of optimization.The calculation results obtained using the SVD algorithm gave the temperature distribution on the external wall of the blade closer to the criterion of optimization.
文摘Data analysis and automatic processing is often interpreted as knowledge acquisition. In many cases it is necessary to somehow classify data or find regularities in them. Results obtained in the search of regularities in intelligent data analyzing applications are mostly represented with the help of IF-THEN rules. With the help of these rules the following tasks are solved: prediction, classification, pattern recognition and others. Using different approaches---clustering algorithms, neural network methods, fuzzy rule processing methods--we can extract rules that in an understandable language characterize the data. This allows interpreting the data, finding relationships in the data and extracting new rules that characterize them. Knowledge acquisition in this paper is defined as the process of extracting knowledge from numerical data in the form of rules. Extraction of rules in this context is based on clustering methods K-means and fuzzy C-means. With the assistance of K-means, clustering algorithm rules are derived from trained neural networks. Fuzzy C-means is used in fuzzy rule based design method. Rule extraction methodology is demonstrated in the Fisher's Iris flower data set samples. The effectiveness of the extracted rules is evaluated. Clustering and rule extraction methodology can be widely used in evaluating and analyzing various economic and financial processes.