期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
Objective Model Selection in Physics: Exploring the Finite Information Quantity Approach
1
作者 Boris Menin 《Journal of Applied Mathematics and Physics》 2024年第5期1848-1889,共42页
Traditional methods for selecting models in experimental data analysis are susceptible to researcher bias, hindering exploration of alternative explanations and potentially leading to overfitting. The Finite Informati... Traditional methods for selecting models in experimental data analysis are susceptible to researcher bias, hindering exploration of alternative explanations and potentially leading to overfitting. The Finite Information Quantity (FIQ) approach offers a novel solution by acknowledging the inherent limitations in information processing capacity of physical systems. This framework facilitates the development of objective criteria for model selection (comparative uncertainty) and paves the way for a more comprehensive understanding of phenomena through exploring diverse explanations. This work presents a detailed comparison of the FIQ approach with ten established model selection methods, highlighting the advantages and limitations of each. We demonstrate the potential of FIQ to enhance the objectivity and robustness of scientific inquiry through three practical examples: selecting appropriate models for measuring fundamental constants, sound velocity, and underwater electrical discharges. Further research is warranted to explore the full applicability of FIQ across various scientific disciplines. 展开更多
关键词 Comparative Uncertainty Finite Information Quantity Formulating a Model Measurement Accuracy Limit objective Model selection
下载PDF
Retrieval Single-Doppler Radar Wind with Variational Assimilation Method-Part I: Objective Selection of Functional Weighting Factors 被引量:5
2
作者 魏鸣 党人庆 葛文忠 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 1998年第4期123-138,共16页
In variational problem, the selection of functional weighting factors (FWF) is one of the key points for discussing many relevant studies. To overcome arbitrariness and subjectivity of the empirical selecting methods ... In variational problem, the selection of functional weighting factors (FWF) is one of the key points for discussing many relevant studies. To overcome arbitrariness and subjectivity of the empirical selecting methods used widely at present, this paper tries to put forward an optimal objective selecting method of FWF. The focus of the study is on the weighting factors optimal selection in the variation retrieval single-Doppler radar wind field with the simple adjoint models. Weighting factors in the meaning of minimal variance are calculated out with the matrix theory and the finite difference method of partial differential equation. Experiments show that the result is more objective comparing with the factors obtained with the empirical method. 展开更多
关键词 Variation Weighting factor Minimum variance objective selection
下载PDF
论《生活下降者》中的存在主义哲学思想
3
作者 谭方黎 《江西科技学院学报》 2009年第1期82-84,共3页
大江健三郎是继川端康成之后获得诺贝尔文学奖的第二位日本籍作家。东方传统的文学底蕴和当代西方反传统的文艺思潮在大江健三郎的作品中得到巧妙的融合,特别是作家早年接触到的西方存在主义理论更是对他的文学创作产生了深刻的影响,从... 大江健三郎是继川端康成之后获得诺贝尔文学奖的第二位日本籍作家。东方传统的文学底蕴和当代西方反传统的文艺思潮在大江健三郎的作品中得到巧妙的融合,特别是作家早年接触到的西方存在主义理论更是对他的文学创作产生了深刻的影响,从而形成了作者独特的创作风格和表现手法。本文通过萨特的哲学观和文学观来探讨大江健三郎的短篇小说《生活下降者》中的存在主义哲学思想。 展开更多
关键词 《生活下降者》 存在主义 主观与客观 自由选择论
下载PDF
Efficient Leave-One-Out Strategy for Supervised Feature Selection 被引量:2
4
作者 Dingcheng Feng Feng Chen Wenli Xu 《Tsinghua Science and Technology》 SCIE EI CAS 2013年第6期629-635,共7页
Feature selection is a key task in statistical pattern recognition. Most feature selection algorithms have been proposed based on specific objective functions which are usually intuitively reasonable but can sometimes... Feature selection is a key task in statistical pattern recognition. Most feature selection algorithms have been proposed based on specific objective functions which are usually intuitively reasonable but can sometimes be far from the more basic objectives of the feature selection. This paper describes how to select features such that the basic objectives, e.g., classification or clustering accuracies, can be optimized in a more direct way. The analysis requires that the contribution of each feature to the evaluation metrics can be quantitatively described by some score function. Motivated by the conditional independence structure in probabilistic distributions, the analysis uses a leave-one-out feature selection algorithm which provides an approximate solution. The leave-one- out algorithm improves the conventional greedy backward elimination algorithm by preserving more interactions among features in the selection process, so that the various feature selection objectives can be optimized in a unified way. Experiments on six real-world datasets with different feature evaluation metrics have shown that this algorithm outperforms popular feature selection algorithms in most situations. 展开更多
关键词 LEAVE-ONE-OUT feature selection objectives evaluation metrics
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部