摘要
本文通过分析仙农(Shannon)测度,得出“划分”是获取信息的依据和根本方法.利用连续划分由离散信源熵推导连续信源熵,可消除 log△→∞的矛盾.通过讨论模糊划分,得到划分的信息客观度和信息散失度,定量解决了正确选择‘划分’的问题.针对仙农信源及其划分的缺陷,本文提出了普适信源和普适划分,从而可以认知信源输出的语法信息及语义识别信息;并且证明:仙农信息只是后者的近似.
In this paper,by analyzing Shannon measure,it is obtained that‘division’is the basis
and fundamental method for gaining information.If follows from deriving the continuous
entropy on the discrete entropy by using the continuous division that the cotradictory of
Log△→∞ can be eliminated.From the discussing of the fuzzy division,the objective degree
and the Loss degree of information which relate to the division are obtained.Relating the
division there by the correct diviaion can be choosen.In connection with the defect of
Shannon information source and the division,the universal information source and the
universal division are suggested.Thereby the syntax information and the semantic recogni-
tion information can be cognized from the output of information source.And it is proved
that Shannon information is the approximation of the latter.
出处
《西安交通大学学报》
EI
CAS
CSCD
北大核心
1990年第2期85-92,共8页
Journal of Xi'an Jiaotong University
关键词
信源
划分
语义信息
认知过程
information source
division
recognition
semantics information