摘要
As conventional communication systems based on classic information theory have closely approached Shannon capacity,semantic communication is emerging as a key enabling technology for the further improvement of communication performance.However,it is still unsettled on how to represent semantic information and characterise the theoretical limits of semantic-oriented compression and transmission.In this paper,we consider a semantic source which is characterised by a set of correlated random variables whose joint probabilistic distribution can be described by a Bayesian network.We give the information-theoretic limit on the lossless compression of the semantic source and introduce a low complexity encoding method by exploiting the conditional independence.We further characterise the limits on lossy compression of the semantic source and the upper and lower bounds of the rate-distortion function.We also investigate the lossy compression of the semantic source with two-sided information at the encoder and decoder,and obtain the corresponding rate distortion function.We prove that the optimal code of the semantic source is the combination of the optimal codes of each conditional independent set given the side information.
基金
partly supported by NSFC under grant No.62293481,No.62201505
partly by the SUTDZJU IDEA Grant(SUTD-ZJU(VP)202102)。