A Gaussian channel with additive interference that is causally known to the transmitter is called a Dirty-Tape Channel(DTC).In this paper,we consider a state-dependent dirty-tape Gaussian relay channel with orthogonal...A Gaussian channel with additive interference that is causally known to the transmitter is called a Dirty-Tape Channel(DTC).In this paper,we consider a state-dependent dirty-tape Gaussian relay channel with orthogonal channels from the source to the relay and from the source and relay to the destination.The orthogonal channels are corrupted by two independent additive interferences causally known to both the source and relay.The lower and upper bounds of the channel capacity are established.The lower bound is obtained by employing superposition coding at the source,Partial Decode-and-Forward(PDF)relaying at the relay,and a strategy similar to that used by Shannon at the source and relay.The explicit capacity is characterised when the power of the relay is sufficiently large.Finally,several numerical examples are provided to illustrate the impact of additive interferences and the role of the relay in information transmission and in removing the interference.展开更多
The causal states of computational mechanics define the minimal sufficient memory for a given discrete stationary stochastic process. Their entropy is an important complexity measure called statistical complexity (or...The causal states of computational mechanics define the minimal sufficient memory for a given discrete stationary stochastic process. Their entropy is an important complexity measure called statistical complexity (or true measure complexity). They induce the s-machine, which is a hidden Markov model (HMM) generating the process. But it is not the minimal one, although generative HMMs also have a natural predictive interpretation. This paper gives a mathematical proof of the idea that the s-machine is the minimal HMM with an additional (partial) determinism condition. Minimal internal state entropy of a generative HMM is in analogy to statistical complexity called generative complexity. This paper also shows that generative complexity depends on the process in a nice way. It is, as a function of the process, lower semi-continuous (w.r.t. weak-, topology), concave, and behaves nice under ergodic decomposition of the process.展开更多
基金supported by the Fundamental Research Funds for the Central Universities under Grants No.2013B08214,No2009B32114the National Natural Science Foundation of China under Grants No.61271232,No.60972045,No.61071089+1 种基金the Open Research Fund of National Mobile Communications Research Laboratory,Southeast University under Grant No.2012D05the University Postgraduate Research and Innovation Project in Jiangsu Province under Grant No.CXZZ11_0395
文摘A Gaussian channel with additive interference that is causally known to the transmitter is called a Dirty-Tape Channel(DTC).In this paper,we consider a state-dependent dirty-tape Gaussian relay channel with orthogonal channels from the source to the relay and from the source and relay to the destination.The orthogonal channels are corrupted by two independent additive interferences causally known to both the source and relay.The lower and upper bounds of the channel capacity are established.The lower bound is obtained by employing superposition coding at the source,Partial Decode-and-Forward(PDF)relaying at the relay,and a strategy similar to that used by Shannon at the source and relay.The explicit capacity is characterised when the power of the relay is sufficiently large.Finally,several numerical examples are provided to illustrate the impact of additive interferences and the role of the relay in information transmission and in removing the interference.
文摘The causal states of computational mechanics define the minimal sufficient memory for a given discrete stationary stochastic process. Their entropy is an important complexity measure called statistical complexity (or true measure complexity). They induce the s-machine, which is a hidden Markov model (HMM) generating the process. But it is not the minimal one, although generative HMMs also have a natural predictive interpretation. This paper gives a mathematical proof of the idea that the s-machine is the minimal HMM with an additional (partial) determinism condition. Minimal internal state entropy of a generative HMM is in analogy to statistical complexity called generative complexity. This paper also shows that generative complexity depends on the process in a nice way. It is, as a function of the process, lower semi-continuous (w.r.t. weak-, topology), concave, and behaves nice under ergodic decomposition of the process.