Z-channel (information theory)

From testwiki
Revision as of 22:25, 25 January 2025 by 131.111.5.201 (talk) (Capacity: Alpha is the probability of a zero, whereas MacKay considers the probability of a one)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Template:Context

The Z-channel sees each 0 bit of a message transmitted correctly always and each 1 bit transmitted correctly with probability 1–p, due to noise across the transmission medium.

In coding theory and information theory, a Z-channel or binary asymmetric channel is a communications channel used to model the behaviour of some data storage systems.

Definition

A Z-channel is a channel with binary input and binary output, where each 0 bit is transmitted correctly, but each 1 bit has probability p of being transmitted incorrectly as a 0, and probability 1–p of being transmitted correctly as a 1. In other words, if X and Y are the random variables describing the probability distributions of the input and the output of the channel, respectively, then the crossovers of the channel are characterized by the conditional probabilities:Template:Sfnp

Pr[Y=0|X=0]=1Pr[Y=0|X=1]=pPr[Y=1|X=0]=0Pr[Y=1|X=1]=1p

Capacity

The channel capacity 𝖼𝖺𝗉() of the Z-channel with the crossover 1 → 0 probability p, when the input random variable X is distributed according to the Bernoulli distribution with probability α for the occurrence of 0, is given by the following equation:

𝖼𝖺𝗉()=𝖧(11+2𝗌(p))𝗌(p)1+2𝗌(p)=log2(1+2𝗌(p))=log2(1+(1p)pp/(1p))

where 𝗌(p)=𝖧(p)1p for the binary entropy function 𝖧().

This capacity is obtained when the input variable X has Bernoulli distribution with probability α of having value 0 and 1α of value 1, where:

α=11(1p)(1+2𝖧(p)/(1p)),

For small p, the capacity is approximated by

𝖼𝖺𝗉()10.5𝖧(p)

as compared to the capacity 1𝖧(p) of the binary symmetric channel with crossover probability p.

For any p>0, α>0.5 (i.e. more 0s should be transmitted than 1s) because transmitting a 1 introduces noise. As p1, the limiting value of α is 11e.Template:Sfnp

Bounds on the size of an asymmetric-error-correcting code

Define the following distance function 𝖽A(𝐱,𝐲) on the words 𝐱,𝐲{0,1}n of length n transmitted via a Z-channel

𝖽A(𝐱,𝐲)=max{|{ixi=0,yi=1}|,|{ixi=1,yi=0}|}.

Define the sphere Vt(𝐱) of radius t around a word 𝐱{0,1}n of length n as the set of all the words at distance t or less from 𝐱, in other words,

Vt(𝐱)={𝐲{0,1}n𝖽A(𝐱,𝐲)t}.

A code 𝒞 of length n is said to be t-asymmetric-error-correcting if for any two codewords 𝐜𝐜{0,1}n, one has Vt(𝐜)Vt(𝐜)=. Denote by M(n,t) the maximum number of codewords in a t-asymmetric-error-correcting code of length n.

The Varshamov bound. For n≥1 and t≥1,

M(n,t)2n+1j=0t((n/2j)+(n/2j)).

The constant-weightTemplate:What code bound. For n > 2t ≥ 2, let the sequence B0, B1, ..., Bn-2t-1 be defined as

B0=2,Bi=min0j<i{Bj+A(n+t+ij1,2t+2,t+i)} for i>0.

Then M(n,t)Bn2t1.

Notes

Template:Reflist

References