欢迎光临散文网 会员登陆 & 注册

Turbo 译码浅析

2023-01-18 23:46 作者:乐吧的数学  | 我要投稿

本文章和系列视频,将讲解 Turbo 码的基本入门。

需要的预备知识主要是知道卷积码的基本编码过程,知道卷积码的 BCJR 译码算法。

Turbo 码是由多个卷积码并联而成的, Turbo 名字的由来,是从译码的角度来看的。因此,严格来说,应该称为并联卷积码的 Turbo 译码算法更合适。
注:也有串联的卷积码构成 Turbo 码,也有由多个不同的卷积码并联/串联成的 Turbo 码,本文主要讲解由多个相同的卷积码并联构成的 Turbo 码。而且,为了叙述方便和公式的简洁,我们以下面的卷积码为例子进行讨论。


如果单独看每一个卷积码,我们可以用 BCJR 译码算法来译码,假如我们先用上面那一个卷积码,做 BCJR 译码,则根据之前的文章和视频讲解,我们知道对每个发送比特的后验概率,可以用如下公式计算:

%5Cbegin%7Baligned%7D%0AP(X_t%3Dx%7Cr)%20%26%3D%20%5Csum_%7B(p%2Cq)%5Cin%20S_x%7DP(%5Cpsi_t%3Dp%2C%5Cpsi_%7Bt%2B1%7D%3Dq%7Cr)%20%5C%5C%0A%5C%5C%0A%26%5Cpropto%20%20%5Csum_%7B(p%2Cq)%5Cin%20S_x%7D%20%5Calpha_t(p)%20%5Cgamma_t(p%2Cq)%20%5Cbeta_%7Bt%2B1%7D(q)%0A%5Cend%7Baligned%7D%20%20%5Ctag%201
其中:

%5Cbegin%7Baligned%7D%0A%5Cgamma_t(p%2Cq)%0A%26%3D%20%20p(%5Cpsi_%7Bt%2B1%7D%3Dq%2C%20r_t%20%7C%20%20%5Cpsi_t%3Dp)%20%5C%5C%20%20%5C%5C%0A%26%3D%20p(%20r_t%20%7C%20%5Cpsi_%7Bt%2B1%7D%3Dq%2C%20%20%5Cpsi_t%3Dp)%20p(%5Cpsi_%7Bt%2B1%7D%3Dq%7C%5Cpsi_t%3Dp)%20%20%5C%5C%20%20%5C%5C%0A%26%3D%20p(r_t%7Ca_t)%20p(x_t%3Dx)%0A%0A%0A%5Cend%7Baligned%7D%20%20%5Ctag%202


以及:

%5Cbegin%7Baligned%7D%0A%5Calpha_%7Bt%2B1%7D(q)%20%26%3D%5Csum_p%20%20%5Calpha_t(p)%20%5Clambda_t(p%2Cq)%20%20%20%5C%5C%20%20%5C%5C%0A%0A%5Cbeta_t(p)%20%26%3D%20%5Csum_q%20%20%5Cgamma_t(p%2Cq)%20%20%5Cbeta_%7Bt%2B1%7D(q)%0A%5Cend%7Baligned%7D%20%20%5Ctag%203

那么,我们可以很自然地想到,是否可以利用上面卷积码得到的后验概率信息,来增强对第二个卷积码做概率计算的可靠性?反过来,等第二个卷积码的后验概率计算出来后,是否又可以用来增强对第一个卷积码做概率计算的可靠性?如此往复迭代,就构成了类似涡轮增压的工作机制,因此,得名 Turbo 译码。

现在,我们来分析,从另外一个卷积码的译码结果中,拿什么样的概率信息给当前这个卷积码的译码使用。

最直观的,最容易理解的,就是把第一个卷积码的后验概率当成对发送比特的概率,代入到第二个卷积码中用到比特概率的地方。

如果令另外一个卷积码中公式 (2) 里的先验概率 p(x_t%3Dx) 等于上一个卷积码中的后验概率,即:
p(x_t%3Dx)%20%3D%20P(X_t%3Dx%7Cr)%20%20%5Ctag%204

我们来分析一下这样做会有什么问题。从上面的公式,我们在计算 %5Cgamma 这个概率时,用到了先验概率,我们把 %5Cgamma  这个概率公式再展开分析一下,从公式 (2) 继续分析。因为我们用到的卷积码是系统码,即编码的 bit 会在编码后的码流中原封不动地保存,那么:
%5Cbegin%7Baligned%7D%0A%5Cgamma_t(p%2Cq)%0A%26%3D%20%20p(%5Cpsi_%7Bt%2B1%7D%3Dq%2C%20r_t%20%7C%20%20%5Cpsi_t%3Dp)%20%5C%5C%20%20%5C%5C%0A%26%3D%20p(%20r_t%20%7C%20%5Cpsi_%7Bt%2B1%7D%3Dq%2C%20%20%5Cpsi_t%3Dp)%20p(%5Cpsi_%7Bt%2B1%7D%3Dq%7C%5Cpsi_t%3Dp)%20%20%5C%5C%20%20%5C%5C%0A%26%3D%20p(r_t%5E%7B(0)%7D%2Cr_t%5E%7B(1)%7D%7Ca_t%5E%7B(0)%7D%2C%20a_t%5E%7B(1)%7D)%20p(x_t%3Dx)%0A%0A%0A%5Cend%7Baligned%7D%20%20%5Ctag%205

由于用的是系统码,所以 a_t%5E%7B(0)%7D 对应的就是 x_t,所以,公式 (5) 可以写成:
%5Cbegin%7Baligned%7D%0A%5Cgamma_t(p%2Cq)%0A%26%3D%20p(r_t%5E%7B(0)%7D%2Cr_t%5E%7B(1)%7D%7Cx_t%2C%20a_t%5E%7B(1)%7D)%20p(x_t%3Dx)%20%20%20%5C%5C%0A%26%3D%20p(r_t%5E%7B(0)%7D%7Cx_t)p(r_t%5E%7B(1)%7D%7C%20a_t%5E%7B(1)%7D)%20p(x_t%3Dx)%0A%0A%5Cend%7Baligned%7D%20%20%5Ctag%206把 (6)  代入 (1) 则:
%5Cbegin%7Baligned%7D%0AP(X_t%3Dx%7Cr)%20%26%5Cpropto%20%20%5Csum_%7B(p%2Cq)%5Cin%20S_x%7D%20%5Calpha_t(p)%20%5Cgamma_t(p%2Cq)%20%5Cbeta_%7Bt%2B1%7D(q)%20%5C%5C%0A%26%20%5Cpropto%20%5Csum_%7B(p%2Cq)%5Cin%20S_x%7D%20%5Calpha_t(p)%20p(r_t%5E%7B(0)%7D%7Cx_t)p(r_t%5E%7B(1)%7D%7C%20a_t%5E%7B(1)%7D)%20p(x_t%3Dx)%20%5Cbeta_%7Bt%2B1%7D(q)%5C%5C%0A%26%20%5Cpropto%20p(r_t%5E%7B(0)%7D%7Cx_t)p(x_t%3Dx)%20%5Csum_%7B(p%2Cq)%5Cin%20S_x%7D%20%5Calpha_t(p)%20p(r_t%5E%7B(1)%7D%7C%20a_t%5E%7B(1)%7D)%20%5Cbeta_%7Bt%2B1%7D(q)%0A%5Cend%7Baligned%7D%20%20%5Ctag%207

我们把公式 (7) 中最后三项,分别记为:

%5Cbegin%7Baligned%7D%0AP_%7Bs%2Ct%7D(x)%20%26%3D%20p(r_t%5E%7B(0)%7D%7Cx_t)%20%5C%5C%20%5C%5C%0AP_%7Bp%2Ct%7D(x)%26%3Dp(x_t%3Dx)%20%5C%5C%20%5C%5C%0AP_%7Be%2Ct%7D(x)%26%3D%5Csum_%7B(p%2Cq)%5Cin%20S_x%7D%20%5Calpha_t(p)%20p(r_t%5E%7B(1)%7D%7C%20a_t%5E%7B(1)%7D)%20%5Cbeta_%7Bt%2B1%7D(q)%0A%5Cend%7Baligned%7D


如果把第一个卷积码计算出来的后验概率,代入到第二个卷积码的先验概率,则有:

%5Cbegin%7Baligned%7D%0AP(X_t%3Dx%7Cr)%20%26%20%5Cpropto%20p(r_t%5E%7B(0)%7D%7Cx_t)p(x_t%3Dx)%20%5Csum_%7B(p%2Cq)%5Cin%20S_x%7D%20%5Calpha_t(p)%20p(r_t%5E%7B(2)%7D%7C%20a_t%5E%7B(2)%7D)%20%5Cbeta_%7Bt%2B1%7D(q)%20%20%5C%5C%0A%26%20%5Cpropto%20%20p(r_t%5E%7B(0)%7D%7Cx_t)%20%20%20%20p(r_t%5E%7B(0)%7D%7Cx_t)p(x_t%3Dx)%20%5Csum_%7B(p%2Cq)%5Cin%20S_x%7D%20%5Calpha_t(p)%20p(r_t%5E%7B(1)%7D%7C%20a_t%5E%7B(1)%7D)%20%5Cbeta_%7Bt%2B1%7D(q)%20%20%20%5Csum_%7B(p%2Cq)%5Cin%20S_x%7D%20%5Calpha_t(p)%20p(r_t%5E%7B(2)%7D%7C%20a_t%5E%7B(2)%7D)%20%5Cbeta_%7Bt%2B1%7D(q)%0A%5Cend%7Baligned%7D%20%20%20%5Ctag%208

可以看到,上面有两个 p(r_t%5E%7B(0)%7D%7Cx_t),而且代入后,还有 p(x_t%3Dx),这样会导致 “重复计算” 的感觉,参考书中好像是说不要这样,而是用公式 (7) 中 求和的部分,作为第一个卷积码因为编码而得到的概率信息,给第二个卷积码使用,即传递:

p(x_t%3Dx)%3D%5Csum_%7B(p%2Cq)%5Cin%20S_x%7D%20%5Calpha_t(p)%20p(r_t%5E%7B(1)%7D%7C%20a_t%5E%7B(1)%7D)%20%5Cbeta_%7Bt%2B1%7D(q)%20%20%5Ctag%209

所以, Turbo 译码的大致流程为:

1) 对第一个卷积码,计算 %5Calpha%2C%20%5Cbeta 概率
2) 利用公式 (9),计算  P_%7Be%2Ct%7D(x)%3D%5Csum_%7B(p%2Cq)%5Cin%20S_x%7D%20%5Calpha_t(p)%20p(r_t%5E%7B(1)%7D%7C%20a_t%5E%7B(1)%7D)%20%5Cbeta_%7Bt%2B1%7D

3) 把 P_%7Be%2Ct%7D(x) 当成 p(x_t%3Dx),传递给第二个卷积码
4) 对第二个卷积码,计算 %5Calpha%2C%20%5Cbeta 概率
5) 利用公式 (9) (稍微变化一点),计算  P_%7Be%2Ct%7D(x)%3D%5Csum_%7B(p%2Cq)%5Cin%20S_x%7D%20%5Calpha_t(p)%20p(r_t%5E%7B(2)%7D%7C%20a_t%5E%7B(2)%7D)%20%5Cbeta_%7Bt%2B1%7D

6) 把 P_%7Be%2Ct%7D(x) 当成 p(x_t%3Dx),传递回第一个卷积码,转到步骤 1 继续,直到结束.


详细的算法描述如下:

 j%5Cin%20%5C%7B1%2C2%5C%7D 表示第几个卷积码

l  表示第几次迭代

P_%7Be%2Ct%7D%5E%7B(l%2Cj)%7D,表示第 l 次迭代中,第 j  个卷积码计算出来的要传递的外信息。

P%5E%7B(l%2Cj)%7D  表示第 l  次迭代中,第  j  个卷积码用到的 先验概率

M  最大迭代次数

-----------------------------算法------begin

初始化: 令 P%5E%7B(0%2C1)%7D(x_t%3Dx)%3DP%5E%7B(0)%7D(x_t%3Dx)  (用初始先验概率输入给第一个卷积码,一般是等概率分布的)

迭代: 按照迭代次数循环 l%3D1%2C2%2C....M
   1. 用 P%5E%7B(l-1%2C1)%7D(x_t%3Dx) 作为先验概率 P(x_t%3Dx%20) 输入给第一个卷积码,计算
      * 所有的 %5Calpha%2C%20%5Cbeta

      * 计算 P_%7Be%2Ct%7D%5E%7B(l%2C1)%7D(x_t%3Dx)

   2. 令 P%5E%7B(l%2C2)%7D(x_t%3Dx)%20%3D%20%5Cprod%20%5BP_%7Be%2Ct%7D%5E%7B(l%2C1)%7D(x_t%3Dx)%5D

   3. 用 P%5E%7B(l%2C2)%7D(x_t%3Dx)%20 作为先验概率 P(x_t%3Dx%20),计算
      * 所有的 %5Calpha%2C%20%5Cbeta

   4. 如果不是最后一次迭代
      * 计算 P_%7Be%2Ct%7D%5E%7B(l%2Cl)%7D(x_t%3Dx)

      * 令 P%5E%7B(l%2C1)%7D(x_t%3Dx)%20%3D%20%5Cprod%5E%7B-1%7D%20%5BP_%7Be%2Ct%7D%5E%7B(l%2C2)%7D(x_t%3Dx)%5D

   5. 否则,如果是最后一次迭代
      * 用 P%5E%7B(l%2C2)%7D(x_t%3Dx)%20 作为先验概率 P(x_t%3Dx%20),计算 P(x_t%3Dx%7Cr)

      * 解交织 P(x_t%3Dx%7Cr)%20%3D%20%5Cprod%5E%7B-1%7D%5BP(x_t%3Dx%7Cr)%5D
   

-----------------------------算法------end



我们以一个具体的例子来看译码的过程.


首先,做初始化,我们有的先验概率,是 0/1 比特的取值是等概率的,所以:
P%5E%7B(0%2C1)%7D(x_t%3D0)%3DP%5E%7B(0)%7D(x_t%3D0)%20%3D%20%5Cfrac%7B1%7D%7B2%7D%20%20%5C%5C%0AP%5E%7B(0%2C1)%7D(x_t%3D1)%3DP%5E%7B(0)%7D(x_t%3D1)%20%3D%20%5Cfrac%7B1%7D%7B2%7D

对于时刻 0 到时刻 9,我们用公式 (6) 计算出来 %5Cgamma 概率:
%5Cbegin%7Baligned%7D%0A%5Cgamma_0(p%3D0%2Cq%3D0)%20%26%3D%20p(r_t%5E%7B(0)%7D%7Cx_0%3D0)p(r_t%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D%3D-1)%20p(x_0%3D0)%20%20%5C%5C%0A%5Cgamma_0(p%3D0%2Cq%3D2)%20%26%3D%20p(r_t%5E%7B(0)%7D%7Cx_0%3D1)p(r_t%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D%3D%2B1)%20p(x_0%3D1)%20%5C%5C%0A...%5C%5C%0A%5Cgamma_0(p%3D3%2Cq%3D1)%20%26%3D%20p(r_t%5E%7B(0)%7D%7Cx_0%3D0)p(r_t%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D%3D-1)%20p(x_0%3D0)%20%5C%5C%0A%5Cgamma_0(p%3D3%2Cq%3D3)%20%26%3D%20p(r_t%5E%7B(0)%7D%7Cx_0%3D1)p(r_t%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D%3D%2B1)%20p(x_0%3D1)%20%5C%5C%0A%5Cend%7Baligned%7D


计算出所有的 %5Calpha
 初始化 0 时刻的 %5Calpha
%5Calpha_0(0)%20%3D%201%20%20%5C%5C%0A%5Calpha_0(1)%20%3D%200%20%5C%5C%0A%5Calpha_0(2)%20%3D%200%20%5C%5C%0A%5Calpha_0(3)%20%3D%200%0A
1 时刻的 %5Calpha

%5Calpha_1(0)%20%3D%5Csum_%7Bp%5Cin%5C%7B0%2C1%5C%7D%7D%20%20%5Calpha_0(p)%20%5Clambda_0(p%2C0)%20%3D%5Calpha_0(0)%20%5Clambda_0(0%2C0)%2B%5Calpha_0(1)%20%5Clambda_0(1%2C0)%20%5C%5C%0A%5Calpha_1(1)%20%3D%5Csum_%7Bp%5Cin%5C%7B2%2C3%5C%7D%7D%20%20%5Calpha_0(p)%20%5Clambda_0(p%2C1)%20%3D%5Calpha_0(2)%20%5Clambda_0(2%2C1)%2B%5Calpha_0(3)%20%5Clambda_0(3%2C1)%20%5C%5C%0A%5Calpha_1(2)%20%3D%5Csum_%7Bp%5Cin%5C%7B0%2C1%5C%7D%7D%20%20%5Calpha_0(p)%20%5Clambda_0(p%2C2)%20%3D%5Calpha_0(0)%20%5Clambda_0(0%2C2)%2B%5Calpha_0(1)%20%5Clambda_0(1%2C2)%20%5C%5C%0A%5Calpha_1(3)%20%3D%5Csum_%7Bp%5Cin%5C%7B2%2C3%5C%7D%7D%20%20%5Calpha_0(p)%20%5Clambda_0(p%2C3)%20%3D%5Calpha_0(2)%20%5Clambda_0(2%2C3)%2B%5Calpha_0(3)%20%5Clambda_0(3%2C3)%20%5C%5C



依次类推,最后计算 9 时刻的 %5Calpha

%5Calpha_9(0)%20%3D%5Csum_%7Bp%5Cin%5C%7B0%2C1%5C%7D%7D%20%20%5Calpha_8(p)%20%5Clambda_8(p%2C0)%20%3D%5Calpha_8(0)%20%5Clambda_8(0%2C0)%2B%5Calpha_8(1)%20%5Clambda_8(1%2C0)%20%5C%5C%0A%5Calpha_9(1)%20%3D%5Csum_%7Bp%5Cin%5C%7B2%2C3%5C%7D%7D%20%20%5Calpha_8(p)%20%5Clambda_8(p%2C1)%20%3D%5Calpha_8(2)%20%5Clambda_8(2%2C1)%2B%5Calpha_8(3)%20%5Clambda_8(3%2C1)%20%5C%5C%0A%5Calpha_9(2)%20%3D%5Csum_%7Bp%5Cin%5C%7B0%2C1%5C%7D%7D%20%20%5Calpha_8(p)%20%5Clambda_8(p%2C2)%20%3D%5Calpha_8(0)%20%5Clambda_8(0%2C2)%2B%5Calpha_8(1)%20%5Clambda_8(1%2C2)%20%5C%5C%0A%5Calpha_9(3)%20%3D%5Csum_%7Bp%5Cin%5C%7B2%2C3%5C%7D%7D%20%20%5Calpha_8(p)%20%5Clambda_8(p%2C3)%20%3D%5Calpha_8(2)%20%5Clambda_8(2%2C3)%2B%5Calpha_8(3)%20%5Clambda_8(3%2C3)%20%5C%5C

再计算 %5Cbeta 概率,计算时刻 10 到时刻 1 的:

初始化时刻 10 的  %5Cbeta

%5Cbeta_%7B10%7D(0)%20%3D%201%20%20%5C%5C%0A%5Cbeta_%7B10%7D(1)%20%3D%200%20%5C%5C%0A%5Cbeta_%7B10%7D(2)%20%3D%200%20%5C%5C%0A%5Cbeta_%7B10%7D(3)%20%3D%200


计算 9 时刻的  %5Cbeta

%5Cbeta_9(0)%20%3D%20%5Csum_%7Bq%5Cin%20%5C%7B0%2C2%5C%7D%7D%20%20%5Cgamma_9(0%2Cq)%20%20%5Cbeta_%7B10%7D(q)%20%3D%20%5Cgamma_9(0%2C0)%20%20%5Cbeta_%7B10%7D(0)%20%2B%20%20%5Cgamma_9(0%2C2)%20%20%5Cbeta_%7B10%7D(2)%20%5C%5C%0A%5Cbeta_9(1)%20%3D%20%5Csum_%7Bq%5Cin%20%5C%7B0%2C2%5C%7D%7D%20%20%5Cgamma_9(1%2Cq)%20%20%5Cbeta_%7B10%7D(q)%20%3D%20%5Cgamma_9(1%2C0)%20%20%5Cbeta_%7B10%7D(0)%20%2B%20%20%5Cgamma_9(1%2C2)%20%20%5Cbeta_%7B10%7D(2)%20%5C%5C%0A%5Cbeta_9(2)%20%3D%20%5Csum_%7Bq%5Cin%20%5C%7B1%2C3%5C%7D%7D%20%20%5Cgamma_9(2%2Cq)%20%20%5Cbeta_%7B10%7D(q)%20%3D%20%5Cgamma_9(2%2C1)%20%20%5Cbeta_%7B10%7D(1)%20%2B%20%20%5Cgamma_9(2%2C3)%20%20%5Cbeta_%7B10%7D(3)%20%5C%5C%0A%5Cbeta_9(3)%20%3D%20%5Csum_%7Bq%5Cin%20%5C%7B1%2C3%5C%7D%7D%20%20%5Cgamma_9(3%2Cq)%20%20%5Cbeta_%7B10%7D(q)%20%3D%20%5Cgamma_9(3%2C1)%20%20%5Cbeta_%7B10%7D(1)%20%2B%20%20%5Cgamma_9(3%2C3)%20%20%5Cbeta_%7B10%7D(3)%20%5C%5C

以此类推,计算 1 时刻的  %5Cbeta

%5Cbeta_1(0)%20%3D%20%5Csum_%7Bq%5Cin%20%5C%7B0%2C2%5C%7D%7D%20%20%5Cgamma_1(0%2Cq)%20%20%5Cbeta_2(q)%20%3D%20%5Cgamma_1(0%2C0)%20%20%5Cbeta_2(0)%20%2B%20%20%5Cgamma_1(0%2C2)%20%20%5Cbeta_2(2)%20%5C%5C%0A%5Cbeta_1(1)%20%3D%20%5Csum_%7Bq%5Cin%20%5C%7B0%2C2%5C%7D%7D%20%20%5Cgamma_1(1%2Cq)%20%20%5Cbeta_2(q)%20%3D%20%5Cgamma_1(1%2C0)%20%20%5Cbeta_2(0)%20%2B%20%20%5Cgamma_1(1%2C2)%20%20%5Cbeta_2(2)%20%5C%5C%0A%5Cbeta_1(2)%20%3D%20%5Csum_%7Bq%5Cin%20%5C%7B1%2C3%5C%7D%7D%20%20%5Cgamma_1(2%2Cq)%20%20%5Cbeta_2(q)%20%3D%20%5Cgamma_1(2%2C1)%20%20%5Cbeta_2(1)%20%2B%20%20%5Cgamma_1(2%2C3)%20%20%5Cbeta_2(3)%20%5C%5C%0A%5Cbeta_1(3)%20%3D%20%5Csum_%7Bq%5Cin%20%5C%7B1%2C3%5C%7D%7D%20%20%5Cgamma_1(3%2Cq)%20%20%5Cbeta_2(q)%20%3D%20%5Cgamma_1(3%2C1)%20%20%5Cbeta_2(1)%20%2B%20%20%5Cgamma_1(3%2C3)%20%20%5Cbeta_2(3)%20%5C%5C


至此,我们可以计算 extrinsic probability 外信息:



%5Cbegin%7Baligned%7D%0AP_%7Be%2C0%7D(0)%26%3D%5Csum_%7B(p%2Cq)%5Cin%20S_0%7D%20%5Calpha_0(p)%20p(r_0%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D)%20%5Cbeta_%7B1%7D(q)%20%5C%5C%0A%26%3D%20%5Calpha_0(0)%20p(r_0%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D)%20%5Cbeta_%7B1%7D(0)%20%2B%20%5Calpha_0(1)%20p(r_0%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D)%20%5Cbeta_%7B1%7D(2)%20%20%5C%5C%0A%26%5Cquad%20%5Cquad%20%2B%5Calpha_0(2)%20p(r_0%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D)%20%5Cbeta_%7B1%7D(1)%20%2B%20%5Calpha_0(3)%20p(r_0%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D)%20%5Cbeta_%7B1%7D(3)%20%5C%5C%0A%0AP_%7Be%2C0%7D(1)%26%3D%5Csum_%7B(p%2Cq)%5Cin%20S_1%7D%20%5Calpha_0(p)%20p(r_0%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D)%20%5Cbeta_%7B1%7D(q)%20%5C%5C%0A%26%3D%20%5Calpha_0(0)%20p(r_0%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D)%20%5Cbeta_%7B1%7D(2)%20%2B%20%5Calpha_0(1)%20p(r_0%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D)%20%5Cbeta_%7B1%7D(0)%20%20%5C%5C%0A%26%5Cquad%20%5Cquad%20%2B%5Calpha_0(2)%20p(r_0%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D)%20%5Cbeta_%7B1%7D(3)%20%2B%20%5Calpha_0(3)%20p(r_0%5E%7B(1)%7D%7C%20a_0%5E%7B(1)%7D)%20%5Cbeta_%7B1%7D(1)%20%5C%5C%20%20%5C%5C%0A%0A%5Cend%7Baligned%7D



我们把下一个卷积码要用的先验概率用前面的外信息赋值:
P%5E%7B(0%2C2)%7D(x_t%3D0)%3DP_%7Be%2C0%7D(0)%20%5C%5C%0AP%5E%7B(0%2C2)%7D(x_t%3D1)%3DP_%7Be%2C0%7D(1)用同样的步骤计算 %5Cgamma%2C%20%5Calpha%2C%20%5Cbeta, 然后计算出 P_%7Be%2C0%7D(x) 外信息,把这个外信息作为下一轮的第一个卷积码的先验概率来使用,继续上面的过程。

Turbo 译码浅析的评论 (共 条)

分享到微博请遵守国家法律