Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Grouped Markov Superposition Coding Method Based on Double Recursion

A technology of superposition coding and double recursion, which is applied in the direction of using block codes for error correction/detection, coding, and error detection coding using multiple parity bits, and can solve non-recursive block Markov superposition coding Method, decoding complexity and high decoding delay

Active Publication Date: 2021-01-26
JINAN UNIVERSITY
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the non-recursive block Markov superposition coding method has the following problems: when the repetition code and the parity code are used as the basic code, a large coding memory length m is required to effectively approach the channel capacity, and the larger the memory length m , the greater the required decoding delay d, the higher the corresponding decoding complexity and decoding delay
Therefore, the non-recursive block Markov superposition coding method cannot be used in communication and storage systems that require extremely low latency and extremely low computational complexity

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Grouped Markov Superposition Coding Method Based on Double Recursion
  • A Grouped Markov Superposition Coding Method Based on Double Recursion
  • A Grouped Markov Superposition Coding Method Based on Double Recursion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0043] set m 1 =m 2 = 1, refer to figure 1 , and the corresponding coding diagram is as follows image 3 . refer to image 3 , a binary information sequence whose length is K=kL=1250×343 u Divided into L=343 equal-length groups u =(u (0) , u (1) ,…, u (342) ), the length of each packet is k=1250. The basic code encoder ENC uses a repetition code with code length n=2 and information bit length k=1. In this example, two random interleavers are used. The symbol-wise aliaser S uses a bit-wise binary field sum operator. The ending length T is set to be the same as the decoding delay d, ie T=d. refer to figure 1 , its encoding method includes the following steps:

[0044] Step 1. Put the information sequence u Divided into 343 equal-length groups u =( u (0) , u (1) ,…, u (342) ), the length of each packet is 1250; for t=-1, the sequence of length 2500 w 1 (t) with w 2 (t) The initialization is set as an all-zero sequence, that is, for t=-1, there is ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a double-recursion-based grouping Markov superposition coding method. The error correction code C[n,k] whose code length is n and information bit length is k is used as the basic code, and the length K=kL message sequence u Encoded into a codeword of length N=n(L+T) c ; Its encoding method comprises the following steps: first, the length is the information sequence of kL u Divide into L equal-length groups u =( u (0) , u (1) ,…, u (L‑1) ), the length of each packet is k; for time t=‑1,‑2,...,‑m 1 , the sequence of length n w 1 (t) set to a sequence of all zeros; for time instants t = ‑1,‑2,...,‑m 2 , the sequence of length n w 2 (t) Set as an all-zero sequence; then, at time t=0,1,...,L-1, send a sequence of length k to the encoder ENC of the basic code C[n,k] for encoding to obtain a sequence of length n and calculate the codeword by combining the feedback and c The tth subsequence of c (t) . The present invention has the advantages of simple encoding, low decoding complexity, and can approach channel capacity. Compared with the traditional block Markov superposition coding method, the present invention has a lower decoding error floor and lower decoding complexity. Spend.

Description

technical field [0001] The invention relates to the technical fields of digital communication and digital storage, in particular to a double-recursion-based grouping Markov superposition coding method. Background technique [0002] Data in communication and storage systems will be affected by noise and errors will occur, resulting in data that cannot be received or restored correctly. With the increasing demand for personal data and storage, data reliability in communication and storage systems has drawn more and more attention. In order to achieve efficient and reliable data transmission and data storage, it is necessary to design a channel code that can approach the channel capacity and has an efficient encoding and decoding algorithm. Since Shannon proposed the famous channel coding theorem in 1948, people have been devoting themselves to the research and design of good codes that can approximate the channel capacity. In 1993, Berrou et al. proposed the Turbo code, whic...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H03M13/11H03M13/29
CPCH03M13/1125H03M13/1194H03M13/2972
Inventor 赵山程马啸黄勤白宝明
Owner JINAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products