中国高校课件下载中心 》 教学资源 》 大学文库

《多媒体技术基础》课程教学课件(英文讲稿)Chapter 7 Lossless Compression Algorithms

文档信息
资源类别:文库
文档格式:PDF
文档页数:50
文件大小:378.1KB
团购合买:点击进入团购
内容简介
7.1 Introduction 7.2 Basics of Information Theory 7.3 Run-Length Coding 7.4 Variable-Length Coding (VLC) 7.5 Dictionary-based Coding 7.6 Arithmetic Coding 7.7 Lossless Image Compression 7.8 Further Exploration
刷新页面文档预览

Fundamentals of Multimedia, Chapter 7 Chapter 7 Lossless Compression Algorithms 7.1 Introduction 7.2 Basics of Information Theory 7.3 Run-Length Coding 7.4 Variable-Length Coding (VLC) 7.5 Dictionary-based Coding 7.6 Arithmetic Coding 7.7 Lossless Image Compression 7.8 Further Exploration

Fundamentals of Multimedia, Chapter 7 7.1 Introduction • Compression: the process of coding that will effectively reduce the total number of bits needed to represent certain information. Encoder (compression) Decoder (decompression) Storage or networks Input Output data data Fig. 7.1: A General Data Compression Scheme

Fundamentals of Multimedia, Chapter 7 Introduction (cont’d) • If the compression and decompression processes induce no information loss, then the compression scheme is lossless; otherwise, it is lossy. • Compression ratio: compression ratio = B 0 B 1 (7.1) B 0 – number of bits before compression B 1 – number of bits after compression

Fundamentals of Multimedia, Chapter 7 7.2 Basics of Information Theory • The entropy η of an information source with alphabet S = { s 1, s 2,.,s n } is: η = H ( S) = Xn i=1 p i log 2 1 p i (7.2) = − Xn i=1 p i log 2 p i (7.3) p i – probability that symbol s i will occur in S. log 2 1 pi – indicates the amount of information ( self-information as defined by Shannon) contained in s i, which corresponds to the number of bits needed to encode s i

Fundamentals of Multimedia, Chapter 7 Distribution of Gray-Level Intensities 0 0 255 255 1 2/3 1/3 1/256 (a) (b) i i pi pi Fig. 7.2 Histograms for Two Gray-level Images. • Fig. 7.2(a) shows the histogram of an image with uni￾form distribution of gray-level intensities, i.e., ∀i p i = 1 /256. Hence, the entropy of this image is: log 2 256 = 8 (7.4)

Fundamentals of Multimedia, Chapter 7 Entropy and Code Length • As can be seen in Eq. (7.3): the entropy η is a weighted-sum of terms log 2 1 pi; hence it represents the average amount of information contained per symbol in the source S. • The entropy η specifies the lower bound for the average num￾ber of bits to code each symbol in S, i.e., η ≤ ¯l (7.5) ¯l - the average length (measured in bits) of the codewords produced by the encoder

Fundamentals of Multimedia, Chapter 7 7.3 Run-Length Coding • Memoryless Source: an information source that is indepen￾dently distributed. Namely, the value of the current symbol does not depend on the values of the previously appeared symbols. • Instead of assuming memoryless source, Run-Length Coding (RLC) exploits memory present in the information source. • Rationale for RLC: if the information source has the prop￾erty that symbols tend to form continuous groups, then such symbol and the length of the group can be coded

Fundamentals of Multimedia, Chapter 7 7.4 Variable-Length Coding (VLC) Shannon-Fano Algorithm — a top-down approach 1. Sort the symbols according to the frequency count of their occurrences. 2. Recursively divide the symbols into two parts, each with ap￾proximately the same number of counts, until all parts con￾tain only one symbol. An Example: coding of “HELLO” Symbol HELO Count 1121 Frequency count of the symbols in ”HELLO

Fundamentals of Multimedia, Chapter 7 L:(2) (5) H,E,O:(3) (a) 0 1 (b) L:(2) (5) H:(1) E,O:(2) (3) 0 1 0 1 L:(2) O:(1) (5) E:(1) H:(1) (c) (2) (3) 0 1 0 1 0 1 Fig. 7.3: Coding Tree for HELLO by Shannon-Fano

Fundamentals of Multimedia, Chapter 7 Table 7.1: Result of Performing Shannon-Fano on HELLO Symbol Count log 2 1 pi Code # of bits used L 2 1.32 0 2 H 1 2.32 10 2 E 1 2.32 110 3 O 1 2.32 111 3 TOTAL number of bits: 10

刷新页面下载完整文档
VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
注册用户24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
相关文档