Please use this identifier to cite or link to this item:
Title: Data Compression Techniques
Authors: Thakur, Vikram
Ghrera, Satya Prakash [Guided by]
Keywords: Data compression
Data compression model
Shannon-Fano coding
Issue Date: 2015
Publisher: Jaypee University of Information Technology, Solan, H.P.
Abstract: Adaptive Huffman Coding is one of the many coding techniques used to compress data in a multitude of real world applications. GZip, 7Zip, Winrar are only a handful of utilities which make use of such data compression algorithms. Data Compression algorithms can be divided into 2 categories 1. Lossy Compression Algorithms 2. Lossless Compression Algorithms Adaptive Huffman, which is derived from Huffman Coding, falls into the latter category and is extensively used in image formats such as JPEG (Joint Photographic Experts Group). There are a few shortcomings to the straight Huffman compression. First of all, you need to send the Huffman tree at the beginning of the compressed file, or the decompressor will not be able to decode it. This can cause some overhead. Also, Huffman compression looks at the statistics of the whole file, so that if a part of the code uses a character more heavily, it will not adjust during that section. Not to mention the fact that sometimes the whole file is not available to get the counts from (such as in live information).
Appears in Collections:B.Tech. Project Reports

Files in This Item:
File Description SizeFormat 
Data Compression Techniques.pdf1.09 MBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.