Please use this identifier to cite or link to this item:
http://ir.juit.ac.in:8080/jspui/jspui/handle/123456789/5881
Title: | Data Compression Techniques |
Authors: | Thakur, Vikram Ghrera, Satya Prakash [Guided by] |
Keywords: | Data compression Data compression model Algorithms Shannon-Fano coding |
Issue Date: | 2015 |
Publisher: | Jaypee University of Information Technology, Solan, H.P. |
Abstract: | Adaptive Huffman Coding is one of the many coding techniques used to compress data in a multitude of real world applications. GZip, 7Zip, Winrar are only a handful of utilities which make use of such data compression algorithms. Data Compression algorithms can be divided into 2 categories 1. Lossy Compression Algorithms 2. Lossless Compression Algorithms Adaptive Huffman, which is derived from Huffman Coding, falls into the latter category and is extensively used in image formats such as JPEG (Joint Photographic Experts Group). There are a few shortcomings to the straight Huffman compression. First of all, you need to send the Huffman tree at the beginning of the compressed file, or the decompressor will not be able to decode it. This can cause some overhead. Also, Huffman compression looks at the statistics of the whole file, so that if a part of the code uses a character more heavily, it will not adjust during that section. Not to mention the fact that sometimes the whole file is not available to get the counts from (such as in live information). |
URI: | http://ir.juit.ac.in:8080/jspui//xmlui/handle/123456789/5881 |
Appears in Collections: | B.Tech. Project Reports |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Data Compression Techniques.pdf | 1.09 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.