Google Video Series Deciphers Compression Algorithms for Dev Source: Todd R. Weiss
Google's video series for developers explains the theory and use of compression algorithms to help them learn how to build smaller, better apps and Websites.
Google is looking to help application and Website developers shrink their content so that more users will be able to access games and Websites using mobile devices and over slower-than-desired Internet connections.
That's the idea behind Google's new developer-targeted "Compressor Head" video series, which details the evolution and use of compression algorithms, which can squish content to help users reduce their data usage and increase performance. The new video series, available through the Google Developers YouTube channel, was unveiled by Colt McAnlis, a Google developer advocate, in a May 20 post on the Google Developers Blog.
"The next five billion humans who come online will be doing so from parts of the world where connectivity is costly and slow," wrote McAnlis. "With the average Website approaching 2 megabytes in size and the average Android game approaching 125 megabytes, users in these markets will have to make a tough choice between content and cost. Compression algorithms, which address this issue, will become critically important over the next decade."
The video series includes three episodes, the first of which details Variable Length Codes, which since the early 1950s have been at the heart of data compression algorithms, according to McAnlis. Episode 1 also explores the creation of information theory and how it's spawned the concept of variable length codes.
Episode 2 details the LZ Compression Family, which in the world of compression reigns supreme as the most important algorithm family, according to McAnlis. "Born in the late 1970s, the Lempel-Ziv algorithms have become the most dominant dictionary encoding schemes in compression. This episode explains why these algorithms are so dominant."
    Business Value Drivers in Digital Commerce: Key Capabilities for Growing Revenue Register Now
The third episode in the video series describes Markov Chain Compression, which is "at the cutting edge of compression algorithms," wrote McAnlis.    "These algorithms adopt an artificial intelligence approach to compression by allowing the encoder and decoder to 'predict' what data is coming next. In this episode, you'll learn how these magical algorithms compress data, and why some think that they are the future of compression."
The topic of compression and how it can make content more accessible for users is a key for developers as they create their future products, wrote McAnlis. "Most developers are content to let compression be someone else's problem. But the truth is that these algorithms sit in the intersection of optimization, information theory, and pragmatism. These videos will take us through the history of information theory, explain why compression matters, and show how different algorithm families approach this challenge."
Data compression and its relationship with users has been a topic noted by Google for quite some time.
In January 2014, Google announced new Chrome browsers for Android and iOS devices that included data compression services that aimed to help users reduce their data usage by up to about 50 percent, according to an earlier eWEEK report.
In March 2013, Google released a new Zopfli Compression Algorithm, an open-source general-purpose data compression library that can make files 3 to 8 percent smaller than those run through the existing zlib library, helping to speed data transfer. Zopfli gets its name from a traditional Swiss braided bread recipe (Zopf). Zopfli    is an implementation of the Deflate compression algorithm that creates a smaller output size, compared with previous techniques. Zopfli, which is written in C, was released under an Apache Software Foundation 2.0 open-source license. It is a compression-only library and is bit-stream-compatible with compression used in gzip, Zip, PNG, HTTP requests and others.
| }
|