TechNews Pictorial PriceGrabber Video Sun Dec 22 18:27:05 2024

0


Saarland University professor receives top research award for improved image com
Source: Saarland University


Joachim Weickerts method to improve data compression techniques draws on a process observed in nature and follows the same rules as those governing the propagation of heat. For the five-year period 2017 to 2022, Professor Weickert will receive up to €2.46 million in EU funding to support his research. Credit: Oliver Dietze

According to the statistics portal statista.com , the amount of digital data created worldwide in 2015 was about 8.5 billion terabytes. By 2020, the volume of data created annually will have increased almost five-fold to the gigantic figure of 40 billion terabytes (equivalent to 40,000 exabytes or 40 zettabytes). A large portion of this digital information arises from online video services and from images of ever higher resolution.

While it is true that the capacity of modern digital storage media is also growing, it will simply be unable to keep pace with the rising flood of digital data. The data therefore needs to be compressed using a special set of computational rules ('data compression algorithms') so that it can be stored more efficiently. Common compression methods currently in use, such as JPEG for photos or MPEG4 for videos, reduce the data volume by storing visually less relevant information less precisely or even simply eliminating it. Up until now this has been done in the frequency domain. Just as audio signals can be considered to be composed of different frequencies, optical images can be treated as two-dimensional signals that can be decomposed into individual frequencies. If the compression process is successful, the observer should barely be able to distinguish the compressed image from the original. However, these techniques have their limits. Put simply, if too much information from the original is ignored and the image is compressed too strongly, the difference between the original and the compressed image will be noticeable. The resulting image suffers from unpleasant artefacts and the loss of information will be readily apparent to the observer.

Joachim Weickert, Professor of Mathematics and Informatics at Saarland University, is taking a quite different approach to image compression. He and his research group in Saarbrücken are working on a method that only needs to store a few particularly important pixels but is still able to reconstruct a high-quality image. Using only a very small fraction of the pixels in the original image, their technique can achieve excellent compression ratios. Weickert has been able to show that at high compression rates techniques of this type have the potential to do better than the established standard methods such as JPEG.

The technique makes use of a phenomenon that Weickert has copied from nature. Using variations of the heat equation, which scientists use to describe and compute how the distribution of heat in materials changes with time, Weickert is able to reconstruct images with high precision. The underlying idea is not difficult to explain: 'The stored pixels can be thought of as tiny air conditioners. The brighter a pixel, the higher its temperature. Just as heat diffuses through space, so too does pixel information, spreading into those neighbouring areas where nothing was stored,' explains Joachim Weickert. Starting from a just few stored pixels, the missing pixels can be reconstructed to yield a surprisingly good visual image. Over the last seven years, Joachim Weickert has been able to lay the foundations of this research field using the money awarded with his Leibniz Prize. The ERC Advanced Grant has been awarded to Weickert in recognition of the success of this earlier research work. The award of up to €2.46m over a five-year period follows on seamlessly from the Leibniz Prize and affords Weickert the opportunity to continue his research into these promising ideas so that they can be incorporated into future coding standards. Before that can happen, however, there are a number of very difficult problems to solve.

'One challenge consists of deciding precisely which pixels to select so as to get the best result,' says Weickert, explaining one of the avenues to be explored in the coming years. 'Intuitively it seem pretty clear that the pixels to be stored should be chosen close to edges, that is at locations where things change a lot, such as at the edge of a face that is contrasted against a background. That helps up to improve the reconstruction of the original.' But which pixels are they exactly? That is something that the researchers know only in a few rather simple special cases. If they could find the answer to that question, it would represent a huge step forward in the quality of the image reconstruction process.

This is where mathematics comes into play. To illustrate the complexity of this task, Joachim Weickert offers the following example: 'Playing the lottery is a pretty hopeless undertaking. The probability of predicting the right six numbers from the 49 numbers in the German lottery is very low. It's about 1 in 14 million. If I now have an image made up of 8,000,000 pixels, and I wanted to choose the best 400,000 pixels so that I could produce an ideal reconstruction of the original picture, the probability of making the right choice is now one in a number that contains 689,710 digits!' Put another way, without the help of good mathematical theories it is practically impossible to determine the ideal combination of pixels for storage that will enable the 'least lossy' reconstruction of the original.

Another problem concerns the large amount of computing power required. Computer simulations of the heat equation require a great deal more computing power than frequency-based methods such as JPEG. This is an area where the researchers need to make significant progress by developing new and considerably more efficient algorithms that are able to make full use of the options offered by modern graphics processors.

Joachim Weickert has an ambitious goal: 'In five years' time, we want to be able to use our methods to compress 4K videos and then decompress the resulting file in real time. To do that, our algorithms need to get faster by a factor of about 100.' But he is optimistic: 'Ten years ago, well-respected colleagues were telling me: "Forget it, you'll never do it." Then we were awarded the Leibniz Prize. And now we've received the ERC Grant.' And that's a measure of research success that's hard to argue with.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |