Fast JPEG 2000 decoder and its use in medical imaging.

IEEE Trans Inf Technol Biomed

Image Computing Systems Laboratory, Department of Bioengineering, University of Washington, Seattle, WA 98195-2500, USA.

Published: September 2003

Over the last decade, a picture archiving and communications system (PACS) has been accepted by an increasing number of clinical organizations. Today, PACS is considered as an essential image management and productivity enhancement tool. Image compression could further increase the attractiveness of PACS by reducing the time and cost in image transmission and storage as long as 1) image quality is not degraded and 2) compression and decompression can be done fast and inexpensively. Compared to JPEG, JPEG 2000 is a new image compression standard that has been designed to provide improved image quality at the expense of increased computation. Typically, the decompression time has a direct impact on the overall response time taken to display images after they are requested by the radiologist or referring clinician. In this paper, we present a fast JPEG 2000 decoder running on a low-cost programmable processor. It can decode a losslessly compressed 2048 x 2048 CR image in 1.51 s. Using this kind of a decoder, performing JPEG 2000 decompression at the PACS display workstation right before images are displayed becomes viable. A response time of 2 s can be met with an effective transmission throughput between the central short-term archive and the workstation of 4.48 Mb/s in case of CT studies and 20.2 Mb/s for CR studies. We have found that JPEG 2000 decompression at the workstation is advantageous in that the desired response time can be obtained with slower communication channels compared to transmission of uncompressed images.

Download full-text PDF

Source
http://dx.doi.org/10.1109/titb.2003.813789DOI Listing

Publication Analysis

Top Keywords

jpeg 2000
20
response time
12
fast jpeg
8
2000 decoder
8
image compression
8
image quality
8
2000 decompression
8
image
7
0
5
time
5

Similar Publications

State-of-the-Art Trends in Data Compression: COMPROMISE Case Study.

Entropy (Basel)

November 2024

Faculty of Electrical Engineering and Computer Science, University of Maribor, Koroška cesta 46, SI-2000 Maribor, Slovenia.

After a boom that coincided with the advent of the internet, digital cameras, digital video and audio storage and playback devices, the research on data compression has rested on its laurels for a quarter of a century. Domain-dependent lossy algorithms of the time, such as JPEG, AVC, MP3 and others, achieved remarkable compression ratios and encoding and decoding speeds with acceptable data quality, which has kept them in common use to this day. However, recent computing paradigms such as cloud computing, edge computing, the Internet of Things (IoT), and digital preservation have gradually posed new challenges, and, as a consequence, development trends in data compression are focusing on concepts that were not previously in the spotlight.

View Article and Find Full Text PDF

A case study on entropy-aware block-based linear transforms for lossless image compression.

Sci Rep

November 2024

Faculty of Electrical Engineering and Computer Science, University of Maribor, Koroška cesta 46, 2000, Maribor, Slovenia.

Data compression algorithms tend to reduce information entropy, which is crucial, especially in the case of images, as they are data intensive. In this regard, lossless image data compression is especially challenging. Many popular lossless compression methods incorporate predictions and various types of pixel transformations, in order to reduce the information entropy of an image.

View Article and Find Full Text PDF

This paper provides a comprehensive study on features and performance of different ways to incorporate neural networks into lifting-based wavelet-like transforms, within the context of fully scalable and accessible image compression. Specifically, we explore different arrangements of lifting steps, as well as various network architectures for learned lifting operators. Moreover, we examine the impact of the number of learned lifting steps, the number of channels, the number of layers and the support of kernels in each learned lifting operator.

View Article and Find Full Text PDF

Unlabelled: New higher-count-rate, integrating, large area X-ray detectors with framing rates as high as 17,400 images per second are beginning to be available. These will soon be used for specialized MX experiments but will require optimal lossy compression algorithms to enable systems to keep up with data throughput. Some information may be lost.

View Article and Find Full Text PDF
Article Synopsis
  • This study introduces a new method for compressing dense light field images captured by Plenoptic 2.0 cameras, using advanced statistical models like the 5-D Epanechnikov Kernel.
  • To address limitations in traditional modeling techniques, the researchers propose a novel 5-D Epanechnikov Mixture-of-Experts approach that uses Gaussian Initialization, which performs better than existing models like 5-D Gaussian Mixture Regression.
  • Experimental results show that this new compression method produces higher quality rendered images than High Efficiency Video Coding (HEVC) and JPEG 2000, especially at low bit depths below 0.06 bits per pixel (bpp).
View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!