Wavelet image coding with context-based zerotree quantization framework

Citation
K. Yang et al., Wavelet image coding with context-based zerotree quantization framework, IEICE T INF, E83D(2), 2000, pp. 211-222
Citations number
11
Categorie Soggetti
Information Tecnology & Communication Systems
Journal title
IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS
ISSN journal
09168532 → ACNP
Volume
E83D
Issue
2
Year of publication
2000
Pages
211 - 222
Database
ISI
SICI code
0916-8532(200002)E83D:2<211:WICWCZ>2.0.ZU;2-O
Abstract
We introduce a new wavelet image coding framework using context-based zerot ree quantization, where an unique and efficient method for optimization of zerotree quantization is proposed. Because of the localization properties o f wavelets, when a wavelet coefficient is to be quantized, the best quantiz er is expected to be designed to match the statistics of the wavelet coeffi cients in its neighborhood, that is, the quantizer should be adaptive both in space and frequency domain. Previous image coders tended to design quant izers in a band or a class level, which limited their performances as it is difficult for the localization properties of wavelets to be exploited. Con trasting with previous coders, we propose to trace the localization propert ies with the combination of the tree-structured wavelet representations and adaptive models which are spatial-varying according to the local statistic s. In the paper, we describe the proposed coding algorithm, where the spati al-varying models are estimated from the quantized causal neighborhoods and the zerotree pruning is based on the Lagrangian cost that can be evaluated from the statistics nearby the tree. In this way, optimization of zerotree quantization is no longer a joint optimization problem as in SFQ. Simulati on results demonstrate that the coding performance is competitive, and some times is superior to the best results of zerotree-based coding reported in SFQ.