Texas Tech University

Computer Vision & Image Analysis Laboratory

Projects

Compression & Quantization

Throughout the years, we have researched in the field of compression and quantization, particularly on image compression. The followings are the highlights of our results:

  • Backward Coding of Wavelet Trees (BCWT)
  • Fast Enhanced LBG (FELBG)
  • Enhanced SPIHT (E-SPIHT)
  • Hybrid Vector Scalar Quantization (HVSQ)
  • Hybrid Multi-scaled Vector Quantization (HMVQ)

Segmentation of Cervical Cancer Lesions

Our current research is on automated classification of pre−cancerous bio−markers in cervical cancer using digital images of the cervix. Cervical cancer is the second most common form of cancer in women world wide and the Pap smear, the current standard in cervical cancer screening, is known to have a very low sensitivity. However, optical tests like cervicography and coloposcopy, that allow physicians to examine the cervix and obtain an image of the cervix, are gaining prominence. Our work in this area concerns investigation and implementation of image processing algorithms for automatic segmentation, detection and classification of pathologically significant markers for image processing, like acetowhite change, mosaicism and punctations.

Superresolution on Digital Image Sequence

This project involves image deblur and superresolution on digital image sequence, using matched filtering, image registration, point−spread−function estimation, Wiener filtering, and deconvolution techniques. Another aspect is to establish a client−server framework for very high resolution image remote viewing, according to the Internet Imaging Protocol.

Automatic Glaucoma Detection

Early detection of structural damage to the optic nerve head (ONH) is critical in diagnosis of glaucoma, because such glaucomatous damage precedes clinically identifiable visual loss. Early detection of glaucoma can prevent progression of the disease and consequent loss of vision. The Gold standards of glaucoma detection imclude visual field test, intraocular pressure monitoring and stereo fundus photography. Stereo fundus photography is routinely used to detect subtle changes in the ONH. However, clinical evaluation of stereo fundus photographs suffers from inter− and intra−subject variability. Even the sophisticated optical instruments like the Heidelberg Retina Tomograph (HRT) have not been found to detect glaucoma any earlier than visual field loss.

We have developed a fully automated algorithm for segmentation of optic cup and disc contours from corresponding stereo disparity information. The system takes only a stereo pair of fundus images as input and generates cup and disc contours automatically along with the three−dimensional visualization of the Optic Nerve Head in true color. Because this technique does not involve human intervention, it eliminates subjective variability encountered in currently used clinical methods and provides ophthalmologists with a cost−effective and quantitative method for detection of ONH structural damage for early detection of glaucoma.

Error Resilient Transmission of Embedded Wavelet Coded Images

The reliable transmission of compressed images across noisy transmission channels, and networks prone to packet losses is being considered. The problem is viewed from a source coding perspective where the main constraints are maximum coding efficiency and minimum redundancy. Error−resilient mechanisms are incorporated into wavelet based image encoding schemes to obtain robust data streams that withstand significant bit errors and packet losses.

The basic design starts off with the well known SPIHT image compression algorithm and makes it resilient to network packet losses by modifying the encoding structure and partitioning the coefficients. The robust encoder in its basic form requires no additional redundancy and relies on an edge directed interpolation method for error concealment of lost data. Coding improvements of 0.5−1.5 dB in SNR values over other similar methods have been observed. Wavelet based image encoding schemes involve quantization of the image wavelet coefficients. Redundant sets of wavelet coefficients generated by overcomplete discrete wavelet transform are used as side information to improve error−resilience at high packet loss rates. Unlike other error protection schemes, the use of redundant wavelet sets not only improves the robustness of the encoding scheme but also improves image quality.

The error−resilient encoding scheme is extended to encompass a wide range of images including color images. Efficient implementations are also developed to account for different situations arising due to various types of errors and losses in transmission channels. The different error−resilient mechanisms that have been implemented require minimal redundancy and maintain a high source coding efficiency for the selected image quality. The implementations are flexible and can be easily adapted to any type of transmission channel with different network parameters.

Blind Equalization of an HF Channel

Wireless channels often distort a signal beyond the point of reliable demodulation. To account for this problem receivers incorporate an equalizer whose objective is to remove the distortion introduced by the medium of transmission. Typically the equalizer requires knowledge of the transmitted signal to adequately adapt to remove this distortion. This introduces overhead into the signal effectively reducing the data throughput rate. Blind equalizers do not require this prior knowledge of the transmitted signal allowing for the removal of information that is known by the receiver a priori.

In most of the research in blind equalizers simulated signals and test channels are used. Little to no attention is paid to using real world transmissions in the algorithms analysis. While modeling is a crucial step into the development of an efficient algorithm it is not the only step. Accurate models accounting for all the disturbances and noise in the real world are to complex to be efficient. One particular case of this is the HF channel. The high frequency (HF) channel is particularly difficult to blindly equalize because of its noisy environments and it long path delays. This work looks into the use of blind equalizers for HF signals and focuses a large amount of the analysis on real world HF transmissions.

PCI RIO IF Transceiver Configured with the LabVIEW FPGA Module

The RIO IF transceiver is an IF transceiver that uses a combination of dedicated digital transmit and receive path signal processing hardware and reconfigurable I/O technology, an internal PLL and 2 input and 2 output channels. The behavior of the device can be customized by programming the FPGA using a high level graphical programming language, LabVIEW. We intend to pursue vigorous research to understand in depth, the working of this transceiver board, and particularly, its applications in the field of communications.

A Time Division−Multiplexing System for Multiple 1394 Buses

We are developing a time division−multiplexing system for multiple 1394 buses that interfaces with a two−way fiber optic connection. The system is backward compatible with all previous versions by using a 1934b−compliant PHY to convert any previous standard to 1394b signaling. Digital filtering is used to detect the tone signaling specified by the IEEE 1934b−2002 standard, and external clock and data recovery is required to extract the data before it is multiplexed.