THESIS
2000
[16], 96, 7 leaves : ill. ; 30 cm
Abstract
Neuromorphic vision sensors are biologically based vision sensors. In this kind of sensors, the image acquisition and image processing are integrated into a single parallel system to avoid the computation bottleneck faced by other digital image processing system, like charge-cou-pled devices (CCD). Among the visual processing tasks implemented using neuromorphic vision sensors, motion perception or image velocity estimation, is one of the most fundamental. Different approaches in image velocity estimations has been proposed. However, these methods suffer from the lack of robustness (Gradient Method) and loss of information due to quantizations (Correlation Method)....[
Read more ]
Neuromorphic vision sensors are biologically based vision sensors. In this kind of sensors, the image acquisition and image processing are integrated into a single parallel system to avoid the computation bottleneck faced by other digital image processing system, like charge-cou-pled devices (CCD). Among the visual processing tasks implemented using neuromorphic vision sensors, motion perception or image velocity estimation, is one of the most fundamental. Different approaches in image velocity estimations has been proposed. However, these methods suffer from the lack of robustness (Gradient Method) and loss of information due to quantizations (Correlation Method).
In this research a more robust indicator, phase, of an image is used. By using a spatial Gabor-type filter together with an integrated differentiation and division circuit, the local image velocity is extracted in continuous time. This scheme of image velocity estimation avoids the robustness problem by the band-pass characteristic of the Gabor-type filter. Also its continuous time signal processing eliminates the loss of information due to quantization.
An one-dimensional version of this velocity sensor has been fabricated using AMI1.5um nwell process via MOSIS. The chip contains of 18 pixels in an area of 1564.8um x 1676um. Each pixel generates two outputs, the local image velocity and the local motion direction. Under the best biasing conditions, the chip can report a linear response with the input image velocity up to 300 pixels/s. Similar responses are obtained when the intensity of the motion is reduced to 32% of the original intensity. If the target image velocity is slower, at 100 pixel/s, the intensity of the motion can even reduce to 1% of the original intensity. The chip requires a differential 2.5V supplies. The static power consumption is 23.8mW.
Post a Comment