Processing hyperspectral or multispectral image data

Robles-Kelly, Antonio and Jordan, J 2019, Processing hyperspectral or multispectral image data, US20170053178A1.

Attached Files
Name Description MIMEType Size Downloads

Title Processing hyperspectral or multispectral image data
Creator(s) Robles-Kelly, AntonioORCID iD for Robles-Kelly, Antonio
Jordan, J
Date 2019
Patent no. US20170053178A1
Patent owner NICTA
Summary The disclosure concerns processing hyperspectral or multispectral images. Image data comprises a sampled image spectrum represented by first values for each pixel location representative of an intensity associated with a wavelength index. A processor determines for each pixel location second values based on a measure of similarity between pixel locations with respect to the first values such that two pixel locations that 5 are similar with respect to the first values are also similar with respect to the second values. The processor then stores for each pixel location the determined one or more second values associated with that pixel location on a data store. This way, the image data is made suitable for applications, such as clustering or displaying, while pixels that are similar in the input image are also similar in the output data. This means that a 10 structure between the pixels in the input image is preserved.
Language eng
Indigenous content off
HERDC Research category I.1 Patents
Persistent URL

Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.

Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 0 times in TR Web of Science
Scopus Citation Count Cited 0 times in Scopus
Google Scholar Search Google Scholar
Access Statistics: 8 Abstract Views, 2 File Downloads  -  Detailed Statistics
Created: Fri, 28 Jun 2019, 14:59:42 EST

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact