The tar ball contains: ncdump - netCDF-4 executable, 32-bit Linux; displays the contents of the netCDF files (and some HDF5 files) use ./ncdump -h filename Original NPOESS files we got from the GRAVITE system at NOAA. They look like some MODIS data files converted to the NPOESS format: SVM07_ter_d20101206_t2009584_e2011083_b0000-1_c20101206231443705497_grav_dev.h5 GMODO_ter_d20101206_t2009584_e2011083_b00000_c20101206225316640547_grav_dev.h5 NPOESS XML file we got from Richard: D34862-03_NPOESS-CDFCB-X-Vol-III_D_VIIRS-M7-SDR-PP.xml Augmented HDF5 files and the corresponding outputs of the "ncdump -h filename" commands: SVM07_ter_augmented-without-XML.h5 augmented-without-XML.out SVM07_ter_augmented-with-ds.h5 augmented-with-ds.out SVM07_ter_augmented-with-ds-attr.h5 augmented-with-ds-attr.out Here is a brief description of the augmentation process: 1) Try ncdump -h SVM07_ter_d20101206_t2009584_e2011083_b0000-1_c20101206231443705497_grav_dev.h5 and you will get an error ./ncdump: SVM07_ter_d20101206_t2009584_e2011083_b0000-1_c20101206231443705497_grav_dev.h5: NetCDF: Bad type ID 2) Open the file with HDFView, delete the /Data_Products group and save as SVM07_ter_augmented-without-XML.h5 3) Try ncdump -h SVM07_ter_augmented-without-XML.h5 and you should see output as saved in the augmented-without-XML.out file. Since HDF5 file doesn't have information, for example, about the dimensions of the Radiance dataset, ncdump displays Radiance as group: VIIRS-M7-SDR_All { dimensions: .... phony_dim_3 = 768 ; phony_dim_4 = 3200 ; .... variables: ... float Radiance(phony_dim_3, phony_dim_4) ; ... 4) We wrote a program that adds dimension information found in the XML file to the HDF5 file (see the result in the file SVM07_ter_augmented-with-ds.h5) Here is an excerpt from XML and the corresponding ncdump's view of the augmented data: XML: Radiance AlongTrack ... CrossTrack ... ncdump output (compare withe ncdump output in 3): group: VIIRS-M7-SDR_All { dimensions: AlongTrack = 768 ; CrossTrack = 3200 ; ... variables: ... float Radiance(AlongTrack, CrossTrack) ; ... 5) XML file contains information about raw data of the Radiance dataset, as shown in the following excerpt from the XML file: Radiance ..... Calibrated Top of Atmosphere (TOA) Radiance for each VIIRS pixel 0 1 RadianceFactors W/(m^2 μm sr) unsigned 16-bit integer NA_UINT16_FILL 65535 .... Our program creates attributes on the dataset Radiance using XML node's name for the HDF5 attribute name and the XML value; for example, the augmented HDF5 file will have attributes on the Radiance dataset such as Description, DatumOffset, FillValue. Not all nodes ARE mapped to the attributes, for example, XML node DataType, since HDF5 has this information already as a part of the dataset "Radiance" definition. If now we use ncdump on the augmented file, we can see the output as saved in the augmented-with-ds-attr.out file. here is the corresponding excerpt from the ncdump output: float Radiance(AlongTrack, CrossTrack) ; string Radiance:Description = "Calibrated Top of Atmosphere (TOA) Radiance for each VIIRS pixel" ; string Radiance:DIMENSION_LABELS = "AlongTrack", "CrossTrack" ; Radiance:DatumOffset = 0 ; Radiance:Scaled = 1 ; string Radiance:MeasurementUnits = "W/(m^2 μm srr)" ; 6) One step that we didn't finish in our example program, was bringing Height, Longitude and Latitude datasets from the GEO*.h5 file to the augmented file. I would be glad to walk you through those steps/examples if you have any questions. I am working on a formal design document and hope to share it as soon as I have it. Our major problem now is that netCDF-4 cannot open HDF5 file that contains datasets with the region or object references. That is why we need to delete /Data_products group. There are also some weird failures in ncdump (for example, I had to delete all data under the /Data_Products group in order to use ncdump on the augmented file, and not just the datasets with references). Thank you! Elena