Commit Graph

17 Commits

Author SHA1 Message Date
Allen Byrne
e5759186e8 [svn-r24746] Merge trunk revision 24744 from cmake branch includes;
HDFFV-8505: UD filter changes to remove filters in h5repack.

Tested: local linux
2014-02-27 14:16:04 -05:00
Allen Byrne
ea30f2156b [svn-r24680] Remove acknowledgment file from install.
Remove obsolete CPack.cmake file.
Merge h5repack and h5mkgrp test folder changes from trunk.

Tested: local linux
2014-02-03 16:59:26 -05:00
Allen Byrne
deddc7f955 [svn-r24205] Merge HDFFV-8513/8522 from trunk (via cmake branch), h5repack UD plugins.
Also warning session fixes.

Tested: CMake local linux
2013-09-26 17:10:54 -05:00
Allen Byrne
ad7d03103d [svn-r24108] Remove file from the future. 2013-09-06 12:57:57 -05:00
Allen Byrne
35b22a13a8 [svn-r24106] Add help text tests for h5dump and h5repack to the linux scripts. HDFFV-8498 merge from trunk.
Tested: local linux - cmake and autotools
2013-09-06 12:28:26 -05:00
Allen Byrne
8194f34ef7 [svn-r24072] Merge trunk cmake changes to 1.8 branch.
Also add default switch blocks to h5import.
Merge h5dump any_path option from trunk.

Tested: local linux
2013-08-26 10:35:15 -05:00
Jonathan Kim
06e9ab4420 [svn-r22814] Purpose:
HDFFV-8012 - h5repack changes max dims and cause failure if only "-f none" is used without changing layout for chunked dataset when a chunk dim is bigger than a dataset dim

Description:
  "h5repack -f <obj>:NONE <file.h5> out.h5" command failed if source file contains chunked dataset and a chunk dim is bigger than a dataset dim. 
    Another issue is that the command changed max dims if chunk dim is smaller than the dataset dim. 
    These issue occurred when dataset size is smaller than 64k (compact size limit)
    Fixed them.
  Merged from HDF5 trunk r22805

Tested: 
    jam (linux32-LE), koala (linux64-LE), ostrich (linuxppc64-BE), tejeda (mac32-LE), linew (solaris-BE),  Windows (32-LE cmake), cmake (jam)
2012-09-26 13:10:01 -05:00
Jonathan Kim
f6fded4394 [svn-r22281] Purpose:
Fix for HDFFV-7993 - h5repack fails with error "chunk size must be <= maximum dimension size for fixed-sized dimensions"

Description:
  Fixed a failure when change the chunk size of a specified chunked dataset with unlimited max dims.
  Also took care of converting to contiguous and compact from the dataset.
  Test cases were added and tagged with jira#.
  Merged from HDF5 trunk r22277.

Tested:
  jam (linux32-LE), koala (linux64-LE), ostrich (linuxppc64-BE), tejeda (mac32-LE), Windows (32-LE cmake), Cmake (jam)
2012-04-11 17:31:31 -05:00
Jonathan Kim
3f1489ee7b [svn-r21400] Purpose:
HDFFV-5932 - h5repack breaks files with dimension scales

Description:
    - Fixed h5repack to update values of references(object and region) of 
      attributes in h5repack for 1) references, 2) ARRAY of references, 
      3) VLEN of references, and 4) COMPOUND of references.
    - Merged from HDF5 trunk 21393, 21382, 21386, 21389. (support Peter)

Tested:
 jam (linux32-LE), koala (linux64-LE), heiwa (linuxppc64-BE), tejeda (mac32-LE)
2011-09-20 11:34:01 -05:00
Jonathan Kim
0069b4b190 [svn-r21151] Description:
Merged from HDF5 trunk r21105

  Fixed two bugs:    
    - h5repack: h5repack failed to copy dataset if the layout is changed from c
      hunked with unlimited dims to contiguous. (PC -- 2011/07/15)
    - h5diff: "--delta" option considers two NaN of the same type are different
      , which is wrong based on http://www.hdfgroup.org/HDF5/doc/RM/Tools.html#Tools-Diff.  (PC -- 2011/07/15)

Tested:
  jam (linux32-LE), koala (linux64-LE), heiwa (linuxppc64-BE), tejeda (mac32-LE), linew (solaris-BE)
2011-07-26 10:20:51 -05:00
Jonathan Kim
f4af852764 [svn-r19391] Purpose:
Fix for Bug1896 h5repack - changing layout to COMPACT does not work

Description:
Merged from hdf5 trunk (r19389)
Make h5repack be able to convert a layout to COMPACT for small size dataset as default.  Also add verifying layout changes in our test script.

Tested:
 jam, amani
2010-09-15 14:00:44 -05:00
Jonathan Kim
25fb348729 [svn-r18455] Purpose:
Fix for the bug1726 - NPOESS: h5repack loses attributes for datasets of 
    type H5T_REFERENCE.

Description:
    Merged from hdf5 trunk rXXX

Tested:
    jam
2010-03-25 12:46:21 -05:00
Jonathan Kim
8438c335ed [svn-r18427] Purpose:
Fix for bug1814 - NPOESS: h5repack doesn't handle references to the groups 
    as an element of a dataset 

Description:
    Merged from hdf5 trunk r18425
    handles object reference to named-datatype as well.
    Add test cases.

Tested:
    jam
2010-03-18 17:31:00 -05:00
Neil Fortner
4446e5da71 [svn-r16802] Purpose: Fix bug 1516
Description:
h5repack previously would not take named datatypes into consideration when copying
datasets and attributes.  This would cause extra anonymous datatypes in the target file
at best, and cause errors halfway through the repacking at worst.  h5repack should now
always handle named datatypes correctly.  Named datatypes are also now converted to the
native type when -n is given.

Tested: jam, linew, smirom (h5committest)
2009-04-20 11:48:14 -05:00
Pedro Vicente Nunes
22d6e96014 [svn-r16641] merge from trunk revs 16614, 16629
1.	#1501  (B1) tools bug if dataset is larger than H5TOOLS_BUFSIZE limit. 
ISSUE : the tools use the following formula to read by hyperslabs: hyperslab_size[i] = MIN( dim_size[i], H5TOOLS_BUFSIZE / datum_size) where H5TOOLS_BUFSIZE is a constant defined of 1024K. This is OK as long as the datum_size does not exceed 1024K, otherwise we have a hyperslab size of 0 (since 1024K/(greater than 1024K) = 0). This affects h5dump. h5repack, h5diff
SOLUTION: add a check for a 0 size and define as 1 if so. 
TEST FOR H5DUMP: Defined a case in the h5dump test generator program of such a type (an array type of doubles with a large array dimension, that was the case the user reported).  Since the written file commited in svn would be around 1024K, opted for not writing the data (the part of the code where the hyperslab is defined is executed, since h5dump always reads the files). Defined a macro WRITE_ARRAY to enable such writing if needed. Added a run on the h5dump shell script. Added 2 new files to svn: tools/testfiles/tarray8.ddl, tools/testfiles/tarray8.h5. NOTE: while doing this I thought of adding this dataset case to an existing file, but that would add the large array output to those files (the ddls). The issue is that the file list is increasing.
TEST FOR H5DIFF: for h5diff the check for reading by hyperslabs is H5TOOLS_MALLOCSIZE (128 * H5TOOLS_BUFSIZE) or 128 Mb. This makes it not possible to add such a file to svn, so used the same method as h5dump (only write the dataset if WRITE_ARRAY is defined). As opposed to h5dump, the hyperslab code is NOT executed when the dataset is empty (dataset is not read). Added the new dataset to existing files and shell run (tools/h5diff/testfiles/h5diff_dset1.h5 and tools/h5diff/testfiles/h5diff_dset2.h5 and output in tools/h5diff/testfiles/h5diff_80.txt).
TEST FOR H5REPACK: similar issue as h5diff with the difference that the hyperslab code is run. Added a run to the shell script (with a filter, otherwise the code uses H5Ocopy). 
FURTHER ISSUES: the type in question ("double") has a different output cross platforms (e.g on liberty some garbage number is printed at some array locations)
SOLUTION: defined an "int" type for this test. However the printing of such an array has a bogus output at least in one platform (FreeBsd), so eliminated the test run altogether and filed a bug report on this
2009-04-01 10:25:43 -05:00
Quincey Koziol
fc61c3a81a [svn-r16402] Description:
Bring r16401 back from trunk:

	Correct error introduced in r16353 with layout version, and add test
so it gets caught earlier.

Tested on:
	FreeBSD/32 6.3 (duty)
	Too minor to require h5committest
2009-02-02 20:43:58 -05:00
Pedro Vicente Nunes
bb206d572c [svn-r15750] move h5repack test files to /tools/h5repack/testfiles
tested: linux
2008-10-01 15:04:33 -05:00