• 如果您觉得本站非常有看点,那么赶紧使用Ctrl+D 收藏吧

Java HDF5Constants类的典型用法和代码示例

java 1次浏览

本文整理汇总了Java中ncsa.hdf.hdf5lib.HDF5Constants的典型用法代码示例。如果您正苦于以下问题:Java HDF5Constants类的具体用法?Java HDF5Constants怎么用?Java HDF5Constants使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。

HDF5Constants类属于ncsa.hdf.hdf5lib包,在下文中一共展示了HDF5Constants类的25个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: readExchangeSymbolQuotes

点赞 3

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
private void readExchangeSymbolQuotes(String exchange, String symbol) throws EodDataSinkException {
  

  openQuoteDataset(exchange, symbol);

  try {
  int fileDataspaceHandle = H5.H5Dget_space(quoteDatasetHandle);
  long dimensions[] = new long[1];
  long maxDimensions[] = new long[1];
  @SuppressWarnings("unused")
  int status = H5.H5Sget_simple_extent_dims(fileDataspaceHandle, dimensions, maxDimensions);
  
  final byte[] readBuffer = new byte[Hdf5QuoteDatatype.QUOTE_DATATYPE_SIZE * (int)dimensions[0]];

  H5.H5Dread(quoteDatasetHandle,
             HDF5Constants.H5T_NATIVE_INT,
             HDF5Constants.H5S_ALL,
             HDF5Constants.H5S_ALL,
             HDF5Constants.H5P_DEFAULT,
             readBuffer);
  }
  catch (HDF5Exception e) {
    throw new EodDataSinkException();
  }

}
 

开发者ID:jsr38,
项目名称:ds3,
代码行数:27,
代码来源:HdfObjectEodDataSink.java

示例2: readStringArray

点赞 3

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Retrieves a string array given its full-path within the HDF5 file.
 * @param fullPath the target data-set name.
 * @throws HDF5LibException if the operation failed, for example if there is no such a data-set
 * @return never {@code null}.
 */
public String[] readStringArray(final String fullPath) {
    return readDataset(fullPath, (dataSetId, typeId, dimensions) -> {
        if (dimensions.length != 1) {
            throw new HDF5LibException(
                    String.format("expected 1-D array for data-set '%s' in '%s' but it is %d-D", fullPath, file, dimensions.length));
        }
        final boolean isVariableString = H5.H5Tis_variable_str(typeId);

        final String[] result = new String[(int) dimensions[0]];

        final int code;
        if (isVariableString) {
            code = H5.H5DreadVL(dataSetId, typeId, HDF5Constants.H5S_ALL, HDF5Constants.H5S_ALL, HDF5Constants.H5P_DEFAULT, result);
        }
        else {
            code = H5.H5Dread_string(dataSetId, typeId, HDF5Constants.H5S_ALL, HDF5Constants.H5S_ALL, HDF5Constants.H5P_DEFAULT, result);
        }
        if (code < 0) {
            throw new HDF5LibException(String.format("getting strings from data-set '%s' in file '%s' resulted in code: %d", fullPath, file, code));
        }
        return result;
    });
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:30,
代码来源:HDF5File.java

示例3: readDouble

点赞 3

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Reads a double value from a particular position in the underlying HDF5 file.
 *
 * @param fullPath the path to the double value in the HDF5 file.
 * @return the stored value.
 * @throws IllegalArgumentException if {@code fullPath} is {@code null}.
 * @throws HDF5LibException if {@code fullPath} does not exist, contains the wrong data type (non-double) or
 *    is an array with several values or multidimensional.
 */
public double readDouble(final String fullPath) {
    return readDataset(fullPath, (dataSetId, typeId, dimensions) -> {
        if (dimensions.length != 1) {
            throw new HDF5LibException(
                    String.format("expected 1-D array for data-set '%s' in '%s' but it is %d-D", fullPath, file, dimensions.length));
        }
        if (dimensions[0] != 1) {
            throw new HDF5LibException(
                    String.format("expected single value array for data-set '%s' in '%s' but it has %d values", fullPath, file, dimensions[0]));
        }
        final double[] values = new double[1];
        final int code = H5.H5Dread_double(dataSetId, typeId, HDF5Constants.H5S_ALL, HDF5Constants.H5S_ALL, HDF5Constants.H5P_DEFAULT, values);
        if (code < 0) {
            throw new HDF5LibException(String.format("getting a double from data-set '%s' in file '%s' resulted in code: %d", fullPath, file, code));
        }
        return values[0];
    });
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:28,
代码来源:HDF5File.java

示例4: readDoubleMatrix

点赞 3

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Reads a double matrix from a particular position in the underlying HDF5 file.
 *
 * @param fullPath the path.
 * @return never {@code null}, non-existing data or the wrong type in the HDF5
 * file will result in a {@link HDF5LibException} instead.
 * @throws IllegalArgumentException if {@code fullPath} is {@code null}.
 * @throws HDF5LibException if {@code fullPath} does not exist, contains the wrong data type (non-double) or
 *    its dimension is not 2.
 */
public double[][] readDoubleMatrix(final String fullPath) {
    return readDataset(fullPath, (dataSetId, typeId, dimensions) -> {
        if (dimensions.length != 2) {
            throw new HDF5LibException(
                    String.format("expected 2D double matrix for data-set '%s' in '%s' but it is %d-D", fullPath, file, dimensions.length));
        }
        final int rows = (int) dimensions[0];
        final int columns = (int) dimensions[1];
        final int size = rows * columns;
        final double[] values = new double[size];
        final int code = H5.H5Dread_double(dataSetId, typeId, HDF5Constants.H5S_ALL, HDF5Constants.H5S_ALL, HDF5Constants.H5P_DEFAULT, values);
        if (code < 0) {
            throw new HDF5LibException(String.format("getting double matrix from data-set '%s' in file '%s' resulted in code: %d", fullPath, file, code));
        }
        final double[][] result = new double[rows][columns];
        for (int i = 0; i < result.length; i++) {
            System.arraycopy(values, i * columns, result[i], 0, columns);
        }
        return result;
    });
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:32,
代码来源:HDF5File.java

示例5: open

点赞 3

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Opens a HDF5 file given its access {@link OpenMode mode}.
 */
private static int open(final File file, final OpenMode mode) {
    final int fileId;
    try {
        if (mode == OpenMode.CREATE) {
            file.delete();
            file.createNewFile();
        }
        fileId = H5.H5Fopen(file.getAbsolutePath(), mode.getFlags(), HDF5Constants.H5P_DEFAULT);
    } catch (final HDF5LibraryException | IOException e) {
        throw new HDF5LibException(
                String.format("exception when opening '%s' with %s mode: %s",file.getAbsolutePath(), mode, e.getMessage()), e);
    }
    if (fileId < 0) {
        throw new HDF5LibException(
                String.format("failure when opening '%s' for read-only access; negative fileId: %d",file.getAbsolutePath(),fileId)
        );
    }
    return fileId;
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:23,
代码来源:HDF5File.java

示例6: makeDoubleMatrix

点赞 3

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Creates or overwrites a double 2D matrix at particular position in the underlying HDF5 file.
 *
 * @param fullPath the path where to place the double matrix in the HDF5 file.
 * @return the stored value.
 * @throws IllegalArgumentException if {@code fullPath} is {@code null} or is not a valid data-type name.
 * @throws HDF5LibException if {@code fullPath} does not exist, contains the wrong data type (non-double) or
 *    is not a 2D array or is too small to contain the new value.
 */
public boolean makeDoubleMatrix(final String fullPath, final double[][] value) {
    Utils.nonNull(value, "the value provided cannot be null");
    if (value.length == 0) {
        throw new IllegalArgumentException("the value provided must have some elements");
    }
    final int columnCount = Utils.nonNull(value[0], "the input value array cannot contain nulls: 0").length;
    if (columnCount == 0) {
        throw new IllegalArgumentException("the value provided must have some elements");
    }
    for (int i = 1; i < value.length; i++) {
        if (Utils.nonNull(value[i], "some row data is null: " + i).length != columnCount) {
            throw new IllegalArgumentException("some rows in the input value matrix has different number of elements");
        }
    }
    final long[] dimensions = new long[] { value.length, columnCount };
    return makeDataset(fullPath, basicTypeCopyIdSupplier(HDF5Constants.H5T_INTEL_F64), dimensions, value);
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:27,
代码来源:HDF5File.java

示例7: makeDataset

点赞 3

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * General dataset making recipe.
 * @param fullPath the dataset full path.
 * @param typeIdSupplier type id supplier lambda.
 * @param dimensions array with the dimensions of the data.
 * @param data the data. It must be an array of the appropriate type given the type that is
 *             going to be returned by the {@code typeIdSupplier}.
 * @return true iff the data-set needed to be created (it did not existed previously). It will
 * return false if the data-set existed even if it was modified in the process.
 */
private boolean makeDataset(final String fullPath, final IntSupplier typeIdSupplier, final long[] dimensions, final Object data) {
    checkCanWrite();
    int typeCopyId = -1;
    try {
        typeCopyId = typeIdSupplier.getAsInt();
        final Pair<String, String> pathAndName = splitPathInParentAndName(fullPath);
        final String groupPath = pathAndName.getLeft();
        final String dataSetName = pathAndName.getRight();
        makeGroup(groupPath);
        final int childType = findOutGroupChildType(groupPath, dataSetName, fullPath);
        if (childType == HDF5Constants.H5G_UNKNOWN) {
            createDataset(fullPath, typeCopyId, dimensions);
            writeDataset(fullPath, typeCopyId, data);
            return true;
        } else if (childType == HDF5Constants.H5G_DATASET) {
            writeDataset(fullPath, typeCopyId, data);
            return false;
        } else {
            throw new HDF5LibException(String.format("problem trying to write dataset %s in file %s: there is a collision with a non-dataset object", fullPath, file));
        }
    } finally {
        if (typeCopyId != -1) { try { H5.H5Tclose(typeCopyId); } catch (final HDF5Exception ex ){} }
    }
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:35,
代码来源:HDF5File.java

示例8: Hdf5AedatFileInputReader

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Constructor.
 * @param f: The Hdf5 file to be read.
 * @param chip
 * @throws IOException
 */
public Hdf5AedatFileInputReader(File f, AEChip chip) throws IOException  {
    fileName = f.getName();
    this.chip = chip;
    
    // An event packet is a row in the dataset. A row consists of 3 columns.
    // It's ordered like this: timestamp_system | packet header | dvs data.
    eventPktDimsM[0] = 1;
    eventPktDimsM[1] = 3;
    
    eventPktData = new String[(int)(eventPktDimsM[0] * eventPktDimsM[1])]; 
    // Create the memory datatype.
    memtype = H5.H5Tvlen_create(HDF5Constants.H5T_NATIVE_UCHAR);       
    eventCamPktSpace_id = H5.H5Screate_simple(2, eventPktDimsM, null); 
}
 

开发者ID:SensorsINI,
项目名称:jaer,
代码行数:21,
代码来源:Hdf5AedatFileInputReader.java

示例9: getChunkDims

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 *
 * @return the chunk's dimensions.
 */
public long[] getChunkDims() {
    long[] retDims = {0, 0};
    int dcpl = -1;
    int chunknDims = 2;
    try {            
        dcpl = H5.H5Dget_create_plist(dataset_id);
        if( HDF5Constants.H5D_CHUNKED == H5.H5Pget_layout(dcpl)) {
            H5.H5Pget_chunk(dcpl, chunknDims, retDims);
        }
    } catch (Exception e) {
        e.printStackTrace();
    }  
    return retDims;
}
 

开发者ID:SensorsINI,
项目名称:jaer,
代码行数:19,
代码来源:Hdf5AedatFileInputReader.java

示例10: readRowData

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * This function is used to read a row data from the file. Every row in the ddd-17
 * consists of 3 columns: timestamp, header, data.
 * @param rowNum: the number of the row that will be read.
 * @return: the data in the rowNumth row.
 */
public String[] readRowData(int rowNum) {
           
    // Select the row data to read.      
    long[] offset = {0, 0};
    long[] count = {1, 3};
    long[] stride = {1, 1};        
    long[] block = {1, 1};   

    offset[0] = rowNum;
    try { 
        if (wholespace_id >= 0) {
    
            H5.H5Sselect_hyperslab(wholespace_id, HDF5Constants.H5S_SELECT_SET, offset, stride, count, block);

            /*
             * Define and select the second part of the hyperslab selection,
             * which is subtracted from the first selection by the use of
             * H5S_SELECT_NOTB
             */
             // H5.H5Sselect_hyperslab (wholespace_id, HDF5Constants.H5S_SELECT_NOTB, offset, stride, count,
             //               block);    
             
            H5.H5DreadVL(dataset_id, memtype,
                eventCamPktSpace_id, wholespace_id,
                HDF5Constants.H5P_DEFAULT, eventPktData);
        }
    } catch (Exception e) {
        e.printStackTrace();
    }
    
    return eventPktData;
}
 

开发者ID:SensorsINI,
项目名称:jaer,
代码行数:39,
代码来源:Hdf5AedatFileInputReader.java

示例11: createExchangeDataset

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
private synchronized void createExchangeDataset(long dimension)
		throws HDF5Exception, EodDataSinkException {

	if (!isOpen) {
		throw new EodDataSinkException("HDF5 File data sink closed!");
	}

	long dimensions[] = { dimension };
	long maxDimensions[] = { HDF5Constants.H5S_UNLIMITED };
	int exchangeDataspaceHandle = H5.H5Screate_simple(
			EXCHANGE_DATASET_RANK, dimensions, maxDimensions);

	Hdf5ExchangeDatatype exchangeDatatype = new Hdf5ExchangeDatatype();
	exchangeDatatypeHandle = exchangeDatatype.getFileDatatypeHandle();

	int createProperties = H5.H5Pcreate(HDF5Constants.H5P_DATASET_CREATE);
	@SuppressWarnings("unused")
	int status = H5.H5Pset_chunk(createProperties, EXCHANGE_DATASET_RANK,
			dimensions);

	if ((fileHandle >= 0) && (exchangeDataspaceHandle >= 0)
			&& (exchangeDatatypeHandle >= 0)) {

		exchangeDatasetHandle = H5.H5Dcreate(fileHandle,
				EXCHANGE_DATASET_NAME, exchangeDatatypeHandle,
				exchangeDataspaceHandle, HDF5Constants.H5P_DEFAULT,
				createProperties, HDF5Constants.H5P_DEFAULT);
	} else {
		throw new EodDataSinkException(
				"Failed to create exchange dataset from scratch.");
	}

	logger.info("Sucessfully created new exchange dataset.");

}
 

开发者ID:jsr38,
项目名称:ds3,
代码行数:36,
代码来源:Hdf5EodDataSink.java

示例12: readExchanges

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
public synchronized Set<String> readExchanges() throws EodDataSinkException {

		if (!isOpen) {
			throw new EodDataSinkException("HDF5 File data sink closed!");
		}

		String[] exchanges = null;

		try {
			rootGroupHandle = H5.H5Gopen(fileHandle, "/",
					HDF5Constants.H5P_DEFAULT);

			final H5L_iterate_cb iter_cb = new H5L_iter_callbackT();
			final opdata od = new opdata();
			@SuppressWarnings("unused")
			int status = H5.H5Literate(rootGroupHandle,
					HDF5Constants.H5_INDEX_NAME, HDF5Constants.H5_ITER_NATIVE,
					0L, iter_cb, od);
			exchanges = ((H5L_iter_callbackT) iter_cb).getSymbols();

		} catch (HDF5LibraryException lex) {
			StringBuffer messageBuffer = new StringBuffer();
			messageBuffer.append("Failed to iterate over exchanges!");
			EodDataSinkException e = new EodDataSinkException(
					messageBuffer.toString());
			e.initCause(lex);
			throw e;
		}

		return new HashSet<String>(Arrays.asList(exchanges));
	}
 

开发者ID:jsr38,
项目名称:ds3,
代码行数:32,
代码来源:Hdf5EodDataSink.java

示例13: createExchangeDataset

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
private void createExchangeDataset(long dimension) throws HDF5Exception, EodDataSinkException {

  long dimensions[] = { dimension };
  long maxDimensions[] = { HDF5Constants.H5S_UNLIMITED };
  int exchangeDataspaceHandle = H5.H5Screate_simple(EXCHANGE_DATASET_RANK, dimensions, maxDimensions);

  Hdf5ExchangeDatatype exchangeDatatype = new Hdf5ExchangeDatatype();
  exchangeDatatypeHandle = exchangeDatatype.getFileDatatypeHandle();

  int createProperties = H5.H5Pcreate(HDF5Constants.H5P_DATASET_CREATE);
  @SuppressWarnings("unused")
  int status = H5.H5Pset_chunk(createProperties, EXCHANGE_DATASET_RANK, dimensions);

  if ((fileHandle >= 0)
      && (exchangeDataspaceHandle >= 0)
      && (exchangeDatatypeHandle >= 0)) {

    exchangeDatasetHandle = H5.H5Dcreate(fileHandle,
                                         EXCHANGE_DATASET_NAME,
                                         exchangeDatatypeHandle,
                                         exchangeDataspaceHandle,
                                         HDF5Constants.H5P_DEFAULT,
                                         createProperties,
                                         HDF5Constants.H5P_DEFAULT);
  }
  else {
    throw new EodDataSinkException("Failed to create exchange dataset from scratch.");
  }

  logger.info("Sucessfully created new exchange dataset.");

}
 

开发者ID:jsr38,
项目名称:ds3,
代码行数:33,
代码来源:HdfObjectEodDataSink.java

示例14: flush

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Flush this file reader (through the HDF5 API) to disk rather than waiting for the OS to handle it.
 */
public void flush() {
    if (isClosed()) {
        return;
    }
    try {
        H5.H5Fflush(fileId, HDF5Constants.H5F_SCOPE_GLOBAL);
    } catch (final HDF5LibraryException e) {
        throw new HDF5LibException(
                String.format("failure when flushing '%s': %s",file.getAbsolutePath(),e.getMessage()),e);
    }
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:15,
代码来源:HDF5File.java

示例15: readDoubleArray

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Reads a double array from a particular position in the underlying HDF5 file.
 *
 * @param fullPath the path.
 * @return never {@code null}, non-existing data or the wrong type in the HDF5
 * file will result in a {@link HDF5LibException} instead.
 * @throws IllegalArgumentException if {@code fullPath} is {@code null}.
 * @throws HDF5LibException if {@code fullPath} does not exist, contains the wrong data type (non-double) or
 *    is multidimensional.
 */
public double[] readDoubleArray(final String fullPath) {
    return readDataset(fullPath, (dataSetId, typeId, dimensions) -> {
        if (dimensions.length != 1) {
            throw new HDF5LibException(
                    String.format("expected 1-D array for data-set '%s' in '%s' but it is %d-D", fullPath, file, dimensions.length));
        }
        final double[] result = new double[(int) dimensions[0]];
        final int code = H5.H5Dread_double(dataSetId, typeId, HDF5Constants.H5S_ALL, HDF5Constants.H5S_ALL, HDF5Constants.H5P_DEFAULT, result);
        if (code < 0) {
            throw new HDF5LibException(String.format("getting doubles from data-set '%s' in file '%s' resulted in code: %d", fullPath, file, code));
        }
        return result;
    });
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:25,
代码来源:HDF5File.java

示例16: openDataset

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Opens a HDF5 dataSet.
 *
 * @param fullPath dataset full path.
 * @return the dataSet id.
 * @throws HDF5LibraryException if there was some error thrown by the HDF5 library.
 * @throws HDF5LibException if the HDF5 library returned a invalid dataSet id indicating some kind of issue.
 */
private int openDataset(final String fullPath) throws HDF5LibraryException {
    final int dataSetId = H5.H5Dopen(fileId, fullPath, HDF5Constants.H5P_DEFAULT);
    if (dataSetId <= 0) {
        throw new HDF5LibException(
                String.format("opening string data-set '%s' in file '%s' failed with code: %d", fullPath, file, dataSetId));
    }
    return dataSetId;
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:17,
代码来源:HDF5File.java

示例17: findOutGroupChildType

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Returns the type of a group child given.
 * <p>
 * Type constants are listed in {@link HDF5Constants}, eg.:
 * <ul>
 *     <li>{@link HDF5Constants#H5G_GROUP H5G_GROUP} indicate that the child is another group</li>
 *     <li>{@link HDF5Constants#H5G_DATASET H5G_DATASET} indicate that the child is a dataset...</li>
 * </ul>
 *
 * </p>
 * {@link HDF5Constants#H5G_UNKNOWN H5G_UNKNOWN} indicates that the child is not present.
 *
 * @param groupId the parent group id. It must be open.
 * @param name of the target child.
 * @param fullPath full path reported in exceptions when there is an issue.
 * @return {@link HDF5Constants#H5G_UNKNOWN H5G_UNKNOWN} if there is no such a child node, other wise any other valid type constant.
 * @throws HDF5LibraryException if any is thrown by the HDF5 library.
 */
private int findOutGroupChildType(final int groupId, final String name, final String fullPath) throws HDF5LibraryException {

    // Use an single position array to return a value is kinda inefficient but that is the way it is:
    final long[] numObjsResult = new long[1];
    H5G_info_t result = H5.H5Gget_info(groupId);
    numObjsResult[0] = result.nlinks;

    final int childCount = (int) numObjsResult[0];
    if (childCount == 0) { // this is no premature optimization: get_obj_info_all really cannot handle length 0 arrays.
        return HDF5Constants.H5G_UNKNOWN;
    } else {
        final String[] childNames = new String[childCount];
        final int[] childTypes = new int[childCount];
        final int[] lTypes = new int[childCount];
        final long[] childRefs = new long[childCount];

        // Example call in HDF docs (https://www.hdfgroup.org/HDF5/examples/api18-java.html .... H5_Ex_G_Iterate.java Line 71):
        //  H5.H5Gget_obj_info_all(file_id, DATASETNAME, oname, otype, ltype, orefs, HDF5Constants.H5_INDEX_NAME);
        if (H5.H5Gget_obj_info_all(groupId, ".", childNames, childTypes, lTypes, childRefs, HDF5Constants.H5_INDEX_NAME) < 0) {
            throw new HDF5LibException(String.format("problem trying to find a group (%s) in file %s", fullPath, file));
        }
        final int childIndex = ArrayUtils.indexOf(childNames, name);
        if (childIndex == -1) {
            return HDF5Constants.H5G_UNKNOWN;
        } else {
            return childTypes[childIndex];
        }
    }
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:48,
代码来源:HDF5File.java

示例18: openDataset

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Open the dataset in the file.
 * @param datasetPath: the path of the dataset that will be opened.
 * @return. True if open successfully, otherwise false.
 */
public boolean openDataset(String datasetPath) {
    boolean retVal = false;

    // Open file using the default properties.
    try {
        file_id = H5.H5Fopen(fileName, HDF5Constants.H5F_ACC_RDWR, HDF5Constants.H5P_DEFAULT);
        
        // Open dataset using the default properties.
        if (file_id >= 0) {
            int count = (int) H5.H5Gn_members(file_id, "/dvs");
            String[] oname = new String[count];
            int[] otype = new int[count];
            int[] ltype = new int[count];
            long[] orefs = new long[count];
            H5.H5Gget_obj_info_all(file_id, "/dvs", oname, otype, ltype, orefs, HDF5Constants.H5_INDEX_NAME);

            // Get type of the object and display its name and type.
            for (int indx = 0; indx < otype.length; indx++) {
                switch (H5O_type.get(otype[indx])) {
                case H5O_TYPE_GROUP:
                    System.out.println("  Group: " + oname[indx]);
                    break;
                case H5O_TYPE_DATASET:
                    System.out.println("  Dataset: " + oname[indx]);
                    break;
                case H5O_TYPE_NAMED_DATATYPE:
                    System.out.println("  Datatype: " + oname[indx]);
                    break;
                default:
                    System.out.println("  Unknown: " + oname[indx]);
                }
            }
            dataset_id = H5.H5Dopen(file_id, datasetPath, HDF5Constants.H5P_DEFAULT);  
            retVal = true;
        }
    } catch (Exception e) {
        e.printStackTrace();
    }
    return retVal;
}
 

开发者ID:SensorsINI,
项目名称:jaer,
代码行数:46,
代码来源:Hdf5AedatFileInputReader.java

示例19: callback

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
public int callback(int group, String name, H5L_info_t info,
		H5L_iterate_t op_data) {

	H5O_info_t infobuf;
	int return_val = 0;
	opdata od = (opdata) op_data; // Type conversion
	int spaces = 2 * (od.recurs + 1); // Number of white spaces to prepend
										// to
										// output.

	// Get type of the object and display its name and type.
	// The name of the object is passed to this function by the Library.
	try {
		infobuf = H5.H5Oget_info_by_name(group, name,
				HDF5Constants.H5P_DEFAULT);

		for (int i = 0; i < spaces; i++)
			System.out.print(" "); // Format output.
		switch (H5O.H5O_type.get(infobuf.type)) {
		case H5O_TYPE_GROUP:
			System.out.println("Group: " + name + " { ");
			// Check group address against linked list of operator
			// data structures. We will always run the check, as the
			// reference count cannot be relied upon if there are
			// symbolic links, and H5Oget_info_by_name always follows
			// symbolic links. Alternatively we could use H5Lget_info
			// and never recurse on groups discovered by symbolic
			// links, however it could still fail if an object's
			// reference count was manually manipulated with
			// H5Odecr_refcount.
			if (group_check(od, infobuf.addr)) {
				for (int i = 0; i < spaces; i++)
					System.out.print(" ");
				System.out.println("  Warning: Loop detected!");
			} else {
				// Initialize new object of type opdata and begin
				// recursive iteration on the discovered
				// group. The new opdata is given a pointer to the
				// current one.
				symbols.add(name);
				opdata nextod = new opdata();
				nextod.recurs = od.recurs + 1;
				nextod.prev = od;
				nextod.addr = infobuf.addr;
				// H5L_iterate_cb iter_cb2 = new H5L_iter_callbackT();
				// return_val = H5.H5Literate_by_name (group, name,
				// HDF5Constants.H5_INDEX_NAME,
				// HDF5Constants.H5_ITER_NATIVE, 0L, iter_cb2, nextod,
				// HDF5Constants.H5P_DEFAULT);
			}
			for (int i = 0; i < spaces; i++)
				System.out.print(" ");
			System.out.println("}");
			break;
		case H5O_TYPE_DATASET:
			System.out.println("Dataset: " + name);
			break;
		case H5O_TYPE_NAMED_DATATYPE:
			System.out.println("Datatype: " + name);
			break;
		default:
			System.out.println("Unknown: " + name);
		}
	} catch (Exception e) {
		logger.error(e);
	}

	return return_val;
}
 

开发者ID:jsr38,
项目名称:ds3,
代码行数:70,
代码来源:Hdf5EodDataSink.java

示例20: readExchangeSymbols

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
public synchronized String[] readExchangeSymbols(String exchange)
		throws EodDataSinkException {

	if (!isOpen) {
		throw new EodDataSinkException("HDF5 File data sink closed!");
	}

	Integer exchangeGroupHandle = (Integer) exchangeGroupHandleMap
			.get(exchange);

	String[] symbols = null;

	try {
		if (exchangeGroupHandle == null) {
			exchangeGroupHandle = H5.H5Gopen(fileHandle, exchange,
					HDF5Constants.H5P_DEFAULT);
			exchangeGroupHandleMap.put(exchange, exchangeGroupHandle);
		}

		final H5L_iterate_cb iter_cb = new H5L_iter_callbackT();
		final opdata od = new opdata();
		@SuppressWarnings("unused")
		int status = H5.H5Literate(exchangeGroupHandle,
				HDF5Constants.H5_INDEX_NAME, HDF5Constants.H5_ITER_NATIVE,
				0L, iter_cb, od);
		symbols = ((H5L_iter_callbackT) iter_cb).getSymbols();
		if (symbols == null || symbols.length <= 0) {
			throw new EodDataSinkException(
					"Couldn't find any symbols for this exchange.");
		}
	} catch (HDF5LibraryException lex) {
		StringBuffer messageBuffer = new StringBuffer();
		messageBuffer.append("Failed to iterate over exchanges  ");
		messageBuffer.append(exchange);
		messageBuffer.append(" ]");
		EodDataSinkException e = new EodDataSinkException(
				messageBuffer.toString());
		e.initCause(lex);
		throw e;
	}

	return symbols;
}
 

开发者ID:jsr38,
项目名称:ds3,
代码行数:44,
代码来源:Hdf5EodDataSink.java

示例21: createQuoteDataset

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
private void createQuoteDataset(long dimension, int locationHandle) throws HDF5Exception, EodDataSinkException {

  long dimensions[] = { dimension };
  long maxDimensions[] = { HDF5Constants.H5S_UNLIMITED };
  
  int quoteDataspaceHandle = H5.H5Screate_simple(QUOTE_DATASET_RANK, dimensions, maxDimensions);
  quoteFileDatatypeHandle = Hdf5QuoteDatatype.getFileDatatypeHandle();
  quoteMemoryDatatypeHandle = Hdf5QuoteDatatype.getMemoryDatatypeHandle();

  int createProperties = H5.H5Pcreate(HDF5Constants.H5P_DATASET_CREATE);
  @SuppressWarnings("unused")
  int status = H5.H5Pset_chunk(createProperties, QUOTE_DATASET_RANK, QUOTEDATASET_CHUNK_DIMENSIONS);

  if ((fileHandle >= 0)
      && (quoteDataspaceHandle >= 0)
      && (quoteFileDatatypeHandle >= 0)) {

    try {
      quoteDatasetHandle = H5.H5Dcreate(locationHandle,
                                        QUOTE_DATASET_NAME,
                                        quoteFileDatatypeHandle,
                                        quoteDataspaceHandle,
                                        HDF5Constants.H5P_DEFAULT,
                                        createProperties,
                                        HDF5Constants.H5P_DEFAULT);
    }
    catch (HDF5Exception e) {
      throw e;
    }
    finally {
      H5.H5Sclose(quoteDataspaceHandle);
    }
    
  }
  else {
    throw new EodDataSinkException("Failed to create exchange dataset from scratch.");
  }

  logger.info("Sucessfully created new quote dataset.");


}
 

开发者ID:jsr38,
项目名称:ds3,
代码行数:43,
代码来源:HdfObjectEodDataSink.java

示例22: Hdf5ExchangeDatatype

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
Hdf5ExchangeDatatype() throws EodDataSinkException {
  
  try {

    codeDatatypeHandle = H5.H5Tcopy(HDF5Constants.H5T_C_S1);
    H5.H5Tset_size(codeDatatypeHandle, EXCHANGE_DATATYPE_SIZE_CODE);

    nameDatatypeHandle = H5.H5Tcopy(HDF5Constants.H5T_C_S1);
    H5.H5Tset_size(nameDatatypeHandle, EXCHANGE_DATATYPE_SIZE_NAME);
  
    countryDatatypeHandle = H5.H5Tcopy(HDF5Constants.H5T_C_S1);
    H5.H5Tset_size(countryDatatypeHandle, EXCHANGE_DATATYPE_SIZE_COUNTRY);

    currencyDatatypeHandle = H5.H5Tcopy(HDF5Constants.H5T_C_S1);
    H5.H5Tset_size(currencyDatatypeHandle, EXCHANGE_DATATYPE_SIZE_CURRENCY);

    suffixDatatypeHandle = H5.H5Tcopy(HDF5Constants.H5T_C_S1);
    H5.H5Tset_size(suffixDatatypeHandle, EXCHANGE_DATATYPE_SIZE_SUFFIX);

    timezoneDatatypeHandle = H5.H5Tcopy(HDF5Constants.H5T_C_S1);
    H5.H5Tset_size(timezoneDatatypeHandle, EXCHANGE_DATATYPE_SIZE_TIMEZONE);

    isIntradayDatatypeHandle = H5.H5Tcopy(HDF5Constants.H5T_STD_I32LE);

    //intradayStartDateDatatypeHandle = H5.H5Tcopy(HDF5Constants.H5T_STD_U64BE);
    intradayStartDateDatatypeHandle = H5.H5Tcopy(HDF5Constants.H5T_UNIX_D64LE);

    hasIntradayProductDatatypeHandle = H5.H5Tcopy(HDF5Constants.H5T_STD_I32LE);

    exchangeDatatypeHandle = H5.H5Tcreate(HDF5Constants.H5T_COMPOUND, EXCHANGE_DATATYPE_SIZE);

    if (exchangeDatatypeHandle >= 0) {
    
      H5.H5Tinsert(exchangeDatatypeHandle, EXCHANGE_DATATYPE_NAME_CODE, EXCHANGE_DATATYPE_OFFSET_CODE, codeDatatypeHandle);
      H5.H5Tinsert(exchangeDatatypeHandle, EXCHANGE_DATATYPE_NAME_NAME, EXCHANGE_DATATYPE_OFFSET_NAME, nameDatatypeHandle);
      H5.H5Tinsert(exchangeDatatypeHandle, EXCHANGE_DATATYPE_NAME_COUNTRY, EXCHANGE_DATATYPE_OFFSET_COUNTRY, countryDatatypeHandle);
      H5.H5Tinsert(exchangeDatatypeHandle, EXCHANGE_DATATYPE_NAME_CURRENCY, EXCHANGE_DATATYPE_OFFSET_CURRENCY, currencyDatatypeHandle);
      H5.H5Tinsert(exchangeDatatypeHandle, EXCHANGE_DATATYPE_NAME_SUFFIX, EXCHANGE_DATATYPE_OFFSET_SUFFIX, suffixDatatypeHandle);
      H5.H5Tinsert(exchangeDatatypeHandle, EXCHANGE_DATATYPE_NAME_TIMEZONE, EXCHANGE_DATATYPE_OFFSET_TIMEZONE, timezoneDatatypeHandle);
      H5.H5Tinsert(exchangeDatatypeHandle, EXCHANGE_DATATYPE_NAME_ISINTRADAY, EXCHANGE_DATATYPE_OFFSET_ISINTRADAY, isIntradayDatatypeHandle);
      H5.H5Tinsert(exchangeDatatypeHandle, EXCHANGE_DATATYPE_NAME_INTRADAYSTARTDATE, EXCHANGE_DATATYPE_OFFSET_INTRADAYSTARTDATE, intradayStartDateDatatypeHandle);
      H5.H5Tinsert(exchangeDatatypeHandle, EXCHANGE_DATATYPE_NAME_HASINTRADAYPRODUCT, EXCHANGE_DATATYPE_OFFSET_HASINTRADAYPRODUCT, hasIntradayProductDatatypeHandle);

    }
    else {
      throw new EodDataSinkException("Unable to create exchange datatype.");
    }

  }
  catch (HDF5LibraryException ex) {
    logger.error(ex);
    throw new EodDataSinkException("Unable to create exchange datatype.");
  }



}
 

开发者ID:jsr38,
项目名称:ds3,
代码行数:58,
代码来源:Hdf5ExchangeDatatype.java

示例23: createExtensibleArray

点赞 2

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
protected H5ScalarDS createExtensibleArray(String name, Group parent, Datatype type,
                                           String TITLE, String LAYOUT, String UNITS,
                                           long... dims)
    throws Exception
{
    long[] maxdims = dims.clone();
    maxdims[0] = H5F_UNLIMITED;
    long[] chunks = dims.clone();

    /* avoid too small chunks */
    chunks[0] = 1;
    if (ArrayUtil.product(chunks) == 0)
        throw new RuntimeException("Empty chunks: " + xJoined(chunks));

    while (ArrayUtil.product(chunks) < 1024)
        chunks[0] *= 2;

    /* do not write any data in the beginning */
    dims[0] = 0;

    /* Create dataspace */
    int filespace_id = H5.H5Screate_simple(dims.length, dims, maxdims);

    /* Create the dataset creation property list, add the shuffle filter
     * and the gzip compression filter. The order in which the filters
     * are added here is significant — we will see much greater results
     * when the shuffle is applied first. The order in which the filters
     * are added to the property list is the order in which they will be
     * invoked when writing data. */
    int dcpl_id = H5.H5Pcreate(HDF5Constants.H5P_DATASET_CREATE);
    H5.H5Pset_shuffle(dcpl_id);
    H5.H5Pset_deflate(dcpl_id, compression_level);
    H5.H5Pset_chunk(dcpl_id, dims.length, chunks);

    /* Create the dataset */
    final String path = parent.getFullName() + "/" + name;
    H5.H5Dcreate(this.output.getFID(), path,
                 type.toNative(), filespace_id,
                 HDF5Constants.H5P_DEFAULT, dcpl_id, HDF5Constants.H5P_DEFAULT);
    Dataset ds = new H5ScalarDS(this.output, path, "/");
    ds.init();

    log.info("Created {} with dims=[{}] size=[{}] chunks=[{}]",
             name, xJoined(dims), xJoined(maxdims), xJoined(chunks));

    setAttribute(ds, "TITLE", TITLE);
    setAttribute(ds, "LAYOUT", LAYOUT);
    setAttribute(ds, "UNITS", UNITS);

    return (H5ScalarDS) ds;
}
 

开发者ID:neurord,
项目名称:stochdiff,
代码行数:52,
代码来源:ResultWriterHDF5.java

示例24: makeDouble

点赞 1

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Creates or overwrites a double value at particular position in the underlying HDF5 file.
 *
 * @param fullPath the path where to place the double value in the HDF5 file.
 * @return true iff the new data-set had to be created (none existed for that path).
 * @throws IllegalArgumentException if {@code fullPath} is {@code null} or is not a valid data-type name.
 * @throws HDF5LibException if {@code fullPath} does not exist, contains the wrong data type (non-double) or
 *    is an array with several values or multidimensional.
 */
public boolean makeDouble(final String fullPath, final double value) {
    return makeDataset(fullPath, basicTypeCopyIdSupplier(HDF5Constants.H5T_INTEL_F64), SCALAR_VALUE_DIMENSIONS, new double[] {value});
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:13,
代码来源:HDF5File.java

示例25: makeDoubleArray

点赞 1

import ncsa.hdf.hdf5lib.HDF5Constants; //导入依赖的package包/类
/**
 * Creates or overwrites a double array at particular position in the underlying HDF5 file.
 *
 * @param fullPath the path where to place the double array in the HDF5 file.
 * @return true iff the new data-set had to be created (none existed for that path).
 * @throws IllegalArgumentException if {@code fullPath} is {@code null} or is not a valid data-type name.
 * @throws HDF5LibException if {@code fullPath} does not exist, contains the wrong data type (non-double) or
 *    is not a 1D array or is too small to contain the new value.
 */
public boolean makeDoubleArray(final String fullPath, final double[] value) {
    Utils.nonNull(value);
    final long[] dimensions = new long[] { value.length };
    return makeDataset(fullPath, basicTypeCopyIdSupplier(HDF5Constants.H5T_INTEL_F64), dimensions,  value);
}
 

开发者ID:broadinstitute,
项目名称:hdf5-java-bindings,
代码行数:15,
代码来源:HDF5File.java


版权声明:本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系管理员进行删除。
喜欢 (0)