• 如果您觉得本站非常有看点,那么赶紧使用Ctrl+D 收藏吧

Java OrcStruct类的典型用法和代码示例

java 3次浏览

本文整理汇总了Java中org.apache.hadoop.hive.ql.io.orc.OrcStruct的典型用法代码示例。如果您正苦于以下问题:Java OrcStruct类的具体用法?Java OrcStruct怎么用?Java OrcStruct使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。

OrcStruct类属于org.apache.hadoop.hive.ql.io.orc包,在下文中一共展示了OrcStruct类的28个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: getORCRecords

点赞 3

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
public List<OrcStruct> getORCRecords(String storeBaseDir, String tableName) throws IOException {
  List<OrcStruct> orcrecords = new ArrayList<>();
  try {
    FileSystem fs = FileSystem.get(conf);
    Path storeBasePath = new Path(fs.getHomeDirectory(), storeBaseDir);
    Path tablePath = new Path(storeBasePath, tableName);
    if (fs.exists(tablePath)) {
      RemoteIterator<LocatedFileStatus> locatedFileStatusRemoteIterator =
          fs.listFiles(tablePath, false);
      while (locatedFileStatusRemoteIterator.hasNext()) {
        LocatedFileStatus next = locatedFileStatusRemoteIterator.next();
        final org.apache.hadoop.hive.ql.io.orc.Reader fis =
            OrcFile.createReader(next.getPath(), OrcFile.readerOptions(conf));
        RecordReader rows = fis.rows();
        while (rows.hasNext()) {
          orcrecords.add((OrcStruct) rows.next(null));
        }
        System.out.println("File name is " + next.getPath());
      }
    }
  } catch (IOException e) {
    e.printStackTrace();
  }
  return orcrecords;
}
 

开发者ID:ampool,
项目名称:monarch,
代码行数:26,
代码来源:HDFSQuasiService.java

示例2: advanceNextPosition

点赞 3

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Override
public boolean advanceNextPosition()
{
    try {
        if (closed || !recordReader.hasNext()) {
            close();
            return false;
        }

        row = (OrcStruct) recordReader.next(row);

        // reset loaded flags
        // partition keys are already loaded, but everything else is not
        System.arraycopy(isPartitionColumn, 0, loaded, 0, isPartitionColumn.length);

        return true;
    }
    catch (IOException | RuntimeException e) {
        closeWithSuppression(e);
        throw new PrestoException(HIVE_CURSOR_ERROR, e);
    }
}
 

开发者ID:y-lan,
项目名称:presto,
代码行数:23,
代码来源:OrcHiveRecordCursor.java

示例3: getStorerecord

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
private StoreRecord getStorerecord(OrcStruct orcStruct) {
  final FTableOrcStruct fTableOrcStruct = new FTableOrcStruct(orcStruct);
  StoreRecord storeRecord = new StoreRecord(fTableOrcStruct.getNumFields());
  final List<ColumnConverterDescriptor> columnConverters =
      converterDescriptor.getColumnConverters();
  for (int i = 0; i < fTableOrcStruct.getNumFields(); i++) {
    storeRecord.addValue(columnConverters.get(i).getReadable(fTableOrcStruct.getFieldValue(i)));
  }
  return storeRecord;
}
 

开发者ID:ampool,
项目名称:monarch,
代码行数:11,
代码来源:TierStoreORCReader.java

示例4: getObjectInspector

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
private ObjectInspector getObjectInspector() {
  final List<ColumnConverterDescriptor> columnConverters =
      converterDescriptor.getColumnConverters();
  List<String> names = new ArrayList<>();
  List<TypeInfo> typeInfos = new ArrayList<>();
  for (int i = 0; i < columnConverters.size(); i++) {
    final String columnName = columnConverters.get(i).getColumnName();
    names.add(i, columnName);
    final TypeInfo typeInfo = (TypeInfo) columnConverters.get(i).getTypeDescriptor();
    typeInfos.add(i, typeInfo);
  }
  TypeInfo rowTypeInfo = TypeInfoFactory.getStructTypeInfo(names, typeInfos);
  ObjectInspector inspector = OrcStruct.createObjectInspector(rowTypeInfo);
  return inspector;
}
 

开发者ID:ampool,
项目名称:monarch,
代码行数:16,
代码来源:OrcWriterWrapper.java

示例5: verifyORCRecords

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
private void verifyORCRecords(final String tablename, List<Row> rows) throws IOException {
  int expectedrecordsInStore = 0;
  List<OrcStruct> orcRecords = hdfsQuasiService.getORCRecords(baseDir, tablename);
  if (expected == 0) {
    expectedrecordsInStore = NUM_OF_ROWS;
  }
  assertEquals(expectedrecordsInStore, orcRecords.size());
  if (expectedrecordsInStore != 0) {
    assertEquals(rows.size(), orcRecords.size());
    for (int i = 0; i < rows.size(); i++) {
      assertTrue(isEqual(rows.get(i), orcRecords.get(i)));
    }
  }
}
 

开发者ID:ampool,
项目名称:monarch,
代码行数:15,
代码来源:ArchiveTableDUnitTest.java

示例6: CorcRecordReader

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
CorcRecordReader(StructTypeInfo typeInfo, RecordReader<NullWritable, OrcStruct> reader, ConverterFactory factory,
    Filter filter) {
  this.typeInfo = typeInfo;
  this.reader = reader;
  this.factory = factory;
  this.filter = filter;
  transactional = AcidRecordReader.class.isAssignableFrom(reader.getClass());
  if (transactional) {
    transactionalReader = (AcidRecordReader<NullWritable, OrcStruct>) reader;
  } else {
    transactionalReader = null;
  }
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:14,
代码来源:CorcRecordReader.java

示例7: getRecordReader

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Override
public RecordReader<NullWritable, Corc> getRecordReader(InputSplit inputSplit, JobConf conf, Reporter reporter)
    throws IOException {
  StructTypeInfo typeInfo = getSchemaTypeInfo(conf);
  LOG.info("Conf StructTypeInfo: {}", typeInfo);
  if (typeInfo == null) {
    typeInfo = readStructTypeInfoFromSplit(inputSplit, conf);
    LOG.info("File StructTypeInfo: {}", typeInfo);
  }
  setReadColumns(conf, typeInfo);
  RecordReader<NullWritable, OrcStruct> reader = orcInputFormat.getRecordReader(inputSplit, conf, reporter);
  return new CorcRecordReader(typeInfo, reader, getConverterFactory(conf), getFilter(conf, typeInfo));
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:14,
代码来源:CorcInputFormat.java

示例8: toWritableObjectInternal

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Override
protected Object toWritableObjectInternal(Object value) throws UnexpectedTypeException {
  @SuppressWarnings("unchecked")
  List<Object> list = (List<Object>) value;
  OrcStruct result = (OrcStruct) inspector.create();
  result.setNumFields(list.size());
  int i = 0;
  for (StructField field : inspector.getAllStructFieldRefs()) {
    inspector.setStructFieldData(result, field, converters.get(i).toWritableObject(list.get(i)));
    i++;
  }
  return result;
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:14,
代码来源:DefaultConverterFactory.java

示例9: toJavaObjectInternal

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Override
protected Object toJavaObjectInternal(Object value) throws UnexpectedTypeException {
  OrcStruct struct = (OrcStruct) value;
  List<Object> result = new ArrayList<>(struct.getNumFields());
  int i = 0;
  for (StructField field : inspector.getAllStructFieldRefs()) {
    result.add(converters.get(i).toJavaObject(inspector.getStructFieldData(struct, field)));
    i++;
  }
  return result;
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:12,
代码来源:DefaultConverterFactory.java

示例10: getJavaObject

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Override
public Object getJavaObject(OrcStruct struct) throws UnexpectedTypeException {
  Object writable = inspector.getStructFieldData(struct, structField);
  try {
    return converter.toJavaObject(writable);
  } catch (UnexpectedTypeException e) {
    throw new UnexpectedTypeException(writable, structField.getFieldName(), e);
  }
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:10,
代码来源:ValueMarshallerImpl.java

示例11: setWritableObject

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Override
public void setWritableObject(OrcStruct struct, Object javaObject) throws UnexpectedTypeException {
  Object writable;
  try {
    writable = converter.toWritableObject(javaObject);
  } catch (UnexpectedTypeException e) {
    throw new UnexpectedTypeException(javaObject, structField.getFieldName(), e);
  }
  inspector.setStructFieldData(struct, structField, writable);
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:11,
代码来源:ValueMarshallerImpl.java

示例12: Corc

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
public Corc(StructTypeInfo typeInfo, ConverterFactory factory) {
  LOG.debug("TypeInfo: {}", typeInfo);
  inspector = (SettableStructObjectInspector) OrcStruct.createObjectInspector(typeInfo);
  struct = (OrcStruct) inspector.create();
  this.factory = factory;
  recordIdentifier = new RecordIdentifier();
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:8,
代码来源:Corc.java

示例13: toJava

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Test
public void toJava() throws UnexpectedTypeException {
  StructTypeInfo nested = new StructTypeInfoBuilder().add("char1", TypeInfoFactory.getCharTypeInfo(1)).build();
  TypeInfo typeInfo = new StructTypeInfoBuilder()
      .add("char1", TypeInfoFactory.getCharTypeInfo(1))
      .add("struct_char1", nested)
      .build();

  SettableStructObjectInspector inspector = (SettableStructObjectInspector) OrcStruct.createObjectInspector(typeInfo);
  Object struct = inspector.create();
  inspector.setStructFieldData(struct, inspector.getStructFieldRef("char1"),
      new HiveCharWritable(new HiveChar("a", -1)));

  SettableStructObjectInspector nestedInspector = (SettableStructObjectInspector) OrcStruct
      .createObjectInspector(nested);
  Object nestedStruct = inspector.create();
  nestedInspector.setStructFieldData(nestedStruct, nestedInspector.getStructFieldRef("char1"),
      new HiveCharWritable(new HiveChar("b", -1)));
  inspector.setStructFieldData(struct, inspector.getStructFieldRef("struct_char1"), nestedStruct);

  List<Object> list = new ArrayList<>();
  list.add(new HiveChar("a", -1));
  list.add(Arrays.asList(new HiveChar("b", -1)));

  Converter converter = factory.newConverter(inspector);

  Object convertedList = converter.toJavaObject(struct);
  assertThat(convertedList, is((Object) list));

  Object convertedStruct = converter.toWritableObject(list);
  assertThat(convertedStruct, is(struct));
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:33,
代码来源:DefaultConverterFactoryTest.java

示例14: readerCreateKey

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Test
public void readerCreateKey() {
  @SuppressWarnings("unchecked")
  RecordReader<NullWritable, OrcStruct> recordReader = mock(RecordReader.class);
  CorcRecordReader reader = new CorcRecordReader(typeInfo, recordReader, factory, Filter.ACCEPT);

  reader.createKey();
  verify(recordReader).createKey();
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:10,
代码来源:CorcRecordReaderTest.java

示例15: readerCreateValue

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Test
public void readerCreateValue() {
  @SuppressWarnings("unchecked")
  RecordReader<NullWritable, OrcStruct> recordReader = mock(RecordReader.class);
  CorcRecordReader reader = new CorcRecordReader(typeInfo, recordReader, factory, Filter.ACCEPT);

  Corc corc = reader.createValue();
  verify(recordReader, never()).createValue();

  assertThat(corc.getInspector().getTypeName(), is("struct<a:string>"));

  Object create = ((SettableStructObjectInspector) OrcStruct.createObjectInspector(typeInfo)).create();
  assertThat(corc.getOrcStruct(), is(create));
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:15,
代码来源:CorcRecordReaderTest.java

示例16: readerGetPos

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Test
public void readerGetPos() throws IOException {
  @SuppressWarnings("unchecked")
  RecordReader<NullWritable, OrcStruct> recordReader = mock(RecordReader.class);
  CorcRecordReader reader = new CorcRecordReader(typeInfo, recordReader, factory, Filter.ACCEPT);

  reader.getPos();
  verify(recordReader).getPos();
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:10,
代码来源:CorcRecordReaderTest.java

示例17: readerGetProgress

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Test
public void readerGetProgress() throws IOException {
  @SuppressWarnings("unchecked")
  RecordReader<NullWritable, OrcStruct> recordReader = mock(RecordReader.class);
  CorcRecordReader reader = new CorcRecordReader(typeInfo, recordReader, factory, Filter.ACCEPT);

  reader.getProgress();
  verify(recordReader).getProgress();
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:10,
代码来源:CorcRecordReaderTest.java

示例18: readerClose

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Test
public void readerClose() throws IOException {
  @SuppressWarnings("unchecked")
  RecordReader<NullWritable, OrcStruct> recordReader = mock(RecordReader.class);
  CorcRecordReader reader = new CorcRecordReader(typeInfo, recordReader, factory, Filter.ACCEPT);

  reader.close();
  verify(recordReader).close();
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:10,
代码来源:CorcRecordReaderTest.java

示例19: set

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Test
public void set() throws UnexpectedTypeException, IOException {
  when(converter.toWritableObject(VALUE)).thenReturn(new Text(VALUE));

  corc.set("a", VALUE);

  SettableStructObjectInspector inspector = corc.getInspector();
  OrcStruct struct = corc.getOrcStruct();
  StructField structField = inspector.getStructFieldRef("a");
  Object data = inspector.getStructFieldData(struct, structField);

  assertThat(data, is((Object) new Text(VALUE)));
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:14,
代码来源:CorcTest.java

示例20: get

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Test
public void get() throws IOException, UnexpectedTypeException {
  when(converter.toJavaObject(new Text(VALUE))).thenReturn(VALUE);

  SettableStructObjectInspector inspector = corc.getInspector();
  OrcStruct struct = corc.getOrcStruct();
  StructField structField = inspector.getStructFieldRef("a");
  inspector.setStructFieldData(struct, structField, new Text(VALUE));

  assertThat(corc.get("a"), is((Object) VALUE));
  // repeat is same
  assertThat(corc.get("a"), is((Object) VALUE));
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:14,
代码来源:CorcTest.java

示例21: setLocation

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Override
public void setLocation(String location, Job job) throws IOException {
    Properties p = UDFContext.getUDFContext().getUDFProperties(this.getClass());
    if (!UDFContext.getUDFContext().isFrontend()) {
        typeInfo = (TypeInfo)ObjectSerializer.deserialize(p.getProperty(signature + SchemaSignatureSuffix));
    } else if (typeInfo == null) {
        typeInfo = getTypeInfo(location, job);
    }
    if (typeInfo != null && oi == null) {
        oi = OrcStruct.createObjectInspector(typeInfo);
    }
    if (!UDFContext.getUDFContext().isFrontend()) {
        if (p.getProperty(signature + RequiredColumnsSuffix) != null) {
            mRequiredColumns = (boolean[]) ObjectSerializer.deserialize(p
                    .getProperty(signature + RequiredColumnsSuffix));
            job.getConfiguration().setBoolean(ColumnProjectionUtils.READ_ALL_COLUMNS, false);
            job.getConfiguration().set(ColumnProjectionUtils.READ_COLUMN_IDS_CONF_STR,
                    getReqiredColumnIdString(mRequiredColumns));
            if (p.getProperty(signature + SearchArgsSuffix) != null) {
                // Bug in setSearchArgument which always expects READ_COLUMN_NAMES_CONF_STR to be set
                job.getConfiguration().set(ColumnProjectionUtils.READ_COLUMN_NAMES_CONF_STR,
                        getReqiredColumnNamesString(getSchema(location, job), mRequiredColumns));
            }
        } else if (p.getProperty(signature + SearchArgsSuffix) != null) {
            // Bug in setSearchArgument which always expects READ_COLUMN_NAMES_CONF_STR to be set
            job.getConfiguration().set(ColumnProjectionUtils.READ_COLUMN_NAMES_CONF_STR,
                    getReqiredColumnNamesString(getSchema(location, job)));
        }
        if (p.getProperty(signature + SearchArgsSuffix) != null) {
            job.getConfiguration().set(SARG_PUSHDOWN, p.getProperty(signature + SearchArgsSuffix));
        }

    }
    FileInputFormat.setInputPaths(job, location);
}
 

开发者ID:sigmoidanalytics,
项目名称:spork,
代码行数:36,
代码来源:OrcStorage.java

示例22: getReadableComplexType

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@SuppressWarnings("unchecked")
private static Object getReadableComplexType(Object wObject, DataType type) {
  if (wObject == null)
    return null;
  Object resultObject = null;
  switch (type.getCategory()) {
    case Basic:
      resultObject = ConverterUtils.ORCReaderFunctions.get(type.toString()).apply(wObject);
      break;
    case Struct:
      final FTableOrcStruct fTableOrcStruct = new FTableOrcStruct((OrcStruct) wObject);
      StructType structObjectType = (StructType) type;
      Object[] valueArray = new Object[fTableOrcStruct.getNumFields()];
      for (int i = 0; i < valueArray.length; i++) {
        valueArray[i] =
            getReadableComplexType(valueArray[i], structObjectType.getColumnTypes()[i]);
      }
      resultObject = valueArray;
      break;
    case List:
      List list = (List) wObject;
      ListType listObjectType = (ListType) type;
      List outList = new ArrayList();
      for (int i = 0; i < list.size(); i++) {
        outList.add(i, getReadableComplexType(list.get(i), listObjectType.getTypeOfElement()));
      }
      resultObject = outList;
      break;
    case Union:
      final UnionType unionObjectType = (UnionType) type;
      FTableOrcUnion unionIn = new FTableOrcUnion(wObject);
      byte tag = unionIn.getTag();
      Object[] unionOut = new Object[2];
      unionOut[0] = tag;
      unionOut[1] =
          getReadableComplexType(unionIn.getObject(), unionObjectType.getColumnTypes()[tag]);
      resultObject = unionOut;
      break;
    case Map:
      Map iMap = (Map) wObject;
      MapType mapObjectType = (MapType) type;
      Map outMap = new HashMap();
      iMap.forEach((K, V) -> {
        outMap.put(getReadableComplexType(K, mapObjectType.getTypeOfKey()),
            getReadableComplexType(V, mapObjectType.getTypeOfValue()));
      });
      resultObject = outMap;
      break;
    default:
      break;
  }
  return resultObject;
}
 

开发者ID:ampool,
项目名称:monarch,
代码行数:54,
代码来源:ORCColumnConverterDescriptor.java

示例23: getJavaObject

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Override
public Object getJavaObject(OrcStruct struct) throws UnexpectedTypeException {
  return null;
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:5,
代码来源:ValueMarshaller.java

示例24: getWritableObject

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Override
public Object getWritableObject(OrcStruct struct) {
  return null;
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:5,
代码来源:ValueMarshaller.java

示例25: setWritableObject

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Override
public void setWritableObject(OrcStruct struct, Object javaObject) throws UnexpectedTypeException {
  // do nothing
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:5,
代码来源:ValueMarshaller.java

示例26: getWritableObject

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@Override
public Object getWritableObject(OrcStruct struct) {
  return inspector.getStructFieldData(struct, structField);
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:5,
代码来源:ValueMarshallerImpl.java

示例27: getOrcStruct

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
public OrcStruct getOrcStruct() {
  return struct;
}
 

开发者ID:HotelsDotCom,
项目名称:corc,
代码行数:4,
代码来源:Corc.java

示例28: compareData

点赞 2

import org.apache.hadoop.hive.ql.io.orc.OrcStruct; //导入依赖的package包/类
@SuppressWarnings("rawtypes")
private void compareData(Object expected, Object actual) {
    if (expected instanceof Text) {
        assertEquals(String.class, actual.getClass());
        assertEquals(expected.toString(), actual);
    } else if (expected instanceof ShortWritable) {
        assertEquals(Integer.class, actual.getClass());
        assertEquals((int)((ShortWritable) expected).get(), actual);
    } else if (expected instanceof IntWritable) {
        assertEquals(Integer.class, actual.getClass());
        assertEquals(((IntWritable) expected).get(), actual);
    } else if (expected instanceof LongWritable) {
        assertEquals(Long.class, actual.getClass());
        assertEquals(((LongWritable) expected).get(), actual);
    } else if (expected instanceof FloatWritable) {
        assertEquals(Float.class, actual.getClass());
        assertEquals(((FloatWritable) expected).get(), actual);
    } else if (expected instanceof HiveDecimalWritable) {
        assertEquals(BigDecimal.class, actual.getClass());
        assertEquals(((HiveDecimalWritable) expected).toString(), actual.toString());
    } else if (expected instanceof DoubleWritable) {
        assertEquals(Double.class, actual.getClass());
        assertEquals(((DoubleWritable) expected).get(), actual);
    } else if (expected instanceof BooleanWritable) {
        assertEquals(Boolean.class, actual.getClass());
        assertEquals(((BooleanWritable) expected).get(), actual);
    } else if (expected instanceof TimestampWritable) {
        assertEquals(DateTime.class, actual.getClass());
        assertEquals(((TimestampWritable) expected).getTimestamp().getTime(),
                ((DateTime) actual).getMillis());
    } else if (expected instanceof BytesWritable) {
        assertEquals(DataByteArray.class, actual.getClass());
        BytesWritable bw = (BytesWritable) expected;
        assertEquals(new DataByteArray(bw.getBytes(), 0, bw.getLength()), actual);
    } else if (expected instanceof ByteWritable) {
        assertEquals(Integer.class, actual.getClass());
        assertEquals((int) ((ByteWritable) expected).get(), actual);
    } else if (expected instanceof OrcStruct) {
        assertEquals(BinSedesTuple.class, actual.getClass());
        // TODO: compare actual values. No getters in OrcStruct
    } else if (expected instanceof ArrayList) {
        assertEquals(DefaultDataBag.class, actual.getClass());
        // TODO: compare actual values. No getters in OrcStruct
    } else if (expected instanceof HashMap) {
        assertEquals(HashMap.class, actual.getClass());
        assertEquals(((HashMap) expected).size(), ((HashMap) actual).size());
        // TODO: compare actual values. No getters in OrcStruct
    } else if (expected == null) {
        assertEquals(expected, actual);
    } else {
        Assert.fail("Unknown object type: " + expected.getClass().getName());
    }
}
 

开发者ID:sigmoidanalytics,
项目名称:spork,
代码行数:54,
代码来源:TestOrcStorage.java


版权声明:本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系管理员进行删除。
喜欢 (0)