• 如果您觉得本站非常有看点,那么赶紧使用Ctrl+D 收藏吧

Java EncodedSeeker类的典型用法和代码示例

java 2次浏览

本文整理汇总了Java中org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker的典型用法代码示例。如果您正苦于以下问题:Java EncodedSeeker类的具体用法?Java EncodedSeeker怎么用?Java EncodedSeeker使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。

EncodedSeeker类属于org.apache.hadoop.hbase.io.encoding.DataBlockEncoder包,在下文中一共展示了EncodedSeeker类的14个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: testSeekWithRandomData

点赞 3

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testSeekWithRandomData() throws Exception {
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  ByteArrayOutputStream baosInMemory = new ByteArrayOutputStream();
  DataOutputStream userDataStream = new DataOutputStream(baosInMemory);
  int batchId = numBatchesWritten++;
  HFileContext meta = new HFileContextBuilder()
                      .withHBaseCheckSum(false)
                      .withIncludesMvcc(false)
                      .withIncludesTags(includesTag)
                      .withCompression(Algorithm.NONE)
                      .build();
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      DataBlockEncoding.PREFIX_TREE, new byte[0], meta);
  generateRandomTestData(kvset, batchId, includesTag, encoder, blkEncodingCtx, userDataStream);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR,
      encoder.newDataBlockDecodingContext(meta));
  byte[] onDiskBytes = baosInMemory.toByteArray();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes, DataBlockEncoding.ID_SIZE,
      onDiskBytes.length - DataBlockEncoding.ID_SIZE);
  verifySeeking(seeker, readBuffer, batchId);
}
 

开发者ID:fengchen8086,
项目名称:ditb,
代码行数:23,
代码来源:TestPrefixTreeEncoding.java

示例2: testSeekWithFixedData

点赞 3

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testSeekWithFixedData() throws Exception {
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  int batchId = numBatchesWritten++;
  HFileContext meta = new HFileContextBuilder()
                      .withHBaseCheckSum(false)
                      .withIncludesMvcc(false)
                      .withIncludesTags(includesTag)
                      .withCompression(Algorithm.NONE)
                      .build();
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      DataBlockEncoding.PREFIX_TREE, new byte[0], meta);
  ByteArrayOutputStream baosInMemory = new ByteArrayOutputStream();
  DataOutputStream userDataStream = new DataOutputStream(baosInMemory);
  generateFixedTestData(kvset, batchId, includesTag, encoder, blkEncodingCtx, userDataStream);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR,
      encoder.newDataBlockDecodingContext(meta));
  byte[] onDiskBytes = baosInMemory.toByteArray();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes, DataBlockEncoding.ID_SIZE,
      onDiskBytes.length - DataBlockEncoding.ID_SIZE);
  verifySeeking(seeker, readBuffer, batchId);
}
 

开发者ID:fengchen8086,
项目名称:ditb,
代码行数:23,
代码来源:TestPrefixTreeEncoding.java

示例3: testSeekWithRandomData

点赞 3

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testSeekWithRandomData() throws Exception {
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  int batchId = numBatchesWritten++;
  ByteBuffer dataBuffer = generateRandomTestData(kvset, batchId, includesTag);
  HFileContext meta = new HFileContextBuilder()
                      .withHBaseCheckSum(false)
                      .withIncludesMvcc(false)
                      .withIncludesTags(includesTag)
                      .withCompression(Algorithm.NONE)
                      .build();
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      DataBlockEncoding.PREFIX_TREE, new byte[0], meta);
  encoder.encodeKeyValues(dataBuffer, blkEncodingCtx);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR,
      encoder.newDataBlockDecodingContext(meta));
  byte[] onDiskBytes = blkEncodingCtx.getOnDiskBytesWithHeader();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes, DataBlockEncoding.ID_SIZE,
      onDiskBytes.length - DataBlockEncoding.ID_SIZE);
  verifySeeking(seeker, readBuffer, batchId);
}
 

开发者ID:tenggyut,
项目名称:HIndex,
代码行数:22,
代码来源:TestPrefixTreeEncoding.java

示例4: testSeekWithFixedData

点赞 3

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testSeekWithFixedData() throws Exception {
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  int batchId = numBatchesWritten++;
  ByteBuffer dataBuffer = generateFixedTestData(kvset, batchId, includesTag);
  HFileContext meta = new HFileContextBuilder()
                      .withHBaseCheckSum(false)
                      .withIncludesMvcc(false)
                      .withIncludesTags(includesTag)
                      .withCompression(Algorithm.NONE)
                      .build();
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      DataBlockEncoding.PREFIX_TREE, new byte[0], meta);
  encoder.encodeKeyValues(dataBuffer, blkEncodingCtx);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR,
      encoder.newDataBlockDecodingContext(meta));
  byte[] onDiskBytes = blkEncodingCtx.getOnDiskBytesWithHeader();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes, DataBlockEncoding.ID_SIZE,
      onDiskBytes.length - DataBlockEncoding.ID_SIZE);
  verifySeeking(seeker, readBuffer, batchId);
}
 

开发者ID:tenggyut,
项目名称:HIndex,
代码行数:22,
代码来源:TestPrefixTreeEncoding.java

示例5: testScanWithRandomData

点赞 3

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testScanWithRandomData() throws Exception {
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  ByteBuffer dataBuffer = generateRandomTestData(kvset, numBatchesWritten++);
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      Algorithm.NONE, DataBlockEncoding.PREFIX_TREE, new byte[0]);
  encoder.encodeKeyValues(dataBuffer, false, blkEncodingCtx);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR, false);
  byte[] onDiskBytes=blkEncodingCtx.getOnDiskBytesWithHeader();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes,
      DataBlockEncoding.ID_SIZE, onDiskBytes.length
          - DataBlockEncoding.ID_SIZE);
  seeker.setCurrentBuffer(readBuffer);
  KeyValue previousKV = null;
  do{
    KeyValue currentKV = seeker.getKeyValue();
    if (previousKV != null && KeyValue.COMPARATOR.compare(currentKV, previousKV) < 0) {
      dumpInputKVSet();
      fail("Current kv " + currentKV + " is smaller than previous keyvalue "
          + previousKV);
    }
    previousKV = currentKV;
  } while (seeker.next());
}
 

开发者ID:cloud-software-foundation,
项目名称:c5,
代码行数:25,
代码来源:TestPrefixTreeEncoding.java

示例6: testSeekWithFixedData

点赞 3

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testSeekWithFixedData() throws Exception {
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  int batchId = numBatchesWritten++;
  ByteBuffer dataBuffer = generateFixedTestData(kvset, batchId);
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      Algorithm.NONE, DataBlockEncoding.PREFIX_TREE, new byte[0]);
  encoder.encodeKeyValues(dataBuffer, false, blkEncodingCtx);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR,
      false);
  byte[] onDiskBytes = blkEncodingCtx.getOnDiskBytesWithHeader();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes,
      DataBlockEncoding.ID_SIZE, onDiskBytes.length
          - DataBlockEncoding.ID_SIZE);
  verifySeeking(seeker, readBuffer, batchId);
}
 

开发者ID:cloud-software-foundation,
项目名称:c5,
代码行数:17,
代码来源:TestPrefixTreeEncoding.java

示例7: testScanWithRandomData

点赞 2

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testScanWithRandomData() throws Exception {
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  ByteArrayOutputStream baosInMemory = new ByteArrayOutputStream();
  DataOutputStream userDataStream = new DataOutputStream(baosInMemory);
  HFileContext meta = new HFileContextBuilder()
                      .withHBaseCheckSum(false)
                      .withIncludesMvcc(false)
                      .withIncludesTags(includesTag)
                      .withCompression(Algorithm.NONE)
                      .build();
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      DataBlockEncoding.PREFIX_TREE, new byte[0], meta);
  generateRandomTestData(kvset, numBatchesWritten++, includesTag, encoder, blkEncodingCtx,
      userDataStream);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR,
      encoder.newDataBlockDecodingContext(meta));
  byte[] onDiskBytes = baosInMemory.toByteArray();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes, DataBlockEncoding.ID_SIZE,
      onDiskBytes.length - DataBlockEncoding.ID_SIZE);
  seeker.setCurrentBuffer(readBuffer);
  Cell previousKV = null;
  do {
    Cell currentKV = seeker.getKeyValue();
    System.out.println(currentKV);
    if (previousKV != null && KeyValue.COMPARATOR.compare(currentKV, previousKV) < 0) {
      dumpInputKVSet();
      fail("Current kv " + currentKV + " is smaller than previous keyvalue " + previousKV);
    }
    if (!includesTag) {
      assertFalse(currentKV.getTagsLength() > 0);
    } else {
      Assert.assertTrue(currentKV.getTagsLength() > 0);
    }
    previousKV = currentKV;
  } while (seeker.next());
}
 

开发者ID:fengchen8086,
项目名称:ditb,
代码行数:38,
代码来源:TestPrefixTreeEncoding.java

示例8: verifySeeking

点赞 2

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
private void verifySeeking(EncodedSeeker encodeSeeker,
    ByteBuffer encodedData, int batchId) {
  List<KeyValue> kvList = new ArrayList<KeyValue>();
  for (int i = 0; i < NUM_ROWS_PER_BATCH; ++i) {
    kvList.clear();
    encodeSeeker.setCurrentBuffer(encodedData);
    KeyValue firstOnRow = KeyValueUtil.createFirstOnRow(getRowKey(batchId, i));
    encodeSeeker.seekToKeyInBlock(
        new KeyValue.KeyOnlyKeyValue(firstOnRow.getBuffer(), firstOnRow.getKeyOffset(),
            firstOnRow.getKeyLength()), false);
    boolean hasMoreOfEncodeScanner = encodeSeeker.next();
    CollectionBackedScanner collectionScanner = new CollectionBackedScanner(
        this.kvset);
    boolean hasMoreOfCollectionScanner = collectionScanner.seek(firstOnRow);
    if (hasMoreOfEncodeScanner != hasMoreOfCollectionScanner) {
      dumpInputKVSet();
      fail("Get error result after seeking " + firstOnRow);
    }
    if (hasMoreOfEncodeScanner) {
      if (KeyValue.COMPARATOR.compare(encodeSeeker.getKeyValue(),
          collectionScanner.peek()) != 0) {
        dumpInputKVSet();
        fail("Expected " + collectionScanner.peek() + " actual "
            + encodeSeeker.getKeyValue() + ", after seeking " + firstOnRow);
      }
    }
  }
}
 

开发者ID:fengchen8086,
项目名称:ditb,
代码行数:29,
代码来源:TestPrefixTreeEncoding.java

示例9: testScanWithRandomData

点赞 2

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testScanWithRandomData() throws Exception {
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  ByteBuffer dataBuffer = generateRandomTestData(kvset, numBatchesWritten++, includesTag);
  HFileContext meta = new HFileContextBuilder()
                      .withHBaseCheckSum(false)
                      .withIncludesMvcc(false)
                      .withIncludesTags(includesTag)
                      .withCompression(Algorithm.NONE)
                      .build();
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      DataBlockEncoding.PREFIX_TREE, new byte[0], meta);
  encoder.encodeKeyValues(dataBuffer, blkEncodingCtx);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR,
      encoder.newDataBlockDecodingContext(meta));
  byte[] onDiskBytes = blkEncodingCtx.getOnDiskBytesWithHeader();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes, DataBlockEncoding.ID_SIZE,
      onDiskBytes.length - DataBlockEncoding.ID_SIZE);
  seeker.setCurrentBuffer(readBuffer);
  KeyValue previousKV = null;
  do {
    KeyValue currentKV = seeker.getKeyValue();
    System.out.println(currentKV);
    if (previousKV != null && KeyValue.COMPARATOR.compare(currentKV, previousKV) < 0) {
      dumpInputKVSet();
      fail("Current kv " + currentKV + " is smaller than previous keyvalue " + previousKV);
    }
    if (!includesTag) {
      assertFalse(currentKV.getTagsLength() > 0);
    } else {
      Assert.assertTrue(currentKV.getTagsLength() > 0);
    }
    previousKV = currentKV;
  } while (seeker.next());
}
 

开发者ID:tenggyut,
项目名称:HIndex,
代码行数:36,
代码来源:TestPrefixTreeEncoding.java

示例10: verifySeeking

点赞 2

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
private void verifySeeking(EncodedSeeker encodeSeeker,
    ByteBuffer encodedData, int batchId) {
  List<KeyValue> kvList = new ArrayList<KeyValue>();
  for (int i = 0; i < NUM_ROWS_PER_BATCH; ++i) {
    kvList.clear();
    encodeSeeker.setCurrentBuffer(encodedData);
    KeyValue firstOnRow = KeyValue.createFirstOnRow(getRowKey(batchId, i));
    encodeSeeker.seekToKeyInBlock(firstOnRow.getBuffer(),
        firstOnRow.getKeyOffset(), firstOnRow.getKeyLength(), false);
    boolean hasMoreOfEncodeScanner = encodeSeeker.next();
    CollectionBackedScanner collectionScanner = new CollectionBackedScanner(
        this.kvset);
    boolean hasMoreOfCollectionScanner = collectionScanner.seek(firstOnRow);
    if (hasMoreOfEncodeScanner != hasMoreOfCollectionScanner) {
      dumpInputKVSet();
      fail("Get error result after seeking " + firstOnRow);
    }
    if (hasMoreOfEncodeScanner) {
      if (KeyValue.COMPARATOR.compare(encodeSeeker.getKeyValue(),
          collectionScanner.peek()) != 0) {
        dumpInputKVSet();
        fail("Expected " + collectionScanner.peek() + " actual "
            + encodeSeeker.getKeyValue() + ", after seeking " + firstOnRow);
      }
    }
  }
}
 

开发者ID:tenggyut,
项目名称:HIndex,
代码行数:28,
代码来源:TestPrefixTreeEncoding.java

示例11: testSeekWithRandomData

点赞 2

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testSeekWithRandomData() throws Exception {
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  int batchId = numBatchesWritten++;
  ByteBuffer dataBuffer = generateRandomTestData(kvset, batchId);
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      Algorithm.NONE, DataBlockEncoding.PREFIX_TREE, new byte[0]);
  encoder.encodeKeyValues(dataBuffer, false, blkEncodingCtx);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR, false);
  byte[] onDiskBytes = blkEncodingCtx.getOnDiskBytesWithHeader();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes,
      DataBlockEncoding.ID_SIZE, onDiskBytes.length
          - DataBlockEncoding.ID_SIZE);
  verifySeeking(seeker, readBuffer, batchId);
}
 

开发者ID:cloud-software-foundation,
项目名称:c5,
代码行数:16,
代码来源:TestPrefixTreeEncoding.java

示例12: testSeekBeforeWithFixedData

点赞 2

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testSeekBeforeWithFixedData() throws Exception {
  formatRowNum = true;
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  int batchId = numBatchesWritten++;
  HFileContext meta = new HFileContextBuilder()
                      .withHBaseCheckSum(false)
                      .withIncludesMvcc(false)
                      .withIncludesTags(includesTag)
                      .withCompression(Algorithm.NONE).build();
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      DataBlockEncoding.PREFIX_TREE, new byte[0], meta);
  ByteArrayOutputStream baosInMemory = new ByteArrayOutputStream();
  DataOutputStream userDataStream = new DataOutputStream(baosInMemory);
  generateFixedTestData(kvset, batchId, false, includesTag, encoder, blkEncodingCtx,
      userDataStream);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR,
      encoder.newDataBlockDecodingContext(meta));
  byte[] onDiskBytes = baosInMemory.toByteArray();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes, DataBlockEncoding.ID_SIZE,
      onDiskBytes.length - DataBlockEncoding.ID_SIZE);
  seeker.setCurrentBuffer(readBuffer);

  // Seek before the first keyvalue;
  KeyValue seekKey = KeyValueUtil.createFirstDeleteFamilyOnRow(getRowKey(batchId, 0), CF_BYTES);
  seeker.seekToKeyInBlock(
      new KeyValue.KeyOnlyKeyValue(seekKey.getBuffer(), seekKey.getKeyOffset(), seekKey
          .getKeyLength()), true);
  assertEquals(null, seeker.getKeyValue());

  // Seek before the middle keyvalue;
  seekKey = KeyValueUtil.createFirstDeleteFamilyOnRow(getRowKey(batchId, NUM_ROWS_PER_BATCH / 3),
      CF_BYTES);
  seeker.seekToKeyInBlock(
      new KeyValue.KeyOnlyKeyValue(seekKey.getBuffer(), seekKey.getKeyOffset(), seekKey
          .getKeyLength()), true);
  assertNotNull(seeker.getKeyValue());
  assertArrayEquals(getRowKey(batchId, NUM_ROWS_PER_BATCH / 3 - 1), seeker.getKeyValue().getRow());

  // Seek before the last keyvalue;
  seekKey = KeyValueUtil.createFirstDeleteFamilyOnRow(Bytes.toBytes("zzzz"), CF_BYTES);
  seeker.seekToKeyInBlock(
      new KeyValue.KeyOnlyKeyValue(seekKey.getBuffer(), seekKey.getKeyOffset(), seekKey
          .getKeyLength()), true);
  assertNotNull(seeker.getKeyValue());
  assertArrayEquals(getRowKey(batchId, NUM_ROWS_PER_BATCH - 1), seeker.getKeyValue().getRow());
}
 

开发者ID:fengchen8086,
项目名称:ditb,
代码行数:48,
代码来源:TestPrefixTreeEncoding.java

示例13: testSeekBeforeWithFixedData

点赞 2

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testSeekBeforeWithFixedData() throws Exception {
  formatRowNum = true;
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  int batchId = numBatchesWritten++;
  ByteBuffer dataBuffer = generateFixedTestData(kvset, batchId, false, includesTag);
  HFileContext meta = new HFileContextBuilder()
                      .withHBaseCheckSum(false)
                      .withIncludesMvcc(false)
                      .withIncludesTags(includesTag)
                      .withCompression(Algorithm.NONE).build();
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      DataBlockEncoding.PREFIX_TREE, new byte[0], meta);
  encoder.encodeKeyValues(dataBuffer, blkEncodingCtx);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR,
      encoder.newDataBlockDecodingContext(meta));
  byte[] onDiskBytes = blkEncodingCtx.getOnDiskBytesWithHeader();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes, DataBlockEncoding.ID_SIZE,
      onDiskBytes.length - DataBlockEncoding.ID_SIZE);
  seeker.setCurrentBuffer(readBuffer);

  // Seek before the first keyvalue;
  KeyValue seekKey = KeyValue.createFirstDeleteFamilyOnRow(getRowKey(batchId, 0), CF_BYTES);
  seeker.seekToKeyInBlock(seekKey.getBuffer(), seekKey.getKeyOffset(), seekKey.getKeyLength(),
      true);
  assertEquals(null, seeker.getKeyValue());

  // Seek before the middle keyvalue;
  seekKey = KeyValue.createFirstDeleteFamilyOnRow(getRowKey(batchId, NUM_ROWS_PER_BATCH / 3),
      CF_BYTES);
  seeker.seekToKeyInBlock(seekKey.getBuffer(), seekKey.getKeyOffset(), seekKey.getKeyLength(),
      true);
  assertNotNull(seeker.getKeyValue());
  assertArrayEquals(getRowKey(batchId, NUM_ROWS_PER_BATCH / 3 - 1), seeker.getKeyValue().getRow());

  // Seek before the last keyvalue;
  seekKey = KeyValue.createFirstDeleteFamilyOnRow(Bytes.toBytes("zzzz"), CF_BYTES);
  seeker.seekToKeyInBlock(seekKey.getBuffer(), seekKey.getKeyOffset(), seekKey.getKeyLength(),
      true);
  assertNotNull(seeker.getKeyValue());
  assertArrayEquals(getRowKey(batchId, NUM_ROWS_PER_BATCH - 1), seeker.getKeyValue().getRow());
}
 

开发者ID:tenggyut,
项目名称:HIndex,
代码行数:43,
代码来源:TestPrefixTreeEncoding.java

示例14: testSeekBeforeWithFixedData

点赞 2

import org.apache.hadoop.hbase.io.encoding.DataBlockEncoder.EncodedSeeker; //导入依赖的package包/类
@Test
public void testSeekBeforeWithFixedData() throws Exception {
  formatRowNum = true;
  PrefixTreeCodec encoder = new PrefixTreeCodec();
  int batchId = numBatchesWritten++;
  ByteBuffer dataBuffer = generateFixedTestData(kvset, batchId, false);
  HFileBlockEncodingContext blkEncodingCtx = new HFileBlockDefaultEncodingContext(
      Algorithm.NONE, DataBlockEncoding.PREFIX_TREE, new byte[0]);
  encoder.encodeKeyValues(dataBuffer, false, blkEncodingCtx);
  EncodedSeeker seeker = encoder.createSeeker(KeyValue.COMPARATOR, false);
  byte[] onDiskBytes = blkEncodingCtx.getOnDiskBytesWithHeader();
  ByteBuffer readBuffer = ByteBuffer.wrap(onDiskBytes,
      DataBlockEncoding.ID_SIZE, onDiskBytes.length
          - DataBlockEncoding.ID_SIZE);
  seeker.setCurrentBuffer(readBuffer);

  // Seek before the first keyvalue;
  KeyValue seekKey = KeyValue.createFirstDeleteFamilyOnRow(
      getRowKey(batchId, 0), CF_BYTES);
  seeker.seekToKeyInBlock(seekKey.getBuffer(), seekKey.getKeyOffset(),
      seekKey.getKeyLength(), true);
  assertEquals(null, seeker.getKeyValue());

  // Seek before the middle keyvalue;
  seekKey = KeyValue.createFirstDeleteFamilyOnRow(
      getRowKey(batchId, NUM_ROWS_PER_BATCH / 3), CF_BYTES);
  seeker.seekToKeyInBlock(seekKey.getBuffer(), seekKey.getKeyOffset(),
      seekKey.getKeyLength(), true);
  assertNotNull(seeker.getKeyValue());
  assertArrayEquals(getRowKey(batchId, NUM_ROWS_PER_BATCH / 3 - 1), seeker
      .getKeyValue().getRow());

  // Seek before the last keyvalue;
  seekKey = KeyValue.createFirstDeleteFamilyOnRow(Bytes.toBytes("zzzz"),
      CF_BYTES);
  seeker.seekToKeyInBlock(seekKey.getBuffer(), seekKey.getKeyOffset(),
      seekKey.getKeyLength(), true);
  assertNotNull(seeker.getKeyValue());
  assertArrayEquals(getRowKey(batchId, NUM_ROWS_PER_BATCH - 1), seeker
      .getKeyValue().getRow());
}
 

开发者ID:cloud-software-foundation,
项目名称:c5,
代码行数:42,
代码来源:TestPrefixTreeEncoding.java


版权声明:本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系管理员进行删除。
喜欢 (0)