• 如果您觉得本站非常有看点,那么赶紧使用Ctrl+D 收藏吧

Java BigQueryException类的典型用法和代码示例

java 3次浏览

本文整理汇总了Java中com.google.cloud.bigquery.BigQueryException的典型用法代码示例。如果您正苦于以下问题:Java BigQueryException类的具体用法?Java BigQueryException怎么用?Java BigQueryException使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。

BigQueryException类属于com.google.cloud.bigquery包,在下文中一共展示了BigQueryException类的10个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: createDataset

点赞 3

import com.google.cloud.bigquery.BigQueryException; //导入依赖的package包/类
/**
 * Example of creating a dataset.
 */
// [TARGET create(DatasetInfo, DatasetOption...)]
// [VARIABLE "my_dataset_name"]
public Dataset createDataset(String datasetName) {
  // [START createDataset]
  Dataset dataset = null;
  DatasetInfo datasetInfo = DatasetInfo.newBuilder(datasetName).build();
  try {
    // the dataset was created
    dataset = bigquery.create(datasetInfo);
  } catch (BigQueryException e) {
    // the dataset was not created
  }
  // [END createDataset]
  return dataset;
}
 

开发者ID:michael-hll,
项目名称:BigQueryStudy,
代码行数:19,
代码来源:BigQuerySnippets.java

示例2: createJob

点赞 3

import com.google.cloud.bigquery.BigQueryException; //导入依赖的package包/类
/**
 * Example of creating a query job.
 */
// [TARGET create(JobInfo, JobOption...)]
// [VARIABLE "SELECT field FROM my_dataset_name.my_table_name"]
public Job createJob(String query) {
  // [START createJob]
  Job job = null;
  JobConfiguration jobConfiguration = QueryJobConfiguration.of(query);
  JobInfo jobInfo = JobInfo.of(jobConfiguration);
  try {
    job = bigquery.create(jobInfo);
  } catch (BigQueryException e) {
    // the job was not created
  }
  // [END createJob]
  return job;
}
 

开发者ID:michael-hll,
项目名称:BigQueryStudy,
代码行数:19,
代码来源:BigQuerySnippets.java

示例3: isBatchSizeError

点赞 3

import com.google.cloud.bigquery.BigQueryException; //导入依赖的package包/类
/**
 * @param exception the {@link BigQueryException} to check.
 * @return true if this error is an error that can be fixed by retrying with a smaller batch
 *         size, or false otherwise.
 */
private static boolean isBatchSizeError(BigQueryException exception) {
  if (exception.getCode() == BAD_REQUEST_CODE
      && exception.getError() == null
      && exception.getReason() == null) {
    /*
     * 400 with no error or reason represents a request that is more than 10MB. This is not
     * documented but is referenced slightly under "Error codes" here:
     * https://cloud.google.com/bigquery/quota-policy
     * (by decreasing the batch size we can eventually expect to end up with a request under 10MB)
     */
    return true;
  } else if (exception.getCode() == BAD_REQUEST_CODE
             && INVALID_REASON.equals(exception.getReason())) {
    /*
     * this is the error that the documentation claims google will return if a request exceeds
     * 10MB. if this actually ever happens...
     * todo distinguish this from other invalids (like invalid table schema).
     */
    return true;
  }
  return false;
}
 

开发者ID:wepay,
项目名称:kafka-connect-bigquery,
代码行数:28,
代码来源:TableWriter.java

示例4: attemptSchemaUpdate

点赞 2

import com.google.cloud.bigquery.BigQueryException; //导入依赖的package包/类
private void attemptSchemaUpdate(PartitionedTableId tableId, String topic) {
  try {
    schemaManager.updateSchema(tableId.getBaseTableId(), topic);
  } catch (BigQueryException exception) {
    throw new BigQueryConnectException("Failed to update table schema for: " + tableId.getBaseTableId(), exception);
  }
}
 

开发者ID:wepay,
项目名称:kafka-connect-bigquery,
代码行数:8,
代码来源:AdaptiveBigQueryWriter.java

示例5: testBigQuery5XXRetry

点赞 2

import com.google.cloud.bigquery.BigQueryException; //导入依赖的package包/类
@Test
public void testBigQuery5XXRetry() {
  final String topic = "test_topic";
  final String dataset = "scratch";

  Map<String, String> properties = propertiesFactory.getProperties();
  properties.put(BigQuerySinkTaskConfig.BIGQUERY_RETRY_CONFIG, "3");
  properties.put(BigQuerySinkTaskConfig.BIGQUERY_RETRY_WAIT_CONFIG, "2000");
  properties.put(BigQuerySinkConfig.TOPICS_CONFIG, topic);
  properties.put(BigQuerySinkConfig.DATASETS_CONFIG, String.format(".*=%s", dataset));

  BigQuery bigQuery = mock(BigQuery.class);

  InsertAllResponse insertAllResponse = mock(InsertAllResponse.class);
  when(bigQuery.insertAll(anyObject()))
      .thenThrow(new BigQueryException(500, "mock 500"))
      .thenThrow(new BigQueryException(502, "mock 502"))
      .thenThrow(new BigQueryException(503, "mock 503"))
      .thenReturn(insertAllResponse);
  when(insertAllResponse.hasErrors()).thenReturn(false);

  SinkTaskContext sinkTaskContext = mock(SinkTaskContext.class);

  BigQuerySinkTask testTask = new BigQuerySinkTask(bigQuery, null);
  testTask.initialize(sinkTaskContext);
  testTask.start(properties);
  testTask.put(Collections.singletonList(spoofSinkRecord(topic)));
  testTask.flush(Collections.emptyMap());

  verify(bigQuery, times(4)).insertAll(anyObject());
}
 

开发者ID:wepay,
项目名称:kafka-connect-bigquery,
代码行数:32,
代码来源:BigQuerySinkTaskTest.java

示例6: testBigQuery403Retry

点赞 2

import com.google.cloud.bigquery.BigQueryException; //导入依赖的package包/类
@Test
public void testBigQuery403Retry() {
  final String topic = "test_topic";
  final String dataset = "scratch";

  Map<String, String> properties = propertiesFactory.getProperties();
  properties.put(BigQuerySinkTaskConfig.BIGQUERY_RETRY_CONFIG, "2");
  properties.put(BigQuerySinkTaskConfig.BIGQUERY_RETRY_WAIT_CONFIG, "2000");
  properties.put(BigQuerySinkConfig.TOPICS_CONFIG, topic);
  properties.put(BigQuerySinkConfig.DATASETS_CONFIG, String.format(".*=%s", dataset));

  BigQuery bigQuery = mock(BigQuery.class);

  InsertAllResponse insertAllResponse = mock(InsertAllResponse.class);
  BigQueryError quotaExceededError = new BigQueryError("quotaExceeded", null, null);
  BigQueryError rateLimitExceededError = new BigQueryError("rateLimitExceeded", null, null);
  when(bigQuery.insertAll(anyObject()))
      .thenThrow(new BigQueryException(403, "mock quota exceeded", quotaExceededError))
      .thenThrow(new BigQueryException(403, "mock rate limit exceeded", rateLimitExceededError))
      .thenReturn(insertAllResponse);
  when(insertAllResponse.hasErrors()).thenReturn(false);

  SinkTaskContext sinkTaskContext = mock(SinkTaskContext.class);

  BigQuerySinkTask testTask = new BigQuerySinkTask(bigQuery, null);
  testTask.initialize(sinkTaskContext);
  testTask.start(properties);
  testTask.put(Collections.singletonList(spoofSinkRecord(topic)));
  testTask.flush(Collections.emptyMap());

  verify(bigQuery, times(3)).insertAll(anyObject());
}
 

开发者ID:wepay,
项目名称:kafka-connect-bigquery,
代码行数:33,
代码来源:BigQuerySinkTaskTest.java

示例7: testBigQueryRetryExceeded

点赞 2

import com.google.cloud.bigquery.BigQueryException; //导入依赖的package包/类
@Test(expected = BigQueryConnectException.class)
public void testBigQueryRetryExceeded() {
  final String topic = "test_topic";
  final String dataset = "scratch";

  Map<String, String> properties = propertiesFactory.getProperties();
  properties.put(BigQuerySinkTaskConfig.BIGQUERY_RETRY_CONFIG, "1");
  properties.put(BigQuerySinkTaskConfig.BIGQUERY_RETRY_WAIT_CONFIG, "2000");
  properties.put(BigQuerySinkConfig.TOPICS_CONFIG, topic);
  properties.put(BigQuerySinkConfig.DATASETS_CONFIG, String.format(".*=%s", dataset));

  BigQuery bigQuery = mock(BigQuery.class);

  InsertAllResponse insertAllResponse = mock(InsertAllResponse.class);
  BigQueryError quotaExceededError = new BigQueryError("quotaExceeded", null, null);
  when(bigQuery.insertAll(anyObject()))
    .thenThrow(new BigQueryException(403, "mock quota exceeded", quotaExceededError));
  when(insertAllResponse.hasErrors()).thenReturn(false);

  SinkTaskContext sinkTaskContext = mock(SinkTaskContext.class);

  BigQuerySinkTask testTask = new BigQuerySinkTask(bigQuery, null);
  testTask.initialize(sinkTaskContext);
  testTask.start(properties);
  testTask.put(Collections.singletonList(spoofSinkRecord(topic)));
  testTask.flush(Collections.emptyMap());
}
 

开发者ID:wepay,
项目名称:kafka-connect-bigquery,
代码行数:28,
代码来源:BigQuerySinkTaskTest.java

示例8: isTableMissingSchema

点赞 2

import com.google.cloud.bigquery.BigQueryException; //导入依赖的package包/类
private boolean isTableMissingSchema(BigQueryException e) {
  // If a table is missing a schema, it will raise a BigQueryException that the input is invalid
  // For more information about BigQueryExceptions, see: https://cloud.google.com/bigquery/troubleshooting-errors
  return e.getReason() != null && e.getReason().equalsIgnoreCase("invalid");
}
 

开发者ID:wepay,
项目名称:kafka-connect-bigquery,
代码行数:6,
代码来源:AdaptiveBigQueryWriter.java

示例9: writeRows

点赞 2

import com.google.cloud.bigquery.BigQueryException; //导入依赖的package包/类
/**
 * @param table The BigQuery table to write the rows to.
 * @param rows The rows to write.
 * @param topic The Kafka topic that the row data came from.
 * @throws InterruptedException if interrupted.
 */
public void writeRows(PartitionedTableId table,
                      List<InsertAllRequest.RowToInsert> rows,
                      String topic)
    throws BigQueryConnectException, BigQueryException, InterruptedException {
  logger.debug("writing {} row{} to table {}", rows.size(), rows.size() != 1 ? "s" : "", table);

  Exception mostRecentException = null;

  int retryCount = 0;
  do {
    if (retryCount > 0) {
      waitRandomTime();
    }
    try {
      performWriteRequest(table, rows, topic);
      return;
    } catch (BigQueryException err) {
      mostRecentException = err;
      if (err.getCode() == INTERNAL_SERVICE_ERROR
          || err.getCode() == SERVICE_UNAVAILABLE
          || err.getCode() == BAD_GATEWAY) {
        // backend error: https://cloud.google.com/bigquery/troubleshooting-errors
        /* for BAD_GATEWAY: https://cloud.google.com/storage/docs/json_api/v1/status-codes
           todo possibly this page is inaccurate for bigquery, but the message we are getting
           suggest it's an internal backend error and we should retry, so lets take that at face
           value. */
        logger.warn("BQ backend error: {}, attempting retry", err.getCode());
        retryCount++;
      } else if (err.getCode() == FORBIDDEN
                 && err.getError() != null
                 && QUOTA_EXCEEDED_REASON.equals(err.getReason())) {
        // quota exceeded error
        logger.warn("Quota exceeded for table {}, attempting retry", table);
        retryCount++;
      } else if (err.getCode() == FORBIDDEN
                 && err.getError() != null
                 && RATE_LIMIT_EXCEEDED_REASON.equals(err.getReason())) {
        // rate limit exceeded error
        logger.warn("Rate limit exceeded for table {}, attempting retry", table);
        retryCount++;
      } else {
        throw err;
      }
    }
  } while (retryCount <= retries);
  throw new BigQueryConnectException(
      String.format("Exceeded configured %d attempts for write request", retries),
      mostRecentException);
}
 

开发者ID:wepay,
项目名称:kafka-connect-bigquery,
代码行数:56,
代码来源:BigQueryWriter.java

示例10: performWriteRequest

点赞 1

import com.google.cloud.bigquery.BigQueryException; //导入依赖的package包/类
/**
 * Handle the actual transmission of the write request to BigQuery, including any exceptions or
 * errors that happen as a result.
 * @param tableId The PartitionedTableId.
 * @param rows The rows to write.
 * @param topic The Kafka topic that the row data came from.
 */
protected abstract void performWriteRequest(PartitionedTableId tableId,
                                            List<InsertAllRequest.RowToInsert> rows,
                                            String topic)
    throws BigQueryException, BigQueryConnectException;
 

开发者ID:wepay,
项目名称:kafka-connect-bigquery,
代码行数:12,
代码来源:BigQueryWriter.java


版权声明:本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系管理员进行删除。
喜欢 (0)