• 如果您觉得本站非常有看点,那么赶紧使用Ctrl+D 收藏吧

Java JoinOperatorBase类的典型用法和代码示例

java 2次浏览

本文整理汇总了Java中org.apache.flink.api.common.operators.base.JoinOperatorBase的典型用法代码示例。如果您正苦于以下问题:Java JoinOperatorBase类的具体用法?Java JoinOperatorBase怎么用?Java JoinOperatorBase使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。

JoinOperatorBase类属于org.apache.flink.api.common.operators.base包,在下文中一共展示了JoinOperatorBase类的29个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: executeTaskWithGenerator

点赞 3

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
private void executeTaskWithGenerator(
		JoinFunction<Tuple2<Integer, Integer>, Tuple2<Integer, Integer>, Tuple2<Integer, Integer>> joiner,
		int keys, int vals, int msecsTillCanceling, int maxTimeTillCanceled) throws Exception {
	UniformIntTupleGenerator g = new UniformIntTupleGenerator(keys, vals, false);
	ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
	DataSet<Tuple2<Integer, Integer>> input1 = env.createInput(new UniformIntTupleGeneratorInputFormat(keys, vals));
	DataSet<Tuple2<Integer, Integer>> input2 = env.createInput(new UniformIntTupleGeneratorInputFormat(keys, vals));

	input1.join(input2, JoinOperatorBase.JoinHint.REPARTITION_SORT_MERGE)
			.where(0)
			.equalTo(0)
			.with(joiner)
			.output(new DiscardingOutputFormat<Tuple2<Integer, Integer>>());

	env.setParallelism(parallelism);

	runAndCancelJob(env.createProgramPlan(), msecsTillCanceling, maxTimeTillCanceled);
}
 

开发者ID:axbaretto,
项目名称:flink,
代码行数:19,
代码来源:JoinCancelingITCase.java

示例2: ExpandEmbeddings

点赞 3

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * New Expand One Operator
 *
 * @param input the embedding which should be expanded
 * @param candidateEdges candidate edges along which we expand
 * @param expandColumn specifies the input column that represents the vertex from which we expand
 * @param lowerBound specifies the minimum hops we want to expand
 * @param upperBound specifies the maximum hops we want to expand
 * @param direction direction of the expansion {@see ExpandDirection}
 * @param distinctVertexColumns indices of distinct input vertex columns
 * @param distinctEdgeColumns indices of distinct input edge columns
 * @param closingColumn defines the column which should be equal with the paths end
 * @param joinHint join strategy
 */
public ExpandEmbeddings(DataSet<Embedding> input, DataSet<Embedding> candidateEdges,
  int expandColumn, int lowerBound, int upperBound, ExpandDirection direction,
  List<Integer> distinctVertexColumns, List<Integer> distinctEdgeColumns, int closingColumn,
  JoinOperatorBase.JoinHint joinHint) {

  this.input = input;
  this.candidateEdges = candidateEdges;
  this.expandColumn = expandColumn;
  this.lowerBound = lowerBound;
  this.upperBound = upperBound;
  this.direction = direction;
  this.distinctVertexColumns = distinctVertexColumns;
  this.distinctEdgeColumns = distinctEdgeColumns;
  this.closingColumn = closingColumn;
  this.joinHint = joinHint;
  this.setName("ExpandEmbeddings");
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:32,
代码来源:ExpandEmbeddings.java

示例3: JoinEmbeddings

点赞 3

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Instantiates a new join operator.
 *
 * @param left embeddings of the left side of the join
 * @param right embeddings of the right side of the join
 * @param rightColumns number of columns in the right side of the join
 * @param leftJoinColumns specifies the join columns of the left side
 * @param rightJoinColumns specifies the join columns of the right side
 * @param distinctVertexColumnsLeft distinct vertex columns of the left embedding
 * @param distinctVertexColumnsRight distinct vertex columns of the right embedding
 * @param distinctEdgeColumnsLeft distinct edge columns of the left embedding
 * @param distinctEdgeColumnsRight distinct edge columns of the right embedding
 * @param joinHint join strategy
 */
public JoinEmbeddings(DataSet<Embedding> left, DataSet<Embedding> right,
  int rightColumns,
  List<Integer> leftJoinColumns, List<Integer> rightJoinColumns,
  List<Integer> distinctVertexColumnsLeft, List<Integer> distinctVertexColumnsRight,
  List<Integer> distinctEdgeColumnsLeft, List<Integer> distinctEdgeColumnsRight,
  JoinOperatorBase.JoinHint joinHint) {
  this.left                       = left;
  this.right                      = right;
  this.rightColumns               = rightColumns;
  this.leftJoinColumns            = leftJoinColumns;
  this.rightJoinColumns           = rightJoinColumns;
  this.distinctVertexColumnsLeft  = distinctVertexColumnsLeft;
  this.distinctVertexColumnsRight = distinctVertexColumnsRight;
  this.distinctEdgeColumnsLeft    = distinctEdgeColumnsLeft;
  this.distinctEdgeColumnsRight   = distinctEdgeColumnsRight;
  this.joinHint                   = joinHint;
  this.setName("JoinEmbeddings");
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:33,
代码来源:JoinEmbeddings.java

示例4: ValueJoin

点赞 3

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * New value equi join operator
 *
 * @param left left hand side data set
 * @param right right hand side data set
 * @param leftJoinProperties join criteria
 * @param rightJoinProperties join criteria
 * @param rightColumns size of the right embedding
 */
public ValueJoin(DataSet<Embedding> left, DataSet<Embedding> right,
  List<Integer> leftJoinProperties, List<Integer> rightJoinProperties,
  int rightColumns) {

  this(
    left, right,
    leftJoinProperties,
    rightJoinProperties,
    rightColumns,
    Collections.emptyList(),
    Collections.emptyList(),
    Collections.emptyList(),
    Collections.emptyList(),
    JoinOperatorBase.JoinHint.OPTIMIZER_CHOOSES
    );
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:26,
代码来源:ValueJoin.java

示例5: executeTask

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
private void executeTask(JoinFunction<Tuple2<Integer, Integer>, Tuple2<Integer, Integer>, Tuple2<Integer, Integer>> joiner, boolean slow, int parallelism) throws Exception {
	ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
	DataSet<Tuple2<Integer, Integer>> input1 = env.createInput(new InfiniteIntegerTupleInputFormat(slow));
	DataSet<Tuple2<Integer, Integer>> input2 = env.createInput(new InfiniteIntegerTupleInputFormat(slow));

	input1.join(input2, JoinOperatorBase.JoinHint.REPARTITION_SORT_MERGE)
			.where(0)
			.equalTo(0)
			.with(joiner)
			.output(new DiscardingOutputFormat<Tuple2<Integer, Integer>>());

	env.setParallelism(parallelism);

	runAndCancelJob(env.createProgramPlan(), 5 * 1000, 10 * 1000);
}
 

开发者ID:axbaretto,
项目名称:flink,
代码行数:16,
代码来源:JoinCancelingITCase.java

示例6: build

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
@SuppressWarnings("unchecked")
public JoinOperatorBase<?, ?, OUT, ?> build() {
	JoinOperatorBase<?, ?, OUT, ?> operator;
	if (joinType.isOuter()) {
		operator = new OuterJoinOperatorBase<>(
				udf,
				new BinaryOperatorInformation(input1Type, input2Type, resultType),
				this.keys1.computeLogicalKeyPositions(),
				this.keys2.computeLogicalKeyPositions(),
				this.name,
				getOuterJoinType());
	} else {
		operator = new InnerJoinOperatorBase<>(
				udf,
				new BinaryOperatorInformation(input1Type, input2Type, resultType),
				this.keys1.computeLogicalKeyPositions(),
				this.keys2.computeLogicalKeyPositions(),
				this.name);
	}

	operator.setFirstInput(input1);
	operator.setSecondInput(input2);
	operator.setParallelism(parallelism);
	operator.setCustomPartitioner(partitioner);
	operator.setJoinHint(joinHint);
	return operator;
}
 

开发者ID:axbaretto,
项目名称:flink,
代码行数:28,
代码来源:JoinOperator.java

示例7: SetPairForLoopTraverser

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Creates a new distributed traverser.
 *
 * @param traversalCode describes the traversal through the graph
 * @param vertexCount   number of query vertices
 * @param edgeCount     number of query edges
 * @param keyClazz      needed for embedding initialization
 */
public SetPairForLoopTraverser(TraversalCode traversalCode,
  int vertexCount, int edgeCount, Class<K> keyClazz) {
  this(traversalCode, MatchStrategy.ISOMORPHISM,
    vertexCount, edgeCount,
    keyClazz,
    JoinOperatorBase.JoinHint.OPTIMIZER_CHOOSES,
    JoinOperatorBase.JoinHint.OPTIMIZER_CHOOSES,
    null, null); // debug mappings
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:18,
代码来源:SetPairForLoopTraverser.java

示例8: SetPairTraverser

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Creates a new distributed traverser.
 *
 * @param traversalCode          describes the graph traversal
 * @param matchStrategy          matching strategy for vertices and edges
 * @param vertexCount            number of query vertices
 * @param edgeCount              number of query edges
 * @param keyClazz               key type for embedding initialization
 * @param edgeStepJoinStrategy   Join strategy for edge extension
 * @param vertexStepJoinStrategy Join strategy for vertex extension
 * @param vertexMapping          used for debug
 * @param edgeMapping            used for debug
 */
SetPairTraverser(TraversalCode traversalCode, MatchStrategy matchStrategy, int vertexCount,
  int edgeCount, Class<K> keyClazz, JoinOperatorBase.JoinHint edgeStepJoinStrategy,
  JoinOperatorBase.JoinHint vertexStepJoinStrategy,
  DataSet<Tuple2<K, PropertyValue>> vertexMapping,
  DataSet<Tuple2<K, PropertyValue>> edgeMapping) {
  super(traversalCode, matchStrategy, vertexCount, edgeCount, keyClazz,
    vertexMapping, edgeMapping);

  Objects.requireNonNull(edgeStepJoinStrategy);
  Objects.requireNonNull(vertexStepJoinStrategy);

  this.edgeStepJoinStrategy = edgeStepJoinStrategy;
  this.vertexStepJoinStrategy = vertexStepJoinStrategy;
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:28,
代码来源:SetPairTraverser.java

示例9: SetPairBulkTraverser

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Creates a new distributed traverser.
 *
 * @param traversalCode describes the traversal through the graph
 * @param vertexCount   number of query vertices
 * @param edgeCount     number of query edges
 * @param keyClazz      needed for embedding initialization
 */
public SetPairBulkTraverser(TraversalCode traversalCode,
  int vertexCount, int edgeCount, Class<K> keyClazz) {
  this(traversalCode, MatchStrategy.ISOMORPHISM,
    vertexCount, edgeCount,
    keyClazz,
    JoinOperatorBase.JoinHint.OPTIMIZER_CHOOSES,
    JoinOperatorBase.JoinHint.OPTIMIZER_CHOOSES,
    null, null); // debug mappings
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:18,
代码来源:SetPairBulkTraverser.java

示例10: ExplorativePatternMatching

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Create new operator instance
 *
 * @param query                   GDL query graph
 * @param attachData              true, if original data shall be attached
 *                                to the result
 * @param matchStrategy           match strategy for vertex and edge mappings
 * @param traverserStrategy       iteration strategy for distributed traversal
 * @param traverser               Traverser used for the query graph
 * @param edgeStepJoinStrategy    Join strategy for edge extension
 * @param vertexStepJoinStrategy  Join strategy for vertex extension
 */
private ExplorativePatternMatching(String query, boolean attachData,
  MatchStrategy matchStrategy,
  TraverserStrategy traverserStrategy,
  Traverser traverser,
  JoinOperatorBase.JoinHint edgeStepJoinStrategy,
  JoinOperatorBase.JoinHint vertexStepJoinStrategy) {
  super(query, attachData, LOG);
  this.matchStrategy          = matchStrategy;
  this.traverserStrategy = traverserStrategy;
  this.traverser = traverser;
  this.traverser.setQueryHandler(getQueryHandler());
  this.edgeStepJoinStrategy   = edgeStepJoinStrategy;
  this.vertexStepJoinStrategy = vertexStepJoinStrategy;
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:27,
代码来源:ExplorativePatternMatching.java

示例11: execute

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
@Override
public DataSet<Embedding> execute() {
  ExpandEmbeddings op = new ExpandEmbeddingsBulk(
    getLeftChild().execute(), getRightChild().execute(),
    expandColumn, lowerBound, upperBound, expandDirection,
    getDistinctVertexColumns(getLeftChild().getEmbeddingMetaData()),
    getDistinctEdgeColumns(getLeftChild().getEmbeddingMetaData()),
    closingColumn, JoinOperatorBase.JoinHint.OPTIMIZER_CHOOSES);
  op.setName(toString());
  return op.evaluate();
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:12,
代码来源:ExpandEmbeddingsNode.java

示例12: JoinEmbeddingsNode

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Creates a new node.
 *
 * @param leftChild left input plan node
 * @param rightChild right input plan node
 * @param joinVariables query variables to join the inputs on
 * @param vertexStrategy morphism setting for vertices
 * @param edgeStrategy morphism setting for edges
 * @param joinHint Join hint for the Flink optimizer
 */
public JoinEmbeddingsNode(PlanNode leftChild, PlanNode rightChild,
  List<String> joinVariables,
  MatchStrategy vertexStrategy, MatchStrategy edgeStrategy,
  JoinOperatorBase.JoinHint joinHint) {
  super(leftChild, rightChild);
  this.joinVariables = joinVariables;
  this.vertexStrategy = vertexStrategy;
  this.edgeStrategy = edgeStrategy;
  this.joinHint = joinHint;
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:21,
代码来源:JoinEmbeddingsNode.java

示例13: ValueJoinNode

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Creates a new node.
 *
 * @param leftChild left input plan node
 * @param rightChild right input plan node
 * @param leftJoinProperties properties used for joining on the left side
 * @param rightJoinProperties properties used for joining on the right side
 * @param vertexStrategy morphism setting for vertices
 * @param edgeStrategy morphism setting for edges
 * @param joinHint Join hint for the Flink optimizer
 */
public ValueJoinNode(PlanNode leftChild, PlanNode rightChild,
  List<Pair<String, String>> leftJoinProperties, List<Pair<String, String>> rightJoinProperties,
  MatchStrategy vertexStrategy, MatchStrategy edgeStrategy,
  JoinOperatorBase.JoinHint joinHint) {
  super(leftChild, rightChild);
  this.leftJoinProperties = leftJoinProperties;
  this.rightJoinProperties = rightJoinProperties;
  this.vertexStrategy = vertexStrategy;
  this.edgeStrategy = edgeStrategy;
  this.joinHint = joinHint;
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:23,
代码来源:ValueJoinNode.java

示例14: testJoin

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
@Test
public void testJoin() throws Exception {
  Embedding l = new Embedding();
  l.add(v0, PropertyValue.create("Foobar"));
  l.add(e0, PropertyValue.create(42));
  l.add(v1);
  DataSet<Embedding> left = getExecutionEnvironment().fromElements(l);

  Embedding r1 = new Embedding();
  r1.add(v2, PropertyValue.create("Foobar"));
  r1.add(e1, PropertyValue.create(21));
  r1.add(v3);
  Embedding r2 = new Embedding();
  r2.add(v2, PropertyValue.create("Baz"));
  r2.add(e1, PropertyValue.create(42));
  r2.add(v3);

  DataSet<Embedding> right = getExecutionEnvironment().fromElements(r1,r2);

  List<Integer> emptyList = Lists.newArrayListWithCapacity(0);

  PhysicalOperator join = new ValueJoin(left, right,
    Lists.newArrayList(0), Lists.newArrayList(0),
    3,
    emptyList, emptyList, emptyList, emptyList,
    JoinOperatorBase.JoinHint.OPTIMIZER_CHOOSES
  );

  DataSet<Embedding> result = join.evaluate();
  assertEquals(1, result.count());
  assertEveryEmbedding(result, embedding ->
    embedding.getProperties().equals(Lists.newArrayList(
      PropertyValue.create("Foobar"),
      PropertyValue.create(42),
      PropertyValue.create("Foobar"),
      PropertyValue.create(21)
      )
    ));
  assertEmbeddingExists(result, v0,e0,v1,v2,e1,v3);
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:41,
代码来源:ValueJoinTest.java

示例15: buildRecommendationSets

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Build recommendation sets from recommendation pairs.
 *
 * TODO Optimize this method for performance (currently bottleneck for evaluations)
 *
 * @param env Execution environment
 * @param recommendations Data set of recommendation pairs
 * @param topK Number of top recommendations included in set.
 * @param cpiExpr Complete complex CPI score (optional)
 * @param articleStatsFilename Path to article stats data set (if complex CPI enabled)
 * @return Recommendation set with top-k recommendations
 * @throws Exception If complex CPI failed
 */
public static DataSet<RecommendationSet> buildRecommendationSets(ExecutionEnvironment env,
                                                                 DataSet<Recommendation> recommendations,
                                                                 int topK, String cpiExpr, String articleStatsFilename) throws Exception {

    // Compute complex CPI with expression
    if(cpiExpr != null && articleStatsFilename != null) {
        // TODO redirects?
        DataSet<ArticleStatsTuple> stats = ArticleStats.getArticleStatsFromFile(env, articleStatsFilename);

        // Total articles
        long count = stats.count();

        // TODO JoinHint? Currently using left hybrid build second
        recommendations = recommendations
                .leftOuterJoin(stats, JoinOperatorBase.JoinHint.BROADCAST_HASH_SECOND)
                .where(Recommendation.RECOMMENDATION_TITLE_KEY)
                .equalTo(ArticleStatsTuple.ARTICLE_NAME_KEY)
                .with(new ComputeComplexCPI(count, cpiExpr));
    }

    DataSet<RecommendationSet> recommendationSets = recommendations
            .groupBy(Recommendation.SOURCE_TITLE_KEY) // Using HashPartition Sort on [0; ASC] TODO Maybe use reduce()
            .reduceGroup(new RecommendationSetBuilder(topK));

    return recommendationSets;
}
 

开发者ID:wikimedia,
项目名称:citolytics,
代码行数:40,
代码来源:WikiSimReader.java

示例16: JoinProjectionTest

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
@Test
public void JoinProjectionTest() {
	try {
		ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
		DataSet<Tuple5<Integer, Long, String, Long, Integer>> tupleDs = env.fromCollection(emptyTupleData, tupleTypeInfo);

		tupleDs.join(tupleDs).where(0).equalTo(0).projectFirst(2, 3).projectSecond(1, 4).types(String.class, Long.class, Long.class, Integer.class).print();

		Plan plan = env.createProgramPlan();

		GenericDataSinkBase<?> sink = plan.getDataSinks().iterator().next();
		JoinOperatorBase<?, ?, ?, ?> projectJoinOperator = ((JoinOperatorBase<?, ?, ?, ?>) sink.getInput());

		DualInputSemanticProperties props = projectJoinOperator.getSemanticProperties();

		assertTrue(props.getForwardedField1(2).size() == 1);
		assertTrue(props.getForwardedField1(3).size() == 1);
		assertTrue(props.getForwardedField2(1).size() == 1);
		assertTrue(props.getForwardedField2(4).size() == 1);
		assertTrue(props.getForwardedField1(2).contains(0));
		assertTrue(props.getForwardedField1(3).contains(1));
		assertTrue(props.getForwardedField2(1).contains(2));
		assertTrue(props.getForwardedField2(4).contains(3));
	} catch (Exception e) {
		System.err.println(e.getMessage());
		e.printStackTrace();
		fail("Exception in test: " + e.getMessage());
	}
}
 

开发者ID:citlab,
项目名称:vs.msc.ws14,
代码行数:30,
代码来源:SemanticPropertiesProjectionTest.java

示例17: runInternal

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
@Override
public DataSet<Result<K>> runInternal(Graph<K, VV, EV> input)
		throws Exception {
	// u, v, bitmask where u < v
	DataSet<Tuple3<K, K, ByteValue>> filteredByID = input
		.getEdges()
		.map(new OrderByID<>())
			.setParallelism(parallelism)
			.name("Order by ID")
		.groupBy(0, 1)
		.reduceGroup(new ReduceBitmask<>())
			.setParallelism(parallelism)
			.name("Flatten by ID");

	// u, v, (deg(u), deg(v))
	DataSet<Edge<K, Tuple3<EV, Degrees, Degrees>>> pairDegrees = input
		.run(new EdgeDegreesPair<K, VV, EV>()
			.setParallelism(parallelism));

	// u, v, bitmask where deg(u) < deg(v) or (deg(u) == deg(v) and u < v)
	DataSet<Tuple3<K, K, ByteValue>> filteredByDegree = pairDegrees
		.map(new OrderByDegree<>())
			.setParallelism(parallelism)
			.name("Order by degree")
		.groupBy(0, 1)
		.reduceGroup(new ReduceBitmask<>())
			.setParallelism(parallelism)
			.name("Flatten by degree");

	// u, v, w, bitmask where (u, v) and (u, w) are edges in graph
	DataSet<Tuple4<K, K, K, ByteValue>> triplets = filteredByDegree
		.groupBy(0)
		.sortGroup(1, Order.ASCENDING)
		.reduceGroup(new GenerateTriplets<>())
			.name("Generate triplets");

	// u, v, w, bitmask where (u, v), (u, w), and (v, w) are edges in graph
	DataSet<Result<K>> triangles = triplets
		.join(filteredByID, JoinOperatorBase.JoinHint.REPARTITION_HASH_SECOND)
		.where(1, 2)
		.equalTo(0, 1)
		.with(new ProjectTriangles<>())
			.name("Triangle listing");

	if (permuteResults) {
		triangles = triangles
			.flatMap(new PermuteResult<>())
				.name("Permute triangle vertices");
	} else if (sortTriangleVertices.get()) {
		triangles = triangles
			.map(new SortTriangleVertices<>())
				.name("Sort triangle vertices");
	}

	return triangles;
}
 

开发者ID:axbaretto,
项目名称:flink,
代码行数:57,
代码来源:TriangleListing.java

示例18: runInternal

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
@Override
public DataSet<Result<K>> runInternal(Graph<K, VV, EV> input)
		throws Exception {
	// u, v where u < v
	DataSet<Tuple2<K, K>> filteredByID = input
		.getEdges()
		.flatMap(new FilterByID<>())
			.setParallelism(parallelism)
			.name("Filter by ID");

	// u, v, (edge value, deg(u), deg(v))
	DataSet<Edge<K, Tuple3<EV, LongValue, LongValue>>> pairDegree = input
		.run(new EdgeDegreePair<K, VV, EV>()
			.setParallelism(parallelism));

	// u, v where deg(u) < deg(v) or (deg(u) == deg(v) and u < v)
	DataSet<Tuple2<K, K>> filteredByDegree = pairDegree
		.flatMap(new FilterByDegree<>())
			.setParallelism(parallelism)
			.name("Filter by degree");

	// u, v, w where (u, v) and (u, w) are edges in graph, v < w
	DataSet<Tuple3<K, K, K>> triplets = filteredByDegree
		.groupBy(0)
		.sortGroup(1, Order.ASCENDING)
		.reduceGroup(new GenerateTriplets<>())
			.name("Generate triplets");

	// u, v, w where (u, v), (u, w), and (v, w) are edges in graph, v < w
	DataSet<Result<K>> triangles = triplets
		.join(filteredByID, JoinOperatorBase.JoinHint.REPARTITION_HASH_SECOND)
		.where(1, 2)
		.equalTo(0, 1)
		.with(new ProjectTriangles<>())
			.name("Triangle listing");

	if (permuteResults) {
		triangles = triangles
			.flatMap(new PermuteResult<>())
				.name("Permute triangle vertices");
	} else if (sortTriangleVertices.get()) {
		triangles = triangles
			.map(new SortTriangleVertices<>())
				.name("Sort triangle vertices");
	}

	return triangles;
}
 

开发者ID:axbaretto,
项目名称:flink,
代码行数:49,
代码来源:TriangleListing.java

示例19: getEdgeStepJoinStrategy

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
private JoinOperatorBase.JoinHint getEdgeStepJoinStrategy() {
  return edgeStepJoinStrategy;
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:4,
代码来源:SetPairTraverser.java

示例20: getVertexStepJoinStrategy

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
private JoinOperatorBase.JoinHint getVertexStepJoinStrategy() {
  return vertexStepJoinStrategy;
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:4,
代码来源:SetPairTraverser.java

示例21: getEdgeStepJoinStrategy

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
JoinOperatorBase.JoinHint getEdgeStepJoinStrategy() {
  return edgeStepJoinStrategy;
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:4,
代码来源:TripleTraverser.java

示例22: getJoinNode

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
private static final MatchNode getJoinNode() {
	return new MatchNode(new JoinOperatorBase<String, String, String, FlatJoinFunction<String, String, String>>(new DummyFlatJoinFunction<String>(), new BinaryOperatorInformation<String, String, String>(BasicTypeInfo.STRING_TYPE_INFO, BasicTypeInfo.STRING_TYPE_INFO, BasicTypeInfo.STRING_TYPE_INFO), new int[] {1}, new int[] {2}, "join op"));
}
 

开发者ID:citlab,
项目名称:vs.msc.ws14,
代码行数:4,
代码来源:FeedbackPropertiesMatchTest.java

示例23: testTupleBaseJoiner

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
@Test
public void testTupleBaseJoiner(){
	final FlatJoinFunction<Tuple3<String, Double, Integer>, Tuple2<Integer, String>, Tuple2<Double, String>> joiner =
				new FlatJoinFunction<Tuple3<String, Double, Integer>, Tuple2<Integer, String>, Tuple2<Double, String>>()
	{
		@Override
		public void join(Tuple3<String, Double, Integer> first, Tuple2<Integer, String> second, Collector<Tuple2<Double, String>> out) {
			Tuple3<String, Double, Integer> fst = (Tuple3<String, Double, Integer>)first;
			Tuple2<Integer, String> snd = (Tuple2<Integer, String>)second;

			assertEquals(fst.f0, snd.f1);
			assertEquals(fst.f2, snd.f0);

			out.collect(new Tuple2<Double, String>(fst.f1, snd.f0.toString()));
		}
	};

	final TupleTypeInfo<Tuple3<String, Double, Integer>> leftTypeInfo = TupleTypeInfo.getBasicTupleTypeInfo
			(String.class, Double.class, Integer.class);
	final TupleTypeInfo<Tuple2<Integer, String>> rightTypeInfo = TupleTypeInfo.getBasicTupleTypeInfo(Integer.class,
			String.class);
	final TupleTypeInfo<Tuple2<Double, String>> outTypeInfo = TupleTypeInfo.getBasicTupleTypeInfo(Double.class,
			String.class);

	final int[] leftKeys = new int[]{0,2};
	final int[] rightKeys = new int[]{1,0};

	final String taskName = "Collection based tuple joiner";

	final BinaryOperatorInformation<Tuple3<String, Double, Integer>, Tuple2<Integer, String>, Tuple2<Double,
			String>> binaryOpInfo = new BinaryOperatorInformation<Tuple3<String, Double, Integer>, Tuple2<Integer,
			String>, Tuple2<Double, String>>(leftTypeInfo, rightTypeInfo, outTypeInfo);

	final JoinOperatorBase<Tuple3<String, Double, Integer>, Tuple2<Integer,
			String>, Tuple2<Double, String>, FlatJoinFunction<Tuple3<String, Double, Integer>, Tuple2<Integer,
			String>, Tuple2<Double, String>>> base = new JoinOperatorBase<Tuple3<String, Double, Integer>,
			Tuple2<Integer, String>, Tuple2<Double, String>, FlatJoinFunction<Tuple3<String, Double, Integer>,
			Tuple2<Integer, String>, Tuple2<Double, String>>>(joiner, binaryOpInfo, leftKeys, rightKeys, taskName);

	final List<Tuple3<String, Double, Integer> > inputData1 = new ArrayList<Tuple3<String, Double,
			Integer>>(Arrays.asList(
			new Tuple3<String, Double, Integer>("foo", 42.0, 1),
			new Tuple3<String,Double, Integer>("bar", 1.0, 2),
			new Tuple3<String, Double, Integer>("bar", 2.0, 3),
			new Tuple3<String, Double, Integer>("foobar", 3.0, 4),
			new Tuple3<String, Double, Integer>("bar", 3.0, 3)
	));

	final List<Tuple2<Integer, String>> inputData2 = new ArrayList<Tuple2<Integer, String>>(Arrays.asList(
			new Tuple2<Integer, String>(3, "bar"),
			new Tuple2<Integer, String>(4, "foobar"),
			new Tuple2<Integer, String>(2, "foo")
	));
	final Set<Tuple2<Double, String>> expected = new HashSet<Tuple2<Double, String>>(Arrays.asList(
			new Tuple2<Double, String>(2.0, "3"),
			new Tuple2<Double, String>(3.0, "3"),
			new Tuple2<Double, String>(3.0, "4")
	));

	try {
		List<Tuple2<Double, String>> resultSafe = base.executeOnCollections(inputData1, inputData2, new RuntimeUDFContext("op", 1, 0, null), true);
		List<Tuple2<Double, String>> resultRegular = base.executeOnCollections(inputData1, inputData2, new RuntimeUDFContext("op", 1, 0, null), false);

		assertEquals(expected, new HashSet<Tuple2<Double, String>>(resultSafe));
		assertEquals(expected, new HashSet<Tuple2<Double, String>>(resultRegular));
	}
	catch (Exception e) {
		e.printStackTrace();
		fail(e.getMessage());
	}
}
 

开发者ID:citlab,
项目名称:vs.msc.ws14,
代码行数:72,
代码来源:JoinOperatorBaseTest.java

示例24: getContractClassShouldReturnMatchForMatchStub

点赞 2

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Test {@link OperatorUtil#getContractClass(Class)}
 */
@Test
public void getContractClassShouldReturnMatchForMatchStub() {
	final Class<?> result = OperatorUtil.getContractClass(Matcher.class);
	assertEquals(JoinOperatorBase.class, result);
}
 

开发者ID:citlab,
项目名称:vs.msc.ws14,
代码行数:9,
代码来源:OperatorUtilTest.java

示例25: TripleTraverser

点赞 1

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Creates a new distributed traverser.
 *
 * @param traversalCode        describes the graph traversal
 * @param matchStrategy        matching strategy for vertices and edges
 * @param vertexCount          number of query vertices
 * @param edgeCount            number of query edges
 * @param keyClazz             key type for embedding initialization
 * @param edgeStepJoinStrategy Join strategy for edge extension
 * @param vertexMapping        used for debug
 * @param edgeMapping          used for debug
 */
TripleTraverser(TraversalCode traversalCode, MatchStrategy matchStrategy, int vertexCount,
  int edgeCount, Class<K> keyClazz,
  JoinOperatorBase.JoinHint edgeStepJoinStrategy,
  DataSet<Tuple2<K, PropertyValue>> vertexMapping,
  DataSet<Tuple2<K, PropertyValue>> edgeMapping) {
  super(traversalCode, matchStrategy, vertexCount, edgeCount, keyClazz, vertexMapping,
    edgeMapping);

  Objects.requireNonNull(edgeStepJoinStrategy);
  this.edgeStepJoinStrategy = edgeStepJoinStrategy;
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:24,
代码来源:TripleTraverser.java

示例26: TripleForLoopTraverser

点赞 1

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Creates a new distributed traverser.
 *
 * @param traversalCode describes the graph traversal
 * @param vertexCount   number of query vertices
 * @param edgeCount     number of query edges
 * @param keyClazz      key type for embedding initialization
 */
public TripleForLoopTraverser(TraversalCode traversalCode,
  int vertexCount, int edgeCount, Class<K> keyClazz) {
  this(traversalCode, MatchStrategy.ISOMORPHISM,
    vertexCount, edgeCount, keyClazz,
    JoinOperatorBase.JoinHint.OPTIMIZER_CHOOSES, null, null);
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:15,
代码来源:TripleForLoopTraverser.java

示例27: ExpandEmbeddingsBulk

点赞 1

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * New Expand One Operator
 *
 * @param input the embedding which should be expanded
 * @param candidateEdges candidate edges along which we expand
 * @param expandColumn specifies the input column that represents the vertex from which we expand
 * @param lowerBound specifies the minimum hops we want to expand
 * @param upperBound specifies the maximum hops we want to expand
 * @param direction direction of the expansion {@see ExpandDirection}
 * @param distinctVertexColumns indices of distinct input vertex columns
 * @param distinctEdgeColumns indices of distinct input edge columns
 * @param closingColumn defines the column which should be equal with the paths end
 * @param joinHint join strategy
 */
public ExpandEmbeddingsBulk(DataSet<Embedding> input, DataSet<Embedding> candidateEdges,
  int expandColumn, int lowerBound, int upperBound, ExpandDirection direction,
  List<Integer> distinctVertexColumns, List<Integer> distinctEdgeColumns, int closingColumn,
  JoinOperatorBase.JoinHint joinHint) {

  super(input, candidateEdges, expandColumn, lowerBound, upperBound, direction,
    distinctVertexColumns, distinctEdgeColumns, closingColumn, joinHint);
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:23,
代码来源:ExpandEmbeddingsBulk.java

示例28: ExpandEmbeddingsForLoop

点赞 1

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * New Expand One Operator
 *
 * @param input the embedding which should be expanded
 * @param candidateEdges candidate edges along which we expand
 * @param expandColumn specifies the input column that represents the vertex from which we expand
 * @param lowerBound specifies the minimum hops we want to expand
 * @param upperBound specifies the maximum hops we want to expand
 * @param direction direction of the expansion {@see ExpandDirection}
 * @param distinctVertexColumns indices of distinct input vertex columns
 * @param distinctEdgeColumns indices of distinct input edge columns
 * @param closingColumn defines the column which should be equal with the paths end
 * @param joinHint join strategy
 */
public ExpandEmbeddingsForLoop(DataSet<Embedding> input, DataSet<Embedding> candidateEdges,
  int expandColumn, int lowerBound, int upperBound, ExpandDirection direction,
  List<Integer> distinctVertexColumns, List<Integer> distinctEdgeColumns, int closingColumn,
  JoinOperatorBase.JoinHint joinHint) {

  super(input, candidateEdges, expandColumn, lowerBound, upperBound, direction,
    distinctVertexColumns, distinctEdgeColumns, closingColumn, joinHint);
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:23,
代码来源:ExpandEmbeddingsForLoop.java

示例29: setEdgeStepJoinStrategy

点赞 1

import org.apache.flink.api.common.operators.base.JoinOperatorBase; //导入依赖的package包/类
/**
 * Sets the join strategy for joining edges during traversal.
 *
 * @param edgeStepJoinStrategy join strategy
 * @return modified builder
 */
public Builder setEdgeStepJoinStrategy(
  JoinOperatorBase.JoinHint edgeStepJoinStrategy) {
  this.edgeStepJoinStrategy = edgeStepJoinStrategy;
  return this;
}
 

开发者ID:dbs-leipzig,
项目名称:gradoop,
代码行数:12,
代码来源:ExplorativePatternMatching.java


版权声明:本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系管理员进行删除。
喜欢 (0)