Clone Tools
  • last updated 21 mins ago
Constraints
Constraints: committers
 
Constraints: files
Constraints: dates
new branch for merging lsm and fullstack

git-svn-id: https://hyracks.googlecode.com/svn/branches/fullstack_lsm_staging@3014 123451ca-8445-de46-9d55-352943316053

  1. … 1792 more files in changeset.
merged hyracks_asterix_stabilization r1634:1651

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_lsm_tree@1657 123451ca-8445-de46-9d55-352943316053

  1. … 149 more files in changeset.
merged hyracks_asterix_stabilization r1440:1453

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_lsm_tree@1488 123451ca-8445-de46-9d55-352943316053

  1. … 456 more files in changeset.
Integrated more native MR operators and a Shuffle Connector

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_dev_next@1244 123451ca-8445-de46-9d55-352943316053

    • -0
    • +268
    ./hyracks/dataflow/hadoop/mapreduce/HadoopHelper.java
    • -0
    • +33
    ./hyracks/dataflow/hadoop/mapreduce/HadoopTools.java
    • -0
    • +23
    ./hyracks/dataflow/hadoop/mapreduce/IInputSplitProvider.java
    • -0
    • +109
    ./hyracks/dataflow/hadoop/mapreduce/KVIterator.java
    • -0
    • +51
    ./hyracks/dataflow/hadoop/mapreduce/MarshalledWritable.java
    • -0
    • +187
    ./hyracks/dataflow/hadoop/mapreduce/ReduceWriter.java
    • -0
    • +178
    ./hyracks/dataflow/hadoop/mapreduce/ShuffleFrameReader.java
  1. … 18 more files in changeset.
merge hyracks_dev_next r847:977

git-svn-id: https://hyracks.googlecode.com/svn/branches/aggregators_dev_next@978 123451ca-8445-de46-9d55-352943316053

  1. … 148 more files in changeset.
Added getFieldCount() call to RecordDescriptor

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_dev_next@881 123451ca-8445-de46-9d55-352943316053

  1. … 13 more files in changeset.
Cleaned up createPushRuntime api

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_dev_next@729 123451ca-8445-de46-9d55-352943316053

  1. … 39 more files in changeset.
Fixed protocol to call fail() on writers when source operators encounter an error.

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_dev_next@550 123451ca-8445-de46-9d55-352943316053

  1. … 10 more files in changeset.
Added fail() call to IFrameWriter

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_dev_next@549 123451ca-8445-de46-9d55-352943316053

  1. … 53 more files in changeset.
Merged -r 438:524 from trunk into branch

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_indexes@525 123451ca-8445-de46-9d55-352943316053

  1. … 96 more files in changeset.
Merged r490:491 from trunk

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_scheduling@493 123451ca-8445-de46-9d55-352943316053

  1. … 54 more files in changeset.
Refactored cluster controller

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_scheduling@492 123451ca-8445-de46-9d55-352943316053

  1. … 28 more files in changeset.
1) made changes to pom.xml for hadoopcompatapp , so that working directory is appropriately set for the CC and NCs 2) refactored code

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_hadoop_compat_changes@479 123451ca-8445-de46-9d55-352943316053

  1. … 5 more files in changeset.
refactored code in HadoopWriteOperatorDescriptor

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_hadoop_compat_changes@475 123451ca-8445-de46-9d55-352943316053

Refactored code in comaptibility layer to support submission of jobs against existing applications + made minor changes in hadoop operators

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_hadoop_compat_changes@460 123451ca-8445-de46-9d55-352943316053

  1. … 4 more files in changeset.
Partial commit. Code compiles, but not complete

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_scheduling@383 123451ca-8445-de46-9d55-352943316053

  1. … 209 more files in changeset.
Added new scheduler. Temporarily disabled build of compat layer pending some changes

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_scheduling@299 123451ca-8445-de46-9d55-352943316053

  1. … 125 more files in changeset.
Merged r289:290 from the hyracks_io_management branch

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@291 123451ca-8445-de46-9d55-352943316053

  1. … 185 more files in changeset.
Merged 249:266 from trunk

git-svn-id: https://hyracks.googlecode.com/svn/branches/hyracks_storage_cleanup@267 123451ca-8445-de46-9d55-352943316053

  1. … 1 more file in changeset.
Refactored WritableComparingComparatorFactory to RawComparingComparatorFactory

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@257 123451ca-8445-de46-9d55-352943316053

Merged online_aggregation @186:220

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@221 123451ca-8445-de46-9d55-352943316053

    • -0
    • +62
    ./hyracks/dataflow/hadoop/data/HadoopNewPartitionerTuplePartitionComputerFactory.java
  1. … 33 more files in changeset.
Modified HadoopWriterOperatorDescriptor. The operator previously used to create Sequence files by opening FSDataOutputStream. Though this results in correct creation of the sequence file, it is still better to have it done by obtaining a SequenceFile writer from the outputFormat.

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@202 123451ca-8445-de46-9d55-352943316053

External Hadoop client like Pig/Hive use intermediate output formats. Earlier the write operator used to open a FSDataOutputStream and push in the bytes. This works fine but not with some custom output formats that do not write into HDFS but to some local storage. In order to be compatible with such custom formats, we must get the writer from the custom format. This check in ensures the the writer writes in a manner similar to the custom output format

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@191 123451ca-8445-de46-9d55-352943316053

  1. … 1 more file in changeset.
Fixed issure related to initialization of Job Conf in HadoopReducer

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@190 123451ca-8445-de46-9d55-352943316053

Changed WritableComparingComparatorFactory to use RawComparator. Changed FrameTupleAppender to accept IFrameTupleAccessor.

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@189 123451ca-8445-de46-9d55-352943316053

  1. … 1 more file in changeset.
fixed issue related to initialization of JobConf instance before calling in the configure > used ReflectionUtils to create instance of mapper class > and the reducer class.

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@188 123451ca-8445-de46-9d55-352943316053

Made changes to support org.apache.hadoop.mapreduce library in addition to org.apache.hadoop.mapred library. The new library is used in Hadoop client community, notably in Pig and Mahout. To be compatible with hadoop, this change is mandatory

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@187 123451ca-8445-de46-9d55-352943316053

Hadoop Operators currently do not support 'org.apache.hadoop.mapreduce.*' types and hence cannot run MR jobs referencing those types. In order to be compatible, we need to support them. > This change adds suport for mapreduce libraries. The changes are spread across all Hadoop operators. The compatibilty layer also changes in order to support the mapreduce package

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@184 123451ca-8445-de46-9d55-352943316053

modified HadoopMapper operator to work in two modes : 1) SelfReadMode: The mapper reads the input directly from HDFS instead of receiving it from another specific read operator 2) DependentMode : The mapper is not anymore a source operator, but requires input to be fed by some other operator ( eg a reducer in case of chained MR jobs ) For operators A & B that connect using a one-to-one connector, A & B can be fused together to form a single operator. The above change maked HadoopReadOperator redundant. It is not being deleted here as it is a useful operator for reading from HDFS and could be used in other scenarios.

Modified AbstractHadoopReadOperator to take as argument in the constructor , the input arity. The input arity was earlier assumed to be 1 for Map and Reduce, but is

0 for Map in the SelfReadMode.

Modified Reducer to pass the inputArity to base class constructor

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@176 123451ca-8445-de46-9d55-352943316053

removed log messages

git-svn-id: https://hyracks.googlecode.com/svn/trunk/hyracks@165 123451ca-8445-de46-9d55-352943316053