Skip to content

Commit

Permalink
Add ScoobiWritable to Job JAR for inputs.
Browse files Browse the repository at this point in the history
DLists that span between multiple MSCRs are persisted as Sequence Files
of ScoobiWritable objects. The class definitions for these objects are
generated dynamically and must be added to a MapReduce job's "user" JAR.
Fix a bug where the class files for ScoobiWritable's are only added to
the JAR when the DList is an output of a MapReduce job and not when it
is an input.
  • Loading branch information
blever committed Mar 13, 2012
1 parent bd863b2 commit a130816
Showing 1 changed file with 7 additions and 1 deletion.
8 changes: 7 additions & 1 deletion src/main/scala/com/nicta/scoobi/impl/exec/MapReduceJob.scala
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,13 @@ class MapReduceJob(stepId: Int) {
* - generate runtime class (ScoobiWritable) for each input value type and add to JAR (any
* mapper for a given input channel can be used as they all have the same input type */
val inputChannels: List[((DataSource[_,_,_], MSet[TaggedMapper[_,_,_]]), Int)] = mappers.toList.zipWithIndex
inputChannels.foreach { case ((source, ms), ix) => ChannelInputFormat.addInputChannel(job, ix, source) }
inputChannels.foreach { case ((source, ms), ix) =>
ChannelInputFormat.addInputChannel(job, ix, source)
source match {
case bs@BridgeStore(_) => jar.addRuntimeClass(bs.rtClass.getOrElse(sys.error("Run-time class should be set.")))
case _ => {}
}
}

DistCache.pushObject(
job.getConfiguration,
Expand Down

0 comments on commit a130816

Please sign in to comment.