Details

    • Type: Bug
    • Status: Open
    • Priority: Minor
    • Resolution: Unresolved
    • Labels:
      None
    • Attachments:
      0
    • Comments:
      0

      Description

      The Hadoop configuration file is not rich enough to hold all of the information we need: site URLs, implementation class names (per site), working directory path, class URIs for each site, etc.

      How to make this as easy and as unified as possible? Can we create our own XML file and parse it at the start of the job? Then how do we pass the parsed config to the Mapper and Reducer tasks? Serialize it? Copy it to the HDFS and have each task re-parse it? Is there an easier way?

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              j2blake Jim Blake
            • Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

              • Created:
                Updated:

                Time Tracking

                Estimated:
                Original Estimate - 3 hours
                3h
                Remaining:
                Remaining Estimate - 3 hours
                3h
                Logged:
                Time Spent - Not Specified
                Not Specified