pinot
pinot copied to clipboard
Refactor to centralize Job Specification properties for reusability across ingestion jobs
Currently some of the the job specification properties are defined at each implementation of batch ingestion like
Hadoop job runner has defined the job spec constants in it's corresponding implementation:
public static final String SEGMENT_GENERATION_JOB_SPEC = "segmentGenerationJobSpec";
// Field names in job spec's executionFrameworkSpec/extraConfigs section private static final String DEPS_JAR_DIR_FIELD = "dependencyJarDir"; private static final String STAGING_DIR_FIELD = "stagingDir";
Similarly spark ingestion runner also defined the same constants in its own classes
private static final Logger LOGGER = LoggerFactory.getLogger(SparkSegmentGenerationJobRunner.class); private static final String DEPS_JAR_DIR = "dependencyJarDir"; private static final String STAGING_DIR = "stagingDir";
Better to centralize them at common place and reuse to avoid deviations in the future.