Cloned from launchpad blueprint https://blueprints.launchpad.net/sahara/+spec/edp-spark-job-type.
Spark EDP has been implemented initially using the Java job type. However, it will be better to suport a specific Spark job type for several reasons:
* the semantics are slightly different. Spark requires a "main" application jar and supporting libs are optional (Java uses all libs)
* the Spark job type may someday support Python apps
* the possible config set for Spark will be different from Java (although they both use edp.java.main_class)
* Spark/Java may diverge in different ways in the future
This will need support in sahara-api and the dashboard. TBD whether or not there is a client impact or data model impact.
(spec in progress)
Specification URL (additional information):
This bug has been closed as a part of the RHEL-OSP 6 general availability release. For details, see https://rhn.redhat.com/errata/rhel7-rhos-6-errata.html