API de apache spark rest
Estoy usando el comando spark-submit que tengo para las propiedades log4j para invocar un Spark-submit como este:
/opt/spark-1.6.2-bin-hadoop2.6/bin/spark-submit \
--driver-java-options \
"-Dlog4j.configuration=file:/home/test_api/log4j-driver.properties\ --class Test testing.jar
Como lo hago--driver-java-options
, para enviar un trabajo a través de curl (API REST oculta de Apache Spark)?
Intenté esto:
curl -X POST http://host-ip:6066/v1/submissions/create --header "Co,ntent-Type:application/json;charset=UTF-8" --data '{
"action" : "CreateSubmissionRequest",
"appArgs" : [ "" ],
"appResource" : "hdfs://host-ip:9000/test/testing.jar",
"clientSparkVersion" : "1.6.2",
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1"
},
"mainClass" : "Test",
"spark.driver.extraJavaOptions" : "-Dlog4j.configuration=file:/home/test_api/log4j-driver.properties",
"sparkProperties" : {
"spark.jars" : "hdfs://host-ip:9000/test/testing.jar",
"spark.app.name" : "Test",
"spark.eventLog.enabled": "true",
"spark.eventLog.dir": "hdfs://host-ip:9000/test/spark-events",
"spark.submit.deployMode" : "cluster",
"spark.master" : "spark://host-ip:7077"
}
}'
El trabajo se envió correctamente y se dio respuesta, pero con un campo desconocido:
{
"action" : "CreateSubmissionResponse",
"message" : "Driver successfully submitted as driver-20160810210057-0091",
"serverSparkVersion" : "1.6.2",
"submissionId" : "driver-20160810210057-0091",
"success" : true,
"unknownFields" : [ "spark.driver.extraJavaOptions" ]
}
"unknownFields" : [ "spark.driver.extraJavaOptions" ]
También he intentadodriverExtraJavaOptions
como sigue:
curl -X POST http://host-ip:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data '{
"action" : "CreateSubmissionRequest",
"appArgs" : [ "" ],
"appResource" : "hdfs://host-ip:9000/test/testing.jar",
"clientSparkVersion" : "1.6.2",
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1"
},
"mainClass" : "Test",
"driverExtraJavaOptions" : "-Dlog4j.configuration=file:/home/test_api/log4j-driver.properties",
"sparkProperties" : {
"spark.jars" : "hdfs://host-ip:9000/test/testing.jar",
"spark.app.name" : "Test",
"spark.eventLog.enabled": "true",
"spark.eventLog.dir": "hdfs://host-ip:9000/test/spark-events",
"spark.submit.deployMode" : "cluster",
"spark.master" : "spark://host-ip:7077"
}
}'
Pero obtuvo una respuesta similar:
{
"action" : "CreateSubmissionResponse",
"message" : "Driver successfully submitted as driver-20160810211432-0094",
"serverSparkVersion" : "1.6.2",
"submissionId" : "driver-20160810211432-0094",
"success" : true,
"unknownFields" : [ "driverExtraJavaOptions" ]
}
¿Por qué es esto?
Miréspark-submit.scala y hizo referencia a laAPI REST de Spark