我已经设置了
spark.executor.extraJavaOptions to –XX:+UseG1GC
spark.storage.memoryFraction to 0.3
我可以在环境页面中看到它们。但是在执行程序页面中,存储内存仍在分配默认数量。我没有看到在启动执行程序时使用了Java选项。我使用SparkConf.set()
在我的应用程序中设置了这些环境 Map<String, String> params = new HashMap<>();
params.put("spark.kryoserializer.buffer", "24m");
params.put("spark.kryoserializer.buffer.max", "1g");
params.put("spark.kryo.registrationRequired", "true");
params.put("spark.speculation", "false");
params.put("spark.rdd.compress", "true");
params.put("spark.storage.memoryFraction","0.3");
params.put("spark.executor.extraJavaOptions","–XX:+UseG1GC -XX:+G1SummarizeConcMark -XX:InitiatingHeapOccupancyPercent=35 -XX:MaxGCPauseMillis=400 -XX:ConcGCThread=20");
SparkConf sparkConf = new SparkConf().setAppName("Test");
Arrays.asList(
getSparkParams(),
ElasticSearchProvider.getSparkParams()
).stream()
.map(settings -> settings.entrySet().stream()
.map(entry -> sparkConf.set(entry.getKey(), entry.getValue()))
.count())
.count();