任何人都知道我该如何使用
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 23.0 failed 1 times, most recent failure: Lost task 0.0 in stage 23.0 (T
ID 523, localhost, executor driver): java.lang.UnsupportedOperationException
at java.util.AbstractList.add(Unknown Source)
at java.util.AbstractList.add(Unknown Source)
at com.nielsen.media.mediaView.adintel.pivot.datareader.AIReportViewerProcessor.lambda$zipWithIndex$7245ab51$1(AIReportViewerProcessor.java:3071)
at org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1040)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:125)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
在模板文件中? 每次我想使用这种地形崩溃-“%!s()”。
我设法用'%%'转义了'%',但双花括号仍然是一个问题。
答案 0 :(得分:0)
我使用Terraform 0.11模板来生成docker-compose.yml以供牧场主1使用。 这是非常过时且特殊的,但也许有人会从中受益。
所以我想得到
NAME_TEMPLATE=%{{service_name}}.%{{environment_name}}
进入我的docker-compose.yml。
首先,我让模板引擎使用yaml进行工作,因为还有其他变量要填充:
Terraform模板
data "template_file" "route53updater" {
template = "${file("${local.templates_path}/docker-compose.yml")}"
vars {
aws_access_key = "${var.aws_access_key}"
aws_secret_key = "${var.aws_secret_key}"
aws_domain = "${var.instance_tld}"
aws_zone_id = "${var.aws_zone_id}"
aws_region = "${var.aws_region}"
}
}
version: '2'
services:
route53:
mem_limit: 134217728
image: rancher/external-dns:v0.7.12
environment:
AWS_ACCESS_KEY: ${aws_access_key}
ROUTE53_ZONE_ID: ${aws_zone_id}
AWS_REGION: ${aws_region}
AWS_SECRET_KEY: ${aws_secret_key}
ROOT_DOMAIN: ${aws_domain}
TTL: '60'
NAME_TEMPLATE: "NAME_TEMPLATE_VALUE"
expose:
- '1000'
labels:
io.rancher.container.agent.role: external-dns
io.rancher.container.create_agent: 'true'
成为:
version: '2'
services:
route53:
mem_limit: 134217728
image: rancher/external-dns:v0.7.12
environment:
AWS_ACCESS_KEY: SECRET
AWS_REGION: eu-central-1
AWS_SECRET_KEY: SECRET
ROOT_DOMAIN: domain.com
ROUTE53_ZONE_ID: SECRET
NAME_TEMPLATE: "NAME_TEMPLATE_VALUE"
TTL: '60'
expose:
- '1000'
labels:
io.rancher.container.agent.role: external-dns
io.rancher.container.create_agent: 'true'
然后我使用replace来内联编辑模板内容,如下所示:
resource "rancher_stack" "route53" {
name = "route53"
environment_id = "${var.rancher_environment}"
scope = "system"
start_on_create = true
docker_compose = "${replace("${data.template_file.route53updater.rendered}", "NAME_TEMPLATE_VALUE", "%{{`{{service_name}}`}}.%{{`{{environment_name}}`}}")}"
rancher_compose = "${file("${local.templates_path}/rancher-compose.yml")}"
}
因此它将NAME_TEMPLATE_VALUE
替换为%{{`{{service_name}}`}}.%{{`{{environment_name}}`}}")}