我是Spark的新手,我只是想不通如何使用Spark for Java将下面的代码写到文本或目标文件中。 我有一个列表,我在其中从已处理的文件中写入数据
/opt/aws/amazon-cloudwatch-agent/etc/amazon-cloudwatch-agent.toml
解析文本文件
List<Pair<Long, Pair<String, String>>> parsedLog = new List<Pair<Long, Pair<String, String>>>();
然后将其添加到列表
while ((line = reader.readLine()) != null) {
String parsedLine = line.trim();
if (!line.trim().isEmpty()) {
if (!isOperatorHelpToken) {
emptyLinesCount = 0;
Pattern pattern = Pattern.compile(": *");
Matcher matcher = pattern.matcher(parsedLine);
if (matcher.find()) {
String parsedName = parsedLine.substring(0, matcher.start());
String parsedValue = parsedLine.substring(matcher.end());
propertyValue = parsedValue;
propertyName = parsedName;
for (int i = 0; i < purge.length; ++i)
{
propertyName = propertyName.replace(purge[i], "");
propertyValue = propertyValue.replace(purge[i], "");
}
我尝试如下写入文件
if (line.equals(operatorHelpEndMark)) {
propertyValue = String.join("\30", operatorHelpText);
parsedLog.add(new Pair<Long, Pair<String, String>>
(alertCounter, new Pair<String, String>(propertyName, propertyValue)));