为什么Socket.connect使用SocketAddress而不是InetSocketAddress?

时间:2018-07-30 10:56:26

标签: java sockets

为什么Socket.connect使用SocketAddress而不是InetSocketAddress?
我阅读了Socket.connect的源代码,它只是将SocketAddress转换为InetSocketAddress,如果IllegalArgumentException无法转换为SocketAddress,则会抛出InetSocketAddress package com.npower.scala.test_maven import scala.xml.include.sax.Main import org.apache.spark.sql.functions._ import org.apache.spark.sql.hive.HiveContext import com.databricks.spark.xml._ import org.apache.spark.SparkContext import org.apache.spark.sql.SQLContext import org.apache.spark.sql.DataFrame object HelloWorld { def main(args: Array[String]): Unit = { println("Hello World") //sql context init. val sc = new SparkContext() //creating HiveContext val sql = new SQLContext(sc) val sqlContext = new HiveContext(sc) //import sqlContext.implicits._ //reading xml file into dataframe val df = sqlContext.read.format("com.databricks.spark.xml").option("rowTag", "log-entry").load("file:///home/cloudera/Desktop/alert.log.2018*") } } POM.xml <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.npower.scala</groupId> <artifactId>test_maven</artifactId> <version>0.0.1-SNAPSHOT</version> <packaging>jar</packaging> <name>test_maven</name> <url>http://maven.apache.org</url> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </properties> <dependencies> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>3.8.1</version> <scope>test</scope> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.6.0</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.10</artifactId> <version>1.6.0</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-hive_2.10</artifactId> <version>1.6.0</version> <scope>provided</scope> </dependency> <!-- https://mvnrepository.com/artifact/com.databricks/spark-xml --> <dependency> <groupId>com.databricks</groupId> <artifactId>spark-xml_2.10</artifactId> <version>0.4.1</version> </dependency> </dependencies> </project>
我对此感到很困惑。

1 个答案:

答案 0 :(得分:2)

基本思想可能是Socket类的未来版本可以支持其他类型的连接。