ADAL .NET到ADFS

时间:2016-09-27 16:09:57

标签: .net adfs adal

我正在尝试完成此库的设计目的:允许我的.NET代码(当前是测试控制台应用程序)从ADFS为当前用户检索令牌,并使用令牌进行身份验证并调用REST Web API端点。 This article将其称为ADAL支持的基本流程。我的组织运行Windows Server 2012 SP2 ADFS。控制台应用程序和ADFS是内部部署。

有人能指出我如何设置并使用它的完整示例吗?代码片段很好,但周围有很多未解释的内容 - 比如什么是身份验证上下文端点?我对我的系统管理员有什么要求?他们给了我一个STS端点,但这似乎不起作用。我想我需要的不仅仅是服务器URL,可能需要传递查询字符串或其他扩展名,但我找不到任何解释。实际上我发现的所有示例和示例都涉及使用Azure注册客户端应用程序,而我没有使用Azure。

我的代码目前看起来像这样:

Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 30.0 failed 4 times, most recent failure: Lost task 0.3 in stage 30.0 (TID 52, ph-hdp-prd-dn02): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/data/0/yarn/nm/usercache/phanalytics-test/appcache/application_1474532589728_2983/container_e203_1474532589728_2983_01_000014/pyspark.zip/pyspark/worker.py", line 172, in main
    process()
  File "/data/0/yarn/nm/usercache/analytics-test/appcache/application_1474532589728_2983/container_e203_1474532589728_2983_01_000014/pyspark.zip/pyspark/worker.py", line 167, in process
    serializer.dump_stream(func(split_index, iterator), outfile)
  File "/usr/local/spark-latest/python/pyspark/rdd.py", line 2371, in pipeline_func
  File "/usr/local/spark-latest/python/pyspark/rdd.py", line 2371, in pipeline_func
  File "/usr/local/spark-latest/python/pyspark/rdd.py", line 317, in func
  File "/usr/local/spark-latest/python/pyspark/rdd.py", line 1792, in combineLocally
  File "/data/0/yarn/nm/usercache/phanalytics-test/appcache/application_1474532589728_2983/container_e203_1474532589728_2983_01_000014/pyspark.zip/pyspark/shuffle.py", line 238, in mergeValues
    d[k] = comb(d[k], v) if k in d else creator(v)
  File "<ipython-input-11-ec09929e01e4>", line 6, in <lambda>
TypeError: 'int' object is not callable

    at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:193)
    at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:234)
    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:152)
    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:63)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
    at org.apache.spark.api.python.PairwiseRDD.compute(PythonRDD.scala:390)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
    at org.apache.spark.scheduler.Task.run(Task.scala:85)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

但是对AquireToken的调用因AuthenticationContext authenticationContext = new AuthenticationContext("https://sts.myorg.co.za"); var token = authenticationContext.AcquireToken("http://myservices/service1", "a8cb2a71-da38-4cf4-9023-7799d00e09f6", new Uri("http://TodoListClient"));

之外的异常而失败

任何帮助或指示赞赏!

谢谢, 彼得

1 个答案:

答案 0 :(得分:0)

您应该将validateAuthority = false传递给ADAL的构造函数。