使用Scala反射创建其声明的类的对象

时间:2016-12-17 06:44:54

标签: java scala apache-spark-sql

我是Scala的新手,我在编写spark-sql应用程序以动态加载用户类并将rdd映射到它时遇到了困难。

#include <stdio.h>
#include <stdlib.h>
#include <string.h>

#define INITSIZE 10
#define BUFFSIZE 100

int
main(void) {
    char **strings;
    size_t currsize = INITSIZE, str_count = 0, slen;

    char buffer[BUFFSIZE];
    char *word;
    const char *delim = " ";
    int i;

    /* Allocate initial space for array */
    strings = malloc(currsize * sizeof(*strings));
    if(!strings) {
        printf("Issue allocating memory for array of strings.\n");
        exit(EXIT_FAILURE);
    }

    printf("Enter some words(Press enter again to end): ");
    while (fgets(buffer, BUFFSIZE, stdin) != NULL && strlen(buffer) > 1) {

        /* grow array as needed */
        if (currsize == str_count) {
            currsize *= 2;
            strings = realloc(strings, currsize * sizeof(*strings));
            if(!strings) {
                printf("Issue reallocating memory for array of strings.\n");
                exit(EXIT_FAILURE);
            }
        }

        /* Remove newline from fgets(), and check for buffer overflow */
        slen = strlen(buffer);
        if (slen > 0) {
            if (buffer[slen-1] == '\n') {
                buffer[slen-1] = '\0';
            } else {
                printf("Exceeded buffer length of %d.\n", BUFFSIZE);
                exit(EXIT_FAILURE);
            }
        }

        /* Parsing of words from stdin */
        word = strtok(buffer, delim);
        while (word != NULL) {

            /* allocate space for one word, including nullbyte */
            strings[str_count] = malloc(strlen(word)+1);
            if (!strings[str_count]) {
                printf("Issue allocating space for word.\n");
                exit(EXIT_FAILURE);
            }

            /* copy strings into array */
            strcpy(strings[str_count], word);

            str_count++;
            word = strtok(NULL, delim);
        }
    }

    /* print and free strings */
    printf("Your array of strings:\n");
    for (i = 0; i < str_count; i++) {
        printf("strings[%d] = %s\n", i, strings[i]);
        free(strings[i]);
        strings[i] = NULL;
    }

    free(strings);
    strings = NULL;

    return 0;
}

问题是将对象转换为其声明的类,因为cls.type返回java.lang.class [_],这是不期望的。在运行时,将抛出以下异常:

   rdd.map(line => {  
        val cls = Class.forName("UserClass")  
        val constructor = cls.getConstructor(classOf[String], classOf[String])  
        Tuple1(constructor.newInstence(line._1, line._2)).asInstanceOf[cls.type]  
    }).toDF()  

顺便说一句,我使用的是Scala 2.10和spark 1.6.1 任何建议和意见将不胜感激!谢谢!

1 个答案:

答案 0 :(得分:1)

我没有真正的解决方案,但我可以告诉你一些你做错的事情。

将对象包装在Tuple1中,然后尝试将元组转换为其他类型,而不是对象本身。

cls.type不是Class cls所代表的类型。它是变量cls的类型,在这种情况下恰好是java.lang.Class[_]

Casting主要是编译时间的东西。因此,您只能转换为编译时已知的类型。你说你正在动态加载类,所以我猜它们不为编译器所知。