如何在Karaf 4.0.5中的BaseDao中获取entityManager?

时间:2016-04-29 06:46:19

标签: java jpa-2.0 entitymanager apache-karaf

我使用karaf 4.0.5并暂停4.2.15我希望在我的BaseDao课程中获取EntityManager。 如果我尝试在我的服务中获取EntityManager

<bean id="subscriberService" class="domain.payment.impl.subscriber.SubscriberServiceImpl" scope="singleton"
          init-method="init">
        <tx:transaction method="*" />
    </bean>

并在课堂上

@PersistenceContext(unitName="payment")
    private EntityManager entityManager;

我正常地获得EntityManager。

但如果我在另一堂课中尝试过

public class BaseJpaDao<E> implements BaseDao<E>{
    protected Class<?> entityClass;

    @PersistenceContext(unitName="payment")
    private EntityManager entityManager;

    public BaseJpaDao(Class<?> entityClass) {
        this.entityClass = entityClass;
    }

    @Override
    public E persist(E e) {
        entityManager.persist(e);
        return e;
    }

我的entityManager为NULL;

我试过

<bean id="baseDao" class="domain.payment.impl.base.BaseJpaDao" scope="singleton"
          init-method="init">
        <tx:transaction method="*" />
    </bean>

但它没有帮助。

在Spring项目中它工作正常,但在OSGi中我遇到了很多问题。

真的只能来自我可以获得的服务entityManager

1 个答案:

答案 0 :(得分:1)

你查过日志了吗? Py4JJavaError Traceback (most recent call last) <ipython-input-9-d93d15081c08> in <module>() ----> 1 input.first() C:\spark-1.6.1\python\pyspark\rdd.pyc in first(self) 1313 ValueError: RDD is empty 1314 """ -> 1315 rs = self.take(1) 1316 if rs: 1317 return rs[0] C:\spark-1.6.1\python\pyspark\rdd.pyc in take(self, num) 1265 """ 1266 items = [] -> 1267 totalParts = self.getNumPartitions() 1268 partsScanned = 0 1269 C:\spark-1.6.1\python\pyspark\rdd.pyc in getNumPartitions(self) 2361 2362 def getNumPartitions(self): -> 2363 return self._prev_jrdd.partitions().size() 2364 2365 @property C:\spark-1.6.1\python\lib\py4j-0.9-src.zip\py4j\java_gateway.py in __call__(self, *args) 811 answer = self.gateway_client.send_command(command) 812 return_value = get_return_value( --> 813 answer, self.gateway_client, self.target_id, self.name) 814 815 for temp_arg in temp_args: C:\spark-1.6.1\python\pyspark\sql\utils.pyc in deco(*a, **kw) 43 def deco(*a, **kw): 44 try: ---> 45 return f(*a, **kw) 46 except py4j.protocol.Py4JJavaError as e: 47 s = e.java_exception.toString() C:\spark-1.6.1\python\lib\py4j-0.9-src.zip\py4j\protocol.py in get_return_value(answer, gateway_client, target_id, name) 306 raise Py4JJavaError( 307 "An error occurred while calling {0}{1}{2}.\n". --> 308 format(target_id, ".", name), value) 309 else: 310 raise Py4JError( Py4JJavaError: An error occurred while calling o50.partitions. ohit.guglani/Documents/train.csv at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:285) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313) at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:199) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.api.java.JavaRDDLike$class.partitions(JavaRDDLike.scala:64) at org.apache.spark.api.java.AbstractJavaRDDLike.partitions(JavaRDDLike.scala:46) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381) at py4j.Gateway.invoke(Gateway.java:259) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:209) at java.lang.Thread.run(Unknown Source) 似乎没有公共空构造函数,因此karaf.log中应该有一个错误,说无法创建BaseJpaDao bean ...