Hibernate Criteria按子记录的数量排序

时间:2015-08-06 12:02:39

标签: java hibernate

我有两个课程:新闻和评论,他们之间有一对多的关联。 我正在使用Hibernate Criteria从数据库中获取新闻。我希望我的新闻按其评论的顺序排序。

session.createCriteria(News.class, "n");
criteria.createAlias("n.comments", "comments");
criteria.setProjection(Projections.projectionList()
    .add(Projections.groupProperty("comments.id"))
    .add(Projections.count("comments.id").as("numberOfComments")));
criteria.addOrder(Order.desc("numberOfComments"));
List<News> news = criteria.list();

使用以下代码,我不会得到新闻列表,而是每个都有两个Long的对象列表。 如何获取已排序的新闻对象列表?

1 个答案:

答案 0 :(得分:2)

我在这里找到了我的问题的答案: Hibernate Criteria API - how to order by collection size?

我添加了新的hibernate Order实现:

public class SizeOrder extends Order {

protected String propertyName;
protected boolean ascending;

protected SizeOrder(String propertyName, boolean ascending) {
    super(propertyName, ascending);
    this.propertyName = propertyName;
    this.ascending = ascending;
}

public String toSqlString(Criteria criteria, CriteriaQuery criteriaQuery) throws HibernateException {
    String role = criteriaQuery.getEntityName(criteria, propertyName) + '.' + criteriaQuery.getPropertyName(propertyName);
    QueryableCollection cp = (QueryableCollection) criteriaQuery.getFactory().getCollectionPersister(role);

    String[] fk = cp.getKeyColumnNames();
    String[] pk = ((Loadable) cp.getOwnerEntityPersister())
            .getIdentifierColumnNames();
    return " (select count(*) from " + cp.getTableName() + " where "
            + new ConditionFragment()
                    .setTableAlias(
                            criteriaQuery.getSQLAlias(criteria, propertyName)
                    ).setCondition(pk, fk)
                .toFragmentString() + ") "
            + (ascending ? "asc" : "desc");
}

public static SizeOrder asc(String propertyName) {
    return new SizeOrder(propertyName, true);
}
public static SizeOrder desc(String propertyName) {
    return new SizeOrder(propertyName, false);
}
}

然后将其应用于我的标准

criteria.addOrder(SizeOrder.desc("n.comments"));

现在一切正常, 非常感谢大家:)