废弃的结果不会更新

时间:2017-03-09 21:55:56

标签: ruby-on-rails ruby nokogiri whenever

我正在通过Whenever gem进行安排,但是,我的报废结果似乎根本没有得到更新。

我认为这是因为它被保存(即较早的结果),所以它只显示这些结果,但我不确定。

控制器:

class EntriesController < ApplicationController


def index
  @entries = Entry.all
end

def scrape

    RedditScrapper.scrape

    respond_to do |format|
      format.html { redirect_to entries_url, notice: 'Entries were successfully scraped.' }
      format.json { entriesArray.to_json }
    end
  end

end

LIB / reddit_scrapper.rb:

require 'open-uri'

module RedditScrapper
  def self.scrape
    doc = Nokogiri::HTML(open("https://www.reddit.com/"))

    entries = doc.css('.entry')
    entriesArray = []
    entries.each do |entry|
      title = entry.css('p.title > a').text
      link = entry.css('p.title > a')[0]['href']
      entriesArray << Entry.new({ title: title, link: link })
    end

    if entriesArray.map(&:valid?)
      entriesArray.map(&:save!)
    end
  end
end

配置/ schedule.rb:

RAILS_ROOT = File.expand_path(File.dirname(__FILE__) + '/')

every 2.minutes do 
  runner "RedditScrapper.scrape", :environment => "development"
end

模型:

class Entry < ApplicationRecord

end

路线:

Rails.application.routes.draw do
    #root 'entry#scrape_reddit'
    root 'entries#index'
    resources :entries
    #get '/new_entries', to: 'entries#scrape', as: 'scrape'
end

查看index.html.erb:

<h1>Reddit's Front Page</h1>
<% @entries.order("created_at DESC").limit(10).each do |entry| %>
  <h3><%= entry.title %></h3>
  <p><%= entry.link %></p>
<% end %>

1 个答案:

答案 0 :(得分:0)

仅使用Entry.create!创建条目:

module RedditScrapper
  def self.scrape
    doc = Nokogiri::HTML(open("https://www.reddit.com/"))

    entries = doc.css('.entry')
    entriesArray = []
    entries.each do |entry|
      title = entry.css('p.title > a').text
      link = entry.css('p.title > a')[0]['href']
      Entry.create!(title: title, link: link )
    end
  end
end

获得10个最新条目:

# controller
def index
  @entries = Entry.order("created_at DESC").limit(10)
end

视图:

  

  

但是也认为你需要更改从Reddit中解析最新版本的项目的顺序,但是首先将它添加到数据库中。您需要在Reddit Scrapper中进行更改。

还原条目:而不是

entries.each do |entry|

使用

entries.revert.each do |entry|

因此,解析将从条目结束开始,最后的新闻将被添加。