爬虫Ruby on Rails上的延迟作业,在持久化之前无法为记录创建作业

时间:2013-08-22 04:47:46

标签: ruby-on-rails ruby-on-rails-3 mongoid delayed-job

我正在使用mongodb开发ROR应用程序。基本上我的一个模型类是:

require 'open-uri'   
class StockPrice
 include Mongoid::Document
 include Mongoid::Timestamps

 field :stock_name, type: String
 field :price, type: Float

 index :stock_name => +1, :updated_at => -1

 def self.price(stock_name)
   #print stock_name
   g=StockPrice.new
   g.crawl(stock_name)
   where(stock_name: stock_name).desc(:updated_at).first.price
end

#Search for the stock in screener.in and get its last price.
def crawl(stock_name)
  company_name=stock_name
  agent = Mechanize.new
  page = agent.get('http://www.screener.in/')
  form = agent.page.forms[0]
  agent.page.forms[0]["q"]=company_name
  button = agent.page.forms[0].button_with(:value => "Search Company")
  pages=agent.submit(form, button)  
  new_page=pages.uri.to_s
  doc=Nokogiri::HTML(open(new_page))
  row_data = doc.css('.table.draggable.table-striped.table-hover tr.strong td').map     do       |tdata|
      tdata.text
  end
  g=row_data[2]
 StockPrice.create(stock_name:stock_name,price:g)
end
handle_asynchronously :crawl
end

现在在我的应用程序中我想使用延迟作业运行抓取功能,以便它可以在后台运行为此我使用delayed_jobs_mongoid gem。但是当我使用handle_asynchronously时,我收到此错误:在记录被保留之前无法为记录创建作业

1 个答案:

答案 0 :(得分:0)

您自己说明了记录必须在数据库中保留的原因才能使delayed_job工作

解决方案

而是做这样的事情

def self.price(stock_name)
   #print stock_name
   crawl stock_name
   where(stock_name: stock_name).desc(:updated_at).first.price
end

def self.crawl(stock_name)
  s = StockPrice.new(name: stock_name)
  company_name=stock_name
  agent = Mechanize.new
  page = agent.get('http://www.screener.in/')
  form = agent.page.forms[0]
  agent.page.forms[0]["q"]=company_name
  button = agent.page.forms[0].button_with(:value => "Search Company")
  pages=agent.submit(form, button)  
  new_page=pages.uri.to_s
  doc=Nokogiri::HTML(open(new_page))
  row_data = doc.css('.table.draggable.table-striped.table-hover tr.strong td').map     do       |tdata|
      tdata.text
  end
  g=row_data[2]
  s.price = g 
  s.save
end

这更有意义,因为crawl在技术上(在现实世界中)应该是class方法而不是实例方法