I'm trying to download all full-res images from a site by checking for image links, visit them and download the full image.
I have managed to make it kinda work. I can fetch all links and download the image from i.imgur. However, I want to make it work with more sites and normal imgur albums and also without wget (which I am using now as shown below).
This is the code I'm currently playing around with (Don't judge, it's only testcode):
require 'mechanize'
require 'uri'
def get_images()
  crawler = Mechanize.new
  img_links = crawler.get("http://www.reddit.com/r/climbing/new/?count=25&after=t3_39qccc").links_with(href: %r{i.imgur})
  return img_links
end
def download_images()
  img_links = get_images()
  crawler = Mechanize.new
  clean_links = []
  img_links.each do |link|
  current_link = link.uri.to_s
  unless current_link.include?("domain")
    unless clean_links.include?(current_link)
      clean_links << current_link
    end
  end
end
p clean_links
clean_links.each do |link|
  system("wget -P ./images -A jpeg,jpg,bmp,gif,png #{link}")
  end
end
download_images()