Bulk updating gem source
with no parameters, bundler will ignore any previously installed gems and resolve all dependencies again based on the latest versions of all gems available in the sources.
Consider the following Gemfile Fetching gem metadata from https://rubygems.org/......... Installing builder 2.1.2 Installing abstract 1.0.0 Installing rack 1.2.8 Using bundler 1.7.6 Installing rake 10.4.0 Installing polyglot 0.3.5 Installing mime-types 1.25.1 Installing i18n 0.4.2 Installing mini_portile 0.6.1 Installing tzinfo 0.3.42 Installing rack-mount 0.6.14 Installing rack-test 0.5.7 Installing treetop 1.4.15 Installing thor 0.14.6 Installing activesupport 3.0.0Installing erubis 2.6.6 Installing activemodel 3.0.0Installing arel 0.4.0 Installing mail 2.2.20 Installing activeresource 3.0.0Installing actionpack 3.0.0Installing activerecord 3.0.0Installing actionmailer 3.0.0Installing railties 3.0.0Installing rails 3.0.0Installing nokogiri 1.6.5 Bundle complete! Use `bundle show [gemname]` to see where a bundled gem is installed. Keep in mind that this process can result in a significantly different set of the 25 gems, based on the requirements of new gems that the gem authors released since the last time you ran Fetching source index for https://rubygems.org/ Installing daemons (1.1.0) Installing eventmachine (0.12.10) with native extensions Installing open4 (1.0.1) Installing (0.4.7) with native extensions Installing rack (1.2.1) Installing rack-perftools_profiler (0.0.2) Installing thin (1.2.7) with native extensions Using bundler (1.0.03) option will also prevent shared dependencies from being updated.
All of my websites are currently running on a VPS server provided by Host Ican.
I recently discovered a little quirk involving VPS servers and Ruby Gems.
Rails is having to talk over TCP to the database four times for each record, plus it is having to build two different SQL statements, each time.
Luckily, this last part is quick thanks to some work by Tenderlove on Adequate Record Pro where the query construction is import_products2: :environment do require 'csv' CSV.foreach("#/tmp/products.csv") do |p| product = Product.new(sku: p, name: p, origin: p, msrp_cents: p) product.save_for_bulk_import!
end end The way we have our code now, each of the 50,000 products gets its own database transaction.
This is how Rails handles saving a model by default…
All in all, we were able to get 50,000 records into the database from 59 seconds all the way down to 4.6 seconds, around 12 times faster.
The way to avoid this problem is to update Ruby Gems… Unfortunately, this was not in my local repository.
Bulk updating Gem source index for: installed rails-2.1.0 Successfully installed rake-0.8.1 Successfully installed activesupport-2.1.0 Successfully installed activerecord-2.1.0 Successfully installed actionpack-2.1.0 Successfully installed actionmailer-2.1.0 Successfully installed activeresource-2.1.0 Installing ri documentation for rake-0.8.1... Installing RDoc documentation for activesupport-2.1.0... Installing RDoc documentation for activerecord-2.1.0... Installing RDoc documentation for actionpack-2.1.0... Installing RDoc documentation for actionmailer-2.1.0... Installing RDoc documentation for activeresource-2.1.0...
Gem requirements as defined in the Gemfile will still be the first determining factor for what versions are available.
If the gem requirement for # Command Line Result ------------------------------------------------------------ 1 bundle update --patch 'foo 1.4.5', 'bar 2.1.1' 2 bundle update --patch foo 'foo 1.4.5', 'bar 2.1.1' 3 bundle update --minor 'foo 1.5.1', 'bar 3.0.0' 4 bundle update --minor --strict 'foo 1.5.0', 'bar 2.1.1' 5 bundle update --patch --strict 'foo 1.4.4', 'bar 2.0.4' In case 1, bar is upgraded to 2.1.1, a minor version increase, because the dependency from foo 1.4.5 required it.