Wednesday 12 February 2014

Using Comfortable Mexican Sofa's fixtures on Heroku

Hi all,
If you have been using Comfortable Mexican Sofa you might have been wondering "how do i backup the cms data?"
The team on CMS added a fixture capabilities, however these features does not really work on heroku:
CMS is creating all the content as files, even the files uploaded will be downloaded locally, the problem is that on heroku, local means files that will be deleted as soon as the task is over.
My way of working around it was to upload all the files to S3, and pulling it from S3 for restoring that content.
On my case, we have an heroku app for staging and production app, with fixtures i can move the CMS content from one app to the other.

I've created a two rake tasks, one for exporting, and one for importing.



namespace :fixtures do
  def prepare_folder!(path)
    FileUtils.rm_rf(path)
    FileUtils.mkdir_p(path)
  end

  desc "Run rule engine"
  task :export => :environment do
    #create the fixtures files
    from  = ENV['FROM'] || "appname"
    to    = "#{Date.today.strftime("%Y%m%d")}"
    ComfortableMexicanSofa::Fixture::Exporter.new(from, to).export!

    # create a connection
    connection = Fog::Storage.new({
      :provider                 => 'AWS',
      :aws_access_key_id        => GLOBAL['s3_key'],
      :aws_secret_access_key    => GLOBAL['s3_secret']
    })

    directory = connection.directories.get("fixtures")

    #GO OVER ALL FILES AND UPLOAD
    Dir.glob("db/cms_fixtures/#{to}/**/*.*") do |rb_file|
      # do work on files ending in .rb in the desired directory
      file = directory.files.create(
        :key    => rb_file,
        :body   => File.open(rb_file),
        :public => true
      )
    end
  end

  task :import => :environment do
    to  = ENV['TO'] || "appname"
    from    = "#{Date.today.strftime("%Y%m%d")}"
    path  = "db/cms_fixtures/#{from}/"

    prepare_folder!(path)

    # create a connection
    connection = Fog::Storage.new({
      :provider                 => 'AWS',
      :aws_access_key_id        => GLOBAL['s3_key'],
      :aws_secret_access_key    => GLOBAL['s3_secret']
    })

    directory = connection.directories.get("fixtures", prefix: "db/cms_fixtures/#{from}/")

    directory.files.each do |file|
      url = file.public_url
      folder = url.match("db/cms_fixtures/#{from}/.+/")[0]
      prepare_folder!(folder) unless Dir.exist?(folder)
      open(url.match("db/cms_fixtures/#{from}/.*")[0], 'wb') do |f|
        open(url) { |src| f.write(src.read) }
      end
      puts "Copied #{url}"
    end

    ComfortableMexicanSofa::Fixture::Importer.new(from, to, :force).import!
  end
end

For that to work you are going to need to state as the "TO" or "FROM" parameter the name of the site (the identifier) in the CMS, also you will need to create the bucket named fixtures as well as credentials for S3.
this code is creating under the fixtures bucket a "directory"(there are no directories in S3, the file names are just directory like e.g. "folder/file.txt") with the date and stores all data there.

1 comment:

  1. Thanks, Fwiw, heres some code that imports/exports all CMS sites to S3
    https://gist.github.com/Onumis/288baf157a0dee905d2b

    ReplyDelete