Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add Google Cloud Storage / nearline to storages #22

Open
bmaeser opened this issue Mar 13, 2015 · 9 comments
Open

add Google Cloud Storage / nearline to storages #22

bmaeser opened this issue Mar 13, 2015 · 9 comments
Labels

Comments

@bmaeser
Copy link

bmaeser commented Mar 13, 2015

in march 2015 google introduced the Cloud Storage / Nearline plattform.
using this service as a backup-storage would make a great improvement to this project.
regards, Bernhard

http://googlecloudplatform.blogspot.co.at/2015/03/introducing-Google-Cloud-Storage-Nearline-near-online-data-at-an-offline-price.html

@tombruijn
Copy link
Member

I hadn't heard of a new google cloud storage thing, but sure, would make a nice option to save back ups there as well.

Currently the team is not working to expand current feature set. We're busy with other things, but also looking if we can improve the overal code base in a new version.

We're currently accepting community PRs for new features. So if you're able and have the time to add this go ahead! Otherwise you'll have to wait until someone has time to pick this up.

@bmaeser
Copy link
Author

bmaeser commented Sep 7, 2015

i would love to implement this myself, unfortunately i don't speak any ruby.

but thanks for putting it on the roadmap.

@bmaeser bmaeser closed this as completed Sep 7, 2015
@bmaeser bmaeser reopened this Sep 7, 2015
@Bogdaan
Copy link

Bogdaan commented Oct 1, 2016

I'm working on this PR (at this moment, they have the best offer for backups storage)

@pduersteler
Copy link

@Bogdaan Any information about the current state? Anything we can help with?

@Bogdaan
Copy link

Bogdaan commented Nov 10, 2016

@pduersteler
For performance and security reasons i use FUSE (https://cloud.google.com/storage/docs/gcs-fuse) with backup-local storage. FUSE create more api-calls, but it works faster that ryby implementation.

@lauramorillo
Copy link

I'd love to help with this! So @Bogdaan let me know if I can help!

@rpassis
Copy link

rpassis commented Jan 1, 2017

The Google Storage API is interoperable with S3 so that basically means all you need to do is pass the correct endpoint to fog, along with the correct access & secret keys for Google, and everything will work exactly the same.

Steps to make it work:

  1. On your Google Cloud account, go to Storage -> Settings -> Interoperability and create a new Access/Secret pair.

  2. The documentation link below has a reference on how to pass other options to fog, including a different endpoint: http://backup.github.io/backup/v4/storage-s3/

You can use one of the XML endpoints from this link https://cloud.google.com/storage/docs/request-endpoints (either HTTPS or HTTP should work).

I hope it helps.
Rog

@lauramorillo
Copy link

I have created a new PR implementing with fog the connection to use GCS as storage.

backup/backup#827

I have seen that fog provides the XML API (using the Interoperability credentials that @rpassis said) and also the JSON API, but this one is supported in a version that is not included in fog yet as that version deprecates Ruby versions less than 2.0.0 and in fog they still support it.

According to fog/fog#3872 (comment) they are planning the change, so hopefully that one will be done soon and we will be able to have the JSON API.

@bdossantos
Copy link

Hi !

I'm trying to use GCS via S3 interoperable API, but I have the following error:

[...]
[2017/02/23 13:49:15][info] Packaging Complete!
[2017/02/23 13:49:15][info] Cleaning up the temporary files... 
[2017/02/23 13:49:15][info] Storage::S3 Started...
[2017/02/23 13:49:15][info] Storing 'backup-bucket/web04/archives_full/2017.02.23.13.47.10/archives_full.tar-aaa'...
[2017/02/23 13:49:15][info]   Initiate Multipart 'backup-bucket/web04/archives_full/2017.02.23.13.47.10/archives_full.tar-aaa'
[2017/02/23 13:49:15][info] CloudIO::Error: Retry #1 of 10
[2017/02/23 13:49:15][info]   Operation: POST 'backup-bucket/web04/archives_full/2017.02.23.13.47.10/archives_full.tar-aaa' (Initiate)
[2017/02/23 13:49:15][info] --- Wrapped Exception ---
[2017/02/23 13:49:15][info] TypeError: no implicit conversion of Array into String
[...]

My configuration:

  store_with S3 do |s3|
    s3.access_key_id     = 'XXX'
    s3.secret_access_key = 'XXX'
    s3.bucket            = 'backup-bucket'
    s3.path              = 'web04'
    s3.region            = 'EU',
    s3.fog_options       = {
      endpoint: 'https://storage.googleapis.com',
    }
  end

My environment:

backup 4.3.0 : ruby 2.3.1p112 (2016-04-26 revision 54768) [x86_64-linux]

Do I need to pass extras options ?

Thank you in advance.

cc @rpassis

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants