Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Support for Docker Library #70

Open
ruffsl opened this issue Sep 30, 2017 · 2 comments
Open

Adding Support for Docker Library #70

ruffsl opened this issue Sep 30, 2017 · 2 comments

Comments

@ruffsl
Copy link

ruffsl commented Sep 30, 2017

This issue is tracking support for releasing Dockerfiles into the Docker Library with superflore.
I'm not yet sure how this should be approache, but here is my current vision of the result:

  1. Upon triggering, the dockerfile auto generation scripts are run
  2. If a file level change is detected, the diff is staged and committed
  3. Then the manifest for the docker library is updated and committed as well
  4. After, a auto PR with the last two commits is submitted to the Dockerfile repo
  5. The PR is then manually reviewed and merged
  6. The reviewer can then copy and paste the manifest to PR towards the Docker Hub Official Library

So given this goal, I think some github API interfacing will be required, for submitting a PR to an upstream repo without push access for example. I see some of this capability already in bloom, so perhaps there is a way we could share more intraftructure between these bloom and superflore.

Another issue is that I'm not sure how I should call my generation scripts. Should I call them functionally from python, as my auto generated code is already python3, or should I use subprocess to avoid roping in a host of separate dependencies? I suppose the scripts could be executed in a container?

Lastly, I'm debating if this whole PR process for docker images would be better serviced from a CI job. I have just about all the steps already working in travis already, and I'm wondering if keeping this agnostic of bloom, so that other images that are not entirely ros related could piggyback on the same CI PR process.

ping: @tfoote @mikaelarguedas @allenh1 @nuclearsandwich

@nuclearsandwich
Copy link
Contributor

Then the manifest for the docker library is updated and committed as well

As a bit of pre-emptive advice: git-commit-tree and git-update-ref plumbing commands are, in my experience, more reliable for committing generated contents than the regular workflow commands since they work by taking a content snapshot directly rather than simulating a human workflow.

I see some of this capability already in bloom, so perhaps there is a way we could share more intraftructure between these bloom and superflore.

I think the easiest thing to do at the moment is to pull in functionality that you need from where it already exists. I'm pretty neutralpositive on adding bloom as a dependency to superflore to accomplish this. I prefer it over copying functionality or, at least in the near term, trying to factor just the GitHub bits into a separate library used by both. If bloom and superflore start sharing more code down the line that seems a positive to me.

Lastly, I'm debating if this whole PR process for docker images would be better serviced from a CI job. I have just about all the steps already working in travis already,

I think in a lot of ways we should treat CI as the new cron with the added bonus of slightly more democratic observability and invocation. If it can work in a Travis context it can likely also run in a container on a dedicated CI system should the need arise.

I'm wondering if keeping this agnostic of bloom, so that other images that are not entirely ros related could piggyback on the same CI PR process.... Another issue is that I'm not sure how I should call my generation scripts. Should I call them functionally from python, as my auto generated code is already python3, or should I use subprocess to avoid roping in a host of separate dependencies? I suppose the scripts could be executed in a container?

For infrastructure work I think it's valuable to solve the specific problem in front of you and then factor out more generic tools when the need for sharing arises. In both cases I'd suggest doing the simplest thing first then revisiting the interface when you have a use case in mind.

@allenh1
Copy link
Contributor

allenh1 commented Oct 31, 2017

I see some of this capability already in bloom, so perhaps there is a way we could share more intraftructure between these bloom and superflore.

@ruffsl Sorry to get back to you so late on this! This is a good idea, and is a thing I've been putting off for a little while (see #67 ).

In the interim, I've got my own way of doing it that works decently well.

For infrastructure work I think it's valuable to solve the specific problem in front of you and then factor out more generic tools when the need for sharing arises. In both cases I'd suggest doing the simplest thing first then revisiting the interface when you have a use case in mind.

@nuclearsandwich I strongly agree with this.

@ruffsl After #76 was merged into master, you need to do the following:

  1. Supply an entry point for the generator in setup.py.
  2. Create a generator forlder to house your logic
  3. Write a generate_pkg function like this one.
  4. In the main function referenced by your entry point, you will simply call superflore.generate_installers with your function.

Let me know if there's anything else you need from me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants