Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support LimitGroups and MachineLimit Houdini Deadline #38

Merged

Conversation

MustafaJafar
Copy link
Contributor

Changelog Description

Support LimitGroups and MachineLimit Houdini Deadline

Snippets

Houdini Deadline Settings publisher UI
image image

Submitted Job info

MachineLimit LimitGroups
image image

Testing notes:

  1. Launch Houdini Lacuncher
  2. Publish Render and Cache to farm via publisher while setting limit groups and machine limit.
  3. ensure the settings are propagated to the deadline job

@MustafaJafar MustafaJafar added type: enhancement Improvement of existing functionality or minor addition sponsored This is directly sponsored by a client or community member labels Oct 2, 2024
@MustafaJafar MustafaJafar self-assigned this Oct 2, 2024
@MustafaJafar MustafaJafar linked an issue Oct 2, 2024 that may be closed by this pull request
@MustafaJafar
Copy link
Contributor Author

one question:

  • Should we remove any spaces from the limit groups input? e.g. limit1, limit2 -> limit1,limit2. tbh, Idk if users are aware that spaces are propagated to DL 😅

@BigRoy
Copy link
Contributor

BigRoy commented Oct 2, 2024

We should really make these enums - let me make a PR to follow up on this soon.

@BigRoy
Copy link
Contributor

BigRoy commented Oct 3, 2024

Also, we should really not be doing this PER submitter plug-in because it'll just be maintenance hell across all plug-ins because we'll need to duplicate all of this across all plug-ins.

@MustafaJafar
Copy link
Contributor Author

Also, we should really not be doing this PER submitter plug-in because it'll just be maintenance hell across all plug-ins because we'll need to duplicate all of this across all plug-ins.

How do you imagine it?
I think we may have it in a dedicated plugin similar to collect pools but then users won't be able to specify limits for export jobs and render jobs separately?

@BigRoy
Copy link
Contributor

BigRoy commented Oct 3, 2024

I think we may have it in a dedicated plugin similar to collect pools but then users won't be able to specify limits for export jobs and render jobs separately?

Yes - we should bring it down to the basics preferably where it all shares code. So either we do something so this is just part of a parent plug-in being inherited from, and hence they share the logic regardless. Or we need to do some other type of filtering to define what is the default value and what is the settings, etc. which may require e.g. profiles filtering.

But we could also just have the "base" implementation for render jobs - and another separate of export jobs. But those then still share across ALL hosts, instead of making this feature Houdini-only currently.

@MustafaJafar
Copy link
Contributor Author

But we could also just have the "base" implementation for render jobs - and another separate of export jobs. But those then still share across ALL hosts, instead of making this feature Houdini-only currently.

I'll try again and report back.

@BigRoy
Copy link
Contributor

BigRoy commented Oct 3, 2024

Ignore enums

Also, ignore my Enum statement - I mean, it'd be nice for the "Limit Groups" but not crucial I think. I thought the "machine limit" originally was the "Allow/Deny list" of machines instead max. machines allowed to render the job simultaneously.

For context, I've explained more on how we've exposed that here: https://community.ynput.io/t/deadline-pools-and-groups-dropdown-menu/1770/2?u=bigroy

It also has a link to a code comparison which may help to just 'copy' some of that logic if needed.

Change wording on "Machine Limit" (because this is not it)

I'd change the wording on that because Machine Limit inside deadline is much more than that - it is also the allow/deny list of workers (which is what we use a lot, but I've never ever used the limit you're setting here).

For context this seems to be the setting you're setting:

image

But it's part of a machine limit tab that offers much more:

image

Hence I'd try and stay closer to the Deadline wording that the setting you've exposed now is the amount of workers that can this job simultaneously - and not necessarily "machine limits" overall.

@MustafaJafar MustafaJafar requested a review from antirotor October 3, 2024 19:57
Copy link
Member

@moonyuet moonyuet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good for me
image

@MustafaJafar
Copy link
Contributor Author

MustafaJafar commented Oct 11, 2024

I've also discovered that the AYON publish job doesn't get the LimitGroups and MachineLimit.
The way we handle this with other attributes like primaryPool and secondaryPool is to have them globally and let both the houdini and ayon jobs inherit the specified values.

Currently, I'm discussing with @BigRoy different approaches.

@BigRoy
Copy link
Contributor

BigRoy commented Oct 11, 2024

Currently, I'm discussing with @BigRoy different approaches.

Just copy-pasting some of the stuff that came up in the discussion as potential approaches:

Mustafa: Global Collect Plugin

We can have a global collect plugin that adds these publish attributes. and then reuse them in the Houdini submitter, which saves us from adding them in the Houdini submitter plugins.
But, there are some caveats:

  1. We don't have consistent render target names across different repos. So, it may/might be hard to add a condition that shows the export publish attributes based on the instance data.
  2. We still need to add code to the Houdini submitters so that we can update the job info properly.

Hmm yes - I've thought about that too.
I agree that targeting the right products may be very tricky.

Roy: Inheritable base class

Another approach would just be making an "inheritable interface" that we can put on each submitter.
Like inherit from a DeadlineJobOptions
Which then defines the attribute defs and has a get_deadline_job_info_options that gets the relevant key/value for those attributes that you can just add to the payload.
The nice thing is that we're then still able to set settings for those "Configurable options" per plug-in in the settings as usual.
Plus it'd also work for the PublishJob plugins too.
However - I'm not sure I like it completely.

Roy: Attributes per instance (new functionality)

Another way is to use the "attributes per instance" PR and use some other mechanic to decide whether to display the attributes for an instance. Like e.g. a Creator adding some transient data like supports_farm=True and we just check for existence of that.
But i'm not totally convinced that's the best way either.
Plus this last approach will also suffer a bit from supporting "multiple separate jobs" like (export job, render job and publish job settings)


Anyway, the best thing we can find is where we can find ONE way where we just define the logic once - and distribute it to the different plug-ins/instances/integrations.

So preferably we have something separate:

class ConfigurableDeadlineOptions:
    """Inherit pyblish plug-in from this to support customizable Deadline Job Options
    
    Attribute definitions will be created for each job label in the `jobs` class
    attribute so that one plug-in could for example support multiple job configurations.

    Then, for the plug-in also expose settings on the server using the 
    `ConfigurableDeadlineOptionsModel`. That will allow a studio admin to completely
    configure all the default values and also which attributes are 'user-editable'
    as publisher attributes.
    """
    jobs: List[str] = ["render"]
    
    settings: Dict[str, Any] = {}
  
    def get_configured_job_info_attrs(self, data: Dict[str, Any], job: str) -> Dict[str, Any]:
        """Return the Job Info configuration for the named job."""
        attr_values = self.get_attr_values(data)
        # pseudcode: poor man's filtering for attribute def values for the job
        # this should also end up using the 'defaults' from `settings` that were
        # set to be non-user overridable (and hence may have no attribute defs?)
        return {key: value for key, value in attr_values.items() if key.startswith(job)} 

    def get_instance_attr_defs(self):
        attr_defs = []
        for job in self.jobs:
            attr_defs.extend(self._get_attr_defs_for_job(job))
        return attr_defs

Here's e.g. a quick pseudocode thing we could inherit from.
The docstring describes how we could have a base settings model class one could inherit from to "share" the settings model between the plug-ins so we don't need to duplicate that every time.

@moonyuet moonyuet self-requested a review October 14, 2024 13:45
Copy link
Member

@moonyuet moonyuet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested it for one more time.
Looks good.
image
image

@MustafaJafar
Copy link
Contributor Author

Merging this PR and I'll be creating an issue to track BigRoy's suggestions.

@MustafaJafar MustafaJafar merged commit 321d404 into develop Oct 14, 2024
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
sponsored This is directly sponsored by a client or community member type: enhancement Improvement of existing functionality or minor addition
Projects
None yet
Development

Successfully merging this pull request may close these issues.

AY-6727_Houdini expose Deadline options
3 participants