Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PatchTSTModel, PatchTSTConfig, & Trainer #1047

Open
Datakunskap opened this issue Nov 21, 2024 · 3 comments · May be fixed by huggingface/optimum#2101 or #1048
Open

PatchTSTModel, PatchTSTConfig, & Trainer #1047

Datakunskap opened this issue Nov 21, 2024 · 3 comments · May be fixed by huggingface/optimum#2101 or #1048
Labels
enhancement New feature or request

Comments

@Datakunskap
Copy link

Feature request

Implementation of:
PatchTSTModel, PatchTSTConfig, & Trainer

Source:
https://github.com/huggingface/transformers/blob/v4.46.3/src/transformers/models/patchtst/modeling_patchtst.py#L1142

Motivation

https://github.com/huggingface/transformers

Your contribution

Submitting a PR

@xenova
Copy link
Collaborator

xenova commented Nov 21, 2024

Opened PR here. Let me know if that works for you 👍

@Datakunskap
Copy link
Author

@xenova You're amazing! We were about to switch over to Python. You saved our Typescript devs. Thank you!

We have one more request to make it on par with the Python transformer library. Would it be possible to add the following Models as well?

Kindest regards,
Datakunskap

@xenova
Copy link
Collaborator

xenova commented Nov 23, 2024

Sure thing! I've added them here.

Example usage for PatchTSMixerForPrediction can be found here.

Note for the config, you an simply do:

import { AutoConfig } from '@huggingface/transformers';

const config = await AutoConfig.from_pretrained('hf-internal-testing/tiny-random-PatchTSTModel'); 
console.log(config);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
2 participants