Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Writing to text #191

Open
serget2 opened this issue Dec 15, 2024 · 9 comments
Open

Writing to text #191

serget2 opened this issue Dec 15, 2024 · 9 comments
Assignees
Labels
enhancement New feature or request

Comments

@serget2
Copy link

serget2 commented Dec 15, 2024

Hi it's me again,

I tried to make an archiver for image prompts
afbeelding

I gave it these rules to follow:
"Only add new prompts
Clearly mark the end of the prompts with a empty line, followed
by a single line of 100 Underscores then another empty line
Do not overwrite or delete the previous prompts only add to the list.
If the prompt is new Add it after the previous prompt and end mark, in the same txt file
If the prompt already exist, do not add it again."

But it keeps overwriting the prompt that it had, instead of adding it to the next free line.
Am I doing something wrong? Did I forget a rule? Or is it not meant to work this way?

@shhlife
Copy link
Collaborator

shhlife commented Dec 15, 2024

I think you would need to feed the text file into the agent as well - right now it doesn't know what the text file contains. Can you try loading the text file & adding it to the agent in the input_string section?

This may also be something where just pure python would work best for taking a text file, breaking it up into sections & only inserting the new prompt if it doesn't already exist... for deterministic stuff like that LLMs may not be the best tool. I really gotta get on that CodeExecution node. :)

@serget2
Copy link
Author

serget2 commented Dec 15, 2024

I loaded the txt in to the agent, now I have to wait till my prompts are done, will tell you as soon as I queue new ones

@serget2
Copy link
Author

serget2 commented Dec 15, 2024

If you are going to make a code execution node, a .csv file would be handy as an output aswell

@serget2
Copy link
Author

serget2 commented Dec 16, 2024

It works, it's logging all the prompts one after another
fluxturbo_8_1 0_747
griptape_output_1.txt

@serget2
Copy link
Author

serget2 commented Dec 18, 2024

Still a small problem though, when I hit that limit where the LLM loses quota (Gemini 1.5), it overwrites my prompt file with a blank file only containing the error "429 Resource has been exhausted (e.g. check quota).", luckily I backed most of them up, and I can do that before I hit that limit but letting it run overnight unsupervised, made me lose a few prompts. Glad I still had the old way up so they are still in another txt file.

I am going to try it now with my local LLama_3.2_vision which doesn't require a quota to see if it can understand what the agents need to do, and if that keeps working without overwriting. (I would not mind giving me the error but don't erase all that previous work)
Is there a way to add them directly in to the PNG files?

@shhlife
Copy link
Collaborator

shhlife commented Dec 20, 2024

to write the metadata to a PNG? possibly! @griptapeOsipa , would your metadata tool be able to do that? write the prompt that generated it into an image?

@griptapeOsipa
Copy link

*Can, and *effectively might be different answers here. It can most definitely write text into PNG image data. That said, the current version can only write to existing fields not create new ones, and fields seem to have built-in character limits that are not all consistent. A long prompt stored in the wrong place might truncate. But, the best thing to do is try!

@shhlife
Copy link
Collaborator

shhlife commented Jan 3, 2025

okay - so just to clarify what we want to do here.. are you wanting a node to add a line to a csv?

@shhlife
Copy link
Collaborator

shhlife commented Jan 3, 2025

btw - with the new python run nodes, you can do something like this:

image

where you pass it a json text with "name" and "prompt" as the two fields - and it'll save it to a file.

This particular script will overwrite an existing row with the same name, or create a new row with a new name.

Here's the workflow:
save_to_csv.json

@shhlife shhlife self-assigned this Jan 3, 2025
@shhlife shhlife added the enhancement New feature or request label Jan 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants