Skip to content

Commit

Permalink
describe TGD vs SGD and TextLoss
Browse files Browse the repository at this point in the history
  • Loading branch information
mertyg committed Jun 12, 2024
1 parent 4305fe3 commit 0732672
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,13 +36,13 @@ Initial `punchline` from the model:
Not bad, but maybe GPT-4o can do better! Let's optimize the punchline using TextGrad. In this case `punchline` would be the variable we want to optimize and improve.
```python
# Step 2: Define the loss function and the optimizer, just like in PyTorch!
# Step 2: Define the loss function and the optimizer, just like in PyTorch! Here, we don't have SGD, but we have TGD (Textual Gradient Descent) that works with "textual gradients". TextLoss is a natural-language specified loss function that describes how we want to evaluate the punchline.
loss_fn = tg.TextLoss("We want to have a super smart and funny punchline. Is the current one concise and addictive? Is the punch fun, makes sense, and subtle enough?")
optimizer = tg.TGD(parameters=[punchline])
```

```python
# Step 3: Do the loss computation, backward pass, and update the punchline.
# Step 3: Do the loss computation, backward pass, and update the punchline. Exact same syntax as PyTorch!
loss = loss_fn(punchline)
loss.backward()
optimizer.step()
Expand Down

0 comments on commit 0732672

Please sign in to comment.