Skip to content

Commit

Permalink
Merge pull request #947 from NeuromatchAcademy/release-v1.0.1
Browse files Browse the repository at this point in the history
Update notebook-pr.yaml
  • Loading branch information
iamzoltan authored Apr 9, 2024
2 parents bb1f25c + 2b49fb2 commit 5a094fa
Show file tree
Hide file tree
Showing 4 changed files with 4 additions and 4 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/notebook-pr.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ jobs:
python ci/verify_exercises.py $nbs --c "$COMMIT_MESSAGE"
python ci/make_pr_comment.py $nbs --branch $branch --o comment.txt
# This package is outdated and no longer maintained
# This package is outdated and no longer maintained.
# - name: Add PR comment
# if: "!contains(env.COMMIT_MESSAGE, 'skip ci')"
# uses: machine-learning-apps/[email protected]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -301,7 +301,7 @@
"\n",
"In classical transformer systems, a core principle is encoding and decoding. We can encode an input sequence as a vector (that implicitly codes what we just read). And we can then take this vector and decode it, e.g., as a new sentence. So a sequence-to-sequence (e.g., sentence translation) system may read a sentence (made out of words embedded in a relevant space) and encode it as an overall vector. It then takes the resulting encoding of the sentence and decodes it into a translated sentence.\n",
"\n",
"In modern transformer systems, such as GPT, all words are used parallelly. In that sense, the transformers generalize the encoding/decoding idea. Examples of this strategy include all the modern large language models (such as GPT)."
"In modern transformer systems, such as GPT, all words are used in parallel. In that sense, the transformers generalize the encoding/decoding idea. Examples of this strategy include all the modern large language models (such as GPT)."
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -301,7 +301,7 @@
"\n",
"In classical transformer systems, a core principle is encoding and decoding. We can encode an input sequence as a vector (that implicitly codes what we just read). And we can then take this vector and decode it, e.g., as a new sentence. So a sequence-to-sequence (e.g., sentence translation) system may read a sentence (made out of words embedded in a relevant space) and encode it as an overall vector. It then takes the resulting encoding of the sentence and decodes it into a translated sentence.\n",
"\n",
"In modern transformer systems, such as GPT, all words are used parallelly. In that sense, the transformers generalize the encoding/decoding idea. Examples of this strategy include all the modern large language models (such as GPT)."
"In modern transformer systems, such as GPT, all words are used in parallel. In that sense, the transformers generalize the encoding/decoding idea. Examples of this strategy include all the modern large language models (such as GPT)."
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -301,7 +301,7 @@
"\n",
"In classical transformer systems, a core principle is encoding and decoding. We can encode an input sequence as a vector (that implicitly codes what we just read). And we can then take this vector and decode it, e.g., as a new sentence. So a sequence-to-sequence (e.g., sentence translation) system may read a sentence (made out of words embedded in a relevant space) and encode it as an overall vector. It then takes the resulting encoding of the sentence and decodes it into a translated sentence.\n",
"\n",
"In modern transformer systems, such as GPT, all words are used parallelly. In that sense, the transformers generalize the encoding/decoding idea. Examples of this strategy include all the modern large language models (such as GPT)."
"In modern transformer systems, such as GPT, all words are used in parallel. In that sense, the transformers generalize the encoding/decoding idea. Examples of this strategy include all the modern large language models (such as GPT)."
]
},
{
Expand Down

0 comments on commit 5a094fa

Please sign in to comment.