Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: Update links and clean up documentation #577

Merged
merged 1 commit into from
Sep 27, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/build/glossary/terms/masked-language-modeling.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ category: AIML

Masked Language Modeling, or “MLM,” is a pre-training technique used in natural language processing (NLP) to enable a model to predict masked tokens within an input sequence. It is an approach that helps AI models learn a deep understanding of language context and structure without requiring labeled data, making MLM NLP an unsupervised learning method.

Unlike traditional language models that predict the next token in a sequence, MLM can utilize both the previous and subsequent tokens to predict a masked token. As a result, the model is able to better understand the [context](https://docs.pieces.app/build/glossary/terms/Pieces%20Specific/context) surrounding each word.
Unlike traditional language models that predict the next token in a sequence, MLM can utilize both the previous and subsequent tokens to predict a masked token. As a result, the model is able to better understand the [context](https://docs.pieces.app/build/glossary/terms/context) surrounding each word.

## Benefits of Masked Language Modeling
- **Deep contextual understanding** - MLM trains using both left and right contexts of a word, leading to a nuanced understanding of language structure and usage, superior to traditional unidirectional models.
Expand Down
2 changes: 1 addition & 1 deletion docs/build/reference/typescript/models/05-Activity.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ title: Activity | TypeScript SDK

# Activity

consider a rename to Event? That being said if we go with event we need to think about a word to pre/post fix event because it is likely to be a reserved word. additional documentation: https://www.notion.so/getpieces/Activity-4da8de193733441f85f87b510235fb74 Notes: - user/asset/format are all optional, NOT required that one of these are present. - if mechanism == internal we will not display to the user. Thoughts around additional properties. - hmm dismissed array here, that is an array of strings, where the string is the userId that dismissed this notification? or we could potentially do it based on the os_ID. -
consider a rename to Event? That being said if we go with event we need to think about a word to pre/post fix event because it is likely to be a reserved word. Notes: - user/asset/format are all optional, NOT required that one of these are present. - if mechanism == internal we will not display to the user. Thoughts around additional properties. - hmm dismissed array here, that is an array of strings, where the string is the userId that dismissed this notification? or we could potentially do it based on the os_ID. -

## Properties

Expand Down
2 changes: 0 additions & 2 deletions docs/product-highlights-and-benefits/live-context.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,6 @@ You also can clear Workstream Pattern Engine data at any time, from a specific t

Once you've enabled the Workstream Pattern Engine, go about your usual work for a few minutes, and then head to the Copilot Chats view in the Pieces for Developers Desktop App and select “New Chat.” In the “Set Context” section, tap the option labeled “Live Context." Like before, you can leverage any of our available models, on-device or cloud, to engage with Live Context in the Pieces Copilot, and you can use it across your toolchain to carry on conversations in your favorite IDE and browser.

<Video type={'gif'} src={'/wpe/using-live-context.gif'} alt={'WPE Using Live Context'}/>

:::info

The Workstream Pattern Engine must be turned on to use Live Context. We’ve made this super easy to do from wherever you’re getting started.
Expand Down