Skip to content

Commit

Permalink
Changed path for artifacts
Browse files Browse the repository at this point in the history
  • Loading branch information
Pete Rodriguez committed Oct 7, 2024
1 parent cb3ace0 commit bc48e52
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 7 deletions.
2 changes: 1 addition & 1 deletion 068-AzureOpenAIApps/Coach/Solution-04.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

## Notes & Guidance

The students will use the Azure CLI to upload the submission documents located in `challenge-artifacts/contoso-education/submissions`:<br>
The students will use the Azure CLI to upload the submission documents located in `artifacts/contoso-education/submissions`:<br>
Example: <br>
`az storage blob upload-batch --account-name contosopeterod1storage -d submissions -s .`

Expand Down
8 changes: 4 additions & 4 deletions 068-AzureOpenAIApps/Student/Challenge-01.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,9 @@ This approach enhances the efficiency and effectiveness of various applications,

Contoso Yachts is a 40-person organization that specializes in booking tours in Contoso Islands.

There are documents (from the **ContosoAIAppsBackend/challenge-artifacts/documents/contoso-islands** folder in your Resources) that needs to be uploaded to the **government** container in the Azure Blob Storage account.
There are documents (from the **artifacts/documents/contoso-islands** folder in your Resources) that needs to be uploaded to the **government** container in the Azure Blob Storage account.

There are also some JSON documents (from the **ContosoAIAppsBackend/challenge-artifacts/cosmos-db/contoso-yachts** that needs to be uploaded to the corresponding Azure **yachts** Cosmos DB containers respectively.
There are also some JSON documents (from the **artifacts/cosmos-db/contoso-yachts** that needs to be uploaded to the corresponding Azure **yachts** Cosmos DB containers respectively.

You can use the **az storage blob upload** command examples below to upload the document to Azure Blob Storage.

Expand Down Expand Up @@ -65,7 +65,7 @@ az login --use-device-code
az login --service-principal -u <app-id> -p <password-or-cert> --tenant <tenant>

# navigate to document directory
cd ContosoAIAppsBackend/challenge-artifacts/documents/contoso-islands
cd artifacts/documents/contoso-islands

````

Expand Down Expand Up @@ -95,7 +95,7 @@ az storage blob upload-batch --account-name contosopeterod1storage -d government

### Uploading Documents to Azure Cosmos DB

The contents of the Yacht details are stored in the directory **Challenge-00/ContosoAIAppsBackend/challenge-artifacts/cosmos-db/contoso-yachts**
The contents of the Yacht details are stored in the directory **artifacts/cosmos-db/contoso-yachts**

Make sure you manually copy and paste the JSON contents of each JSON file in this location and use the REST client in **rest-api-yachts-management.http** to send each document via the REST API to Cosmos DB. There are five JSON files.

Expand Down
4 changes: 2 additions & 2 deletions 068-AzureOpenAIApps/Student/Challenge-03.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Your goal is to design and create a pipeline that can process all the historical

You can use any programming language and Azure services of your choice to implement the solution. Remember to follow best practices for coding and architecture design.

There are 20 sample documents in the **Student/Resources/Challenge-00/ContosoAIAppsBackend/challenge-artifacts/contoso-education** folder:
There are 20 sample documents in the **artifacts/contoso-education** folder:

- F01-Civics-Geography and Climate
- F02-Civics-Tourism and Economy
Expand Down Expand Up @@ -104,7 +104,7 @@ The first 3 extractor models a straightforward. However in the 4th document type
}
]
````
After training your models, you can test the form processing pipeline by uploading the files located locally in `ContosoAIAppsBackend/challenge-artifacts/contoso-education/submissions` to the `submissions` container in your storage account. Refer back to CH0 for uploading local files into your storage account. This will trigger Azure Functions, which have been created for you in the backend. Azure Functions will classify, extract, and store the results in CosmosDB.
After training your models, you can test the form processing pipeline by uploading the files located locally in `artifacts/contoso-education/submissions` to the `submissions` container in your storage account. Refer back to CH0 for uploading local files into your storage account. This will trigger Azure Functions, which have been created for you in the backend. Azure Functions will classify, extract, and store the results in CosmosDB.

Your solution should:

Expand Down

0 comments on commit bc48e52

Please sign in to comment.