This guide will walk you through the procedure to manually create the Personalized Offers Solution. You can get an overview of the project here and see some details on how to work with the deployed solution here.
The necessary materials are included in the src folder in this repository.
The steps described later in this guide require the following prerequisites:
- An Azure subscription with login credentials
- The availability of creation of:
- 1 Data Lake Store
- 4 Stream Jobs with a total of 43 Streaming Units
- 1 Event Hub with 20 Throughput Units, 16 partitions and 4 Consumer Groups
- 1 DocumentDB database with 6 collections, each provisioned with 10000 RUs, 10GB (3 are Partitioned).
Ensure adequate Data Lake Stores and Stream Processing units are available before provisioning. Please consider deleting any unused Data Lake Store from your subscription. You may contact Azure Support if you need to increase the limit.
The architecture diagram shows various Azure services that are deployed by Personalized Offers Solution using Azure AI platform with Azure services, and how they are connected to each other in the end-to-end solution.
The following are the steps to deploy the end-to-end solution.
This tutorial will refer to files available in the Manual Deployment Guide section of the Cortana Intelligence Personalized Offers Git repository. You can download all of these files at once by clicking the "Clone or download" button.
You can download or view individual files by navigating through the repository folders. If you choose this option, be sure to download the "raw" version of each file by clicking the filename to view it, then clicking "Download". You will also find a settings.txt file in the src folder that can be used to keep track of settings you will need for configuring the Azure Functions. The names provided in the settings.txt file correspond to the names of the settings, and the entries you add will be the values.
You will need a unique string to identify your deployment because some Azure services such as Azure Storage require a unique name for each instance. We suggest you use only letters and numbers in this string. The length of your unique string should not be greater than 9 characters.
We suggest you use "[UI]poffer[N]" where [UI] are the user's initials, N is a random integer that you choose and characters are lowercase. Please open your settings.txt and write down your unique string.
- Log into the Azure Management Portal.
- Click the Resource groups button, and then click the + Add button to add a resource group.
- Enter your unique string for the resource group and choose your subscription.
- For Resource group location, you should choose one of the following as they are the locations that offer all of the Azure services used in this guide (with the exception of Azure Data Factory, which need not be located in the same location):
- South Central US
- West Europe
- Southeast Asia
- Click Create
Please open your settings.txt file and save the information in the form of the following table, replacing the content in [] with the actual values.
Azure Resource Group | |
---|---|
resourceGroupName | [unique] |
region | [region] |
In this tutorial, all resources will be created in the resource group you just created. You can easily access these resources from the resource group overview page, which can be accessed as follows:
- Log into the Azure Management Portal.
- If you pinned the Resource Group when creating it, you will find your resource group from Dashboard and can click on it to see all of the associated resources.
- Click the Resource groups button.
- Choose the subscription your resource group resides in.
- Search for (or directly select) your resource group from the list of resource groups.
Note that you may need to close the resource description page to add new resources.
In the following steps, if any entry or item is not mentioned in the instructions, please leave it as the default value.
In this section we will go through the steps necessary to create the storage account and a blob associated with it, and upload some files to the blob. Along the way we will note down the values that we will need later in our settings.txt file.
- Go to the Azure Portal and navigate to the resource group you just created.
- In the Overview panel, click + Add to add a new resource. Enter Storage account and hit "Enter" to search.
- Click on Storage account - blob, file, table, queue offered by Microsoft (in the "Storage" category).
- Click Create at the bottom of the description panel.
- In the Azure Storage Account panel:
- Enter your unique string for "Name".
- Make sure the selected resource group is the one you just created. If not, choose the resource group you created for this solution.
- Click the Create button at the bottom.
- Go back to your resource group overview and wait until the storage account is deployed. To check the deployment status, refresh the page or the list of resources in the resource group as needed.
These are the steps to get the access key that will be used in later steps.
- Click the created storage account. In the new panel, click on Access keys.
- In the new panel, copy values of Key and Connection string under
key1
(use 'Click to copy' icon next to each value) and paste into your settings.txt file as detailed below.
Azure Storage Account | |
---|---|
storageAccountName | [unique string] |
storageAccountKey | [Key] |
storageAccountConnectionString | [Connection string] |
These are the steps to create the Blob storage.
- After getting the primary key, click on Overview on the left to return to the main panel for the Storage Account.
- Click on Blobs in the central area of the main panel.
- Click on + Container at the top of the new panel.
- Enter [unique string]blob for "Name".
- Select Blob for "Public access level".
- Click OK at the bottom of the panel.
- Click on the blob that you just created.
At this time, if you haven't already, make sure to download the following files from the src directory in this repository:
- OfferPriority.txt
- offers.txt
- OfferThreshold.txt
- products.txt
- redisSeed.txt
- users.txt
To upload these files:
- Click Upload at the top of the panel.
- In the panel that opens, click the folder icon to the right of the "Files" field.
- Navigate to where you saved the files on your computer.
- Pressing the 'Control Key' on your keyboard and clicking on each of the files (single-click) will allow you to select all the files at once.
- Click the Open button at the bottom of the window.
- Click Upload at the bottom of the panel.
- Click the x at the top right of the panel for uploading files to dismiss it.
- You should now see a list of files in the blob container.
- Click the Properties button on the left-hand menu bar of your blob container panel.
- Click the Copy icon to the right of the URL field in the properties panel that opened.
- Paste this value into your settings.txt as shown in the table below.
Azure Storage Blob Container Files | |
---|---|
userFile | [Blob Container URL]/users.txt |
offerFile | [Blob Container URL]/offers.txt |
productFile | [Blob Container URL]/products.txt |
referenceCollectionFile1 | [Blob Container URL]/OfferPriority.txt |
referenceCollectionFile2 | [Blob Container URL]/OfferThreshold.txt |
redisCacheSeedFile | [Blob Container URL]/redisSeed.txt |
The model used in this guide is based on the Personalized Offers Solution How To Guide from the Azure AI Gallery. The experiment used to train the model can be found here.
- Go to the Azure Portal and navigate to the resource group you created.
- In the Overview panel, click + Add to add a new resource. Enter Machine Learning Studio Workspace and hit "Enter" to search.
- Click on Machine Learning Studio Workspace offered by Microsoft in the Analytics category.
- Click the Create button at the bottom of the description panel.
- In the Machine Learning Studio workspace panel:
- Enter your unique string for "Workspace name".
- Choose Use existing for "Resource group" and select the resource group you created earlier.
- Choose Use existing for "Storage account" and select the storage account you created earlier.
- Choose Create new for "Web service plan".
- Click on Web service plan pricing tier, choose S1 Standard and click Select at the bottom.
- Click Create at the bottom.
- Go to the Personalized Offers Solution How To Guide in the Azure AI Gallery.
- Click the Open in Studio button on the right. Log in if needed.
- Choose the region and workspace. For the region, you should choose the region that your resource group resides in. For the workspace, choose the workspace you just created.
- Wait until the experiment is copied.
- Click Run at the bottom of the page. It takes around three minutes to run the experiment.
- Click Deploy Web Service at the bottom of the page, then click Deploy Web Service [Classic] to publish the web service. This will lead you to the web service page. This page can also be found by clicking the Web services button on the left-hand menu bar in your workspace.
- Copy the API key and save it in your settings.txt as per the table given below.
- Click the REQUEST/RESPONSE link under the API HELP PAGE section. On the help page that opens, copy the Request URI under the Request section, and save it in your settings.txt as per the table given below.
Machine Learning Web Service | |
---|---|
mlPublishedExperimentEndpoint | [Request URI] |
mlPublishedExperimentKey | [API key] |
- Go to the Azure Portal and navigate to your resource group.
- In the Overview panel, click + Add to add a new resource. Type Event Hubs and hit "Enter" to search.
- Click on Event Hubs offered by Microsoft in the "Analytics" category.
- Click Create at the bottom of the description panel.
- In the new panel for creating a namespace:
- enter your unique string for "Name".
- Choose the Standard pricing tier.
- Choose your subscription, resource group, and the location that your resource group resides in.
- Slide the Throughput Units slider to 20.
- Click the Create button at the bottom of the panel.
- Return to your resource group's overview page. When it has finished deploying, click on the resource of type "Event Hub".
- Click Shared access policies in the left-hand menu bar.
- In the new panel click RootManageSharedAccessKey
- Copy the Primary key using the copy button to the right of the field, and add it to your settings.txt file.
- Copy the Connection string–primary key using the copy button to the right of the field, and add it to your settings.txt file.
- Click the x in the top right to dismiss this panel.
Azure Event Hub | |
---|---|
serviceBusNamespace | [unique string] |
eventHubSharedAccessPolicyKeyName | RootManageSharedAccessKey |
eventHubSharedAccessPolicyKey | [Primary key] |
eventHubConnectionString | [Connection string–primary key] |
eventHubName | personalizedofferseh |
eventHubGroup1 | clickactivityaggcg |
eventHubGroup2 | clickactivitydbcg |
eventHubGroup3 | clickactivitydlcg |
- Click Overview on the left-hand menu bar, and then on the + Event Hub button to add an event hub.
- In the new panel:
- Enter personalizedofferseh for "Name".
- Enter 16 for "Partition Count".
- Enter 1 for "Message Retention".
- Click Create at the bottom.
- Click on the Event Hubs option in the left-hand menu bar.
- Click on the event hub named personalizedofferseh created through the previous steps. In the new panel:
- Click + Consumer group at the top of the panel
- Enter clickactivityaggcg for the 'name' field.
- Click Create at the bottom.
- Repeat 2 more times creating the following Consumer Groups:
- clickactivitydbcg
- clickactivitydlcg
- Click + Consumer group at the top of the panel
- Go to the Azure Portal and navigate to your resource group.
- In the Overview panel, click + Add to add a new resource. Type Azure Cosmos DB and hit "Enter" to search.
- Click on Azure Cosmos DB offered by Microsoft in the "Storage" category.
- Click Create at the bottom of the description panel.
- In the Azure Cosmos DB panel:
- Enter your unique string for "ID".
- Choose SQL for "API".
- Choose your subscription, resource group, and the location that your resource group resides in.
- Click Create at the bottom.
- Once your Cosmos DB is deployed, navigate back to the resource you have just created, then:
- Click on Keys on the left.
- Select Read-write Keys at the top of the new panel.
- Use the copy button to the right of the fields URI, PRIMARY KEY, and PRIMARY CONNECTION STRING, and add their values to the settings.txt file as follows:
Azure Cosmos DB | |
---|---|
docDbUri | [URI] |
docDbKey | [PRIMARY KEY] |
docDbConnectionString | [PRIMARY CONNECTION STRING] (remove the ";" at the end) |
- Go to the Azure Portal and navigate to your resource group.
- In the Overview panel, click + Add to add a new resource. Type Data Lake Storage and hit "Enter" to search.
- Click on Data Lake Storage Gen1 offered by Microsoft in the "Storage" category.
- Click Create at the bottom of the description panel.
- In the Data Lake Storage panel:
- Enter your unique string for "Name".
- Choose your subscription and resource group.
- Click Create at the bottom.
- Add your unique string to your settings.txt file.
Azure Data Lake Store | |
---|---|
adlStoreAccount | [unique string] |
- Go to the Azure Portal and navigate to your resource group.
- In the Overview panel, click + Add to add a new resource. Type Redis Cache and hit "Enter" to search.
- Click on Redis Cache offered by Microsoft in the "Databases" category.
- Click Create at the bottom of the description panel.
- In the Redis Cache panel:
- Enter your unique string for "DNS name".
- Choose your subscription, resource group, and the location that your resource group resides in.
- Choose Standard C2 (2.5 GB Cache, Replication) for "Pricing tier". If only the recommended pricing tiers are shown up, click "See additional options" below the options. Then, click Apply at the bottom.
- Click Create at the bottom.
- When it has finished deploying, navigate back to the resource you have just created, then:
- Click on Access keys on the left.
- Use the copy button to the right of the field Primary, and add the value to the settings.txt file.
Azure Redis Cache | |
---|---|
redisCacheName | [unique string] |
redisCacheKey | [Primary] |
- Go to the Azure Portal and navigate to your resource group.
- In the Overview panel, click + Add to add a new resource. Type App Service Plan and hit "Enter" to search.
- Click on App Service Plan offered by Microsoft.
- Click Create at the bottom of the description panel.
- In the New App Service Plan panel:
- Enter your unique string for "App Service plan".
- Choose your subscription, resource group, and the location that your resource group resides in.
- Choose Windows for "Operating System".
- Click on Pricing tier, choose S3 Standard and click Select at the bottom.
- Click Create at the bottom.
- Go to the Azure Portal and navigate to your resource group.
- In the Overview panel, click + Add to add a new resource. Type Function App and hit "Enter" to search.
- Click on Function App offered by Microsoft in the "Web" category.
- Click Create at the bottom of the description panel.
- In the Function App panel:
- Enter your unique string for "App name".
- Choose your subscription, resource group, and the location that your resource group resides in.
- Choose App Service Plan for "Hosting Plan".
- Click on App Service plan/Location, then click on the plan you created previously.
- Choose Use existing for "Storage", then enter your unique string.
- Click Create at the bottom.
- When it has finished deploying, navigate back to the resource you have just created, then:
- Click on Platform features from the menu across the top.
- Click on Application settings.
- Choose 64-bit for "Platform".
- Choose On for "Always On".
- Add all key-value pairs you have been storing in your settings.txt under the "Application settings" section (starting with storageAccountName).
- Click on Save at the top of the page.
- Close the Application settings tab by clicking on x.
- From the "Platform features" panel opened from the steps above, click on Advanced tools(Kudu).
- In the new window that opens, click on Debug console menu at the top, then click on CMD.
- Click on the site folder in the items table.
- Click on the wwwroot folder.
- Find the functions.zip file you downloaded from this repository under the src folder, unzip the file locally and transfer all its contents by dragging them onto the page you opened wwwroot folder.
- Close the page and return to the Function Apps screen.
- Refresh the list of functions by clicking on the "Refresh" icon at the right of your Function Apps instance and check if the functions you just uploaded are shown up under Functions expendable menu.
- Select SeedDocumentDb from the list of functions.
- Click Run at the top of the page and wait for the function to complete. You may check the execution status by clicking on Logs at the bottom of the page.
- Repeat for the function SeedRedisCache.
For this solution we will be creating 4 separate stream jobs to better understand query performance.
- Go to the Azure Portal and navigate to your resource group.
- In the Overview panel, click + Add to add a new resource. Type Stream Analytics job and hit "Enter" to search.
- Click on Stream Analytics job offered by Microsoft in the "Internet of Things" category.
- Click Create at the bottom of the description panel.
- In the New Stream Analytics job panel:
- Enter productViewsJob for "Job name".
- Choose your subscription, resource group, and the location that your resource group resides in.
- Click Create at the bottom.
- When it has finished deploying, navigate back to the resource you have just created, then:
- Click on Inputs on the left.
- Click on + Add stream input at the top, then click on Event Hub.
- In the panel that opens:
- Enter ClickActivity for "Input alias".
- Select your unique string for "Event Hub namespace".
- Click on Use existing under "Event Hub name", then select personalizedofferseh.
- Select RootManageSharedAccessKey for "Event Hub policy name".
- Enter clickactivityaggcg for "Event Hub consumer group".
- Click Save at the bottom.
- Click on Functions on the left.
- Click on + Add at the top, then click on Javascript UDF.
- In the panel that opens:
- Enter productViewsJson for "Function alias".
- Copy the contents of the file ProductViewsUDF.txt you downloaded from the src directory in this repository, and replace the current function definition displayed on the right side of the screen.
- Click Save at the bottom.
- Click on Outputs on the left.
- Click on + Add at the top, then click on Cosmos DB.
- In the panel that opens:
- Enter ProductViews for "Output alias".
- Select your unique string for "Account id".
- Click on Use existing under "Database", then select personalizedOffers.
- Enter productViewsCollection for "Collection name pattern".
- Enter id for "Document id".
- Click Save at the bottom.
- Click on Query on the left.
- In the panel that opens:
- Copy the contents of the ProductViewsQuery.txt file you downloaded from the src directory in this repository, and replace the current query displayed on the right side of the screen.
- Click Save at the top, then confirm by clicking on Yes.
- Click on Scale on the left.
- In the panel that opens:
- Slide the "Streaming units" slider to 12.
- Click Save at the top, then confirm by clicking on Yes.
- Click on Overview on the left.
- Click Start at the top.
- In the panel that opens, click Start at the bottom to start the stream job.
- If failed to start the strean job, then open Overview, click Stop at the top and try restart the stream job.
- Go to the Azure Portal and navigate to your resource group.
- In the Overview panel, click + Add to add a new resource. Type Stream Analytics job and hit "Enter" to search.
- Click on Stream Analytics job offered by Microsoft in the "Internet of Things" category.
- Click Create at the bottom of the description panel.
- In the New Stream Analytics job panel:
- Enter offerViewsJob for "Job name".
- Choose your subscription, resource group, and the location that your resource group resides in.
- Click Create at the bottom.
- When it has finished deploying, navigate back to the resource you have just created, then:
- Click on Inputs on the left.
- Click on + Add stream input at the top, then click on Event Hub.
- In the panel that opens:
- Enter clickactivitydbcg for "Input alias".
- Select your unique string for "Event Hub namespace".
- Click on Use existing under "Event Hub name", then select personalizedofferseh.
- Select RootManageSharedAccessKey for "Event Hub policy name".
- Enter clickactivitydbcg for "Event Hub consumer group".
- Click Save at the bottom.
- Click on Functions on the left.
- Click on + Add at the top, then click on Javascript UDF.
- In the panel that opens:
- Enter offerViewsJson for "Function alias".
- Copy the contents of the file OfferViewsUDF.txt you downloaded from the src directory in this repository, and replace the current function definition displayed on the right side of the screen.
- Click Save at the bottom.
- Click on Outputs on the left.
- Click on + Add at the top, then click on Cosmos DB.
- In the panel that opens:
- Enter OfferViews for "Output alias".
- Select your unique string for "Account id".
- Click on Use existing under "Database", then select personalizedOffers.
- Enter offerViewsCollection for "Collection name pattern".
- Enter id for "Document id".
- Click Save at the bottom.
- Click on Query on the left.
- In the panel that opens:
- Copy the contents of the OfferViewsQuery.txt file you downloaded from the src directory in this repository, and replace the current query displayed on the right side of the screen.
- Click Save at the top, then confirm by clicking on Yes.
- Click on Scale on the left.
- In the panel that opens:
- Slide the "Streaming units" slider to 18.
- Click Save at the top, then confirm by clicking on Yes.
- Click on Overview on the left.
- Click Start at the top.
- In the panel that opens, click Start at the bottom to start the stream job.
- Go to the Azure Portal and navigate to your resource group.
- In the Overview panel, click + Add to add a new resource. Type Stream Analytics job and hit "Enter" to search.
- Click on Stream Analytics job offered by Microsoft in the "Internet of Things" category.
- Click Create at the bottom of the description panel.
- In the New Stream Analytics job panel:
- Enter clickCountsJob for "Job name".
- Choose your subscription, resource group, and the location that your resource group resides in.
- Click Create at the bottom.
- When it has finished deploying, navigate back to the resource you have just created, then:
- Click on Inputs on the left.
- Click on + Add stream input at the top, then click on Event Hub.
- In the panel that opens:
- Enter clickactivitydbcg for "Input alias".
- Select your unique string for "Event Hub namespace".
- Click on Use existing under "Event Hub name", then select personalizedofferseh.
- Select RootManageSharedAccessKey for "Event Hub policy name".
- Enter clickactivitydbcg for "Event Hub consumer group".
- Click Save at the bottom.
- Click on Outputs on the left.
- Click on + Add at the top, then click on Cosmos DB.
- In the panel that opens:
- Enter ProductCounts for "Output alias".
- Select your unique string for "Account id".
- Click on Use existing under "Database", then select personalizedOffers.
- Enter productCollection for "Collection name pattern".
- Enter id for "Document id".
- Click Save at the bottom.
- Click on + Add at the top, then click on Cosmos DB.
- In the panel that opens:
- Enter UserCounts for "Output alias".
- Select your unique string for "Account id".
- Click on Use existing under "Database", then select personalizedOffers.
- Enter userCollection for "Collection name pattern".
- Enter id for "Document id".
- Click Save at the bottom.
- Click on Query on the left.
- In the panel that opens:
- Copy the contents of the ClickCountsQuery.txt file you downloaded from the src directory in this repository, and replace the current query displayed on the right side of the screen.
- Click Save at the top, then confirm by clicking on Yes.
- Click on Scale on the left.
- In the panel that opens:
- Slide the "Streaming units" slider to 12.
- Click Save at the top, then confirm by clicking on Yes.
- Click on Overview on the left.
- Click Start at the top.
- In the panel that opens, click Start at the bottom to start the stream job.
- Go to the Azure Portal and navigate to your resource group.
- In the Overview panel, click + Add to add a new resource. Type Stream Analytics job and hit "Enter" to search.
- Click on Stream Analytics job offered by Microsoft in the "Internet of Things" category.
- Click Create at the bottom of the description panel.
- In the New Stream Analytics job panel:
- Enter rawDataJob for "Job name".
- Choose your subscription, resource group, and the location that your resource group resides in.
- Click Create at the bottom.
- Navigate back to the resource you have just created, then:
- Click on Inputs on the left.
- Click on + Add stream input at the top, then click on Event Hub.
- In the panel that opens:
- Enter clickactivitydlcg for "Input alias".
- Select your unique string for "Event Hub namespace".
- Click on Use existing under "Event Hub name", then select personalizedofferseh.
- Select RootManageSharedAccessKey for "Event Hub policy name".
- Enter clickactivitydlcg for "Event Hub consumer group".
- Click Save at the bottom.
- Click on Outputs on the left.
- Click on + Add at the top, then click on Data Lake Store.
- In the panel that opens:
- Enter clickstreamdl for "Output alias".
- Select your unique string for "Account name".
- Enter personalizedoffers/clickstream/{date} for "Path prefix pattern".
- Click on Authorize.
- Click Save at the bottom.
- Click on + Add at the top, then click on Data Lake Store.
- In the panel that opens:
- Enter offersdl for "Output alias".
- Select your unique string for "Account name".
- Enter personalizedoffers/offers/{date} for "Path prefix pattern".
- Click on Authorize.
- Click Save at the bottom.
- Click on Query on the left.
- In the panel that opens:
- Copy the contents of the RawDataQuery.txt file you downloaded from the src directory in this repository, and replace the current query displayed on the right side of the screen.
- Click Save at the top, then confirm by clicking on Yes.
- Click on Overview on the left.
- Click Start at the top.
- In the panel that opens, click Start at the bottom to start the stream job.
- Go to the Azure Portal and navigate to your resource group.
- Select the resource with type "App Service".
- Click on Functions on the left, then enable all functions.
- Repeat the following steps for the functions RedisProductTrigger and UpdateTopUsersCache:
- Click on the function name on the left.
- Click Run at the top of the page, and wait for the function to complete. You may check the execution status by clicking on Logs at the bottom of the page.
- Finally, start the simulation by running the UserSimulationStartup function:
- Click on the function name on the left.
- Click Run at the top of the page, and wait for the function to complete. You may check the execution status by clicking on Logs at the bottom of the page.
Visit here for the post-deployment instructions. The page provides information on monitoring, scaling and visualizing the output of the deployed solution. Details for stopping the solution can also be found on the same page.