Skip to content

Commit

Permalink
refactor: automate adding tons of codeblock helpers in website samples (
Browse files Browse the repository at this point in the history
#1253)

* delete python imports and helpers in markdowns & modify doctest to auto add those

* fix ) in markdown & format

* fix mergePyCodeDir error

* fix announcement color

* test letting job fail on errors

* fix pipeline

* fix pipeline

* fix SynapseE2e job

* fixing pipeline

* fix python version

* fix sphinx version caused bug in publish artifacts
  • Loading branch information
serena-ruan authored Nov 12, 2021
1 parent c0b516b commit b2751eb
Show file tree
Hide file tree
Showing 33 changed files with 271 additions and 2,299 deletions.
2 changes: 1 addition & 1 deletion environment.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ dependencies:
- r-devtools
- pip:
- wheel
- sphinx
- sphinx==4.2.0
- sphinx_rtd_theme
- coverage
- pytest
Expand Down
17 changes: 15 additions & 2 deletions pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,7 @@ jobs:
azureSubscription: 'MMLSpark Build'
keyVaultName: mmlspark-keys
- bash: |
set -e
source activate synapseml
sbt packagePython
sbt publishBlob publishDocs publishR publishPython
Expand All @@ -71,7 +72,9 @@ jobs:
PGP-PRIVATE: $(pgp-private)
PGP-PUBLIC: $(pgp-public)
PGP-PW: $(pgp-pw)
- bash: sbt publishBadges
- bash: |
set -e
sbt publishBadges
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'))
displayName: Publish Badges
env:
Expand All @@ -98,6 +101,7 @@ jobs:
azureSubscription: 'MMLSpark Build'
keyVaultName: mmlspark-keys
- bash: |
set -e
source activate synapseml
sbt packagePython
sbt publishBlob
Expand Down Expand Up @@ -139,8 +143,9 @@ jobs:
azureSubscription: 'MMLSpark Build'
keyVaultName: mmlspark-keys
- bash: |
set -e
source activate synapseml
jupyter nbconvert --to script ./notebooks/*.ipynb*
jupyter nbconvert --to script ./notebooks/features/*/*.ipynb*
sbt packagePython
sbt publishBlob
displayName: Publish Blob Artifacts
Expand Down Expand Up @@ -246,6 +251,7 @@ jobs:
echo '##vso[task.setvariable variable=tag]'$(git tag -l --points-at HEAD)
displayName: 'Get Git Tag'
- bash: |
set -e
wget https://github.com/git-chglog/git-chglog/releases/download/0.8.0/git-chglog_linux_amd64
chmod +x git-chglog_linux_amd64
./git-chglog_linux_amd64 -o CHANGELOG.md $TAG
Expand Down Expand Up @@ -274,6 +280,7 @@ jobs:
azureSubscription: 'MMLSpark Build'
keyVaultName: mmlspark-keys
- bash: |
set -e
source activate synapseml
sbt publishPypi
condition: startsWith(variables['tag'], 'v')
Expand Down Expand Up @@ -327,6 +334,7 @@ jobs:
keyVaultName: mmlspark-keys
condition: succeededOrFailed()
- bash: |
set -e
curl -s https://codecov.io/bash > .codecov
chmod +x .codecov
echo "Starting Codecov Upload"
Expand Down Expand Up @@ -377,6 +385,7 @@ jobs:
keyVaultName: mmlspark-keys
condition: succeededOrFailed()
- bash: |
set -e
curl -s https://codecov.io/bash > .codecov
chmod +x .codecov
echo "Starting Codecov Upload"
Expand Down Expand Up @@ -424,6 +433,7 @@ jobs:
keyVaultName: mmlspark-keys
condition: succeededOrFailed()
- bash: |
set -e
curl -s https://codecov.io/bash > .codecov
chmod +x .codecov
echo "Starting Codecov Upload"
Expand Down Expand Up @@ -460,12 +470,14 @@ jobs:
source activate synapseml
sbt convertNotebooks
- bash: |
set -e
yarn install
cd website
yarn
yarn build
displayName: 'yarn install and build'
- bash: |
set -e
git config --global user.name "${GH_NAME}"
git config --global user.email "${GH_EMAIL}"
git checkout -b main
Expand Down Expand Up @@ -594,6 +606,7 @@ jobs:
keyVaultName: mmlspark-keys
condition: succeededOrFailed()
- bash: |
set -e
curl -s https://codecov.io/bash > .codecov
chmod +x .codecov
echo "Starting Codecov Upload"
Expand Down
2 changes: 1 addition & 1 deletion project/CodegenPlugin.scala
Original file line number Diff line number Diff line change
Expand Up @@ -224,7 +224,7 @@ object CodegenPlugin extends AutoPlugin {
artifactPath.in(packageBin).in(Compile).value.getParentFile
},
mergePyCodeDir := {
join(baseDirectory.value.getParent, "target", "scala-2.12", "sbt-1.0", "generated")
join(baseDirectory.value.getParent, "target", "scala-2.12", "generated")
},
codegenDir := {
join(targetDir.value, "generated")
Expand Down
68 changes: 6 additions & 62 deletions website/docs/documentation/estimators/_LightGBM.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,26 +2,8 @@ import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import DocTable from "@theme/DocumentationTable";

<!--
```python
import pyspark
import os
import json
from IPython.display import display
spark = (pyspark.sql.SparkSession.builder.appName("MyApp")
.config("spark.jars.packages", "com.microsoft.azure:synapseml:0.9.2")
.config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven")
.getOrCreate())
def getSecret(secretName):
get_secret_cmd = 'az keyvault secret show --vault-name mmlspark-build-keys --name {}'.format(secretName)
value = json.loads(os.popen(get_secret_cmd).read())["value"]
return value
import synapse.ml
```
-->



## LightGBMClassifier

Expand Down Expand Up @@ -87,27 +69,8 @@ values={[
]}>
<TabItem value="py">

<!--
```python
import pyspark
import os
import json
from IPython.display import display
from pyspark.sql.functions import *
spark = (pyspark.sql.SparkSession.builder.appName("MyApp")
.config("spark.jars.packages", "com.microsoft.azure:synapseml:0.9.2")
.config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven")
.getOrCreate())
def getSecret(secretName):
get_secret_cmd = 'az keyvault secret show --vault-name mmlspark-build-keys --name {}'.format(secretName)
value = json.loads(os.popen(get_secret_cmd).read())["value"]
return value
import synapse.ml
```
-->



<!--pytest-codeblocks:cont-->

Expand Down Expand Up @@ -159,27 +122,8 @@ values={[
]}>
<TabItem value="py">

<!--
```python
import pyspark
import os
import json
from IPython.display import display
from pyspark.sql.functions import *
spark = (pyspark.sql.SparkSession.builder.appName("MyApp")
.config("spark.jars.packages", "com.microsoft.azure:synapseml:0.9.2")
.config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven")
.getOrCreate())
def getSecret(secretName):
get_secret_cmd = 'az keyvault secret show --vault-name mmlspark-build-keys --name {}'.format(secretName)
value = json.loads(os.popen(get_secret_cmd).read())["value"]
return value
import synapse.ml
```
-->



<!--pytest-codeblocks:cont-->

Expand Down
45 changes: 4 additions & 41 deletions website/docs/documentation/estimators/_VW.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,26 +2,8 @@ import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import DocTable from "@theme/DocumentationTable";

<!--
```python
import pyspark
import os
import json
from IPython.display import display
spark = (pyspark.sql.SparkSession.builder.appName("MyApp")
.config("spark.jars.packages", "com.microsoft.azure:synapseml:0.9.2")
.config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven")
.getOrCreate())
def getSecret(secretName):
get_secret_cmd = 'az keyvault secret show --vault-name mmlspark-build-keys --name {}'.format(secretName)
value = json.loads(os.popen(get_secret_cmd).read())["value"]
return value
import synapse.ml
```
-->



## VowpalWabbitRegressor

Expand Down Expand Up @@ -84,27 +66,8 @@ values={[
]}>
<TabItem value="py">

<!--
```python
import pyspark
import os
import json
from IPython.display import display
from pyspark.sql.functions import *
spark = (pyspark.sql.SparkSession.builder.appName("MyApp")
.config("spark.jars.packages", "com.microsoft.azure:synapseml:0.9.2")
.config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven")
.getOrCreate())
def getSecret(secretName):
get_secret_cmd = 'az keyvault secret show --vault-name mmlspark-build-keys --name {}'.format(secretName)
value = json.loads(os.popen(get_secret_cmd).read())["value"]
return value
import synapse.ml
```
-->



<!--pytest-codeblocks:cont-->

Expand Down
41 changes: 0 additions & 41 deletions website/docs/documentation/estimators/core/_AutoML.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,26 +2,6 @@ import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import DocTable from "@theme/DocumentationTable";

<!--
```python
import pyspark
import os
import json
from IPython.display import display
spark = (pyspark.sql.SparkSession.builder.appName("MyApp")
.config("spark.jars.packages", "com.microsoft.azure:synapseml:0.9.2")
.config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven")
.getOrCreate())
def getSecret(secretName):
get_secret_cmd = 'az keyvault secret show --vault-name mmlspark-build-keys --name {}'.format(secretName)
value = json.loads(os.popen(get_secret_cmd).read())["value"]
return value
import synapse.ml
```
-->

## AutoML

Expand Down Expand Up @@ -143,27 +123,6 @@ values={[
]}>
<TabItem value="py">

<!--
```python
import pyspark
import os
import json
from IPython.display import display
from pyspark.sql.functions import *
spark = (pyspark.sql.SparkSession.builder.appName("MyApp")
.config("spark.jars.packages", "com.microsoft.azure:synapseml:0.9.2")
.config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven")
.getOrCreate())
def getSecret(secretName):
get_secret_cmd = 'az keyvault secret show --vault-name mmlspark-build-keys --name {}'.format(secretName)
value = json.loads(os.popen(get_secret_cmd).read())["value"]
return value
import synapse.ml
```
-->

<!--pytest-codeblocks:cont-->

Expand Down
Loading

0 comments on commit b2751eb

Please sign in to comment.