Skip to content

Commit 12b0881

Browse files
committed
chore: add a deployment example
1 parent 350d67c commit 12b0881

File tree

9 files changed

+308
-9
lines changed

9 files changed

+308
-9
lines changed

.gitignore

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
keen_**
1+
automl-model
22

33
# application data
44
# -------------------------------------

conda.yaml

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
# conda.yml
2+
name: automl-env
3+
dependencies:
4+
- python=3.8
5+
- pip
6+
- pip:
7+
- onnxruntime
8+
- numpy
9+
- pandas

docs/AZURE_ACCOUNT_SETUP.md

Lines changed: 55 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -134,9 +134,20 @@ az provider register --namespace Microsoft.Compute
134134
az provider register --namespace Microsoft.Storage
135135
az provider register --namespace Microsoft.KeyVault
136136
az provider register --namespace Microsoft.ContainerInstance
137+
az provider register --namespace Microsoft.ContainerRegistry
138+
az provider register --namespace Microsoft.ContainerService
139+
140+
# Additional providers for real-time endpoints
141+
az provider register --namespace Microsoft.Web
142+
az provider register --namespace Microsoft.Network
143+
az provider register --namespace Microsoft.Authorization
144+
az provider register --namespace Microsoft.Insights
145+
az provider register --namespace Microsoft.OperationalInsights
146+
az provider register --namespace Microsoft.Resources
147+
137148

138149
# Check registration status
139-
az provider list --query "[?namespace=='Microsoft.MachineLearningServices' || namespace=='Microsoft.Compute' || namespace=='Microsoft.Storage'].{Namespace:namespace, State:registrationState}" --output table
150+
az provider list --query "[?namespace=='Microsoft.MachineLearningServices' || namespace=='Microsoft.Compute' || namespace=='Microsoft.Storage' || namespace=='Microsoft.ContainerService' || namespace=='Microsoft.Web' || namespace=='Microsoft.Network' || namespace=='Microsoft.Authorization' || namespace=='Microsoft.Insights' || namespace=='Microsoft.OperationalInsights' || namespace=='Microsoft.Resources'].{Namespace:namespace, State:registrationState}" --output table
140151
```
141152

142153
**Note**: Registration can take 5-10 minutes. Wait for all to show "Registered" before proceeding.
@@ -188,12 +199,23 @@ az vm list-usage --location eastus2 --query "[?contains(name.value, 'Standard')]
188199

189200
### Common Quota Requirements for AutoML
190201

191-
| VM Family | Recommended Quota | Use Case |
192-
| -------------------------- | ----------------- | ---------------------- |
193-
| Standard DSv3 Family vCPUs | 2-10 | General AutoML jobs |
194-
| Standard Dv3 Family vCPUs | 16-32 | CPU-intensive training |
195-
| Standard FSv2 Family vCPUs | 16-32 | Fast compute jobs |
196-
| Total Regional vCPUs | 50-100 | Overall regional limit |
202+
| VM Family | Recommended Quota | Use Case |
203+
| ------------------------------- | ----------------- | ----------------------- |
204+
| Standard DSv3 Family vCPUs | 2-10 | General AutoML jobs |
205+
| **Standard DASv4 Family vCPUs** | **4-10** | **Real-time endpoints** |
206+
| Standard Dv3 Family vCPUs | 16-32 | CPU-intensive training |
207+
| Standard FSv2 Family vCPUs | 16-32 | Fast compute jobs |
208+
| Total Regional vCPUs | 50-100 | Overall regional limit |
209+
210+
### Check Endpoint VM Quotas
211+
212+
```console
213+
# Check quota for endpoint VMs (different from training compute)
214+
az vm list-usage --location eastus2 --query "[?contains(name.value, 'DASv4')]" --output table
215+
216+
# Common endpoint VM families
217+
az vm list-usage --location eastus2 --query "[?contains(name.value, 'DASv4') || contains(name.value, 'DSv3') || contains(name.value, 'Dv3')]" --output table
218+
```
197219

198220
### Request Quota Increase in Azure Portal
199221

@@ -325,6 +347,31 @@ az login
325347
az account set --subscription "your-subscription-name"
326348
```
327349

350+
### Issue 5: Troubleshooting Endpoint Creation
351+
352+
If endpoint creation fails with "invalid request" errors, try these solutions:
353+
354+
````console
355+
# 1. Use same VM family as training cluster
356+
az ml online-endpoint create \
357+
--name titanic-test \
358+
--auth-mode key \
359+
--resource-group automl-resources \
360+
--workspace-name automl-workspace
361+
362+
# 2. Deploy with DSv3 family (same as training)
363+
az ml online-deployment create \
364+
--name blue \
365+
--endpoint-name titanic-test \
366+
--model azureml:your-model-name@latest \
367+
--instance-type Standard_D2s_v3 \
368+
--instance-count 1 \
369+
--resource-group automl-resources \
370+
--workspace-name automl-workspace
371+
372+
# 3. Alternative: Use Azure ML Studio interface
373+
# Go to ml.azure.com → Models → Deploy → Real-time endpoint
374+
328375
## 💰 Cost Management
329376

330377
### Estimated Costs for AutoML
@@ -349,7 +396,7 @@ az ml compute update \
349396
--idle-time-before-scale-down 300 \
350397
--resource-group automl-resources \
351398
--workspace-name automl-workspace
352-
```
399+
````
353400

354401
## ✅ Validation Checklist
355402

docs/AZURE_DEPLOYMENT.md

Lines changed: 100 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,100 @@
1+
# Azure AutoML Real-Time Deployment
2+
3+
Microsoft's recommended approach for deploying AutoML models to production for real-time endpoints is to download your AutoML model and deploy it using Azure Functions.
4+
5+
## Prerequisites
6+
7+
- Azure CLI installed and logged in
8+
- Azure Functions Core Tools installed
9+
- Python 3.8+ installed locally
10+
- Access to the Azure ML workspace containing your model
11+
12+
```console
13+
# Install Azure Functions Core Tools (Mac)
14+
brew tap azure/functions
15+
brew install azure-functions-core-tools@4
16+
17+
# Verify installation
18+
func --version
19+
```
20+
21+
## Deployment
22+
23+
### Step 1: Download Your AutoML Model
24+
25+
```console
26+
# Download your trained AutoML model
27+
az ml model download \
28+
--name titanic-survival \
29+
--version 1 \
30+
--resource-group ubc-cdl10 \
31+
--workspace-name UBC-CDL10 \
32+
--download-path ./titanic-survival-app/automl-model/
33+
```
34+
35+
### Step 2: Create Azure Function App
36+
37+
```console
38+
# Create a new storage account (name must be globally unique, 3-24 chars, lowercase/numbers only)
39+
az storage account create \
40+
--name ubccdl10titanic \
41+
--resource-group ubc-cdl10 \
42+
--location eastus2 \
43+
--sku Standard_LRS
44+
45+
# Create Function App
46+
az functionapp create \
47+
--resource-group ubc-cdl10 \
48+
--consumption-plan-location eastus2 \
49+
--runtime python \
50+
--runtime-version 3.8 \
51+
--functions-version 4 \
52+
--name titanic-survival-app \
53+
--storage-account ubccdl10titanic
54+
```
55+
56+
### Step 3: Create Function Code
57+
58+
- [**init**.py](../titanic-survival-app/__init__.py)
59+
- [requirements.txt](../titanic-survival-app/requirements.txt)
60+
- [function.json](../titanic-survival-app/function.json)
61+
62+
### Step 4: Deploy Function
63+
64+
```console
65+
cd titanic-survival-app
66+
func azure functionapp publish titanic-survival-app
67+
```
68+
69+
### Step 5: Test Your Deployment
70+
71+
```console
72+
# Test the function
73+
curl -X POST https://titanic-survival-app.azurewebsites.net/api/predict \
74+
-H "Content-Type: application/json" \
75+
-d '{"data": [[3, 1, 22, 1, 0, 7.25, 2]]}'
76+
```
77+
78+
Expected response:
79+
80+
```json
81+
{
82+
"prediction": [0]
83+
}
84+
```
85+
86+
## 🔧 Troubleshooting
87+
88+
### Check Function Logs
89+
90+
```console
91+
# Check function logs
92+
az functionapp log tail --name titanic-survival-app --resource-group ubc-cdl10
93+
```
94+
95+
### Common Issues
96+
97+
1. **Model Loading Errors**: Ensure model.pkl is in the same directory as **init**.py
98+
2. **Dependency Issues**: Pin all package versions in requirements.txt
99+
3. **Memory Issues**: Use Premium plan for large models
100+
4. **JSON Issues**: Ensure input data format matches training data

requirements/local.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@
77

88
-r base.txt
99

10+
azure.functions==1.23.0
1011

1112
# Code linters, formatters, and security scanners
1213
# ------------

titanic-survival-app/README.md

Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
# Azure function deployment resources
2+
3+
This directory contains the deployment resources for a machine learning model created using Azure AutoML (Automated Machine Learning).
4+
See [Azure AutoML Real-Time Deployment](../docs/AZURE_DEPLOYMENT.md)
5+
6+
## Overview
7+
8+
These files enable deployment of an AutoML-trained model as a serverless Azure Function for real-time inference. The model was trained using Azure's AutoML service and is now packaged for production deployment.
9+
10+
## Files
11+
12+
- **`__init__.py`** - Main Azure Function handler that loads the AutoML model and processes prediction requests
13+
- **`requirements.txt`** - Python dependencies required for the function runtime
14+
- **`function.json`** - Azure Functions configuration defining the HTTP trigger and bindings
15+
- **`automl-model/`** - Directory containing the downloaded AutoML model artifacts
16+
17+
## Usage
18+
19+
This function is designed to be deployed to Azure Functions and provides a REST API endpoint for real-time model predictions.
20+
21+
### API Endpoint
22+
23+
```bash
24+
curl -X POST https://titanic-survival-app.azurewebsites.net/api/predict \
25+
-H "Content-Type: application/json" \
26+
-d '{"data": [[3, 1, 22, 1, 0, 7.25, 2]]}'
27+
```
28+
29+
### Response
30+
31+
```json
32+
{
33+
"prediction": [0]
34+
}
35+
```
36+
37+
## Deployment
38+
39+
For complete deployment instructions, see the [Azure Deployment Guide](../docs/AZURE_DEPLOYMENT.md).
40+
41+
## Model Information
42+
43+
- **Source**: Azure AutoML
44+
- **Task**: Binary Classification (Titanic Survival Prediction)
45+
- **Input Features**: Passenger class, sex, age, siblings/spouses, parents/children, fare, embarked port
46+
- **Output**: Survival prediction (0 = did not survive, 1 = survived)

titanic-survival-app/__init__.py

Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
"""
2+
https://github.com/FullStackWithLawrence/azureml-example
3+
./titanic-survival-app/__init__.py
4+
5+
Azure Function to serve predictions from a pre-trained AutoML model.
6+
7+
EXECUTION ENVIRONMENT:
8+
- This code runs on Azure Functions cloud servers (NOT on your local computer)
9+
- Deployed via 'func azure functionapp publish' command from your local machine
10+
- Accessible globally via HTTPS endpoint: https://titanic-survival-app.azurewebsites.net
11+
12+
DEPLOYMENT FLOW:
13+
1. Model downloaded locally: 'az ml model download' → ./automl-model/
14+
az ml model download \
15+
--name titanic-survival \
16+
--version 1 \
17+
--resource-group ubc-cdl10 \
18+
--workspace-name UBC-CDL10 \
19+
--download-path path/to/this/repo/titanic-survival-app/automl-model/
20+
21+
2. Function code created locally in ./titanic-survival-app/__init__.py (this file)
22+
23+
3. Entire titanic-survival-app/ folder uploaded to Azure via 'func azure functionapp publish'
24+
cd titanic-survival-app
25+
func azure functionapp publish titanic-survival-app
26+
27+
4. Azure Functions runtime loads and executes this code on their servers
28+
29+
MODEL LOADING:
30+
- The model.pkl file is loaded ONCE when the Azure Function starts up (cold start)
31+
- Model file exists on Azure servers because it was packaged and uploaded with this code
32+
- Path './automl-model/model.pkl' is relative to this file's location on Azure servers
33+
34+
RUNTIME BEHAVIOR:
35+
- When HTTP requests arrive at the Azure endpoint, Azure executes main() function
36+
- Input data comes from HTTP POST requests with JSON payload
37+
- Predictions are computed using the pre-loaded model on Azure servers
38+
- Results are returned as JSON HTTP responses
39+
40+
This function expects input data in JSON format and returns predictions in JSON format.
41+
"""
42+
43+
import json
44+
import os
45+
import pickle
46+
47+
import azure.functions as func
48+
import pandas as pd
49+
50+
51+
# Load model once at startup
52+
model_path = os.path.join(os.path.dirname(__file__), "./automl-model/model.pkl")
53+
with open(model_path, "rb") as f:
54+
model = pickle.load(f) # nosec B301
55+
56+
57+
def main(req: func.HttpRequest) -> func.HttpResponse:
58+
try:
59+
# Get input data
60+
req_body = req.get_json()
61+
input_data = pd.DataFrame(req_body["data"])
62+
63+
# Make prediction
64+
prediction = model.predict(input_data)
65+
66+
# Return result
67+
return func.HttpResponse(json.dumps({"prediction": prediction.tolist()}), mimetype="application/json")
68+
# pylint: disable=W0718
69+
except Exception as e:
70+
return func.HttpResponse(json.dumps({"error": str(e)}), status_code=400)

titanic-survival-app/function.json

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
{
2+
"scriptFile": "__init__.py",
3+
"bindings": [
4+
{
5+
"authLevel": "function",
6+
"type": "httpTrigger",
7+
"direction": "in",
8+
"name": "req",
9+
"methods": ["post"]
10+
},
11+
{
12+
"type": "http",
13+
"direction": "out",
14+
"name": "$return"
15+
}
16+
]
17+
}
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
"""
2+
# titanic-survival-app/requirements.txt
3+
# This file contains the Python package dependencies for the Azure Function App
4+
# that serves the Titanic dataset model.
5+
"""
6+
azure-functions
7+
pandas
8+
numpy
9+
scikit-learn

0 commit comments

Comments
 (0)