Skip to content

This is to demo how to use Azure Data Factory with AKS/KEDA to run batch jobs in Azure

License

Notifications You must be signed in to change notification settings

briandenicola/datafactory-batchjob-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Overview

This is to demo how to use Azure Data Factory with AKS/KEDA to run batch jobs in Azure

Architecture

Architecture

Deployment

Prerequisite

  • A Linux machine or Windows Subsytem for Linux or Docker for Windows
  • Azure Cli and an Azure Subscription
  • Terraform 0.12 or greater
  • Kubectl
  • Helm
  • A virtual network with 2 subnets defined - one for private endpoint and one for kubernetes
  • DNS zones for storage private endpoints - refrenece
  • An Azure Container Repository

Infrastructure

  1. az extension add --name aks-preview
  2. az extension update --name aks-preview
  3. az login
  4. az feature register --namespace "Microsoft.ContainerService" --name "AKS-AzureKeyVaultSecretsProvider"
  5. az feature register --namespace "Microsoft.ContainerService" --name "EnablePodIdentityPreview"
  6. az feature register --namespace "Microsoft.ContainerService" --name "AKS-OpenServiceMesh"
  7. az feature register --namespace "Microsoft.ContainerService" --name "DisableLocalAccountsPreview"
  8. az feature list -o table --query "[?contains(name, 'Microsoft.ContainerService')].{Name:name,State:properties.state}"
  9. Wait till the above features are enabled.
  10. Update uat.tfvars with values for your environment
  11. az provider register --namespace Microsoft.ContainerService
  12. cd infrastructure
  13. terraform init -backend=true -backend-config="access_key=${access_key}" -backend-config="key=uat.terraform.tfstate"
  14. terraform plan -out="uat.plan" -var "resource_group_name=DevSub_K8S_RG" -var-file="uat.tfvars"
  15. terraform apply -auto-approve "uat.plan"
  16. ./aks-keda-install.sh $SUBSCRIPTION_ID $RG $CLUSTER_NAME $KEDA_IDENTITY $BATCH_IDENTITY

Source

Build

  1. cd source
  2. az login
  3. az acr login -n ${ACR_NAME}
  4. docker build -f DOCKERFILE -t ${ACR_NAME}.azurecr.io/queue-processor:{BUILD_ID} .
  5. docker push ${ACR_NAME}.azurecr.io/queue-processor:{BUILD_ID}

Deploy

  1. cd chart
  2. Update values.yaml
  3. helm upgrade -i batchdemo .

Datafactory Pipeline

TBD

Validation

TBD

Backlog

  • Update Readme with additional details

About

This is to demo how to use Azure Data Factory with AKS/KEDA to run batch jobs in Azure

Topics

Resources

License

Stars

Watchers

Forks