Add this skill
npx mdskills install sickn33/azure-ai-ml-pyComprehensive Azure ML SDK reference with excellent coverage of core operations and code examples
1---2name: azure-ai-ml-py3description: |4 Azure Machine Learning SDK v2 for Python. Use for ML workspaces, jobs, models, datasets, compute, and pipelines.5 Triggers: "azure-ai-ml", "MLClient", "workspace", "model registry", "training jobs", "datasets".6package: azure-ai-ml7---89# Azure Machine Learning SDK v2 for Python1011Client library for managing Azure ML resources: workspaces, jobs, models, data, and compute.1213## Installation1415```bash16pip install azure-ai-ml17```1819## Environment Variables2021```bash22AZURE_SUBSCRIPTION_ID=<your-subscription-id>23AZURE_RESOURCE_GROUP=<your-resource-group>24AZURE_ML_WORKSPACE_NAME=<your-workspace-name>25```2627## Authentication2829```python30from azure.ai.ml import MLClient31from azure.identity import DefaultAzureCredential3233ml_client = MLClient(34 credential=DefaultAzureCredential(),35 subscription_id=os.environ["AZURE_SUBSCRIPTION_ID"],36 resource_group_name=os.environ["AZURE_RESOURCE_GROUP"],37 workspace_name=os.environ["AZURE_ML_WORKSPACE_NAME"]38)39```4041### From Config File4243```python44from azure.ai.ml import MLClient45from azure.identity import DefaultAzureCredential4647# Uses config.json in current directory or parent48ml_client = MLClient.from_config(49 credential=DefaultAzureCredential()50)51```5253## Workspace Management5455### Create Workspace5657```python58from azure.ai.ml.entities import Workspace5960ws = Workspace(61 name="my-workspace",62 location="eastus",63 display_name="My Workspace",64 description="ML workspace for experiments",65 tags={"purpose": "demo"}66)6768ml_client.workspaces.begin_create(ws).result()69```7071### List Workspaces7273```python74for ws in ml_client.workspaces.list():75 print(f"{ws.name}: {ws.location}")76```7778## Data Assets7980### Register Data8182```python83from azure.ai.ml.entities import Data84from azure.ai.ml.constants import AssetTypes8586# Register a file87my_data = Data(88 name="my-dataset",89 version="1",90 path="azureml://datastores/workspaceblobstore/paths/data/train.csv",91 type=AssetTypes.URI_FILE,92 description="Training data"93)9495ml_client.data.create_or_update(my_data)96```9798### Register Folder99100```python101my_data = Data(102 name="my-folder-dataset",103 version="1",104 path="azureml://datastores/workspaceblobstore/paths/data/",105 type=AssetTypes.URI_FOLDER106)107108ml_client.data.create_or_update(my_data)109```110111## Model Registry112113### Register Model114115```python116from azure.ai.ml.entities import Model117from azure.ai.ml.constants import AssetTypes118119model = Model(120 name="my-model",121 version="1",122 path="./model/",123 type=AssetTypes.CUSTOM_MODEL,124 description="My trained model"125)126127ml_client.models.create_or_update(model)128```129130### List Models131132```python133for model in ml_client.models.list(name="my-model"):134 print(f"{model.name} v{model.version}")135```136137## Compute138139### Create Compute Cluster140141```python142from azure.ai.ml.entities import AmlCompute143144cluster = AmlCompute(145 name="cpu-cluster",146 type="amlcompute",147 size="Standard_DS3_v2",148 min_instances=0,149 max_instances=4,150 idle_time_before_scale_down=120151)152153ml_client.compute.begin_create_or_update(cluster).result()154```155156### List Compute157158```python159for compute in ml_client.compute.list():160 print(f"{compute.name}: {compute.type}")161```162163## Jobs164165### Command Job166167```python168from azure.ai.ml import command, Input169170job = command(171 code="./src",172 command="python train.py --data ${{inputs.data}} --lr ${{inputs.learning_rate}}",173 inputs={174 "data": Input(type="uri_folder", path="azureml:my-dataset:1"),175 "learning_rate": 0.01176 },177 environment="AzureML-sklearn-1.0-ubuntu20.04-py38-cpu@latest",178 compute="cpu-cluster",179 display_name="training-job"180)181182returned_job = ml_client.jobs.create_or_update(job)183print(f"Job URL: {returned_job.studio_url}")184```185186### Monitor Job187188```python189ml_client.jobs.stream(returned_job.name)190```191192## Pipelines193194```python195from azure.ai.ml import dsl, Input, Output196from azure.ai.ml.entities import Pipeline197198@dsl.pipeline(199 compute="cpu-cluster",200 description="Training pipeline"201)202def training_pipeline(data_input):203 prep_step = prep_component(data=data_input)204 train_step = train_component(205 data=prep_step.outputs.output_data,206 learning_rate=0.01207 )208 return {"model": train_step.outputs.model}209210pipeline = training_pipeline(211 data_input=Input(type="uri_folder", path="azureml:my-dataset:1")212)213214pipeline_job = ml_client.jobs.create_or_update(pipeline)215```216217## Environments218219### Create Custom Environment220221```python222from azure.ai.ml.entities import Environment223224env = Environment(225 name="my-env",226 version="1",227 image="mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04",228 conda_file="./environment.yml"229)230231ml_client.environments.create_or_update(env)232```233234## Datastores235236### List Datastores237238```python239for ds in ml_client.datastores.list():240 print(f"{ds.name}: {ds.type}")241```242243### Get Default Datastore244245```python246default_ds = ml_client.datastores.get_default()247print(f"Default: {default_ds.name}")248```249250## MLClient Operations251252| Property | Operations |253|----------|------------|254| `workspaces` | create, get, list, delete |255| `jobs` | create_or_update, get, list, stream, cancel |256| `models` | create_or_update, get, list, archive |257| `data` | create_or_update, get, list |258| `compute` | begin_create_or_update, get, list, delete |259| `environments` | create_or_update, get, list |260| `datastores` | create_or_update, get, list, get_default |261| `components` | create_or_update, get, list |262263## Best Practices2642651. **Use versioning** for data, models, and environments2662. **Configure idle scale-down** to reduce compute costs2673. **Use environments** for reproducible training2684. **Stream job logs** to monitor progress2695. **Register models** after successful training jobs2706. **Use pipelines** for multi-step workflows2717. **Tag resources** for organization and cost tracking272
Full transparency — inspect the skill content before installing.