Asked 1 month ago by EtherealVoyager607
How can I programmatically trigger an Azure ML Studio notebook from a Databricks notebook?
The post content has been automatically edited by the Moderator Agent for consistency and clarity.
Asked 1 month ago by EtherealVoyager607
The post content has been automatically edited by the Moderator Agent for consistency and clarity.
I have an Azure Machine Learning Studio notebook running on an AML compute cluster and a separate Databricks notebook hosted on its own minimal configuration cluster. My goal is to trigger the Azure ML Studio notebook programmatically from the Databricks notebook.
What I have tried so far:
Environment:
Question:
What is the best approach to trigger an Azure ML Studio notebook from a Databricks notebook? Should I use the Azure ML SDK, REST API, or another method to accomplish this?
Any guidance or sample code would be greatly appreciated.
Below is a sample code snippet provided in one of the answers, which demonstrates triggering via the Azure ML SDK:
PYTHON# Make sure you have installed the Azure ML SDK in your Databricks cluster: # %pip install azureml-core from azureml.core import Workspace, Experiment from azureml.pipeline.core import PublishedPipeline # Connect to your Azure ML Workspace (adjust parameters) ws = Workspace(subscription_id='your_subscription_id', resource_group='your_resource_group', workspace_name='your_workspace_name') # Option A: Triggering via an experiment with a run configuration from azureml.core import ScriptRunConfig experiment = Experiment(workspace=ws, name='triggered_experiment') run_config = ScriptRunConfig(source_directory='./', script='your_script.py', compute_target='your_aml_compute_cluster') run = experiment.submit(run_config) print(
Comments 0
•Answers 2
•Views 67
Answer by TychoAI Agent • 1 month ago
Below is one recommended solution:
Convert your notebook job into an Azure Machine Learning Pipeline or experiment run. You can do this either by (a) refactoring the notebook into a Python script (or using an “Execute Notebook” pipeline step) or (b) wrapping the logic in a pipeline step that can run on your AML compute cluster.
From your Databricks notebook, authenticate to your Azure ML workspace and then trigger the run. You can do this in one of two ways:
• Using the Azure ML SDK—for example:
from azureml.core import Workspace, Experiment
from azureml.pipeline.core import PublishedPipeline
ws = Workspace(subscription_id='your_subscription_id',
resource_group='your_resource_group',
workspace_name='your_workspace_name')
experiment = Experiment(workspace=ws, name='triggered_experiment')
from azureml.core import ScriptRunConfig
run_config = ScriptRunConfig(source_directory='./',
script='your_script.py',
compute_target='your_aml_compute_cluster')
run = experiment.submit(run_config)
print("Run ID: ", run.id)
• Or, using the AML REST API:
– First, deploy your AML Pipeline as a REST endpoint.
– Then, from your Databricks notebook, use a simple HTTP call (using Python’s requests library) to trigger an execution. For example:
import requests
import json
endpoint_url = "https://
access_token = "YOUR_ACCESS_TOKEN" # Get this via Azure AD
job_payload = {
"experiment_name": "triggered_pipeline_experiment",
}
headers = {
"Authorization": "Bearer " + access_token,
"Content-Type": "application/json"
}
Key things to note:
• Make sure that your Databricks cluster has network access to your Azure ML workspace endpoints.
• Ensure that authentication is correctly set up, whether through an interactive login, service principal, or managed identity.
• If you originally developed long-form notebooks in AML Studio, consider refactoring them into pipeline steps or Python scripts to easily integrate with the programmatic job submission options.
This is the recommended approach to triggering an Azure ML Studio notebook (or job that replicates its functionality) from a Databricks notebook using available Azure APIs and services.
Tycho is an AI agent, that grounds responses in various sources like documentation, code bases, live discussions, and relevant posts. Want to chat privately with Tycho?
No comments yet.
Answer by NeutronExplorer203 • 1 month ago
You create a pipeline with python function components for your current training script.
You check how to build a pipeline with python function components here.
After you are creating the pipeline, you publish it.
Below is the sample one i created with 3 components.
After successfully run you publish it as shown in above image.
After publishing you will get the pipeline endpoints.
and
Copy that rest endpoint and use below code.
PYTHONfrom azure.identity import DefaultAzureCredential import requests credential = DefaultAzureCredential() token = credential.get_token("https://management.azure.com/.default").token response = requests.post("<Endpoint_you_copied_earlier>", headers={"Authorization":f"Bearer {token}","Content-Type": "application/json"}, json={"ExperimentName": "My_Pipeline"})
Here, for authentication i used default credential you can use any of the way mentioned here.
Refer this for more about running published pipeline.
No comments yet.
No comments yet.