Skip to content

feat: add MLFlow experiment link to eval output#5783

Merged
mollyheamazon merged 3 commits intoaws:masterfrom
joshuatowner:eval-mlflow-link
Apr 22, 2026
Merged

feat: add MLFlow experiment link to eval output#5783
mollyheamazon merged 3 commits intoaws:masterfrom
joshuatowner:eval-mlflow-link

Conversation

@joshuatowner
Copy link
Copy Markdown
Contributor

Add MLflow experiment link to the evaluation pipeline's Jupyter wait() display.

When an eval pipeline has MLflow configured, the wait view now shows a clickable 🔗 MLflow Experiment link in the header alongside the existing Pipeline Execution (Studio) link. The link deep-links to the experiment page (#/experiments/{id}) using a presigned URL, cached for 4 minutes to avoid expiry.

image

return None


def _get_mlflow_experiment_url(mlflow_resource_arn: str, mlflow_experiment_name: Optional[str] = None) -> Optional[str]:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let me look at this. does require some decoupling since that takes in training job but better structure abstracting for sure

@mollyheamazon
Copy link
Copy Markdown
Contributor

@mollyheamazon mollyheamazon merged commit d4f392f into aws:master Apr 22, 2026
4 of 7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants