Skip to content

Conversation

@riley-harper
Copy link
Contributor

Closes #227.

This adds an hlink.linking.util.set_job_description() function which you can use as a context manager to update Spark's job description. This is a thread-local variable which is displayed in the Spark UI. I chose a context manager for this because the SparkContext.setJobDescription() function sets the description for all following jobs until you call setJobDescription(None). Using a context manager is a simple way to make sure we're doing this everywhere.

I also removed some verbose debug logging and added some more logging in some spots.

@riley-harper riley-harper requested a review from ccdavis November 21, 2025 18:22
Copy link
Contributor

@ccdavis ccdavis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The context manager approach is really nice. I think it makes code where it is used more readable too, besides making the running jobs easier to interpret. I want to see how an Hlink log looks and the Spark UI. Looks good.

@riley-harper riley-harper merged commit 8237724 into main Nov 21, 2025
6 checks passed
@riley-harper riley-harper deleted the spark-job-descriptions branch November 21, 2025 20:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Set Spark Job Descriptions in the UI

3 participants