Coding Challenge #3
Stuck Jobs Finder — Detect Hanging CronJobs and Kubernetes Jobs
The Problem
You get paged about failed data syncs or missing reports… again.
Turns out, a Kubernetes Job or CronJob has been stuck for hours — running forever or failing silently.
You want a quick way to identify Jobs and CronJobs that are hanging, failed, or have exceeded their expected duration.
Your Mission
Write a Python script using the Kubernetes Python client that:
✅ Accepts a namespace (default: default)
✅ Lists all Jobs and CronJobs in the namespace
✅ For each Job or CronJob:
Checks the status conditions
Calculates how long the job has been running
If it’s running for more than a threshold (e.g., 30 minutes), flag it as "Stuck"
If it’s failed, report reason and number of retries
Print a concise summary of:
Job/CronJob name
Type: Job or CronJob
Status: Running / Failed / Succeeded
Duration / Retry count
Reason (if any)
🧪 Example Output
$ python3 stuck_jobs_finder.py --namespace batch --threshold 1800
[Job] sync-users-job-32452
[Status] Running
[Duration] 43 minutes
[Note] Possibly stuck — exceeded threshold
[CronJob] nightly-backup
[Last Job] nightly-backup-23948
[Status] Failed
[Retries] 3
[Reason] Pod Evicted
[Job] report-generator-job-81230
[Status] Succeeded
[Duration] 12 minutes
📦 Bonus
Add
--all-namespacessupportExport output to CSV or JSON
Add
--thresholdflag to customize stuck time in secondsHighlight stuck/failed jobs in red, succeeded in green
📚 Tips
Use
v1.batch_v1_api.list_namespaced_job()Check
status.conditions,status.start_time,status.failed, andstatus.succeededFor CronJobs, check
status.last_schedule_timeand linked Jobs
Docs: Kubernetes Python Client
Why This Is Useful
Prevents silent job failures
Helps teams maintain job reliability
Great addition to DevOps observability scripts
Can be run as a daily cronjob to notify of stuck/failed jobs
Follow me on X @sharonsahadevan and connect on LinkedIn @sharonsahadevan for more real-world DevOps content.
Kubenatives Newsletter is a reader-supported publication. To get new challenges and deep-dive Kubernetes content, consider subscribing.


