Dutech’s Job
Senior Databricks Administrator (AWS, Spark & Data Platform)
Austin,TX
DatePosted : 4/2/2026 2:48:14 PM
JobNumber : DTS1017187681JobType : W2 or C2C
Skills: Databricks, AWS, Apache Spark, S3, RBAC, IAM, SCIM, Terraform, CI/CD, Data Governance, Unity Catalog, MLflow, Data Lake, Lakehouse
Job Description
We are seeking an experienced Senior Databricks Administrator to manage and optimize our Databricks platform in an AWS cloud environment. The ideal candidate will have strong expertise in Databricks workspace administration, Apache Spark, cloud integrations, and data platform governance.
This role focuses on ensuring platform performance, security, scalability, and cost optimization for enterprise data and AI workloads.
Key Responsibilities:
- Administer and manage Databricks workspaces in AWS cloud environments
- Configure and optimize Databricks clusters, job scheduling, and workspace settings
- Manage user access, roles, and permissions using IAM, SCIM, and RBAC
- Implement and enforce cluster policies and governance standards
- Monitor platform health, performance, and availability
- Optimize Apache Spark workloads for performance and scalability
- Integrate Databricks with cloud storage services (e.g., Amazon S3)
- Manage and support Databricks SQL, notebooks, and job orchestration
- Ensure data security, encryption, and compliance requirements are met
- Automate infrastructure and workflows using Terraform, scripting, and CI/CD pipelines
- Support AI/ML workloads using Databricks ML and MLflow (as needed)
Required Qualifications:
- 8+ years of experience in Databricks administration in AWS environments
- Strong expertise in Databricks cluster configuration and workspace management
- Hands-on experience with Apache Spark (performance tuning & troubleshooting)
- Experience managing IAM, SCIM, and role-based access control (RBAC)
- Experience integrating Databricks with cloud storage (S3 or similar)
- Strong understanding of data security, encryption, and compliance
- Experience with DevOps tools (Terraform, CI/CD pipelines, scripting)
Preferred Qualifications:
- Experience in enterprise or government environments
- Familiarity with Databricks Unity Catalog for governance
- Experience with cost optimization strategies for Databricks workloads
- Knowledge of data lake/lakehouse architectures
- Experience supporting AI/ML workloads (MLflow)
- Programming knowledge in Python, SQL, or Scala
SHARE THIS JOB