Sr. Databricks Architect
Remote
Job Id:
145838
Job Category:
Job Location:
Remote
Security Clearance:
Public Trust or Uncleared
Business Unit:
Zachary Piper
Division:
Zachary Piper Solutions
Position Owner:
Cameron Bagwell
*Applicants should be able to obtain and maintain a US Public Trust Clearance*
Zachary Piper Solutions is seeking a Sr. Databricks Architect to support a longstanding Department of Treasury program. This is a fully remote position. a strong data engineering background to support the design and delivery of secure, monitored, cloud-hosted environments. The ideal candidate will possess hands-on experience with data science and analytics, extensive use of AI/ML and LLM models, AI/MLOps, DevSecOps, and GenAI tools and technologies, with expertise in deploying solutions to high-stakes production environments.
Responsibilities:
- Design and implement event-driven architecture to support scalable, resilient, and dynamic GenAI applications.
- Refactor and rehost existing applications onto cloud platforms, optimizing for performance and scalability.
- Collaborate on the development of Platform Architecture, ensuring seamless integration and efficient operation of cloud services.
- Develop and optimize Databricks Solution Architectures to handle large-scale data processing and analytics workloads.
- Engineer robust Bedrock Solution Architectures to support infrastructure provisioning and management at scale.
- Define and execute Deployment Architectures that allow for automated and reliable deployments of applications and services.
- Establish and enforce Security Architectures to protect data and maintain compliance with industry and regulatory standards.
- Proactively identify and address technical issues and challenges to improve the efficiency and reliability of cloud-based systems.
- Document architectural decisions and create best practice guidelines for cloud development and deployment.
- Coach and provide direction to junior engineers in the Databricks and data engineering areas.
- Lead efforts to integrate various data sources, ensuring seamless data flow and transformation. Develop strategies for data ingestion, cleansing, normalization, and integration.
- Leverage experience with MLFlow and general CI/CD principles for automating model deployment.
- Collaborate with customers and critical resources in an SME program to come up with solutions that align with long-term roadmaps and strategy.
- Execute Databricks upgrades, particularly in the federal space.
Qualifications:
- 7+ years of experience with Bachelors Degree
- Proven experience in cloud engineering and architecture, with a focus on event-driven solutions
- Proficiency with Databricks and understanding of big data ecosystems
- Experience with Infrastructure as Code (IaC) using tools like Terraform, AWS CloudFormation, etc.
- Strong understanding of deployment pipelines and automated testing frameworks
- Clear understanding of microservices architecture and containerization technologies (e.g., Docker, Kubernetes)
Compensation:
- $120,000 - $155,000 *depending on experience*
- Health, Dental, Vision
- PTO, 401K, Paid Holidays, Sick Leave if required by Law
This job posting opens on 7/16/25 and will remain open for at least 30 days from the posting date
#LI-CB1
#LI-REMOTE
Keywords: Data, data scientist, data science, AI, artificial intelligence, machine learning, algorithms, python, SQL, scripting, coding, data modeling, public trust, data analysis, data visualizations, visualization, neural networks, deep learning, statistics, probability, algebra, ML, NLP, Databricks, DataRobot, business intelligence, power BI, python libraries, clustering, forecasting, Qlik sense, SQL, agile, logistics, tableau, databricks, AWS, IaC, infrastructure as code, Docker, kubernetes, AI/ML, microservices