Back to Jobs
Infinitive

Databricks Platform Architect Unity Catalog Enablement at Infinitive

Infinitive McLean, VA

Job Description

About InfinitiveInfinitive is a data & AI consultancy that enables global brands to deliver results through insights innovation and efficiency. We possess deep industry and technology expertise to drive and sustain adoption of new capabilities. We match our people and personalities to our clients culture while bringing the right mix of talent and skills to enable high return on investment.Infinitive has been named Best Small Firms to Work For by Consulting Magazine eight times and has also been named a Washington Post Top Workplace Washington Business Journal Best Places to Work and Virginia Business Best Places to Work.Role OverviewThis architect will define and shape a unified platform service that enables scalable governed and cost-efficient data access across the bank. The ideal candidate will influence enterprise design standards and technical adoption by making Databricks Unity Catalog the effortless observable and default foundation for data integration governance and analytics across all business domains.Key ResponsibilitiesPlatform Vision & ArchitectureDefine and champion the end-to-end architecture for the banks Databricks-based data platform ensuring scalability security cost efficiency and ease of adoption.Design a self-service platform layer that leverages Databricks Unity Catalog to deliver seamless data discovery access and observability across all environments.Establish architectural patterns and reference implementations that encourage enterprise-wide reuse and standardization.Unity Catalog Strategy & EnablementLead the design and implementation ofDatabricks Unity Catalogas the central governance planedefining catalog hierarchies fine-grained access controls and cross-environment lineage.Evaluate and implement metadata RBAC/ABAC and data masking capabilities to meet regulatory and compliance requirements (e.g. GLBA GDPR HIPAA).Define the template architecture that allows Unity Catalog to operate as a scalable and cost-effective shared service across lines of business.Scalability Cost and ObservabilityEngineer platform capabilities that provide deep visibility into compute storage and catalog operations through integratedobservability monitoring and FinOpspractices.Develop resource optimization strategies to balance performance and cost while maintaining compliance and SLAs.Establish metrics dashboards and alerts to ensure the platform scales predictably under enterprise workloads.API and Integration DesignArchitect streamlinedRESTful/GraphQL APIsfor secure governed data access and metadata integration.Ensure interoperability with enterprise systems APIs and external data consumers using modern consistent and documented integration patterns.Data Modeling & Pipeline StrategyGuide teams in buildingLakehouse-aligneddata models that maximize reuse and governance.Oversee design of ETL/ELT architectures (Spark PySpark SQL) that integrate seamlessly with Unity Catalog for lineage and access tracking.Collaboration & InfluencePartner with engineering data science and risk teams to align platform design with business outcomes and regulatory expectations.Influence architecture steering committees and platform engineering groups to adopt the Databricks foundation as a managed enterprise-wide service.Promote a culture of easy adoption through clear design patterns documentation and working sessions.Technical Leadership & MentorshipMentor engineers and architects on Databricks Unity Catalog and best practices for cost scale and observability.Contribute to internal architecture communities and upskill teams across multiple domains.Required Skills & QualificationsEducation:Bachelors or Masters degree in Computer Science Engineering or related field.Experience:8 years in data architecture or platform engineering including experience designing enterprise-scale distributed data environments.Databricks Expertise:Deep hands-on knowledge ofDatabricks Delta Lake Apache Spark and Lakehouse principles.Unity Catalog Mastery:Demonstrated success architecting and operationalizingDatabricks Unity Catalogfor enterprise governance metadata management and access control.Programming & Data:Advanced proficiency inPython (PySpark)andSQL; experience with cloud data platforms (AWS Azure or GCP).API Engineering:Strong background inAPI architecture (REST GraphQL OpenAPI)and applying best-in-class security and observability.Governance Knowledge:Expert-level understanding ofdata governance frameworks data quality management and regulatory compliance.Soft Skills:Outstanding communication and influence skills with ability to advocate for design principles across executive technical and risk audiences.Preferred QualificationsExperience deploying Databricks and cloud infrastructure usingTerraform or IaC frameworks.Familiarity withMLflowand model governance integration.Relevant certifications (Databricks Certified Data EngineerAWS/Azure/GCP Architect).Experience withreal-time data streamingtechnologies (Kafka Structured Streaming). Required Experience:Senior IC Key Skills APIs,Pegasystems,Spring,SOAP,.NET,Hybris,Solution Architecture,Service-Oriented Architecture,Adobe Experience Manager,J2EE,Java,Oracle Employment Type : Full-Time Experience: years Vacancy: 1

Resume Suggestions

Highlight relevant experience and skills that match the job requirements to demonstrate your qualifications.

Quantify your achievements with specific metrics and results whenever possible to show impact.

Emphasize your proficiency in relevant technologies and tools mentioned in the job description.

Showcase your communication and collaboration skills through examples of successful projects and teamwork.

Explore More Opportunities