Link copied to clipboard!
Back to Jobs
Data Engineer Sr. Consultant at Visa
Visa
Bellevue, WA
Information Technology
Posted 0 days ago
Job Description
Visa is building a next-generation Agentic AI and Data Platform to enable intelligent automation adaptive workflows and data-driven decision-making across our global post-purchase ecosystem.As a Data Engineer you will lead the architecture modernization and scaling of Visas enterprise data infrastructurebridging the existing SQL Serverbased data warehouse with the new Hadoop and Databricks data lake platform. You will ensure seamless data flow optimized performance and integration of data services that support AI GenAI and Agentic systems.This is a hands-on leadership role requiring deep data engineering expertise and the ability to work across both legacy modernization and new platform innovation. You will collaborate closely with the Principal Agentic Engineer ML teams and business stakeholders to enable AI-driven insights and intelligent data orchestration.Essential Functions:Data Platform ModernizationLead the transition from SQL Server data warehouse to the next-generation Hadoop/Databricks platform ensuring performance reliability and minimal business disruption. Architect hybrid data pipelines that bridge legacy and modern ecosystems enabling near real-time data access for analytics and AI applications. Optimize existing SQL Server models (facts dimensions indexes stored procedures) and design modern equivalents in Hadoop and Spark environments. Define long-term migration strategy data partitioning and retention policies aligned with Visas data governance standards.Data Architecture & EngineeringDesign and implement scalable distributed data pipelines using Spark Kafka Airflow and Delta Lake. Build robust ETL/ELT frameworks to process transactional behavioral and unstructured data at scale. Partner with the Agentic AI team to power RAG (Retrieval-Augmented Generation) pipelines vector database integrations and LLM data provisioning. Lead proof-of-concept (POC) initiatives to evaluate and integrate new data engineering technologies.Operational ExcellenceProvide guidance to junior Engineers to maintain existing SQL Serverbased data warehouses including database performance tuning replication encryption and high-availability (HA) solutions. Implement best practices in T-SQL development schema design and stored procedure optimization. Perform proactive performance analysis troubleshooting and resolution of production issues in SQL Server and Hadoop clusters. Collaborate with DBA and application teams to ensure uptime within SLAs and compliance with Visas data policies.Security Governance & ComplianceDefine and enforce standards for data quality lineage and governance across both legacy and modern platforms. Ensure full compliance with Visas data privacy encryption and access control frameworks. Partner with InfoSec to embed data security principles into every phase of data lifecycle management.Leadership & CollaborationMentor and coach data engineers fostering an environment of technical excellence and innovation. Collaborate cross-functionally with product ML and platform engineering teams on architecture decisions and roadmap execution. Serve as a bridge between legacy enterprise data teams and next-gen AI platform engineers ensuring knowledge continuity and execution speed.This is a hybrid position. The expectation for in-office days will be confirmed by your hiring manager.This role does not offer relocation or immigration support now or in the future.Qualifications : Basic Qualifications:8 or more years of relevant work experience with a Bachelors Degree or at least 5 years of experience with an Advanced Degree (e.g. Masters MBA JD MD) or 2 years of work experience with a PhD OR 11 years of relevant work experience.Preferred Qualifications:9 or more years of relevant work experience with a Bachelor Degree or 7 or more relevant years of experience with an Advanced Degree (e.g. Masters MBA JD MD) or 3 or more years of experience with a PhDBachelors or Masters degree in Computer Science Data Engineering or related field.Proven success supporting and modernizing SQL Serverbased data warehouses in high-SLA environments.Production-level experience architecting Hadoop Spark and Databricks data pipelines.Expertise in ETL/ELT frameworks data modeling and schema design for analytical and operational use cases.Strong programming proficiency in Python Java or Scala.Hands-on experience with AWS or Azure (Glue Synapse Redshift Delta Lake S3).Familiarity with Kafka Airflow Kubernetes and containerized data services.Understanding of RAG pipelines vector databases and AI data flows.Experience designing data pipelines and APIs compatible with Model Context Protocol (MCP)-based agent frameworks enabling seamless integration between AI agents data services and enterprise APIsStrong SQL optimization debugging and production troubleshooting experience.Experience leading data warehouse migration or modernization from SQL Server to Hadoop/Spark ecosystems.Deep understanding of data lineage metadata management and governance frameworks (e.g. Atlas Great Expectations).Familiarity with LangChain LangGraph and MCP for integrating AI agents with data systems.Strong ability to balance innovation and stability across coexisting legacy and modern data architectures.Proven track record mentoring engineers and collaborating across global teams.Leadership Attributes:Go-Getter: Executes decisively and thrives in complex hybrid data environments.Builder: Hands-on architect who delivers scalable production-grade data systems.Hustler: Operates with urgency and accountability across multiple technology stacks.Entrepreneurial: Drives innovation in data architecture and modernization.True North: Leads with integrity and alignment to Visas mission and long-term data strategy.Lead by Example: Sets high standards of excellence and transparency.Execute with Excellence: Ensures reliability performance and operational maturity across all data systems.Tech Stack Snapshot:Languages: Python Java Scala SQL T-SQLData Systems: SQL Server Hadoop Spark Databricks Delta Lake SnowflakePipelines: Kafka Airflow Glue Azure Data FactoryAI Integration: LangChain LangGraphCloud Platforms: AWS AzureInfra: Docker Kubernetes Terraform JenkinsGovernance: Great Expectations Atlas Data Privacy & Encryption FrameworksAdditional Information : Work Hours: Varies upon the needs of the department.Travel Requirements: This position requires travel 5-10% of the time.Mental/Physical Requirements: This position will be performed in an office setting. The position will require the incumbent to sit and stand at a desk communicate in person and by telephone frequently operate standard office equipment such as telephones and computers.Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race color religion sex national origin sexual orientation gender identity disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law.Visa will consider for employment qualified applicants with criminal histories in a manner consistent with applicable local law including the requirements of Article 49 of the San Francisco Police Code.U.S. APPLICANTS ONLY: The estimated salary range for a new hire into this position is 152200.00 to 220850.00 USD per year which may include potential sales incentive payments (if applicable). Salary may vary depending on job-related factors which may include knowledge skills experience and addition this position may be eligible for bonus and equity. Visa has a comprehensive benefits package for which this position may be eligible that includes Medical Dental Vision 401 (k) FSA/HSA Life Insurance Paid Time Off and Wellness Program.Remote Work : NoEmployment Type : Full-time Key Skills Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala Experience: years Vacancy: 1
Resume Suggestions
Highlight relevant experience and skills that match the job requirements to demonstrate your qualifications.
Quantify your achievements with specific metrics and results whenever possible to show impact.
Emphasize your proficiency in relevant technologies and tools mentioned in the job description.
Showcase your communication and collaboration skills through examples of successful projects and teamwork.