Link copied to clipboard!
Back to Jobs
Data Engineer (San Francisco) at Zigma LLC
Zigma LLC
San Francisco, CA
Information Technology
Posted 0 days ago
Job Description
Company DescriptionZigma LLC is a women-owned technology consulting and IT services start-up specializing in Big Data engineering, cloud data modernization, cloud architecture, and advanced analytics. Our mission is to empower organizations through secure, scalable, and high-performance digital ecosystems while maintaining a strong commitment to cybersecurity and compliance. We work with clients across various industries, including healthcare, telecom, and financial services, ranging from local businesses to enterprise-level corporations. Dedicated to fostering inclusion and women's leadership, we strive to deliver innovative solutions that drive operational efficiency and digital transformation. Zigma LLC combines technical expertise with a passion for empowering the next generation of women entrepreneurs.Data Engineer (Mid-Level) Hybrid | C2C | HealthcareLocations: East Bay Area, CA | Greater Los Angeles Area, CA | Oregon's Willamette Valley, OR | Greater Atlanta Area, GAEmployment Type: C2CWork Authorization: US Citizens, Green Card, H4/L2/Any EAD, OPT/CPT Candidates.Work Arrangement: HybridOpenings: 3 per locationExperience: 712 yearsContract: Long-term (12+ months, performance-based)Preferred Education/Certification: B.S/M.S. in Engineering Discipline with Computer Science, Data Engineering or relevant skills and certificationsJoin a leading healthcare analytics team as a Data Engineer! Work on Azure Cloud, Databricks, and modern Data Pipelines to drive insights from complex healthcare datasets. This is a hybrid role with opportunities to collaborate across multiple locations.Key Responsibilities:Design, build, and maintain ETL/ELT Ingestion pipelines on Azure CloudCollaborate with data scientists and analysts to ensure data quality, governance, and availabilityImplement batch and streaming data processing workflowsOptimize data workflows and pipelines for performance and scalabilityWork with HIPAA-compliant healthcare dataTechnical Skills & Tools:Programming & Scripting: Python, SQL, Scala/JavaData Processing Frameworks: Apache Spark, Kafka, Airflow/PrefectDatabases: Relational (PostgreSQL, MySQL, SQL Server), NoSQL (MongoDB, Cassandra), Data Warehouses (Snowflake, Redshift)Data Formats: CSV, JSON, Parquet, Avro, ORCVersion Control & DevOps: Git, Azure DevOps, CI/CD pipelinesCloud & Containerization: Azure Cloud, Docker, Kubernetes, TerraformCore Skills:ETL/ELT Ingestion pipeline designBatch & streaming data processingData modelling (star/snowflake schema)Performance optimization & scalabilityData governance and securityMust-Have:712 years in Data EngineeringHands-on Azure Cloud and Databricks experienceM.S. in Data Science or relevant certifications (Databricks/Data Science)
Resume Suggestions
Highlight relevant experience and skills that match the job requirements to demonstrate your qualifications.
Quantify your achievements with specific metrics and results whenever possible to show impact.
Emphasize your proficiency in relevant technologies and tools mentioned in the job description.
Showcase your communication and collaboration skills through examples of successful projects and teamwork.