Elite Technical is seeking an experienced Senior Data Engineer to join our customers technology platform team. This role is a critical contributor in designing, developing, and optimizing cloud-based data solutions using Snowflake. You-ll leverage advanced Snowflake capabilities, build modern data pipelines, and enable scalable analytics and reporting for enterprise healthcare operations. The ideal candidate will demonstrate deep Snowflake and SQL expertise, hands-on experience with Snowpark (Python), and a strong foundation in data architecture, governance, and automation.
Key Responsibilities
- Design, develop, and optimize data pipelines and transformations within Snowflake using SQL and Snowpark (Python).
- Build and maintain Streams, Tasks, Materialized Views, and Dashboards to enable real-time and scheduled data operations.
- Develop and automate CI/CD pipelines for Snowflake deployments(Jenkins)
- Collaborate with data architects, analysts, and cloud engineers to design scalable and efficient data models.
- Implement data quality, lineage, and governance frameworks aligned with enterprise standards and compliance (e.g., HIPAA, PHI/PII).
- Monitor data pipelines for performance, reliability, and cost efficiency; proactively optimize workloads and resource utilization.
- Integrate Snowflake with dbt, Kafka for end-to-end orchestration and streaming workflows.
- Conduct root cause analysis and troubleshooting for complex data and performance issues in production.
- Collaborate across technology and business teams to translate complex data needs into elegant, maintainable solutions.
This position is (1) a contract to permanent opportunity (sponsorship not permitted) (2) hybrid/onsite in Columbia MD office 2x per month and (3) requires a onsite/F2F final interview round om Columbia MD. Please do not apply if you are unable to meet both requirements.
- 5+ years of experience in data engineering or equivalent field.
- 3+ years hands-on experience with Snowflake Data Cloud, including:
(1) Streams, Tasks, Dashboards, and Materialized Views
(2) Performance tuning, resource monitors, and warehouse optimization
- Strong proficiency in SQL (complex queries, stored procedures, optimization).
- Proficiency in Python, with demonstrated experience using Snowpark for data transformations.
- Experience building CI/CD pipelines for Snowflake using modern DevOps tooling.
- Solid understanding of data modeling methodologies (Kimball, Data Vault, or 3NF).
- Experience with data governance, lineage, and metadata tools (Collibra, Alation, or Azure Purview).
- Strong troubleshooting, analytical, and communication skills with the ability to engage both technical and business audiences.
________________________________________
Preferred Qualifications
- Experience with dbt, or Kafka for orchestration and streaming.
- Exposure to data quality frameworks such as Great Expectations or Monte Carlo.
- Understanding of real-time and batch data ingestion architectures.
- Snowflake Certification (SnowPro Core or Advanced).
- Prior experience in healthcare, insurance, or other regulated data environments (HIGHLY PREFERRED)
Hybrid/Columbia, MD
1
Monday, March 9, 2026
Contract
12 months T2P
Wednesday, February 11, 2026
Know someone who would be a good fit? We pay for referrals!