Elite Technical is seeking an experienced Senior Data Engineer to join our customer-s technology platform team. This role is a critical contributor in designing, developing, and optimizing cloud-based data solutions using Snowflake. You-ll leverage advanced Snowflake capabilities, build modern data pipelines, and enable scalable analytics and reporting for enterprise healthcare operations. The ideal candidate will demonstrate deep Snowflake and SQL expertise, hands-on experience with Snowpark (Python), and a strong foundation in data architecture, governance, and automation.
Key Responsibilities
- Design, develop, and optimize data pipelines and transformations within Snowflake using SQL and Snowpark (Python).
- Build and maintain Streams, Tasks, Materialized Views, and Dashboards to enable real-time and scheduled data operations.
- Develop and automate CI/CD pipelines for Snowflake deployments(Jenkins)
- Collaborate with data architects, analysts, and cloud engineers to design scalable and efficient data models.
- Implement data quality, lineage, and governance frameworks aligned with enterprise standards and compliance (e.g., HIPAA, PHI/PII).
- Monitor data pipelines for performance, reliability, and cost efficiency; proactively optimize workloads and resource utilization.
- Integrate Snowflake with dbt, Kafka for end-to-end orchestration and streaming workflows.
- Conduct root cause analysis and troubleshooting for complex data and performance issues in production.
- Collaborate across technology and business teams to translate complex data needs into elegant, maintainable solutions.
This position is a contract to permanent opportunity, and is hybrid (onsite in Columbia MD 2x per month). As the final interview round, a F2F interview in Columbia MD will be required. Please contact Elite Technical for additional information.
- 5+ years of experience in data engineering or equivalent field.
- 3+ years hands-on experience with Snowflake Data Cloud, including:
(1) Streams, Tasks, Dashboards, and Materialized Views
(2) Performance tuning, resource monitors, and warehouse optimization
- Strong proficiency in SQL (complex queries, stored procedures, optimization).
- Proficiency in Python, with demonstrated experience using Snowpark for data transformations.
- Experience building CI/CD pipelines for Snowflake using modern DevOps tooling.
- Solid understanding of data modeling methodologies (Kimball, Data Vault, or 3NF).
- Experience with data governance, lineage, and metadata tools (Collibra, Alation, or Azure Purview).
- Strong troubleshooting, analytical, and communication skills with the ability to engage both technical and business audiences.
________________________________________
Preferred Qualifications
- Experience with dbt, or Kafka for orchestration and streaming.
- Exposure to data quality frameworks such as Great Expectations or Monte Carlo.
- Understanding of real-time and batch data ingestion architectures.
- Snowflake Certification (SnowPro Core or Advanced).
- Prior experience in healthcare, insurance, or other regulated data environments (HIGHLY PREFERRED)
Hybrid/Columbia, MD
1
Monday, November 24, 2025
Contract
12 months T2P
Monday, November 3, 2025
Know someone who would be a good fit? We pay for referrals!