Snowflake Data Engineer

Published: 23 April 2026

Back
Snowflake Data Engineer
Position: Snowflake Data Engineer
Location: India

Required Skills & Experience
Snowflake Platform
• 3+ years hands-on Snowflake development
• Snowflake architecture: warehouses, databases, schemas, stages, file formats, storage integrations
• Snowflake Streams, Tasks, Pipes — design and production use
• Performance tuning: clustering, materialized views, warehouse sizing, query profiling
• Snowflake RBAC, column masking, row-level security

Ingestion & Orchestration
• Azure Data Factory — pipelines, linked services, triggers, monitoring
• File-based ingestion: CSV, JSON, Parquet, Avro from Azure Blob / ADLS or equivalent
• REST API ingestion with auth patterns (OAuth2, API key) and pagination
• SFTP automation and file handling pipelines
• Push/pull ingestion pattern design and implementation

SQL & Development
• Expert-level SQL: complex joins, window functions, CTEs, query optimization
• Python for pipeline scripting, API integration, and data manipulation (pandas, requests, SQLAlchemy)
• dbt or equivalent transformation framework — model development, testing, documentation
• Version control with Git; experience in code review workflows

Data Engineering Practices
• Incremental load patterns: watermark, CDC, merge/upsert logic
• Error handling, retry logic, and pipeline alerting
• Data quality checks embedded within pipelines
• Cost management and monitoring on cloud data platforms
• Experience with Agile/Scrum delivery environments

Qualifications
Required
• Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent practical experience.
• 3–5 years of demonstrated data engineering experience with a focus on Snowflake as the target platform.
• Proven delivery of production-grade ingestion pipelines across multiple source patterns in a team environment.

Preferred
• SnowPro Core Certification (or equivalent Snowflake-issued credential).
• Experience working in offshore or near-shore delivery models with onshore stakeholders across time zones.
• Prior exposure to enterprise data platforms in regulated industries (financial services, oil & gas, public sector, or similar).
• Familiarity with Azure ecosystem services: ADLS Gen2, Azure Key Vault, Azure Monitor, Event Hub.

Nice to Have
• Experience with Snowpark (Python or Java in-Snowflake compute) for advanced transformation workloads.
• Familiarity with Snowflake Data Sharing or Marketplace for cross-organization data exchange.
• Exposure to real-time or streaming ingestion patterns (Kafka, Azure Event Hub, Kinesis) that land into Snowflake.
• Experience with data catalog or metadata tools (Collibra, Alation, Purview) for documenting Snowflake objects.
• Familiarity with Terraform or other IaC tooling for Snowflake object provisioning.

Apply Now