Data Engineer
- ראש העין
- הכשרה
- משרה מלאה
- Design, build, and maintain scalable, robust ETL/ELT pipelines.
- Ingest data from various sources (APIs, databases, flat files, cloud buckets).
- Automate workflows for batch and/or streaming pipelines (e.g., using Airflow, GCP services).
- Design and implement efficient data models in BigQuery and Snowflake.
- Organize data for analytics teams in cloud warehouses (BigQuery, Snowflake).
- Implement best practices for partitioning, clustering, and materialized views.
- Manage and optimize data infrastructure (cloud resources, storage, compute).
- Ensure scalability, security, and compliance in data platforms.
- Monitor data integrity, consistency, and accuracy.
- Implement validation, monitoring, and alerting for pipeline health and data accuracy.
- Maintain documentation and data catalogs.
- Troubleshoot failures or performance bottlenecks.
- Work closely with data analysts, managers, and developers.
- Translate business requirements into technical solutions.
- Support self-service analytics and create reusable datasets.
- 2+ years of experience as a Data Engineer or similar role.
- Strong SQL and Python skills for data manipulation and pipeline logic.
- Experience with Airflow for orchestration and Docker/Kubernetes for deployment.
- Hands-on experience with cloud data platforms (GCP, AWS) and warehouses like BigQuery or Snowflake.
- Knowledge of data modeling, optimization, and performance tuning.
- Familiarity with DAX and BI tools like Power BI or Looker.
- Experience with Kafka or Pub/Sub for real-time data ingestion- an advantage.
- Familiarity with dbt for modular, testable SQL transformations- an advantage.
- Knowledge of Docker, Kubernetes, and cloud-native tools in GCP- an advantage.
- Experience with Firebase Analytics and Unity Analytics (data structure wise)- an advantage.
Mploy