Data Engineer
- תל אביב
- הכשרה
- משרה מלאה
- Own and continuously improve Minute Media's end-to-end data infrastructure and pipelines, with a focus on advertising technology and monetisation data; integrate data from multiple ad platforms and internal systems into our central BigQuery data warehouse.
- Collaborate with cross-functional stakeholders (Finance, AdOps, Product, Editorial/Content) to understand their data needs and deliver solutions that empower data-driven decision-making while removing dependency on the data team.
- Design, build, and manage foundational data architectures and tables in BigQuery with a holistic, company-wide perspective - ensuring a unified, scalable data schema that supports all business intelligence needs, enables consistent and reliable key metrics across Looker dashboards, maintains high data quality, and minimizes long-term complexity for maintenance.
- Develop, maintain, and optimise ETL/ELT processes (using Rivery and custom solutions) to ensure timely and accurate ingestion of programmatic and direct advertising revenue data (e.g. ad server reports, SSP/DSP outputs) into a single source of truth (SSOT) dataset.
- Ensure data quality and accuracy by implementing robust and automatic reconciliation processes; compare and validate revenue data across different sources (ad platforms, finance records, etc.) to identify and resolve discrepancies.
- Monitor the health and performance of data pipelines and infrastructure, troubleshoot issues proactively, and implement improvements or automation to enhance the stability, efficiency, and scalability of our data platform, minimizing costs associated with large-scale data processing.
- Stay up-to-date with industry best practices in data engineering, ad tech analytics and AI/ML applications, proactively recommend new tools or approaches to keep our data ecosystem robust and innovative.
- 4+ years of experience in data engineering, BI development, or a similar role, with a track record of building and managing data pipelines and warehouses (preferably in a digital media, online advertising, or tech environment).
- Strong proficiency in SQL and experience working with large datasets in cloud data warehouse platforms (especially Google BigQuery), along with a solid understanding of data modelling principles and query optimisation for performance.
- Hands-on experience with ETL/ELT tools and data pipeline orchestration (for example, Rivery, Airflow, or similar), including building data flows that ingest and transform data from various sources into centralised repositories.
- Experience with BI and data visualisation platforms, to support self-service analytics and reporting for end-users.
- Effective communication and collaboration skills - able to work closely with both technical teams and non-technical stakeholders (Finance, AdOps, Product managers, etc.), translating business needs into data solutions and explaining data insights in plain language.
- Familiarity with digital advertising technology and monetisation metrics - including working with data from ad servers, programmatic platforms, or other ad tech systems, and understanding key concepts like impressions, CPM, fill rate, clicks, and revenue attribution.
- Demonstrated ability to ensure data quality and integrity, including experience implementing data validation, auditing, and reconciliation practices to maintain trust in the numbers used by stakeholders.
- Proficiency in a programming language for data (e.g. Python) and familiarity with Google Cloud Platform services (beyond BigQuery, such as Cloud Functions or Composer) used for data engineering tasks.
- Strong analytical and problem-solving skills, with high attention to detail and a proactive approach to identifying and fixing data issues or process gaps.
- Experience in the online publishing or digital media industry, particularly involving advertising revenue streams or content analytics - Advantage.
Mploy