Your opportunity
We are looking for a highly skilled and motivated Lead Data Engineer to join our Data Platform Team. As a Lead Data Engineer, you will play a crucial role in designing, building, and maintaining scalable and reliable data pipelines and platforms. You will work with cutting-edge technologies such as Snowflake, Airflow, and dbt to enable the efficient processing and analysis of large volumes of data.

Join our talented Data Platform Team and contribute to the development of a cutting-edge data infrastructure that enables powerful data analytics and insights for our customers. 

Apply now and be part of New Relic's mission to deliver exceptional digital experiences.

What you'll do
  • Design and develop robust and scalable data pipelines to support data integration using Kafka, Fivetran, Snowflake, Airflow, and dbt.
  • Help lead the implementation and maintenance of the data platform solutions, ensuring data integrity, performance, and security.
  • Collaborate with cross-functional teams including data scientists, analysts, and software engineers to understand data requirements and deliver high-quality solutions.
  • Evaluate and implement best practices for data modeling, ETL processes, and data quality assurance.
  • Optimize and tune data processing workflows and SQL queries for improved performance and efficiency.
  • Provide technical leadership and mentorship to junior data engineers, guiding them in implementing best practices and delivering high-quality solutions.
  • Stay up-to-date with industry trends and advancements in data engineering, continuously improving the team's technical knowledge and skill set.
  • Collaborate with infrastructure and operations teams to ensure reliable and scalable data storage, processing, and monitoring solutions.

This role requires
  • Bachelor's degree in Computer Science, Engineering, or a related field. Advanced degree preferred.
  • Proven experience (5+ years) in data engineering, designing and implementing data pipelines, and building data infrastructure.
  • Strong expertise in working with Snowflake, Airflow, and dbt, including data modeling, ETL, and data quality assurance.
  • Proficiency in SQL and experience with optimizing and tuning queries for performance.
  • Solid understanding of data warehousing concepts, dimensional modeling, and data integration techniques.
  • Experience with cloud platforms (e.g., AWS, Azure, GCP) and cloud-based data technologies.
  • Strong programming skills in Python or other scripting languages for data manipulation and automation.
  • Excellent problem-solving and troubleshooting abilities with a keen attention to detail.
  • Strong communication skills with the ability to effectively collaborate with cross-functional teams and stakeholders.
  • Leadership experience, mentoring junior team members, and guiding technical projects.
Preferred Qualifications:
  • Experience with streaming data processing frameworks (e.g., Apache Kafka, Apache Flink).
  • Familiarity with containerization technologies (e.g., Docker, Kubernetes).
  • Knowledge of distributed computing frameworks (e.g., Spark, Hadoop).
  • Experience with data governance, data security, and compliance practices.
  • Understanding of DevOps principles and experience with CI/CD pipelines.


Bonus points if you have
  • Data Observability experience
Is a Remote Job?
Hybrid (Remote with required office time)

New Relic helps engineers and developers do their best work every day — using data, not opinions — at every stage of the software lifecycle. The world’s best engineering teams rely on New Relic to...

Apply Now