d

Senior Data Engineer

AeroFarms · Newark, NJ

Engineering Food & Agriculture Posted 3 weeks ago Claim this company

As a Certified B Corporation, AeroFarms is a mission-driven company with global headquarters in Newark, NJ, championing indoor vertical farming and fundamentally transforming agriculture. Recognized by Fast Company as one of the World’s Most Innovative Companies and by Inc.com as one of the Top 25 Disruptive Companies in the World, AeroFarms is scaling to meet the demand for our fresh, locally grown produce that is setting new culinary standards, and we need someone special who can bring their experience in recruiting to help us grow further. Must be aligned with our mission and passionate about making a difference.

We have:

  • An incredible ‘change-the-world’ company with the eyes of the world focused on our success.
  • A team of motivated, intellectually-curious individuals to support you.
  • Backed by some seriously impressive firms including Goldman Sachs, Prudential, leading VCs, and strategic partners with a view on global expansion.

Job Description

We are looking for a highly-motivated Senior Data Engineer who will be responsible for accessing, moving, processing, modeling and managing large data sets in a fault-tolerant, scalable and performant manner. The ideal candidate will have a team first mentality and be a creative problem solver. This is a fantastic opportunity to engage in a positive, cutting-edge, and creative work environment that offers excellent benefits and rewards.

Responsibilities:

  • Develop a data quality framework to ensure delivery of high-quality data and analyses to stakeholders.
  • Develop and improve the current data architecture, data quality, monitoring and data availability.
  • Develop and manage stable, scalable data pipelines that cleanse, structure and integrate disparate big data sets into a readable and accessible format for end user analysis and targeting the use of stream and batch processing architectures.
  • Collaborate with Data Scientists to implement advanced analytics algorithms that exploit our rich data sets for statistical analysis, prediction, clustering, Machine Learning (ML), and modeling.
  • Develop and support continuous integrations build and deployment processes which use Jenkins, Docker, Git, etc.
  • Define and implement monitoring and alerting policies for data solutions.
  • Apply business understanding and a technology know-how to cutting edge Data Science problems.
  • Play a leading role in architecture design and implementation of next generation BI solutions.
  • Effectively communicate with various teams and stakeholders, escalate technical and managerial issues at the right time and resolve conflicts.
  • Peer review work. Actively mentor more junior members of the team, improving their skills, their knowledge of our systems and their ability to get things done.

You have:

  • Bachelor’s Degree in Computer Science, Engineering, Mathematics, Physics, or IT related field required.
  • Master’s Degree in Computer Science or IT related field preferred.
  • 5+ years of experience with detailed knowledge of data lake, data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools.
  • 5+ years of hands-on experience in using advanced SQL queries (analytical functions), experience in writing and optimizing highly efficient SQL queries.
  • 2+ years of programming experience using Javascript.
  • Experience with time series database technologies like TimescaleDB preferred.
  • Experience with analytical tools like Tableau preferred.
  • Experience working with AWS technologies including, EMR, S3, or RDS.
  • Experience with other data lake/data warehouse technologies including, Snowflake or similar solutions built around Hive/Spark etc.
  • Proven track record of delivering big data solutions – batch and real-time.
  • Ability to design, develop and automate scalable ETL and reporting solutions that transform data into accurate and actionable business information.
  • Comfortable working with business customers to gather requirements and gain a deep understanding of varied datasets.
  • Experienced in testing and monitoring data for anomalies and rectifying them.
  • Knowledge of software coding practices across the development lifecycle, including Agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations.
  • Familiar with build and deployment tools (e.g, Jenkins).
  • Some experience in Machine Learning (ML), AI and deployment of models would be preferred. Familiar with other technologies like Kafka, microservices, React, and GraphQL preferred.

Connect with your next key hire on Tech Jobs for Good.

Post a featured job Schedule a demo