Senior Data Engineer
Zipline · South San Francisco, CA
ABOUT YOU AND THE ROLE
Zipline’s Data team is responsible for powering data-driven decision making across the entire organization. In order for us to execute against our mission, we need to build a solid data foundation and ensure that every area of the business has access to highly reliable data.
We are hiring a talented and experienced Senior Data Engineer to join our small, but growing Data team and play a critical role in designing and executing on a robust and forward-looking data strategy for the company. Our team owns the data pipelines and tools that provide secure, reliable, and accessible data that enable team members to derive actionable insights. Doing our job well means that we enable the entire organization’s ability to make more informed decisions, innovate faster, and serve our customers better.
In this role, you will work directly with our Data, Engineering, Operations, Go-to-Market, and Finance teams to support the organization's data processing and analytics needs. You will be the internal expert on all things data engineering, empowering your peers with your expertise so that together, we can build a world-class data culture. This is a unique opportunity to directly influence not only our data systems, but also our drones and global operations. The ideal candidate will help us design systems that support the company’s needs today and many years into the future.
WHAT YOU'LL DO
- Help architect, build, maintain, and scale our data pipelines that bring together data from various internal and external systems into our data lake/warehouse.
- Partner with internal stakeholders to understand analysis needs and consumption patterns.
- Use tools like dbt to develop data models and schemas in our data warehouse that enable performant, intuitive analysis.
- Partner with upstream engineering teams to enhance data logging patterns and best practices.
- Lead architectural decisions and plan for the company’s data needs as we scale.
- Drive and evangelize data engineering best practices for data processing, modeling, and lake/warehouse development.
- Serve as a thought leader and mentor by sharing your knowledge across the company and leveling up our collective technical expertise.
- Advise engineers and other cross-functional partners on how to most efficiently use the data tools at Zipline.
WHAT YOU'LL BRING
- Advanced knowledge of Python & SQL
- Experience building data pipelines & ETL/ELT processes
- Experience with Snowflake or other cloud data warehouses (e.g. Redshift, BigQuery, etc.)
- Experience with tools commonly used in data orchestration and processing such as Airflow, dbt, and Spark
- Familiarity with BI tools such as Sisense, Looker, or Tableau
- Experience in schema design and dimensional data modeling, ideally using tools like dbt
- Experience working with continuous integration and continuous deployment processes and tools
- Ability to communicate effectively with stakeholders to define requirements and timelines
- Passion and excitement to serve as a technical mentor and thought leader
- Passion and excitement to build a strong data foundation for the company. There are many critical decisions on our roadmap and we need someone who is knowledgeable and excited to lead the way.
- Nice to have: Experience working with async messaging services or distributed streaming platforms like Google Pub/Sub and Apache Kafka