What's the opportunity?
The Data Infrastructure team builds distributed systems and tools supporting Intercom and empowering people with information. As the company grows, so does the volume and velocity of our data and the appetite for more-and-more sophisticated and specialized, often AI-assisted, data solutions.
Our team builds, maintains, evolves, and extends the data platform, enabling our partners to self-serve by creating their own end-to-end data workflows, from ingestion through transforming data and evaluating experiments to analyzing usage and running predictive models. We provide a solid data foundation to support various highly impactful business and product-focused projects.
We’re looking for a Senior Data Infrastructure engineer to join us and collaborate on large-scale data-related infrastructure initiatives, who is passionate about providing solid foundations for providing high quality data to our consumers.
What will I be doing?
- Evolve the Data Platform by designing and building the next generation of the stack.
- Develop, run and support our batch and real-time data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS.
- Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support their needs.
- Develop automation and tooling to support the creation and discovery of high quality analytics data in an environment where dozens of changes can be shipped daily.
- Implement systems to monitor our infrastructure, detect and surface data quality issues.
Recent projects the team has delivered:
- Refactoring of our MySQL Ingestion pipeline for reduced latency and 10x scalability.
- Redshift -> Snowflake migration
- Unified Local Analytics Development Environment for Airflow and DBT
- Building our next generation company metrics framework, adding anomaly detection and alerting, and enabling easier discovery and consumption.
About you
- You have 5+ years of full-time, professional work experience in the data space using Python and SQL.
- You have solid experience building and running data pipelines for large and complex datasets including handling dependencies.
- You have hands-on cloud provider experience (preferably AWS) including service integrations and automation via CLI and APIs.
- You have a solid understanding of data security practices and are passionate about privacy.
- You can demonstrate the significant impact that your work has had, both on the technology side as well as with the teams you’ve been part of.
- You have a great sense of what should be worked on next and know how to break big ambiguous problems into small workable chunks.
- You love helping people grow and recognise where your mentorship might be more valuable than your direct technical contributions on a project.
- You care about your craft
In addition it would be a bonus if you have
- Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would be helpful.
- Experience or understanding of tools and technologies included in the modern data stack ( Snowflake, DBT )
- Industry awareness of up-and-coming technologies and vendors.
Benefits
We are a well treated bunch, with awesome benefits! If there’s something important to you that’s not on this list, talk to us!
- Competitive salary and equity in a fast-growing start-up
- We serve lunch every weekday, plus a variety of snack foods and a fully stocked kitchen
- Regular compensation reviews - we reward great work!
- Pension scheme & match up to 4%
- Peace of mind with life assurance, as well as comprehensive health and dental insurance for you and your dependents
- Open vacation policy and flexible holidays so you can take time off when you need it
- Paid maternity leave, as well as 6 weeks paternity leave for fathers, to let you spend valuable time with your loved ones
- If you’re cycling, we’ve got you covered on the Cycle-to-Work Scheme. With secure bike storage too
- MacBooks are our standard, but we also offer Windows for certain roles when needed.