We are looking for an experienced and innovative Data Architect to join our team at Navan. As a Data Architect, you will be responsible for driving the design and implementation of scalable, efficient, and high-performance data architectures to support data-driven decision-making. You will collaborate closely with various teams—including data engineers, analysts, and product engineering —to ensure seamless integration across systems and data platforms.
What You'll Do:
- Lead the ideation, design, and hands-on development of a high-performance data architecture to support the growing needs of the business.
- Architect scalable, flexible, and efficient data products, leveraging Snowflake as the central data warehouse, and cloud technologies to ensure that data is accessible, secure, and easy to work with.
- Work closely with data analysts, data scientists, and product engineering teams to create and optimize data models, ensuring seamless data access and high-quality insights.
- Build and implement robust tests, monitoring, and data validation strategies to ensure data quality, reliability, and consistency across all layers of the data stack.
- Develop and maintain our "single source of truth" for key business metrics, ensuring alignment across teams for all decision-making.
- Establish and promote best practices for data engineering and modeling across the organization, mentoring junior team members and fostering a culture of excellence.
- Drive improvements in data performance, efficiency, and usability, with a focus on reducing friction for end-users and enabling self-service analytics.
- Focus on ensuring high data quality, performance, and scalability for large volumes of structured and unstructured data.
What We're Looking For:
- 10+ years of experience in data engineering + demonstrated leadership and mentorship experience, including the ability to guide and develop junior engineers and analysts.
- 3+ years of experience building and maintaining ELT pipelines in a cloud environment (preferably Snowflake).
- Strong expertise in DBT Core, AWS, Airflow, SQL, and Python.
- Expert in managing Airflow for data orchestration, and skilled in CI/CD for seamless development automation.
- Solid understanding of data modeling, cloud-based data architectures, and performance optimization.