Algolia was built to help users deliver an intuitive search-as-you-type experience on their websites and mobile apps. We provide a search API used by thousands of customers in more than 100 countries. Billions of search queries are answered every month thanks to the code we push into production every day.
Join the AI Platform: Building Core components to speed up AI delivery
The AI Platform is dedicated to enable AI product delivery by providing other teams with turnkey tools, frameworks, and features so that they can focus on their core business instead of redundant work that falls outside their expertise. The areas covered by the AI Platform are two-fold: allowing teams to quickly design new models (AI development) and generating and serving predictions in production (AI productionization).
We’re looking for problem solvers with an entrepreneurial mindset—people who focus on outcomes and use data to drive decisions. If you're passionate about building software for other developers and applying AI thoughtfully to achieve measurable results for customers, we'd love to hear from you!
The team is composed of engineers, most of whom are fully remote, with different skill sets and backgrounds. Your experience, your knowledge and your perspective will add to this diversity and help the team deliver products that make a difference.
YOU WILL:
- Be a key contributor to the design and development of the AI Platform
- Collaborate with a team that includes a variety of roles ranging from Site Reliability Engineer to Machine Learning specialists with a strong focus on Data Engineering
- Be responsible for the quality and soundness of our data pipelines
YOU MIGHT BE A FIT IF YOU HAVE:
- Experience designing and operating data engineering pipelines in production
- Experience working with large datasets and high traffic
- Experience in building and maintaining API services
- Rigor in high code quality, automated testing, and other engineering best practices
- Experience using one of the major cloud providers (GCP, AWS or Azure)
- Experience using data engineering tools (e.g. Airflow or BigQuery)
- Excellent spoken and written English skills
NICE TO HAVE:
- Experience operating AI models in production environments
- Experience in Go or Python
- Experience in Kubernetes
- Sensitivity to data driven decision making, and exploring datasets with SQL