Why You Want This Position
Enverus is the leading energy SaaS company delivering highly technical insights and predictive/prescriptive analytics that empower customers to make decisions for increased profitability. Enverus’ innovative technologies drive production and investment strategies; enable best practices for energy and commodity trading and risk management; and reduce costs through automated processes across critical business functions. Enverus is a strategic partner to more than 6,000 customers in 50 countries.
Would you like to work in a unique data engineering environment built with modern platforms and data technologies? As a Data Engineer you get to implement features and enhancements of various sizes and complexities, influencing decisions on every step of the development cycle – from prototype to deployment. Join us on our journey as a member of Enverus’ data engineering team working with crew members based in the United States and Canada.
You would join the Oil & Gas Production Data Engineering Team collecting, transforming, and delivering presentation-ready data to Enverus Prism, our premier energy analytics platform.
Enverus has a dynamic hub for developing software in Brno, Czech Republic and you can learn more about our team, company culture, and benefits here. What You Will Do
- Engage and participate in all stages of the Software Development Life Cycle (SDLC) from research, planning, design, development, and testing, all the way to deployment
- Integrate backend data storage including relational database and AWS/S3 cloud storage
- Provide solutions to complex business problems beyond simple CRUD operations
- Develop reusable, maintainable, efficient, and cost-effective production-ready code
- Review and enforce code quality and standards
- Write unit, integration, and end-to-end tests
- Support and monitor infrastructure, application, database, etc.
- Balance working independently with collaborating in a team
- Learn new technologies as needed to serve business growth
What You Should Have
- Bachelor’s degree in computer science or related field
- Minimum 2 years of software development experience
- Experience with Python data transformations
- Experience with C#
- Experience with Structured Query Language (SQL)
- Experience with modern CI/CD workflows and Infrastructure as Code (IaC)
- Familiarity with Databricks, Snowflake, or equivalent
- Knowledge of big data technologies and ETL processes
- Good command of Git or similar source-control tools
- Familiarity with Airflow, Prefect, or any orchestration service
Our Tech Stack
AWS, Terraform, Databricks, Python, Spark,Oracle, Airflow, GitHub, Prefect, C#