This job listing expired on Jul 20, 2022
Tweet

We are looking on expanding our Analytics team with a data engineer to help create data-driven solutions for the Illuvium DAO. These include: molding vast amounts of data to help our data analysts and data scientists create insights for the Illuvium DAO, help creating internal and external data products and setting up vital communications with cutting-edge AI infrastructure to play our games.

About Illuvium

Illuvium Labs is an independent game development studio based in Sydney, Australia. We develop blockchain based games for the Illuvium DAO. We have developed a strong culture of independence with our team, preferring candidates who can articulate their own vision and goals. We operate almost entirely remotely so each team member designs their own hours and work schedule. In the end all that matters is the delivered product. We hire based on people’s abilities to adapt and change quickly, valuing underlying core abilities above specific skill sets.

Responsibilities

  • Proactively create data pipelines to assist the Analytics team in creating AI and providing DAO insights
  • Feed the infrastructure which governs our data-driven products continuously, robustly and coherently
  • Manage dataflows and data warehousing for the entire Illuvium DAO
  • Build and launch new data extraction, transformation and loading processes in production
  • Deal with data-related ad hoc requests from the broader Illuvium team

Skills & Qualifications

  • High proficiency with Python
  • Experience delivering e2e data-driven products
  • The ability to both work independently while proactively engaging with the Analytics team

Preferred Qualifications

  • Experience with Splunk is a big plus
  • You have experience with Big Data on Google Cloud Platform or AWS. You have experience with Cloud Engineering (AWS, Cloud Formation, Terraform).
  • Understanding of data modelling in both SQL and noSQL-based databases
  • At least 5 years of experience with databases, ETL, Streaming and data processing frameworks like Kafka, Airflow

Location

All of our jobs are 100% remote and we are looking to find the best talent globally!