This job listing expired on Jun 29, 2020
Tweet

Square Enix has an internal cloud-based platform, which provides our Analytic & Insight team and other groups across the business with a single data lake pooling game telemetry, sales and marketing data, web analytics and other information. The Senior Data Engineer leads the team responsible for maintaining and further developing this platform.

Duties include

  • optimising, refining and enhancing the data acquisition pipeline;
  • working with client teams to ensure robust capture of high-quality data;
  • supporting data analysts and other users of the data via training and technical assistance;
  • providing staff development for others in the Data Engineering team.
  • This position requires a driven and talented person that can help the team progress.

Essential

Requirements

  • High-level of professional experience with cloud-based data engineering platforms, particularly Google Cloud Platform (DataFlow, BigQuery, PubSub, GCS).
  • Expertise with lambda architecture and other approaches to capture and processing of data at scale to provide real-time analytics capability.
  • Comfortable familiarity working with large data sets.
  • Expert SQL skills.
  • Excellent problem solving & analytical skills.
  • Excellent programming skills in Java (8 preferable) & Python essential, other languages an advantage.
  • Experience modelling ETLs using Apache Beam.
  • Experience writing near real-time ETLs.
  • Experience with multiple build tools, preferably gradle
  • Familiarity with OSX or Linux environment (shell scripting, basic system administration etc).
  • Experience with managing a code base and using source control/collaboration tools such as GitHub, Bitbucket or GitLab.
  • Familiarity with collaboration and communication tools such as JIRA, Confluence, Slack etc.

Desirable

  • BSc or higher level degree in Computer Science, STEM subject or a similar field of study.
  • Experience with a variety of systems with aggregation frame works such as Mongo and Elastic Search.
  • Experience with DAG based workflow management systems, ideally AirFlow.
  • Interested in statistical methodologies and models.
  • Experience with Hadoop technologies.
  • Experience writing ETLs in SPARK.
  • Knowledge of functional programming languages, such as Scala, Kotlin etc
  • Knowledge of Data Protection laws and best practices
  • Experience working in a data protection regulated environment would be beneficial