Senior Data Engineer, Scrum Master, Tech Lead

Published: 3 April 2023

Back
Senior Data Engineer, Scrum Master, Tech Lead
Location: UK, Europe, Offshore (India, Pakistan etc) Remote working
Duration: 6 months + Contract

Role Summary
Our client, an Integrated Energy Company, requires a senior Data Engineer to work within their data and analytics vertical within their Customers & Markets portfolio. The customer is extracting data from core systems and storing it in an AWS data lake. They are then using this to improve business decisions and margins. They need a senior data engineer with AWS experience, good business knowledge to help understand and solve business issues using the AWS data lake, and scrum master skills to drive the agile team.

Role Responsibilities
  • Scope, review & deliver low-level data requirements for digital and data initiatives across bp analytical platforms incl. AWS working closely with stakeholders
  • Work with Business stakeholders to understand the business issues they are trying to resolve and help identify ways of solving this
  • Act as Scrum master and lead the agile squad
  • Provide data strategy and guidance adhering to bp standards during the development phase and planning for long-term
  • Undertake database design & modelling
  • Create data Migration Strategy
  • Manage and track delivery and KPIs within Azure DevOps (ADO)
  • Ensure data models are fit for purpose and data standards are implemented where available
  • Ensure end to end integration, data flow and ease of ongoing maintenance is optimal
  • Provides guidance for Database Security & Auditing standards
  • Provide query Analysis & optimization recommendations where needed
  • Manage stakeholder relationships and expectations

Required Skills
  • 6+ years data engineering
  • Have good knowledge of AWS Cloud Data Migration & ETL strategies
  • Scrum Master
  • Database Security & Auditing standards
  • Postgres (AWS Aurora) RDMS
  • Analytical mindset – data driven ability to problem solve and refine engineering requirements .
  • Be self-motivated and self-starting – able to work with little supervision and unfamiliar concepts quickly. Be confident in their judgement.
  • SQL skills for pipeline development and analysis
  • Python for pipeline development and analysis
  • Experience of big data engineering environment delivering data pipelines, and ETL concepts
  • Exposure to Iterative delivery methods
  • Mathematics, Computer science or numerical related degree
  • Experience of continuous integration/deployment and development techniques
  • Experience in toolsets for agile delivery management and DevOps; for example Azure DevOps (ADO) or JIRA
We are unable to provide sponsorship for this opportunity.












Apply Now