Data & Analytics Analyst
National Audit Office
Your job ad is not online.

Data & Analytics Analyst

National Audit Office
  • Graduate Program / Job Rotation Program (From 19 to 24 months)
  • Newcastle upon Tyne (United Kingdom)
  • Published on September 9 2021
Your job ad is not online.
Job description

The National Audit Office (NAO) supports Parliament to hold the government to account for the way public services are delivered. Our primary role is to scrutinise public spending for Parliament. We do this by certifying over 350 government accounts and produce around 65 Value for Money (VFM) reports each year.

In this role we will expect you to spend most of your time building, optimising and maintaining data pipelines in support of our financial audit work. We have recently built a new platform on Azure and are looking for someone with experience of working with cloud technologies to support and develop complex ETL data pipelines in the platform using python, R and SQL as well as the available cloud native services.

You will augment our application developers with expertise in data warehousing, data modelling, optimised querying and building and utilising data APIs.

You will join our Data Analytics Research Team, working with a team of data analysts and data scientists to develop and deliver business critical analytical tools. You will also be responsible for promoting best practices on effective data engineering. This is not a support role but there will element of support as part of working in a business facing software development team.

Our roles are based at either our London office or our Newcastle office. We are following Public Health England's advice and taking all necessary steps to minimise the risks to our colleagues, their families and the wider public. Our systems are designed to be operated remotely and the majority of our staff are working from home at present. The NAO supports flexible working and is happy to discuss this with you at application stage.

Your key responsibilities:

  • Build and maintain complex data pipelines in Microsoft Azure ecosystem
  • Monitoring and management of business critical ETL processes
  • In exceptional circumstances some support of end users in understanding exceptions and communication of requirements for input data.
  • Occasional ad-hoc data processing and extraction tasks.
  • Being an SME on all data related matters to help the team use data and its tooling effectively
  • Liaison with wider IT team to ensure compliance with data governance and data security requirements


What we are looking for

  • UK nationals, nationals of Commonwealth countries who have the right to work in the UK, nationals from the EU, EEA or Switzerland with (or eligible for) status under the European Union Settlement Scheme (EUSS)
  • Demonstrable commercial experience managing complex data pipelines, and using Apache Spark (Datasets, DataFrames and SQL APIs, Scala not essential) is required. Commercial software development experience is a plus but is not essential, as is knowledge of accounting and auditing.
  • A strong academic background would be considered a strength but is not a requirement.
  • Ability to design, build and manage data pipelines encompassing data transformation, data models, schemas, metadata and workload management. The ability to work with both IT and business in integrating analytics and data science output into business processes and workflows.
  • Experience working with Microsoft Azure, including (most of): Azure SQL, Data Lake, Azure Databricks & Data Factory. (Experience with other cloud platforms will be considered)
  • Experience working with Apache Spark
  • A good understanding of ETL and Data Warehousing concepts.
  • Use of SQL, R and or Python
  • Excellent communication skills.
  • Passion for promoting and maintaining best practices
  • Ability to work co-operatively and collaboratively as part of a team to deliver a team effort
  • Drive and determination to overcome obstacles in order to achieve goals
  • Commitment to personal development and keeping technical skills up to date.
Any evidence of the following would strengthen your application:

  • Development with Python, R, Scala or other programming languages
  • Basic experience working with popular data discovery, analytics and BI software tools such as Tableau.
  • Agile methodologies and DevOps
  • Cognitive Services, Power Apps, Logic Apps
  • Terraform, or other Infrastructure as code tools


Qualifications

  • 2:1 degree (or better) in any subject
  • A minimum of 120 UCAS points (or 300 based on the pre-September 2017 UCAS tariff point system) or equivalent, not including general studies. If you have 104 UCAS points (or 260 based on the former UCAS tariff point system), you may still be eligible to apply as your application will be reviewed in the context of your socio-economic background.
Your job ad is not online.