Launch Recite Me assistive technology
Back to job search

Data Developer - DataBricks

  • Location:

    London

  • Sector:

    Investments

  • Job type:

    Temporary

  • Salary:

    Negotiable

  • Contact:

    Matthew Thresher

  • Contact email:

    Matthew.Thresher@oliverjames.com

  • Job ref:

    JOB-012024-235967_1706009639

  • Published:

    6 Monate her

  • Duration:

    6 Months

  • Expiry date:

    2024-02-22

  • Startdate:

    ASAP

The Data Engineering team are extremely driven by the delivery of value into the business and seek to blend a strong technical capability with an acute business focus.

We work within an agile delivery framework supporting different workloads using both SCRUM & Kanban, and focus all our development efforts towards clearly identified business goals.

Responsibilities:

  • This role will promote the available data and analytics capabilities and expertise to business unit leaders and educate them in leveraging these capabilities
  • Collaborate across departments: The newly hired data engineer will need strong collaboration skills in order to work with varied stakeholders within the organization.
  • Build data pipelines: Managed data pipelines consist of a series of stages through which data flows, these data pipelines must be created, maintained and optimized as workloads move from development to production for specific use cases.
  • Drive Automation: The data engineer will be responsible for using innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity.
  • Educate and train
  • Participate in ensuring compliance and governance during data use: It will be the responsibility of the data engineer to ensure that the data users and consumers use the data provisioned to them responsibly through data governance and compliance initiatives.

Skills:

  • At least three years or more of work experience in data solutions design and development including data warehouse; ETL/ELT; and data integration.
  • Delta Lakehouse architecture and associated technologies such as Databricks, Azure Data Lake, and Azure Data Factory.
  • T-SQL, SparkSQL and PySpark.
  • Good understanding of data modelling techniques including conceptual, logical, and physical models, and Kimball and 3NF data structures.
  • Experience working with popular data discovery, analytics, and BI software tools like Power BI, Tableau, Qlik and others for semantic-layer-based data discovery.
  • SQL Server BI stack. (SQL Server; SSIS).
  • Good experience of working in an agile delivery framework using Azure DevOps
  • Good awareness of data governance and appropriate management of risks associated with data.

Image 2020 11 03 T18 21 29

The latest OJ Webinar

Are you looking to relocate?