skip to Main Content

Big Data Engineer (Python, SQL)

Touchpoint Resource
Published
10 March 2020
Location
London, United Kingdom
Job Type
Employer Type
Technology Company
Industry
IT & Telecoms
Base Salary
up to £80k
Work Hours
Office hours

Description

Big Data Engineer required to design, craft and deliver hugely complex Data solutions for high profile customers. You will have the opportunity to work with terabytes and in some cases petabytes of seriously interesting data sets, contributing towards the build of data pipelines and manipulating data in a visualisation friendly manner.

With fluency in working with Big Data, we are looking for Data Engineers who can demonstrate experience of working with recent programming languages such as Python and / or Java and have a genuine passion for operating at the forefront of data and machine learning technology. In addition to building large scale data pipelines, the successful Data Engineer can also expect to participate in developing the company's next generation of Machine Learning products.

Day to day, your responsibilities will include:

  • Work with the most innovative and scalable data processing technologies
  • Build innovative state-of-the-art solutions
  • Work closely with industry leading partners across Cloud, Data and Visualisation technologies
  • Collaborate in an agile and dynamic environment with a talented team

Essential requirements:

  • BSc or MSc Educated in a Computing related subject
  • Demonstrable programming skills in either Python or Java
  • Minimum of 2 years experience working with Big Data solutions
  • Strong ETL skills, for example Hadoop, Spark, Flink, Beam

The successful Big Data Engineer can expect to achieve a competitive salary of up to £80k, plus a benefits package including:

  • 25 days holiday
  • Pension
  • Choice of hardware
  •  Regular monthly socials and off-site events
  •  Exposure to industry experts through mentoring, conferencing and networking events
  •  Freedom to explore, discover and share the latest tools and technologies

Skills

- Experience building big data solutions
- Strong ETL skills, using cool , Hadoop based tech such as Spark, and data pipelines (Beam / Flink)
- Programming skills in SQL, Java and Python
- Working in distributed, cloud environments (GCP / AWS)
- Agile / Scrum

Education Requirements

- Degree required in a computing related subject required as a minimum

Qualifications

BSc or MSc Educated in a Computing related subject

Experience Requirements

- Minimum 2 years experience in crafting big data solutions
- Experience of working with the latest cloud technologies in Google Cloud Platform or AWS ideal
- Superb communicator, both written and verbal

Responsibilities

- Take ownership of building complex data pipelines
- Craft compelling data visualisation to be fed into state of the art machine learning products
- Strive to keep abreast of new and emerging technologies relating to data, machine learning and AI

Apply
(If applicable)
A rough indication will suffice
Drop files here browse files ...
Captcha

Related Jobs

GCP Cloud Architect - London   London, United Kingdom
20 March 2020
10 March 2020
10 March 2020
Senior Java Developer (Full Stack)   London, United Kingdom
10 March 2020
Back To Top
Are you sure you want to delete this file?
/