
Description
We are actively seeking a Big Data Engineering Lead to join a high profile Data / Machine Learning consultancy based in a convenient London location. The company partner with leading vendors in the cloud computing data space to offer their global customers unrivalled data solutions.
The role will see you get involved in significant data engineering and architectural projects, where you will be required to build scalable data pipelines and craft complex data in a visualisation friendly manner, enabling seamless transition into machine learning models.
As a Big Data Engineering Lead you will:
- Play a pivotal role in building and leading the Data Engineering team
- Work with the most recent, innovative and scalable data processing technologies in the market
- Create, innovate and deliver complex data solutions as part of a multi-faceted team
- Partner with the greatest minds in the cloud based data space
To be considered, we are looking for a Big Data Engineering Lead who has superb communication skills, enjoys collaborating with peers at all levels and is keen to join a company operating at the forefront of data driven technology.
Specifically, we require applicants to possess:
- BSc or MSc degree in Computer Science related field
- Demonstrable experience of building big data solutions
- Excellent knowledge of ETL tools, Spark, and data pipelines (eg Beam, Flink)
- Competent in programming and architecture, utilising tech such as Java, Python, SQL
- Ability to architect cloud based infrastructure, with strong troubleshooting skills
- Any team lead or previous 'Engineering Lead' experience would be perfect, however our client is happy to consider exceptional candidates looking to take the next step in their career
- Extra points available for candidates with broad data science, machine learning, visualisation and data mining knowledge
The successful Big Data Engineering Lead can expect to achieve a competitive salary of up to £100k, plus a benefits package including:
- 25 days holiday
- Pension
- Choice of hardware
- Regular monthly socials and off-site events
- Exposure to industry experts through mentoring, conferencing and networking events
- Freedom to explore, discover and share the latest tools and technologies
Skills
- Experience building big data solutions
- Strong ETL skills, using cool , Hadoop based tech such as Spark, and data pipelines (Beam / Flink)
- Programming skills in SQL, Java and Python
- Working in distributed, cloud environments (GCP / AWS)
- Agile / Scrum
Education Requirements
Degree level education as a minimum in a technology focussed discipline.
Experience Requirements
- 5 years minimum working with big data solutions, working with recent ETL technologies, ideally in a cloud based environment (AWS / GCP)
- Excellent command of the English language, both written and verbal
Responsibilities
The role will see you get involved in significant data engineering and architectural projects, where you will be required to build scalable data pipelines and craft complex data in a visualisation friendly manner, enabling seamless transition into machine learning models.