Skip to content

This website works best using cookies which are currently disabled.Cookie policy  Allow cookies
JobServe
 

Job Application

 
 
 

Please answer the following questions in order to process your application.

 
 
Email Address *
 
Select your working status in the UK *
 
 
 
File Attachments:
(2MB file maximum. doc, docx, pdf, rtf or txt files only)
 
Attach a CV * 
 
Optional covering letter 
OR
Clear covering letter
 
 
 * denotes required field
 
 
 
Additional Information:
 
First Name
 
Last Name
 
Address
 
Country
 
Home Telephone
 
Mobile/Cell
 
Availability/Notice
 
Salary Expectation GBP
 
Approximately how far are you willing to travel to work (in miles) ?
 
 
 

Key Privacy Information

When you apply for a job, JobServe will collect the information you provide in the application and disclose it to the advertiser of the job.

If the advertiser wishes to contact you they have agreed to use your information following data protection law.

JobServe will keep a copy of the application for 90 days.

More information about our Privacy Policy.

 

Job Details

 

Data DevOps Engineer - DevOps, Big data - Permanent - Gloucestershire (Permanent)

Location: Gloucestershire/Bristol Country: UK Rate: £65 - £95K per annum Negotiable DOE
 

Data DevOps Engineer - DevOps, Big data - Permanent - Gloucestershire

Location: Gloucestershire/Bristol (full-time onsite)
Salary: £65 - £95K per annum Negotiable DOE
Benefits: Flexible working hours, career opportunities, private medical, excellent pension, and social benefits

Active DV Clearance is highly desirable. Please note that candidates will need to be eligible to undergo DV Clearance.

The Client: Curo are collaborating with a global edge-to-cloud company advancing the way people live and work. They help companies connect, protect, analyse, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today's complex world.

The Candidate: We are looking for a bright, driven, customer focussed professional to join our clients Hybrid Cloud Delivery team, and work alongside Enterprise Data Engineering Consultants to accelerate and drive data engineering opportunities.

This is a fantastic opportunity for a dynamic individual with big ambitions, who is an established technologist with both outstanding technical ability and consultative mindset. This would suit an open-minded personable self-starter who relishes the fluidity and collaborative nature of consultancy.

The Role: This role sits on our clients Advisory and Professional Services delivery team, who provide thought-leadership, industry know-how and technical excellence to consultative engagements. Helping customers to reap maximum business benefit from their technical investments, leveraging best in class Vender & Partner technologies to create relevant and effective business-valued technical solutions.
The Data DevOps Engineer role is all about the detailed development and implementation of scalable clustered Big Data solutions, with a specific focus on automated dynamic scaling, self-healing systems.

Duties:

  • Participating in the full life cycle of data solution development, from requirements engineering through to continuous optimisation engineering and all the typical activities in between
  • Providing technical thought-leadership and advisory on technologies and processes at the core of the data domain, as well as data domain adjacent technologies
  • Engaging and collaborating with both internal and external teams and be a confident participant as well as a leader
  • Assisting with solution improvement activities driven either by the project or service

Essential Requirements:

  • Excellent knowledge of Linux operating system administration and implementation
  • Broad understanding of the containerisation domain adjacent technologies/services, such as: Docker, OpenShift, Kubernetes etc.
  • Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc.
  • Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc.
  • Observability - SRE
  • Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem
  • Edge technologies eg NGINX, HAProxy etc.
  • Excellent knowledge of YAML or similar languages

Desirable Requirements:

  • Jupyter Hub Awareness
  • Minio or similar S3 storage technology
  • Trino/Presto
  • RabbitMQ or other common queue technology eg ActiveMQ
  • NiFi
  • Rego
  • Familiarity with code development, Shell-Scripting in Python, Bash etc.

To apply for this Data DevOps Engineer permanent job, please click the button below and submit your latest CV.

Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience.

Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.


Posted Date: 22 Apr 2024 Reference: JSRL6903 Company: Curo Services Contact: Applications