Data engineer – AWS cloud technologies (Open to Remote)

  • Profesionales y Técnicos
  • Full time
  • 4 meses -
  • Valencia

Información del trabajo

Descripción del trabajo

This is Ekkiden

Ambition and innovation are our driving force. We aim high, act quickly, and acquire the wherewithal that we need to achieve our objectives. Our organisation is shaped by the requirements of both our clients and our teams. Becoming a European benchmark requires hard work, quality and respect. We pave our own path, with international teams committed to sustainable growth. We help clients to innovate and we make sure we question ourselves at all times. At Ekkiden everything is possible. Our managers are trained to bring out the best in every team member. If you are daring, creative, and have great people skills, if you want to live an extraordinary human experience and you don’t want to keep reminding yourself why you get up in the morning, it’s time to meet up.

Role: Data Engineer – AWS Cloud Technologies

This role is open to remote conditions. Application from women are particularly encouraged !

Within an international environment, we are looking for a data engineer well versed in AWS cloud technologies for supporting existing data warehouse customers, as well as performing a senior lead role for new projects to implement various data solutions in AWS.


Participate in the project specification and software design phases

Design and Implement AWS architecture, AWS Solutions services and tools

Design Native Cloud Application Architectures or optimize applications for AWS (S3, SQS, Lambda Comprehend, Transcribe, Faregate, Aurora, API GW, CFT, SAM to build a data flow pipeline)

Collaborating within a project team to solve complex problems

Ensuring the delivered solution meets the technical specifications and design requirements

Be responsible for meeting development deadlines

Perform other duties as required

What we are looking for

Bachelors or higher degree in Computer Science or a related discipline

+5 years of experience with Data warehousing methodologies and modelling techniques

Minimum 2 years of experience working in Massively Parallel Processing (MPP) Analytical Datastores such as Teradata

Minimum 3 years of experience in creating master data datasets

Minimum 2 years of experience in Cloud technologies such as:

AWS – S3, Glacier, EC2, Lambda, SQS, Redshift, Athena, EMR, AWS Glue, AWS Lake Formation, Kinesis, AWS Batch

Azure – Blob Storage, Cool Blob Storage, Virtual Machine, Functions, SQL, Datawarehouse, DataFactory, Databricks, CosmosDB

Experience in handling semi-structured data (JSON, XML) using the VARIANT attribute in Snowflake

General understanding of the Snowflake architecture

Previous experience in handling semi-structured data (JSON, XML) using the VARIANT attribute in Snowflake

Good knowledge with ELT concepts and ETL tools, such as Informatica

Experience in Hadoop, Hive, HBASE, Spark is a plus

Experience working with DevOps tools such as GitLabs, Jenkins, CodeBuild and CoePipeline CodeDeploy

Experience with containers (docker)

Strong Python programming skills

Strong scripting skills: Bash Shell and PowerShell

Fluent in English. Spanish is a plus

What we offer

Participate in the creation of a European technology and organisational consulting group

A unique, innovative and dynamic working environment

A career path tailored to your personality, both in terms of job position and location

The opportunity to express yourself, assert your opinion and make an impact within our organisation and/or further afield

You say it, you own it. Once your idea has been given the go-ahead, it is up to you to make it happen

A demanding and innovative training program delivered by our greatest experts

Age is just a number, what really matter are achievements. We have a fast-track promotion system for employees who deliver

Our working conditions are concentric with our values and ambitions

Alerta JOB

¿Deseas recibir emails de trabajo?

Don`t copy text!