ETL Data Engineer

ETL Data Engineer


Extern

22/02/2018


  • Omschrijving

    Wij zijn voor een van onze eindklanten in Arnhem (TenneT) op zoek naar een ETL Data Engineer.





    Startdatum: z.s.m.

    Einddatum: 31-12-2018

    Gemiddeld aantal uren per week: 40



    De opdrachtgever:

    Deze opdrachtgever behoort tot de top 5 elektriciteitstransporteurs van Europa. De focus is gericht op de ontwikkeling van een Noordwest-Europese energiemarkt, op de integratie van duurzame energie en de safety performance.



    Functie-informatie:

    The Digital Transformation Corporate (DTC) programme implements the TenneT strategy on advancing the use of data and analytics to the benefit of internal operations and society. This will be realized amongst others by introducing data governance processes and the implementation of the TenneT Data Platform (TDP). The TDP implements solutions for data integration and big data use cases.



    This implementation is done by a dedicated DevOps team within the IMC department. Apart from realizing use cases, the team is involved in implementing and extending the data platform and the big data capabilities following an agile approach.



    Wat je gaat doen:

    What are your activities in the TenneT Data Platform DevOps Team as ETL Developer within TenneT?



    You support innovative business use cases in the transformation of business ideas in the field of data integration and big data to IT solutions. These IT solutions can either be first implementations of complex data transformation solutions or governed implementations within the systematic TenneT IT landscape. You are working in a multi-disciplinary team in close cooperation with other IT experts and business representatives.



    Your main activities are

    Design, build, test, maintain and document data integration and big data solutions in an innovative environment with mainly open source big data products;



    Transforming business ideas to IT solutions in close cooperation with your team members and business and IT stakeholders;



    You are participating in and contributing to a DevOps / Agile team.



    Wat je meebrengt:

    You have a completed Bachelor or Master degree from a technical IT faculty;

    Demonstrable experience with the Apache Hadoop ecosystem, for example Spark and R. Experience with Cloudera is a plus;

    Demonstrable experience with open source big data tooling, such as Flink, Kafka, NoSQL, Kylo/Nifi, the ELK stack (Elastic Stack) and NoSQL solutions like Cassandra;

    Development experience in Java, Scala or Python at least two years in one of them;

    Demonstrable experience with deploying and maintaining a big data landscape on Apache Mesos.

    Demonstrable experience on implementing big data solutions in AWS, Azure or Google Cloud.

    You have knowledge of implementing big data patterns, data lake concepts;

    Proven experience in agile working environments;

    Strong in communication and cooperation with different stakeholders within the Business departments and IT;

    Fluency in English. Fluency in Dutch and German is a plus;

    A strong team player with a "getting it done" mentality;

    Working from our Arnhem office and willingness to travel to our German offices occasionally.

  • Omschrijving Freelancer

  • Externe Website

    U dient in te loggen om gebruik te maken van Opdrachten.nl

  • Opdrachtrubriek

    ICT (applicaties en software)

  • Provincie

    Gelderland

  • Verspreiden via social media