Senior Big Data Architect

Empresa: HAYS
Provincia: Madrid
Población:  Madrid, Madrid centro
Descripción: Be a key driver on BASF´s path to digitalisation by supporting existing products and initiatives as well as innovate additional digital solutions that supports BASF´s global businesses

Our team “Big Data Solutions” is leveraging the latest on-premises & cloud based technologies in order to develop and operate data based solutions for the whole BASF group. An overarching big data platform, the BASF Enterprise Data Lake, is thereby building the technological foundation for the implementation of a broad range of traditional & advanced analytics use cases and enables various user groups to utilize the platform in a self-service manner

Responsabilities:

In the Product Team Big Data Solutions, you will take over an active role in the specification, design, implementation and support of complex software and application components. In your role as a big data architect, you will thereby focus on data integration topics.

o You will take an active part in processing of incoming project demands by creating cost estimations, concepts and descriptions.
o You will be responsible to create and manage the technical solution design based on customer requirements during the project life cycle.
o You coordinate, monitor and advise the development team during the whole development lifecycle by leveraging state of the art methods and tools
on the development of applications using existing methods and tools.
o Ensuring proper testing and quality standards by implementing quality assurance measures complete your job profile

Qualifications:

– Hands on experience with:

– Design & implementation of ETL / ELT data pipelines
– Collection, ingestion & processing of batch and streaming data
– Publish Subscripe Concept
– Database replication
– Familiar with Data Warehouse and Data Lake architecture concept
– Database Management & Design (SQL & NoSQL) preferably

– SAP HANA Database
– Microsoft SQL Database

– Azure Cloud / AWS Cloud experience with the following services:

– Azure EventHub and/or Apache Kafka
– Azure Data Factory and/or AWS Glue
– Azure Data Lake and/or AWS S3
– Azure Databricks and/or Azure HDInsight / AWS EMR
– Azure SQL & Synapse and/or AWS Redshift

– Experience with relational data integration from ERP & CRM systems (preferably SAP based) – Integration of IoT & time series data – Intergration of multidimensional data from Data Warehouse

-Coding Skills & Frameworks

– Python (must)
– SQL (must)
– Apache Spark (PySpark, SparkSQL) (must)
– Powershell
– Any object oriented object oriented programming language (C#, Java, ..)

– Methodological Skills:

– Agile project management methods such as Scrum / Kanban
– Familiar with Dev Ops approach
– Working with Azure Dev Ops (VSTS) or Atlassian Stack (Jira, Bitbucket, etc.)
– Familiar with Git workflow & CI/CD

– Studies: business informatics, computer science or a comparable technical education o Several years of professional experience in the IT environment as well as good knowledge of project management with regard to processes, methods and tools
– Practical hands-on experience with the mentioned technologies and concepts above o good communication and interpersonal skills in an international team and thus acquired fluent English skills good negotiation skills as well as good planning-, organization-, and follow-through abilities combined with a highly developed awareness for the organization o Innovation-oriented, analytical and strategic thinking
Tecnologías: ETL, SQL, NoSQL, SAP HANA, Azure, AWS,
Tipo de Contrato: 
Indefinido
Salario: Sin especificar
Experiencia: 1 año
Funciones: Arquitecto TIC – Big Data


Publicaciones Similares