We are looking for a Data Engineer to work on the implementation of a whole new BI architecture. You’ll be part of the Data & analytics center of excellence in the global Digital Factory of Randstad. It’s a role in a highly dynamic international environment. Lead our operating companies implementing a fully cloud based BI architecture that serves as the blueprint for the entire Randstad Group, which consists of 38 countries.
As a Data Engineer you have an overview of the data landscape. You build data pipelines that extract data from internal- and external sources and make this available in the country specific data lakes for various data consumers (BI tool, AI projects, Data scientist etc). To ingest and transform data we have an inhouse developed framework based on the latest cloud technologies. Our tech stack is GCP (Google Cloud Platform) Bigquery with the extension of tools like Composer/Airflow, Dataflow/Apache Beam, Gitlab, Spinnaker, SonarCube a.o.
Ingesting data from sources into the data lake is only part of your job. Together with your team you are also responsible for transforming the data through different stages (i.e. from landing zone to data warehouse and data mart). This means that designing, standardizing and reviewing data(warehouse) models, both technical and logical, is also part of your job. Because of the central position of the data engineer you will play a crucial role in translating business requirements into data products like pipelines and data marts. Additionally, you also manage the required data quality standards (availability, integrity, confidentiality, actuality, frequency and completeness). At Randstad we process a lot of personal data so we expect our data engineers to act as a custodian. Designing solutions with the highest security standards in mind.
- develop next-generation scalable, flexible, and high-performance data pipelines
- optimize and expand data warehouses, through integration of new data
- build data marts and data models to support Data Science and other internal customers
- act as an intermediary for problems, with both technical and non-technical audiences
- help define quality standards for the team and share your knowledge
- a close collaboration with our local data teams in continuing to shape the technical vision and recast problems and challenges into innovative solutions
- work closely with the business functions to identify the problem statement and work towards answering those from a data perspective
- give advice on potential areas where data streams can be optimised
- experiment with available tools and advice on new tools in order to determine optimal solutions given the requirements
- has a bachelor or master degree in informatics/computer science
- You will have at least 2/3 years of working experience with ETL (preferably ELT)
- you will have at least 2/3 years of working experience with database design, data modelling and normalisation techniques
- expert knowledge of SQL
- experience in cloud infrastructure and Cloud based way of working, knowledge of GCP and technologies like Bigquery, Dataflow/Apache Beam and Composer/Airflow is a (big)plus
- have some experience in architecture in complex IT landscapes
- experience with multiple program coding languages
- experience with scalable data platforms, security, authorisation and authentication.
- experience with common devops and CI/CD practices to guarantee the quality of our products
- be comfortable with an agile way of working
- we encourage experiments and exploration, which does mean you should be able to adapt to a variety of challenges and like quick feedback loops
- have a keen eye for details
- driving results in a political dynamic environment and is not scared to speak up and take responsibility
- strong communicator verbally and written, fluent in English
- has a structured way of problem solving with analytical skills
- someone who is forward thinking and persistent to thrive for quality
- mentoring more junior engineers into independent members of the team
- If you really want to impress us, you can do so by having experience in: api management, good understanding of analytics and machine learning concepts, python, and data governance.
-Duration: 6 months with an option to extend
-Working from home until government measures regarding Corona allows working in the office again.
-Location Diemen ( near Amsterdam)
Open for freelancers or as a permanent position
You play a pivoting role in the digital transformation of the world’s leading HR company, join Randstad as a DATA ENGINEER
Randstad was founded 60 years ago and became one of the first platform companies in the world by creating a marketplace for supply and demand of labour. Back in those early days everything happened offline in our branches. But today Randstad business models are primarily online and consume and generate a wealth of data. We understand the value of this data and are busy implementing our new data strategy so we can unlock this at scale. Would you like to join us on this journey and become part of the largest HR service provider in the world? Building on our ambition to touch the working lives of 500 million people? Meanwhile working with the most cutting edge tools and supported by a great team of people? Check out the DATA ENGINEER role below and apply!
Inclusiviteit en diversiteit
Uiteraard staat deze vacature open voor iedereen die zich hierin herkent. We geloven dat diverse teams van belang zijn voor ons als lerende organisatie, die voorop wil blijven lopen in de wereld van werk. Want juist verschillen tussen mensen zorgen voor groei. Van collega's, klanten, kandidaten en daarmee van Yacht. Heb jij een uniek talent? We ontmoeten je graag.