Our client is initiating a group-wide digital transformation programme and requires a Data Architect to help him realise this vision. 6-10 years relevant experience is expected minimum with the following specific experience deemed ideal:
- Hands-on design, develop, implement and test of Big Data solutions from scratch in Cloud environments (AWS/Azure/GCP)
- Specific and significant experience with Azure data lake or AWS HBase / DynamoDb or Mongo / Redis / Couchbase / Hadoop / Hortonworks / Cloudera
- Targeted design for ODS, Analytics/Insight experience
- Proven ability to plan and migrate legacy SQL stored procedures and bespoke scripts and processes to Big data platform
- Organising and developing transformations on Unstructured data (myriad data sources, types, formats) across multiple formats and delivery through REST APIs
- Expertise in MapReduce, JSON, Sharding, Scaling, Entity models, Open APIs, API management
- A structured and analytical (but not theoretical) approach to problem-solving.
- Exposure to AWS Kafka or similar advantageous
- Personal traits: A “finisher”; quick learner, focussed but adaptable, thrives in fast-moving, challenging environments, good comms.
It is expected the right employment background will be from a partial blue-chip background and / or ideally technology-first companies like social media, telcos, challenger banks, etc.
By applying now, you accept you have read and accepted our GDPR Privacy Notice for Job Applicants.Back to Vacancy Listing