Leave us your email address and we'll send you all the new jobs according to your preferences.
Big Data Architect
Posted 2 hours 33 minutes ago by Vallum
Role: Big Data Architect (GCP, Spark & Scala)
Location: 2/3 days Onsite in London
Full Time opportunity
Job Description
You will be responsible for translating client requirements into design, architecting, and implementing GCP Cloud-based big data solutions for clients. Your role will be focused on delivering high-quality solutions by independently driving design discussions related to Data Ingestion, Transformation & Consumption, Data Storage and Computation Frameworks, Performance Optimizations, Infrastructure, Automation & Cloud Computing, and Data Governance & Security. The role requires a hands-on technologist with expertise in Big Data solution architecture and with a strong programming background in Java/Scala/Python.
Your Impact:
- Provide technical leadership and hands-on implementation role in the areas of data engineering including data ingestion, data access, modelling, data processing, visualization, design, and implementation.
- Lead a team to deliver high quality big data technologies-based solutions on GCP Cloud. Manage functional & nonfunctional scope and quality.
- Help establish standard data practices like governance and address other non-functional issues like data security, privacy, and quality.
- Manage and provide technical leadership to a data program implementation based on the requirement using agile technologies.
- Participate in workshops with clients and align client stakeholders to optimal solutions.
- Consulting, Soft Skills, Thought Leadership, Mentorship etc.
- People management, contributing to hiring and capability building.
QualificationsYour Skills & Experience:
- Overall 8+ years of IT experience with 3+ years in Data related technologies, and expertise of 1+ years in data-related GCP Cloud services and delivered at least 1 project as an architect.
- Mandatory to have knowledge of Big Data Architecture Patterns and experience in the delivery of end-to-end Big Data solutions on GCP Cloud.
- Expert in programming languages like Java/Scala and good to have Python
- Expert in at least one distributed data processing framework: Spark (Core, Streaming, SQL), Storm or Flink, etc.
- Expert in Hadoop eco-system with GCP cloud distribution and worked at least on one or more big data ingestion tools (Sqoop, Flume, NiFI, etc), distributed messaging and ingestion frameworks (Kafka, Pulsar, Pub/Sub, etc) and good to know traditional tools like Informatica, Talend, etc
- Should have worked on any NoSQL solutions like Mongo DB, Cassandra, HBase, etc, or Cloud-based NoSQL offerings like DynamoDB, Big Table, etc.
- Good Exposure in development with CI/CD pipelines. Knowledge of containerization, orchestration, and Kubernetes engine would be an added advantage.
Set Yourself Apart With:
- Certification on GCP cloud platform or big data technologies.
- Strong analytical and problem-solving skills.
- Excellent understanding of data technologies landscape/ecosystem.
Vallum
Related Jobs
Director Project Manager, EU
- Not Specified, United Kingdom
Network Security Engineer
- London, United Kingdom
Lead Non-Functional Quality Engineer
- £65,000 - £80,000 Annual
- London, United Kingdom
Head of Development PHP
- £85,000 - £90,000 Annual
- England, United Kingdom
IT Head of Business Systems
- £80,000 - £90,000 Annual
- Kent, Maidstone, United Kingdom, ME141