Leave us your email address and we'll send you all the new jobs according to your preferences.

Data Platform Engineer (GCP, Data DevOps, Databricks)

Posted 2 hours 38 minutes ago by Red - The Global SAP Solutions Provider

Contract
Not Specified
Other
Not Specified, Portugal
Job Description

Role: Data Platform Engineer

Location: Remote

Start date: ASAP

Duration: 12 months

Data Platform Engineer (GCP, Data DevOps, Databricks)

Must-Have Skills:

  • GCP Networking & Infrastructure: Deep experience in GCP cloud computing, networking, and secure infrastructure design.
  • Google Cloud Services: Expertise in Google Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage.
  • Kubernetes: Advanced knowledge in container orchestration.
  • Programming: Proficiency in Python, PySpark, and SQL.
  • Terraform: Experience in Infrastructure as Code (IaC) using Terraform.

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines in PySpark on GCP.
  • Implement and manage cloud infrastructure using Terraform.
  • Ensure data quality, governance, and security across the platform.
  • Work with Databricks and Google Cloud Pub/Sub to facilitate data ingestion, processing, and storage.
  • Manage Kubernetes clusters for application deployment and data orchestration.
  • Contribute to hybrid/multi-cloud architecture if needed.

Additional Skills:

  • Experience with Azure and hybrid cloud environments is a plus.

Email this Job