Job Description:
-A minimum of 2.5+ years of working experience as a hands-on data engineering is a must.
-Advanced Degree in the field of Computer Science, Data Engineering, Data Analytics or equivalent discipline.
-Strong programming experience with Python, SQL, ETL and efficiently querying big datasets using SQL.
-Prior experience with CircleCI CI/CD, Terraform and Github version control is a must.
-Solid understanding of Data Collection, Data Ingestion, Data Integration, and Data Operations & Maintenance.
-Experience in working with Front end technologies such as React, Angular, Flask, Javascript, HTML, CSS is a plus.
-Domain knowledge on News, Media & Communications is an added advantage.
Responsibilities:
-Design, implement, and maintain the scalable and reliable GCP cloud-based data infrastructure and BI – ELT processes.
-Collaborate with cross-functional teams to define data architecture and data warehouse requirements.
-Automate data pipelines, ETL/ELT processes, and data integration workflows using infrastructure-as-code (IaC) and configuration management tools.
-Deliver production quality, modular, reproducible code in Python, SQL.
-Implement continuous integration and continuous deployment (CI/CD) pipelines for data pipelines and data-related applications.
-Keep abreast of the latest developments in the field through continuous learning and proactively champion promising new methods relevant to the problems at hand.
Tech Stack:
● Google Cloud Platform is IaaS
● BigQuery, GCS, Python, SQL
● Composer / Airflow – orchestration tools
● Terraform, CircleCI, CI/CD, GIthub, Jira, Confluence
Location : Bangalore, India
Apply link : Click Here