In this role, you’ll play a key part in developing and maintaining data infrastructure that supports critical business insights and reporting.
Key Responsibilities:
-
Design and implement data ingestion pipelines and workflows using GCP tools (BigQuery, Cloud Composer, Dataform, and more).
-
Collaborate with cross-functional teams to translate business requirements into technical solutions.
-
Develop, configure, test, and document new and existing data processes.
-
Write and optimize SQL queries to extract, transform, and validate large data sets.
-
Troubleshoot and resolve production issues related to data ingestion and pipelines.
-
Create and execute testing and implementation plans, including unit, system, and user acceptance testing.
-
Contribute to process improvement by identifying opportunities for greater efficiency and automation.
-
Participate in code reviews and ensure adherence to security and development standards.
-
Join an on-call rotation to support production environments (typically every 4–6 weeks, for one week).