About the Role:We are looking for a highly skilled Data Engineer with strong expertise in Google Cloud Platform (GCP) services, DBT, SQL/Python, and GitLab. The ideal candidate will be responsible for building scalable data pipelines, enabling analytics, and ensuring efficient data integration across platforms. Key Responsibilities:Design, build, and optimize scalable data pipelines and ETL processes on GCP.Develop and manage transformation workflows using DBT with SQL and Python.Ensure best practices in data modeling, governance, and performance optimization.Implement CI/CD pipelines and version control using GitLab.Collaborate with cross-functional teams including Data Scientists, Analysts, and Product Managers.Monitor, troubleshoot, and enhance data workflows for accuracy and reliability.Ensure adherence to security, compliance, and data quality standards. Required Skills & Qualifications:5-7 years of proven experience as a Data Engineer.Strong proficiency in GCP services (BigQuery, Cloud Storage, Pub/Sub, Dataflow, Composer, etc.).Hands-on experience with DBT (Data Build Tool) for data transformation and orchestration.Advanced SQL skills with ability to write optimized queries for large-scale datasets.Proficiency in Python for data manipulation and automation.Experience with GitLab (version control, CI/CD, workflow management).Good understanding of data modeling, data warehousing, and ELT/ETL frameworks.Strong analytical, problem-solving, and communication skills.