PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of GCP Data Architect – Analytics & BI Platform/Bangalore:
-
Employment Type:
Full-Time
-
Location:
Bengaluru, Karnataka (Onsite)
Do you meet the requirements for this job?
GCP Data Architect – Analytics & BI Platform/Bangalore
Job Title: GCP Data Architect – Analytics & BI Platform
Location: Bangalore, Karnataka, India
Work: Full Time
Salary: 30,00,000/ Yearly
Job Summary:
We are seeking an experienced GCP Data Architect with deep expertise in Google Cloud Platform (GCP) and a strong foundation in BI architecture, data layering, and performance optimization. This role will be responsible for designing and delivering scalable, governed, and cost-efficient enterprise data platforms leveraging BigQuery, Looker, and the GCP AI & Data ecosystem.
The ideal candidate will play a strategic role in data modernization, advanced analytics enablement, and AI adoption, while ensuring robust data governance, security (including row-level access controls), and best-in-class BI performance optimization.
Role Summary:
As a GCP Data Architect, you will define and implement end-to-end cloud-native data and BI architectures, spanning raw, curated, and mart layers. You will collaborate closely with data engineering, analytics, and business teams to enable trusted, high-performing, and scalable analytics solutions that support enterprise decision-making and AI-driven insights.
Key Responsibilities:
Architecture & Platform Design
- Design and implement cloud-native data and BI architectures on GCP, aligned with enterprise standards and best practices.
- Define and manage data layers (raw, curated, semantic/mart layers) to support scalable analytics and reporting.
- Architect modern BI and analytics platforms using BigQuery and Looker.
- Design optimized data models and marts for analytics and self-service BI use cases.
- Apply BigQuery optimization techniques (partitioning, clustering, query optimization, cost controls).
- Lead mart-layer optimization to improve query performance, usability, and cost efficiency.
- Drive Looker / BI tool optimization, including LookML design, caching strategies, derived tables, and performance tuning.
- Establish best practices for BI architecture, semantic modeling, and governed data consumption.
- Implement row-level and column-level security to ensure secure data access across user groups.
- Ensure data quality, consistency, and governance across all data layers.
- Collaborate with stakeholders to define analytics requirements and translate them into scalable data solutions.
- Lead data modernization initiatives, including migration of legacy data platforms to GCP.
- Enable and support AI/ML use cases using GCP services such as Vertex AI.
- Partner with data science and ML teams to operationalize analytics and AI solutions.
- Provide technical leadership and architectural guidance to data engineering and analytics teams.
- Define standards, reference architectures, and reusable patterns.
- Communicate effectively with business and technical stakeholders to align solutions with business goals.
- Strong hands-on experience with BigQuery, Looker, LookML, and core GCP data services.
- Solid understanding of BI architecture, semantic modeling, and analytics design patterns.
- Proven experience working with multi-layered data architectures (raw, curated, mart).
- Expertise in performance and cost optimization for data warehouses and BI tools.
- Hands-on implementation of row-level security and governed data access.
- Strong understanding of data engineering, data modeling, and analytics workflows.
- Familiarity with AI/ML concepts and GCP AI services.
- Excellent communication and stakeholder engagement skills.
- Google Cloud certifications (Professional Data Engineer, Professional Cloud Architect, or equivalent).
- Experience with streaming pipelines, data governance frameworks, and MLOps.
- Background delivering large-scale enterprise analytics and BI programs.
- Bachelor's degree in computer science, Data Engineering, Information Systems, or equivalent practical experience.