Jobs
Interviews

1824 Data Architecture Jobs - Page 41

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 17.0 years

19 - 25 Lacs

Mumbai

Work from Office

Management Level:I&F Decision Science Practitioner Associate Director Job Location: Bangalore / Gurgaon / Mumbai Must-have Skills: Generative AI, Cloud Computing, Emerging Data & AI Platforms/Technologies/Trends, Data & AI Strategy Development, Business Case Development & ROI Analysis, Data Architecture & Operating Model Design, AI Strategy Execution and Governance, Stakeholder Management and Executive Communication Good-to-have Skills: Wealth Analytics (e.g., Asset Allocation, Robo Advisory, Personalization of Insights) , Responsible AI Implementation , Cross-functional Leadership on Large Transformative Deals , Ecosystem Strategy and Partner Engagement Job Summary We are seeking an experienced and visionary Ind & Func AI Value Strategy Senior Manager to join the Accenture Strategy & Consulting team in the Global Network Data & AI practice. This role involves defining and leading data and AI strategies for large transformation deals within Financial Services, working on high-impact initiatives that shape how clients adopt AI, generate value, and innovate at scale. Key Responsibilities Strategy & Execution Lead the development of robust data and AI strategies by identifying high-value business use cases and redesigning processes to unlock transformative outcomes Define current-state assessments and future-state roadmaps using tools like data diagnostics, AI maturity assessments, and business case development Deep understanding of Generative AI, cloud computing, emerging data and AI platforms/technologies/trends Masters degree in computer science, Data Science, Business Administration, or a related field. MBA degree (preferred). Proven track record of at least 12 years of experience in data and AI strategy, with a focus on driving business value through the strategic application of AI technologies. Strong understanding of data and AI platforms & technologies, architecture principles, operating models, and best practices, with hands-on experience in designing and implementing scalable AI solutions. Demonstrated ability to build compelling business cases for AI initiatives, with experience conducting cost-benefit analyses, ROI assessments, and risk evaluations. Excellent strategic thinking and problem-solving skills, with the ability to identify opportunities for AI-driven innovation and value creation in complex business environments. Exceptional communication and stakeholder management skills, with the ability to effectively collaborate with cross-functional teams and influence decision-making at all levels of the organization. Professional & Technical Skills Deep understanding of Data and AI technologies, cloud platforms (AWS, Azure, GCP), and architectural principles Proven ability to drive business transformation using strategic AI tools and techniques Strong stakeholder engagement and the ability to influence decisions at senior executive levels Additional Information Excellent communication, leadership, and cross-functional collaboration skills Willingness to travel as required for client engagements (up to 40%) About Our Company | AccentureQualification Experience: Minimum 12 years of experience in Data and AI Strategy, with a focus on driving business value through strategic AI applications Educational Qualification: Masters degree in Computer Science, Data Science, Business Administration, or a related field MBA (preferred)

Posted 1 month ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects.- Ensure cohesive integration between systems and data models.- Implement data platform components.- Troubleshoot and resolve data platform issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead data architecture design and implementation.- Optimize data delivery and re-design infrastructure for greater scalability.- Implement data security and privacy measures.- Collaborate with data scientists and analysts to understand data needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Strong understanding of cloud-based data solutions.- Experience with data warehousing and data lakes.- Knowledge of SQL and NoSQL databases.- Hands-on experience with data integration tools.- Good To Have Skills: Experience with Azure Machine Learning. Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Azure Data Services.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to enhance the overall data architecture and platform functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Assist in the documentation of data architecture and integration processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and methodologies.- Experience with cloud-based data solutions and architectures.- Familiarity with data modeling concepts and practices.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

50 - 70 Lacs

Kalyani, Bengaluru

Work from Office

Please Note: 1. If you are a first time user, please create your candidate login account before you apply for a job. (Click Sign In > Create Account) 2. If you already have a Candidate Account, please Sign-In before you apply. Job Description: About Us Join the Avi Application Load Balancer Analytics team which plays a critical role in driving insights, performance optimization, and intelligent automation across our platforms. Beyond our daily responsibilities, our team has a proven history of innovation, with an impressive portfolio of multiple patents. We encourage and support each other in exploring new ideas and turning them into deployable solutions, fostering a culture of creativity and intellectual curiosity. We re looking for a seasoned Staff Engineer to lead complex analytics initiatives that intersect big data, AI/ML, distributed systems, and high-performance computing. This is an on-site position in Bangalore, India, where you will be part of a larger cross-border team of smart and motivated engineers located both in Bangalore, India and Palo Alto, USA while enjoying the autonomy and support to express yourself creatively and deliver at your best. What You ll Do Architect and lead the design of scalable, high-performance data analytics platforms and systems. Develop and optimize services using GoLang, C++, and Python for ingesting, processing, and analyzing massive volumes of telemetry and network data. Implement data pipelines and analytics workflows across SQL/NoSQL databases (e.g., PostgreSQL, TimescaleDB, Redis, etc.). Build and manage search and indexing systems using OpenSearch or Lucene , ensuring low-latency querying and efficient data retrieval. Design and enforce strong data modeling practices across structured and unstructured data sources. Collaborate closely with ML engineers to deploy AI/ML models for predictive analytics, anomaly detection, and intelligent insights. Must have - 15+ years of experience in software/data engineering, preferably within large-scale networking, cloud infrastructure, or data platforms. Expertise in GoLang, C++, and Python with production-level experience in performance-critical systems. Deep understanding of SQL and NoSQL databases , their tradeoffs, scaling strategies, and schema design. Strong hands-on experience with search technologies such as OpenSearch, Lucene , or Elasticsearch. Proven experience with data modeling , pipeline optimization, and data architecture. Solid foundation in AI/ML concepts with applied experience in deploying models into production analytics systems. Strong communication and leadership skills with a passion for mentorship and technical excellence. Nice to Have - A strong background in networking; with proven experience in building high performance networking appliances Experience in telemetry, network data analytics, or observability systems. Familiarity with Kubernetes, Kafka, Spark, or similar distributed systems technologies. Broadcom is proud to be an equal opportunity employer. We will consider qualified applicants without regard to race, color, creed, religion, sex, sexual orientation, national origin, citizenship, disability status, medical condition, pregnancy, protected veteran status or any other characteristic protected by federal, state, or local law. We will also consider qualified applicants with arrest and conviction records consistent with local law. If you are located outside USA, please be sure to fill out a home address as this will be used for future correspondence.

Posted 1 month ago

Apply

6.0 - 8.0 years

12 - 16 Lacs

Pune

Work from Office

I am sharing the job description (JD) for the Data Architect role. We are looking for someone who can join as soon as possible, and I have included a few key bullet points below. The ideal candidate should have between 6 to 8 years of experience, and I am flexible in terms of a strong profile. Job Description: Key Responsibilities The ideal profile should have a strong foundation in data concepts, design, and strategy, with the ability to work across diverse technologies in an agnostic manner. Transactional Database Architecture Design and implement high-performance, reliable, and scalable transactional database architectures. Collaborate with cross-functional teams to understand transactional data requirements and create solutions that ensure data consistency, integrity, and availability. Optimize database designs and recommend best practices and technology stacks. Oversee the management of entire transactional databases, including modernization and de-duplication initiatives. Data Lake Architecture Design and implement data lakes that consolidate data from disparate sources into a unified, scalable storage solution. Architect and deploy cloud-based or on-premises data lake infrastructure. Ensure self-service capabilities across the data engineering space for the business. Work closely with Data Engineers, Product Owners, and Business teams. Data Integration & Governance: Understand ingestion and orchestration strategies. Implement data sharing, data exchange, and assess data sensitivity and criticality to recommend appropriate designs. Basic understanding of data governance practices. Innovation Evaluate and implement new technologies, tools, and frameworks to improve data accessibility, performance, and scalability. Stay up-to-date with industry trends and best practices to continuously innovate and enhance the data architecture strategy. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 1 month ago

Apply

6.0 - 8.0 years

12 - 16 Lacs

Pune

Work from Office

I am sharing the job description (JD) for the Data Architect role. We are looking for someone who can join as soon as possible, and I have included a few key bullet points below. The ideal candidate should have between 6 to 8 years of experience, and I am flexible in terms of a strong profile. Job Description: Key Responsibilities The ideal profile should have a strong foundation in data concepts, design, and strategy, with the ability to work across diverse technologies in an agnostic manner. Transactional Database Architecture Design and implement high-performance, reliable, and scalable transactional database architectures. Collaborate with cross-functional teams to understand transactional data requirements and create solutions that ensure data consistency, integrity, and availability. Optimize database designs and recommend best practices and technology stacks. Oversee the management of entire transactional databases, including modernization and de-duplication initiatives. Data Lake Architecture Design and implement data lakes that consolidate data from disparate sources into a unified, scalable storage solution. Architect and deploy cloud-based or on-premises data lake infrastructure. Ensure self-service capabilities across the data engineering space for the business. Work closely with Data Engineers, Product Owners, and Business teams. Data Integration & Governance: Understand ingestion and orchestration strategies. Implement data sharing, data exchange, and assess data sensitivity and criticality to recommend appropriate designs. Basic understanding of data governance practices. Innovation Evaluate and implement new technologies, tools, and frameworks to improve data accessibility, performance, and scalability. Stay up-to-date with industry trends and best practices to continuously innovate and enhance the data architecture strategy. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 1 month ago

Apply

10.0 - 14.0 years

45 - 55 Lacs

Bengaluru

Work from Office

As a Senior Engineering Manager - Myntra Data Platform, you will oversee the technical aspects of the data platform, driving innovation, and ensuring efficient data management processes. Your role will have a significant impact on the organizations data strategy and overall business objectives. Roles and Responsibilities: Lead and mentor a team of engineers to deliver high-quality data solutions. Develop and execute strategies for data platform scalability and performance optimization. Collaborate with cross-functional teams to align data platform initiatives with business goals. Define and implement best practices for data governance, security, and compliance. Drive continuous improvement through innovation and technological advancement. Monitor and analyze data platform metrics to identify areas for enhancement. Ensure seamless integration of new data sources and technologies into the platform. Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or related field. 10-14 years of experience in engineering roles with a focus on data management and analysis. Proven experience in leading high-performing engineering teams. Strong proficiency in data architecture, ETL processes, and database technologies. Excellent communication and collaboration skills to work effectively with stakeholders. Relevant certifications in data management or related fields are a plus. " Who are we Myntra is India s leading fashion and lifestyle platform, where technology meets creativity. As pioneers in fashion e-commerce, we ve always believed in disrupting the ordinary. We thrive on a shared passion for fashion, a drive to innovate to lead, and an environment that empowers each one of us to pave our own way. We re bold in our thinking, agile in our execution, and collaborative in spirit. Here, we create MAGIC by inspiring vibrant and joyous self-expression and expanding fashion possibilities for India, while staying true to what we believe in. We believe in taking bold bets and changing the fashion landscape of India. We are a company that is constantly evolving into newer and better forms and we look for people who are ready to evolve with us. From our humble beginnings as a customization company in 2007 to being technology and fashion pioneers today, Myntra is going places and we want you to take part in this journey with us. Working at Myntra is challenging but fun - we are a young and dynamic team, firm believers in meritocracy, believe in equal opportunity, encourage intellectual curiosity and empower our teams with the right tools, space, and opportunities.

Posted 1 month ago

Apply

9.0 - 14.0 years

0 - 0 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements. Looking for a Data Modeler Architect to design conceptual, logical, and physical data models. Must translate business needs into scalable models for analytics and operational systems. Strong in normalization , denormalization , ERDs , and data governance practices. Experience in star/snowflake schemas and medallion architecture preferred. Role requires close collaboration with architects, engineers, and analysts. Data modelling, Normalization, Denormalization, Star and snowflake schemas, Medallion architecture, ERDLogical data model, Physical data model & Conceptual data model

Posted 1 month ago

Apply

8.0 - 10.0 years

16 - 20 Lacs

Kolkata

Work from Office

Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail Experience: Minimum 8 Years Of Experience

Posted 1 month ago

Apply

10.0 - 11.0 years

24 - 30 Lacs

Kochi

Work from Office

7+ years in data architecture,3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. data warehousing, data lakes, and real-time data pipelines. SQL, Python

Posted 1 month ago

Apply

4.0 - 8.0 years

14 - 20 Lacs

Mohali

Work from Office

Were looking for a fast-moving, detail-oriented Data Engineer with deep experience in AWS data services, ETL processes, and reporting tools like QuickSight, Tableau, and Power BI. Youll play a key role in building and maintaining the data infrastructure that powers reporting, analytics, and strategic decision-making across the organization. Key Responsibilities: Design, develop, and maintain scalable ETL pipelines to process and transform data from multiple sources. Build and optimize data models, data lakes, and data warehouses using AWS services such as: AWS Glue, Athena, Redshift, S3, Lambda, and QuickSight. Collaborate with business teams and analysts to deliver self-service reporting and dashboards via QuickSight. Ensure data integrity, security, and performance across all reporting platforms. Support cross-functional teams with ad hoc data analysis and the development of custom reporting solutions. Monitor data pipelines and troubleshooting issues as they arise. Preferred Qualifications: 4+ years of experience as a Data Engineer or in a similar role. Strong hands-on experience with AWS data ecosystem, particularly: AWS Glue (ETL), Redshift, S3, Athena, Lambda, QuickSight. Proficiency in SQL and scripting languages such as Python. Experience working with QuickSight, Tableau and Power BI in a production environment. Strong understanding of data architecture, data warehousing, and dimensional modeling. Familiarity with data governance, quality checks, and best practices for data privacy and compliance. Comfortable working in a fast-paced, agile environment with shifting priorities. Excellent communication and collaboration skills. Nice to Have: Experience with DevOps for data: Terraform, CloudFormation, or CI/CD pipelines for data infrastructure. Background in financial services, SaaS, or a data-heavy industry. Why Join Us? Make a direct impact on high-visibility analytics and reporting projects. Work with modern cloud-native data tools and a collaborative, high-performance team. Flexible work environment with opportunities for growth and leadership. Shift timings - Swing shift 2:30 PM to 11:30 PM, Cab and Food will be provided.

Posted 1 month ago

Apply

12.0 - 16.0 years

3 - 4 Lacs

Hyderabad

Work from Office

Key Responsibilities: Monitor and maintain data pipeline reliability , including logging, alerting, and troubleshooting failures. Good knowledge on Artificial Intelligence and Machine learning Design, develop, and optimize relational and NoSQL databases for diverse applications, including AI and large-scale data processing. Build and manage ETL/ELT pipelines to ensure efficient data processing and transformation. Optimize database performance for high-availability applications, reporting, and analytics . Implement data partitioning, indexing, and sharding strategies for scalability. Ensure data integrity, governance, and security across multiple applications. Collaborate with teams to streamline data access, model storage, and training workflows when applicable. Develop SQL queries, stored procedures, and views for efficient data retrieval. Monitor and troubleshoot database performance, bottlenecks, and failures . Required Skills & Qualifications: Strong SQL expertise (writing complex queries, optimization, stored procedures, indexing). Experience with relational databases (PostgreSQL, SQL Server) and NoSQL databases (MongoDB, Redis). Knowledge of AI-related database optimizations , such as vector databases (e.g., Pinecone, FAISS, Weaviate) for embedding storage and retrieval is a plus. Experience working with enterprise data workflows , including data modeling and architecture. Dimensional Modeling / Data Warehousing : Experience with dimensional modeling (star/snowflake schemas) and data warehousing concepts (e.g., Kimball, Inmon). Metadata Management & Data Catalogs : Familiarity with metadata management, data catalogs, or data lineage tools (e.g., Alation, Data Catalog in GCP, AWS Glue Data Catalog). Hands-on experience with cloud platforms (AWS, Azure, GCP) and cloud-based data storage solutions. Familiarity with big data technologies (Spark, Hadoop, Kafka) is a plus. Strong Python or SQL scripting skills for data manipulation and automation. Knowledge of data security, privacy regulations (GDPR, CCPA), and compliance standards . Unit / Integration Testing : Experience with testing data pipelines, including unit and integration testing for transformations. Documentation : Strong documentation practices for pipelines, database schemas, and data governance processes. Excellent problem-solving skills and ability to collaborate with cross-functional teams . Experience with workflow orchestration tools like Apache Airflow or Prefect. Preferred Qualifications: Experience with vector databases and retrieval-augmented generation (RAG) workflows. Understanding of AI model storage, caching, and retrieval from databases when applicable. Experience in machine learning model feature engineering and ML model versioning . Experience with containerization technologies like Docker or Kubernetes for deploying data solutions. Data Quality and Observability Tools : Experience with tools or frameworks for data quality checks, validation, and data observability (e.g., Great Expectations, Monte Carlo, Databand). Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

12.0 - 15.0 years

16 - 20 Lacs

Chennai

Work from Office

Group Company: MINDSPRINT DIGITAL (INDIA) PRIVATE LIMITED Designation: Data SolutionArchitect JobDescription: Design,architect, and implement scalable data solutions on Google Cloud Platform (GCP)to meet the strategic data needs of the organization. Leadthe integration of diverse data sources into a unified data platform, ensuringseamless data flow and accessibility across the organization. Developand enforce robust data governance, security, and compliance frameworkstailored to GCP's architecture. Collaboratewith cross-functional teams, including data engineers, data scientists, andbusiness stakeholders, to translate business requirements into technical datasolutions. Optimizedata storage, processing, and analytics solutions using GCP services such asBigQuery, Dataflow, and Cloud Storage. Drivethe adoption of best practices in data architecture and cloud computing toenhance the performance, reliability, and scalability of data solutions. Conductregular reviews and audits of the data architecture to ensure alignment withevolving business goals and technology advancements. Stayinformed about emerging GCP technologies and industry trends to continuouslyimprove data solutions and drive innovation. ProfileDescription: Experience:12-15 years of experience in data architecture, with extensive expertise inGoogle Cloud Platform (GCP). Skills:Deep understanding of GCP services including BigQuery, Dataflow, Pub/Sub, CloudStorage, and IAM. Proficiency in data modeling, ETL processes, and datawarehousing. Qualifications:Masters degree in Computer Science, Data Engineering, or a related field. Competencies:Strong leadership abilities, with a proven track record of managing large-scaledata projects. Ability to balance technical and business needs in designingdata solutions. Certifications:Google Cloud Professional Data Engineer or Professional Cloud Architectcertification preferred. Knowledge:Extensive knowledge of data governance, security best practices, and compliancein cloud environments. Familiarity with big data technologies like ApacheHadoop and Spark. SoftSkills: Excellent communication skills to work effectively with both technicalteams and business stakeholders. Ability to lead and mentor a team of dataengineers and architects. Tools:Experience with version control (Git), CI/CD pipelines, and automation tools.Proficient in SQL, Python, and data visualization tools like Looker or Power BI. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 1 month ago

Apply

12.0 - 15.0 years

15 - 19 Lacs

Chennai

Work from Office

Group Company: MINDSPRINT DIGITAL (INDIA) PRIVATE LIMITED Designation: Data Solution Architect Job Description: Design, architect, and implement scalable data solutions on Google Cloud Platform (GCP) to meet the strategic data needs of the organization. Lead the integration of diverse data sources into a unified data platform, ensuring seamless data flow and accessibility across the organization. Develop and enforce robust data governance, security, and compliance frameworks tailored to GCP's architecture. Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to translate business requirements into technical data solutions. Optimize data storage, processing, and analytics solutions using GCP services such as BigQuery, Dataflow, and Cloud Storage. Drive the adoption of best practices in data architecture and cloud computing to enhance the performance, reliability, and scalability of data solutions. Conduct regular reviews and audits of the data architecture to ensure alignment with evolving business goals and technology advancements. Stay informed about emerging GCP technologies and industry trends to continuously improve data solutions and drive innovation. Profile Description: Experience: 12-15 years of experience in data architecture, with extensive expertise in Google Cloud Platform (GCP). Skills: Deep understanding of GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and IAM. Proficiency in data modeling, ETL processes, and data warehousing. Qualifications: Masters degree in Computer Science, Data Engineering, or a related field. Competencies: Strong leadership abilities, with a proven track record of managing large-scale data projects. Ability to balance technical and business needs in designing data solutions. Certifications: Google Cloud Professional Data Engineer or Professional Cloud Architect certification preferred. Knowledge: Extensive knowledge of data governance, security best practices, and compliance in cloud environments. Familiarity with big data technologies like Apache Hadoop and Spark. Soft Skills: Excellent communication skills to work effectively with both technical teams and business stakeholders. Ability to lead and mentor a team of data engineers and architects. Tools: Experience with version control (Git), CI/CD pipelines, and automation tools. Proficient in SQL, Python, and data visualization tools like Looker or Power BI. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 11 Lacs

Noida

Work from Office

Responsibilities : - Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc. - Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions. - Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions. - Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions. - Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel. - Stay up to date on the latest GCP offerings, trends, and best practices. Experience : - Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP). - Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP. - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau). - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI). - Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques. - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data. - Strong knowledge and experience in best practices for data governance, security, and compliance. - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs. - Strong analytical and problem-solving skills. - Ability to work independently and as part of a team.

Posted 1 month ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Lucknow

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 1 month ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Ludhiana

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 1 month ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Hyderabad

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 1 month ago

Apply

8.0 - 10.0 years

13 - 17 Lacs

Patna

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 1 month ago

Apply

8.0 - 10.0 years

13 - 17 Lacs

Surat

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 1 month ago

Apply

8.0 - 10.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Position: Solution Architect (ETL) Location: Bangalore Experience: 8 Yrs CTC: As per the Industry standards Immediate Joiners # Job Summary We are seeking an experienced Solution Architect (ETL) to design and implement data integration solutions using ETL (Extract, Transform, Load) tools. The ideal candidate will have a strong background in data warehousing, ETL, and data architecture. # Key Responsibilities 1. Design and Implement ETL Solutions: Design and implement ETL solutions using tools such as Informatica PowerCenter, Microsoft SSIS, or Oracle Data Integrator. 2. Data Architecture: Develop and maintain data architectures that meet business requirements and ensure data quality and integrity. 3. Data Warehousing: Design and implement data warehouses that support business intelligence and analytics. 4. Data Integration: Integrate data from various sources, including databases, files, and APIs. 5. Data Quality and Governance: Ensure data quality and governance by implementing data validation, data cleansing, and data standardization processes. 6. Collaboration: Collaborate with cross-functional teams, including business stakeholders, data analysts, and IT teams. 7. Technical Leadership: Provide technical leadership and guidance to junior team members. # Requirements 1. Education: Bachelors degree in Computer Science, Information Technology, or related field. 2. Experience: Minimum 8 years of experience in ETL development, data warehousing, and data architecture. 3. Technical Skills: ETL tools such as Informatica PowerCenter, Microsoft SSIS, or Oracle Data Integrator. Data warehousing and business intelligence tools such as Oracle, Microsoft, or SAP. Programming languages such as Java, Python, or C#. Data modeling and data architecture concepts. 4. Soft Skills: Excellent communication and interpersonal skills. Strong problem-solving and analytical skills. Ability to work in a team environment and lead junior team members. # Nice to Have 1. Certifications: Certifications in ETL tools, data warehousing, or data architecture. 2. Cloud Experience: Experience with cloud-based data integration and data warehousing solutions. 3. Big Data Experience: Experience with big data technologies such as Hadoop, Spark, or NoSQL databases. # What We Offer 1. Competitive Salary: Competitive salary and benefits package. 2. Opportunities for Growth: Opportunities for professional growth and career advancement. 3. Collaborative Work Environment: Collaborative work environment with a team of experienced professionals.

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Architecture Principles Good to have skills : Python (Programming Language), Data Building Tool, Snowflake Data WarehouseMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating and implementing innovative solutions to enhance business processes and meet application needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the team in implementing data architecture principles effectively- Develop and maintain data models and databases- Ensure data integrity and security measures are in place Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles- Good To Have Skills: Experience with Python (Programming Language), Snowflake Data Warehouse, Data Building Tool- Strong understanding of data architecture principles- Experience in designing and implementing data solutions- Knowledge of data modeling and database design Additional Information:- The candidate should have a minimum of 12 years of experience in Data Architecture Principles- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

15 - 19 Lacs

Pune

Work from Office

Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Cloud Machine Learning ServicesMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationAs an AI/ML technical lead, you will be responsible for developing applications and systems that utilize AI tools and Cloud AI services. Your typical day will involve applying CCAI and GenAI models as part of the solution, utilizing deep learning, neural networks and chatbots. Roles & Responsibilities:- Design and develop CCAI applications and systems utilizing Google Cloud Machine Learning Services, dialogue flow CX, agent assist.- Develop and implement chatbot solutions that integrate seamlessly with CCAI and other Cloud services- Integrate Dialogflow agents with various platforms, such as Google Assistant, Facebook Messenger, Slack, and websites. Hands-on experience with IVR integration and telephony systems such as Twilio, Genesys, Avaya- Integrate with IVR systems and Proficiency in webhook setup and API integration.- Develop Dialogflow CX - flows, pages, webhook as well as playbook and integration of tool into playbooks.- Creation of agents in Agent builder and integrating them into end end to pipeline using python.- Apply GenAI-Vertex AI models as part of the solution, utilizing deep learning, neural networks, chatbots, and image processing.- Work with Google Vertex AI for building, training and deploying custom AI models to enhance chatbot capabilities- Implement and integrate backend services (using Google Cloud Functions or other APIs) to fulfill user queries and actions.- Document technical designs, processes, and setup for various integrations.- Experience with programming languages such as Python/Node.js Professional & Technical Skills: - Must To Have Skills: CCAI/Dialogflow CX hands on experience and generative AI understanding.- Good To Have Skills: Cloud Data Architecture, Cloud ML/PCA/PDE Certification, - Strong understanding of AI/ML algorithms and techniques.- Experience with chatbot , generative AI models, prompt Engineering- Experience with cloud or on-prem application pipeline with production-ready quality. Additional Information:- The candidate should have a minimum of 7 years of experience in Google Cloud Machine Learning Services/Gen AI/Vertex AI/CCAI.- The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions.- A 15 years full time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

9 - 14 Lacs

Mumbai

Work from Office

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Cloud Machine Learning ServicesMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationAs an AI/ML technical lead, you will be responsible for developing applications and systems that utilize AI tools and Cloud AI services. Your typical day will involve applying CCAI and GenAI models as part of the solution, utilizing deep learning, neural networks and chatbots. Roles & Responsibilities:- Design and develop CCAI applications and systems utilizing Google Cloud Machine Learning Services, dialogue flow CX, agent assist.- Develop and implement chatbot solutions that integrate seamlessly with CCAI and other Cloud services- Integrate Dialogflow agents with various platforms, such as Google Assistant, Facebook Messenger, Slack, and websites. Hands-on experience with IVR integration and telephony systems such as Twilio, Genesys, Avaya- Integrate with IVR systems and Proficiency in webhook setup and API integration.- Develop Dialogflow CX - flows, pages, webhook as well as playbook and integration of tool into playbooks.- Creation of agents in Agent builder and integrating them into end end to pipeline using python.- Apply GenAI-Vertex AI models as part of the solution, utilizing deep learning, neural networks, chatbots, and image processing.- Work with Google Vertex AI for building, training and deploying custom AI models to enhance chatbot capabilities- Implement and integrate backend services (using Google Cloud Functions or other APIs) to fulfill user queries and actions.- Document technical designs, processes, and setup for various integrations.- Experience with programming languages such as Python/Node.js Professional & Technical Skills: - Must Have Skills: CCAI/Dialogflow CX hands on experience and generative AI understanding.- Good To Have Skills: Cloud Data Architecture, Cloud ML/PCA/PDE Certification, - Strong understanding of AI/ML algorithms and techniques.- Experience with chatbot, generative AI models, prompt Engineering- Experience with cloud or on-prem application pipeline with production-ready quality. Additional Information:- The candidate should have a minimum of 7 years of experience in Google Cloud Machine Learning Services/Gen AI/Vertex AI/CCAI.- The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions.- A 15-year full time education is required Qualification 15 years full time education

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies