Jobs
Interviews

1817 Data Architecture Jobs - Page 39

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

8 - 12 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Salesforce CRMA Consultant1 Job Title: Salesforce CRMA Consultant Location: Offshore Duration: 6 Months : We are seeking a highly skilled Salesforce CRM Analytics (CRMA/Tableau CRM) Consultant to join our offshore team on a 6-month FTE engagement. The successful candidate will be responsible for designing and developing interactive dashboards, transforming Salesforce data into actionable insights, and collaborating closely with business stakeholders to meet reporting and analytics needs. Key Responsibilities: Design, develop, and deploy dashboards and visualizations using Salesforce CRMA/Tableau CRM Create and manage dataflows and recipes for data preparation and transformation Utilize SAQL , SOQL , and JSON-based dashboards to build customized analytical solutions Ensure alignment with the Salesforce data model and security protocols Optimize dashboard performance for scalability and responsiveness Partner with business and technical teams to understand requirements and translate them into technical solutions Maintain best practices for CRMA development and documentation Required Skills: Hands-on experience with Salesforce CRMA (Tableau CRM) Proficiency in SAQL, SOQL , and JSON Strong understanding of dataflows, recipes , and Salesforce data architecture Experience with dashboard performance tuning and optimization Familiarity with Salesforce data security model (sharing rules, field-level security, etc.) Location - Pune,Hyderabad,Kolkata,Jaipur,Chandigarh

Posted 1 month ago

Apply

10.0 - 15.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Join us as a Data & Analytics Analyst This is an opportunity to take on a purpose-led role in a cutting edge Data & Analytics team You ll be consulting with our stakeholders to understand their needs and identify suitable data and analytics solutions to meet them along with business challenges in line with our purpose You ll bring advanced analytics to life through visualisation to tell powerful stories and influence important decisions for key stakeholders, giving you excellent recognition for your work Were offering this role as associate vice presidentr level What youll do As a Data & Analytics Analyst, you ll be driving the use of advanced analytics in your team to develop business solutions which increase the understanding of our business, including its customers, processes, channels and products. You ll be working closely with business stakeholders to define detailed, often complex and ambiguous business problems or opportunities which can be supported through advanced analytics, making sure that new and existing processes are designed to be efficient, simple and automated where possible. As well as this, you ll be: Leading and coaching your colleagues to plan and deliver strategic project and scrum outcomes Planning and delivering data and analytics resource, expertise and solutions, which brings commercial and customer value to business challenges Communicating data and analytics opportunities and bringing them to life in a way that business stakeholders can understand and engage with Adopting and embedding new tools, technologies and methodologies to carry out advanced analytics Developing strong stakeholder relationships to bring together advanced analytics, data science and data engineering work that is easily understandable and links back clearly to our business needs The skills youll need We re looking for someone with a passion for data and analytics together with knowledge of data architecture, key tooling and relevant coding languages. Along with advanced analytics knowledge, you ll bring an ability to simplify data into clear data visualisations and compelling insight using appropriate systems and tooling . You ll also demonstrate: Strong knowledge of data management practices and principles Experience of translating data and insights for key stakeholders Good knowledge of data engineering, data science and decisioning disciplines Strong communication skills with the ability to engage with a wide range of stakeholders Coaching and leadership experience with an ability to support and motivate colleagues Hours 45 Job Posting Closing Date: 06/07/2025

Posted 1 month ago

Apply

8.0 - 13.0 years

32 - 45 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Title: Data Scientist Architect Location: Pan India - hybrid Experience: 8+ years Position Overview: We are seeking a Data Scientist Architect to lead and drive data science and architecture initiatives within Brillio. The ideal candidate will have a deep understanding of data science, data engineering, and architecture, and be highly proficient in implementing cutting-edge solutions using tools like DataBricks, AWS, and Bedrock/Mistral . The role requires an individual with extensive experience in designing, building, and deploying large-scale data systems and machine learning models, along with the ability to lead and mentor cross-functional teams. As a Data Scientist Architect, you will have the opportunity to innovate and make a lasting impact across our diverse client base, providing them with tailored solutions that drive their data strategy forward. Key Responsibilities: Lead Data Architecture & Science Initiatives: Design and implement advanced data architectures and solutions to support complex data science and machine learning workflows. Build and deploy scalable, production-grade data pipelines and models leveraging cloud platforms like AWS and tools like DataBricks. Architect solutions involving large-scale data ingestion, transformation, and storage, focusing on performance, scalability, and reliability. Platform Development & Integration: Implement and manage cloud-based infrastructure for data engineering, analytics, and machine learning on platforms like AWS, leveraging services like S3, Lambda, EC2, etc. Work with Bedrock/Mistral to deploy and manage machine learning models at scale, ensuring continuous optimization and improvement. Skills and Qualifications: Experience: 8+ years of experience in Data Science, Data Architecture with a focus on large-scale data systems and cloud platforms. Proven track record of leading data science architecture projects from inception to deployment. Technical Skills: Proficiency in DataBricks, AWS (S3, EC2, Lambda, Redshift, SageMaker, etc.), and Bedrock/Mistral.

Posted 1 month ago

Apply

6.0 - 11.0 years

5 - 5 Lacs

Mumbai, Navi Mumbai, Mumbai (All Areas)

Work from Office

Role & responsibilities : Hands-on experience in relevant field (within essential requirement of post basic qualification experience) in: Proven track record in Database Performance Tuning, Database Security, Promoting Process Improvement, Problem Solving, Presenting Technical Information, Quality Focus, Database Management, Data Maintenance, Operating Systems, Attention to Detail, Information Security Policies Proven enterprise database administration experience in Oracle/Microsoft SQL Server Proficient in design and development of relational (& optionally hierarchical) databases, including ability to design for performance, scalability, availability, flexibility & extensibility, meeting security requirements. Strong knowledge of database engineering activities, including installation and configuration of RDBMS software. Work Experience as Database Administrator (DBA) Oracle Database 19C/ RAC configuration/ Oracle Golden Gate configuration. Experience with any Application Performance Monitoring Tools such as Dynatrace / AppDynamics BASIC QUALIFICATION (AS ON 31.03.2025): B.Tech / B.E. in Computer Science/ Computer Science & Engineering/ Information Technology/ Electronics/ Electronics & Communications Engineering or Equivalent Degree in above specified disciplines with minimum 60% score. Or MCA or equivalent Or M.Tech / M.Sc. in Computer Science/ Computer Science & Engineering/ Information Technology/ Electronics/ Electronics & Communications Engineering or Equivalent Degree in above specified disciplines. (From a University/ Institution/ Board recognised by Govt. Of India/ approved by Govt. Regulatory Bodies) OTHER QUALIFICATION (AS ON 31.03.2025) Preferred Certifications: (valid as on 31.03.2025) Oracle Certified Associate (OCA) Database Administrator Oracle Certified Professional (OCP) SQL Queries and PL/SQL WORK EXPERIENCE : Minimum 6 years post qualification experience in IT Industry.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Do you want to help create the future of healthcare? Our name, Siemens Healthineers, was selected to honor our people who dedicate their energy and passion to this cause. It reflects their pioneering spirit combined with our long history of engineering in the ever-evolving healthcare industry. We offer you a flexible and dynamic environment with opportunities to go beyond your comfort zone in order to grow personally and professionally. Sound interesting? Then come and join our global team as an Enterprise Architect (f/m/d) in IT to design the enterprise architecture for a large business unit or the entire company and to be responsible for the application landscape as well as for the technologies and development tools used. Your tasks and responsibilities: You will be responsible for enterprise architecture management (including business IT alignment and analysis of the application portfolio) of a large business unit or process domain and derive IT strategies from business requirements, ensuring alignment with the overall enterprise architecture You will drive the architecture roadmap and the application and data architecture for the business unit with a focus on security, scalability, and reliability of the IT landscape You will prepare decisions on the use of new technologies and platforms You will model IT architecture and processes and promote consistent design, planning and implementation of IT solutions You will be responsible for the coordination of communication with all important decision makers and relevant stakeholders and advise them on the development of the IT landscape You will drive the composition of the IT landscape and balance organizational needs with enterprise architecture decisions and objectives You will identify digitalization opportunities and synergies within the system landscape and represent system interrelationships holistically Restricted To find out more about the specific business, have a look at https: / / www.siemens-healthineers.com / products-services Your qualifications and experience: You have a degree in computer science, Industrial Engineering or a comparable qualification You have 10+ years of experience in global IT organizations, ideally in a mix of operational and architecture roles You have 5+ years of experience as a solution / application or enterprise architect Based on your very good understanding of complex IT processes and your openness to new technologies, you have acquired in-depth knowledge of software development, application management, enterprise architecture, enterprise architecture methodologies, governance structures and frameworks (e.g. TOGAF) You also have deep technological expertise and several years of experience in complex technology landscapes Functional or IT implementation experience across all key IT functions with a focus on PLM, SCM, Order-to-Cash and Accounting In-depth knowledge in at least one business domain such as CRM/Sales, R&D/PLM or SCM You have experience in business process analysis and modelling You bring several years of proven experience with working on different business process management models with Enterprise Architecture tools such as LeanIX or BizzDesign Further, you have a very good understanding of the interrelationships between functional business and technical IT structures Your attributes and skills: For working with specialist departments at home and abroad, we require very good English language skills, both spoken and written. Ideally you also have very good German language skills Restricted You are an organizational talent and impress with good communication and presentation skills - at very different levels in the organizational hierarchy You are a team player with a high level of social competence who can operate confidently in a global environment We dont compromise on quality - you work results- and quality-oriented with high commitment and possess good analytical and conceptual skills You are flexible in thought and action, have a quick grasp and constructive assertiveness Our global team: Siemens Healthineers is a leading global medical technology company. 50,000 dedicated colleagues in over 70 countries are driven to shape the future of healthcare. An estimated 5 million patients across the globe benefit every day from our innovative technologies and services in the areas of diagnostic and therapeutic imaging, laboratory diagnostics and molecular medicine, as well as digital health and enterprise services. Our culture: Our culture embraces different perspectives, open debate, and the will to challenge convention. Change is a constant aspect of our work. We aspire to lead the change in our industry rather than just react to it. That s why we invite you to take on new challenges, test your ideas, and celebrate success. Check our Careers Site at https: / / www.siemens-healthineers.com / de / careers As an equal opportunity employer, we welcome applications from individuals with disabilities.

Posted 1 month ago

Apply

15.0 - 20.0 years

20 - 25 Lacs

Noida

Work from Office

Exp 15+ year of exp Require knowledge of Talend but also knowledge of other Data related tools like Databricks or Snowflake The Senior Talend Developer/Architect role is responsible to lead the design, development and manage the INSEAD data infrastructure for the CRM ecosystem, to develop Talend Jobs & Flows and to act as a mentor for the other 3-4 Talend Developers. This role will be instrumental in driving data pipeline architecture and ensuring data integrity, performance, and scalability using the Talend platform. This role is key part of the HARMONIA project team while the engagement is active. The role will also be part of the Harmonia Data Quality project and Data Operations scrum teams. It will contribute to additional activities such data modelling & design, architecture, integration and propose technology strategy. The position holder must organize and plan her/his work under special consideration of a frictionless information flow in the Digital Solutions and the relevant business department. He/She will collaborate closely with cross-functional teams to deliver high-quality data solutions that support strategic business objectives. Job Requirements Details Design, develop, and deploy scalable ETL/ELT solutions using Talend (e.g. Data Stewardship, Management Console, Studio). Architect end-to-end data integration workflows. Establish development best practices, reusable components, and job templates to optimize performance and maintainability. Responsible for delivering robust data architecture, tested, validated and deployable jobs/flows to production environments. He/she will follow Talend best practices and JIRA development framework development practices. Translate business requirements into efficient and scalable Talend solutions. Assist with the developer input/feedback for those requirements wherever deemed necessary. These are to be done by actively leading brainstorming sessions arranged by the project manager. Work closely with the Manager of Data Operations and Quality, project manager, business analysts, data analysts, developers and other subject matter experts to align technical solutions with operational needs. Ensure alignment with data governance, security, and compliance standards. Responsible for ensuring that newly produced developments follow standard styles which are already part of the current Talend jobs & flows developed by the current integrator, and INSEAD teams. Actively participate to the project related activities and ensure the SDLC process is followed. Participate in the implementation and execution of data cleansing and normalization, deduplication and transformation projects. Conduct performance tuning, error handling, monitoring, and troubleshooting of Talend jobs and environments. Contribute to sprint planning and agile ceremonies with the Harmonia Project Team and Data Operations Team. Document technical solutions, data flows, and design decisions to support operational transparency. Stay current with Talend product enhancements and industry trends, recommending upgrades or changes where appropriate. No budget responsibility

Posted 1 month ago

Apply

1.0 - 6.0 years

15 - 19 Lacs

Bengaluru

Work from Office

In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Responsibilities Build, Train, and Deploy ML Models using Python on Azure/AWS 1+ years of Experience in building Machine Learning and Deep Learning models in Python Experience on working on AzureML/AWS Sagemaker Ability to deploy ML models with REST based APIs Proficient in distributed computing environments / big data platforms (Hadoop, Elasticsearch, etc.) as well as common database systems and value stores (SQL, Hive, HBase, etc.) Ability to work directly with customers with good communication skills. Ability to analyze datasets using SQL, Pandas Experience of working on Azure Data Factory, PowerBI Experience on PySpark, Airflow etc. Experience of working on Docker/Kubernetes Mandatory skill sets Data Science, Machine Learning Preferred skill sets Data Science, Machine Learning Education qualification B.Tech / M.Tech / MBA / MCA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred Required Skills Data Science Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?

Posted 1 month ago

Apply

1.0 - 6.0 years

15 - 19 Lacs

Bengaluru

Work from Office

At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Responsibilities Build, Train, and Deploy ML Models using Python on Azure/AWS 1+ years of Experience in building Machine Learning and Deep Learning models in Python Experience on working on AzureML/AWS Sagemaker Ability to deploy ML models with REST based APIs Proficient in distributed computing environments / big data platforms (Hadoop, Elasticsearch, etc.) as well as common database systems and value stores (SQL, Hive, HBase, etc.) Ability to work directly with customers with good communication skills. Ability to analyze datasets using SQL, Pandas Experience of working on Azure Data Factory, PowerBI Experience on PySpark, Airflow etc. Experience of working on Docker/Kubernetes Mandatory skill sets Data Science, Machine Learning Preferred skill sets Data Science, Machine Learning Education qualification B.Tech / M.Tech / MBA / MCA Education Degrees/Field of Study required Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Data Science Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?

Posted 1 month ago

Apply

4.0 - 7.0 years

6 - 10 Lacs

Kolkata

Work from Office

In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC Learn more about us . s Experience of 4 to 7 years who has adequate knowledge Scalas objectoriented programming. Scala code written in the backend is the basis of the finance module reports which are accessed via QuickSight. To assess scala code written for Finance module reports, figure out the issues and fix the same. Mandatory skill sets Scala and OOP Preferred skill sets Scala and OOP Education qualification B.tech/MBA/MCA Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills Scala (Programming Language) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Founded in 1976, CGI is among the worlds largest independent IT and business consulting services firms. With 94,000 consultants and professionals globally, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services, and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion, and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Job Title: Perl Scripting Developer / Lead Position: SSE / LA Experience: 5 to 8yrs. Category: Software Development/ Engineering Job location: Bangalore or Chennai Position ID: J0425-1535 Work Type - Hybrid Employment Type: Full Time / Permanent Education: Bachelor s or Master s degree in Computer Science, Engineering, or a related field. The Technical Expert is responsible for performing the following tasks: Handling assistance requests, corrections, and enhancements Drafting technical specifications Producing estimates for minor and major enhancements Carrying out unit and integration testing Anticipating system failures or degradations as part of preventive maintenance Updating documentation Supervising the application Improving platform reliability Technical skills : Excellent command of scripting: Perl, Shell, preferably in an AIS environment Strong expertise in Java/J2EE (JDBC, JSP, Servlet) Excellent command of SQL/Oracle/PostGre Good command of development tools: GIT, an IDE, Linux system commands Proficiency with Tomcat, SOAP and REST web services Mastery of DevOps tools: Git, GitHub, CI/CD Pipelines (DevOps Pipeline, GitHub Actions, Jenkins Pipeline) Experience working in Agile Scrum #LI-GB9 Skills: Data Analysis Data Architecture Perl

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Kolkata

Work from Office

Some careers have more impact than others. If you re looking for a career where you can make a real impression, join HSBC and discover how valued you ll be. HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of Product Owner Change and Digital Enablement Regulatory Compliance Principal responsibilities The individual will be responsible for reporting RC AI & Analytics scorecard and key performance indicators in a timely and accurate manner. Promote a culture of data driven decision making, aligning short term decisions and investments with longer term vision and objectives. Help the business to manage regulatory risk in a more effective, efficient, and commercial way through the adoption of data science (AI/ML and advanced analytics) Support communication and engagement with stakeholders and partners to increase understanding and adoption of data science products and services also research opportunities. Collaborate with other analytics teams across the banks to share insight and best practice. Foster a collaborative, open and agile delivery culture. Build positive momentum for change across the organization with the active support and buy-in of all stakeholders. The ability to communicate often complex analytical solutions to the wider department, ensuring a strong transfer of key findings & intelligence. Requirements University degree in technology, data analytics or related discipline or relevant work experience in computer or Data Science Understanding of Regulatory Compliance, risks and direct experience of deployment of controls and analytics to manage those risks. Experience in Financial Services (experience within a tier one bank) or related industry, Knowledge of the HSBC Group structure, its business and personnel, and HSBC s corporate culture Experience of agile development and active contribution to strategy and innovation. Solid understanding of applied mathematics, statistics, data science principles and advanced computing (machine learning, modelling, NLP and Generative AI) Experience working within the Hadoop ecosystem in addition to strong technical skills in analytical languages such as Python and SQL. Specific knowledge of GCP, AWS, Azure, Spark and/or graph theory an advantage. Experience of visualization tools and techniques including Qlik and Tableau Solid understanding of data & architecture concepts including cloud, ETL, Ontology, Data Modelling. Experience of using JIRA, GIT, Confluence, Teams, Slack, Advanced Excel You ll achieve more at HSBC

Posted 1 month ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 1 month ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Lucknow

Work from Office

Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 1 month ago

Apply

3.0 - 8.0 years

12 - 20 Lacs

Bengaluru

Hybrid

Role- Senior Data Science Engineer Sabre is the global leader in innovative technology that leads the travel industry. We are always looking for bright and driven people who have a penchant for technology, are hackers at heart and want to hone their skills. If you are interested in challenging work, being part of a global team, and solving complex problems through technology, business intelligence and analytics, and Agile practices - then Sabre is right for you! It is our people who develop and deliver powerful solutions that meet the current and future needs for our airline, hotel, and travel agency customers. Sabre is seeking a talented senior software engineer full Senior Data Science Engineer for SabreMosaic Team. At Sabre, were passionate about building data science and data engineering solves problems. In this role you will plan, design, develop and test data science/data engineer software systems or applications for software enhancements and new products based on cloud-based solutions. Role and Responsibilities: Develops, codes, tests and debugs new complex data driven software solutions or enhancements to existing product Designs, plans, develops and improves applications using advanced cloud native technology Works on issues where analysis of situations or data requires an in-depth knowledge of organizational objectives. Implements strategic policies when selecting methods, techniques Encourage high coding standards, using best practices and high quality Regularly interacts with subordinate supervisors, architects, product managers, HR, on matters concerning projects, or team performance. Requires the ability to change the thinking of, or gain acceptance from others in sensitive situations, without damage to the relationship Provides technical mentorship and cultural/competency-based guidance to teams Provides larger business/product context. Mentors on specific tech stacks/technologies Qualifications and Education Requirements: Minimum 4-6 years of related experience as a full stack developer Expert in the field of Data Engineering/DW projects with Google Cloud based Data Engineering solutions Designing and developing enterprise data solutions on GCP cloud platform with native or third-party data technologies. Good working experience with relational databases and NoSQL databases including but not limited to Oracle, Spanner, BigQuery etc. Expert level SQL skills for data manipulation (DML), data validation and data manipulation Experience in design and development of data modeling, data design, data warehouses, data lakes and analytics platforms on GCP Expertise in designing ETL data pipelines and data processing architectures for Datawarehouse Experience in technical design and building both Streaming and batch processing systems Good experience in Datawarehouse in designing Star & Snowflake Schemas and knowledge of Dimensional Data Modelling Work with data scientists, data team and engineering teams to use Google Cloud platform to analyze data, build data models on Big query, big table etc Working experience in Integrating different datasets from multiple data sources for data modelling for analytical and AI/ML models Take ownership of production deployment of code Understanding and experience in Pub/Sub, Kafka, Kubernetes, GCP, AWS, Hive, Docker Expertise in Java spring boot / Python or other programming languages used for Data Engineering and integration projects Strong problem-solving and analytical skills AI/ML exposure, MLOPS and Vertex AI is a great advantage Familiarity with DevOps practices like CICD pipeline Airline domain experience is a plus Excellent spoken and written communication skills GCP Cloud Data Engineer Professional is plus

Posted 1 month ago

Apply

7.0 - 12.0 years

5 - 15 Lacs

Bengaluru

Work from Office

BE/B.Tech or equivalent The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL data bases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. 7+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi- data -center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Required Skills Required Skills - Data Modeling, Dimensional modeling, Erwin, Data Management, RDBMS, SQL/NoSQL, ETL

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Mohali

Work from Office

We are looking for a Snowflake Developer with 5+ years of experience in Snowflake Data Warehouse and related tools. You will build, manage, and optimize data pipelines, assist in data integration, and contribute to data architecture. The ideal candidate should understand data modeling and ETL processes, and have experience with cloud-based data platforms. Please confirm once you ve gained access, and let us know if you need further assistance. Key Responsibilities Design, develop, and maintain Snowflake Data Warehouses. Create and manage Snowflake schema, tables, views, and materialized views. Implement ETL processes to integrate data from various sources into Snowflake. Collaborate with Data Engineers, Data Scientists, and Analysts to build efficient data pipelines. Ensure data integrity, security, and compliance with data governance policies. Requirements Proficient in SQL, SnowSQL, and ETL processes Strong experience in data modeling and schema design in Snowflake. Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data pipelines, data lakes, and data integration tools. Experience in using tools like dbt, Airflow, or similar orchestration tools is a plus. Maintain records of the conversations with the customer and analyze the data. Handling customer queries on Chat and E-mails. Lorem Ipsum Work with us SourceMash Technologies is a leading solution provider for internet-based applications and product development since 2008. Be a part of our company that is facilitated by highly skilled professionals dedicated to providing total IT solutions under one roof. We offer remarkable services in the areas of Software Development, Quality Assurance, and Support. An employee welcome kit, like Custom Notepad, T-Shirt, Water Bottle etc., is also included in employee welcome packages onboard. SourceMash Technologies offers the best employee health insurance benefit to their employees family members under the same policy. Annual leaves are paid at the payment rate in the working period before the leave, and no untaken leaves can be considered part of the mandatory notice periods.

Posted 1 month ago

Apply

12.0 - 15.0 years

13 - 18 Lacs

Gurugram

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Data & AI Strategy Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business strategy and meets the needs of various stakeholders. You will collaborate with cross-functional teams to gather requirements, analyze data flows, and implement best practices in data management, all while maintaining a focus on scalability and performance. Your role will be pivotal in shaping the data landscape of the organization, driving innovation, and enabling data-driven decision-making across the enterprise. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing and mentoring within the team to enhance overall capabilities.- Develop and maintain documentation related to data architecture and design principles. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data & AI Strategy.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with cloud-based data storage solutions and architectures.- Ability to analyze complex data sets and derive actionable insights. Additional Information:- The candidate should have minimum 12 years of experience in Data & AI Strategy.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

12.0 - 15.0 years

13 - 18 Lacs

Mumbai

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Data & AI Strategy Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business strategy and meets the needs of various stakeholders. You will collaborate with cross-functional teams to gather requirements and translate them into effective data solutions, while also addressing any challenges that arise during the development process. Your role will be pivotal in establishing a robust data framework that supports the organization's objectives and enhances data accessibility and usability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing and best practices among team members.- Monitor and evaluate the effectiveness of data strategies and make necessary adjustments. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data & AI Strategy.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Ability to design scalable and efficient data architectures.- Familiarity with cloud-based data solutions and platforms. Additional Information:- The candidate should have minimum 12 years of experience in Data & AI Strategy.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

12.0 - 15.0 years

13 - 18 Lacs

Mumbai

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Data & AI Strategy Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration. You will collaborate with various teams to ensure that the data architecture aligns with business objectives and supports the overall data strategy. You will also engage in discussions to refine data models and address any challenges that arise during the development process, ensuring that the data architecture is robust and scalable to meet future needs. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Mentor junior team members to enhance their skills and understanding of data architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data & AI Strategy.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with cloud data services and architectures.- Ability to design and implement data governance frameworks. Additional Information:- The candidate should have minimum 12 years of experience in Data & AI Strategy.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

13 - 18 Lacs

Gurugram

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Data & AI Strategy Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business objectives and supports efficient data management and accessibility. You will collaborate with various teams to understand their data needs and provide innovative solutions that enhance data utilization across the organization. Roles & Responsibilities:- Expected to be a Subject Matter Expert with deep knowledge and experience.- Should have influencing and advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and discussions to gather requirements and feedback from stakeholders.- Continuously evaluate and improve data architecture practices to ensure scalability and performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data & AI Strategy.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Ability to design and implement data governance frameworks.- Familiarity with cloud-based data storage solutions and architectures. Additional Information:- The candidate should have minimum 15 years of experience in Data & AI Strategy.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Pune

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to enhance the overall data architecture and platform functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with industry trends and technologies.- Assist in the documentation of data platform processes and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with cloud data services and data warehousing solutions.- Strong understanding of data integration techniques and ETL processes.- Familiarity with data modeling concepts and practices.- Experience in working with big data technologies and frameworks. Additional Information:- The candidate should have minimum 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Cloud Machine Learning ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI/ML Data Engineer, you will be responsible for developing applications and systems that utilize AI tools and Cloud AI services. Your typical day will involve applying CCAI and GenAI models as part of the solution, utilizing deep learning, neural networks and chatbots. Roles & Responsibilities:Solutioning and designing CCAI applications and systems utilizing Google Cloud Machine Learning Services, dialogue flow CX, agent assist, conversational AI. Design, develop, and maintain intelligent chatbots and voice applications using Google Dialogflow CX. Develop CCAI applications and systems utilizing Google Cloud Machine Learning Services, dialogue flow CX, agent assist. Develop and implement chatbot solutions that integrate seamlessly with CCAI and other Cloud servicesDevelop CCAI- flows, pages, webhook as well as playbook and integration of tool into playbooks. Creation of agents in Agent builder and integrating them into end end to pipeline. Apply GenAI-Vertex AI models as part of the solution, Ensure proper cloud application pipeline with production-ready quality. Experience with programming languages such as Python/Node.js Professional & Technical Skills: - Must To Have Skills: CCAI/Dialogflow CX hands on experience and generative AI understanding.- Good To Have Skills: Cloud Data Architecture, Cloud ML/PCA/PDE Certification, - Have some understanding of AI/ML algorithms and techniques.- Experience with chatbot , generative AI models, prompt Engineering- Experience with cloud or on-prem application pipeline with production-ready quality. Additional Information:- The candidate should have a minimum of 5 years of experience in Google Cloud Machine Learning Services.- The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : BE Any Stream Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with business objectives and supports efficient data management practices. You will collaborate with various stakeholders to gather requirements and translate them into effective data solutions, while also addressing any challenges that arise in the data architecture process. Your role will be pivotal in establishing a robust data framework that enhances data accessibility and usability across the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest data architecture trends and technologies.- Collaborate with cross-functional teams to ensure data integrity and consistency across various platforms. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with data governance frameworks and compliance standards.- Ability to analyze complex data sets and derive actionable insights. Additional Information:- The candidate should have minimum 3 years of experience in SAP Master Data Governance MDG Tool.- This position is based at our Bengaluru office.- A BE in Any Stream is required. Qualification BE Any Stream

Posted 1 month ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Noida

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will collaborate with key stakeholders, including business representatives, data owners, and architects to model current and new data, contributing to data architecture decisions and solutions. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead data modeling efforts for various projects.- Develop and maintain data models for databases.- Ensure data integrity and quality in all data modeling activities.- Conduct data analysis to support business requirements. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles.- Experience with data modeling tools such as ERwin or PowerDesigner.- Knowledge of data governance and data quality best practices.- Good To Have Skills: Experience with data warehousing concepts. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Noida office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

10.0 - 14.0 years

9 - 14 Lacs

Pune

Work from Office

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Cloud Machine Learning ServicesMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI/ML lead, you will be responsible for developing applications and systems that utilize AI tools and Cloud AI services. Your typical day will involve applying CCAI and Gen AI models as part of the solution, utilizing deep learning, neural networks and chatbots. Should have hands-on experience in creating, deploying, and optimizing chatbots and voice applications using Google Conversational Agents and other tools. Roles & Responsibilities:- Solutioning and designing CCAI applications and systems utilizing Google Cloud Machine Learning Services, dialogue flow CX, agent assist, conversational AI.- Design, develop, and maintain intelligent chatbots and voice applications using Google Dialogflow CX.- Integrate Dialogflow agents with various platforms, such as Google Assistant, Facebook Messenger, Slack, and websites. Hands-on experience with IVR integration and telephony systems such as Twilio, Genesys, Avaya- Integrate with IVR systems and Proficiency in webhook setup and API integration.- Develop Dialogflow CX - flows, pages, webhook as well as playbook and integration of tool into playbooks.- Creation of agents in Agent builder and integrating them into end end to pipeline using python.- Apply GenAI-Vertex AI models as part of the solution, utilizing deep learning, neural networks, chatbots, and image processing.- Work with Google Vertex AI for building, training and deploying custom AI models to enhance chatbot capabilities- Implement and integrate backend services (using Google Cloud Functions or other APIs) to fulfill user queries and actions.- Document technical designs, processes, and setup for various integrations.- Experience with programming languages such as Python/Node.js Professional & Technical Skills: - Must Have Skills: CCAI/Dialogflow CX hands on experience and generative AI understanding.- Good To Have Skills: Cloud Data Architecture, Cloud ML/PCA/PDE Certification, - Strong understanding of AI/ML algorithms, NLP and techniques.- Experience with chatbot, generative AI models, prompt Engineering.- Experience with cloud or on-prem application pipeline with production-ready quality. Additional Information:- The candidate should have a minimum of 10 years of experience in Google Cloud Machine Learning Services/Gen AI/Vertex AI/CCAI.- The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions.- A 15-year full time education is required Qualification 15 years full time education

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies