Jobs
Interviews

976 Dataflow Jobs - Page 24

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

14 - 22 Lacs

Hyderabad

Work from Office

Role - Machine Learning Engineer Required Skills & Experience 5+ years of hands-on experience in building, training, and deploying machine learning models in a professional, production-oriented setting. Demonstrable experience with database creation and advanced querying (e.g., SQL, NoSQL), with a strong understanding of data warehousing concepts. Proven expertise in data blending, transformation, and feature engineering , adept at integrating and harmonizing both structured (e.g., relational databases, CSVs) and unstructured (e.g., text, logs, images) data. Strong practical experience with cloud platforms for machine learning development and deployment; significant experience with Google Cloud Platform (GCP) services (e.g., Vertex AI, BigQuery, Dataflow) is highly desirable. Proficiency in programming languages commonly used in data science (e.g., Python is preferred, R). Solid understanding of various machine learning algorithms (e.g., regression, classification, clustering, dimensionality reduction) and experience with advanced techniques like Deep Learning, Natural Language Processing (NLP), or Computer Vision . Experience with machine learning libraries and frameworks (e.g., scikit-learn, TensorFlow, PyTorch ). Familiarity with MLOps tools and practices , including model versioning, monitoring, A/B testing, and continuous integration/continuous deployment (CI/CD) pipelines. Experience with containerization technologies like Docker and orchestration tools like Kubernetes for deploying ML models as REST APIs. Proficiency with version control systems (e.g., Git, GitHub/GitLab) for collaborative development. Interested candidates share cv to dikshith.nalapatla@motivitylabs.com

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Apply now » Apply Now Start applying with LinkedIn Please wait... Date: Jun 7, 2025 Location: Pune, MH, IN Company: HMH HMH is a learning technology company committed to delivering connected solutions that engage learners, empower educators and improve student outcomes. As a leading provider of K–12 core curriculum, supplemental and intervention solutions, and professional learning services, HMH partners with educators and school districts to uncover solutions that unlock students’ potential and extend teachers’ capabilities. HMH serves more than 50 million students and 4 million educators in 150 countries. HMH Technology India Pvt. Ltd. is our technology and innovation arm in India focused on developing novel products and solutions using cutting-edge technology to better serve our clients globally. HMH aims to help employees grow as people, and not just as professionals. For more information, visit www.hmhco.com The data architect is responsible for designing, creating, and managing an organization’s data architecture. This role is critical in establishing a solid foundation for data management within an organization, ensuring that data is organized, accessible, secure, and aligned with business objectives. The data architect designs data models, warehouses, file systems and databases, and defines how data will be collected and organized. Responsibilities Interprets and delivers impactful strategic plans improving data integration, data quality, and data delivery in support of business initiatives and roadmaps Designs the structure and layout of data systems, including databases, warehouses, and lakes Selects and designs database management systems that meet the organization’s needs by defining data schemas, optimizing data storage, and establishing data access controls and security measures Defines and implements the long-term technology strategy and innovations roadmaps across analytics, data engineering, and data platforms Designs processes for the ETL process from various sources into the organization’s data systems Translates high-level business requirements into data models and appropriate metadata, test data, and data quality standards Manages senior business stakeholders to secure strong engagement and ensures that the delivery of the project aligns with longer-term strategic roadmaps Simplifies the existing data architecture, delivering reusable services and cost-saving opportunities in line with the policies and standards of the company Leads and participates in the peer review and quality assurance of project architectural artifacts across the EA group through governance forums Defines and manages standards, guidelines, and processes to ensure data quality Works with IT teams, business analysts, and data analytics teams to understand data consumers’ needs and develop solutions Evaluates and recommends emerging technologies for data management, storage, and analytics Design, create, and implement logical and physical data models for both IT and business solutions to capture the structure, relationships, and constraints of relevant datasets Build and operationalize complex data solutions, correct problems, apply transformations, and recommend data cleansing/quality solutions Effectively collaborate and communicate with various stakeholders to understand data and business requirements and translate them into data models Create entity-relationship diagrams (ERDs), data flow diagrams, and other visualization tools to represent data models Collaborate with database administrators and software engineers to implement and maintain data models in databases, data warehouses, and data lakes Develop data modeling best practices, and use these standards to identify and resolve data modeling issues and conflicts Conduct performance tuning and optimization of data models for efficient data access and retrieval Incorporate core data management competencies, including data governance, data security and data quality Education Job Requirements A bachelor’s degree in computer science, data science, engineering, or related field Experience At least five years of relevant experience in design and implementation of data models for enterprise data warehouse initiatives Experience leading projects involving data warehousing, data modeling, and data analysis Design experience in Azure Databricks, PySpark, and Power BI/Tableau Skills Ability in programming languages such as Java, Python, and C/C++ Ability in data science languages/tools such as SQL, R, SAS, or Excel Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP), real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake, Databricks) Experience with database technologies such as SQL, NoSQL, Oracle, Hadoop, or Teradata Understanding of entity-relationship modeling, metadata systems, and data quality tools and techniques Ability to think strategically and relate architectural decisions and recommendations to business needs and client culture Ability to assess traditional and modern data architecture components based on business needs Experience with business intelligence tools and technologies such as ETL, Power BI, and Tableau Ability to regularly learn and adopt new technology, especially in the ML/AI realm Strong analytical and problem-solving skills Ability to synthesize and clearly communicate large volumes of complex information to senior management of various technical understandings Ability to collaborate and excel in complex, cross-functional teams involving data scientists, business analysts, and stakeholders Ability to guide solution design and architecture to meet business needs Expert knowledge of data modeling concepts, methodologies, and best practices Proficiency in data modeling tools such as Erwin or ER/Studio Knowledge of relational databases and database design principles Familiarity with dimensional modeling and data warehousing concepts Strong SQL skills for data querying, manipulation, and optimization, and knowledge of other data science languages, including JavaScript, Python, and R Ability to collaborate with cross-functional teams and stakeholders to gather requirements and align on data models Excellent analytical and problem-solving skills to identify and resolve data modeling issues Strong communication and documentation skills to effectively convey complex data modeling concepts to technical and business stakeholders HMH Technology Private Limited is an Equal Opportunity Employer and considers applicants for all positions without regard to race, colour, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. We are committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. For more information, visit https://careers.hmhco.com/. Follow us on Twitter, Facebook, LinkedIn, and YouTube. Job Segment: Curriculum, Social Media, Education, Marketing Apply now » Apply Now Start applying with LinkedIn Please wait... Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job description: Overall more than 5 Yrs of experience in Data projects Good Knowledge of GCP Bigquery SQL Python dataflow skills Has worked in Implementation projects building data pipelines transformation logics and data models Job Title GCP Data Engineer Belongs to Data Management Engineering Education Bachelor of engineering in any disciplineequivalent Desired Candidate Profile Technology Engineering Expertise 4 years of experience in implementing data solutions using GCP BigquerySQL programming Proficient in dealing data access layer RDBMS NOSQL Experience in implementing and deploying Big data applications with GCP Big Data Services Good to have SQL skills Able to deal with diverse set of stakeholders Proficient in articulation communication and presentation High integrity Problem solving skills learning attitude Team player Key Responsibilities Implement data solutions using GCP and need to be familiar in programming with SQLpython Ensure clarity on NFR and implement these requirements Work with Client Technical Manager by understanding customers landscape their IT priorities Lead performance engineering and capacity planning exercises for databases Technology Engineering Expertise 4 years of experience in implementing data pipelines for Data Analytics solutions Experience in solutions using Google Cloud Data Flow Apache Beam Java programming Proficient in dealing data access layer RDBMS NOSQL Experience in implementing and deploying Big data applications with GCP Big Data Services Good to have SQL skills Experience with different development methodologies RUP Scrum XP Soft skills Able to deal with diverse set of stakeholders Proficient in articulation communication and presentation High integrity Problem solving skills learning attitude Team player Skills: Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Cloud Composer,GCP DMS,Apache airflow,Java,Python,Scala,GCP Datastream,Google Analytics Hub,GCP Workflows,GCP Dataform,GCP Datafusion,GCP Pub/Sub,ANSI-SQL,GCP Dataflow,GCP Data Flow,GCP Cloud Pub/Sub,Big Data Hadoop Ecosystem Show more Show less

Posted 1 month ago

Apply

1.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description · Hands-on experience with data tools and technologies is a must. ·Partial design experience is acceptable, but core focus should be on strong data skills. ·Will be supporting the pre-sales team from a hands-on technical perspective. · GCP experience: Looker / BigQuery / Vertex – any of these with 6 months to 1 year of experience. Requirements On day one we'll expect you to... Own the modules and take complete ownership of the project Understand the scope, design and business objective of the project and articulate it in the form of a design document Strong experience with Google Cloud Platform data services, including BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, GenAI (Gemini, Imagen, Veo) Experience in implementing data governance on GCP Familiarity with integrating GCP services with other platforms like Snowflake, and hands-on Snowflake project experience is a plus Experienced coder in python, SQL, ETL and orchestration tools Experience with containerized solutions using Google Kubernetes Engine Good communication skills to interact with internal teams and customers Expertise in pySpark(Batch and Real-time both), Kafka, SQL, Data Querying tools. Experience in working with a team, continuously monitoring, working as a individual contributor hand-on and helping team deliver their work as you deliver yours Experience in working with large volumes of data in distributed environment keeping in mind parallelism and concurrency, ensuring performant and resilient system ops Optimize the deployment architecture to reduce job run-times and resource utilization Develop and Optimize Data Warehouses given the schema design. Show more Show less

Posted 1 month ago

Apply

55.0 years

6 - 9 Lacs

Pune

Remote

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your Role Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data Lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of €22.5 billion.

Posted 1 month ago

Apply

0 years

6 - 8 Lacs

Bengaluru

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Demonstrated ability of Insurance Company Operations like Nonforfeiture option/ Face amount increase, decrease/ CVAT or GPT calculations /Dollar cost averaging and perform their respective transactions. Understanding on upstream and downstream interfaces for policy lifecycle Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Proficient in technology solution architecture, with a focus on designing innovative and effective solutions. Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: GCP Data Engineer Location: Chennai Duration: 12 Months Work Type: Onsite Position Description: Bachelor's Degree 2+Years in GCP Services- Biq Query, Data Flow, Dataproc, DataPlex, DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years in Data Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strong experience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobs for data importing/exporting Skills Required: Python, dataflow, Dataproc, GCP Cloud Run, DataForm, Agile Software Development, Big Query, TERRAFORM, Data Fusion, Cloud SQL, GCP, KAFKA Skills Preferred: Java Experience Required: 8+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity. Show more Show less

Posted 1 month ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Our company is seeking a seasoned Senior Data Engineer to join our dynamic team. In this role as a Senior Data Engineer, you will concentrate on data integration and ETL projects for cloud-based systems. Your primary responsibilities will be to design and implement sophisticated data solutions, ensuring data accuracy, reliability, and accessibility. Responsibilities Design and implement sophisticated data solutions for cloud-based systems Develop ETL workflows utilizing SQL, Python, and additional relevant technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Collaborate with cross-functional teams to gauge data integration needs and specifications Create and uphold documentation, such as technical specifications, data flow diagrams, and data mappings Monitor and refine data integration processes to boost performance and efficiency while ensuring data accuracy and integrity Requirements Bachelor's degree in Computer Science, Electrical Engineering, or related fields 5-8 years of experience in data engineering Experience using cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Proficiency in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good oral and written communication skills in English at a B2 level Nice to have Proficiency in ETL using Python Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : React.js, Cloud Network Operations Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop and implement scalable applications using Google BigQuery. - Collaborate with cross-functional teams to ensure application functionality. - Conduct code reviews and provide technical guidance to junior developers. - Stay updated on industry trends and best practices in application development. - Troubleshoot and resolve application issues in a timely manner. Professional & Technical Skills: (Project specific) - BQ, BQ Geospatial, Python, Dataflow, Composer , - Secondary skill -Geospatial Domain Knowledge" - Must To Have Skills: Proficiency in Google BigQuery. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Bengaluru office. - A 15 years full-time education is required. Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Role- Azure databricks engineer Location- Kolkata Experience 7+ Must-Have** Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure data ingestion and transformation components. Following technology skills are required – Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with ADF, Dataflow Experience with big data tools like Delta Lake, Azure Databricks Experience with Synapse Designing an Azure Data Solution skills Assemble large, complex data sets that meet functional / non-functional business requirements. Good-to-Have Working knowledge of Azure DevOps SN Responsibility of / Expectations from the Role 1 Customer Centric Work closely with client teams to understand project requirements and translate into technical design Experience working in scrum or with scrum teams Internal Collaboration Work with project teams and guide the end to end project lifecycle, resolve technical queries Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. Soft Skills Good communication skills Ability to interact with various internal groups and CoEs Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Role**: Java SB MS Required Technical Skill Set: Java SB MS Desired Experience Range: 04 - 10 yrs Notice Period: Immediate to 30Days only Location of Requirement: Pune/Thivandrum We are currently planning to do a Virtual Interview on 11 th June 2025 (Wednesday) Date – 11 th June 2025 (Wednesday) Job Description: Desired Skills -Technical/Behavioral Primary Skill Java Springboot, GCP Services Secondary Skill Big Query, Jenkins, Dataflow, CloudSQL Show more Show less

Posted 1 month ago

Apply

55.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your Role Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data Lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What You Will Love About Working Here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of €22.5 billion. Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Senior GCP Data Engineer Location: Chennai Experience: 8-15 Years Job Summary We are looking for a Senior GCP Data Engineer with deep expertise in Google Cloud Platform (GCP) services and modern data engineering tools. The ideal candidate will be responsible for designing, building, and optimizing scalable data pipelines and solutions across the GCP ecosystem. This role is ideal for someone who thrives in Agile environments, understands infrastructure-as-code, and can work closely with data scientists, analysts, and DevOps teams. Required Skills & Experience 8–15 years of overall experience in data engineering with at least 3+ years of strong hands-on experience with GCP. Expert-level programming in Python for data manipulation and orchestration. Hands-on experience with Google Cloud services: BigQuery, Dataflow, Dataproc, Data Fusion, Cloud Run, Cloud SQL Proficiency with Terraform for managing GCP infrastructure. Experience using DataForm or similar data modeling/orchestration tools. Strong knowledge of SQL and performance optimization techniques. Solid understanding of Agile software development practices and tools (e.g., JIRA, Confluence). Experience with real-time and batch data processing patterns. Show more Show less

Posted 1 month ago

Apply

0.0 - 10.0 years

0 Lacs

Indore, Madhya Pradesh

On-site

Indore, Madhya Pradesh, India;Noida, Uttar Pradesh, India Qualification : Job Description: We are looking for GCP Data Engineer and SQL Programmer with good working experience on PostgreSQL, & PL/SQL programming experience and following technical skills PL/SQL and PostgreSQL programming – Ability to write complex SQL Queries, Stored Procedures. Migration – Working experience in migrating Database structure and data from Oracle to Postgres SQL preferably on GCP Alloy DB or Cloud SQL Working experience on Cloud SQL/Alloy DB Working experience to tune autovacuum in postgresql. Working experience on tuning Alloy DB / PostgreSQL for better performance. Working experience on Big Query, Fire Store, Memory Store, Spanner and bare metal setup for PostgreSQL Ability to tune the Alloy DB / Cloud SQL database for better performance Experience on GCP Data migration service Working experience on MongoDB Working experience on Cloud Dataflow Working experience on Database Disaster Recovery Working experience on Database Job scheduling Working experience on Database logging techniques Knowledge of OLTP And OLAP Desirable: GCP Database Engineer Certification Other Skills:- Out of the Box Thinking Problem Solving Skills Ability to make tech choices (build v/s buy) Performance management (profiling, benchmarking, testing, fixing) Enterprise Architecture Project management/Delivery Capabilty/ Quality Mindset Scope management Plan (phasing, critical path, risk identification) Schedule management / Estimations Leadership skills Other Soft Skills Learning ability Innovative / Initiative Skills Required : Postgresql, plsql, Bigquery Role : Develop, construct, test, and maintain data architectures Migrate Enterprise Oracle database from On Prem to GCP cloud autovacuum in postgresql Ability to tune autovacuum in postgresql. Working on tuning Alloy DB / PostgreSQL for better performance. Performance Tuning of PostgreSQL stored procedure code and queries Converting Oracle stored procedure & queries to PostgreSQL stored procedures & Queries Creating Hybrid data store with Datawarehouse and No SQL GCP solutions along with PostgreSQL. Migrate Oracle Table data from Oracle to Alloy DB Leading the database team Experience : 7 to 10 years Job Reference Number : 12779

Posted 1 month ago

Apply

16.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Experience- 16+years Location- Hybrid Job Description- We are seeking a dynamic and experienced Enterprise Solution Architect to lead the design and implementation of innovative solutions that align with our organization’s strategic objectives. The Enterprise Solution Architect will play a key role in defining the architecture vision, establishing technical standards, and driving the adoption of best practices across the enterprise. The ideal candidate will have a deep understanding of enterprise architecture principles, business processes, and technology trends, with a focus on delivering scalable, flexible, and secure solutions. Responsibilities Drive client conversations, solutions and build strong relationships with client, acting as a trusted advisor and technical expert. Experienced in laying down Architectural roadmap, guidelines and High Level Design covering E2E lifecycle of data value chain from ingestion, integration, consumption (visualization, AI capabilities), data governance and non-functionals (incl. data security) Experienced in delivering large scale data platform implementations for Telecom clients. Must have Telecom domain understanding. Experienced in implementation of data applications and platform on GCP. Execution of a comprehensive data migration strategy for our telecom client, involving multiple source systems to GCP. Deep dive into client requirements to understand their data needs and challenges. Proactively propose solutions that leverage GCP’s capabilities or integrate with external tools for optimal results. Spearhead solution calls with the client, translating complex data architecture and engineering concepts into clear, actionable plans for data engineers. Demonstrate flexibility and adaptability to accommodate evolving needs. Develop a robust data model for the telecom client, ensuring data is organized, consistent, and readily available for analysis. Leverage your expertise in Data, AI, and ML to create a future-proof blueprint for the client’s data landscape, enabling advanced analytics and insights generation. Develop architectural principles, standards, and guidelines to ensure consistency, interoperability, and scalability across systems and applications. Lead the design and implementation of end-to-end solutions that leverage emerging technologies and industry best practices to address business challenges and opportunities. Conduct architectural reviews and assessments to validate design decisions, identify risks, and recommend mitigation strategies. Collaborate with vendors, partners, and external consultants to evaluate and select technology solutions that meet business requirements and align with enterprise architecture standards. Drive the adoption of cloud computing, microservices architecture, API management, and other emerging technologies to enable digital transformation and innovation. Communicate the enterprise architecture vision, principles, and roadmap to stakeholders at all levels of the organization, and advocate for architectural decisions and investments. Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. Total experience of 18+ years on data analytics implementations. Minimum 10+ years of extensive experience as a Principal Solution Architect or similar senior role. Proven success in leading large-scale data migrations, particularly to GCP. In-depth knowledge of data architecture principles and best practices. Strong understanding of data modeling techniques and the ability to create efficient data models. Experience working with GCP and its various data management services (e.g., BigQuery, Cloud Storage, Dataflow, dbt). Experience with at least one programming language commonly used in data processing (e.g., Python, Java). A demonstrable understanding of Data Science, Artificial Intelligence, and Machine Learning concepts. Click here to apply Apply here Job Category: Solution Designer Job Type: Contract Job Location: hybrid Apply for this position Full Name * Email * Phone * Cover Letter * Upload CV/Resume *Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. * Show more Show less

Posted 1 month ago

Apply

35.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Company Description Sutherland is a global leader in driving business and digital transformation, and exceptional experiences along the entire journey of our client’s engagement with their customers. With over 35 years of experience, we combine deep domain expertise and extensive knowledge in proven optimization with both proprietary and partnered tools and platforms to drive growth, efficiency, and productivity across organizations. Sutherland brings together our people, processes, products and platforms across cognitive artificial intelligence (AI), intelligent automation, advanced analytics and digital services to create unique solutions for the industries that we service. The core values of remaining agile, outside-the-box thinking, uncompromising integrity and flawless execution are key pillars of the company. We serve marque brands across Healthcare, Insurance, Banking and Financial Services, Communications, Media and Entertainment, Technology, Travel and Logistics and Retail. Sutherland has 212 unique and independent inventions associated with several patent grants in critical technologies in the US and UK. Job Description Role Overview: We are looking for a motivated Power BI Reporting / Analytics Specialist to join our team and help transform raw data into actionable insights within the context of SAP implementations. The ideal candidate will have 10+ years of experience in working with SAP, as well as hands-on experience in creating reports, dashboards, and analytics using Power BI. In this role, you will collaborate with SAP functional teams to gather data from various SAP modules and develop business intelligence solutions that empower data-driven decision-making. Key Responsibilities: Data Collection and Integration: Collaborate with SAP functional consultants and business stakeholders to gather and understand data requirements for reporting and analytics. Extract and integrate data from various SAP modules (e.g., SAP FICO, MM, SD, HR) to prepare datasets for reporting and analysis. Work with data engineering teams to ensure clean, accurate, and reliable data pipelines for Power BI reports. Power BI Report Development: Design, develop, and maintain interactive and visually appealing Power BI reports and dashboards based on business requirements. Create custom Power BI visualizations to present key metrics, KPIs, and trends derived from SAP data. Implement drill-down capabilities, dynamic filtering, and other Power BI features to enhance the user experience and provide more granular insights. Data Analysis & Insights: Perform data analysis on SAP data to identify key trends, anomalies, and business performance indicators. Work closely with business users to understand their analytical needs and provide actionable insights using Power BI. Provide ongoing analysis and reporting for continuous monitoring of business performance. Collaborate with SAP Functional Teams: Work closely with SAP functional consultants (e.g., SAP FICO, MM, SD) to ensure accurate extraction of relevant data from SAP systems. Assist in defining data models and ensuring that data from SAP is represented appropriately for reporting and analytics. Support functional teams in implementing data governance processes to ensure data integrity and consistency across reports. Report Optimization and Performance Tuning: Continuously optimize Power BI reports for performance, ensuring fast loading times and efficient data refreshes. Troubleshoot and resolve performance issues in reports or dashboards to maintain smooth user experiences. Implement best practices for report design, data model optimization, and visual consistency. User Support and Training: Provide training and support to end-users, ensuring they understand how to navigate Power BI reports and interpret the data. Create user manuals or documentation for Power BI reports and dashboards, ensuring that business users can independently generate insights. Assist with user feedback, ensuring reports meet their needs and making necessary adjustments. Continuous Improvement: Stay up-to-date with the latest features and capabilities of Power BI, and implement new functionalities to improve the reporting experience. Suggest improvements to existing reporting structures and processes to enhance reporting efficiency and accuracy. Required Skills & Qualifications: Experience: 10+ years of hands-on experience with Power BI, ideally with exposure to SAP data reporting and analytics. Technical Skills: Proficiency in Power BI Desktop, Power BI Service, and DAX (Data Analysis Expressions). Understanding of data extraction techniques (e.g., SAP HANA, SAP BW) and integration with Power BI. Familiarity with SAP modules (FICO, MM, SD, HR) and their data structures. Ability to design and implement effective data models and relationships in Power BI. Data Visualization: Strong skills in creating effective and visually compelling reports and dashboards, ensuring clarity of insights. SQL Skills: Knowledge of SQL for data extraction and transformation purposes. Analytical Skills: Strong analytical mindset, capable of identifying patterns and trends within data to provide actionable insights. Collaboration Skills: Ability to work cross-functionally with SAP teams, business users, and IT teams to ensure the success of reporting initiatives. Communication Skills: Strong verbal and written communication skills, with the ability to present complex data in a simple, user-friendly manner. Preferred Skills: SAP Experience: Exposure to SAP systems and understanding of how data flows within SAP (SAP FICO, MM, SD, etc.) is a plus. Power BI Certification: Certification in Power BI or other relevant BI tools. Data Warehousing Knowledge: Familiarity with data warehousing concepts and the integration of data sources into reporting tools. Advanced Power BI Features: Experience with advanced Power BI features like Power Query, Dataflow, custom visuals, and data transformations. Agile Methodology: Experience working in Agile/Scrum project environments. Additional Information All your information will be kept confidential according to EEO guidelines. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design and implement scalable, efficient, and secure data pipelines on GCP, utilizing tools such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Collaborate with cross-functional teams (data scientists, analysts, and software engineers) to understand business requirements and deliver actionable data solutions. Develop and maintain ETL/ELT processes to ingest, transform, and load data from various sources into GCP-based data warehouses. Build and manage data lakes and data marts on GCP to support analytics and business intelligence initiatives. Implement automated data quality checks, monitoring, and alerting systems to ensure data integrity.  Optimize and tune performance for large-scale data processing jobs in BigQuery, Dataflow, and other GCP tools. Create and maintain data pipelines to collect, clean, and transform data for analytics and machine learning purposes. Ensure data governance and compliance with organizational policies, including data security, privacy, and access controls. Stay up to date with new GCP services and features and make recommendations for improvements and new implementations. Mandatory Skill Sets GCP, Big query , Data Proc Preferred Skill Sets GCP, Big query , Data Proc, Airflow Years Of Experience Required 4-7 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Good Clinical Practice (GCP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation {+ 18 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within…. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design and implement scalable, efficient, and secure data pipelines on GCP, utilizing tools such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Collaborate with cross-functional teams (data scientists, analysts, and software engineers) to understand business requirements and deliver actionable data solutions. Develop and maintain ETL/ELT processes to ingest, transform, and load data from various sources into GCP-based data warehouses. Build and manage data lakes and data marts on GCP to support analytics and business intelligence initiatives. Implement automated data quality checks, monitoring, and alerting systems to ensure data integrity.  Optimize and tune performance for large-scale data processing jobs in BigQuery, Dataflow, and other GCP tools. Create and maintain data pipelines to collect, clean, and transform data for analytics and machine learning purposes. Ensure data governance and compliance with organizational policies, including data security, privacy, and access controls. Stay up to date with new GCP services and features and make recommendations for improvements and new implementations. Mandatory Skill Sets GCP, Big query , Data Proc Preferred Skill Sets GCP, Big query , Data Proc, Airflow Years Of Experience Required 4-7 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Good Clinical Practice (GCP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation {+ 18 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 month ago

Apply

6.0 years

5 - 10 Lacs

Noida

On-site

Country/Region: IN Requisition ID: 26218 Work Model: Position Type: Salary Range: Location: INDIA - NOIDA- BIRLASOFT OFFICE Title: Technical Specialist-Data Engg Description: Area(s) of responsibility JD - Snowflake Developer 6 years of experience working with hands on development experience in DBT, Aptitude, Snowflake on Azure platform- Dataflow, Data Ingestion, Data Storage & Security Expertise in ETL tool DBT Design Data Integration (ETL) projects using the DBT Strong hands-on experience in build custom data models/semantic reporting layer in Snowflake to support customer reporting current platform requirements. Good to have experience in any other ETL tool Participate in the entire project lifecycle including design and development of ETL solutions Design data integration and conversion strategy, exception handing mechanism, data retention and archival strategy Ability to communicate platform features/development effectively to customer SME & Technical team.

Posted 1 month ago

Apply

6.0 years

0 Lacs

Greater Hyderabad Area

On-site

Area(s) of responsibility JD - Snowflake Developer 6 years of experience working with hands on development experience in DBT, Aptitude, Snowflake on Azure platform- Dataflow, Data Ingestion, Data Storage & Security Expertise in ETL tool DBT Design Data Integration (ETL) projects using the DBT Strong hands-on experience in build custom data models/semantic reporting layer in Snowflake to support customer reporting current platform requirements. Good to have experience in any other ETL tool Participate in the entire project lifecycle including design and development of ETL solutions Design data integration and conversion strategy, exception handing mechanism, data retention and archival strategy Ability to communicate platform features/development effectively to customer SME & Technical team. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderābād

On-site

Summary We are seeking a highly skilled and motivated GCP Data Engineering Manager to join our dynamic team. As a Data Engineering manager specializing in Google Cloud Platform (GCP), you will play a crucial role in designing, implementing, and maintaining scalable data pipelines and systems. You will leverage your expertise in Google Big Query, SQL, Python, and analytical skills to drive data-driven decision-making processes and support various business functions. About the Role Key Responsibilities: Data Pipeline Development: Design, develop, and maintain robust data pipelines using GCP services like Dataflow, Dataproc, ensuring high performance and scalability. Google Big Query Expertise: Utilize your hands-on experience with Google Big Query to manage and optimize data storage, retrieval, and processing. SQL Proficiency: Write and optimize complex SQL queries to transform and analyze large datasets, ensuring data accuracy and integrity. Python Programming: Develop and maintain Python scripts for data processing, automation, and integration with other systems and tools. Data Integration: Collaborate with data analysts, and other stakeholders to integrate data from various sources, ensuring seamless data flow and consistency. Data Quality and Governance: Implement data quality checks, validation processes, and governance frameworks to maintain high data standards. Performance Tuning: Monitor and optimize the performance of data pipelines, queries, and storage solutions to ensure efficient data processing. Documentation: Create comprehensive documentation for data pipelines, processes, and best practices to facilitate knowledge sharing and team collaboration. Minimum Qualifications: Proven experience (minimum 6 – 8 yrs) in Data Engineer, with significant hands-on experience in Google Cloud Platform (GCP) and Google Big Query. Proficiency in SQL for data transformation, analysis and performance optimization. Strong programming skills in Python, with experience in developing data processing scripts and automation. Proven analytical skills with the ability to interpret complex data and provide actionable insights. Excellent problem-solving abilities and attention to detail. Strong communication and collaboration skills, with the ability to work effectively in a team enviro Desired Skills : Experience with Google Analytics data and understanding of digital marketing data. Familiarity with other GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Dataproc. Knowledge of data visualization tools such as Looker, Tableau, or Data Studio. Experience with machine learning frameworks and libraries. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division US Business Unit Universal Hierarchy Node Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Marketing Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.

Posted 1 month ago

Apply

8.0 - 13.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are searching for a skilled Lead Data Engineer to enhance our energetic team. In the role of Lead Data Engineer, you will take charge of creating, building, and sustaining data integration solutions for our clientele. Your leadership will guide a team of engineers towards achieving high-quality, scalable, and efficient data integration solutions. This role presents a thrilling challenge for an experienced data integration expert who is enthusiastic about technology and excels in a rapid, evolving setting. Responsibilities Design, build, and sustain data integration solutions for clients Guide a team of engineers to guarantee high-quality, scalable, and efficient data integration solutions Collaborate with multidisciplinary teams to grasp business needs and devise appropriate data integration solutions Ensure the security, reliability, and efficiency of data integration solutions Create and update documentation, such as technical specifications, data flow diagrams, and data mappings Continuously update knowledge and skills related to the latest data integration methods and tools Requirements Bachelor's degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or a related field Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Background in Snowflake for cloud data warehousing Familiarity with at least one cloud platform such as AWS, Azure, or GCP Experience in leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

We are seeking a highly skilled and motivated Lead DS/ML engineer to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns. We are seeking a highly skilled Data Scientist / ML Engineer with a strong foundation in data engineering (ELT, data pipelines) and advanced machine learning to develop and deploy sophisticated models. The role focuses on building scalable data pipelines, developing ML models, and deploying solutions in production to support a cutting-edge reporting, insights, and recommendations platform for measuring and optimizing online marketing campaigns. The ideal candidate should be comfortable working across data engineering, ML model lifecycle, and cloud-native technologies. Job Description: Key Responsibilities: Data Engineering & Pipeline Development Design, build, and maintain scalable ELT pipelines for ingesting, transforming, and processing large-scale marketing campaign data. Ensure high data quality, integrity, and governance using orchestration tools like Apache Airflow, Google Cloud Composer, or Prefect. Optimize data storage, retrieval, and processing using BigQuery, Dataflow, and Spark for both batch and real-time workloads. Implement data modeling and feature engineering for ML use cases. Machine Learning Model Development & Validation Develop and validate predictive and prescriptive ML models to enhance marketing campaign measurement and optimization. Experiment with different algorithms (regression, classification, clustering, reinforcement learning) to drive insights and recommendations. Leverage NLP, time-series forecasting, and causal inference models to improve campaign attribution and performance analysis. Optimize models for scalability, efficiency, and interpretability. MLOps & Model Deployment Deploy and monitor ML models in production using tools such as Vertex AI, MLflow, Kubeflow, or TensorFlow Serving. Implement CI/CD pipelines for ML models, ensuring seamless updates and retraining. Develop real-time inference solutions and integrate ML models into BI dashboards and reporting platforms. Cloud & Infrastructure Optimization Design cloud-native data processing solutions on Google Cloud Platform (GCP), leveraging services such as BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, and Dataflow. Work on containerized deployment (Docker, Kubernetes) for scalable model inference. Implement cost-efficient, serverless data solutions where applicable. Business Impact & Cross-functional Collaboration Work closely with data analysts, marketing teams, and software engineers to align ML and data solutions with business objectives. Translate complex model insights into actionable business recommendations. Present findings and performance metrics to both technical and non-technical stakeholders. Qualifications & Skills: Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence, Statistics, or a related field. Certifications in Google Cloud (Professional Data Engineer, ML Engineer) is a plus. Must-Have Skills: Experience: 5-10 years with the mentioned skillset & relevant hands-on experience Data Engineering: Experience with ETL/ELT pipelines, data ingestion, transformation, and orchestration (Airflow, Dataflow, Composer). ML Model Development: Strong grasp of statistical modeling, supervised/unsupervised learning, time-series forecasting, and NLP. Programming: Proficiency in Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and SQL for large-scale data processing. Cloud & Infrastructure: Expertise in GCP (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Storage) or equivalent cloud platforms. MLOps & Deployment: Hands-on experience with CI/CD pipelines, model monitoring, and version control (MLflow, Kubeflow, Vertex AI, or similar tools). Data Warehousing & Real-time Processing: Strong knowledge of modern data platforms for batch and streaming data processing. Nice-to-Have Skills: Experience with Graph ML, reinforcement learning, or causal inference modeling. Working knowledge of BI tools (Looker, Tableau, Power BI) for integrating ML insights into dashboards. Familiarity with marketing analytics, attribution modeling, and A/B testing methodologies. Experience with distributed computing frameworks (Spark, Dask, Ray). Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies