Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
4 - 8 Lacs
Kochi
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in Big Data Technology like Hadoop, Apache Spark, Hive. Practical experience in Core Java (1.8 preferred) /Python/Scala. Having experience in AWS cloud services including S3, Redshift, EMR etc. Strong expertise in RDBMS and SQL. Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 4 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Hyderabad
Work from Office
One Azure backend expert (Strong SC or Specialist Senior) Should have hands-on experience of working with ADLS, ADF and Azure SQL DW Should have minimum 3 Years working experience of delivering Azure projects. Must Have:- 3 to 8 years of experience working on design, develop, and deploy ETL processes on Databricks to support data integration and transformation. Optimize and tune Databricks jobs for performance and scalability. Experience with Scala and/or Python programming languages. Proficiency in SQL for querying and managing data. Expertise in ETL (Extract, Transform, Load) processes. Knowledge of data modeling and data warehousing concepts. Implement best practices for data pipelines, including monitoring, logging, and error handling. Excellent problem-solving skills and attention to detail. Excellent written and verbal communication skills Strong analytical and problem-solving abilities. Experience in version control systems (e.g., Git) to manage and track changes to the codebase. Document technical designs, processes, and procedures related to Databricks development. Stay current with Databricks platform updates and recommend improvements to existing process. Good to Have:- Agile delivery experience. Experience with cloud services, particularly Azure (Azure Databricks), AWS (AWS Glue, EMR), or Google Cloud Platform (GCP). Knowledge of Agile and Scrum Software Development Methodologies. Understanding of data lake architectures. Familiarity with tools like Apache NiFi, Talend, or Informatica. Skills in designing and implementing data models. Skills: adf,sql,adls,azure,azure sql dw
Posted 4 weeks ago
13.0 - 18.0 years
44 - 48 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About KPI Partners. KPI Partners is a leading provider of data-driven insights and innovative analytics solutions. We strive to empower organizations to harness the full potential of their data, driving informed decision-making and business success. We are seeking an enthusiastic and experienced professional to join our dynamic team as an Associate Director / Director in Data Engineering & Modeling. We are looking for a highly skilled and motivated Associate Director/ Director – Data Engineering & Solution Architecture to support the strategic delivery of modern data platforms and enterprise analytics solutions. This is a hands-on leadership role that bridges technology and business, helping design, develop, and operationalize scalable cloud-based data ecosystems. You will work closely with client stakeholders, internal delivery teams, and practice leadership to drive the architecture, implementation, and best practices across key initiatives. Key Responsibilities Solution Design & Architecture : Collaborate on designing robust, secure, and cost-efficient data architectures using cloud-native platforms such as Databricks, Snowflake, Azure Data Services, AWS, and Incorta. Data Engineering Leadership : Oversee the development of scalable ETL/ELT pipelines using ADF, Airflow, dbt, PySpark, and SQL, with an emphasis on automation, error handling, and auditing. Data Modeling & Integration : Design data models (star, snowflake, canonical), resolve dimensional hierarchies, and implement efficient join strategies. API-based Data Sourcing : Work with REST APIs for data acquisition — manage pagination, throttling, authentication, and schema evolution. Platform Delivery : Support end-to-end project lifecycle — from requirement analysis and PoCs to development, deployment, and handover. CI/CD & DevOps Enablement : Implement and manage CI/CD workflows using Git, Azure DevOps, and related tools to enforce quality and streamline deployments. Mentoring & Team Leadership : Mentor senior engineers and developers, conduct code reviews, and promote best practices across engagements. Client Engagement : Interact with clients to understand needs, propose solutions, resolve delivery issues, and maintain high satisfaction levels. Required Skills & Qualifications 14+ years of experience in Data Engineering, BI, or Solution Architecture roles. Strong hands-on expertise in one of the cloud like Azure, Databricks, Snowflake, and AWS (EMR). Proficiency in Python, SQL, and PySpark for large-scale data transformation. Proven skills in developing dynamic and reusable data pipelines (metadata-driven preferred). Strong grasp of data modeling principles and modern warehouse design. Experience with API integrations, including error handling and schema versioning. Ability to design modular and scalable solutions aligned with business goals. Solid communication and stakeholder management skills. Preferred Qualifications Exposure to data governance, data quality frameworks, and security best practices. Certifications in Azure Data Engineering, Databricks, or Snowflake are a plus. Experience working with Incorta and building materialized views or delta-based architectures. Experience working with enterprise ERP systems. Exposure leading data ingestion from Oracle Fusion ERP and other enterprise systems. What We Offer Opportunity to work on cutting-edge data transformation projects for global enterprises Mentorship from senior leaders and a clear path to Director-level roles Flexible work environment and a culture that values innovation, ownership, and growth Competitive compensation and professional development support
Posted 4 weeks ago
2.0 - 4.0 years
7 - 9 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
POSITION Senior Data Engineer / Data Engineer LOCATION Bangalore/Mumbai/Kolkata/Gurugram/Hyd/Pune/Chennai EXPERIENCE 2+ Years JOB TITLE: Senior Data Engineer / Data Engineer OVERVIEW OF THE ROLE: As a Data Engineer or Senior Data Engineer, you will be hands-on in architecting, building, and optimizing robust, efficient, and secure data pipelines and platforms that power business-critical analytics and applications. You will play a central role in the implementation and automation of scalable batch and streaming data workflows using modern big data and cloud technologies. Working within cross-functional teams, you will deliver well-engineered, high-quality code and data models, and drive best practices for data reliability, lineage, quality, and security. HASHEDIN BY DELOITTE 2025 Mandatory Skills: Hands-on software coding or scripting for minimum 3 years Experience in product management for at-least 2 years Stakeholder management experience for at-least 3 years Experience in one amongst GCP, AWS or Azure cloud platform Key Responsibilities: Design, build, and optimize scalable data pipelines and ETL/ELT workflows using Spark (Scala/Python), SQL, and orchestration tools (e.g., Apache Airflow, Prefect, Luigi). Implement efficient solutions for high-volume, batch, real-time streaming, and event-driven data processing, leveraging best-in-class patterns and frameworks. Build and maintain data warehouse and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift) to support analytics, data science, and BI workloads. Develop, automate, and monitor Airflow DAGs/jobs on cloud or Kubernetes, following robust deployment and operational practices (CI/CD, containerization, infra-as-code). Write performant, production-grade SQL for complex data aggregation, transformation, and analytics tasks. Ensure data quality, consistency, and governance across the stack, implementing processes for validation, cleansing, anomaly detection, and reconciliation. Collaborate with Data Scientists, Analysts, and DevOps engineers to ingest, structure, and expose structured, semi-structured, and unstructured data for diverse use-cases. Contribute to data modeling, schema design, data partitioning strategies, and ensure adherence to best practices for performance and cost optimization. Implement, document, and extend data lineage, cataloging, and observability through tools such as AWS Glue, Azure Purview, Amundsen, or open-source technologies. Apply and enforce data security, privacy, and compliance requirements (e.g., access control, data masking, retention policies, GDPR/CCPA). Take ownership of end-to-end data pipeline lifecycle: design, development, code reviews, testing, deployment, operational monitoring, and maintenance/troubleshooting. Contribute to frameworks, reusable modules, and automation to improve development efficiency and maintainability of the codebase. Stay abreast of industry trends and emerging technologies, participating in code reviews, technical discussions, and peer mentoring as needed. Skills & Experience: Proficiency with Spark (Python or Scala), SQL, and data pipeline orchestration (Airflow, Prefect, Luigi, or similar). Experience with cloud data ecosystems (AWS, GCP, Azure) and cloud-native services for data processing (Glue, Dataflow, Dataproc, EMR, HDInsight, Synapse, etc.). © HASHEDIN BY DELOITTE 2025 Hands-on development skills in at least one programming language (Python, Scala, or Java preferred); solid knowledge of software engineering best practices (version control, testing, modularity). Deep understanding of batch and streaming architectures (Kafka, Kinesis, Pub/Sub, Flink, Structured Streaming, Spark Streaming). Expertise in data warehouse/lakehouse solutions (Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse) and storage formats (Parquet, ORC, Delta, Iceberg, Avro). Strong SQL development skills for ETL, analytics, and performance optimization. Familiarity with Kubernetes (K8s), containerization (Docker), and deploying data pipelines in distributed/cloud-native environments. Experience with data quality frameworks (Great Expectations, Deequ, or custom validation), monitoring/observability tools, and automated testing. Working knowledge of data modeling (star/snowflake, normalized, denormalized) and metadata/catalog management. Understanding of data security, privacy, and regulatory compliance (access management, PII masking, auditing, GDPR/CCPA/HIPAA). Familiarity with BI or visualization tools (PowerBI, Tableau, Looker, etc.) is an advantage but not core. Previous experience with data migrations, modernization, or refactoring legacy ETL processes to modern cloud architectures is a strong plus. Bonus: Exposure to open-source data tools (dbt, Delta Lake, Apache Iceberg, Amundsen, Great Expectations, etc.) and knowledge of DevOps/MLOps processes. Professional Attributes: Strong analytical and problem-solving skills; attention to detail and commitment to code quality and documentation. Ability to communicate technical designs and issues effectively with team members and stakeholders. Proven self-starter, fast learner, and collaborative team player who thrives in dynamic, fast-paced environments. Passion for mentoring, sharing knowledge, and raising the technical bar for data engineering practices. Desirable Experience: Contributions to open source data engineering/tools communities. Implementing data cataloging, stewardship, and data democratization initiatives. Hands-on work with DataOps/DevOps pipelines for code and data. Knowledge of ML pipeline integration (feature stores, model serving, lineage/monitoring integration) is beneficial. © HASHEDIN BY DELOITTE 2025 EDUCATIONAL QUALIFICATIONS: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience). Certifications in cloud platforms (AWS, GCP, Azure) and/or data engineering (AWS Data Analytics, GCP Data Engineer, Databricks). Experience working in an Agile environment with exposure to CI/CD, Git, Jira, Confluence, and code review processes. Prior work in highly regulated or large-scale enterprise data environments (finance, healthcare, or similar) is a plus.
Posted 1 month ago
12.0 - 20.0 years
35 - 40 Lacs
Navi Mumbai
Work from Office
Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Chennai
Work from Office
Skills Skill Medical Coding Healthcare CPT ICD-9 EMR Medical Billing Healthcare Management Revenue Cycle ICD-10 HIPAA Education Qualification No data available CERTIFICATION No data available Role Description Overview: Coder is accountable to manage day to day activities of coding the Patients chart & Diagnosis report. Responsibility Areas: Coding or auditing charts, based on requirements Updating/Clearing the production/pending reports To work closely with the team leader. To review emails for any updates Identify issues and escalate the same to the immediate supervisor Strict adherence to the company policies and procedures. Sound knowledge in Medical Coding concept. Should have 6 months to 3 Yrs of Coding Experience. Understand the client requirements and specifications of the project. Meet the productivity targets of clients within the stipulated time (Daily & Monthly) Applying the instructions/updates received from the client during production. Coding or auditing charts, based on requirements. Prepare and Maintain reports
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Skills Skill Medical Coding Healthcare HIPAA CPT ICD-9 EMR Medical Billing Healthcare Management Revenue Cycle ICD-10 Education Qualification No data available CERTIFICATION No data available Responsibility Areas: Coding or auditing charts, based on requirements Updating/Clearing the production/pending reports To work closely with the team leader. To review emails for any updates Identify issues and escalate the same to the immediate supervisor Strict adherence to the company policies and procedures. Sound knowledge in Medical Coding concept. Should have 6 months to 3 Yrs of Coding Experience. Understand the client requirements and specifications of the project. Meet the productivity targets of clients within the stipulated time (Daily & Monthly) Applying the instructions/updates received from the client during production. Coding or auditing charts, based on requirements. Prepare and Maintain reports
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Chennai
Work from Office
Skills Skill Medical Coding Healthcare HIPAA CPT ICD-9 EMR Medical Billing Healthcare Management Revenue Cycle ICD-10 Education Qualification No data available CERTIFICATION No data available Role Description Overview: Coder is accountable to manage day to day activities of coding the Patients chart & Diagnosis report. Responsibility Areas: Coding or auditing charts, based on requirements Updating/Clearing the production/pending reports To work closely with the team leader. To review emails for any updates Identify issues and escalate the same to the immediate supervisor Strict adherence to the company policies and procedures. Sound knowledge in Medical Coding concept. Should have 6 months to 3 Yrs of Coding Experience. Understand the client requirements and specifications of the project. Meet the productivity targets of clients within the stipulated time (Daily & Monthly) Applying the instructions/updates received from the client during production. Coding or auditing charts, based on requirements. Prepare and Maintain reports
Posted 1 month ago
1.0 - 2.0 years
2 - 4 Lacs
Noida
Work from Office
Role & responsibilities Follow up with the Insurance company to check on claim status. Identify denial reason and work on resolution. Save claim from getting written off by timely following up. Insurance Collection Insurance Ageing. Will be involved in various AR reports preparation such as Aging reports, Collection reports etc. Analyzing Claims. Initiate telephone calls to insurance companies requesting status of claim in queue regarding past due invoices and establishment payment arrangements. Meet Quality and productivity standards. Processing the Health insurance claims. Contact insurance companies for further explanation of denials & underpayments. Take appropriate action on claims to guarantee resolution. Auditing the claims Ensure accurate & timely follow up where required. Review denials to determine necessary steps for Claim review Respond to client inquiries via phone and email regarding account or software issues. NOTE : It's available only for Noida/Ghaziabad/Mayur Vihar/New Ashok Nagar/Laxmi Nagar/Vinod Nagar/Ghazipur/Khora candidates. Perks and benefits
Posted 1 month ago
1.0 - 6.0 years
2 - 3 Lacs
Patna
Work from Office
GNM or BSC Nursing Patient care, treat & medicate as per protocols Monitor accurate & detailed patient records Assist in medical procedure & intervention Effectively talk to patients, families, & team Call +919815295303, leadermaker.rv@gmail.com Required Candidate profile GNM or BSC Nursing Strong clinical knowledge Passionate and patient-centric approach Good communication & teamwork skills Ability to handle emergencies Call +919815295303, leadermaker.rv@gmail.com
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Kochi
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / Data Bricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 1 month ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 1 month ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
Responsibilities Develop key features and enhance the platforms performance, scalability, and security Design and implement secure, efficient APIs to support communication and data flow Optimize database performance and adhere to data standards Collaborate with team members to align on technical and business requirements Contribute to the maintenance of CI/CD processes and proactive monitoring Stay updated on advancements in PHP, Laravel, AWS, and healthcare technology Requirements Experience with PHP frameworks, particularly Laravel, and a good understanding of modern development practices Knowledge of optimizing database performance, including query optimization and indexing Experience in designing and implementing RESTful APIs with authentication standards such as OAuth 2.0 Understanding of software architecture patterns, including MVC and microservices, to ensure maintainable and scalable code CI/CD and Monitoring Skills Familiarity with CI/CD processes and tools, with experience in automating testing and deployment Knowledge of monitoring tools for system health and performance Core Infrastructure Skills Practical experience in deploying, managing, and scaling applications using AWS services like EC2, RDS, and S3 Basic experience managing cloud-based architectures, including EC2 instances and load balancing Familiarity with containerization technologies such as Docker for application deployment Nice to have Experience with front-end development Basic knowledge of front-end technologies (HTML, CSS, JavaScript) for collaborative development Integration Skills Experience integrating third-party APIs, such as EMR systems or communication tools like Twilio Regulatory Awareness Understanding of regulatory standards impacting healthcare technology, such as HIPAA compliance Communication and Problem-Solving Strong communication skills to work effectively with team members and stakeholders Ability to analyze and troubleshoot technical issues to improve application performance
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Remote
Hiring for US based Multinational Company (MNC) We are seeking a skilled and detail-oriented Data Engineer to join our team. In this role, you will design, build, and maintain scalable data pipelines and infrastructure to support business intelligence, analytics, and machine learning initiatives. You will work closely with data scientists, analysts, and software engineers to ensure that high-quality data is readily available and usable. Design and implement scalable, reliable, and efficient data pipelines for processing and transforming large volumes of structured and unstructured data. Build and maintain data architectures including databases, data warehouses, and data lakes. Collaborate with data analysts and scientists to support their data needs and ensure data integrity and consistency. Optimize data systems for performance, cost, and scalability. Implement data quality checks, validation, and monitoring processes. Develop ETL/ELT workflows using modern tools and platforms. Ensure data security and compliance with relevant data protection regulations. Monitor and troubleshoot production data systems and pipelines. Proven experience as a Data Engineer or in a similar role Strong proficiency in SQL and at least one programming language such as Python, Scala, or Java Experience with data pipeline tools such as Apache Airflow, Luigi, or similar Familiarity with modern data platforms and tools: Big Data: Hadoop, Spark Data Warehousing: Snowflake, Redshift, BigQuery, Azure Synapse Databases: PostgreSQL, MySQL, MongoDB Experience with cloud platforms (AWS, Azure, or GCP) Knowledge of data modeling, schema design, and ETL best practices Strong analytical and problem-solving skills
Posted 1 month ago
10.0 - 15.0 years
22 - 37 Lacs
Bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. As an AWS Data Engineer at Kyndryl, you will be responsible for designing, building, and maintaining scalable, secure, and high-performing data pipelines using AWS cloud-native services. This role requires extensive hands-on experience with both real-time and batch data processing, expertise in cloud-based ETL/ELT architectures, and a commitment to delivering clean, reliable, and well-modeled datasets. Key Responsibilities: Design and develop scalable, secure, and fault-tolerant data pipelines utilizing AWS services such as Glue, Lambda, Kinesis, S3, EMR, Step Functions, and Athena. Create and maintain ETL/ELT workflows to support both structured and unstructured data ingestion from various sources, including RDBMS, APIs, SFTP, and Streaming. Optimize data pipelines for performance, scalability, and cost-efficiency. Develop and manage data models, data lakes, and data warehouses on AWS platforms (e.g., Redshift, Lake Formation). Collaborate with DevOps teams to implement CI/CD and infrastructure as code (IaC) for data pipelines using CloudFormation or Terraform. Ensure data quality, validation, lineage, and governance through tools such as AWS Glue Data Catalog and AWS Lake Formation. Work in concert with data scientists, analysts, and application teams to deliver data-driven solutions. Monitor, troubleshoot, and resolve issues in production pipelines. Stay abreast of AWS advancements and recommend improvements where applicable. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Bachelor’s or master’s degree in computer science, Engineering, or a related field Over 8 years of experience in data engineering More than 3 years of experience with the AWS data ecosystem Strong experience with Pyspark, SQL, and Python Proficiency in AWS services: Glue, S3, Redshift, EMR, Lambda, Kinesis, CloudWatch, Athena, Step Functions Familiarity with data modelling concepts, dimensional models, and data lake architectures Experience with CI/CD, GitHub Actions, CloudFormation/Terraform Understanding of data governance, privacy, and security best practices Strong problem-solving and communication skills Preferred Skills and Experience Experience working as a Data Engineer and/or in cloud modernization. Experience with AWS Lake Formation and Data Catalog for metadata management. Knowledge of Databricks, Snowflake, or BigQuery for data analytics. AWS Certified Data Engineer or AWS Certified Solutions Architect is a plus. Strong problem-solving and analytical thinking. Excellent communication and collaboration abilities. Ability to work independently and in agile teams. A proactive approach to identifying and addressing challenges in data workflows. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 1 month ago
0.0 - 5.0 years
1 - 3 Lacs
Chennai
Work from Office
Role & responsibilities - Taking care of accident & emergency care patients. - Interaction with patients & attenders to co-ordinate and work as a team Preferred candidate profile -Good communication skills. -Willing to work in rotational shifts -Qualification - Bachelor or Diploma degree in Accident & Emergency Care, Trauma Care - Timing - 10 AM to 12 PM (Mon to Fri) - Experience: 0 to 5 years
Posted 1 month ago
5.0 - 10.0 years
7 - 17 Lacs
Hyderabad, Pune, Chennai
Work from Office
Airflow Data Engineer in AWS platform Job Title Apache Airflow Data Engineer ROLE” as per TCS Role Master • 4-8 years of experience in AWS, Apache Airflow (on Astronomer platform), Python, Pyspark, SQL • Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. • Experience in creating data pipelines and orchestrating using Apache Airflow • Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. • Good to have: Experience with cloud ETL and ELT in one of the tools like DBT/Glue/EMR or Matillion or any other ELT tool • Excellent communication skills to liaise with Business & IT stakeholders. • Expertise in planning execution of a project and efforts estimation. • Exposure to working in Agile ways of working. Candidate for this position to be offered with TAIC or TCSL as Entity Data warehousing , pyspark , Github, AWS data platform, Glue, EMR, RedShift, databricks,Data Marts. DBT/Glue/EMR or Matillion, data engineering, data modelling, data consumption
Posted 1 month ago
4.0 - 9.0 years
8 - 18 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Summary: As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. You will play a crucial role in ensuring the smooth functioning of the applications and resolving any technical glitches that may arise. Your expertise in EPIC Systems and problem-solving skills will be instrumental in maintaining the efficiency and reliability of the systems. Roles & Responsibilities: • Epic Analyst will provide primary support for their designated application/module. • Take on more advanced issues that arise during the project for their application area and will take on more complex tasks with respect to system configuration, testing and administration. • Provide on-going system support and maintenance based on support roster • Respond in a timely manner to system issues and requests • Conduct investigation, assessment, evaluation and deliver solutions and fixes to resolve system issues. • Handle and deliver Service Request / Change Request / New Builds • Perform system monitoring, such as error queues, alerts, batch jobs, etc and execute the required actions or SOPs • Perform/support regular / periodic system patch, maintenance and verification. • Perform/support the planned system upgrade work, cutover to production and post cutover support and stabilization • Perform/support the work required to comply with audit and security requirements. • Require to overlap with client business or office hours • Comply with Compliance requirements as mandated by the project Professional & Technical Skills: - Must To Have Skills: Certified in epic modules (RWB,Epic Care link,Haiku,Healthy Planet,Mychart,Rover,Willow ambulatory,Cogito, Ambulatory, Clindoc, Orders, ASAP, RPB, RHB, HIM Identity,HIM ROI, HIM DT, Cadence, Prelude, GC, Optime, Anesthesia, Beacon, Willow Imp, Cupid, Pheonix, Radiant, Beaker AP, Beaker CP, Bridges, Clarity, Radar, RWB) - Experience in troubleshooting and resolving application issues. Email me - maya@mounttalent.com
Posted 1 month ago
5.0 - 10.0 years
5 - 15 Lacs
Gurugram, Bengaluru, Mumbai (All Areas)
Hybrid
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : AWS BigData Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Software Development Lead, you will be responsible for developing and configuring software systems either end-to-end or for a specific stage of the product lifecycle. You will apply your knowledge of technologies, applications, methodologies, processes, and tools to support clients, projects, or entities. Your typical day will involve collaborating with the team, making team decisions, engaging with multiple teams, and providing solutions to problems for your immediate team and across multiple teams. You will also contribute to key decisions and ensure the successful execution of projects. Roles & Responsibilities: - AWS EMR, Glue, S3, Python/PySpark - Resource must have SQL experience - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the development and configuration of software systems - Ensure the successful execution of projects - Contribute to the improvement of processes and methodologies Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS BigData - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity
Posted 1 month ago
4.0 - 7.0 years
9 - 12 Lacs
Pune
Hybrid
So, what’s the role all about? In NiCE as a Senior Software professional specializing in designing, developing, and maintaining applications and systems using the Java programming language. They play a critical role in building scalable, robust, and high-performing applications for a variety of industries, including finance, healthcare, technology, and e-commerce How will you make an impact? Working knowledge of unit testing Working knowledge of user stories or use cases Working knowledge of design patterns or equivalent experience. Working knowledge of object-oriented software design. Team Player Have you got what it takes? Bachelor’s degree in computer science, Business Information Systems or related field or equivalent work experience is required. 4+ year (SE) experience in software development Well established technical problem-solving skills. Experience in Java, spring boot and microservices. Experience with Kafka, Kinesis, KDA, Apache Flink Experience in Kubernetes operators, Grafana, Prometheus Experience with AWS Technology including (EKS, EMR, S3, Kinesis, Lambda’s, Firehose, IAM, CloudWatch, etc) You will have an advantage if you also have: Experience with Snowflake or any DWH solution. Excellent communication skills, problem-solving skills, decision-making skills Experience in Databases Experience in CI/CD, git, GitHub Actions Jenkins based pipeline deployments. Strong experience in SQL What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 6965 Reporting into: Tech Manager Role Type: Individual Contributor
Posted 1 month ago
4.0 - 6.0 years
15 - 25 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role: AWS Data Engineer Experience Required :4 to 6 yrs Work Location :Bangalore/Pune/Hyderabad/Chennai Required Skills, Pyspark AWS Glue Interested candidates can send resumes to nandhini.spstaffing@gmail.com
Posted 1 month ago
2.0 - 5.0 years
14 - 17 Lacs
Mumbai
Work from Office
Experience with Scala object-oriented/object function Strong SQL background. Experience in Spark SQL, Hive, Data Engineer. SQL Experience with data pipelines & Data Lake Strong background in distributed comp. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL Experience with data pipelines & Data Lake Strong background in distributed comp Experience with Scala object-oriented/object function Strong SQL background Preferred technical and professional experience Core Scala Development Experience
Posted 1 month ago
3.0 - 8.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Health Insurance Operations Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the application development process and ensure successful implementation. Roles & Responsibilities:- Facets skill experience is mandatory- Participate in code reviews and quality gate definitions.- Collaborate with the development team to complete unit testing.- Develop strategic plans for testing efforts and create test estimates.- Define and build reusable testing assets for large/complex projects.- Provide technical leadership and support the creation of complex tests.- Identify and describe appropriate test techniques and supporting tools.- Define and maintain a Test Automation Architecture.- Specify and verify the required Test Environment Configurations.- Verify and assess the Test Approach.- Define and carry out plans and strategies for performance risk management of business products.- Inspire developers, designers, and product owners to be quality conscious by providing extensive training and workshops about testing culture and best practices.- Plan and prioritize different strategies according to business needs.- Improve quality practices across functional and non-functional testing. Professional & Technical Skills: - 5+ years of experience in FACETS development and customization.-Proficiency in SQL, PL/SQL, and FACETS extensions.-Familiarity with healthcare EDI transactions (837, 835, 270/271, 276/277, etc.).-Strong understanding of healthcare domain standards and HIPAA compliance.-Preferred Skills: Experience with .NET or Java technologies.-Knowledge of FACETS workflow management and integration frameworks.-Understanding of Agile/Scrum development methodologies.-Strong problem-solving and analytical skills.-Excellent communication and teamwork abilities.- Ready to work in shifts - 12 PM to 10 PM Additional Information:- The candidate should have a minimum of 3 years of experience in Health Insurance Operations.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
4.0 - 9.0 years
20 - 25 Lacs
Gurugram
Work from Office
Job Title - S&C Global Network - AI - Healthcare Analytics - Consultant Management Level: 9-Team Lead/Consultant Location: Bangalore/Gurgaon Must-have skills: R,Phython,SQL,Spark,Tableau ,Power BI Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary : This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. WHATS IN IT FOR YOU An opportunity to work on high-visibility projects with top Pharma clients around the globe. Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners, and business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge, and capabilities. Opportunity to thrive in a culture that is committed to accelerating equality for all. Engage in boundaryless collaboration across the entire organization. What you would do in this role Support delivery of small to medium-sized teams to deliver consulting projects for global clients. Responsibilities may include strategy, implementation, process design, and change management for specific modules. Work with the team or as an Individual contributor on the project assigned which includes a variety of skills to be utilized from Data Engineering to Data Science Provide Subject matter expertise in various sub-segments of the LS industry. Develop assets and methodologies, point-of-view, research, or white papers for use by the team and the larger community. Acquire new skills that have utility across industry groups. Support strategies and operating models focused on some business units and assess likely competitive responses. Also, assess implementation readiness and points of greatest impact. Co-lead proposals, and business development efforts and coordinate with other colleagues to create consensus-driven deliverables. Execute a transformational change plan aligned with the clients business strategy and context for change. Engage stakeholders in the change journey and build commitment to change. Make presentations wherever required to a known audience or client on functional aspects of his or her domain. Who are we looking for Bachelors or Masters degree in Statistics, Data Science, Applied Mathematics, Business Analytics, Computer Science, Information Systems, or other Quantitative field. Proven experience (4+ years) in working on Life Sciences/Pharma/Healthcare projects and delivering successful outcomes. Excellent understanding of Pharma data sets commercial, clinical, RWE (Real World Evidence) & EMR (Electronic medical records) Leverage ones hands on experience of working across one or more of these areas such as real-world evidence data, R&D clinical data, digital marketing data. Hands-on experience with handling Datasets like Komodo, RAVE, IQVIA, Truven, Optum etc. Hands-on experience in building and deployment of Statistical Models/Machine Learning including Segmentation & predictive modeling, hypothesis testing, multivariate statistical analysis, time series techniques, and optimization. Proficiency in Programming languages such as R, Python, SQL, Spark, etc. Ability to work with large data sets and present findings/insights to key stakeholders; Data management using databases like SQL. Experience with any of the cloud platforms like AWS, Azure, or Google Cloud for deploying and scaling language models. Experience with any of the Data Visualization tools like Tableau, Power BI, Qlikview, Spotfire is good to have. Excellent analytical and problem-solving skills, with a data-driven mindset. Proficient in Excel, MS Word, PowerPoint, etc. Ability to solve complex business problems and deliver client delight. Strong writing skills to build points of view on current industry trends. Good communication, interpersonal, and presentation skills Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience: 4-8 Years Educational Qualification: Bachelors or Masters degree in Statistics, Data Science, Applied Mathematics, Business Analytics, Computer Science, Information Systems, or other Quantitative field.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France