Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
3 - 7 Lacs
Noida
Work from Office
Job ResponsibilitiesEstimates and develops scalable solutions using .Net technologies in a highly collaborative agile environment with strong experience in C#, ASP.net Core, Web API.Maintains relevant documentation around the solutions.Conducts Code Reviews and ensures SOLID principles and standard design patterns are applied to system architectures and implementations.Evaluates, understands and recommends new technology, languages or development practices that have benefits for implementing.Collaborate with the Agile practitioners to help avoid distractions for the team, so that the team is focused on delivering their sprint commitments.Drive adoption of modern engineering practices such as Continuous Integration, Continuous Deployment, Code Reviews, TDD, Functional\Non-Functional testing, Test Automation, Performance Engineering etc. to deliver high-quality, high-value softwareIdentify and deliver re-useable components or de-couple components from existing code base to build a framework.Lead code reviews with other team members.Foster a culture and mindset of continuous learning to develop agility using the three pillars transparency, inspection and adaptation across levels and geographies.Mentors other members of the development team.Leads sessions with scrum team members to structure solution source code and designs implementation approaches optimizing for code that follows engineering best practices, and maximizes maintainability, testability and performance.Relevant exposure to agile ways of working preferably Scrum and Kanban"Job SpecificationB.E/B. Tech/MCA or equivalent professional degree3-6 years of experience designing and developing n-tier Web applications using .Net Framework, .Net Core, ASP.Net, WCF and C#, MVC 4/5 Web Development, RESTful API Services, Web API and JSONWell versed with C#, modern UI technologies and database\ORM technologies.Must have solid understanding of modern architectural and design patterns.Comprehensive knowledge of automation testing and modern testing practices e.g. TDD, BDD etc.Strong exposure in one or more Implementation of CI & CD using Jenkins, Dockers containerization.Strong exposure to Agile software development methodologies and enabling tools such as Jira, ConfluenceExcellent communicator with demonstrable ability of influencing decisions.Knowledge of healthcare revenue cycle management, HL7, EMR systems, HIPAA, FHIR would be preferred.Good working understanding of application architecture concepts like microservices, Domain-Driven Design, broker pattern/message bus, event-driven, CQRS, ports & adapters/hexagonal/onion, SOA would be preferred. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook
Posted 3 weeks ago
8.0 - 13.0 years
10 - 15 Lacs
Noida
Work from Office
About R1 R1 RCM Inc. is a leading provider of technology-enabled revenue cycle management services that transform and solve challenges across health systems, hospitals, and physician practices. Headquartered in Chicago, R1 is a publicly traded organization with employees throughout the US and other international locations. Our mission is to be the one trusted partner to manage revenue, so providers and patients can focus on what matters most. Our priority is to always do what is best for our clients, patients, our employees, and the communities we operate in. With our proven and scalable operating model, we complement a healthcare organizations infrastructure. quickly driving sustainable improvements to net patient revenue and cash flows. while reducing operating costs and enhancing the patient experience. Our approach to building software is disciplined and quality-focused with an emphasis on creativity, craftsmanship and commitment. We are looking for smart, quality-minded individuals who want to be a part of a high functioning, dynamic global team. Position summary You will manage and oversee the development & deployment of high-quality software products. You will ensure that the development teams adopt and follow modern engineering practices to deliver a high-quality, high-value product. You will be responsible towards working with different stakeholders to accomplish business and software engineering goals. You will improve the teams capabilities, improve engagement and minimize business risks. Key duties & responsibilities Develop high performing teams that are equipped with right capabilities in terms of skills, tools, technology, and resources to continuously deliver high-quality and high-value software. Collaborate with the -Agile practitioners to help avoid distractions for the team, so that the team is focused on delivering their sprint commitments. Drive adoption of modern engineering practices such as Continuous Integration, Continuous Deployment, Code Reviews, TDD, Functional\Non-Functional testing, Test Automation, Performance Engineering etc. to deliver high-quality, high-value software for 1-2 scrum teams. Craft individual development plans for team members and provide growth opportunities. Act as a key communication channel between team and senior leadership. Assess and provide team members timely feedback and conduct 360 feedback for self and teams assessment. Foster a mindset to keep customers needs at top and learn continuously. Qualification B.E/B. Tech/MCA or equivalent professional degree Experience, Skills and Knowledge 8+ years of experience in building web-based enterprise software using Microsoft .NET technology stack. Demonstrable experience of leading teams of highly skilled software engineers (8-12 team members) and working successfully across cultures. Must have solid understanding of modern architectural and design patterns. Comprehensive knowledge of automation testing and modern testing practices e.g. TDD, BDD etc. Well versed with C#, modern UI technologies and database\ORM technologies. Strong exposure to Agile software development methodologies and enabling tools such as Jira, Confluence Excellent communicator with ability demonstrable ability of influencing decisions Knowledge of healthcare revenue cycle management, HL7, EMR systems, HIPAA, FHIR would be preferred. Key competency profile Spot new opportunities by anticipating change and planning accordingly Find ways to better serve customers and patients. Be accountable for customer service of highest quality Create connections across teams by valuing differences and including others Own your development by implementing and sharing your learnings Motivate each other to perform at our highest level Help people improve by learning from successes and failures Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook
Posted 3 weeks ago
3.0 - 6.0 years
3 - 7 Lacs
Noida
Work from Office
R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work ForTM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Job Responsibilities Estimates and develops scalable solutions using .Net technologies in a highly collaborative agile environment with strong experience in C#, ASP.net Core, Web API. Maintains relevant documentation around the solutions. Conducts Code Reviews and ensures SOLID principles and standard design patterns are applied to system architectures and implementations. Evaluates, understands and recommends new technology, languages or development practices that have benefits for implementing. Collaborate with the Agile practitioners to help avoid distractions for the team, so that the team is focused on delivering their sprint commitments. Drive adoption of modern engineering practices such as Continuous Integration, Continuous Deployment, Code Reviews, TDD, Functional\Non-Functional testing, Test Automation, Performance Engineering etc. to deliver high-quality, high-value software Identify and deliver re-useable components or de-couple components from existing code base to build a framework. Lead code reviews with other team members. Foster a culture and mindset of continuous learning to develop agility using the three pillars transparency, inspection and adaptation across levels and geographies. Mentors other members of the development team. Leads sessions with scrum team members to structure solution source code and designs implementation approaches optimizing for code that follows engineering best practices, and maximizes maintainability, testability and performance. Relevant exposure to agile ways of working preferably Scrum and Kanban" Job Specification B.E/B. Tech/MCAor equivalent professionaldegree 3-6 years of experience designing and developing n-tier Web applications using .Net Framework, .Net Core, ASP.Net, WCF and C#, MVC 4/5 Web Development, RESTful API Services, Web API and JSON Well versed with C#, modern UI technologies and database\ORM technologies. Must have solid understanding of modern architectural and design patterns. Comprehensive knowledge of automation testing and modern testing practices e.g. TDD, BDD etc. Strong exposure in one or more Implementation of CI & CD using Jenkins, Dockers containerization. Strong exposure to Agile software development methodologies and enabling tools such as Jira, Confluence. Excellent communicator with demonstrable ability of influencing decisions. Knowledge of healthcare revenue cycle management, HL7, EMR systems, HIPAA, FHIR would be preferred. Good working understanding of application architecture concepts like microservices, Domain-Driven Design, broker pattern/message bus, event-driven, CQRS, ports & adapters/hexagonal/onion, SOA would be preferred. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook
Posted 3 weeks ago
2.0 - 6.0 years
5 - 14 Lacs
Gurugram
Work from Office
Job Title: Virtual Medical Assistant (Offshore) Position Type: Full-Time Reports To: Practice Manager or Clinical Operations Lead Position Summary: We are seeking a detail-oriented and dependable Virtual Medical Assistant to support clinical operations by handling essential administrative and coordination tasks. This role is critical in maintaining accurate medical records, ensuring effective communication with primary care providers, and coordinating patient care through timely appointment scheduling. Key Responsibilities: Lab Entry: Accurately input laboratory results and related information into the Electronic Medical Record (EMR) system in a timely and organized manner. Fax Coordination: Send patient progress notes and relevant documentation to primary care physicians and specialists, ensuring compliance with privacy standards (e.g., HIPAA-equivalent). Patient Outreach: Call patients to schedule, reschedule, or confirm appointments while maintaining professionalism and excellent customer service. Documentation: Ensure all interactions and actions are properly documented in the EMR. Other Duties as Assigned: Perform additional administrative or clinical coordination tasks as requested by the clinical team or supervisor. Qualifications: Prior experience as a Virtual Medical Assistant, Medical Receptionist, or similar healthcare support role preferred Familiarity with EMR systems (Athena, Epic, or similar) Strong written and verbal communication skills in English Ability to work independently and manage time effectively in a remote setting High level of attention to detail and accuracy Reliable internet connection and private, professional work environment Preferred Skills: Previous experience supporting U.S.-based medical practices Understanding of medical terminology and clinical documentation Customer service experience, particularly in healthcare
Posted 3 weeks ago
8.0 - 12.0 years
20 - 25 Lacs
Pune
Work from Office
Designation: Big Data Lead/Architect Location: Pune Experience: 8-10 years NP - immediate joiner/15-30 days notice Reports To – Product Engineering Head Job Overview We are looking to hire a talented big data engineer to develop and manage our company’s Big Data solutions. In this role, you will be required to design and implement Big Data tools and frameworks, implement ELT processes, collaborate with development teams, build cloud platforms, and maintain the production system. To ensure success as a big data engineer, you should have in-depth knowledge of Hadoop technologies, excellent project management skills, and high-level problem-solving skills. A top-notch Big Data Engineer understands the needs of the company and institutes scalable data solutions for its current and future needs. Responsibilities: Meeting with managers to determine the company’s Big Data needs. Developing big data solutions on AWS, using Apache Spark, Databricks, Delta Tables, EMR, Athena, Glue, Hadoop, etc. Loading disparate data sets and conducting pre-processing services using Athena, Glue, Spark, etc. Collaborating with the software research and development teams. Building cloud platforms for the development of company applications. Maintaining production systems. Requirements: 8-10 years of experience as a big data engineer. Must be proficient with Python & PySpark. In-depth knowledge of Hadoop, Apache Spark, Databricks, Delta Tables, AWS data analytics services. Must have extensive experience with Delta Tables, JSON, Parquet file format. Good to have experience with AWS data analytics services like Athena, Glue, Redshift, EMR. Familiarity with Data warehousing will be a plus. Must have Knowledge of NoSQL and RDBMS databases. Good communication skills. Ability to solve complex data processing, transformation related problems
Posted 3 weeks ago
5.0 - 10.0 years
12 - 16 Lacs
Noida
Work from Office
Increasing digitalization and flexibility of production processes presents outstanding potential. In Digital Industries, we enable our customers to unlock their full potential and drive digital transformation with a unique portfolio of automation and digitalization technologies. From hardware to software to services, weve got quite a lot to offer. How about you We blur the boundaries between industry domains by integrating the virtual and physical, hardware and software, design and manufacturing worlds. With the rapid pace of innovation, digitalization is no longer tomorrows idea. We take what the future promises tomorrow and make it real for our customers today. Join us - where your career meets tomorrow. Siemens EDA is a global technology leader in Electronic Design Automation software. Our software tools enable companies around the world to develop highly innovative electronic products faster and more efficiently. Our customers use our tools to push the boundaries of technology and physics to deliver better products in the increasingly complex world of chip, board, and system design. Questa Simulation Product It is a core R&D team working on multiple verticals of Simulation. A very energetic and enthusiastic team of motivated individuals. This role is based in Noida. But youll also get to visit other locations in India and globe, so youll need to go where this job takes you. In return, youll get the chance to work with teams impacting entire cities, countries, and the shape of things to come. Responsibilities: We are looking for a highly motivated software engineer to work in the QuestaSim R&D team of the Siemens EDA Development responsibilities will include core algorithmic advances and software design/architecture. You will collaborate with a senior group of software engineers contributing to final production level quality of new components and algorithms and to create new engines and support existent code. Self-motivation, self-discipline and the ability to set personal goals and work consistently towards them in a dynamic environment will go far towards chipping in to your success. We Are Not Looking for Superheroes, Just Super Minds! Weve got quite a lot to offer. How about you Required Experience: We seek a graduate with at least 5+ years of relevant working experience with B.Tech or M.Tech in CSE/EE/ECE from a reputed engineering college. Proficiency of C/C++, algorithm and data structures. Compiler Concepts and Optimizations. Experience with UNIX and / or LINUX platforms is vital. Basic Digital Electronics Concepts We value your knowledge of Verilog, System Verilog, VHDL Experience in parallel algorithms, job distribution. Understanding of ML/AI algorithms and their implementation in data-driven tasks Exposure to Simulation or Formal based verification methodologies would be a plus! The person should be self-motivated and can work independently. Should be able to guide others, towards project completion. Good problem solving and analytical skills A collection of over 377,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us shape tomorrow! #LI-EDA #LI-HYBRID #DVT
Posted 3 weeks ago
8.0 - 13.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make optimal use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective processes. We are looking for Sr. AWS Cloud Architect Architect and Design Develop scalable and efficient data solutions using AWS services such as AWS Glue, Amazon Redshift, S3, Kinesis(Apache Kafka), DynamoDB, Lambda, AWS Glue(Streaming ETL) and EMR Integration Integrate real-time data from various Siemens organizations into our data lake, ensuring seamless data flow and processing. Data Lake Management Design and manage a large-scale data lake using AWS services like S3, Glue, and Lake Formation. Data Transformation Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Snowflake Integration Implement and manage data pipelines to load data into Snowflake, utilizing Iceberg tables for optimal performance and flexibility. Performance Optimization Optimize data processing pipelines for performance, scalability, and cost-efficiency. Security and Compliance Ensure that all solutions adhere to security best practices and compliance requirements. Collaboration Work closely with cross-functional teams, including data engineers, data scientists, and application developers, to deliver end-to-end solutions. Monitoring and Troubleshooting Implement monitoring solutions to ensure the reliability and performance of data pipelines. Troubleshoot and resolve any issues that arise. Youd describe yourself as: Experience 8+ years of experience in data engineering or cloud solutioning, with a focus on AWS services. Technical Skills Proficiency in AWS services such as AWS API, AWS Glue, Amazon Redshift, S3, Apache Kafka and Lake Formation. Experience with real-time data processing and streaming architectures. Big Data Querying Tools: Strong knowledge of big data querying tools (e.g., Hive, PySpark). Programming Strong programming skills in languages such as Python, Java, or Scala for building and maintaining scalable systems. Problem-Solving Excellent problem-solving skills and the ability to troubleshoot complex issues. Communication Strong communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Certifications AWS certifications are a plus. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. Find out more about Siemens careers at
Posted 3 weeks ago
5.0 - 10.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BCA,BTech,MBA,MTech,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : At least 1 year of experience in HL7 FHIR implementation. Deep knowledge of HL7 FHIR 4.0.1 standard Knowledge of FHIR implementation guides like DaVinci, CARIN, US Core etc. Experience performing data mapping of Source data sets to FHIR resources Analyzes the business needs, defines detailed requirements, and provides potential solutions/approaches with the business stakeholders Strong experience and understanding of Agile Methodologies Strong written and oral communication and interpersonal skills Strong analytical, planning, organizational, time management and facilitation Skills Strong understanding and experience of SDLC and documentation skills Proficiency in Microsoft Suite (Word, Excel, Access, PowerPoint, Project, Visio, Outlook), Microsoft SQL Studio, JIRA Preferred Skills: Domain-Healthcare-Healthcare - ALL
Posted 3 weeks ago
5.0 - 10.0 years
9 - 13 Lacs
Pune
Work from Office
BA JD: Primary Responsibilities are as follows: 1. Creation of epics and writing of user stories in conjunction with the Product Owner/Business Teams, including non-functional requirements 2. Supports the Product Owner/Business Teams to manage the product backlog 3. Works with the Product Owner/Business Teams for documenting business requirements 4. Analysis of customer journeys, product features and impact on systems 5. Process analysis and improvement activities 6. Assesses operational considerations to support effective solution design 7. Collaborative interactions with teams from different locations and regions. 8. Functional Support to IT teams during Project Execution phase. Profile Expectations: 1. Post Graduate with background in Finance, MBA, CA 2. 7+ years of experience in Banks and/or as an IT BA on Banking Projects (preferably for Global Banks like HSBC) 3. Knowledge of Banking products (Deposits, Overdraft, Loans, Payments, Basics of Finance) and processes (On boarding, Fulfilment, Operations, Reporting 4. Effective communication skills, both written and verbal for technical and non-technical audience. 5. Experience with Agile Delivery Model
Posted 3 weeks ago
10.0 - 13.0 years
12 - 15 Lacs
Bengaluru
Work from Office
About the Opportunity Job TypeApplication 31 July 2025 TitlePrincipal Data Engineer (Associate Director) DepartmentISS LocationBangalore Reports ToHead of Data Platform - ISS Grade 7 Department Description ISS Data Engineering Chapter is an engineering group comprised of three sub-chapters - Data Engineers, Data Platform and Data Visualisation that supports the ISS Department. Fidelity is embarking on several strategic programmes of work that will create a data platform to support the next evolutionary stage of our Investment Process.These programmes span across asset classes and include Portfolio and Risk Management, Fundamental and Quantitative Research and Trading. Purpose of your role This role sits within the ISS Data Platform Team. The Data Platform team is responsible for building and maintaining the platform that enables the ISS business to operate. This role is appropriate for a Lead Data Engineer capable of taking ownership and a delivering a subsection of the wider data platform. Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics.Be accountable for technical delivery and take ownership of solutions.Lead a team of senior and junior developers providing mentorship and guidance.Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress.Drive technical innovation within the department to increase code reusability, code quality and developer productivity.Challenge the status quo by bringing the very latest data engineering practices and techniques. Essential Skills and Experience Core Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house.Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3.Experience designing event-based or streaming data architectures using Kafka.Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python.Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation.Data Security & Performance Optimization:Experience implementing data access controls to meet regulatory requirements.Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings.Experience implementing CDC ingestion.Experience using orchestration tools (Airflow, Control-M, etc..) Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes.Strong experience in API development using Python based frameworks like FastAPI. Key Soft Skills: Problem-Solving:Leadership experience in problem-solving and technical decision-making.Communication:Strong in strategic communication and stakeholder engagement.Project Management:Experienced in overseeing project lifecycles working with Project Managers to manage resources. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.
Posted 3 weeks ago
10.0 - 15.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Experience: 8+ years of experience in data engineering, specifically in cloud environments like AWS. Proficiency in PySpark for distributed data processing and transformation. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Proficiency in Python and PySpark for data processing and transformation tasks. Deep understanding of ETL concepts and best practices. Familiarity with AWS Glue (ETL jobs, Data Catalog, and Crawlers). Experience building and maintaining data pipelines with AWS Data Pipeline or similar orchestration tools. Familiarity with AWS S3 for data storage and management, including file formats (CSV, Parquet, Avro). Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data PipelinesDesign, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL DevelopmentDevelop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow AutomationBuild and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data IntegrationWork with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and ScalingOptimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2.
Posted 3 weeks ago
2.0 - 7.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BCA,BTech,MTech,MBA,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Domain experiencePayer core – claims/Membership/provider mgmt. Domain experienceProvider clinical/RCM, Pharmacy benefit management Healthcare Business Analysts - with Agile/Safe-Agile Business analysis experience Medicaid, Medicaid experienced Business Analysts FHIR, HL7 data analyst and interoperability consulting Healthcare digital transformation consultants with skills/experience of cloud data solutions design, Data analysis/analytics, RPA solution design KeywordsClaims, Provider, utilization management experience, Pricing,Agile, BA Preferred Skills: Domain-Healthcare-Healthcare - ALL Technology-Analytics - Functional-Business Analyst
Posted 3 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
: Key responsibilities include the following: Develop and maintain scalable data pipelines using Pyspark and proven experience as developer with expertise in PySpark. Good to have knowledge on Ab Initio. Experience with distributed computing and parallel processing . Proficiency in SQL and experience with database systems. Collaborate with data engineers and data scientists to understand and fulfil data processing needs. Optimize and troubleshoot existing PySpark applications for performance improvements. Write clean, efficient, and well-documented code following best practices. Participate in design and code reviews. Develop and implement ETL processes to extract, transform, and load data. Ensure data integrity and quality throughout the data lifecycle. Stay current with the latest industry trends and technologies in big data and cloud computing
Posted 3 weeks ago
8.0 - 13.0 years
5 - 10 Lacs
Hyderabad
Work from Office
6+ years of experience with Java Spark. Strong understanding of distributed computing, big data principles, and batch/stream processing. Proficiency in working with AWS services such as S3, EMR, Glue, Lambda, and Athena. Experience with Data Lake architectures and handling large volumes of structured and unstructured data. Familiarity with various data formats. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Design, develop, and optimize large-scale data processing pipelines using Java Spark Build scalable solutions to manage data ingestion, transformation, and storage in AWS-based Data Lake environments. Collaborate with data architects and analysts to implement data models and workflows aligned with business requirements. Ensure performance tuning, fault tolerance, and reliability of distributed data processing systems.
Posted 3 weeks ago
8.0 - 13.0 years
5 - 10 Lacs
Bengaluru
Work from Office
6+ years of experience with Java Spark. Strong understanding of distributed computing, big data principles, and batch/stream processing. Proficiency in working with AWS services such as S3, EMR, Glue, Lambda, and Athena. Experience with Data Lake architectures and handling large volumes of structured and unstructured data. Familiarity with various data formats. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Design, develop, and optimize large-scale data processing pipelines using Java Spark Build scalable solutions to manage data ingestion, transformation, and storage in AWS-based Data Lake environments. Collaborate with data architects and analysts to implement data models and workflows aligned with business requirements. Ensure performance tuning, fault tolerance, and reliability of distributed data processing systems.
Posted 3 weeks ago
8.0 - 13.0 years
8 - 12 Lacs
Hyderabad
Work from Office
10+ years of experience with Java Spark. Strong understanding of distributed computing, big data principles, and batch/stream processing. Proficiency in working with AWS services such as S3, EMR, Glue, Lambda, and Athena. Experience with Data Lake architectures and handling large volumes of structured and unstructured data. Familiarity with various data formats. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Design, develop, and optimize large-scale data processing pipelines using Java Spark Build scalable solutions to manage data ingestion, transformation, and storage in AWS-based Data Lake environments. Collaborate with data architects and analysts to implement data models and workflows aligned with business requirements.
Posted 3 weeks ago
6.0 - 11.0 years
8 - 12 Lacs
Gurugram
Work from Office
6-8 years of experience with at least 4 years of experience in test automation. Prior automation experience is a must. Familiarity with Python for test automation and scripting. Minimum 6 years of experience in QA/testing, with a focus on payments, ETL and data engineering projects. Excellent communication skills to work effectively with cross-functional teams. Good to Have: 6-8 years of experience in Payments/SWIFT/ISO/ ETL testing. Strong SQL skills for querying, comparing, and validating large datasets. Experience in testing ETL pipelines and data transformations- Prior experience testing ETL migrations from legacy systems like SAS DI to modern platforms is a plus. Hands-on experience with cloud platforms, particularly AWS services like S3, EMR, and PostgreSQL on AWS. Knowledge of SAS DI is highly desirable for understanding legacy pipelines.
Posted 3 weeks ago
8.0 - 13.0 years
3 - 7 Lacs
Hyderabad
Work from Office
P1-C3-STS Seeking a developer who has good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift Cloud Formation and other AWS serverless resources. Can optimize data models for performance and efficiency. Able to write SQL queries to support data analysis and reporting Design, implement, and maintain the data architecture for all AWS data services. Work with stakeholders to identify business needs and requirements for data-related projects Design and implement ETL processes to load data into the data warehouse Good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift, Cloud Formation and other AWS serverless resources Cloud Formation and other AWS serverless resources
Posted 3 weeks ago
4.0 - 9.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Minimum 6 years of hands-on experience in data engineering or big data development roles. Strong programming skills in Python and experience with Apache Spark (PySpark preferred). Proficient in writing and optimizing complex SQL queries. Hands-on experience with Apache Airflow for orchestration of data workflows. Deep understanding and practical experience with AWS services: Data Storage & ProcessingS3, Glue, EMR, Athena Compute & ExecutionLambda, Step Functions DatabasesRDS, DynamoDB MonitoringCloudWatch Experience with distributed data processing, parallel computing, and performance tuning. Strong analytical and problem-solving skills. Familiarity with CI/CD pipelines and DevOps practices is a plus.
Posted 3 weeks ago
8.0 - 13.0 years
5 - 10 Lacs
Pune
Work from Office
Data Engineer Position Summary The Data Engineer is responsible for building and maintaining data pipelines ensuring the smooth operation of data systems and optimizing workflows to meet business requirements This role will support data integration and processing for various applications Minimum Qualifications 6 Years overall IT experience with minimum 4 years of work experience in below tech skills Tech Skills Proficient in Python scripting and PySpark for data processing tasks Strong SQL capabilities with hands on experience managing big data using ETL tools like Informatica Experience with the AWS cloud platform and its data services including S3 Redshift Lambda EMR Airflow Postgres SNS and EventBridge Skilled in BASH Shell scripting Understanding of data lakehouse architecture particularly with Iceberg format is a plus Preferred Experience with Kafka and Mulesoft API Understanding of healthcare data systems is a plus Experience in Agile methodologies Strong analytical and problem solving skills Effective communication and teamwork abilities Responsibilities Develop and maintain data pipelines and ETL processes to manage large scale datasets Collaborate to design test data architectures to align with business needs Implement and optimize data models for efficient querying and reporting Assist in the development and maintenance of data quality checks and monitoring processes Support the creation of data solutions that enable analytical capabilities Contribute to aligning data architecture with overall organizational solutions
Posted 3 weeks ago
8.0 - 13.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Experience: 8 years of experience in data engineering, specifically in cloud environments like AWS. Proficiency in PySpark for distributed data processing and transformation. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Proficiency in Python and PySpark for data processing and transformation tasks. Deep understanding of ETL concepts and best practices. Familiarity with AWS Glue (ETL jobs, Data Catalog, and Crawlers). Experience building and maintaining data pipelines with AWS Data Pipeline or similar orchestration tools. Familiarity with AWS S3 for data storage and management, including file formats (CSV, Parquet, Avro). Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data PipelinesDesign, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL DevelopmentDevelop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow AutomationBuild and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data IntegrationWork with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and ScalingOptimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2.
Posted 3 weeks ago
10.0 - 15.0 years
30 - 35 Lacs
Noida
Work from Office
We at Innovaccer are looking for a Director-Clinical Informatics and you need to have structured problem-solving skills, strong analytical abilities, willingness to take initiatives and drive them, excellent verbal and written communication skills, and high levels of empathy towards internal and external stakeholders, among other things.The technology that once promised to simplify patient care has brought more issues than anyone ever anticipated. At Innovaccer, we defeat this beast by making full use of all the data Healthcare has worked so hard to collect, and replacing long-standing problems with ideal solutions.Data is our bread and butter for innovation. We are looking for a leader who will own and manage the clinical ontologies at Innovaccer. He/She will also help Innovaccer build clinical workflows, and care protocols to facilitate clinical decision support at the point of care. A Day in the Life Built a new product development pipeline aligning the companys portfolio with market insights across persona using clinical decision support in EHRs.Owned market research and built business cases to enable prioritization and build/buy/partner assessment by executive-level innovation governance. Worked successfully in a matrixed environment across business units to understand the big picture,build cross-functional relationships, and leverage content assets to solve customer (internal and external) problems. Worked on a pioneering FHIR-based, EHR-integrated, patient context specific, evidence-based guideline solution to reduce care variability. Solid understanding of clinical informatics standards (FHIR, CCDA,CDS Hooks, etc.) and terminologies (RxNorm, LOINC, SNOMED, etc.) Built a successful Clinical Quality Improvement program for assessing clinical credibility of Nuances NLP engines for clinical documentation quality. Created buy-in from executive leadership and cross-functional alignment among stakeholders from product, engineering, and the implementation/customer success teams. Owned the creation of analytics and quality metrics for provider and payor benchmarking and its monetization, for the speech recognition and revenue cycle products. Worked with the CMO, CMIOs, clinical documentation specialists, and the Product-Engineering team to productize them Lead development of clinical content for clinical decision support (CDS) to improve clinical documentation. Collaborate with clinical informaticists, data scientists,, clinical SMEs, product, and engineering teams to build CDS solutions with a deep understanding of the EHR workflow. Managing and defining clinical ontologies and implementing industry best practices of building value sets. The role involves client interaction during US hours, so you should be comfortable working in that time zone What You Need Advanced healthcare degree (MD, PharmD, RN, or Master's in Health Informatics) with 10+ years of clinical informatics experience and 5+ years in managerial/leadership roles Deep technical expertise in clinical informatics standards (FHIR, HL7, CCDA, CDS Hooks) and terminologies (SNOMED CT, LOINC, RxNorm) with hands-on EHR experience Proven track record of implementing clinical decision support systems, EHR integrations, and healthcare analytics platforms in complex healthcare environments Strong clinical knowledge with understanding of care delivery processes, evidence-based medicine, clinical workflows, and regulatory requirements (HIPAA, CMS programs)
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Process Manager - AWS Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Understand clients requirement and provide effective and efficient solution in AWS using Snowflake. Assembling large, complex sets of data that meet non-functional and functional business requirements Using Snowflake / Redshift Architect and design to create data pipeline and consolidate data on data lake and Data warehouse. Demonstrated strength and experience in data modeling, ETL development and data warehousing concepts Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL Technical and Functional Skills: AWS ServicesStrong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc. Programming LanguagesProficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Data WarehousingExperience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. ETL ToolsFamiliarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Database ManagementKnowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Big Data TechnologiesUnderstanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Version ControlProficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Problem-solving Skills: Ability to analyze complex technical problems and propose effective solutions. Communication Skills: Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders. Education and ExperienceTypically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France