Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 4.0 years
8 - 13 Lacs
Pune
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Cloud Engineer As a Cloud Engineer , you will be an individual contributor and subject matter expert who maintains and participates in the design and implementation of technology solutions. The engineer will collaborate within a team of technologists to produce enterprise scale solutions for our clients’ needs. This position will be working with the latest Amazon Web Services technologies around cloud architecture, infrastructure automation, and network security. What You'll Do: Identify, test prototype solution and proof of concepts on public clouds Help develop architectural standards and guidelines for scalability, performance, resilience, and efficient operations while adhering to necessary security and compliance standards Work with application development teams to select and automate repeatable tasks along with participating and assisting in root cause analysis activities Architect cloud solutions using industry-leading DevSecOps best practices and technologies Review software product designs to ensure consistency with architectural best practices; participate in regular implementation reviews to ensure consistent quality and adherence with internal standards Partner closely with cross-functional leaders (platform engineers / software development / product management / business leaders) to ensure a clear understanding of business and technical needs; jointly select the best strategy after evaluating the benefits and costs associated with different approaches Closely collaborate with implementation teams to ensure understanding and utilization of the most optimal approach What You'll Bring 1-4 years in an Infrastructure Engineering / Software Engineering / DevOps role, deploying and maintaining SaaS applications 1-4 years’ experience with AWS/Azure/GCP cloud technologies and at least one cloud proficiency certification is required Hands-on experience with AWS services like Lambda, S3, RDS, EMR, CloudFormation (or Terraform), CodeBuild, Config, Systems Manager, ServiceCatalog, Lambda, etc. Experience building automation using scripting languages like Bash/Python/PowerShell Experience working and contributing to software applications development/deployment/management processes Experience in architecting and implementing cloud-based solutions with robust Business Continuity and Disaster Recover requirements Experience working in agile teams with short release cycles Possess strong verbal, written and team presentation communication skills. ZS is a global firm; fluency in English is required This role requires healthy doses of initiative and the ability to remain flexible and responsive in a very dynamic environment Ability to work around unknowns and develop robust solutions. Experience of delivering quality work on defined tasks with limited oversight Ability to quickly learn new platforms, cloud technologies, languages, tools, and techniques as needed to meet project requirements. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 1 month ago
1.0 - 2.0 years
8 - 13 Lacs
Pune
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZS’s Cloud Center of Excellence team defines and implements cloud best practices that ensure secure and resilient enterprise grade systems architecture for client facing/delivery software solutions. The Cloud team at ZS is a casual, collaborative, and a smart group with offices in Evanston, Illinois and Pune, India. The Cloud Administrator will be part of CCoE application deployment team in Pune. As a Cloud Administrator, you will participate in AWS and on-premise product deployment, configuration across multiple environments for multiple client product instances. You will actively engage in deployment planning, scheduling and deployment execution discussions with the product development and quality assurance team. Collaborate with multiple other team members to ensure seamless deployment. What you'll do: Application deployment and configuration management hosted on private and public cloud (AWS) Assist and work with the engineering team in developing and implementing the deployment plan. Participate in application deployment issues troubleshooting and root cause analysis. Act as primary contact for deployment related inquiries, planning and issues. Participate in infrastructure maintenance activities on weekends. Develop deployment support documentations for customers. Migrating an existing on-premise Data warehouse to AWS. Identify improvement areas in application deployment and upgrade processes. Identify automation opportunities and participate in automation implementations What you'll bring: Bachelor's Degree in CS, IT or EE 1-2 years of experience in AWS Administration, Application deployment and configuration management Good knowledge of AWS services (EC2,S3,IAM,VPC,Lambda,EMR,CloudFront,Elastic Load Balancer etc.) Good knowledge of CI/CD tools like JetBrains TeamCity, SVN,BitBucket etc. Good knowledge of Windows and Linux operating system administration. Basic knowledge of RDBMS and database technologies like SQL server, PostgreSQL. Basic knowledge of web server software like IIS or Apache Tomcat Experience in any scripting knowledge like PowerShell, Python, Bash/Shell scripting Knowledge of DevOps methodology Experience of working in ITIL based environment Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 1 month ago
4.0 - 9.0 years
22 - 30 Lacs
Noida, Hyderabad
Hybrid
Hiring Alert Data Engineer | Xebia | Hyderabad & Noida Were on the lookout for skilled Data Engineers with 4+ years of experience to join our dynamic team at Xebia! If you thrive on solving complex data problems and have solid hands-on experience in Python, PySpark, and AWS, we’d love to hear from you. Location: Hyderabad / Noida Work Mode: 3 Days Work From Office (WFO) per week Timings: 2:30 PM – 10:30 PM IST Notice Period: Immediate to 15 days max Required Skills: Programming: Strong in Python, Spark, and PySpark SQL: Proficient in writing and optimizing complex queries AWS Services: Experience with S3, SNS, SQS, EMR, Lambda, Athena, Glue, RDS (PostgreSQL), CloudWatch, EventBridge, CloudFormation CI/CD: Exposure to Jenkins pipelines Analytical Thinking: Strong problem-solving capabilities Communication: Ability to explain technical topics to non-technical audiences Preferred Skills: Jenkins for CI/CD Familiarity with big data tools and frameworks Interested? Apply Now! Send your updated CV along with the following details to: vijay.s@xebia.com Required Details: Full Name: Total Experience: Current CTC: Expected CTC: Current Location: Preferred Location: Notice Period / Last Working Day (if serving notice): Primary Skill Set (Choose from above or mention any other relevant expertise): LinkedIn Profile URL: Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia. Let’s build the future of data together! #DataEngineer #Xebia #AWS #Python #PySpark #BigData #HiringNow #HyderabadJobs #NoidaJobs #ImmediateJoiners #DataJobs
Posted 1 month ago
5.0 - 10.0 years
5 - 6 Lacs
Chennai
Work from Office
We, India's largest Healthcare Consultancy is planning to provide Healthcare IT Solutions like HMIS, EMR/EHRs, RPMs, E-ICUs, Virtual Clinics, etc. Looking for a Marketing Professional with Healthcare IT experience to lead this initiative.
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Hyderabad
Work from Office
Key Responsibilities Administer and maintain AWS environments supporting data pipelines, including S3, EMR, Athena, Glue, Lambda, CloudFormation, and Redshift. Cost Analysis use AWS Cost Explorer to analyze services and usages, create dashboards to alert outliers on usage and cost Performance and Audit use AWS Cloud Trail and Cloud Watch to monitory system performance and usage Monitor, troubleshoot, and optimize infrastructure performance and availability. Provision and manage cloud resources using Infrastructure as Code (IaC) tools (e.g., AWS CloudFormation, Terraform). Collaborate with data engineers working in PySpark, Hive, Kafka, and Python to ensure infrastructure alignment with processing needs. Support code integration with GIT repositories Implement and maintain security policies, IAM roles, and access controls. Participate in incident response and support resolution of operational issues, including on-call responsibilities. Manage backup, recovery, and disaster recovery processes for AWS-hosted data and services. Interface directly with client teams to gather requirements, provide updates, and resolve issues professionally. Create and maintain technical documentation and operational runbooks Required Qualifications 3+ years of hands-on administration experience managing AWS infrastructure, particularly in support of data-centric workloads. Strong knowledge of AWS services including but not limited to S3, EMR, Glue, Lambda, Redshift, and Athena. Experience with infrastructure automation and configuration management tools (e.g., CloudFormation, Terraform, AWS CLI). Proficiency in Linux administration and shell scripting, including Installing and managing software on Linux servers Familiarity with Kafka, Hive, and distributed processing frameworks such as Apache Spark. Ability to manage and troubleshoot IAM configurations, networking, and cloud security best practices. Demonstrated experience in monitoring tools (e.g., CloudWatch, Prometheus, Grafana) and alerting systems. Excellent verbal and written communication skills. Comfortable working with cross-functional teams and engaging directly with clients. Preferred Qualifications AWS Certification (e.g., Solutions Architect Associate, SysOps Administrator) Experience supporting data science or analytics teams Familiarity with DevOps practices and CI/CD pipelines Familiarity with Apache Icebergbased data pipelines
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
We are looking for a self-motivated individual with appetite to learn new skills and be part of a fast-paced team that is delivering cutting edge solutions that drive new products and features that are critical for our customers. Our senior software engineers are responsible for designing, developing and ensuring the quality, reliability and availability of key systems that provide critical data and algorithms. Responsibilities of this role will include developing new and enhancing existing applications and you will work collaboratively with technical leads and architect to design, develop and test these critical applications. About the role Actively participate in the full life cycle of software delivery, including analysis, design, implementation and testing of new projects and features using Hadoop, Spark/Pyspark, Scala or Java, Hive, SQL, and other open-source tools and design patterns. Python knowledge is a bonus for this role. Working experience with HUDI , Snowflake or similar Must have technologies like Big Data, AWS services like EMR, S3, Lambdas, Elastic, step functions. Actively participate in the development and testing of features for assigned projects with little to no guidance. The position holds opportunities to work under technical experts and also to provide guidance and assistance to less experienced team members or new joiners in the path of the project. Appetite for learning will be key attribute for doing well in the role as the Org is very dynamic and have tremendous scope into various technical landscapes. We consider AI inclusion as a key to excel in this role, we want dynamic candidates who use AI tools as build partners and share experiences to ignite the Org. Proactively share knowledge and best practices on using new and emerging technologies across all of the development and testing groups Create, review and maintain technical documentation of software development and testing artifacts Work collaboratively with others in a team-based environment. Identify and participate in the resolution of issues with the appropriate technical and business resources Generate innovative approaches and solutions to technology challenges Effectively balance and prioritize multiple projects concurrently. About you Bachelors or Masters degree in computer science or a related field 7+ year experience in IT industry Product and Platform development preferred. Strong programming skill with Java or Scala. Must have technologies includes Big Data, AWS. Exposure to services like EMR, S3, Lambdas, Elastic, step functions. Knowledge of Python will be preferred. Experience with Agile methodology, continuous integration and/or Test-Driven Development. Self-motivated with a strong desire for continual learning Take personal responsibility to impact results and deliver on commitments. Effective verbal and written communication skills. Ability to work independently or as part of an agile development team.
Posted 1 month ago
5.0 - 9.0 years
20 - 30 Lacs
Pune
Hybrid
Job Summary : We are looking for a highly skilled AWS Data Engineer with over 5 years of experience in designing, developing, and maintaining scalable data pipelines on AWS. The ideal candidate will be proficient in data engineering best practices and cloud-native technologies, with hands-on experience in building ETL/ELT pipelines, working with large datasets, and optimizing data architecture for analytics and business intelligence. Key Responsibilities : Design, build, and maintain scalable and robust data pipelines and ETL processes using AWS services (e.g., Glue, Lambda, EMR, Redshift, S3, Athena). Collaborate with data analysts, data scientists, and stakeholders to understand data requirements and deliver high-quality solutions. Implement data lake and data warehouse architectures, ensuring data governance, data quality, and compliance. Optimize data pipelines for performance, reliability, scalability, and cost. Automate data ingestion and transformation workflows using Python, PySpark, or Scala. Manage and monitor data infrastructure including logging, error handling, alerting, and performance metrics. Leverage infrastructure-as-code tools like Terraform or AWS CloudFormation for infrastructure deployment. Ensure security best practices are implemented for data access and storage (IAM, KMS, encryption, etc.). Document data processes, architectures, and standards. Required Qualifications : Bachelors or Master’s degree in Computer Science, Information Systems, or a related field. Minimum 5 years of experience as a Data Engineer with a focus on AWS cloud services. Strong experience in building ETL/ELT pipelines using AWS Glue, EMR, Lambda , and Step Functions . Proficiency in SQL , Python , PySpark , and data modeling techniques. Experience working with data lakes (S3) and data warehouses (Redshift, Snowflake, etc.) . Experience with Athena , Kinesis , Kafka , or similar streaming data tools is a plus. Familiarity with DevOps and CI/CD processes, using tools like Git , Jenkins , or GitHub Actions . Understanding of data privacy, governance, and compliance standards such as GDPR, HIPAA, etc. Strong problem-solving and analytical skills, with the ability to work in a fast-paced environment.
Posted 1 month ago
5.0 - 6.0 years
10 - 15 Lacs
Chennai, Bengaluru
Work from Office
AI/ML, AWS-based solutions. Amazon SageMaker, Python and ML libraries, data engineering on AWS, AI/ML algorithms &model deployment strategies.CI/CD, Cloud Formation, Terraform). AWS Certified Machine Learning. generative AI, real-time inference &edge
Posted 1 month ago
3.0 - 6.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Roles and Responsibilities: Responsible for development of automation frameworks using python Explore and apply different design approaches and software practices in development life cycle. Desired Experience: 3 to 6 years experience in python development. Good knowledge of Software Engineering processes and worked in agile environment Proven ability in design of frameworks and OOPS concepts Knowledge and familiarity of PACS/ DICOM standards is an added advantage
Posted 1 month ago
3.0 - 6.0 years
6 - 15 Lacs
Pune
Work from Office
Sr. Software Engineer with advanced Python for product development and of ML & Generative AI. Hands on with FastAPI server in production environment. AI Engineers to design and develop high-quality Generative AI platform on AWS.
Posted 1 month ago
3.0 - 7.0 years
15 - 20 Lacs
Nagpur, Pune
Work from Office
Hi, We are hiring for Leading ITES Company for Lead Data Manager Profile. Job Description Perform day to day Clinical Data Management activities. Work and coordinate with the team to perform data management activities and deliver an error free quality database in accordance with the data management plan and regulator standards. Read and understand the study protocol and the timelines. Perform test data entry in TEST environment, data listing review, data reconciliation and query management tasks. Escalate/Action discrepancy in the clinical data as appropriate. Perform external checks to handle manual discrepancies and action the same. Ensure an error free, quality data with no open queries. Escalate any discrepancy in the clinical data to the study lead as appropriate. Timely completion of trainings Any other tasks deemed appropriate To perform medical data collection and analysis of Prostate Cancer Data using databases like HIS/ EMR (Electronic Medical Record) and Cases Rave, CDM (startup, close out, conduct) Client interaction and meetings. Bringing up new ideas and executing new plans to cope with the back-log. Training to new team members as and when required. Key Skills: a) Minimum 3 years of Experience in leading clinical studies/ Clinical Data Management b) Hands on experience of Study Conduct and Close Out is must c) Any Graduate To Apply, WhatsApp 'Hi' @ 9151555419 Follow the Steps Below: >Click on Start option to Apply and fill the details >Select the location as Other ( to get multiple location option ) a) To Apply for above Job Role ( Pune ) Type : Job Code # 96 b) To Apply for above Job Role ( Nagpur ) Type : Job Code # 97
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Project Role :Application Support Engineer Project Role Description :Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills :Electronic Medical Records (EMR) Good to have skills :NA Minimum 5 year(s) of experience is required Educational Qualification :15 years of full time eduaction Summary:As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. You will play a crucial role in ensuring the smooth functioning of electronic medical records (EMR) systems and supporting the healthcare industry. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Ensure the smooth functioning of electronic medical records (EMR) systems. Troubleshoot and resolve issues within critical business systems. Collaborate with cross-functional teams to identify and solve complex problems. Contribute to the continuous improvement of system performance and stability. Professional & Technical Skills: Must To Have Skills:Proficiency in Electronic Medical Records (EMR). Strong understanding of software engineering principles and practices. Experience in troubleshooting and resolving issues within critical business systems. Knowledge of database management and SQL queries. Familiarity with healthcare industry standards and regulations. Good To Have Skills:Experience with healthcare information systems. Experience with ITIL framework and incident management processes. Additional Information: The candidate should have a minimum of 5 years of experience in Electronic Medical Records (EMR). This position is based at our Hyderabad office. A 15 years of full-time education is required. Qualifications 15 years of full time eduaction
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Chennai
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. You will play a crucial role in ensuring the smooth functioning of the applications and resolving any technical glitches that may arise. Your expertise in EPIC Systems and problem-solving skills will be instrumental in maintaining the efficiency and reliability of the systems. Roles & Responsibilities: Epic Analyst will provide primary support for their designated application/module. Take on more advanced issues that arise during the project for their application area and will take on more complex tasks with respect to system configuration, testing and administration. Provide on-going system support and maintenance based on support roster Respond in a timely manner to system issues and requests Conduct investigation, assessment, evaluation and deliver solutions and fixes to resolve system issues. Handle and deliver Service Request / Change Request / New Builds Perform system monitoring, such as error queues, alerts, batch jobs, etc and execute the required actions or SOPs Perform/support regular / periodic system patch, maintenance and verification. Perform/support the planned system upgrade work, cutover to production and post cutover support and stabilization Perform/support the work required to comply with audit and security requirements. Require to overlap with client business or office hours Comply with Compliance requirements as mandated by the project Professional & Technical Skills: Must To Have Skills:Certified in epic modules (RWB,Epic Care link,Haiku,Healthy Planet,Mychart,Rover,Willow ambulatory,Cogito, Ambulatory, Clindoc, Orders, ASAP, RPB, RHB, HIM Identity,HIM ROI, HIM DT, Cadence, Prelude, GC, Optime, Anesthesia, Beacon, Willow Imp, Cupid, Pheonix, Radiant, Beaker AP, Beaker CP, Bridges, Clarity, Radar, RWB) Experience in troubleshooting and resolving application issues. Additional Information: The candidate should have a minimum of 5 years of experience in EPIC Systems. This position is based at our Chennai office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. You will play a crucial role in ensuring the smooth functioning of the applications and resolving any technical glitches that may arise. Your expertise in EPIC Systems and problem-solving skills will be instrumental in maintaining the efficiency and reliability of the systems. Roles & Responsibilities: Epic Analyst will provide primary support for their designated application/module. Take on more advanced issues that arise during the project for their application area and will take on more complex tasks with respect to system configuration, testing and administration. Provide on-going system support and maintenance based on support roster Respond in a timely manner to system issues and requests Conduct investigation, assessment, evaluation and deliver solutions and fixes to resolve system issues. Handle and deliver Service Request / Change Request / New Builds Perform system monitoring, such as error queues, alerts, batch jobs, etc and execute the required actions or SOPs Perform/support regular / periodic system patch, maintenance and verification. Perform/support the planned system upgrade work, cutover to production and post cutover support and stabilization Perform/support the work required to comply with audit and security requirements. Require to overlap with client business or office hours Comply with Compliance requirements as mandated by the project Professional & Technical Skills: Must To Have Skills:Certified in epic modules (RWB,Epic Care link,Haiku,Healthy Planet,Mychart,Rover,Willow ambulatory,Cogito, Ambulatory, Clindoc, Orders, ASAP, RPB, RHB, HIM Identity,HIM ROI, HIM DT, Cadence, Prelude, GC, Optime, Anesthesia, Beacon, Willow Imp, Cupid, Pheonix, Radiant, Beaker AP, Beaker CP, Bridges, Clarity, Radar, RWB) Experience in troubleshooting and resolving application issues. Additional Information: The candidate should have a minimum of 5 years of experience in EPIC Systems. This position is based at our Chennai office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 month ago
2.0 - 7.0 years
4 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. Your day will involve troubleshooting, analyzing system performance, and collaborating with cross-functional teams to ensure seamless operations. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Proactively identify and resolve technical issues within critical business systems. Collaborate with cross-functional teams to analyze system performance and optimize operations. Professional & Technical Skills: Must To Have Skills:Proficiency in EPIC Systems. Strong understanding of system architecture and integration. Experience in troubleshooting and resolving software issues. Knowledge of ITIL framework for incident and problem management. Additional Information: The candidate should have a minimum of 2 years of experience in EPIC Systems. This position is based at our Hyderabad office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. You will play a crucial role in ensuring the smooth functioning of the applications and resolving any technical glitches that may arise. Your expertise in EPIC Systems and problem-solving skills will be instrumental in maintaining the efficiency and reliability of the systems. Roles & Responsibilities: Epic Analyst will provide primary support for their designated application/module. Take on more advanced issues that arise during the project for their application area and will take on more complex tasks with respect to system configuration, testing and administration. Provide on-going system support and maintenance based on support roster Respond in a timely manner to system issues and requests Conduct investigation, assessment, evaluation and deliver solutions and fixes to resolve system issues. Handle and deliver Service Request / Change Request / New Builds Perform system monitoring, such as error queues, alerts, batch jobs, etc and execute the required actions or SOPs Perform/support regular / periodic system patch, maintenance and verification. Perform/support the planned system upgrade work, cutover to production and post cutover support and stabilization Perform/support the work required to comply with audit and security requirements. Require to overlap with client business or office hours Comply with Compliance requirements as mandated by the project Professional & Technical Skills: Must To Have Skills:Certified in epic modules (RWB,Epic Care link,Haiku,Healthy Planet,Mychart,Rover,Willow ambulatory,Cogito, Ambulatory, Clindoc, Orders, ASAP, RPB, RHB, HIM Identity,HIM ROI, HIM DT, Cadence, Prelude, GC, Optime, Anesthesia, Beacon, Willow Imp, Cupid, Pheonix, Radiant, Beaker AP, Beaker CP, Bridges, Clarity, Radar, RWB) Experience in troubleshooting and resolving application issues. Additional Information: The candidate should have a minimum of 5 years of experience in EPIC Systems. This position is based at our Hyderabad office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : EPIC Systems Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service identifying and solving issues within multiple components of critical business systems. You will play a crucial role in ensuring the smooth functioning of the applications and resolving any technical glitches that may arise. Your expertise in EPIC Systems and problem-solving skills will be instrumental in maintaining the efficiency and reliability of the systems. Roles & Responsibilities: Epic Analyst will provide primary support for their designated application/module. Take on more advanced issues that arise during the project for their application area and will take on more complex tasks with respect to system configuration, testing and administration. Provide on-going system support and maintenance based on support roster Respond in a timely manner to system issues and requests Conduct investigation, assessment, evaluation and deliver solutions and fixes to resolve system issues. Handle and deliver Service Request / Change Request / New Builds Perform system monitoring, such as error queues, alerts, batch jobs, etc and execute the required actions or SOPs Perform/support regular / periodic system patch, maintenance and verification. Perform/support the planned system upgrade work, cutover to production and post cutover support and stabilization Perform/support the work required to comply with audit and security requirements. Require to overlap with client business or office hours Comply with Compliance requirements as mandated by the project Professional & Technical Skills: Must To Have Skills:Certified in epic modules (RWB,Epic Care link,Haiku,Healthy Planet,Mychart,Rover,Willow ambulatory,Cogito, Ambulatory, Clindoc, Orders, ASAP, RPB, RHB, HIM Identity,HIM ROI, HIM DT, Cadence, Prelude, GC, Optime, Anesthesia, Beacon, Willow Imp, Cupid, Pheonix, Radiant, Beaker AP, Beaker CP, Bridges, Clarity, Radar, RWB) Experience in troubleshooting and resolving application issues. Additional Information: The candidate should have a minimum of 5 years of experience in EPIC Systems. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :The ideal candidate will work in a team environment that demands technical excellence, whose members are expected to hold each other accountable for the overall success of the end product. Focus for this team is on the delivery of innovative solutions to complex problems, but also with a mind to drive simplicity in refining and supporting of the solution by others About The Role :& Responsibilities: Be accountable for delivery of business functionality. Work on the AWS cloud to migrate/re-engineer data and applications from on premise to cloud. Responsible for engineering solutions conformant to enterprise standards, architecture, and technologies Provide technical expertise through a hands-on approach, developing solutions that automate testing between systems. Perform peer code reviews, merge requests and production releases. Implement design/functionality using Agile principles. Proven track record of quality software development and an ability to innovate outside of traditional architecture/software patterns when needed. A desire to collaborate in a high-performing team environment, and an ability to influence and be influenced by others. Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact. Be entrepreneurial, business minded, ask smart questions, take risks, and champion new ideas. Take ownership and accountability. Experience Required:3 to 5 years of experience in application program development Experience Desired: Knowledge and/or experience with healthcare information domains. Documented experience in a business intelligence or analytic development role on a variety of large-scale projects. Documented experience working with databases larger than 5TB and excellent data analysis skills. Experience with TDD/BDD Experience working with SPARK and real time analytic frameworks Education and Training Required:Bachelor's degree in Engineering, Computer Science Primary Skills: PYTHON, Databricks, TERADATA, SQL, UNIX, ETL, Data Structures, Looker, Tableau, GIT, Jenkins, RESTful & GraphQL APIs. AWS services such as Glue, EMR, Lambda, Step Functions, CloudTrail, CloudWatch, SNS, SQS, S3, VPC, EC2, RDS, IAM Additional Skills: Ability to rapidly prototype and storyboard/wireframe development as part of application design. Write referenceable and modular code. Willingness to continuously learn & share learnings with others. Ability to communicate design processes, ideas, and solutions clearly and effectively to teams and clients. Ability to manipulate and transform large datasets efficiently. Excellent troubleshooting skills to root cause complex issues Qualifications 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Noida
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :The ideal candidate will work in a team environment that demands technical excellence, whose members are expected to hold each other accountable for the overall success of the end product. Focus for this team is on the delivery of innovative solutions to complex problems, but also with a mind to drive simplicity in refining and supporting of the solution by others About The Role :& Responsibilities: Be accountable for delivery of business functionality. Work on the AWS cloud to migrate/re-engineer data and applications from on premise to cloud. Responsible for engineering solutions conformant to enterprise standards, architecture, and technologies Provide technical expertise through a hands-on approach, developing solutions that automate testing between systems. Perform peer code reviews, merge requests and production releases. Implement design/functionality using Agile principles. Proven track record of quality software development and an ability to innovate outside of traditional architecture/software patterns when needed. A desire to collaborate in a high-performing team environment, and an ability to influence and be influenced by others. Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact. Be entrepreneurial, business minded, ask smart questions, take risks, and champion new ideas. Take ownership and accountability. Experience Required:3 to 5 years of experience in application program development Experience Desired: Knowledge and/or experience with healthcare information domains. Documented experience in a business intelligence or analytic development role on a variety of large-scale projects. Documented experience working with databases larger than 5TB and excellent data analysis skills. Experience with TDD/BDD Experience working with SPARK and real time analytic frameworks Education and Training Required:Bachelor's degree in Engineering, Computer Science Primary Skills: PYTHON, Databricks, TERADATA, SQL, UNIX, ETL, Data Structures, Looker, Tableau, GIT, Jenkins, RESTful & GraphQL APIs. AWS services such as Glue, EMR, Lambda, Step Functions, CloudTrail, CloudWatch, SNS, SQS, S3, VPC, EC2, RDS, IAM Additional Skills: Ability to rapidly prototype and storyboard/wireframe development as part of application design. Write referenceable and modular code. Willingness to continuously learn & share learnings with others. Ability to communicate design processes, ideas, and solutions clearly and effectively to teams and clients. Ability to manipulate and transform large datasets efficiently. Excellent troubleshooting skills to root cause complex issues Qualifications 15 years full time education
Posted 1 month ago
9.0 - 14.0 years
25 - 40 Lacs
Bengaluru
Hybrid
Greetings from tsworks Technologies India Pvt . We are hiring for Sr. Data Engineer - Snowflake with AWS If you are interested, please share your CV to mohan.kumar@tsworks.io Position: Senior Data Engineer Experience: 9+ Years Location: Bengaluru, India (Hybrid) Mandatory Required Qualification Strong proficiency in AWS data services such as S3 buckets, Glue and Glue Catalog, EMR, Athena, Redshift, DynamoDB, Quick Sight, etc. Strong hands-on experience building Data Lake-House solutions on Snowflake, and using features such as streams, tasks, dynamic tables, data masking, data exchange etc. Hands-on experience using scheduling tools such as Apache Airflow, DBT, AWS Step Functions and data governance products such as Collibra Expertise in DevOps and CI/CD implementation Excellent Communication Skills In This Role, You Will Design, implement, and manage scalable and efficient data architecture on the AWS cloud platform. Develop and maintain data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Perform complex data transformations and processing using PySpark (AWS Glue, EMR or Databricks), Snowflake's data processing capabilities, or other relevant tools. Hands-on experience working with Data Lake solutions such as Apache Hudi, Delta Lake or Iceberg. Develop and maintain data models within Snowflake and related tools to support reporting, analytics, and business intelligence needs. Collaborate with cross-functional teams to understand data requirements and design appropriate data integration solutions. Integrate data from various sources, both internal and external, ensuring data quality and consistency. Skills & Knowledge Bachelor's degree in computer science, Engineering, or a related field. 9 + Years of experience in Information Technology, designing, developing and executing solutions. 4+ Years of hands-on experience in designing and executing data solutions on AWS and Snowflake cloud platforms as a Data Engineer. Strong proficiency in AWS services such as Glue, EMR, Athena, Databricks, with file formats such as Parquet and Avro. Hands-on experience in data modelling, batch and real-time pipelines, using Python, Java or JavaScript and experience working with Restful APIs are required. Hands-on experience in handling real-time data streams from Kafka or Kinesis is required. Expertise in DevOps and CI/CD implementation. Hands-on experience with SQL and NoSQL databases. Hands-on experience in data modelling, implementation, and management of OLTP and OLAP systems. Knowledge of data quality, governance, and security best practices. Familiarity with machine learning concepts and integration of ML pipelines into data workflows Hands-on experience working in an Agile setting. Is self-driven, naturally curious, and able to adapt to a fast-paced work environment. Can articulate, create, and maintain technical and non-technical documentation. AWS and Snowflake Certifications are preferred.
Posted 1 month ago
1.0 - 5.0 years
9 - 13 Lacs
Bengaluru
Work from Office
We are looking for a skilled and experienced PySpark Tech Lead to join our dynamic engineering team In this role, you will lead the development and execution of high-performance big data solutions using PySpark You will work closely with data scientists, engineers, and architects to design and implement scalable data pipelines and analytics solutions. As a Tech Lead, you will mentor and guide a team of engineers, ensuring the adoption of best practices for building robust and efficient systems while driving innovation in the use of data technologies. Key Responsibilities Lead and DevelopDesign and implement scalable, high-performance data pipelines and ETL processes using PySpark on distributed systems Tech LeadershipProvide technical direction and leadership to a team of engineers, ensuring the delivery of high-quality solutions that meet both business and technical requirements. Architect SolutionsDevelop and enforce best practices for architecture, design, and coding standards Lead the design of complex data engineering workflows, ensuring they are optimized for performance and cost-effectiveness. CollaborationCollaborate with data scientists, analysts, and other stakeholders to understand data requirements, translating them into scalable technical solutions. Optimization & Performance TuningOptimize large-scale data processing pipelines to improve efficiency and performance Implement best practices for memory management, data partitioning, and parallelization in Spark. Code Review & MentorshipConduct code reviews to ensure high-quality code, maintainability, and scalability Provide guidance and mentorship to junior and mid-level engineers. Innovation & Best PracticesStay current on new data technologies and trends, bringing fresh ideas and solutions to the team Implement continuous integration and deployment pipelines for data workflows. Problem SolvingIdentify bottlenecks, troubleshoot, and resolve issues related to data quality, pipeline failures, and performance optimization. Skills And Qualifications Experience: 7+ years of hands-on experience in PySpark and large-scale data processing. Technical Expertise: Strong knowledge of PySpark, Spark SQL, and Apache Kafka. Experience with cloud platforms like AWS (EMR, S3), Google Cloud, or Azure. In-depth understanding of distributed computing, parallel processing, and data engineering principles. Data Engineering: Expertise in building ETL pipelines, data wrangling, and working with structured and unstructured data. Experience with databases (relational and NoSQL) such as SQL, MongoDB, or DynamoDB. Familiarity with data warehousing solutions and query optimization techniques Leadership & Communication: Proven ability to lead a technical team, make key architectural decisions, and mentor junior engineers. Excellent communication skills, with the ability to collaborate effectively with cross-functional teams and stakeholders. Problem Solving: Strong analytical skills with the ability to solve complex problems involving large datasets and distributed systems. Education: Bachelors or Masters degree in Computer Science, Engineering, or a related field (or equivalent practical experience). Show more Show less
Posted 1 month ago
6.0 - 11.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Lead a team to design, develop, test, deploy, maintain and continuously improve software Mentor the engineering team to develop and perform as highly as possible Guide and help the team adopt best engineering practices Support driving modern solutions to complex problems Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 7+ years of Overall IT experience 3+ Years of experience - AWS (all services needed for Big Data pipelines like S3 , EMR , SNS/SQS , Eventbridge, Lambda , Cloudwatch , MSK , Glue, Container services etc.) . Spark . Scala, Hadoop 2+ Years of experience - Python. Shell scripting, Orchestration (Airflow or MWAA preferred). SQL. CI/CD (Git preferred and experience with Deployment pipelines), Devops (including supporting production stack and working with SRE teams) 1+ Years of experience - Infrastructure as code (Terraform preferred) 1+ Years of experience - Spark streaming Healthcare Domain & Data Standards Preferred Qualification Azure, Big Data and/or Cloud certifications At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.
Posted 1 month ago
4.0 - 7.0 years
10 - 14 Lacs
Chennai
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. The Role in BriefThe Data Integration Analyst, is responsible for implementing, maintaining, and supporting HL7 interfaces between customers, both external and internal, and Optum’s integration platforms. The Engineer will work in a team, but will have individual assignments that he/she will work on independently. Engineers are expected to work under aggressive schedules, be self-sufficient, work within established standards, and be able to work on multiple assignments simultaneously. Candidates must be willing to work in a 24/7 environment and will be on-call as needed for critical issues. Primary Responsibilities Interface Design and Development: Interface AnalysisHL7 message investigation to determine gaps or remediate issues Interface Design, Development, and Delivery - Interface planning, filtering, transformation, and routing Interface ValidationReview, verification, and monitoring to ensure delivered interface passes acceptance testing Interface Go-Live and Transition to SupportCompleting cutover events with teams / partners and executing turnover procedures for hand-off Provider EnrollmentsProvisioning and documentation of all integrations Troubleshooting and Support: Issue ResolutionTroubleshoot issues raised by alarms, support, or project managers from root cause identification to resolution Support RequestsHandle tier 2 / 3 support requests and provide timely solutions to ensure client satisfaction Enhancements / MaintenanceEnsuring stable and continuous data delivery Collaboration and Communication: Stakeholder InteractionWork closely with Clients, Project Managers, Product managers and other stakeholders to understand requirements and deliver solutions DocumentationContribute to technical documentation of specifications and processes CommunicationEffectively communicate complex concepts, both verbally and in writing, to team members and clients Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Basic Qualifications EducationBachelor’s degree in Computer Science or any engineering field Experience2+ years of experience working with HL7 data and Integration Engines or Platforms Technical AptitudeAbility to learn new technologies Skills: Proven solid analytical and problem-solving skills Required Qualifications Undergraduate degree or equivalent experience HL7 Standards knowledgeHL7 v2, v3, CDA Integration Tools knowledgeInter Systems IRIS, Infor Cloverleaf, NextGen Mirth Connect, or equivalent Cloud Technology knowledgeAzure or AWS Scripting and StructureProficiency in T-SQL and procedural scripting, XML, JSON Preferred Qualifications HL7 Standards knowledge HL7 FHIR, US Core Integration Tools knowledge Inter Systems Ensemble or IRIS, Cache Scripting and Structure knowledge Object Script, Perl, TCL, Java script US Health care Knowledge Health Information SystemsWorking knowledge Clinical Data Analysis knowledge Clinical ProcessesUnderstanding of clinical processes and vocabulary Soft Skills Analytical and CreativeHighly analytical, curious, and creative OrganizedProven solid organization skills and attention to detail OwnershipTakes ownership of responsibilities At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.
Posted 1 month ago
3.0 - 7.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Overall Responsibilities: Data Pipeline Development: Design, develop, and maintain highly scalable and optimized ETL pipelines using PySpark on the Cloudera Data Platform, ensuring data integrity and accuracy. Data Ingestion: Implement and manage data ingestion processes from a variety of sources (e.g., relational databases, APIs, file systems) to the data lake or data warehouse on CDP. Data Transformation and Processing: Use PySpark to process, cleanse, and transform large datasets into meaningful formats that support analytical needs and business requirements. Performance Optimization: Conduct performance tuning of PySpark code and Cloudera components, optimizing resource utilization and reducing runtime of ETL processes. Data Quality and Validation: Implement data quality checks, monitoring, and validation routines to ensure data accuracy and reliability throughout the pipeline. Automation and Orchestration: Automate data workflows using tools like Apache Oozie, Airflow, or similar orchestration tools within the Cloudera ecosystem. Monitoring and Maintenance: Monitor pipeline performance, troubleshoot issues, and perform routine maintenance on the Cloudera Data Platform and associated data processes. Collaboration: Work closely with other data engineers, analysts, product managers, and other stakeholders to understand data requirements and support various data-driven initiatives. Documentation: Maintain thorough documentation of data engineering processes, code, and pipeline configurations. Software Requirements: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Familiarity with Hadoop, Kafka, and other distributed computing tools. Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Strong scripting skills in Linux. Category-wise Technical Skills: PySpark: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Cloudera Data Platform: Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Data Warehousing: Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Big Data Technologies: Familiarity with Hadoop, Kafka, and other distributed computing tools. Orchestration and Scheduling: Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Scripting and Automation: Strong scripting skills in Linux. Experience: 3+ years of experience as a Data Engineer, with a strong focus on PySpark and the Cloudera Data Platform. Proven track record of implementing data engineering best practices. Experience in data ingestion, transformation, and optimization on the Cloudera Data Platform. Day-to-Day Activities: Design, develop, and maintain ETL pipelines using PySpark on CDP. Implement and manage data ingestion processes from various sources. Process, cleanse, and transform large datasets using PySpark. Conduct performance tuning and optimization of ETL processes. Implement data quality checks and validation routines. Automate data workflows using orchestration tools. Monitor pipeline performance and troubleshoot issues. Collaborate with team members to understand data requirements. Maintain documentation of data engineering processes and configurations. Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field. Relevant certifications in PySpark and Cloudera technologies are a plus. Soft Skills: Strong analytical and problem-solving skills. Excellent verbal and written communication abilities. Ability to work independently and collaboratively in a team environment. Attention to detail and commitment to data quality.
Posted 1 month ago
5.0 - 10.0 years
18 - 32 Lacs
Hyderabad
Hybrid
Greetings from AstroSoft Technologies We are Back with Exciting job opportunity for AWS Data Engineer professionals. Join our Growing Team & Explore with us at Hyderabad office (Hybrid- Gachibowli) No.of Openings - 10 Positions Role : AWS Data Engineer Project Domain: USA Client-BFSI, Fintech Experience: 5+ Years Work Location : Hyderabad (Hybrid - Gachibowli) Job Type: Full-Time Company: AstroSoft Technologies (https://www.astrosofttech.com/) Astrosoft is an award-winning company that specializes in the areas of Data, Analytics, Cloud, AI/ML, Innovation, Digital. We have a customer first mindset and take extreme ownership in delivering solutions and projects for our customers and have consistently been recognized by our clients as the premium partner to work with. We bring to bear top tier talent, a robust and structured project execution framework, our significant experience over the years and have an impeccable record in delivering solutions and projects for our clients. Founded in 2004 , Headquarters in Florida, Texas-,USA, Corporate Office - India, Hyderabad Benefits from Astrosoft Technologies H1B Sponsorship (Depends on Project & Performance) Lunch & Dinner (Every day) Health Insurance Coverage- Group Industry Standards Leave Policy Skill Enhancement Certification Hybrid Mode JOB DETAILS: Role: Senior AWS Data Engineer Location : India, Hyderabad, Gachibowli (Vasavi SkyCity) Job Type : Full Time Shift Timings : 12.30 PM to 9.30 PM IST Experience Range - 5+ yrs. Work Mode: Hybrid (Fri & Mon-WFH) Interview Process : 3 Tech Rounds Job Summary: Strong experience and understanding of streaming architecture and development practices using kafka , Kinesis , spark , flink etc, Strong AWS development experience using S3 , SNS , SQS, MWAA ( Airflow ) Glue , DMS and EMR . Strong knowledge of one or more programing languages Python /Java/Scala (ideally Python ) Experience using Terraform to build IAC components in AWS . Strong experience with ETL Tools in AWS ; ODI experience is as plus. Strong experience with Database Platforms: Oracle, AWS Redshift Strong experience in SQL tuning, tuning ETL solutions, physical optimization of databases. Very familiar with SRE concepts which includes evaluating and implementing monitoring and observability tools like Splunk , Data Dog, CloudWatch and other job, log or dashboard concepts for customer support and application health checks. Ability to collaborate with our business partners to understand and implement their requirements. Excellent interpersonal skills and be able to build consensus across teams. Strong critical thinking and ability to think out-of-the box. Self-motivated and able to perform under pressure. AWS certified (preferred) Thanks & Regards Karthik Kumar HR-TAG Lead -India Astrosoft Technologies, Unit 1810, level 18, Vasavi Sky city, Gachibowli, Hyderabad, Telangana 500081. Contact: +91-8712229084 Email: karthik.jangam@astrosofttech.com
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France