Home
Jobs

9069 Query Jobs - Page 47

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

8 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Ensure compliance, implement monitoring and automation Guide developers on schema design and query optimization Conduct DB health audits and capacity planningCollaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or master's degree in computer science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge Strong communication and collaboration skills

Posted 6 days ago

Apply

8.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

8 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Ensure compliance, implement monitoring and automation Guide developers on schema design and query optimization Conduct DB health audits and capacity planningCollaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or master's degree in computer science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge Strong communication and collaboration skills

Posted 6 days ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. my_style { font-family: Arial !important;font-size: 11pt !important;line-height:1.3em !important}.my_style h1 { font-family: Arial !important;font-size: 11pt !important;padding-bottom: 0.5em !important;padding-top: 0.5em !important}.my_style h2{ font-family: Arial !important;font-size: 11pt !important;padding-bottom: 0.5em !important;padding-top: 0.5em !important}.my_style p{ font-family: Arial !important;font-size: 11pt !important;padding-bottom: 0.5em !important;padding-top: 0.5em !important}/* This is internal styling */.my_style ul{margin-left:3em ;font-family: Arial !important;font-size: 11pt !important;padding-bottom: 0.5em !important;padding-top: 0.5em !important}.my_style ol{margin-left:3em ;padding-bottom: 0.5em !important;padding-top: 0.5em !important;font-family: Arial !important;font-size: 11pt !important}.my_style ol li {font-family: Arial !important;font-size: 11pt !important;padding-bottom: 0.5em !important;padding-top: 0.5em !important} At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity This is an opportunity to become part of the Global Talent Delivery Team responsible for ensuring alignment of talent solutions, processes, and data, enabling continuous improvement through digital enablement and deliver management information and predictive insights. The successful candidate will join the Talent Insights and Analytics Team – Data & Insights, a key sub function, who will build key relationships, deliver reporting and analytics services to Talent teams globally. Your Key Responsibilities Essential Functions of the Job: Collaborate with Talent Insights and Analytics - Business Consulting and Analytics & Workforce Planning teams to build and enable reporting services at scale. Support the delivery of advanced and predictive reporting techniques to deliver robust analyses and support the delivery of insights to the Talent Executive teams. Ensure consistent delivery of reports, compliance/ legal reporting, strategic reporting, ad-hoc analysis, technical / complex requests, SuccessFactors (SF) report development, management reporting / cross functional reports. Configure and create new reporting and analysis to meet operational and management demands. Deep subject matter expertise in data engineering, visualization, and related functional domains to generate and support insights and analysis. Responsible for the delivery of reporting services via direct access and bespoke requests, leveraging both automation techniques and manual reporting. Liaison with other groups such as vendors, IT, all other teams within Talent Delivery. Understand and deliver complex, ad-hoc report analytics requests through leveraging analytics expertise. Provide better managerial insights to stakeholders - through integrated and standardized data reports and dashboards. Deliver on diverse requests spanning across SF configuration and reporting functionality, SQL, Power BI, advanced Excel, data configuration, storytelling etc. Develop delivery expertise on the different technologies used in SF within Talent, liaising with different solution and process owners to ensure data availability. Seek ways to automate standard reporting to aid and develop the reporting landscape. Perform data analysis to assess quality and meaning of data, maintain database and data systems to ensure reorganization of data in a readable format. Support and execute ongoing development of existing solutions by identifying and prioritizing needs, defining the requirements for third party delivery. Analytical/Decision Making Responsibilities: Provide delivery expertise and knowledge in how reporting and analysis operates. Understand the reporting landscape and optimize functional delivery standards. For allocated processes support and implement decisions for defining, delivering, and continuously improving the process. Leverage and review data and information to monitor reporting performance against agreed metrics. (e.g., timelines / efficiency of service delivery) Share market insights and review findings with key stakeholders / networks influencing change as required. Other Responsibilities: Collaborate with extended teams to ensure effective execution of technology implementation, drive quality and performance standards Work closely with the teams across the Talent Delivery and wider Talent Functions for configuration, development, testing and implementation of technological solutions that support business and functional delivery. Ability to develop people, skills in coaching, mentoring, and learning on the job Effectiveness in building trust, respect, and cooperation among teams Other Requirements: Due to global nature of the role; travel and willingness to work alternative hours will be required Due to global nature of the role; English language skills - excellent written and verbal communication will be required Skills And Attributes For Success Experience: Experience in delivering functional reporting solutions for business Experience on supporting reporting capabilities and its implementation (SAP, Oracle, SuccessFactors, custom solutions) in a relevant industry or consulting environment Experience of having worked on reporting and analytics solutions and its delivery Demonstrable experience of collaborating with talent colleagues to understand needs/requirements and of underlying reporting and data governance processes & systems Experience of participating in global dispersed teams to enhance services, processes, and standards Demonstrable experience of working in fast-paced, ambiguous, stressful environments to deliver required results Demonstrable experience of working with third party vendors / external system implementors to deliver reporting solutions Demonstrable experience of anticipating issues and challenges and proactively working to navigate challenges Experience of conducting internal and external research and analysis, providing best practices and insights to drive improvements Demonstrable experience of having worked in a collaborative environment or provide subject matter resource advice to achieve successful change outcomes To qualify for the role, you must have Bring deep knowledge of the reporting and analytics operating model, and organization design and ways of working across the talent eco-system. Strong business acumen – ability to understand Talent systems landscape and to consider the functionality and integration requirements in line with the capabilities required to implement reporting and data analytics priorities. Ability to participate effectively in virtual teams and networks across diverse and dispersed geographies. Proactive consulting skills that drive business impact; able to interpret functional / technological requirements and, where prioritized, co-create the most relevant & pragmatic approach. Strong teaming skills; collaborate effectively across talent ecosystem, within the Talent Delivery team and the firm at-large. Strong communication skills for sharing thought leadership across EY and externally to enhance EY reputation. Strong organizational skills and attention to detail - the ability to operate within budget and effective time frames. Strong research and analytical skills to track and interpret trending directions for designing the reporting and analytics solutions and to identify potential future options. Significant ability to cope with ambiguity; to drive change and performance outcomes in a complex and agile environment. Reporting, Analytics and Technical Requirements: Reporting: Understanding and manipulating data and creating reports Technical: SuccessFactors report development expertise (SF Report Stories, SF Canvas Reports), SF Plateau Report Designer Excel (Advanced such as Power Query, VBA macro, etc.) SQL, SSIS, SMS, SSRS, ETL, Relational Database, Data modeling Advanced SQL skills to develop and optimize complex queries for data extraction using aggregate functions, CTEs, Windows functions etc. Experience with data manipulation and transformation including creation of SQL tables, views and stored procedures. Experience developing and optimizing SSIS packages for data integration and transformation tasks Visualization/Dashboards: Microsoft Power BI MS Power Platform (Power Apps, Power Automate, etc.) Familiarity with AI platforms Ideally, you’ll also have Functional Experience Learning processes and compliance reporting Learning Consumption Reporting Report on key learning metrics (such as satisfaction, value, hours consumed, speed of delivery and redundant content) Fluency in learning technologies (such as SF LMS or similar product) Strong knowledge of applying analytics to talent and learning data Education: Educated to degree level Higher professional or master’s qualification is preferred, not required Certification Requirements: Higher professional or master’s qualification in a related discipline is preferred, not required Active membership in related professional bodies or industry groups is preferred, not required What We Look For Talent Insights and Analytics – Data & Insights team is looking for an individual with the skills and experience we require, who can work well with our team, takes charge of their personal development, and go above and beyond expectations to help EY build a better working world. What We Offer As part of this role, you'll work in a highly integrated, global team with the opportunity and tools to grow, develop and drive your career forward. Here, you can combine global opportunity with flexible working. The EY benefits package goes above and beyond too, focusing on your physical, emotional, financial, and social well-being. Your recruiter can talk to you about the benefits available in your country. Here’s a snapshot of what we offer: Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. Please apply to this role only through the ‘Apply’ link (not through the local office). Your application will then be routed to the appropriate recruiting team. The Exceptional EY Experience. It’s Yours To Build. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 6 days ago

Apply

5.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Very Urgent position for Data Engineer Lead Job Title: Data Engineer Lead Experience: 5-10 Years Budget :10-14LPA(based on experience) Location: Pune -hybrid Notice Period: Immediate to 15days Mandatory skills: Python,GCP,Spark,SQL(EXPERT) Tech Stack Table Skills Experience Rating out of 10 Python GCP Spark SQL-Expert Lead Responsibilities Lead and mentor a team of data engineers, providing technical guidance, setting best practices, and overseeing task execution for the migration project. Design, develop, and architect scalable ETL processes to extract, transform, and load petabytes of data from on-premises SQL Server to GCP Cloud SQL PostgreSQL. Oversee the comprehensive analysis of existing SQL Server schemas, data types, stored procedures, and complex data models, defining strategies for their optimal conversion and refactoring for PostgreSQL. Establish and enforce rigorous data validation, quality, and integrity frameworks throughout the migration lifecycle, ensuring accuracy and consistency. Collaborate strategically with Database Administrators, application architects, business stakeholders, and security teams to define migration scope, requirements, and cutover plans. Lead the development and maintenance of advanced scripts (primarily Python) for automating large-scale migration tasks, complex data transformations, and reconciliation processes. Proactively identify, troubleshoot, and lead the resolution of complex data discrepancies, performance bottlenecks, and technical challenges during migration. Define and maintain comprehensive documentation standards for migration strategies, data mapping, transformation rules, and post-migration validation procedures. Ensure data governance, security, and compliance standards are meticulously applied throughout the migration process, including data encryption and access controls within GCP. Implement Schema conversion or custom schema mapping strategy for SQL Server to PostgreSQL shift Refactor and translate complex stored procedures and T-SQL logic to PostgreSQL-compatible constructs while preserving functional equivalence. Develop and execute comprehensive data reconciliation strategies to ensure consistency and parity between legacy and migrated datasets post-cutover. Design fallback procedures and lead post-migration verification and support to ensure business continuity. Ensuring metadata cataloging and data lineage tracking using GCP-native or integrated tools. Must-Have Skills Expertise in data engineering, specifically for Google Cloud Platform (GCP). Deep understanding of r elational database architecture, advanced schema design, data modeling, and performance tuning. Expert-level SQL proficiency, with extensive hands-on experience in both T-SQL (SQL Server) and PostgreSQL. Hands-on experience with data migration processes, including moving datasets from on-premises databases to cloud storage solutions. Proficiency in designing, implementing, and optimizing complex ETL/ELT pipelines for high-volume data movement, leveraging tools and custom scripting. Strong knowledge of GCP services: Cloud SQL, Dataflow, Pub/Sub, Cloud Storage, Dataproc, Cloud Composer, Cloud Functions, and Bigquery. Solid understanding of data governance, security, and compliance practices in the cloud, including the management of sensitive data during migration. Strong programming skills in Python or Java for building data pipelines and automating processes. Experience with real-time data processing using Pub/Sub, Dataflow, or similar GCP services. Experience with CI/CD practices and tools like Jenkins, GitLab, or Cloud Build for automating the data engineering pipeline. Knowledge of data modeling and best practices for structuring cloud data storage for optimal query performance and analytics in GCP. Familiarity with observability and monitoring tools in GCP (e.g., Stackdriver, Prometheus) for real-time data pipeline visibility and alerting. Good-to-Have Skills Direct experience with GCP Database Migration Service, Storage Transfer Service, or similar cloud-native migration tools. Familiarity with data orchestration using tools like Cloud Composer (based on Apache Airflow) for managing workflows. Experience with containerization tools like Docker and Kubernetes for deploying data pipelines in a scalable manner. Exposure to DataOps tools and methodologies for managing data workflows. Experience with machine learning platforms like AI Platform in GCP to integrate with data pipelines. Familiarity with data lake architecture and the integration of BigQuery with Google Cloud Storage or Dataproc. Kindly share a profile only in this tracker format ,attach the tracker to the body of the mail. Without this tracker format profile will not be considered. Sl. No Date Position Names of the Candidate Mobile Number Email id Total Experience Relevant Experience CUrrent CTC Expected CTC Notice Period / On Paper Current Organisation Current Location Address with Pin code Reason of leaving DOB Offer in hand VENDOR NAME Comments

Posted 6 days ago

Apply

4.0 - 6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Database Administrator At Optimum Info, we are continually innovating and developing a range of software solutions empowering the Network Development and Field Operations businesses in the Automotive, Power Sports and Equipment industries. Our integrated suite of comprehensive solutions provides a seamless and rich experience to our customers, helping them become more effective at their work and create an impact on the organization. Our sharp cultural focus on outstanding customer service and employee empowerment is core to our growth and success. As a growing company, we offer incredible opportunities for learning and growth with the opportunity to manage high-impact business solutions. Position Overview The Database Administrator (DBA) will acquire a functional understanding of all Optimum products and develop expertise on designs of databases. This role will have key participation in database design and management activities to maintain performance of running products and features as well as ensuring optimal performance of new features developed for various product lines across our network development solutions. The job will be based in Noida, India. Key Responsibilities Understand current database designs and database deployment architectures for all product lines. Review and propose steps for optimal database response, train and advise development and support teams in appropriate data handling techniques covering security, design, collection, query and storage. Track resource utilization, collect performance metrics and tune server configuration for optimal performance. Prepare and execute database management processes, including but not limited to patching, maintenance plans, performance review plans, self-audits etc. Prepare and maintain documentation relevant to data models for all product lines. Prepare and maintain documentation relevant to data management e.g. data flows and integration, data security policies and procedures etc. Prepare and maintain documentation relevant to database administration e.g. disaster prevention and recovery processes and policies, access control, maintenance schedules etc. Report status of data management and database administration activities and checklists in reviews with senior management. Periodically identify risks and challenges, define mitigation and plans for continuous improvement in services. Review and execute DDL and DML scripts on production environments, maintain logs of changes; manage access to test and production databases. Participate in incident management and RCAs. Plan and execute approved activities in consultation with senior IT management. Keep track of industry standards and emerging tools, techniques and practices and make appropriate recommendations for implementation. Desired qualifications and experience Bachelor’s degree or equivalent with experience in administering Microsoft SQL Server databases hosted on cloud and RDS (on AWS) / managed services on any public cloud. Work experience of 4 - 6 years with 3 – 5 years as a Microsoft SQL Server DBA. Experience working with product development teams and understanding of the role a DBA plays in maintaining application performance and availability. Automotive domain experience will be beneficial. Must be proficient in MS Office tools like Word, Excel, and PowerPoint.

Posted 6 days ago

Apply

2.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Join a Team That’s Passionate About Making Lives Better! At Bill Gosling Outsourcing, we believe that success starts with an amazing team. We are a global leader in outsourcing solutions, we focus on making lives better, one connection at a time. We provide tailored solutions to businesses around the globe, specializing in customer care, sales, and financial services. We’re looking for enthusiastic, driven individuals to join our dynamic work environment where fun meets results ! The Interaction Analyst is tasked with call listening/analysis and delivering actionable insights for query generation, contributing to process enhancement projects using the Speech Analytics tool. This role ensures the provision of valuable insights to the client through projects. What You'll Do Primary Responsibilities: Collaborates with Business Analysts and Senior Business Analysts on team goals, offering strategic insights. Generates inputs from call listening that can be transformed into queries for actionable insights. Analyzes results and compiles data to aid in the development of queries and analytical goals in alignment with client agreements. Experience Mandatory : A background as a quality auditor with a preference for those who have provided support for process enhancement to both external and internal stakeholders. Desirable A minimum of 2 years' experience in the current role, preferably engaging in projects related to customer interaction analysis or call evaluation, ideally within contact center environments. Experience with speech analytics platforms would be preferred. Required Skills And Attributes An enquiring mind, curiosity and desire to understand ‘why’. Exceptional listening abilities, particularly in structured call analysis. The capacity to quickly grasp new technologies and the initiative to self-educate. Outstanding communication skills Well- Developed presentation skills Proficiency in PowerPoint and Excel A bachelor's degree or an equivalent qualification What We're Looking For All Information security responsibilities can be located in The Book of Bill (Global) and The Book of Bill (Global) – French. Please note that Information security responsibilities are based on role. Why Join Us? Growth Opportunities: We believe in promoting from within and providing opportunities for career advancement. Comprehensive Training: We offer extensive paid training to ensure you’re equipped for success. Team-Oriented Culture: Work in a collaborative, supportive environment with peers who are passionate about what they do. Diversity & Inclusion: We celebrate the unique perspectives and contributions of all our employees. Fun Workplace: Join a vibrant team that knows how to have fun! From team engagement activities to social events, we foster a lively and inclusive work environment where you’ll build strong connections. State-of-the-Art Offices: Work in our modern, well-equipped offices designed to enhance collaboration and productivity. Rewarding Work: Help businesses grow while making a real difference in people’s lives! Get to Know Us Better! Follow us to get an insider view of our team in action, our values in motion, and a sneak peek into what makes us an awesome place to work! Twitter & Instagram: bgocareers Facebook: Bill Gosling Outsourcing LinkedIn: Bill Gosling Outsourcing Website – https://www.billgosling.com/careers By applying to this position, you acknowledge that you have read and understood Bill Gosling Outsourcing’s Privacy Policy and consent to the collection, use, and storage of personal information in accordance with the policy. At Bill Gosling Outsourcing, we believe that diversity makes us stronger. We welcome applicants from all backgrounds and are committed to creating an inclusive and supportive workplace where everyone can thrive. Regardless of your race, gender, age, ability status, or any other characteristic, you are valued here. If you require accommodations at any stage of the hiring process, we are happy to work with you to ensure you have the support you need – just let us know. Bill Gosling Outsourcing – Where your career thrives!

Posted 6 days ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Hello! You've landed on this page, which means you're interested in working with us. Let's take a sneak peek at what it's like to work at Innovaccer. Analytics at Innovaccer Our analytics team is dedicated to weaving analytics and data science magics across our products. They are the owners and custodians of intelligence behind our products. With their expertise and innovative approach, they play a crucial role in building various analytical models (including descriptive, predictive, and prescriptive) to help our end-users make smart decisions. Their focus on continuous improvement and cutting-edge methodologies ensures that they're always creating market leading solutions that propel our products to new heights of success. About The Role We are looking for a Senior Data Analyst for the analytics team who can help us build the next generation of dashboards, reports, and other analytics for our customers in the provider/payer market. A Day in the Life Experience with Docker and container orchestration/management tools (Jenkins) and software tools (Jira, Confluence, Github, etc Develop, scale, and control strategies, standards, guidelines, governance of Continuous Integration systems Collaborate with internal development and QA teams to help ensure end-to-end quality Develop and maintain Python scripts for automating data processing, API interactions, and system integrations Proven expertise developing end-to-end data integration for complex projects using SQL preferably on Snowflake/PostgresDBs Performance tuning at database level & SQL Query optimization Work with the Data Architect/Solution Architect and application Development team to implement data strategies Contribute to all phases of the software development lifecycle on projects from requirements discussions, development, deployment, final testing & validation of the final product as well as product support post delivery Be able to understand customer requests and implement simple and effective solutions with minimal guidance and oversight Implement projects in Agile; participate in Daily scrum meetings, Co-ordinate with various teams to achieve common goals Work with the Data Architect/Solution Architect and application Development team to implement data strategies What You Need 3+ years of experience in data analytics, with experience in SQL and Python Possess a customer-focused attitude through conversations and documentation Ability to lead a small team and manage the project deliverables for yourself and for the team Strong written and spoken communication skills Hands-on ability to set up reporting tools and build reports and ad hoc functionality Should be a very data-driven person with loads of curiosity and willingness to ask questions about the data Ability to think out of the box and find ways to bring efficiency in the existing processes We offer competitive benefits to set you up for success in and outside of work. Here's What We Offer Generous Leave Benefits: Enjoy generous leave benefits of up to 40 days Parental Leave: Experience one of the industry's best parental leave policies to spend time with your new addition Sabbatical Leave Policy: Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered Health Insurance: We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury Pet-Friendly Office*: Spend more time with your treasured friends, even when you're away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children*: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your child's well-being first. *India offices Where And How We Work Our Noida office is situated in a posh techspace, equipped with various amenities to support our work environment. Here, we follow a five-day work schedule, allowing us to efficiently carry out our tasks and collaborate effectively within our team. Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details.

Posted 6 days ago

Apply

3.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

For over four decades, PAR Technology Corporation (NYSE: PAR) has been a leader in restaurant technology, empowering brands worldwide to create lasting connections with their guests. Our innovative solutions and commitment to excellence provide comprehensive software and hardware that enable seamless experiences and drive growth for over 100,000 restaurants in more than 110 countries. Embracing our "Better Together" ethos, we offer Unified Customer Experience solutions, combining point-of-sale, digital ordering, loyalty and back-office software solutions as well as industry-leading hardware and drive-thru offerings. To learn more, visit partech.com or connect with us on LinkedIn, X (formerly Twitter), Facebook, and Instagram. Position Description PAR Technology is seeking a technically skilled Sustaining Engineer to maintain and troubleshoot our PAR POS and integrated payment solutions for the hospitality industry. This role focuses on resolving complex technical issues, improving system reliability, and supporting our production environments through hands-on problem-solving. The ideal candidate is a detail-oriented engineer with strong diagnostic skills and practical experience maintaining critical payment systems and POS infrastructure. Position Location: Jaipur Reports To: Manager What We’re Looking For Diagnose and resolve production issues across POS, payment processing, and integrated systems Develop and maintain scripts (PowerShell/Python) to automate troubleshooting and monitoring tasks Analyze system logs, application dumps, and SQL query performance to identify root causes Troubleshoot API integrations (REST/SOAP) and middleware message queues Support PCI-compliant payment systems including EMV/NFC transaction flows Debug POS peripherals (printers, scanners, cash drawers) using manufacturer tools Collaborate with engineering teams to implement permanent fixes for recurring issues Maintain and enhance system monitoring using New Relic/Datadog Document technical solutions and create knowledge base articles Participate in on-call rotation for critical production support Unleash your potential: What you will be doing and owning: 3+ years hands-on experience maintaining production POS, payment, or distributed systems Proficient in Windows/Linux system administration and log analysis Strong SQL skills including query optimization and deadlock troubleshooting Scripting proficiency in PowerShell and/or Python for automation Payment systems expertise (EMV/NFC, PCI DSS compliance requirements) API troubleshooting with both REST (Postman, curl) and SOAP (SoapUI, Wireshark) Networking knowledge (TCP/IP, Wi-Fi) and packet analysis (Wireshark/tcpdump) Experience with observability platforms (New Relic/Datadog) Understanding of cryptographic processes and TLS/SSL management Ability to analyze .NET/Java application dumps and system event logs POS integration layers and middleware experience Preferred Qualifications Certifications: Microsoft SQL Server, Network+, CCNA, or PCI Professional (PCIP) Experience with containerization (Docker, Kubernetes) Cloud platform knowledge (Azure, AWS) Familiarity with POS software (NCR Aloha, Micros, or similar) Hospitality/restaurant technology background Interview Process Interview #1: Phone Screen with Talent Acquisition Team Interview #2: Video interview with the Technical Teams (via MS Teams/F2F) Interview #3: Video interview with the Hiring Manager (via MS Teams/F2F) PAR is proud to provide equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. We also provide reasonable accommodations to individuals with disabilities in accordance with applicable laws. If you require reasonable accommodation to complete a job application, pre-employment testing, a job interview or to otherwise participate in the hiring process, or for your role at PAR, please contact accommodations@partech.com. If you’d like more information about your EEO rights as an applicant, please visit the US Department of Labor's website.

Posted 6 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a growing internal community and are committed to creating a workplace that looks like the world that we serve. Pay and Benefits: Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact you will have in this role: The Database Administrator will provide database administrative support for all DTCC environments including Development, QA, client test, and our critical high availability production environment and DR Data Centers. Extensive knowledge on all aspects of MSSQL database administration and the ability to support other database platforms including both Aurora PostgreSQL and Oracle. This DBA will have a high level of impact in the generation of new processes and solutions, while operating under established procedures and processes in a critically important Financial Services infrastructure environment. The Ideal candidate will ensure optimal performance, data security and reliability of our database infrastructure. What You'll Do: Software Installation, configure and maintain SQL server instances (On-prem and cloud based) Implement and handle High availability solutions including Always ON availability groups and clustering. Support development, QA, PSE and Production environments using ServiceNow ticketing system. Review production performance reports for variances from normal operation. Optimize SQL queries and indexes for better efficiency. Analyze query and recommend the tuning strategies. Maintain database performance by calculating optimum values for database parameters; implementing new releases; completing maintenance requirements; evaluating computer operating systems and hardware products. Perform database backup and recovery strategy using tools like SQL server backup, log shipping and other technologies. Provide 3rd level support for DTCC critical production environments. Participate in root cause analysis for database issues. Prepare users by conducting training; providing information; and resolving problems. Maintains quality service by establishing and enforcing organizational standards. Setup and maintain database replication and clustering solutions. Maintains professional and technical knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking innovative practices; participating in professional societies. Will have shared responsibility for off-hour support. Maintain documentation on database configurations and procedures. Provide leadership and direction for the Architecture, Design, Maintenance and L1, L2 and L3 Level Support of a 7 x 24 global infrastructure. Qualifications: Bachelor's degree or equivalent experience Talents Needed for Success: Strong Oracle Experience of 19c, 21c and 22c. A minimum 4+ years of proven relevant experience in SQL Solid understanding in MSSQL Server and Aurora Postgres database Strong knowledge in Python and Angular. Working knowledge of Oracle’s Golden gate Replication technology Demonstrate strong performance Tuning and Optimization skills in MSSQL, PostgreSQL and Oracle databases. Good Experience in High Availability and Disaster recovery (HA/DR) options for SQL server. Good experience in Backup and restore processes. Proficiency in power shell scripting for automation. Possess Good interpersonal skills and ability to coordinate with various stakeholders. Follow standard processes on Organizational change / Incident management / Problem management. Demonstrated ability to solve complex systems and database environment issues. Actual salary is determined based on the role, location, individual experience, skills, and other considerations. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 6 days ago

Apply

5.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Responsibilities The main role of a Senior Consultant (Database) is to troubleshoot highly complex technical problems (Oracle Database & Goldengate/Exadata) requiring a high level of technical expertise. Works directly with customers. Open to work in rotational shifts based on business requirements. Drives improvements in product quality. Serves as Situation Manager on highly sensitive Customer issues. Consult with management to direct the resolution of critical Customer situations. Could you consult with Customers on the complex use of Oracle products? Achieves knowledge transfer through the development and delivery of training, knowledge sessions, mentoring, etc. Creates /reviews Knowledge Articles. Analyzes workload, determines best practices, and implements changes to improve productivity. Proactively contributes to increasing the team's efficiency by sharing knowledge, providing feedback about best practices, and writing tools/utilities. Who are we looking for? Required Qualification: 5 to 10 years of work experience in Oracle DB 12c/18/19c Installation, Upgrade, Configuration, Migration & Performance tunning. Technical degree i.e. BE / B.Tech / M.Tech / MCA / M.Sc. in Computer Science / Management Information Systems / Engineering. Open to working in 24*7 shifts. Oracle OCP DBA certification - Preferred. Technical skills: Database architecture knowledge and administration. Experience with cloud technologies from different vendors. Thorough understanding of the Oracle database features. Extensive hands-on interaction with large Database management systems. Multitenant, Real Application Cluster (RAC) & ASM/Storage areas. Backup and Recovery, RMAN, Data Guard, knowledge of various restore and recovery scenarios. Performance Tuning, Parallel query, Query tuning. Partitioning. Database Security. Golden Gate & Replication. Enterprise Manager. Experience with Engineered systems- Exadata is a plus. General UNIX/Linux Concepts & Administration Managing Kernel Parameters, Partitioning, and File Systems. Operating systems expertise will be an added advantage. Personal Attributes: Self-driven and result-oriented. Open to working from the client's location (Locally) Strong customer support and client relations skills. Ability to work effectively in high-volume & high-pressure situations. Ability & Flexibility to work late shifts. Certifications preferred: OCI/OCP certification.

Posted 6 days ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

Data Engineer Location: India/Remote Type of Contract: Permanent, Full-time Start Date: ASAP Salary: Competitive Who we are: Oxford International Oxford International Education Group is a renowned institution dedicated to providing exceptional educational experiences to international students. With a global presence and a commitment to academic excellence, we strive to empower students to achieve their full potential and thrive in a dynamic, interconnected world. We are proud of our culture and have recently been officially certified as a Great Place to Work ©! Job Purpose OIEG is a Private Equity-backed, rapidly growing organisation that requires an experienced Data Engineer to support the operations within the Data function. As a data engineer, you will be responsible for designing, building, and maintaining the assets and artefacts required for data storage, processing, and analysis. Your role will involve working within the data and analytics team, alongside business stakeholders to deliver data and analytics capabilities. The role will include end to end implementation with a focus on data engineering elements. The ideal person should be inquisitive and have an innovative approach, have a proactive and can-do attitude, be a good problem solver and be open to learning new skills. Key Responsibilities Design, implement, and maintain scalable Data Lake houses and Warehouses using best practices in cloud and distributed systems Build and orchestrate end-to-end data pipelines using Data Factory, Notebooks within Microsoft Fabric, and optimize, monitor, manage and maintain them. Transform data using SQL and Python/pySpark Develop robust ETL/ELT frameworks to extract, load, and transform data from APIs, databases, flat files, and third-party sources into Lake houses or Data Warehouses Ensure data reliability and integrity through automated quality checks, validation rules, and observability tooling. Implement and maintain documentation and metadata management practices for reproducibility and knowledge sharing Collaborate with BI Developers, Analysts, and Business Stakeholders to define data requirements and deliver analytics-ready datasets Person Specifications Qualifications Bachelor’s or Master’s degree in Computer Science, Data Engineering, Data Science, or a related technical discipline Experience Minimum of 4 years hands-on experience as a Data Engineer or in a similar role, ideally in an Azure/ Fabric environment. Any Azure / Fabric certifications would be advantageous. Data Engineering, Data Transformation, Data Modelling BI Tools/Visualization Knowledge of Data Governance Data Engineering: Designing, building, and maintaining data/delta lakes/ DWH, data orchestration and ETL. Understanding of data architecture and design principals Any exposure to Fivetran or similar Data Acquisition tools or an understanding of them. Data modelling techniques using the Star/Kimbal methodology Knowledge of Power BI or similar tools and adept at telling stories through data and visualization. Source Control ideally have exposure to Azure DevOps, GitHub Skills Development of Lake houses/ DWHs ideally within an Azure/Fabric Environment Fabric/Azure Data Factory/ Synapse Analytics/ Dataflows Gen2 or similar tools such as Data Bricks; or similar ETL/ Data acquisition tools Proficient in Python/ pySpark for data engineering that includes scripting, automation, and building transformation logic. Advanced knowledge of SQL/ T-SQL, including query optimization, stored procedures, and indexing strategies. Skills with Power BI and DAX /Power Query This job description is provided as a guide to the role. It is not intended to be an exhaustive description of duties and responsibilities and may be subject to periodic revision. Oxford International is committed to safeguarding and promoting the welfare of children. Recruitment checks, including checks with past employees, are undertaken in accordance with our Recruitment and Selection policy. Oxford International is an equal opportunity employer. Every applicant and employee has the same opportunities regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status.

Posted 6 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our technology services client is seeking multiple Sitecore Developer to join their team on a contract basis. These positions offer a strong potential for conversion to full-time employment upon completion of the initial contract period. Below are further details about the role: Job Title: Sitecore Developer Location: Pune, Bangalore Years of Experience: 5+ Years Rotational Shifts. Notice Period: 15 Days Responsibilities Partner with business and technical teams to scope projects and define high-level requirements. Drive solution architecture and implementation using Sitecore XM Cloud, JSS, and Headless development approaches. Ensure solutions meet business requirements, follow Sitecore best practices, and maintain high-quality standards. Lead development teams in defining and executing implementation plans for Sitecore XM Cloud projects. Act as the primary escalation point for technical issues and challenges during development and deployment. Identify, assess, and mitigate risks associated with Sitecore implementations. Collaborate with other developers to resolve technical challenges and promote knowledge sharing. Maintain strong partnerships with business stakeholders and provide strategic technical leadership in Sitecore architecture. Lead and support testing efforts, including defect analysis, code fixes, and user acceptance testing (UAT) support. Contribute to the adoption and continuous improvement of modern Sitecore development practices, including Experience Edge, SXA Headless, and DevOps automation. Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. Sitecore Developer Certification (preferably XM Cloud) is a plus. Proven experience with Sitecore XM Cloud, Headless CMS architecture, and JSS (JavaScript Services). Proficiency in integrating Sitecore with modern front-end frameworks such as React, Next.js, Angular, or Vue.js. Hands-on experience with Sitecore development tools, including: Sitecore CLI TDS Unicorn Glass.Mapper Strong understanding of reusable components, templates, customization, and personalization in Sitecore. Experience with: Sitecore Search Sitecore Workflows Sitecore Commerce / OrderCloud Sitecore Email Experience Manager (EXM) Configuration and deployment of Experience Edge and CD/CM instances Solid knowledge of relational databases and SQL query design and optimization. Familiarity with CI/CD pipelines, containerization (Docker), and cloud-native deployment strategies is a plus. If you are interested, share the updated resume to vinod@s3staff.com

Posted 6 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 23-Jun-2025 Job ID 10081 Description And Requirements Job Responsibilities Manages design, distribution, performance, replication, security, availability, and access requirements for large and complex Oracle databases from version 11g to 19c (CDB/PDB and Standalone) on (Linux/AIX) Operating System. Designs and develops physical layers of databases to support various application needs; Implements back-up, recovery, archiving, conversion strategies, and performance tuning; Manages job scheduling, application release, database change and compliance; Makes use of advanced database features such as partitioning, advanced compression, multitenant architecture, etc. Participates in the design, implementation and maintenance of automated infrastructure solutions using Infrastructure as Code tools like Ansible, Elastic and Terraform. Participates in the develop and management of Azure DevOps CI/CD pipelines to automate infrastructure deployments using Ansible and Elastic. Identifies and resolves problem utilizing structured tools and techniques. Provides technical assistance and mentoring to staff in all aspects of database management; Consults and advises application development teams on database security, query optimization and performance. Writes scripts for automating DBA routine tasks and documents database maintenance processing flows per standards. Participate in basic Root Cause Analysis (RCA) Maintains, and administers data infrastructure security policies safeguarding information, evaluating existing data infrastructure security procedures and identifying new areas of risk. working knowledge of ServiceNow ticketing system, KB article creation and maintenance, and CMDB maintenance 4+ years of experience with Performance Tuning, physical database design, database programming skills, and shell scripting Working experience with Cloud infrastructure, Elastic, Ansible, data replication and project management Education, Technical Skills & Other Critical Requirement Education Bachelor’s degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience. Experience (In Years) 7+ Years Total IT experience & 4+ Years relevant experience in Oracle database Technical Skills 4+ years of related work experience with application database implementation; knowledge of all key Oracle utilities such as SQLPLUS, RMAN, OEM, Data Pump, Active Data Guard and OID. 2+ years Unix and Linux operating systems and 1 year’s shell scripting. Strong database analytical skills to improve application performance. Management of database elements, including creation, alteration, deletion and copying of schemas, databases, tables, views, indexes, stored procedures, triggers, and integrity constraints Engineering and support experience with PeopleSoft financials application databases and Goldengate replications. Extensive experience in Backup and recovery (Data pump, RMAN, Rubrik). Good knowledge in performance tuning with hands on experience on AWR, ADDM, SQLTRPT skill. Experience in Data guard configuration (DR). Experience in Database switchover and failover. Working knowledge in Cloud computing (Azure, OCI) Data security by managing roles and privileges to User and groups. Experience in installation and configuration of OEM Agent and monitoring. Experience in working Ticketing tools (ServiceNow) Understanding of modern IT infrastructure such as Cloud Architecture as well as Agile DevOps Framework. Other Critical Requirements Automation tools and programming such as Python and Ansible. Monitoring the database using Elastic and Oracle Enterprise monitoring (OEM) OCA or OCP 19c and above certification preferable Excellent written and oral communication skills, including the ability to clearly communicate/articulate technical and functional issues with conclusions and recommendations to stakeholders. Prior experience in handling state side and offshore stakeholders Experience in creating and delivering Business presentations. Demonstrate ability to work independently and in a team environment Ability to work 24*7 rotational shift to support for production, development, and test databases About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 6 days ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

We are seeking an enthusiastic and personable individual who thrives on team interaction and has a genuine passion for leading and managing people. The ideal candidate will possess strong leadership skills and the ability to cultivate a positive, inclusive workplace culture. This role requires someone committed to building strong relationships within the team and promoting a culture that drives engagement, growth, and harmony. Job Description: Recruitment and Selection Identify staffing needs, conduct candidate sourcing, screening, interviewing, and coordinate onboarding and exit processes. Supervision and Assessment Supervise employees, ensuring productivity and performance, and conduct regular assessments with feedback. Attendance and Payroll Administration Track attendance, leaves, and ensure accurate, timely payroll processing. Hiring Outreach Reach out to colleges via LinkedIn, WhatsApp, and other platforms, and lead the campus ambassador program. Personal Assistant to the Founder Manage the founder’s calendar, screen queries, set up meetings, and represent the founder in communications and meetings. Staff Relations and Engagement Serve as a point of contact for staff inquiries, address concerns, and foster a positive work environment. Handle complaints, conflicts, and disciplinary actions. Departmental Supervision Supervise, lead, and assist departments, while shadowing the founder across all brand functions (Auor, The Loft, and the new fashion venture). Query Handling and Departmental Coordination Manage incoming queries on LinkedIn, Instagram, and email, ensuring prompt responses and appropriate delegation. HR Policies, Records, and Reporting Prepare SOPs and company policies, maintain accurate staff records, and provide daily HR activity reports to stakeholders. Skills Required: Must be a quick learner with the capacity to be proactive in taking up tasks Efficient time management skills and the ability to prioritize work Ability to regulate work schedules of interns and lead a team of 10+ members Excellent written and verbal communication skills Outstanding project management skills Excellent interpersonal and relationship building skills Ability to multi-task and manage different operations timely Please Note: Working hours needed: 10:00 am to 8:00 pm every day. An experience letter will be provided after successful completion of the term. Letter of Recommendation will be provided based on exceptional conduct. Minimum tenure for this position would be 12 months, with the first 1 month being probationary. The associate will go through a training process in the first month, upon which we will confirm the role, based solely on his/her performance. CTC: INR 2.4 lac per annum fixed + incentives. Please fill in the below form in order to proceed further with the application process: https://forms.gle/bWWuMXp6LZqxA7NQ9

Posted 6 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

At NiCE, we don’t limit our challenges. We challenge our limits. Always. We’re ambitious. We’re game changers. And we play to win. We set the highest standards and execute beyond them. And if you’re like us, we can offer you the ultimate career opportunity that will light a fire within you. So, what’s the role all about? We are seeking a skilled and experienced Developer with expertise in .net programming along with knowledge on LLM and AI to join our dynamic team. As a Contact Center Developer, you will be responsible for developing and maintaining contact center applications, with a specific focus on AI functionality. Your role will involve designing and implementing robust and scalable AI solutions, ensuring efficient agent experience. You will collaborate closely with cross-functional teams, including software developers, system architects, and managers, to deliver cutting-edge solutions that enhance our contact center experience. How will you make an impact? Develop, enhance, and maintain contact center applications with an emphasis on copilot functionality. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Perform system analysis, troubleshooting, and debugging to identify and resolve issues. Conduct regular performance monitoring and optimization of code to ensure optimal customer experiences. Maintain documentation, including technical specifications, system designs, and user manuals. Stay up to date with industry trends and emerging technologies in contact center, AI, LLM and .Net development and apply them to enhance our systems. Participate in code reviews and provide constructive feedback to ensure high-quality code standards. Deliver high quality, sustainable, maintainable code. Participate in reviewing design and code (pull requests) for other team members – again with a secure code focus. Work as a member of an agile team responsible for product development and delivery Adhere to agile development principles while following and improving all aspects of the scrum process. Follow established department procedures, policies, and processes. Adheres to the company Code of Ethics and CXone policies and procedures. Excellent English and experience in working in international teams are required. Have you got what it takes? BS or MS in Computer Science or related degree 2-4 years’ experience in software development. Strong knowledge of working and developing Microservices. Design, develop, and maintain scalable .NET applications specifically tailored for contact center copilot solutions using LLM technologies. Good understanding of .Net and design patterns and experience in implementing the same Experience in developing with REST API Integrate various components including LLM tools, APIs, and third-party services within the .NET framework to enhance functionality and performance. Implement efficient database structures and queries (SQL/NoSQL) to support high-volume data processing and real-time decision-making capabilities. Utilize Redis for caching frequently accessed data and optimizing query performance, ensuring scalable and responsive application behavior. Identify and resolve performance bottlenecks through code refactoring, query optimization, and system architecture improvements. Conduct thorough unit testing and debugging of applications to ensure reliability, scalability, and compliance with specified requirements. Utilize Git or similar version control systems to manage source code and coordinate with team members on collaborative projects. Experience with Docker/Kubernetes is a must. Experience with cloud service provider, Amazon Web Services (AWS) Experience with AWS Could on any technology (preferred are Kafka, EKS, Kubernetes) Experience with Continuous Integration workflow and tooling. Stay updated with industry trends, emerging technologies, and best practices in .NET development and LLM applications to drive innovation and efficiency within the team. You will have an advantage if you also have: Strong communication skills Experience with cloud service provider like Amazon Web Services (AWS), Google Cloud Engine, Azure or equivalent Cloud provider is a must. Experience with ReactJS. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7442 Reporting into: Sandip Bhattcharjee Role Type: Individual Contributor About NiCE NICE Ltd. (NASDAQ: NICE) software products are used by 25,000+ global businesses, including 85 of the Fortune 100 corporations, to deliver extraordinary customer experiences, fight financial crime and ensure public safety. Every day, NiCE software manages more than 120 million customer interactions and monitors 3+ billion financial transactions. Known as an innovation powerhouse that excels in AI, cloud and digital, NiCE is consistently recognized as the market leader in its domains, with over 8,500 employees across 30+ countries. NiCE is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, age, sex, marital status, ancestry, neurotype, physical or mental disability, veteran status, gender identity, sexual orientation or any other category protected by law.

Posted 6 days ago

Apply

5.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Responsibilities The main role of a Senior Consultant (Database) is to troubleshoot highly complex technical problems (Oracle Database & Goldengate/Exadata) requiring a high level of technical expertise. Works directly with customers. Open to work in rotational shifts based on business requirements. Drives improvements in product quality. Serves as Situation Manager on highly sensitive Customer issues. Consult with management to direct the resolution of critical Customer situations. Could you consult with Customers on the complex use of Oracle products? Achieves knowledge transfer through the development and delivery of training, knowledge sessions, mentoring, etc. Creates /reviews Knowledge Articles. Analyzes workload, determines best practices, and implements changes to improve productivity. Proactively contributes to increasing the team's efficiency by sharing knowledge, providing feedback about best practices, and writing tools/utilities. Who are we looking for? Required Qualification: 5 to 10 years of work experience in Oracle DB 12c/18/19c Installation, Upgrade, Configuration, Migration & Performance tunning. Technical degree i.e. BE / B.Tech / M.Tech / MCA / M.Sc. in Computer Science / Management Information Systems / Engineering. Open to working in 24*7 shifts. Oracle OCP DBA certification - Preferred. Technical skills: Database architecture knowledge and administration. Experience with cloud technologies from different vendors. Thorough understanding of the Oracle database features. Extensive hands-on interaction with large Database management systems. Multitenant, Real Application Cluster (RAC) & ASM/Storage areas. Backup and Recovery, RMAN, Data Guard, knowledge of various restore and recovery scenarios. Performance Tuning, Parallel query, Query tuning. Partitioning. Database Security. Golden Gate & Replication. Enterprise Manager. Experience with Engineered systems- Exadata is a plus. General UNIX/Linux Concepts & Administration Managing Kernel Parameters, Partitioning, and File Systems. Operating systems expertise will be an added advantage. Personal Attributes: Self-driven and result-oriented. Open to working from the client's location (Locally) Strong customer support and client relations skills. Ability to work effectively in high-volume & high-pressure situations. Ability & Flexibility to work late shifts. Certifications preferred: OCI/OCP certification.

Posted 6 days ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Summary: We are seeking a skilled and experienced Senior Backend Developer to join our dynamic team. The ideal candidate will have a strong computer science educational background and extensive hands-on experience in building scalable web applications using Django, GCP, GraphQL, Cloud Technologies, React, and Expo . As a senior engineer, you will play a pivotal role in designing, developing, and maintaining robust, scalable, and efficient systems that drive our business goals. Key Responsibilities: Design, develop, and maintain scalable API and mobile/web applications using Django, GCP, Cloud Technologies , GraphQL , React, and Expo. Collaborate with Internal systems. Ensure the performance, quality, code review and responsiveness of applications. Identify and correct bottlenecks and fix bugs to ensure smooth application functionality. Help maintain code quality, organization, and automation by following best practices. Participate in code reviews and provide constructive feedback to team members to enhance code quality. Stay up-to-date with the latest industry trends and technologies to ensure our solutions remain competitive. Architect, implement, and optimize database solutions using PostgreSQL. Utilize GCP services to deploy, scale, and manage cloud-based applications effectively. Write clean, maintainable, and well-documented code, adhering to industry best practices. Debug and resolve complex technical issues, ensuring application performance and reliability. Optimize application performance for maximum speed and scalability. Participate in the entire software development lifecycle, from requirements analysis to deployment and support. Qualifications and Skills: Bachelor’s or Master’s degree in Computer Science or a related field (mandatory). 5+ years of professional experience as a full-stack developer. Proficiency in backend development using Django and Python. Strong understanding of PostgreSQL, including query optimization and database design. Hands-on experience with Google Cloud Platform (GCP) services like Compute Engine, Cloud SQL, and App Engine. Familiarity with containerisation and orchestration tools like Docker and Kubernetes. Experience with RESTful API design and integration. Solid understanding of software design principles, architecture, and best practices. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication skills and the ability to collaborate effectively with cross-functional teams.

Posted 6 days ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About the Role and Team The Knot Worldwide is looking for a passionate Marketing CRM Automation and Salesforce Marketing Cloud Expert who will be at the heart of creating and executing dynamic customer journeys, campaigns, and best in class automation across all CRM touchpoints to include email, push, SMS and in-app messaging for multiple brands (The Knot and Wedding Wire). The right candidate will be hands-on in the marketing platform (Salesforce Marketing Cloud) and needs to have the technical capacity to modify HTML email code, develop email, push notifications, In-app modal and SMS templates and messages, segment data (SQL), and set up tracking for proper attribution. This role is a great fit for someone with a marketing mindset who has the technical know-how to create sophisticated personalized communications. This is your chance to turn data into business opportunities, while collaborating with a fun and forward-thinking team to make sure we are building personalized and automated journeys for our couples to guide their wedding planning experience with TKWW. Responsibilities : Building and managing exciting customer journeys, email, push, SMS and in-app campaigns and automations with Salesforce Marketing Cloud (think Journey Builder, Automation Studio, Email Studio, Data Extensions, Query Segmentation and AMPScript for personalization). A/B testing complex personalized CRM programs, triggers and journeys Deep understanding of customer data, tables and data extensions to easily enable and execute highly complex campaigns. Analyzing customer data from a variety of data table sources to create targeted segmentation and smart campaign strategies. Partnering with CRM campaign strategy managers to brainstorm, build and bring to life data-driven CRM campaigns and triggers. Building and executing personalized customer engagement CRM campaigns and automating them for maximum impact. Sharing your Salesforce Marketing Cloud expertise with the team and supporting campaigns across various business units. Keeping up with the latest and greatest in salesforce marketing cloud and SMS technology, and suggesting how we can improve and innovate internally. Setting up performance reports to track how well channels, campaigns, customer groups, and segments are performing. Working closely with the Marketing, Data Engineering and CRM Marketing operations teams to keep things running smoothly and to bring new ideas to life. Successful Candidates have: You're a Salesforce Marketing Cloud whiz, with solid experience in Journey Builder, Automation Studio, SQL, AMPScript, HTML/CSS and Email Studio. Bonus points if you're familiar with tools like Power BI, Alteryx, ERP, CRM systems, and data warehouses. Well-versed in all aspects of business intelligence, marketing analytics and ecommerce analytics and deep experience with applying analytics to digital commerce or digital marketing. Experience managing large-scale projects Experience using Salesforce Marketing Cloud (preferred) Expertise in email best practices, deliverability and CAN-SPAM regulations Experience with A/B testing methodologies Must exhibit strong verbal, interpersonal, and written communication skills Excellent team player with strong collaborative skills with the ability to work cross-functionally Ability to anticipate needs, innovate, and flourish in a fast-paced, global environment Fluency in English is essential willingness to work with US based teams/hours Min 8 years experience in CRM marketing automation/campaign execution in building, testing and deploying email, push notifications, SMS, and In-app marketing Min 3-5 years of Salesforce Marketing Cloud experience expert in Journey Builder, Automation Studio, SQL, AMPScript, HTML, SCSS and Email Studio.

Posted 6 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA The Data Engineer is a seasoned subject matter expert, responsible for the transformation of data into a structured format that can be easily analyzed in a query or report. This role is responsible for developing structured data sets that can be reused or complimented by other data sets and reports. This role analyzes the data sources and data structure and designs and develops data models to support the analytics requirements of the business which includes management / operational / predictive / data science capabilities. Key responsibilities: Creates data models in a structured data format to enable analysis thereof. Designs and develops scalable extract, transformation and loading (ETL) packages from the business source systems and the development of ETL routines to populate data from sources, Participates in the transformation of object and data models into appropriate database schemas within design constraints. Interprets installation standards to meet project needs and produce database components as required. Creates test scenarios and be responsible for participating in thorough testing and validation to support the accuracy of data transformations. Accountable for running data migrations across different databases and applications, for example MS Dynamics, Oracle, SAP and other ERP systems. Works across multiple IT and business teams to define and implement data table structures and data models based on requirements. Accountable for analysis, and development of ETL and migration documentation. Collaborates with various stakeholders to evaluate potential data requirements. Accountable for the definition and management of scoping, requirements, definition, and prioritization activities for small-scale changes and assist with more complex change initiatives. Collaborates with various stakeholders, contributing to the recommendation of improvements in automated and non-automated components of the data tables, data queries and data models. To thrive in this role, you need to have: Seasoned knowledge of the definition and management of scoping requirements, definition and prioritization activities. Seasoned understanding of database concepts, object and data modelling techniques and design principles and conceptual knowledge of building and maintaining physical and logical data models. Seasoned expertise in Microsoft Azure Data Factory, SQL Analysis Server, SAP Data Services, SAP BTP. Seasoned understanding of data architecture landscape between physical and logical data models Analytical mindset with excellent business acumen skills. Problem-solving aptitude with the ability to communicate effectively, both written and verbal. Ability to build effective relationships at all levels within the organization. Seasoned expert in programing languages (Perl, bash, Shell Scripting, Python, etc.). Academic qualifications and certifications: Bachelor's degree or equivalent in computer science, software engineering, information technology, or a related field. Relevant certifications preferred such as SAP, Microsoft Azure etc. Certified Data Engineer, Certified Professional certification preferred. Required experience: Seasoned experience as a data engineering, data mining within a fast-paced environment. Proficient in building modern data analytics solutions that delivers insights from large and complex data sets with multi-terabyte scale. Seasoned experience with architecture and design of secure, highly available and scalable systems. Seasoned proficiency in automation, scripting and proven examples of successful implementation. Seasoned proficiency using scripting language (Perl, bash, Shell Scripting, Python, etc.). Seasoned experience with big data tools like Hadoop, Cassandra, Storm etc. Seasoned experience in any applicable language, preferably .NET. Seasoned proficiency in working with SAP, SQL, MySQL databases and Microsoft SQL. Seasoned experience working with data sets and ordering data through MS Excel functions, e.g. macros, pivots. Workplace type: Hybrid Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today.

Posted 6 days ago

Apply

4.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

Job Title: Golang Developer Experience: 4+ Years Location: Okhla, New Delhi Work Mode: Work From Office (WFO) Company: Mobiloitte Technologies About Mobiloitte Mobiloitte Technologies is a full-stack digital solutions provider, delivering innovative web, mobile, and enterprise applications. We partner with global clients to accelerate their digital transformation journeys using the latest technologies and best practices. Role Overview We are seeking a Golang Developer with 4+ years of hands-on experience , including proven expertise in GraphQL , to join our development team. You will design, build, and maintain high-performance back-end services that power our web and mobile products. Key Responsibilities Architect, develop, test, and maintain microservices in Go. Design and implement GraphQL schemas , resolvers, and query optimizations. Integrate Go services with databases (SQL/NoSQL), message queues, and third-party APIs. Write clean, secure, and well-documented code following best practices. Perform code reviews, identify performance bottlenecks, and optimize for scale. Collaborate with front-end, QA, and DevOps teams to deliver end-to-end solutions. Troubleshoot and resolve production issues, ensuring high availability. Mentor junior developers and contribute to knowledge-sharing sessions. Required Skills & Qualifications 4+ years of professional experience in Go (Golang) development. Strong GraphQL experience: schema design, resolver implementation, query performance tuning. Proficiency with RESTful APIs and microservices architecture. Hands-on experience with databases (PostgreSQL, MongoDB, etc.) and ORMs (GORM, etc.). Familiarity with containerization (Docker) and orchestration (Kubernetes). Solid understanding of concurrency patterns and memory management in Go. Experience with version control (Git), CI/CD pipelines, and automated testing. Excellent problem-solving skills and strong communication abilities. Nice to Have Exposure to gRPC or WebSockets. Experience with cloud platforms (AWS, GCP, Azure). Knowledge of message brokers (Kafka, RabbitMQ). Previous work on high-traffic, distributed systems. Certifications in Go or cloud technologies is a plus.

Posted 6 days ago

Apply

4.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Greetings from TCS! Walkin Details: Job Title: SAP BODS Interview Location: Tata Consultancy Services, Gitanjali Park, International Finalcial Hub(CBD), Newtown, Kolkata, Chakpachuria, West Bengal 700156 Interview Date: 28th June 2025 Interview Time: 9am-1pm Experience Range: 4-12 yrs Please apply only if you can attend in person drive at TCS Kolkata office on 28th June 2025 (Saturday) Job description: TCS has always been in the spotlight for being adept in the next big technologies. What we can offer you is a space to explore varied technologies and quench your techie soul. What we are looking for: 4+ years’ working in SAP Business Objects Data Services (BODS) – for designing and development ETL solutions. SAP Business Objects Data Services (BODS) proficiency needs to include: Audit Rules Built-in Functions Interactive Debugger Transforms, such as Query, Key Generation, Table Comparison, History Preserving and Validation Develop ETL logic utilizing various features available in the Data Integrator component, including: Expert in SQL and stored procedure for development, debugging and implementing complex transformation Needs to be a self-starter and able to lead ETL projects or ETL activities within a project Familiarity with Systems Development Life Cycle (SDLC) and software development best practices Development experience with two or more of the following: SQL, PL/SQL, TSQL, SQL Server, Oracle Experience collaborating with DevOps, Security, Network, Systems and other IT teams. Minimum Qualification: 15 years of full-time education.

Posted 6 days ago

Apply

4.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Summary In this role, you will be responsible to manage Payroll input collection, validate employee non-CTC reimbursement claims, handle payroll and full and final related employee queries, and ensure timely and accurate processing of payroll and Full and Final settlements. This role requires emphasis on data integrity and analysis to support key HR decisions while providing coverage and support as needed also responsible for ensuring the payroll disbursements are executed accurately and on time. This includes the statutory payment process involves handling payments that are mandated by law typically relating to payroll. These payments include taxes, PF, ESI, PT and the other deductions that must be processed during payroll cycle. Job Description Payroll Input Management Collect and validate monthly payroll inputs from various sources. Relocation reimbursements. Gym reimbursements. Higher education reimbursements. Maintain proper documentation and audit trails for all input submissions. Ensure compliance with company policies before processing. Employee Query Management Respond to payroll related employee queries within defined 48-hour TAT. Provide high quality resolution to ensure employee satisfaction and service excellence. Exit and Full and Final process Manage end to end exit payroll processing. Authorize and update separation details in separation screen. Coordinate with HRMs, Finance, and payroll vendors for timely F&F computation and payout. Ensure all clearances and statutory requirements are met before closure. Address employees exit related questions promptly and ensure resolution within defined timelines. Payroll Process Management Manages payroll queries from employees, managers and HRM’s in a timely, and professional manner. Correctly administers benefit/payroll processes. Manages service delivery metrics in relation to outsourced benefits/payroll. Includes those who lead Payroll & Benefits Service Delivery teams, and/or the operations for such teams. Mostly used in Shared Service teams but can be found in Business teams. Oversee the accurate and timely processing of payroll for South Asia. Ensure compliance with payroll laws, regulations, and company policies. Individuals who are responsible for the quality delivery of payroll processes and service to the businesses. Oversee the accurate and timely processing of payroll for South Asia. Ensure compliance with payroll laws, regulations, and company policies. Process and manage the disbursement of salaries. Timely deposits of income tax withholdings, PT and contributions to PF. Generate internal reports for reconciliation and to inform management about the status and financial impact of statutory payments. Qualifications/Requirements Bachelor’s degree in human resource, finance, business administration or related field. 4 - 6 years of relevant experience in payroll management, full and final settlement process, employee separation procedures, preferably in a large organization or shared service environment. Familiarity with payroll systems, HRIS platforms, calculation and reporting. Excellent analytical, problem solving, and decision-making skills, with a keen attention to detail and accuracy in calculations and documentation. Commitment to integrity, confidentiality, and professionalism in handling sensitive employee information and financial transactions. Desired Characteristics Collaborative approach Continuous improvement mindset. Customer service orientation. Organization skills. Attention to detail. Additional Information Relocation Assistance Provided: No

Posted 6 days ago

Apply

4.0 - 15.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

TCS has been a great pioneer in feeding the fire of Young Techies like you. We are a global leader in the technology arena and there's nothing that can stop us from growing together. We are Hiring for SAP ABAP Consultant We are delighted to invite you for a discussion to get to know more about you and your professional experience. The interview will be in person. Venue details Date: 28-June-25 Registration Time: 9.30 AM to 12.30 PM Location: SEZ Unit, Synergy Park, Premises 2 56/1/36, Survey Number 26, CMC Campus, Gachibowli, Hyderabad, Telangana, 500032 Job Description SAP ABAP Consultant Experience: 4 to 15 years Location: Hyderabad Required Technical Skill Set: SAP ABAP Desired Competencies (Technical / Behavioral Competency) Responsibilities Include: This is a SAP DevOps technical lead role in SAP S/4 environment Good experience working on Onshore/Offshore model Should have experience in working with Interfaces to SAP Excellent interpersonal and organizational skills with ability to communicate The candidate should be ready to work in flexible timings Strong knowledge and understanding of IT Service Provider organizations is helpful Should have good integration knowledge with other areas of supply chain like PP, QM, SD, and SAP APO Knowledge of batch job monitoring and Idoc processing is mandatory. Must-Have: Must have experience in each of the following areas: List (Report) Programming, Dialog (Transaction) Programming, Enhancements (User Exits, BADI, BAPI etc.), Interfaces (IDOCS, Remote Function Calls, Web Services etc.) and Adobe forms/Smart forms. Hands on experience at HANA compatible developments like CDS views, AMDP, O-Data, SQL query, SQL functions and ABAP on HANA. Knowledge on ABAP Object Oriented Programming. Knowledge on SAP-Workflow Good-to-Have: Self-starter and excellent communication skills in English Should be able to work with the SAP team on Functional issues The candidate should be ready to work in flexible timings Clear understanding of the industry and competitive trends that impact your role Minimum Qualification: 15 years of full-time education

Posted 6 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Us: LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree — a Larsen & Toubro Group company — combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit www.ltimindtree.com. Job Title: Sr. JAVA GCP Developer Work Location Hyderabad, India Job Description: Design and Review the Solution according to the requirement Execute project specific development activities in accordance to applicable standards and quality parameters Developing Reviewing Code Setting up the right environment for the projects Ensure delivery within schedule by adhering to the engineering and quality standards Own deliver end to end projects within GCP for Payments Data Platform Once a month available on support rota for a week for GCP 247 on call Basic Knowledge on Payments ISO standards Message Types etc Able to work under pressure on deliverables P1 Violations Incidents Should be fluent and clear in communications Written Verbal Should be able to follow Agile ways of working Must have hands on experience on JAVA GCP Shell script and Python knowledge a plus Have in depth knowledge on Java Spring boot Should have experience in GCP Data Flow Big Table Big Query etc Should have experience on managing large database Should have worked on requirements design develop Event Driven and Near Real time data patterns Ingress Egress Hands on is a must As per Agile development methodology should be flexible to support developed code in production environment LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law. Safe return to office : In order to comply with LTIMindtree’ s company COVID-19 vaccine mandate, candidates must be able to provide proof of full vaccination against COVID-19 before or by the date of hire. Alternatively, one may submit a request for reasonable accommodation from LTIMindtree’s COVID-19 vaccination mandate for approval, in accordance with applicable state and federal law, by the date of hire. Any request is subject to review through LTIMindtree’s applicable processes.

Posted 6 days ago

Apply

4.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Functional Area: Data Science & AI/ML Employment Type: Full Time, Permanent Role Category: Engineering and Technology Experience: 4-6 years Job Description: As a growing organization in the healthcare domain, we seek a Data Science Application Expert with expertise in JupyterLab, SAS Studio, and R Studio. The ideal candidate will design, develop, and optimize data science workflows, ensuring robust, scalable, and efficient data processing pipelines. You will collaborate with cross-functional teams to support data-driven decision-making, build machine learning models, and implement analytical solutions aligned with industry standards for security and compliance. Key Responsibilities: Data Science & Application Management: Develop, manage, and optimize JupyterLab, SAS Studio, and R Studio environments for data science workflows. Support and troubleshoot Jupyter Notebooks, R scripts, and SAS programs for performance and reliability. Assist data scientists and analysts in configuring and optimizing their analytical workflows. Maintain and enhance computational environments to support data exploration, statistical modeling, and machine learning. Machine Learning & Analytics: Implement and manage machine learning models using Python (TensorFlow, Scikit-learn, PyTorch), R, and SAS. Automate data preprocessing, feature engineering, and model training workflows. Work with AWS SageMaker, EMR, and other cloud-based ML solutions for scalable model deployment. Monitor and optimize model performance, ensuring reproducibility and efficiency. Data Engineering & Integration: Design and optimize ETL (Extract, Transform, Load) pipelines to handle large-scale healthcare datasets. Integrate JupyterLab, SAS Studio, and R Studio with cloud-based storage (AWS S3, Redshift, Snowflake). Manage structured and unstructured data, ensuring data integrity and compliance with healthcare standards (HIPAA, GDPR). Optimize query performance for large datasets using SQL, Pandas, and SAS data steps. Infrastructure as Code (IaC) & Cloud Deployment: Deploy and manage JupyterHub, RStudio Server, and SAS Viya on AWS, Azure, or on-premise environments. Automate environment provisioning using Terraform, AWS CDK, or CloudFormation. Ensure efficient resource allocation and auto-scaling of cloud-based data science environments. Security & Compliance: Implement role-based access controls (RBAC) for secure access to data science applications. Ensure data security and encryption using AWS KMS, Secrets Manager, and IAM policies. Adhere to HIPAA, GDPR, and other regulatory compliance requirements for healthcare data. Collaboration & Stakeholder Management: Work closely with data scientists, engineers, and business analysts to understand analytical requirements. Participate in Agile ceremonies, including sprint planning, standups, and retrospectives. Provide technical guidance and mentorship to junior team members. Job Requirements: Education: Bachelor’s/Master’s degree in Computer Science, Data Science, Statistics, or related fields. Experience: 4-6 years of experience in data science application management, focusing on JupyterLab, SAS Studio, and R Studio. Technical Skills: Core Data Science Platforms: Expertise in JupyterLab, SAS Studio, and R Studio for data science workflows. Strong understanding of SAS programming (Base SAS, Advanced SAS, SAS Viya), R, and Python. Experience in managing and scaling JupyterHub, RStudio Server, and SAS Viya in cloud or on-prem environments. Programming & Frameworks: Proficiency in Python, R, SAS, SQL, and shell scripting. Experience with Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch for machine learning. Cloud & Infrastructure: Experience in deploying and managing JupyterLab, R Studio, and SAS Viya on AWS, Azure, or GCP. Hands-on experience with AWS SageMaker, Glue, Lambda, Step Functions, and EMR. Proficiency in Terraform, AWS CDK, or CloudFormation for infrastructure automation. Database Management & ETL: Experience with SQL, NoSQL databases (PostgreSQL, DynamoDB, Snowflake, Redshift, MongoDB). Hands-on experience in ETL pipelines and data wrangling using SAS, Python, and SQL. DevOps & CI/CD Tools: Familiarity with CI/CD pipelines using Jenkins, GitLab, or AWS native tools. Experience with Docker, Kubernetes, and containerized deployments. Additional Skills: Event-Driven Architecture: Experience in real-time data processing using Kafka, Kinesis, or SNS/SQS. Security Best Practices: Implementation of secure access controls and data encryption. Cost Optimization: Understanding cloud pricing models and optimizing compute resources. Agile Development: Hands-on experience with Agile methodologies like Scrum and Kanban. Key Attributes: Problem-Solving Mindset: Ability to troubleshoot complex data science workflows and propose scalable solutions. Detail-Oriented: Strong focus on data integrity, performance optimization, and reproducibility. Collaborative: Team player who thrives in a dynamic, cross-functional environment. User-Centric Approach: Commitment to delivering scalable and efficient data science applications.

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies