Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 4.0 years
15 - 20 Lacs
Bengaluru
Hybrid
Project Role: Python development Work Experience: 2 to 4 Years Work location: Bengaluru Work Mode: Hybrid Must Have Skills: Python, API, Azure Cloud Technical Skills and Experience: 3-4 years of professional experience in Python development. Strong understanding of object-oriented programming principles and design patterns. Proficiency in developing and hosting APIs using FastAPI (or similar frameworks). Experience with relational and/or NoSQL databases. Solid understanding of RESTful API principles and best practices. Experience with version control systems (e.g., Git). Familiarity with cloud platforms (e.g., AWS, Azure, GCP) is a plus. Excellent problem-solving and debugging skills. Strong communication and collaboration skills. Good to have experience in working with containerization technologies (Kubernetes). Once you join the team you will be trained on AI/ML, GenAI, LLM skills
Posted 1 week ago
5.0 - 8.0 years
6 - 10 Lacs
Noida
Work from Office
Position Summary As a staff engineer you will be part of development team and apply your expert technical knowledge, broad knowledge of software engineering best practices, problem solving, critical thinking and creativity to build and maintain software products that achieve technical, business and customer experience goals and inspire other engineers to do the same. You will be responsible towards working with different stakeholders to accomplish business and software engineering goals.Key duties & responsibilitiesEstimates and develops scalable solutions using .Net technologies in a highly collaborative agile environment with strong experience in C#, ASP.net Core, Web API.Maintain relevant documentation around the solutions. Conducts Code Reviews and ensures SOLID principles and standard design patterns are applied to system architectures and implementations.Evaluates, understands and recommends new technology, languages or development practices that have benefits for implementing.Collaborate with the Agile practitioners to help avoid distractions for the team, so that the team is focused on delivering their sprint commitments.Drive adoption of modern engineering practices such as Continuous Integration, Continuous Deployment, Code Reviews, TDD, Functional\Non-Functional testing, Test Automation, Performance Engineering etc. to deliver high-quality, high-value softwareFoster a culture and mindset of continuous learning to develop agility using the three pillars transparency, inspection and adaptation across levels and geographies.Mentors other members of the development team.Leads sessions with scrum team members to structure solution source code and designs implementation approaches optimizing for code that follows engineering best practices, and maximizes maintainability, testability and performance.Relevant exposure to agile ways of working preferably Scrum and KanbanSkills and KnowledgeB.E/B. Tech/MCA or equivalent professional degree5-8 years of experience designing and developing n-tier Web applications using .Net Framework, .Net Core, ASP.Net, WCF and C#, MVC 4/5 Web Development, RESTful API Services, Web API and JSONWell versed with C#, modern UI technologies and database\ORM technologies.Must have solid understanding of modern architectural and design patterns.Comprehensive knowledge of automation testing and modern testing practices e.g., TDD, BDD etc.Strong exposure in one or more Implementation of CI & CD using Jenkins, Dockers containerization.Strong exposure to Agile software development methodologies and enabling tools such as Jira, ConfluenceExcellent communicator with demonstrable ability of influencing decisionsKnowledge of healthcare revenue cycle management, HL7, EMR systems, HIPAA, FHIR would be preferred.Good to have knowledge on Azure Cloud.Good working understanding of application architecture concepts like microservices, Domain-Driven Design, broker pattern/message bus, event-driven, CQRS, ports & adapters/hexagonal/onion, SOA would be preferredKey competency profileSpot new opportunities by anticipating change and planning accordingly.Find ways to better serve customers and patients.Be accountable for customer service of highest quality.Create connections across teams by valuing differences and including others.Own your development by implementing and sharing your learnings.Motivate each other to perform at our highest level.Help people improve by learning from successes and failures.Work the right way by acting with integrity and living our values every day.Succeed by proactively identifying problems and solutions for yourself and others. r1rcm.com Facebook
Posted 1 week ago
8.0 - 13.0 years
5 - 9 Lacs
Noida
Work from Office
R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industry's most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where we're all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, Top 100 Best Companies for Women by Avtar & Seramount, and amongst Top 10 Best Workplaces in Health & Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 16,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. We are looking for a Senior Database Engineer to administrator and maintain NoSQL and relational SQL Server/MySQL Databases. The candidate will be part of the team providing operations support on multiple No SQL clusters running on Azure and will be responsible to install, configure, monitor, design, implement and support our mission critical MongoDB, Cosmos, Couchbase and SQL Server environments. The ideal candidate should be a fast learner, eager, passionate about automating development and production environments, and enjoy the challenge of working in a highly distributed and dynamic Hybrid Cloud environment. As part of service-oriented team, the role will require the individual to collaborate effectively with other internal engineering teams to gather requirements to deliver on various database platforms. There will be plenty of opportunities for developing your skills, as we look to improve constantly with the latest technologies. Essential Responsibilities Create, Administer, Monitor, and Maintain multiple Elasticsearch, MongoDB, and Cosmos and Couchbase environments. Work with development teams to design and implement optimized NoSQL databases.Implement relational databases, tables, and table changes. Support application development for problem solving and performance tuning. Assist in administering, monitoring, and maintaining SQL Server environments, including for disaster recovery. Work on new and existing logical/physical database designs for applications and infrastructure. Provides after-hours support for database emergencies, routine scheduled maintenance, and database server patching. Works closely with the business and engineering teams to understand and plan for storage and database needs. Implementation, configuration, maintenance, and performance of SQL Server RDBMS systems, to ensure the availability and operational readiness (security, health, and performance) of our corporate applications in cloud (managing Cloud Infrastructure related to SQL Data Services in Azure). Assist app dev teams with complex query tuning and schema refinement. Utilize various tools to evaluate performance and implement remedies to improve performance, including tuning database parameters and SQL statements. Required Qualifications 8+ years of experience in working in Database, Data Management, or Engineering roles. 6+ years of progressive experience in high volume/high transaction data administration, with at least 3 years working with Microsoft Azure Cloud technologies. 6+ years of experience managing NoSQL databases such as Couchbase, MongoDB, CosmosDB. 2+ years of experience in ElasticSearch. 6+ years of experience in performance tuning and database monitoring utilizing techniques with query analysis, indexes, statistics, execution plans. Prior experience working with large (2tb+) transactional databases and across a large environment with hundreds to thousands of databases in-scope. DESIRED TECHNICAL S Ability to troubleshoot performance issues with NoSQL databases (Elasticsearch, MongoDB, Cosmos and Couchbase) Accurately recommend configuration changes for optimal performance of NoSQL databases (Elasticsearch, MongoDB, and Cosmos and Couchbase) Experience in the design, testing, implementation, maintenance, and control of the organization's NoSQL databases across multiple platforms, technologies, (for example physical, relational and object oriented) and computing environments. Ability to develop queries to extract information based on compounded search criteria. Strong expertise with relational databases (Microsoft SQL Server, MySQL is a plus) with enhanced troubleshooting and performance tuning skills. Fundamental proficiency in data modeling in practical applications of a moderate nature. Firm understanding of the most prominent Azure database technologies such as Azure SQL Database and Azure SQL Managed Instance. Backup, restore, secure, scale, monitor and tune an Azure SQL Database Experience translating environments into Azure Managed Instance and other Azure technologies will be given a strong preference. Nice to Haves Certifications in Azure/SQL Server/NoSQL Experience with Postgres and MySQL is a big plus but not mandatory. Knowledge of SQL monitoring tools SolarWinds DPA, RedGate etc. Service now and Azure DevOps experience r1rcm.com Facebook
Posted 1 week ago
3.0 - 8.0 years
17 - 30 Lacs
Bengaluru
Work from Office
Key Responsibilities: Design, develop, and maintain high-performance, scalable, and secure Java-based applications using Spring Boot, JPA, and Hibernate. Work with both SQL (MySQL, PostgreSQL, Oracle) and NoSQL (MongoDB, Cassandra, DynamoDB) databases. Implement and optimize RESTful APIs, microservices, and event-driven architectures. Leverage cloud platforms (AWS/Azure/GCP) for deploying, monitoring, and scaling applications. Integrate message queue systems (PubSub / Kafka / RabbitMQ /SQS / Azure Service Bus) for asynchronous processing. Contribute to data lake and data warehouse solutions, ensuring efficient data ingestion, processing, and retrieval. Collaborate with frontend teams if needed (knowledge of React/Angular is a plus). Troubleshoot and debug complex issues, ensuring optimal performance and reliability. Follow industry best practices in coding standards, security (OWASP), CI/CD, and DevOps methodologies. Own the delivery of an integral piece of a system or application. Mandatory Skills & Qualifications: 3-5 years of hands-on experience in Java/J2EE, Spring Boot, Hibernate, JPA. Strong expertise in SQL & NoSQL databases, query optimization, and data modeling. Proven experience with cloud platforms (AWS/Azure/GCP) Lambda, EC2, S3, Azure Functions, GCP Cloud Run, etc. Knowledge of one of the message queue systems (Pubsub / Kafka / RabbitMQ / ActiveMQ /SQS). Familiarity with data lakes (Delta Lake, Snowflake, Databricks) & data warehouses (BigQuery, Redshift, Synapse). Experience with Docker, Kubernetes, and CI/CD pipelines (Jenkins/GitHub Actions/Azure DevOps). Strong problem-solving, debugging, and performance tuning skills. Experience in eCommerce and Deep hands-on technical expertise Good to Have (Plus Skills): Frontend experience with React.js/Angular. Knowledge of GraphQL, gRPC, or WebSockets. Understanding of AI/ML integration in backend systems. Certifications in cloud (AWS/Azure/GCP) or Java/Spring. Experience with monitoring and logging tools in GCP (Cloud Monitoring, Cloud Logging) Soft Skills: Strong analytical and communication skills. Ability to work in a fast-paced, collaborative environment. Proactive mindset with a focus on continuous learning.
Posted 1 week ago
6.0 - 11.0 years
5 - 15 Lacs
Tirupati
Work from Office
About the Role We are seeking an experienced and driven Technical Project Manager / Technical Delivery Manager to lead complex, high-impact data analytics and data science projects for global clients. This role demands a unique blend of project management expertise, technical depth in cloud and data technologies, and the ability to collaborate across cross-functional teams. You will be responsible for ensuring the successful delivery of data platforms, data products, and enterprise analytics solutions that drive business value. Key Responsibilities Project & Delivery Management Lead the full project lifecycle for enterprise-scale data platformsincluding requirement gathering, development, testing, deployment, and post-production support. Own the delivery of Data Warehousing and Data Lakehouse solutions on cloud platforms (Azure, AWS, or GCP). Prepare and maintain detailed project plans (Microsoft Project Plan), and align them with the Statement of Work (SOW) and client expectations. Utilize hybrid project methodologies (Agile + Waterfall) for managing scope, budget, and timelines. Monitor key project KPIs (e.g., SLA, MTTR, MTTA, MTBF) and ensure adherence using tools like ServiceNow. Data Platform & Architecture Oversight Collaborate with data engineers and architects to guide the implementation of scalable Data Warehouses (e.g., Redshift, Synapse) and Data Lakehouse architectures (e.g., Databricks, Delta Lake). Ensure data platform solutions meet performance, security, and governance standards. Understand and help manage data integration pipelines, ETL/ELT processes, and BI/reporting requirements. Client Engagement & Stakeholder Management Serve as the primary liaison for US/UK clients; manage regular status updates, escalation paths, and expectations across stakeholders. Conduct WSRs, MSRs, and QBRs with clients and internal teams to drive transparency and performance reviews. Facilitate team meetings, highlight risks or blockers, and ensure consistent stakeholder alignment. Technical Leadership & Troubleshooting Provide hands-on support and guidance in data infrastructure troubleshooting using tools like Splunk, AppDynamics, Azure Monitor. Lead incident, problem, and change management processes with data platform operations in mind. Identify automation opportunities and propose technical process improvements across data pipelines and workflows. Governance, Documentation & Compliance Create and maintain SOPs, runbooks, implementation documents, and architecture diagrams. Manage project compliance related to data privacy, security, and internal/external audits. Initiate and track Change Requests (CRs) and look for revenue expansion opportunities with clients. Continuous Improvement & Innovation Participate in and lead at least three internal process optimization or innovation initiatives annually. Work with engineering, analytics, and DevOps teams to improve CI/CD pipelines and data delivery workflows. Monitor production environments to reduce deployment issues and improve time-to-insight. Must-Have Qualifications 10+ years of experience in technical project delivery, with strong focus on data analytics, BI, and cloud data platforms . Strong hands-on experience with SQL and data warehouse technologies like Snowflake, Synapse, Redshift, BigQuery , etc. Proven experience delivering Data Warehouse and Data Lakehouse solutions. Familiarity with tools such as Redshift, Synapse, BigQuery, Databricks, Delta Lake . Strong cloud knowledge with Azure, AWS, or GCP . Proficiency in project management tools like Microsoft Project Plan (MPP) , JIRA, Confluence, and ServiceNow. Expertise in Agile project methodologies. Excellent communication skillsboth verbal and writtenwith no MTI or grammatical errors. Hands-on experience working with global delivery models (onshore/offshore). Preferred Qualifications PMP or Scrum Master certification. Understanding of ITIL processes and DataOps practices. Experience managing end-to-end cloud data transformation projects. Experience in project estimation, proposal writing, and RFP handling. Desired Skills & Competencies Deep understanding of SDLC, data architecture, and data governance principles. Strong leadership, decision-making, and conflict-resolution abilities. High attention to detail and accuracy in documentation and reporting. Ability to handle multiple concurrent projects in a fast-paced, data-driven environment. A passion for data-driven innovation and business impact. Why Join Us? Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions. Work on impactful projects that make a difference across industries. Opportunities for professional growth and continuous learning. Competitive salary and benefits package.
Posted 1 week ago
5.0 - 8.0 years
4 - 9 Lacs
Ahmedabad
Work from Office
Summary: We are looking for a highly skilled Senior MERN Stack Developer to join our team and contribute to the design, development, and maintenance of scalable, high-performance web applications. The ideal candidate should have 5 to 7 years of strong hands-on experience with MongoDB, Express.js, React.js, and Node.js, with an added advantage if they have worked in the healthcare domain or have experience with real-time communication (Socket.IO, WebSockets) or Bluetooth integrations. Key Responsibilities: Develop and maintain full-stack web applications using the MERN stack. Design RESTful APIs and integrate third-party APIs as needed. Build reusable components and front-end libraries for future use. Optimize components for maximum performance across devices and browsers. Work with MongoDB for schema design, indexing, performance tuning, and data modeling. Implement secure authentication and authorization using JWT, OAuth, etc. Collaborate with cross-functional teams including UI/UX, QA, and DevOps. Participate in code reviews and provide constructive feedback. Ensure responsive design and cross-browser compatibility. Implement real-time features using Socket.IO, WebSockets, or Bluetooth (if applicable). Must-Have Skills: 57 years of hands-on experience with the MERN stack (MongoDB, Express, React, Node.js). Proficiency in JavaScript (ES6+), HTML5, and CSS3. Strong experience with MongoDB including aggregation, indexing, and performance tuning. Experience in building and consuming RESTful APIs and/or GraphQL. Solid understanding of Git version control and CI/CD pipelines. Familiarity with Agile/Scrum development processes. Strong debugging and performance optimization skills. Good to Have (Preferred): Healthcare domain experience understanding of clinical workflows, HL7/FHIR, HIPAA compliance, etc. Real-time communication experience with Socket.IO, WebSockets, or Bluetooth integrations (especially for IoT or mobile-connected devices). Experience with cloud services like AWS, Azure, or GCP. Experience with testing frameworks like Jest, Mocha, or Cypress. Soft Skills: Strong communication and interpersonal skills. Ability to mentor junior developers and lead code reviews. Problem-solving mindset and a proactive attitude. Ability to work independently and in a team.
Posted 1 week ago
11.0 - 20.0 years
20 - 35 Lacs
Hyderabad
Work from Office
Key Responsibilities: Define and drive the digital transformation roadmap across business functions including Supply Chain, Finance, and Operations. Leverage Microsoft technologies like Azure Cloud, Azure Machine Learning, Power BI, and RPA tools to design and implement scalable digital solutions. Build and deploy automation solutions (RPA) to improve process efficiency, reduce manual efforts, and enhance data accuracy. Lead cloud hosting initiatives, ensuring scalable, secure, and cost-effective deployments. Design and build advanced analytics dashboards and predictive models using Power BI and Azure ML, closely aligned with business KPIs. Translate business challenges into digital solutions with clear ROI and strategic value. Collaborate with cross-functional business and IT teams to ensure successful adoption of digital tools and platforms. Stay current with emerging technologies and continuously bring innovation ideas to the business Key Skills & Competencies: 15+ years of progressive experience in digital transformation, technology consulting, or enterprise IT roles. Deep expertise in Microsoft technologies: Power BI (including advanced DAX and data modeling) Azure Machine Learning and Azure Cloud Services RPA (UiPath, Power Automate, or similar) Cloud hosting, infrastructure setup, and management on Azure Strong understanding of business domains, especially Supply Chain Management and Finance processes in ERP preferably SAP Proven track record of implementing technology-driven business transformations. Excellent leadership, stakeholder management, and communication skills. Ability to bridge the gap between technology and business strategy effectively. Preferred Qualifications: Certification(s) in Azure Architect, Azure Data Scientist, or RPA Developer. Experience with ERP systems integration (SAP, Oracle, Microsoft Dynamics). Exposure to data governance, security, and compliance frameworks. Bachelor's degree in engineering
Posted 1 week ago
5.0 - 10.0 years
22 - 27 Lacs
Navi Mumbai
Work from Office
As Architect at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Architect, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong understanding data lake approaches, industry standards and industry best practices. Detail level understanding of HADOOP Framework, Ecosystem, MapReduce, and Data on Containers (data in OpenShift). Applies individual experiences / competency and IBM architecting structure thinking model to analyzing client IT systems. Experience with relational SQL, Big Data etc Experienced with Cloud native platforms such as AWS, Azure, Google, IBM Cloud or Cloud Native data platforms like Snowflake Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Unix shell scripting and python
Posted 1 week ago
4.0 - 9.0 years
0 Lacs
Noida
Work from Office
Were hiring a Senior Software Developer with exp in full-stack dev (C#, ASP.NET, React.js), CI/CD (Azure DevOps), AWS (S3, ECS), SQL (MSSQL, PostgreSQL), and REST APIs. Bonus: Agile, network, and contact center knowledge. Office cab/shuttle
Posted 1 week ago
2.0 - 5.0 years
6 - 10 Lacs
Mumbai
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 1 week ago
2.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 1 week ago
4.0 - 8.0 years
10 - 15 Lacs
Bengaluru
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Candidate should have hands-on development with Microservices, .Net, C#. Candidate should have hands-on development with VB, Azure, DevOps. Candidate should have hands-on development with KAfka or Azure Service Bus, Candidate should have hands-on development with AKS-containerization. Good to have skill Github, Git, SQL/RDBMS Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 week ago
15.0 - 20.0 years
30 - 35 Lacs
Bengaluru
Remote
We are looking for a seasoned Senior Engineering Manager with over 15 years of experience in software engineering and integration technologies. This leadership role involves driving the strategic vision, architecture, and execution of enterprise-grade integration solutions. You will lead high-performing teams, collaborate with senior stakeholders, and ensure seamless connectivity across platforms, systems, and services. Key Responsibilities: Strategic Leadership: Define and execute the integration technology roadmap aligned with business goals. Team Management: Lead, mentor, and grow a team of engineers across multiple integration domains. Architecture Oversight: Guide the design of scalable, secure, and resilient integration architectures using APIs, messaging systems, and middleware. Cross-Functional Collaboration: Partner with Product, Infrastructure, Security, and Data teams to deliver cohesive solutions. Cloud Integration: Leverage Azure Integration Services (Logic Apps, API Management, Service Bus, Event Grid) to build scalable and secure integrations. API & Middleware Management: Guide the development of RESTful APIs, microservices, and event-driven systems using Java and frameworks like Spring Boot. Technology Governance: Establish best practices, coding standards, and compliance protocols for integration development. Quality Assurance: Establish data quality and governance standards, implement monitoring and alerting frameworks, and ensure reliable data delivery Innovation & R&D: Stay ahead of emerging trends in cloud-native integration, AI-driven automation, and event-driven architectures. Required Skills: Bachelors or Masters degree in Computer Science, Engineering, or related field. 15+ years of experience in software engineering, with 5+ years in engineering leadership roles. Deep expertise in integration platforms (e.g., TIBCO, IIB, Azure iPaaS). Strong hands-on experience with Java, Spring Boot, and related frameworks Strong background in API management, microservices, and cloud-native development (AWS, Azure). Proven experience in leading large-scale integration projects across enterprise environments. Excellent leadership, communication, and stakeholder management skills. Familiarity with CI/CD pipelines and DevOps practices for product engineering Pls share your updated CV, CTC, ECTC & Notice Period @ monika.yadav@ness.com
Posted 1 week ago
9.0 - 14.0 years
37 - 40 Lacs
Kolkata, Bengaluru, Delhi / NCR
Work from Office
Dear Candidate, We are looking for a skilled Python Developer to build scalable web applications and data-driven solutions. If you have expertise in Django, Flask, and API development, wed love to hear from you! Key Responsibilities: Design and develop robust, scalable Python applications. Create RESTful APIs for seamless integration with front-end applications. Optimize application performance and database queries. Work with cloud platforms (AWS, Azure, GCP) for deployment. Implement best coding practices and security standards. Collaborate with data scientists, DevOps, and front-end teams. Required Skills & Qualifications: Proficiency in Python and web frameworks (Django, Flask, FastAPI). Hands-on experience with databases (PostgreSQL, MySQL, MongoDB). Knowledge of front-end technologies (HTML, CSS, JavaScript) is a plus. Familiarity with cloud services and containerization (Docker, Kubernetes). Experience with version control (Git, GitHub, GitLab). Soft Skills: Strong analytical and problem-solving skills. Ability to work independently and in a collaborative environment. Good communication and teamwork skills. Note: If interested, please share your updated resume and your preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 week ago
4.0 - 6.0 years
7 - 8 Lacs
Bengaluru
Work from Office
As a Dynamic CRM Architect, you will play a pivotal role in the configuration, customization, and management of the Dynamics 365 solutions. Your responsibilities will span across development, troubleshooting, and implementation, ensuring seamless CRM operations. Key Responsibilities: - Configure and customize Dynamics 36, including workflows,Plugins, HTML, JavaScript, and unified Interface. - Utilize Microsoft Power Platform and Microsoft Power Automate to enhance system functionality. -- Work with Web technologies and languages such as ASP.net, HTML, Javascript, XML, JSON,.Net Core,ensuring multi- threaded application development. - Manage cloud/ on prem databases like Azure SQL and SQL Server, integrate using tools like Kingsway Soft. - Document software components like technical and functional specifications. - Provide troubleshooting and technical support across environments. - Design and integrate Microsoft Power Portals, including UI development and liquid scripting. - Utilize code management tools like Git,TFS, JIRA, and Confluence for project tracking. - Employ design tools such as ER modeling,UML diagrams and flowcharts. - Handle Dynamics 365 CRM development, code merging, deployments, and CI/CD pipelines. Required Skills: - Proficiency in Azure Cloud, including authentication and resource management. - Experience with QAD or SAP ERP integrations. - Expertise in Data Migrations, customizations, workflows, and plugins. - Strong understanding of Power Apps, both Cloud and on-premise databases, and Virtual Entities. - Knowledge on exception handling, timeouts, performance tuning, and role- based feature implementation. - Ability to manage managed services and unmanaged solutions. - Extensive experience in Sales and services within the CRM sales lifecycle. -End- to- end CRM development and implementation experience, including deployment across various environments.
Posted 1 week ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad, Pune, Coimbatore
Hybrid
Data Software Engineer - Spark, Python, (AWS, Kafka or Azure Databricks or GCP) Job Description: 1. 5-12 Years of in Big Data & Data related technology experience 2. Expert level understanding of distributed computing principles 3. Expert level knowledge and experience in Apache Spark 4. Hands on programming with Python 5. Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop 6. Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming 7. Experience with messaging systems, such as Kafka or RabbitMQ 8. Good understanding of Big Data querying tools, such as Hive, and Impala 9. Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files 10. Good understanding of SQL queries, joins, stored procedures, relational schemas 11. Experience with NoSQL databases, such as HBase, Cassandra, MongoDB 12. Knowledge of ETL techniques and frameworks 13. Performance tuning of Spark Jobs 14. Experience with native Cloud data services AWS or AZURE Databricks, GCP. 15. Ability to lead a team efficiently 16. Experience with designing and implementing Big data solutions 17. Practitioner of AGILE methodology
Posted 1 week ago
3.0 - 8.0 years
1 - 2 Lacs
Bengaluru
Remote
3+ years in a similar data engineering role Proficient in Java and Python Strong knowledge of XML & JSON formats Experience in Oracle, MSSQL, and Snowflake, Tableau, Power BI, Solidatus Familiar with AWS and Azure cloud platforms
Posted 1 week ago
4.0 - 9.0 years
3 - 8 Lacs
Bengaluru
Work from Office
Position Summary: We are seeking a skilled and detail-oriented Microsoft Administrator - Vulnerability Remediation to join our team. The ideal candidate will have a strong background in Microsoft platforms, vulnerability remediation, and VMware administration. This role requires a proactive professional with a focus on identifying and resolving vulnerabilities across a large number of hosts, ensuring robust system security and compliance. The position involves repetitive tasks, requiring consistency, precision, and adherence to established protocols. Key Responsibilities Vulnerability Remediation: Analyze, prioritize, and remediate vulnerabilities across large-scale Microsoft environments. Implement patches, updates, and configuration changes to resolve vulnerabilities while minimizing disruptions to operations. Collaborate with security teams to ensure timely and effective mitigation of identified risks. Microsoft Platform Administration: Manage, monitor, and maintain Microsoft environments, including Windows Server and related services. Work with Azure Administration tasks, including provisioning, monitoring, and maintaining cloud-based resources. Follow and implement standard work documents, protocols, and best practices for the Microsoft ecosystem. Azure Administration: Perform tasks related to Azure cloud services, including virtual machines, storage, networking, and security configurations. Monitor and troubleshoot Azure-based solutions, ensuring optimal performance and security. VMware Administration: Oversee VMware infrastructure, including installation, configuration, and maintenance of virtualized environments. Work with ESXi hosts, vCenter servers, and other VMware products to support business requirements. Conduct regular updates, backups, and performance optimization of VMware systems. Repetitive Task Management: Execute remediation processes consistently and accurately across a high volume of hosts. Document completed tasks and provide regular updates to stakeholders. Collaboration and Reporting: Work closely with cross-functional teams, including IT security, infrastructure, and operations. Provide detailed reports on remediation activities, status updates, and compliance metrics. Required Skills and Qualifications: Experience: Minimum 3+ years of Microsoft administration experience, with a strong focus on vulnerability remediation. 3+ years of hands-on experience with the Microsoft platform and Azure Administrator responsibilities. Solid experience in VMware administration, including virtualized environments and related tools. Technical Skills: In-depth knowledge of Windows Server and Azure cloud services. Proficiency in identifying, analyzing, and remediating security vulnerabilities. Experience with patch management tools and techniques. Strong understanding of VMware environments, including ESXi and vSphere. Soft Skills: Excellent attention to detail and commitment to high-quality work. Strong problem-solving and troubleshooting skills. Ability to follow detailed procedures and standard documentation. Good communication and teamwork abilities.
Posted 1 week ago
5.0 - 8.0 years
15 - 18 Lacs
Pune, Bengaluru
Hybrid
• Design, develop, and maintain scalable and secure microservices applications using .NET technologies on Azure platform. • Integrating microservices with other systems and third-party APIs to enable seamless data exchange.
Posted 1 week ago
1.0 - 3.0 years
2 - 5 Lacs
Coimbatore
Work from Office
*Design, deploy, and manage cloud infrastructure solutions. *Monitor cloud-based systems to ensure performance, reliability, and security. *Automate processes to streamline operations and reduce manual tasks. Required Candidate profile Experience in Linux System Administration Exposure to AWS /Azure cloud platforms Knowledge of scripting languages like Python, Bash, or similar Understanding of CI/CD pipelines and DevOps practices
Posted 1 week ago
14.0 - 21.0 years
25 - 40 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Azure Architect Experience: 12+ years Location: Gurugram (Hybrid) Job Description- We are looking for an experienced and visionary Azure Architect to lead the design, implementation, and optimization of enterprise-level Azure solutions. The ideal candidate will have a deep understanding of Azure infrastructure, enterprise solution design, network interfaces, and architecture capabilities. This role requires strategic planning, hands-on expertise, and a focus on delivering secure, scalable, and cost-efficient cloud solutions. Technical Competencies: Comprehensive Knowledge of Infrastructure Services: Windows Servers Linux Servers Networking Firewall Security and Web Application Firewalls Storage Solutions Automation using MS-DOS and PowerShell Azure Infrastructure Management Expertise: Enterprise-level Resource Provisioning and Management Advanced Networking and Connectivity Design Identity and Access Management (IAM) for secure operations Backup, Disaster Recovery, and High Availability solutions Security, Compliance, and Governance Policies Cost Management and Performance Optimization Automation and Scripting for scalable deployments Patch Management and Continuous Updates. Required Skills: Expertise in Azure Resource Manager (ARM), Virtual Machines, Azure Networking, Storage Accounts, and Databases. Proficiency in scripting and automation (PowerShell, Azure CLI, or Python). Experience with infrastructure-as-code tools such as Terraform or Bicep. Strong understanding of Azure monitoring tools (Azure Monitor, Log Analytics, Application Insights). In-depth knowledge of Azure Security (Azure AD, IAM, Security Centre). Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes on AKS) is a plus.
Posted 1 week ago
5.0 - 10.0 years
7 - 17 Lacs
Pune
Hybrid
Senior Data engineer: At Acxiom, our vision is to transform data into value for everyone. Our data products and analytical services enable marketers to recognize, better understand, and then deliver highly applicable messages to consumers across any available channel. Our solutions enable true people-based marketing with identity resolution and rich descriptive and predictive audience segmentation. We are seeking an experienced Data Engineer with a versatile skill set to undertake data engineering efforts to build the next-generation ML infrastructure for Acxioms business. As part of the Data Science and Analytics Team, the Sr. Data engineer will partner with Data Scientists and work hands-on with Big Data technologies and build a scalable infrastructure to support development of machine learning based Audience Propensity models and solutions for our domestic and global businesses. The Sr. Data engineer’s responsibilities include collaborating with internal and external stakeholders to identify data ingestion, processing, ETL, data warehousing requirements and develop appropriate solutions using modern data engineering tools in cloud. We want this person to help us build a scalable data lake and EDW using modern tech stack from the ground up. Success in this role comes from combining a strong data engineering background with product and business acumen to deliver scalable data pipeline and database solutions that can enable & support a high performant, large scale modeling infrastructure at Acxiom. The Sr. Data Engineer will be a champion of the latest Cloud database technologies & data engineering tools and will lead by example in influencing adoption and migration to the new stack. What you will do: Partner with ML Architects and data scientists to drive POCs to build a scalable, next generation model development, model management and governance infrastructure in Cloud Be a thought leader and champion for adoption of new cloud-based database technologies and enable migration to new cloud-based modeling stack Collaborate with other data scientists and team leads to define project requirements & build the next generation data source ingestion, ETL, data pipelining, data warehousing solutions in Cloud Build data-engineering solutions by developing strong understanding of business and product data needs Manage environment security permissions and enforce role based compliance Build expert knowledge of the various data sources brought together for audience propensities solutions – survey/panel data, 3rd-party data (demographics, psychographics, lifestyle segments), media content activity (TV, Digital, Mobile), and product purchase or transaction data and develop solutions for seamless ingestion and process of the data Resolve defects/bugs during QA testing, pre-production, production, and post-release patches Contribute to the design and architecture of services across the data landscape Participation in development of the integration team, contributing to reviews of methodologies, standards, and processes Contribute to comprehensive internal documentation of designs and service components Required Skills: Background in data pipelining, warehousing, ETL development solutions for data science and other Big Data applications Experience with distributed, columnar and/or analytic oriented databases or distributed data processing frameworks Minimum of 4 years of experience with Cloud databases –Snowflake, Azure SQL database, AWS Redshift, Google Cloud SQL or similar. Experience with NoSQL databases such as mongoDB, Cassandra or similar is nice to have Snowflake and/or Databricks certification preferred Minimum of 3 years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, data lake and data warehouse solutions Minimum of 3 years of hands-on experience in Big Data technologies such as Hadoop, Spark, PySpark, Spark/SparkSQL, Hive, Pig, Oozie and streaming technologies such as Kafka, Spark Streaming Ingestion API, Unix shell/Perl scripting etc. Strong programming skills using Java, Python, PySpark, Scala or similar Experience with public cloud architectures, pros/cons, and migration considerations Experience with container-based application deployment frameworks (Kubernetes, Docker, ECS/EKS or similar) Experience with Data Visualization tools such as Tableau, Looker or similar Outstanding troubleshooting, attention to detail, and communication skills (verbal/written) in a fast paced setting Bachelor's Degree in Computer Science or relevant discipline or 7+ years of relevant work experience. Solid communication skills: Demonstrate ability to explain complex technical issues to technical and non- technical audiences. Strong understanding of the software design/architecture process Experience with unit testing and data quality checks Building Infrastructure-as-code for Public Cloud using Terraform Experience in a Dev Ops engineering or equivalent role. Experience developing, enhancing, and maintaining CI/CD automation and configuration management using tools such as Jenkins, Snyk, and GitHub What will set you apart: Preferred Skills: Ability to work in white space and be able to develop solutions independently. Experience building ETL pipelines with health claims data will be a plus Prior experience with Cloud based ETL tools such as AWS Glue, AWS Data pipeline or similar Experience with building real-time and streaming data pipelines a plus Experience with MLOps tools such as Apache MLFlow/KubeFlow is a plus Exposure to E2E ML platform such as AWS Sagemaker, Azure ML studio, Google AI/ML, Datarobot, Databricks or similar a plus Experience with ingestion, processing and management of 3rd party data
Posted 1 week ago
6.0 - 8.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Key Responsibilities: End-to-End Solution Design: Work closely with cross-functional teams to define and develop full-stack applications, including backend services, frontend integration, and cloud-based solutions. API Development: Design and implement robust APIs, ensuring that they are scalable and maintainable. Collaborative Development: Work in collaboration with product managers, data scientists, and UX teams to develop new features and improve existing ones. Performance Optimisation: Enhance application performance, scalability, and security by developing efficient algorithms and making use of cloud resources optimally. Code Quality & Best Practices: Follow best practices in coding, testing, and deployment to ensure high-quality, production-ready solutions. Documentation & Maintenance: Ensure proper documentation for both the backend architecture and API usage. Provide ongoing maintenance and updates for the applications. Skills & Experience Required: Experience: 4 to 6 years of product development. Backend Development: Strong experience in building scalable and performant backend services using Python and frameworks such as Flask, FastAPI, and Django. Strong Coding Skills: Solid programming foundation with experience in writing clean, maintainable, and efficient code. Database Expertise: Hands-on experience with SQL (PostgreSQL, MySQL) and NoSQL (MongoDB) databases, and ORM frameworks for managing and optimising database interactions. Cloud Services: Proficiency in cloud platforms like AWS, GCP, or Azure, and knowledge of deploying applications using cloud-native services. Front-End Integration: Experience in integrating with front-end technologies like ReactJS, HTML, CSS, and JavaScript for integrating the frontend with backend services. Version Control & CI/CD: Experience with Git, and tools like Jenkins or CircleCI for continuous integration and continuous deployment. Testing: Ability to write unit tests, integration tests, and ensure the application is highly reliable. Preferred Skills: AI/ML Experience: Exposure to working with AI/ML tools and integrating machine learning models into backend services. Containerization & Orchestration: Knowledge of Docker and Kubernetes for containerising applications and managing microservices. Leadership & Mentorship: Ability to guide junior developers, participate in code reviews, and contribute to technical decision-making. Education and Experience: A degree in Computer Science, Engineering, or a related field (B.Tech/M.Tech or equivalent). 4 to 6 years of experience in full-stack development, with a focus on backend development and microservices architecture. Why Join Us: Work on cutting-edge AI and machine learning technologies. Be part of a collaborative and innovative environment. Solve real-world industrial challenges with impactful solutions. Opportunities for professional growth and learning.
Posted 1 week ago
5.0 - 10.0 years
0 - 0 Lacs
Hyderabad
Work from Office
Role-Python Lead Location -Hyderabad Find below JD: OSI is looking for a results-driven backend lead with 5 + years of experience in creating clean, modern, scalable, secure, and maintainable code for web cloud products and applications. Responsibilities: Collaborate with the Technical Architect and project stakeholders to define and refine the technical direction of the project. Drive the design and development of robust, scalable, and efficient solutions using Python(FastAPI & Mongodb). Programming, Development, Conduct code reviews, distribute work among the team, identify areas for improvement, and enforce code quality standards Collaborate with cross-functional teams, including DevOps CI/CD, QA, and business stakeholders, to ensure successful project delivery. Stay up-to-date with industry trends, emerging technologies, and best practices in the Python Frameworks. Required Skills 5+ years of experience in developing enterprise applications using Python Strong understanding of microservices architecture and API development, including experience with API gateways and service orchestration. Excellent python coding skills & understanding of python best practices (PEP8) & fundamental design principles behind scalable applications. Proficiency in building RESTful APIs using Python FastAPI or similar web frameworks (Flask) with in depth understanding of OpenAPI schema. Experience working with ORM (object-relational mapping) libraries (Sqlalchmey, Pydantic SQLmodel, etc) and NoSQL(Mongodb). Working knowledge of cloud platforms such as Azure, and experience building cloud-native applications. Proficiency in data modeling and working knowledge using DynamoDB database and distributed caches Excellent communication and interpersonal skills, with the ability to effectively communicate technical concepts to both technical and non-technical stakeholders. Strong problem-solving skills, with the ability to anticipate potential issues and propose effective solutions. Familiarity with DevOps practices and tools, such as version control systems (e.g., Git) and CI/CD pipelines. We are NOT looking for someone who: Restrict themselves only to the resources/customer management Prefers to work in fully-remote mode Share below details to samaravadi@osidigital.com Total Experience- Relevant to Python developer- Relevant to FastApi/Django- Relevant to AWS/Azure Cloud- Relevant to MongoDB- CTC- ECTC- Notice- Current location- preferred location- Willing to Work from office at Hyderabad office- ***Share updated Cv.
Posted 1 week ago
4.0 - 9.0 years
6 - 15 Lacs
Bengaluru
Hybrid
Before coming please call me 9701923036 Here's a professional LinkedIn job posting you can use: We Are Hiring | Java Backend Developer Location: Bangalore | Work Mode: Hybrid Experience: 46 Years Interview Mode: F2F – In-Person (Both L1 & L2) Only apply if you're available for in-person interviews. Job Description: We are looking for a skilled Java Backend Developer with Cloud experience to join our dynamic team! Key Responsibilities & Requirements: Proven experience in backend development using Java Strong knowledge of Spring Boot, Hibernate, and related frameworks Hands-on experience with cloud platforms – preferably Microsoft Azure Experience with containerization and orchestration – Docker, Kubernetes Working knowledge of Azure App Services, Functions, Storage, SQL Database Familiarity with CI/CD pipelines and DevOps best practices Good understanding of SQL and NoSQL databases Interested? Share your updated resume at gopi.c@acesoftlabs.com Call/WhatsApp: +91 9701923036
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France