Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Position: We are seeking an experienced professional with experience on Application Support (L2 Support) to design, implement, and optimize solutions with specific to Payments domain. Role: Application Support (L2 support) Location: Bangalore Experience: 4 to 7 Years Job Type: Full Time Employment What You'll Do: You will design, develop and maintain high-performance applications using Application Support. You will revamp the legacy lending system, ensuring platformization, extensibility, testability, and scalability as foundational principles. You will engage with stakeholders to gather and define requirements, skillfully managing ambiguity throughout the project lifecycle. You will collaborate with vendors to integrate lending system with their offerings. You will work closely with client’s cross-functional teams, including product and business teams, to influence technical decisions and define feature specification. Expertise You'll Bring: You should have 4-7 yrs of hands-on industry experience in building distributed systems to deliver scalable and reliable solutions. Proven experience in application support engineering, with a strong emphasis on Kubernetes environments. Proven experience in giving support for Microservices based application Good understanding of Google cloud platform & services Solid understanding of containerization and orchestration tools, with hands-on experience in Kubernetes. Proficiency in implementing and managing monitoring solutions (e.g., Dynatrace, AppDynamics) for applications and infrastructure. Strong troubleshooting skills and the ability to analyze logs and performance metrics. Excellent communication and collaboration skills to work effectively with cross-functional teams. Experience with cloud platforms (e.g., GCP, AWS, Azure) is a plus Object-oriented analysis and design using common design patterns Good knowledge of Java and Angular Knowledge of Relational Databases, SQL, and ORM technologies (JPA2, Hibernate) Experience in Shell and Python scripting. Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent “Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.”
Posted 11 hours ago
0.0 - 8.0 years
20 - 25 Lacs
Pune, Maharashtra
On-site
We’re Looking For a Sr DevOps Engineer! At Organization You Will ? Design stable and polished solutions for CI/CD pipelines with multiple stages and gates to handle quality checks, bundling of assets, and creation of container images ? Design optimal cloud infrastructure for given requirements using architectural tradeoff analysis methods ? You will be responsible for mapping out the current and future state of the deployment strategy ? Learn and grow your existing skill set to help solidify your knowledge as well as to advance the industry's practices ? Mentor junior engineers to develop solutions that fit industry best practices What You Bring To The Table: ? 5+ years of professional DevOps experience ? 3+ years experience with continuous integration and deployment tools, such as Jenkins, Azure DevOps, GitHub Actions, GitLab Pipelines, etc. ? 3+ years experience with containers, container orchestration, and deployment tools, such as Kubernetes, Docker and Helm ? Experience with configurations management tools (Chef, Ansible) ? Expertise in using a major cloud provider (AWS, Azure, GCP) ? Expertise in using containerization (Docker, Kubernetes) ? Expertise in using Infrastructure-as-Code tools (Terraform, Pulumi) ? Comfort with Linux CLI (being able to navigate through the filesystem, check processes, understand what’s happening on the OS) ? Knowledge in a programming/scripting language (Bash, Powershell, and/or Typescript, Python) ? Knowledge of instrumenting applications to expose internal state for capturing metrics to be monitored ? Knowledge of observability tooling to assist in debugging of applications (Prometheus/Grafana/EFK) What Makes You Stand Out ? Solid verbal and written communication skills ? A true consultant mentality ? You are a self-starter who takes initiative and ownership of a challenge while providing practical and innovative solutions ? Strong analytical skills ? Experience speaking to technical and business audiences while working globally ? Collaboration mindset ? An ability to work as part of small and large teams ? Leadership skills ? Inspiring others to follow your lead ? Familiarity using Scrum/Agile development methodology Job Types: Full-time, Permanent Pay: ₹2,000,000.00 - ₹2,500,000.00 per year Benefits: Cell phone reimbursement Health insurance Life insurance Paid sick time Paid time off Provident Fund Schedule: Monday to Friday Supplemental Pay: Performance bonus Experience: DevOps: 8 years (Required) Language: English (Required) Location: Pune, Maharashtra (Required) Work Location: In person Speak with the employer +91 7876212244
Posted 11 hours ago
1.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Infosys BPM Ltd. Windows Support (Chat Process) Hyderabad Are you ready to take your technical support career to the next level? Infosys BPM is hiring Technical Process Specialists and Senior Technical Process Specialists for our Windows Support Chat Process in Hyderabad! If youre passionate about troubleshooting and customer service, this is your chance to work with Microsofts consumer products and deliver exceptional support via chat. About the Role: Position: Senior Technical Process Specialist / Technical Process Specialist Location: Hyderabad (100% Work From Office) Experience: 1 - 6 Years Shift: 24x7 Rotational Shifts What Youll Do: Be the first point of contact for customers seeking assistance with Microsoft Windows & O365 issues. Provide fast, clear, and effective solutions through chat support. Create a positive experience by showing empathy and patience while troubleshooting. Use your technical expertise to diagnose and resolve complex problems. Must-Have Skills & Qualifications: Typing speed of 35 WPM or higher (mandatory) Prior experience in technical support via chat Excellent written and verbal communication skills Any Graduate or Postgraduate (Degree Certificate & Consolidated Marksheet mandatory) Hands-on experience troubleshooting Microsoft Windows & O365 queries Certifications like Microsoft Certified: Azure Fundamentals or M365 Fundamentals are a strong plus Strong problem-solving skills and ability to thrive in a fast-paced environment Patience, empathy, and a customer-first mindset Must be ready for 100% WFO , no hybrid option Interview Details: Dates: 26th July 2025 Time: 10:00 AM 01:00 PM Venue: Infosys STP- Office no 5,6-963/2 Madhava Reddy colony, ISB road, pin code- 500032, Gachibowli Hyderabad. Building number - 4, 1st floor Aryabhata conference room Landmark- Nearby ISB college What to Bring: Updated resume (printout) Two valid photo ID proofs (PAN Card/Driving License/Voter ID/Passport) Original education documents for verification (10th, 12th, Graduation sem-wise marksheets, CMM, provisional & original degree certificates) Important Notes: Laptops, cameras, and other electronic devices are not allowed at the venue due to security reasons. Original Government ID is mandatory for security clearance. Preference will be given to immediate joiners . Infosys BPM is an equal opportunity employer. We celebrate diversity and are committed to fostering an inclusive workplace where everyone can thrive. Ready to launch your career with a global leader? Come meet us and take the first step toward a rewarding future! Regards, Infosys BPM Talent Acquisition | INFY HR
Posted 11 hours ago
5.0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Description Role: Senior Data Engineer Location: Kochi / Trivandrum /Bangalore Experience: 5+years Mandatory skills : Strong in MS SQL and SSIS , Data Lake, Azure SQL, SSIS, ADF , Lead experience Start Date: Aug 6 ,2025 Salary- 18 to 23 LPA Job Purpose (both Onsite / Offshore) Responsible for delivering senior-level innovative, compelling, coherent software solutions for our consumer, internal operations and value chain constituents across a wide variety of enterprise applications through the creation of discrete business services and their supporting components. Job Specification / Skills and Competencies 1. Designs, develops and delivers solutions that meet business line and enterprise requirements. 2. Participates in rapid prototyping and POC development efforts. 3. Advances overall enterprise technical architecture and implementation best practices. 4. Assists in efforts to develop and refine functional and non-functional requirements. 5. Participates in iteration and release planning. 6. Informs efforts to develop and refine functional and non-functional requirements. 7. Demonstrates knowledge of, adherence to, monitoring and responsibility for compliance with state and federal regulations and laws as they pertain to this position. 8. Strong ability to produce high-quality, properly functioning deliverables the first time. 9. Delivers work product according to established deadlines. 10. Estimates tasks with a level of granularity and accuracy commensurate with the information provided. 11. Works collaboratively in a small team. 12. Excels in a rapid iteration environment with short turnaround times. 13. Deals positively with high levels of uncertainty, ambiguity, and shifting priorities. 14. Accepts a wide variety of tasks and pitches in wherever needed. 15. Constructively presents, discusses and debates alternatives. 16. Takes shared ownership of the product. 17. Communicates effectively both verbally and in writing. 18. Takes direction from team leads and upper management. 19. Ability to work with little to no supervision while performing duties. 20. Proficient in SSIS & ADF 21. Strong in MS SQL 22. Hands on experience in Data Lake 23. Hands-on experience in data mart and data warehousing including variant schemas (Star, Snowflake). 24. 5+ years of experience with advanced queries, stored procedures, views, triggers, etc. 25. 5+ years of experience of performance tuning queries. 26. 5+ years of experience of both DDL and DML. 27. 5+ years of experience designing enterprise database systems using Microsoft SQL Server/Azure SQL preferred. 28. Experience in Lakehouse architecture preferred. 29. Experience with Cloud technologies – AWS, Snowflake is preferred. 30. Deep understanding of one or more source/version control systems. Develops branching and merging strategies. 31. Working understanding of Web API, REST, JSON. 32. Working understanding of unit testing creation. 33. Bachelor’s Degree is required, and/or a minimum of four (4) + related work experience. 34. To adhere to the Information Security Management policies and procedures.
Posted 11 hours ago
0.0 - 2.0 years
0 - 0 Lacs
New Town, Kolkata, West Bengal
On-site
Job Title: Backend Developer Company: Benda Infotech Location: Newtown, Kolkata Shift: Night Shift (US Shift) | 8:00 PM to 5:00 AM IST Experience Required: 2 to 5 Years Salary: As per industry standards Joining: Immediate Joiners Preferred Job Description: Benda Infotech is seeking a skilled and enthusiastic Backend Developer with 2 to 5 years of hands-on experience to join our growing development team. The ideal candidate will work closely with frontend developers, project managers, and QA teams to build and maintain robust, scalable backend systems. This is a full-time night shift position aligned with US working hours . Key Responsibilities: Design, develop, test, and maintain server-side applications and APIs. Optimize application performance, scalability, and security. Work with relational and/or NoSQL databases for efficient data management. Collaborate with frontend teams to integrate user-facing elements using server-side logic. Write clean, maintainable, and well-documented code. Debug and resolve technical issues in production and non-production environments. Implement and maintain third-party integrations and RESTful services. Ensure proper code versioning using Git or similar tools. Required Skills: Strong knowledge of backend programming languages such as Node.js, PHP, Python, Java, or .NET (as per company tech stack). Experience with RESTful API development and integration. Hands-on experience with MySQL, PostgreSQL, MongoDB , or similar databases. Proficiency in version control tools like Git . Familiarity with cloud services (AWS, Azure, GCP) is a plus. Understanding of Agile/Scrum methodologies . Ability to work independently and troubleshoot complex issues. Good to Have: Knowledge of CI/CD pipelines. Experience working in a B2B SaaS environment or with large-scale applications. Familiarity with microservices architecture. What We Offer: Competitive salary and night shift allowance. Supportive team environment and career growth opportunities. Work with global clients and gain international exposure. How to Apply: If you are passionate about backend development and eager to work in a dynamic night shift environment, apply now or send your CV to damayanti.benda@gmail.com. Job Types: Full-time, Permanent Pay: ₹15,000.00 - ₹25,000.00 per year Benefits: Paid sick time Schedule: Fixed shift Monday to Friday Night shift US shift Experience: Back-end development: 2 years (Required) Location: New Town, Kolkata, West Bengal (Required) Shift availability: Night Shift (Required) Work Location: In person Speak with the employer +91 8967667532 Application Deadline: 04/08/2025 Expected Start Date: 04/08/2025
Posted 11 hours ago
12.0 years
0 Lacs
Delhi, India
On-site
Job Role • We are looking for Cloud Architects for designing data management solutions, having strong knowledge of architecting and designing highly available and scalable database on cloud • He will deliver hands-on, business-oriented strategic and technical consulting to requirements towards cloud native and marketplace data / database management architecture and solutions Key Responsibilities • Designing PaaS and IaaS database technology (RDBMS, NoSQL, Distributed database) • Designing cloud infrastructure services (Compute, Storage, Network etc…) for DB deployment • Design Database Authentication and Authorization (IAM, RBAC) solution • Capacity planning, performance analysis and database optimization to manage DB workload • Analysing and Identifying infrastructure requirements for on premise, and on other cloud environments like Azure, Google, AWS • Designing High Availability and Disaster Recovery solution for Database deployment on IaaS and PaaS platform • Designing database Backup and Recovery solution using native or enterprise backup solution • Designing database / data management and optimization job / task automation • Designing Homogeneous and Heterogeneous database migration solution within On-Premise or On-Premise to Cloud (IaaS and PaaS) • Designing database monitoring, alert notification/reporting, data masking/encryption solutions • Designing ETL / ELT solution for data ingestion and data transformation • Mentor implementation teams, handhold when needed on best practices and make sure the solution is implemented in right way • Prepare high-level and low-level design document as required for implementation team • Databases Technology and DB Services: Azure SQL, Azure SQL MI, PostgreSQL, MySQL, Oracle, SQL Server, AWS RDS, Amazon Aurora, Cloud SQL, Cloud Spanner, Cosmos DB, Azure Synapse Analytics / Google BigQuery / Amazon Redshift Educational Qualifications • Bachelor's degree in Engineering / Computer Science, Computer Engineering, Information Technology. Years of Experience (minimum & maximum) Min: 12 Years, Max: 20 Years What are the nature and scope of responsibilities the candidate should have handled? • Understand customer's overall data estate, business principles, operations, and discover / assessment database workload • Designing / complex, highly available, distributed, failsafe Cloud manage and unmanage database • HLD and LLD document preparation • Evaluate and recommend Cloud manage database services best suited for customer needs for optimal solution • Drive Cloud manage database technology initiatives end to end and across multiple layers of architecture • Provides strong technical leadership in adopting and Knowledge & Skills • Understanding of Public / Private / Hybrid Cloud solutions and Database services on Cloud • Extensive experience in conducting Cloud Readiness Assessments for database environment and observing business / technical perspectives • Knowledge of Cloud best practices and guidelines for database deployment • Knowledge of cloud native HA-DR and database backup solutions • Experience and Strong knowledge of Reference Architectures of Azure / GCP • Azure / GCP certified Architect (preferred) • Good Oral and Written communication • Ability to work on a distributed and multi-cultural team • Good understanding of ITSM processes and related tools • Willing to learn and explore new technologies About: Jio Platforms Limited , is an Indian technology company that specializes in Internet, telecommunications, cloud computing, e-commerce, retail, artificial intelligence and a subsidiary of Reliance Industries Limited, headquartered in Mumbai, India. Established in 2019, Jio Platforms acts as a holding company for India's largest mobile network operator Jio and other digital businesses of Reliance.
Posted 11 hours ago
0 years
0 Lacs
Port Blair, Andaman and Nicobar Islands, India
On-site
Manage, monitor, and maintain server environments, both on-premises and in the cloud (e.g., AWS EC2). Develop and implement scripts and tools to automate routine tasks. Evaluate and deploy solutions to ensure scalability and high availability for applications. Ensure minimal downtime and maintain service-level agreements (SLAs). Deploy, manage, and troubleshoot containerized applications using Docker. Oversee database systems (MySQL, PostgreSQL, or equivalent), ensuring performance, security, and backups. Manage cloud infrastructure (AWS, Azure, or GCP) and ensure security compliance. Monitor network performance, implement firewalls, and manage encryption protocols. Develop and deploy CI/CD pipelines to automate software delivery processes. Use tools like GitLab CI, or equivalent to streamline deployment workflows. Preferred Skills Excellent written and verbal communication skills in the English language. Exceptional problem solving skills and ability to collaboratively or independently with a team. Identify the priority of the task and take actions effectively
Posted 12 hours ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Data Engineer - Azure Databricks, Pyspark, Python, Airflow __Chennai/Pune India ( 6- 10 years exp only) YOU’LL BUILD TECH THAT EMPOWERS GLOBAL BUSINESSES Our Connect Technology teams are working on our new Connect platform, a unified, global, open data ecosystem powered by Microsoft Azure. Our clients around the world rely on Connect data and insights to innovate and grow. As a Junior Data Engineer, you’ll be part of a team of smart, highly skilled technologists who are passionate about learning and supporting cutting-edge technologies such as Spark, Scala, Pyspark, Databricks, Airflow, SQL, Docker, Kubernetes, and other Data engineering tools. These technologies are deployed using DevOps pipelines leveraging Azure, Kubernetes, Jenkins and Bitbucket/GIT Hub. Responsibilities Develop, test, troubleshoot, debug, and make application enhancements leveraging, Spark , Pyspark, Scala, Pandas, Databricks, Airflow, SQL as the core development technologies. Deploy application components using CI/CD pipelines. Build utilities for monitoring and automating repetitive functions. Collaborate with Agile cross-functional teams - internal and external clients including Operations, Infrastructure, Tech Ops Collaborate with Data Science team and productionize the ML Models. Participate in a rotational support schedule to provide responses to customer queries and deploy bug fixes in a timely and accurate manner. Qualifications 6-10 Years of years of applicable software engineering experience Strong fundamentals with experience in Bigdata technologies, Spark, Pyspark, Scala, Pandas, Databricks, Airflow, SQL, Must have experience in cloud technologies, preferably Microsoft Azure. Must have experience in performance optimization of Spark workloads. Good to have experience with DevOps Technologies as GIT Hub, Kubernetes, Jenkins, Docker. Good to have knowledge in Snowflakes Good to have knowledge of relational databases, preferably PostgreSQL. Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business Minimum B.S. degree in Computer Science, Computer Engineering or related field Additional Information Enjoy a flexible and rewarding work environment with peer-to-peer recognition platforms. Recharge and revitalize with help of wellness plans made for you and your family. Plan your future with financial wellness tools. Stay relevant and upskill yourself with career development opportunities Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion
Posted 12 hours ago
14.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role: Technical Microsoft Practice Head Experience: 14 Years Location: Chennai Mandatory Skills: Microsoft technologies, solutions, and services, Microsoft product suite, including Azure, Office 365, Dynamics 365, and Power Platform. JD: Key Responsibilities: 1. Commercial Strategy Development: • Develop and execute a comprehensive commercial strategy for the Microsoft practice, aligned with overall business objectives. • Identify market trends, customer needs, and competitive landscape to formulate effective go-to-market strategies. • Drive revenue growth by identifying opportunities for expansion, upselling, and crossselling Microsoft solutions and services. 2. Client Relationship Management: • Cultivate and maintain strong relationships with key clients, understanding their business challenges and requirements. • Collaborate with sales teams to identify new business opportunities, participate in client meetings, and contribute to proposal development. • Act as a trusted advisor to clients, offering insights and recommendations on leveraging Microsoft technologies to achieve their business goals. 3. Technical Leadership: • Provide technical leadership and guidance to a team of Microsoft consultants, architects, and developers. • Stay abreast of the latest Microsoft technologies, trends, and best practices, and ensure their incorporation into solution design and delivery. • Drive innovation by exploring emerging technologies and evaluating their applicability to client needs. 4. Project Delivery and Quality Assurance: • Oversee the delivery of Microsoft projects, ensuring adherence to timelines, budgets, and quality standards. • Conduct regular project reviews and performance assessments, identifying areas for improvement and implementing corrective actions as necessary. • Champion a culture of continuous improvement and knowledge sharing within the Microsoft practice. 5. Team Development and Talent Management: • Recruit, onboard, and retain top talent for the Microsoft practice, fostering a culture of excellence, collaboration, and accountability. • Provide mentorship, coaching, and professional development opportunities to team members, helping them enhance their skills and advance their careers. • Encourage a culture of innovation, creativity, and problem-solving among team members 6. Collaboration and Partnership: • Collaborate closely with other practice heads, sales teams, and cross-functional stakeholders to drive synergies and maximize business outcomes. • Forge strategic partnerships with Microsoft and other ecosystem partners to enhance service offerings, access new markets, and strengthen competitive positioning Requirements: • Bachelor’s degree in computer science, Engineering, Business Administration, or related field; advanced degree preferred. • Extensive experience (14+ years) in the IT industry, with a focus on Microsoft technologies, solutions, and services. • Proven track record of success in driving commercial growth, managing client relationships, and leading technical teams. • Strong understanding of Microsoft product suite, including Azure, Office 365, Dynamics 365, and Power Platform. • Excellent leadership, communication, and interpersonal skills, with the ability to influence and inspire others. • Strategic thinker with a results-oriented mindset and a passion for innovation. • Relevant certifications (e.g., Microsoft Certified: Azure Solutions Architect, Microsoft Certified: Dynamics 365) preferred
Posted 12 hours ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Responsibilities: As a WebFOCUS Development Engineer, your core responsibilities will include: Designing, developing, testing, and deploying WebFOCUS reports, dashboards, and visualizations. Creating and managing WebFOCUS procedures (.fex), metadata layers, and reporting cubes. Building and maintaining ReportCaster schedules for automated report distribution. Collaborating with business users and product managers to gather requirements and deliver insights. Performing data analysis, data modeling, and report optimization. Writing clear and effective functional and technical documentation. Participating in solution architecture discussions and offering technical recommendations. Ensuring report performance, access security, and high-quality user experiences. Must Have Skills: 3–7 years of experience in WebFOCUS development Strong command of App Studio, InfoAssist, Developer Studio Excellent verbal and written communication skills Expertise in: WebFOCUS metadata modeling (Master/Access file creation) WebFOCUS report/cube development ReportCaster configuration and scheduling WebFOCUS security design and sign-on integration Proficiency in SQL and working with relational databases Knowledge of PowerBI, Microsoft Fabric and familiarity with Azure Services. Strong understanding of JavaScript, HTML5 and CSS Strong analytical and problem-solving skills Ability to engage directly with end-users for dashboard design and troubleshooting Nice To Have: Familiarity with Azure Services like Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. Qualifications: Bachelor's degree in Computer Science, Information Systems, or a related field
Posted 12 hours ago
14.0 years
20 - 25 Lacs
Pune, Maharashtra, India
On-site
Urgent Hiring | Sr. DevOps Engineer (with Java) | Pune | Immediate Joiners Only Job Location: Yerwada, Pune (Hybrid – 3 Days Office/Week) Domain: Information Technology (IT) Position Type: Full-Time Experience: 8–14 Years Salary Range: ₹20 – ₹25 LPA Joining Timeline: Immediate only (Mandatory) Shift Timing: US Time Zones (EST/CST) Interview Rounds: 2–3 Technical Rounds (US Panel) 🎯 Role Overview – Sr. DevOps Engineer (DevOps 70% + Java 30%) This is a hands-on technical role involving DevOps engineering with a strong Java (Spring Boot, Kafka, Microservices) development background. You'll work closely with cross-functional teams, mentor junior engineers, and support high-velocity enterprise-grade delivery environments. ✅ Must-Have Skills 5+ years in DevOps Engineering 3+ years with CI/CD tools: Jenkins, GitHub Actions, GitLab Pipelines, Azure DevOps Expertise in Kubernetes, Docker, Helm Strong experience with any major cloud provider: AWS, Azure, or GCP Infrastructure-as-Code expertise: Terraform, Pulumi Proficient in Linux CLI, scripting (Bash, PowerShell, Python, TypeScript) Experience in Java 8/17, Spring Boot, Kafka, and Microservices Proven track record of mentoring junior engineers Excellent verbal & written communication; strong analytical mindset 🏆 Nice to Have Prior experience working in Agile/Scrum environments Consulting or client-facing experience Familiarity with observability tooling (Prometheus, Grafana, EFK) Skills: powershell,kafka,terraform,devops,jenkins,docker,python,azure,azure devops,gcp,pulumi,aws,github actions,typescript,java,kubernetes,ci/cd tools,bash,boot,helm,spring boot,spring,linux cli,gitlab pipelines,microservices
Posted 12 hours ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Hello Candidates, Please find below the job description (JD) and other essential details. It is crucial that we adhere to the following criteria to ensure we select the most suitable candidates. Role - Sales Manager - Microsoft Services Sales Experience – 7-12 yrs Payroll Mode - Permanent Location- Pune. Work mode - Work from office (5 days) Commercia l – Upto 25 LPA No. of Position - 2 We are looking for a dynamic and experienced Sales Manager to lead our Microsoft Services Sales vertical. This role requires a proactive sales professional with strong experience in managing OEM relationships specifically Microsoft, developing new business opportunities in international markets, and driving go-to-market strategies in collaboration with marketing and delivery teams. Key Responsibilities: 1. OEM Relationship Management: Establish and strengthen strategic relationships with Microsoft as a key OEM partner. Engage with Microsoft regional teams to identify co-selling opportunities, joint GTMs, and access partner-led programs. Build region-wise relationship maps and act as a bridge between delivery and Microsoft alliance teams. Stay updated with Microsoft’s product roadmap, partner programs, and incentives to leverage them effectively. 2. Business Development & Sales: Identify and qualify leads for Microsoft services (such as Azure, Dynamics 365, Microsoft 365, Power Platform, and Security offerings) in targeted geographies (EMEA, APAC, Americas). Achieve quarterly and annual sales targets by generating new logos and expanding existing accounts. Lead end-to-end sales cycles – including opportunity assessment, client presentations, solution discussions, proposal submission, and commercial negotiation. Build strong client relationships and act as a trusted advisor for Microsoft technology adoption and digital transformation initiatives. 3. Collaboration with Marketing: Work closely with the marketing team to build compelling Microsoft services portfolio collateral. Support campaign planning and execution focused on specific Microsoft technologies and verticals. Contribute to content creation for events, webinars, and social media showcasing Microsoft capabilities. 4. Internal Coordination: Collaborate with pre-sales, delivery, and solutioning teams to design solutions tailored to client needs. Ensure smooth handover of closed deals to the delivery team with proper documentation and stakeholder alignment. Provide feedback from market trends, client needs, and competitor activities to leadership and marketing. Key Requirements: Bachelor's degree in Business, Engineering, or IT (MBA preferred). Proven track record in selling Microsoft services across geographies. In-depth knowledge of Microsoft’s ecosystem, including Azure, M365, Dynamics 365, Power Platform, and related services. Experience in navigating partner ecosystems and managing OEM relationships. Strong understanding of global IT services landscape and ability to identify white spaces. Excellent communication, presentation, and interpersonal skills. Willingness to travel internationally as required.
Posted 12 hours ago
0.0 - 6.0 years
0 Lacs
Delhi, Delhi
On-site
Role Overview: We are seeking a skilled and analytical FinOps Engineer to join our cloud engineering team. The ideal candidate will have hands-on experience with Azure cloud services , a strong understanding of cloud cost optimization , and proficiency in scripting and automation to drive financial governance and operational efficiency in the cloud. Key Responsibilities: Implement and manage FinOps practices to optimize Azure cloud costs across the organization. Develop and maintain automated scripts/tools (PowerShell, Azure CLI, Python, etc.) to collect, analyze, and report on usage and billing data. Collaborate with finance, engineering, and DevOps teams to define budgets, forecasts, and alerts for cloud consumption. Analyze Azure cost and usage data to identify trends, anomalies, and opportunities for cost savings. Establish cost allocation models (e.g., tagging strategies) and ensure compliance across subscriptions and teams. Generate dashboards and reports using Azure Cost Management , Power BI , or other visualization tools. Participate in cloud architectural reviews to ensure cost-effective design and scaling. Stay current with Azure pricing changes, service updates, and industry best practices for FinOps. Must-Have Skills: Proven experience with Azure Cloud Engineering and services (e.g., VMs, Storage, Networking, AKS, App Services). Strong FinOps knowledge, including Azure Cost Management , budgets, reservations, and cost analysis. Proficiency in scripting languages such as PowerShell, Azure CLI, Python, or Bash. Experience implementing or supporting cloud tagging policies and chargeback/showback models . Familiarity with infrastructure-as-code (e.g., ARM, Bicep, Terraform). Strong analytical mindset and ability to work with large datasets. Job Types: Full-time, Contractual / Temporary Contract length: 6 months Pay: Up to ₹150,000.00 per month Schedule: Rotational shift Ability to commute/relocate: Delhi, Delhi: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your current CTC and expected CTC What is your notice period Experience: FinOps Engineer , Azure: 6 years (Required) Work Location: In person
Posted 12 hours ago
0 years
12 - 16 Lacs
Pune, Maharashtra, India
On-site
Job Description We are seeking a skilled Generative AI Engineer with a strong background in Python to join our dynamic team. In this role, you will integrate backend development expertise with the latest advancements in AI to create impactful solutions. If you excel in a fast-paced environment and enjoy tackling complex challenges, we encourage you to apply. Key Responsibilities Generative AI Development Develop and implement generative AI models using frameworks like LangChain or Llama-Index. Apply prompt engineering techniques to design effective queries and ensure optimal LLM responses for diverse use cases. Master advanced LLM functionalities, including prompt optimization, hyperparameter tuning, and response caching. Implement Retrieval-Augmented Generation (RAG) workflows by integrating vector databases like Pinecone, Weaviate, Supabase, or PGVector for efficient similarity searches. Work with embeddings and build solutions that leverage similarity search for personalized query resolution. Explore and process multimodal data, including image and video understanding and generation. Integrate observability tools for monitoring and evaluating LLM performance to ensure system reliability. Backend Engineering Build and maintain scalable backend systems using Python frameworks such as FastAPI, Django, or Flask. Design and implement RESTful APIs for seamless communication between systems and services. Optimize database performance with relational databases (PostgreSQL, MySQL) and integrate vector databases (Pinecone, PGVector, Weaviate, Supabase) for advanced AI workflows. Implement asynchronous programming and adhere to clean code principles for maintainable, high-quality code. Seamlessly integrate third-party SDKs and APIs, ensuring robust interoperability with external systems. Develop backend pipelines for handling multimodal data processing, and supporting text, image, and video workflows. Manage and schedule background tasks with tools like Celery, cron jobs, or equivalent job queuing systems. Leverage containerization tools such as Docker for efficient and reproducible deployments. Ensure security and scalability of backend systems with adherence to industry best practices. Qualifications Essential: Strong Programming Skills: Proficiency in Python and experience with backend frameworks like FastAPI, Django, or Flask. Generative AI Expertise: Knowledge of frameworks like LangChain, Llama-Index, or similar tools, with experience in prompt engineering and Retrieval-Augmented Generation (RAG). Data Management: Hands-on experience with relational databases (PostgreSQL, MySQL) and vector databases (Pinecone, Weaviate, Supabase, PGVector) for embeddings and similarity search. Machine Learning Knowledge: Familiarity with LLMs, embeddings, and multimodal AI applications involving text, images, or video. Deployment Experience: Proficiency in deploying AI models in production environments using Docker and managing pipelines for scalability and reliability. Testing and Debugging: Strong skills in writing and managing unit and integration tests (e.g., Pytest), along with application debugging and performance optimization. Asynchronous Programming: Understanding of asynchronous programming concepts for handling concurrent tasks efficiently. Preferred Cloud Proficiency: Familiarity with platforms like AWS, GCP, or Azure, including serverless applications and VM setups. Frontend Basics: Understanding of HTML, CSS, and optionally JavaScript frameworks like Angular or React for better collaboration with frontend teams. Observability and Monitoring: Experience with observability tools to track and evaluate LLM performance in real-time. Cutting-Edge Tech: Awareness of trends in generative AI, including multimodal AI applications and advanced agentic workflows. Security Practices: Knowledge of secure coding practices and backend system hardening. Certifications: Relevant certifications in AI, machine learning, or cloud technologies are a plus. Skills: postgresql,rag,prompt engineering,integration testing,genai,llama-index,langchain,flask,weaviate,asynchronous programming,unit testing,docker,fastapi,pinecone,multimodal ai applications,django,gcp,mysql,llm,supabase,observability tools,aws,retrieval-augmented generation (rag),pgvector,azure,python
Posted 12 hours ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
*Hiring Alert* : *Job Title: Java Spring Boot Developer* Opening of total 3 positions - At, *Code Decode Labs* Location: Baner, Pune Experience: 4–6 years Employment Type: Full-time Joining: Immediate Joiners *Job Description:* We are seeking highly skilled Java Spring Boot Developers to join our development team. The ideal candidate should have a strong background in building scalable, high-quality, and high-performance backend systems using Java and Spring Boot. *Required Skills:* 1. Strong programming experience in Java 8+. 2. Proficient in Spring Boot, Spring MVC, and Spring Data JPA. 3. Experience with RESTful API design and integration. 4. Solid understanding of Microservices architecture and its best practices. 5. Experience with Hibernate, JPA, and relational databases such as PostgreSQL, MySQL, or Oracle. 6. Familiarity with Git, Maven/Gradle, and build/deployment tools. 7. Basic knowledge of containerization tools (e.g., Docker) is a plus. 8. Exposure to CI/CD pipelines is a plus. 9. Familiarity with cloud platforms (AWS, GCP, or Azure) is desirable. Please apply or refer suitably matching candidates on - careers@codedecodelabs.com www.codedecodelabs.com
Posted 12 hours ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Data Engineer - Azure Databricks, Pyspark, Python, Airflow __Chennai/Pune India ( 3- 6 years exp only) YOU’LL BUILD TECH THAT EMPOWERS GLOBAL BUSINESSES Our Connect Technology teams are working on our new Connect platform, a unified, global, open data ecosystem powered by Microsoft Azure. Our clients around the world rely on Connect data and insights to innovate and grow. As a Junior Data Engineer, you’ll be part of a team of smart, highly skilled technologists who are passionate about learning and supporting cutting-edge technologies such as Spark, Scala, Pyspark, Databricks, Airflow, SQL, Docker, Kubernetes, and other Data engineering tools. These technologies are deployed using DevOps pipelines leveraging Azure, Kubernetes, Jenkins and Bitbucket/GIT Hub. Responsibilities Develop, test, troubleshoot, debug, and make application enhancements leveraging, Spark , Pyspark, Scala, Pandas, Databricks, Airflow, SQL as the core development technologies. Deploy application components using CI/CD pipelines. Build utilities for monitoring and automating repetitive functions. Collaborate with Agile cross-functional teams - internal and external clients including Operations, Infrastructure, Tech Ops Collaborate with Data Science team and productionize the ML Models. Participate in a rotational support schedule to provide responses to customer queries and deploy bug fixes in a timely and accurate manner. Qualifications 3-6 Years of years of applicable software engineering experience Strong fundamentals with experience in Bigdata technologies, Spark, Pyspark, Scala, Pandas, Databricks, Airflow, SQL, Must have experience in cloud technologies, preferably Microsoft Azure. Must have experience in performance optimization of Spark workloads. Good to have experience with DevOps Technologies as GIT Hub, Kubernetes, Jenkins, Docker. Good to have knowledge in Snowflakes Good to have knowledge of relational databases, preferably PostgreSQL. Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business Minimum B.S. degree in Computer Science, Computer Engineering or related field Additional Information Enjoy a flexible and rewarding work environment with peer-to-peer recognition platforms. Recharge and revitalize with help of wellness plans made for you and your family. Plan your future with financial wellness tools. Stay relevant and upskill yourself with career development opportunities Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion
Posted 12 hours ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Us: At Calfus, we are known for delivering cutting-edge AI agents and products that transform businesses in ways previously unimaginable. We empower companies to harness the full potential of AI, unlocking opportunities they never imagined possible before the AI era. Our software engineering teams are highly valued by customers, whether start-ups or established enterprises, because we consistently deliver solutions that drive revenue growth. Our ERP solution teams have successfully implemented cloud solutions and developed tools that seamlessly integrate with ERP systems, reducing manual work so teams can focus on high-impact tasks. None of this would be possible without talent like you! Our global teams thrive on collaboration, and we’re actively looking for skilled professionals to strengthen our in-house expertise and help us deliver exceptional AI, software engineering, and solutions using enterprise applications. As one of the fastest-growing companies in our industry, we take pride in fostering a culture of innovation where new ideas are always welcomed—without hesitation. We are driven and expect the same dedication from our team members. Our speed, agility, and dedication set us apart, and we perform best when surrounded by high-energy, driven individuals. To continue our rapid growth and deliver an even greater impact, we invite you to apply for our open positions and become part of our journey! About the role: The Technical Project Manager will be responsible for the successful delivery of complex technical projects, ensuring projects are completed on time, within budget, and meet customer requirements. This role requires strong leadership, project management, and technical skills. You will manage project teams, define project scope, create timelines, allocate resources, and monitor project progress to meet delivery milestones. You will also act as a liaison between the technical teams and stakeholders to ensure effective communication and issue resolution. What You’ll Do: Extensively manage product and software engineering projects Connect, empathize with and discover customer needs, translate into requirements and engage for clarity. Ability to conduct negotiations with customers on priorities and deliver emergent products. Establish a strong engineering team grounded in respect, empathy and exceptional engineering practices such as Seeking diverse solutions and question status quo while respecting opinions Partnering with principal engineers across Calfus locations to deliver best in class products and solutions to customers Delighting customers through high quality solutions delivered through strong software engineering principle Breadth of experience covering diverse customer verticals and/or different across businesses Ability to stitch complete workflows that enable customers to accelerate their business through demonstration of understanding business needs Strong programming background in any one or more of the following languages – Node JS/Java/Python, React.js/Angular Experience in cloud architecture and technology In depth experience with any one or more of Application/front end stack, middle-layer, backend stack. Working with data from ingestion to dashboards is a huge plus Anchoring engineering teams as go-to-person to understand issues on the ground to help engineers become un-stuck Strong foundation of and being agile in ways of working. On your first day, we'll expect you to have: 10+ yrs if relevant experience in project management, with at least 5+ years in a technical or IT-focused environment. Proven track record of delivering complex projects on time, within scope, and on budget. Strong understanding of software development processes, methodologies (Agile, Scrum, Waterfall), and tools. Experience working with technical teams, understanding of system architectures, software engineering, and infrastructure. Familiarity with development platforms and technologies such as cloud platforms (AWS, Azure), databases, and APIs. Ability to motivate and guide cross-functional teams to achieve project goals. Ability to communicate effectively with both technical and non-technical stakeholders. Ability to make decisions based on data and insights and drive issues to resolution. Proficient in project management tools (e.g., Jira, Asana, MS Project, Trello). Familiarity with development tools (e.g., Git, Jenkins, Docker). We'd be super excited if you have: PMP, ScrumMaster, or other relevant certifications are a plus. Understanding of DevOps and CI/CD practices. Benefits: At Calfus, we value our employees and offer a strong benefits package. This includes medical, group, and parental insurance, coupled with gratuity and provident fund options. Further, we support employee wellness and provide birthday leave as a valued benefit. Calfus Inc. is an Equal Opportunity Employer. We believe diversity drives innovation. We’re committed to creating an inclusive workplace where everyone—regardless of background, identity, or experience—has the opportunity to thrive. We welcome all applicants!
Posted 12 hours ago
0 years
12 - 16 Lacs
Pune, Maharashtra, India
On-site
Job Description We are seeking a skilled Generative AI Engineer with a strong background in Python to join our dynamic team. In this role, you will integrate backend development expertise with the latest advancements in AI to create impactful solutions. If you excel in a fast-paced environment and enjoy tackling complex challenges, we encourage you to apply. Key Responsibilities Generative AI Development Develop and implement generative AI models using frameworks like LangChain or Llama-Index. Apply prompt engineering techniques to design effective queries and ensure optimal LLM responses for diverse use cases. Master advanced LLM functionalities, including prompt optimization, hyperparameter tuning, and response caching. Implement Retrieval-Augmented Generation (RAG) workflows by integrating vector databases like Pinecone, Weaviate, Supabase, or PGVector for efficient similarity searches. Work with embeddings and build solutions that leverage similarity search for personalized query resolution. Explore and process multimodal data, including image and video understanding and generation. Integrate observability tools for monitoring and evaluating LLM performance to ensure system reliability. Backend Engineering Build and maintain scalable backend systems using Python frameworks such as FastAPI, Django, or Flask. Design and implement RESTful APIs for seamless communication between systems and services. Optimize database performance with relational databases (PostgreSQL, MySQL) and integrate vector databases (Pinecone, PGVector, Weaviate, Supabase) for advanced AI workflows. Implement asynchronous programming and adhere to clean code principles for maintainable, high-quality code. Seamlessly integrate third-party SDKs and APIs, ensuring robust interoperability with external systems. Develop backend pipelines for handling multimodal data processing, and supporting text, image, and video workflows. Manage and schedule background tasks with tools like Celery, cron jobs, or equivalent job queuing systems. Leverage containerization tools such as Docker for efficient and reproducible deployments. Ensure security and scalability of backend systems with adherence to industry best practices. Qualifications Essential: Strong Programming Skills: Proficiency in Python and experience with backend frameworks like FastAPI, Django, or Flask. Generative AI Expertise: Knowledge of frameworks like LangChain, Llama-Index, or similar tools, with experience in prompt engineering and Retrieval-Augmented Generation (RAG). Data Management: Hands-on experience with relational databases (PostgreSQL, MySQL) and vector databases (Pinecone, Weaviate, Supabase, PGVector) for embeddings and similarity search. Machine Learning Knowledge: Familiarity with LLMs, embeddings, and multimodal AI applications involving text, images, or video. Deployment Experience: Proficiency in deploying AI models in production environments using Docker and managing pipelines for scalability and reliability. Testing and Debugging: Strong skills in writing and managing unit and integration tests (e.g., Pytest), along with application debugging and performance optimization. Asynchronous Programming: Understanding of asynchronous programming concepts for handling concurrent tasks efficiently. Preferred Cloud Proficiency: Familiarity with platforms like AWS, GCP, or Azure, including serverless applications and VM setups. Frontend Basics: Understanding of HTML, CSS, and optionally JavaScript frameworks like Angular or React for better collaboration with frontend teams. Observability and Monitoring: Experience with observability tools to track and evaluate LLM performance in real-time. Cutting-Edge Tech: Awareness of trends in generative AI, including multimodal AI applications and advanced agentic workflows. Security Practices: Knowledge of secure coding practices and backend system hardening. Certifications: Relevant certifications in AI, machine learning, or cloud technologies are a plus. Skills: retrieval-augmented generation,postgresql,pinecone,observability tools,prompt engineering,docker,weaviate,asynchronous programming,python,azure,gcp,langchain,unit testing,aws,flask,fastapi,celery,llama-index,supabase,llm,pgvector,mysql,django,generative ai,integration testing
Posted 12 hours ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Hello Connections, Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. Job Title: Mainframe Testing · Location: Pune, Chennai, hyderabad, coimbatore, bangalore · Experience: 6+ Year to 10year((relevant in mainframe testing 6Year) · Job Type : Contract to hire. Work Mode : Work from Office(5DAY) · Notice Period:- Immediate joiners Mandatory Skills: Required Technical Skills: Mainframe Testing Z/OS Mainframe, JCL, DB2, IBM Utilities, TSO/ISPF commands Good to have Technical Skills: Cloud Infrastructure Testing (AWS/Azure/GCP), Test Environment Management, Service • Should have 6+ yrs experience in Testing life cycle process, creation of test cases/data/execution as per requirement/design • Should have a good knowledge in editing JCL or create JCL to submit the Test Batch Jobs • Should be aware of TSO/ISPF commands in Mainframe • Good knowledge in analyzing the logs in Spool for Abended jobs and provide the root cause of the issue for further analysis to Development/support team • Work with IT Developer to analyze the COBOL program to analyze issue and identify input and Output files • Able to edit Mainframe files using Layouts/Copybooks using Fileaid to modify data according to testing requirements • Verify the Database in DB2 or output files to verify the outputs • Test data preparation according to Test Requirements • Experienced in STLC Lifecycle (Software Testing Life Cycle) or Agile methodology and prepare Test closure reports/Signoff for Testing Key Responsibilities: • Creation of Test Strategy/Test plan document to define scope and approach of testing • Analyze the Requirements and identify Test scenarios/design the Test cases • Prepare the Test data/Test JCL according to test scenarios • Execute Test cases by submit Jobs and analyze the results • Report the issues and coordinate with Development/support team for fixing the errors • Participate in capability building and upskilling programs, contribute towards training programs in practice. • Supporting practice associates in respective domains with relevant expertise card domain should be atleast 3 years of experience Below skills relevant experience : Mainframe testing - JCL - DB2 - VSAM - CICS - card domain - TSO/ISPF- notice period - virtual interview L1 -2nd august- yes/no- CTC- ECTC-
Posted 12 hours ago
0.0 - 1.0 years
0 - 0 Lacs
Gurugram, Haryana
On-site
AI Analytics Intern Company Overview: Branding Pioneers, located at 750 Udyog Vihar, Phase 5, Gurgaon, is a renowned digital marketing agency specializing in tailored online marketing solutions, with a significant emphasis on the healthcare sector. Location: Gurgaon, Haryana Duration: 3-6 months About Branding Pioneers: Branding Pioneers is a premier digital marketing agency specializing in healthcare marketing. Our comprehensive services include SEO, social media marketing, content marketing, and influencer collaborations, all aimed at enhancing our clients' online presence and engagement. brandingpioneers.com Role Overview: As an AI Analytics Intern, you will work closely with our Data Science and AI teams to extract insights from complex datasets, build predictive models, and support data-driven decision-making across the organization. Key Responsibilities: Assist in collecting, cleaning, and organizing large datasets from various sources. Perform exploratory data analysis (EDA) to identify patterns, trends, and anomalies. Develop and test predictive models using machine learning algorithms. Use data visualization tools (e.g., Power BI, Tableau, Matplotlib, Seaborn) to present analytical findings. Interpret data and communicate insights to technical and non-technical stakeholders. Support the team in developing AI tools, dashboards, and reports. Collaborate on A/B testing and optimization initiatives. Document processes, models, and results for knowledge sharing. Qualifications: Currently pursuing or recently completed a Bachelor’s/Master’s degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. Strong foundation in statistics, data analysis, and machine learning. Experience with Python (NumPy, Pandas, Scikit-learn), R, or SQL. Familiarity with visualization tools like Power BI, Tableau, or libraries like Plotly. Understanding of AI/ML concepts and data modeling techniques. Knowledge of cloud platforms (AWS, Google Cloud, Azure) is a plus. Excellent analytical and problem-solving skills. Strong communication and teamwork abilities. What You’ll Gain: Hands-on experience working with AI and analytics projects in a collaborative environment. Exposure to real business challenges and how AI-driven insights solve them. Mentorship from experienced AI and Data professionals. Opportunity to contribute to impactful projects with potential for a full-time offer. Job Type: Internship Contract length: 6 months Pay: 0 - 10 k per month Schedule: Day shift Ability to commute/relocate: Gurgaon, Haryana: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): How many years of experience do you have in AI Analytics ? Location: Gurgaon, Haryana (Preferred) Willingness to travel: 25% (Preferred) Work Location: In person Application Deadline :1 August 2025 Expected Start Date :7 August 2025 Job Type: Full-time Pay: ₹5,000.00 - ₹10,000.00 per month Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Preferred) Experience: Analytics: 1 year (Preferred) Location: Gurugram, Haryana (Preferred) Shift availability: Day Shift (Preferred) Willingness to travel: 25% (Preferred) Work Location: In person Application Deadline: 01/08/2025 Expected Start Date: 07/08/2025
Posted 12 hours ago
8.0 years
0 Lacs
Mohali district, India
On-site
Job Summary: We are looking for an experienced Tech Lead/Software Architect to lead projects and manage the technology team. The ideal candidate should have a strong personality, excellent communication skills, and hands-on coding expertise . This role requires a strategic thinker who can drive technical excellence , ensure best practices, and provide architectural guidance while being actively involved in development when needed. Key Responsibilities: Lead and mentor the development team, ensuring smooth execution of projects. Architect and design scalable, secure, and high-performance solutions. Stay hands-on with coding, reviewing, and debugging to maintain code quality. Collaborate with cross-functional teams to define technical roadmaps and project timelines. Evaluate and implement best engineering practices, tools, and frameworks . Ensure code efficiency, performance, and security standards are met. Take ownership of technical decisions, system architecture, and design patterns . Guide the team in problem-solving, troubleshooting, and optimizing performance . Communicate effectively with stakeholders, management, and clients to align technical goals with business objectives. Required Skills & Qualifications: 8+ years of hands-on experience in software development, system architecture, and technical leadership. Proficiency in any backend & frontend technologies (e.g., Node.js, .Net, PHP, Python, React, Angular, etc.). Strong knowledge of cloud platforms (AWS, Azure, GCP) and DevOps practices . Experience with database design and management (SQL & NoSQL). Expertise in microservices architecture, APIs, and system scalability . Strong problem-solving skills and ability to handle complex technical challenges. Excellent communication, leadership, and stakeholder management skills . Ability to work in a fast-paced environment and manage multiple projects effectively.
Posted 12 hours ago
2.0 years
6 - 8 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are looking for a highly motivated and skilled Python Backend Developer with 2 years of experience to join our growing team in Chennai. The ideal candidate should have hands-on experience in backend development using Python and Flask , and strong expertise in writing SQL queries . Experience with Elasticsearch is a must. Key Responsibilities Design, develop, and maintain scalable backend services using Python and Flask. Integrate and manage Elasticsearch for optimized search functionalities. Write complex and efficient SQL queries for data manipulation and reporting. Collaborate with front-end developers, product managers, and QA teams to deliver robust and scalable features. Optimize applications for maximum speed and scalability. Participate in code reviews and contribute to team best practices and documentation. Required Skills Strong programming skills in Python, with a focus on backend development. Hands-on experience with the Flask web framework. Good understanding and working knowledge of Elasticsearch. Proficient in SQL query language, able to write optimized and complex queries. Familiarity with RESTful APIs and microservice architecture. Good problem-solving and debugging skills. Strong communication and collaboration abilities. Good To Have Experience with Git, Docker, or any CI/CD tools. Knowledge of NoSQL databases. Exposure to cloud platforms like AWS or Azure. Skills: elasticsearch,sql,ci/cd tools,aws,backend development,azure,python,restful apis,git,docker,flask,nosql databases,microservice architecture
Posted 12 hours ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
ROLE PURPOSE: We are seeking a highly skilled Senior Power Automate Developer with extensive experience in designing, developing, and managing Power Automate workflows and solutions. This role involves working closely with GBS business stakeholders, IT teams, and other developers to automate processes and optimize business operations. The ideal candidate will possess advanced development skills, in-depth knowledge of Microsoft Power Platform, and a proven track record of implementing robust automation solutions. ROLE and RESPONSIBILITIES: Design, develop, and deploy custom Power Automate workflows to automate business processes and integrate various data sources. Collaborate with business users to gather and understand requirements and translate these into technical solutions using Power Automate. Develop custom connectors, expressions, and advanced workflow logic to extend the capabilities of Power Automate. Integrate Power Automate with SharePoint, Power Apps, MS Teams, and other external systems via APIs. Troubleshoot and resolve issues related to Power Automate workflows and integrations. Collaborate with Power BI and Power Apps developers to deliver comprehensive Microsoft Power Platform solutions. Ensure high-quality code by following best practices, including version control, testing, and documentation. Provide mentoring and technical guidance to junior developers on Power Automate best practices and advanced techniques. Monitor, maintain, and continuously optimize existing workflows for performance, scalability, and security. Stay up-to-date with the latest Power Platform updates, tools, and best practices to ensure solutions are cutting-edge. QUALIFICATIONS Experience: 5+ years of development experience with Microsoft Power Automate (Flow) and related technologies. Microsoft Power Platform: Extensive hands-on experience with Power Automate, Power Apps, Power BI, and Dataverse. Development Expertise: Advanced knowledge of building complex flows, custom connectors, and expressions. Scripting: Experience with PowerShell scripting and JSON manipulation. API Integration: Strong experience integrating Power Automate with REST/SOAP APIs, databases, and other systems. Microsoft 365 Suite: In-depth knowledge of integrating Power Automate with SharePoint, Teams, Outlook, Dynamics 365, and Excel. Problem Solving: Proven track record in diagnosing issues, troubleshooting, and optimizing workflows. Collaboration: Excellent communication and teamwork skills, with the ability to work effectively with stakeholders at all levels. Agile/Scrum: Experience working in Agile or Scrum development environments. Soft Skills Experience with Azure Logic Apps and Azure Functions. Familiarity with Power Virtual Agents. Microsoft Power Platform certifications (Power Automate, Power Apps, or Power BI). Knowledge of RPA (Robotic Process Automation) tools and strategies. To know our Privacy Policy, please click on the link below or copy paste the URL on your browser: https://gedu.global/wp-content/uploads/2023/09/GEDU-Privacy-Policy-22092023-V2.0-1.pdf
Posted 12 hours ago
35.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Overview: When it comes to IT solution providers, there are a lot of choices. But when it comes to providers with innovative and differentiating end-to-end service offerings, there’s really only one: Zones – First Choice for IT. TM Zones is a Global Solution Provider of end-to-end IT solutions with an unmatched supply chain. Positioned to be the IT partner you need, Zones, a Minority Business Enterprise (MBE) in business for over 35 years, specializes in Digital Workplace, Cloud & Data Center, Networking, Security, and Managed/Professional/Staffing services. Operating in more than 120 countries, leveraging a robust portfolio, and utilizing the highest certification levels from key partners, including Microsoft, Apple, Cisco, Lenovo, Adobe, and more, Zones has mastered the science of building digital infrastructures that change the way business does business ensuring whatever they need, they can Consider IT Done. Follow Zones, LLC on Twitter @ Zones, and LinkedIn and Facebook. Position Overview: The primary focus of this position is to Design, develop, and maintain robust data pipelines using Azure Data Factory. Implement and manage ETL processes to ensure efficient data flow and transformation. What you’ll do as a (BI Dev eloper Lead): Design, develop, and maintain robust data pipelines using Azure Data Factory. Implement and manage ETL processes to ensure efficient data flow and transformation. Develop and maintain data models and data warehouses using Azure SQL Database and Azure Synapse Analytics. Create and manage Power BI reports and dashboards to provide actionable insights to stakeholders. Ensure data quality, integrity, and security across all data systems. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize data storage and retrieval processes for performance and cost efficiency. Monitor and troubleshoot data pipelines and workflows to ensure smooth operations. Create and maintain tabular models for efficient data analysis and reporting. Stay updated with the latest Azure services and best practices to continuously improve data infrastructure. What will you bring to the team: Bachelor’s degree in computer science, Information Technology, or a related field. Certification in Azure Data Engineer or related Azure certifications will be an added advantage. Experience with machine learning and AI services on Azure will be an added advantage. Proven experience in designing and maintaining data pipelines using Azure Data Factory. Strong proficiency in SQL and experience with Azure SQL Database. Hands-on experience with Azure Synapse Analytics and Azure Data Lake Storage. Proficiency in creating and managing Power BI reports and dashboards. Knowledge of Azure DevOps for CI/CD pipeline implementation. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Knowledge of data governance and compliance standards. Zones offers a comprehensive Benefits package. While we’re committed to providing top-tier solutions, we’re just as committed to supporting our own teams. We offer a competitive compensation package where our team members are rewarded based on their performance and recognized for the value, they bring into our business. Our team members enjoy a variety of comprehensive benefits, including Medical Insurance Coverage, Group Term Life and Personal Accident Cover to handle the uncertainties of life, flexible leave policy to balance their work life. At Zones, work is more than a job – it's an exciting careers immersed in an inventive, collaborative culture. If you’re interested in working on the cutting edge of IT innovation, sales, engineering, operations, administration, and more, Zones is the place for you! All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status or on the basis of disability.
Posted 12 hours ago
3.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
The ideal candidate will be familiar with the full software design life cycle. They should have experience in designing, coding, testing and consistently managing applications They should be comfortable coding in a number of languages and have an ability to test code in order to maintain high-quality code. Desired Skills-Asp.Net,Angular,Azure-Must have Location Ahmedabad only preferable immediate joiners. Responsibilities Design, code, test and manage various applications Collaborate with engineering team and product team to establish best products Follow outlined standards of quality related to code and systems Develop automated tests and conduct performance tuning Qualifications Bachelor's degree in Computer Science or relevant field 3+ years of experience working with .NET or relevant experiences Experience developing web-based applications in C#, HTML, JavaScript, VBScript/ASP, or .NET Experience working with MS SQL Server and MySQL Knowledge of practices and procedures for full software design life cycle Experience working in agile development environment
Posted 12 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France