Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
4 - 7 Lacs
Ahmedabad
Work from Office
Roles and Responsibility : Collaborate with stakeholders to understand business requirements and data needs. Translate business requirements into scalable and efficient data engineering solutions. Design, develop, and maintain data pipelines using AWS serverless technologies. Implement data modeling techniques to optimize data storage and retrieval processes. Develop and deploy data processing and transformation frameworks for real-time and batch processing. Ensure data pipelines are scalable, reliable, and performant for large-scale data sizes. Implement data documentation and observability tools and practices to monitor...
Posted 5 days ago
3.0 - 5.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Position summary: We are seeking a Senior Software Development Engineer – Data Engineering with 3-5 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. Key Responsibilities: Work with cloud-based data solutions (Azure, AWS, GCP). Implement data modeling and warehousing solutions. Developing and maintaining data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Designing and optimizing data storage solutions, including data warehouses and data lakes. Ensuring data quality and integrity through data validation, cleansing, and error handling. Collaborating with data analysts, data architects, and software engineers to understand data requirements and deliver relevant data sets (e.g., for business intelligence). Implementing data security measures and access controls to protect sensitive information. Monitor and troubleshoot issues in data pipelines, notebooks, and SQL queries to ensure seamless data processing. Develop and maintain Power BI dashboards and reports. Work with DAX and Power Query to manipulate and transform data. Basic Qualifications Bachelor’s or master’s degree in computer science or data science 3-5 years of experience in data engineering, big data processing, and cloud-based data platforms. Proficient in SQL, Python, or Scala for data manipulation and processing. Proficient in developing data pipelines using Azure Synapse, Azure Data Factory, Microsoft Fabric. Experience with Apache Spark, Databricks and Snowflake is highly beneficial for handling big data and cloud-based analytics solutions. Preferred Qualifications Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub). Experience in BI and analytics tools (Tableau, Power BI, Looker). Familiarity with data observability tools (Monte Carlo, Great Expectations). Contributions to open-source data engineering projects.
Posted 5 days ago
3.0 - 6.0 years
3 - 6 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Responsibilities: Responsible for ETL/data processing and data transformation. Monitor data traffic and data processing schedules. Capture data changes and escalate issues as needed. Improve and optimize existing data processes. Handle ad-hoc data tool development and customization. Identify data inconsistencies and variances for resolution. Manage timelines and produce regular reports. Execute complex calculations and validations. Perform filing validations and sanity checks. Conduct logical reasoning and analysis to solve complex data-related problems. Required Skills: Strong proficiency in SQL , including: Writing and optimizing complex queries. Working with intricate data structures. Extracting insights from raw data. Strong data analysis and logical reasoning abilities: Analyze complex datasets. Trace data through intricate logic. Apply critical thinking to resolve data issues. Ability to transform and manipulate raw data based on business/client logic. Gather requirements effectively from clients regarding: Data transformation needs. Reporting expectations. Excellent communication skills (verbal and written) to understand and meet client expectations. Familiarity with ETL and reporting tools (preferred but not mandatory): Tools such as SSRS or Power BI for reporting and visualization. Qualifications: Bachelor's degree in a relevant field, preferably with a focus on Mathematics or Statistics . Preferred Experience & Competencies: Prior experience in the finance industry . Strong understanding of ETL concepts and data pipelines . Proven ability to: Communicate clearly and consistently with clients. Solve advanced problems and explain complex concepts to team/client. Manage multiple competing tasks efficiently in a fast-paced environment. Work independently and as part of a collaborative team. Remain calm and professional under pressure. Understand industry trends and drive process innovation.
Posted 6 days ago
1.0 - 5.0 years
4 - 7 Lacs
Pune
Work from Office
Works with clients to finalize desired technical specifications and application design Codes, tests, debugs and documents complex programs, and enhances existing programs to ensure that data processing production systems continue to meet user requirements Develops and maintains application design, program specification documents, and proprietary web applications Contributes effectively as a member of the team; takes ownership of individual assignments and projects with moderate oversight Manages and updates issue-tracking system when gaps in code and documentation are discovered Designs and develops software & mobile applications for external clients Works with project lead and internal stakeholders to formulate sprint backlog Develops detailed system design specifications to serve as a guide for system/program development Identifies and resolves system operating programs in order to provide continuous business operations Interacts with user management regarding project status and user requirements to promote an environment with improved productivity and satisfaction Provides technical leadership and training to Software Engineers I Assists in scheduling, determining manpower requirements, and estimating costs to project completion in order to meet user requirements. Develops new control applications from a set of specifications and tests new and modified control applications Provides remote support for field personnel as they install and troubleshoot new applications Provides on-site support for some scheduled installations and upgrades and end-user phone support, primarily concerning application issues Creates documentation for configurations and how to implement and test the applications Experience & Educational Requirements: bachelors Degree in Computer Science, Information Technology or any other related discipline or equivalent related experience. 2+ years of directly-related or relevant experience, preferably in software designing and development. **Developer, Senior Developer, Team Lead & Management positions available - Depending on Experience. Skills & Knowledge: Behavioral Skills: Critical Thinking Detail Oriented Interpersonal Communication Learning Agility Problem Solving Time Management
Posted 6 days ago
0.0 - 3.0 years
10 - 14 Lacs
Mumbai, Hyderabad
Work from Office
ImporAnalyze, research and document laws and regulations and their potential impact on products (eg tax returns, etc). Research and respond to client inquiries as directed by management. Prioritizes, assesses risk for correspondence and assessments. About the Role: Research and monitor trade compliance content from authorized government websites across multiple countries Track and analyze Denied Parties Lists, sanctions, and embargo regulations to ensure compliance with global trade laws Extract, interpret, and convert government legislation related to Denied Party Screening (DPS) into standardized formats using MS Excel Transform complex regulatory data into software-compatible formats for system integration Perform daily monitoring of regulatory changes to update denied party lists according to established SLAs Validate data integrity by comparing source information with system data to ensure accuracy Utilize various translation tools to process international trade compliance information Create and maintain comprehensive documentation of processes and work instructions Apply technological solutions to meet client compliance needs and improve data processing efficiency Ensure timely updates of trade compliance databases to maintain regulatory adherence. About You: Good Experience with 1+ years of experience in DPS (Denied party screening). Experience in Import/Export Operations, Excel, Power BI and SQL. Track and analyze Denied Parties Lists, sanctions, and embargo regulations to ensure compliance with global trade laws What s in it For You Hybrid Work Model: we've adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial we'llbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world.
Posted 6 days ago
6.0 - 11.0 years
10 - 18 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Design systems for local data processing. Implement solutions for IoT and real-time analytics. Ensure system security and performance.
Posted 6 days ago
0.0 - 1.0 years
2 - 3 Lacs
Noida
Hybrid
up toof experience in Data Entry, proficiency in MS Excel, andof ,Data Entry Registation link- https://forms.gle/aZSF8HKPgmrBb68W8 BU: Study Abroad- upGrad Job Title: Intern- SEO (Data Entry) Location: Noida Sec 125 (Hybrid) Experience Required: 3-6 months Salary: upto 3LPA (3 months internship plus PPO) Job Summary: We are seeking a candidate with 3-6 months experience in Data Entry, MS Excel with good communication skills. Key Responsibilities: Data entry on Excel SEO-related data (e.g. keyword rankings, Content Status, SEO performance metrics). Perform data entry tasks such as updating and maintaining the content on CMS . Support the SEO team in organizing and managing campaign data. Learn and adapt to SEO best practices, staying updated on the latest trends. Requirements : Graduate in Any field Knowledge in MS Excel Good communication skills
Posted 6 days ago
5.0 - 10.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Senior Backend Engineer - Phronetic Who We Are: Phronetic, the artificial intelligence division of Infibeam Avenues Limited, is building the foundational platform for the "agentic economy," moving beyond simple chatbots to create an ecosystem for autonomous AI agents. Powered by Infibeam and trusted by industry leaders like Rediff, they aim to provide tools for developers to launch, manage, and monetize AI agents as "digital coworkers." The Challenge: The current AI stack is fragmented, leading to issues with multimodal data, silent webhook failures, unpredictable token usage, and nascent agent-to-agent collaboration. Phronetic is building a unified, robust backend to resolve these issues for the developer community. Your Mission: As a foundational member of the backend team, you will architect core systems, focusing on: Agent Nervous System: Designing agent-to-agent messaging, lifecycle management, and high-concurrency, low-latency communication. Multimodal Chaos Taming: Engineering systems to process and understand real-time images, audio, video, and text. Bulletproof Systems: Developing secure, observable webhook systems with robust billing, metering, and real-time payment pipelines. What Youll Bring: Phronetic seeks an experienced engineer comfortable with complex systems and ambiguity. Core Experience: Typically 5+ years of experience in backend engineering roles. Expertise in Python, especially with async frameworks like FastAPI. Strong command of Docker and cloud deployment (AWS, Cloud Run, or similar). Proven experience designing and building microservice or agent-based architectures. Specialized Experience (Ideal): Real-Time Systems: Experience with real time media transmission like WebRTC, WebSockets and ways to process them. Scalable Systems: Experience in building scalable, fault-tolerant systems with a strong understanding of observability, monitoring, and alerting best practices. Reliable Webhooks: Knowledge of scalable webhook infrastructure with retry logic, backoffs, and security. Data Processing: Experience with multimodal data (e.g., OCR, audio transcription, video chunking with FFmpeg/OpenCV). Payments & Metering: Familiarity with usage-based billing systems or token-based ledgers. Bonus Skills: Knowledge of Rust and Android development is a plus. Your Impact: The systems designed by this role will form the foundation for: Thousands of AI agents for major partners across chat, video, and APIs. A new creator economy enabling developers to earn revenue through agents. The overall speed, security, and scalability of the Phronetic platform. Why Join Us Opportunity to solve hard problems with clean, scalable code. Small, fast-paced team with high ownership and zero micromanagement. Belief in platform engineering as a craft and care for developer experience. Conviction that AI agents are the future, and a desire to build their powering platform. Dynamic, collaborative in-office work environment in Bengaluru. Meaningful equity in a growing, well-backed company. Direct work with founders and engineers from top AI companies. A real voice in architectural and product decisions. Opportunity to solve cutting-edge problems with no legacy code. Ready to Build the Future Phronetic is building the core platform for the next software paradigm. Interested candidates are encouraged to apply with their GitHub, resume, or anything that showcases their thinking.
Posted 6 days ago
6.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Introduction: Are you searching for an opportunity to play a key role in driving the dramatic growth of a highly successful software company At Poppulo, we re working on what s next in communications and workplace technology. As a pioneer in this industry, we understand that meaningfully reaching every employee is hard. And so is managing office space in a hybrid world. And so is improving the customer and guest experience. We exist to make each of these things easier. We exist to bring harmony to our customers. And we do that at enterprise scale. Our omnichannel employee communications, customer communications, and workplace experience platform is trusted by over 6,000 organizations today, reaching more than 35M employees and delivering content to 500,000+ digital signs. We know there s no such thing as a perfect" candidate - we re all a work in progress and are growing new skills and capabilities all the time. We encourage you to apply for a position with Poppulo even if you don t meet 100% of the requirements. We believe in fostering an environment where there is a diversity of perspectives, in hopes that we can all thrive. The Opportunity We are seeking a skilled Senior Software Development Engineer to join our Extensions & Adapters team within Data Integrations. In this role, youll build and enhance plugins that expand the functionality of our Integration Framework. Your work will enable seamless data retrieval and transformation from a variety of third-party sources, facilitating efficient data handling for our customers. Experience or enthusiasm for AI technologies, especially Large Language Models (LLMs), generative AI, or agentic coding tools, is strongly preferred. Key Responsibilities: Develop and maintain robust extensions, adapters, listeners, actions, and endpoints within our Integration Framework, primarily as Windows applications. Collaborate closely with product management, architects, and integration engineers to understand and translate customer requirements. Enhance the existing infrastructure to support a wide variety of third-party data sources effectively. Write clean, efficient, well-documented code adhering to best practices and coding standards. Participate actively in code reviews, ensuring the quality and integrity of the teams deliverables. Identify opportunities for performance improvements and implement optimized data processing solutions. Explore and integrate AI-driven tools and techniques, leveraging Large Language Models and generative AI technologies. Engage proactively in agile methodologies, contributing to sprints, stand-ups, and retrospectives. Skill & Experience Requirements: Bachelors degree in Computer Science or related field. 6+ years of software development experience focused on integration and data transformation. Proven track record in building plugins, adapters, or extensions for enterprise frameworks. Demonstrated enthusiasm or practical exposure to AI and related technologies Solid experience with C#, .NET Framework, and integration technologies. Proficiency in relational databases (MS SQL). Experience creating APIs and working with third-party integrations. Familiarity with data transformation techniques and event-driven architectures. Knowledge of modern software development practices, including CI/CD and automated testing. Interest or experience in AI technologies, generative AI models (LLMs), and agentic coding tools (Cursor, Windsurf). Who We Are We are a values-driven organization that encourages our employees to bring their authentic selves to work every day and empowers everyone to make a tangible impact on our products, clients, and culture. We offer a dynamic environment with driven, fun, and flexible individuals who thrive on challenge and responsibility. This is an opportunity to contribute to our culture and join a company that s on the move. We live the Poppulo values each day, as they are key to everything we do. Bring Your Best Self We show up authentically, are self-aware and always strive to be better. See it. Own it. Solve it. We proactively innovate and solve for our customers and each other. We set an example with high standards for our work. We foster a culture of learning, acknowledging our successes and our failures. Together We re Better We value and celebrate our diversity. We learn from others, respecting their expertise, and focus on building trust. Thats what makes us a team. Named a Great Place to Work in 2015, 2016, 2017, 2018, 2019, 2020, and 2021, we are a fast-growing global technology company, with offices in Ireland, the US, and the UK. Poppulo is an equal opportunity employer. Named a Great Place to Work in 2015, 2016, 2017, 2018, 2019, 2020, and 2021, we are a fast-growing global technology company, with offices in Ireland, the US, and the UK. Poppulo is an equal opportunity employer. We are committed to protecting your privacy. For details on how we collect, use, and protect your personal information, please refer to our Job Applicant Privacy Policy.
Posted 6 days ago
9.0 - 14.0 years
11 - 16 Lacs
Bengaluru
Work from Office
Introduction: Are you searching for an opportunity to play a key role in driving the dramatic growth of a highly successful software company At Poppulo, we re working on what s next in communications and workplace technology. As a pioneer in this industry, we understand that meaningfully reaching every employee is hard. And so is managing office space in a hybrid world. And so is improving the customer and guest experience. We exist to make each of these things easier. We exist to bring harmony to our customers. And we do that at enterprise scale. Our omnichannel employee communications, customer communications, and workplace experience platform is trusted by over 6,000 organizations today, reaching more than 35M employees and delivering content to 500,000+ digital signs. We know there s no such thing as a perfect" candidate - we re all a work in progress and are growing new skills and capabilities all the time. We encourage you to apply for a position with Poppulo even if you don t meet 100% of the requirements. We believe in fostering an environment where there is a diversity of perspectives, in hopes that we can all thrive. The Opportunity We are seeking a skilled Principal Software Development Engineer to join our Extensions & Adapters team within Data Integrations. In this role, youll build and enhance plugins that expand the functionality of our Integration Framework. Your work will enable seamless data retrieval and transformation from a variety of third-party sources, facilitating efficient data handling for our customers. Experience or enthusiasm for AI technologies, especially Large Language Models (LLMs), generative AI, or agentic coding tools, is strongly preferred. Key Responsibilities: Develop and maintain robust extensions, adapters, listeners, actions, and endpoints within our Integration Framework, primarily as Windows applications. Collaborate closely with product management, architects, and integration engineers to understand and translate customer requirements. Enhance the existing infrastructure to support a wide variety of third-party data sources effectively. Write clean, efficient, well-documented code adhering to best practices and coding standards. Participate actively in code reviews, ensuring the quality and integrity of the teams deliverables. Identify opportunities for performance improvements and implement optimized data processing solutions. Explore and integrate AI-driven tools and techniques, leveraging Large Language Models and generative AI technologies. Engage proactively in agile methodologies, contributing to sprints, stand-ups, and retrospectives. Skill & Experience Requirements: Bachelors degree in Computer Science or related field. 9+ years of software development experience focused on integration and data transformation. Proven track record in building plugins, adapters, or extensions for enterprise frameworks. Demonstrated enthusiasm or practical exposure to AI and related technologies Solid experience with C#, .NET Framework, and integration technologies. Proficiency in relational databases (MS SQL). Experience creating APIs and working with third-party integrations. Familiarity with data transformation techniques and event-driven architectures. Knowledge of modern software development practices, including CI/CD and automated testing. Interest or experience in AI technologies, generative AI models (LLMs), and agentic coding tools (Cursor, Windsurf). Who We Are We are a values-driven organization that encourages our employees to bring their authentic selves to work every day and empowers everyone to make a tangible impact on our products, clients, and culture. We offer a dynamic environment with driven, fun, and flexible individuals who thrive on challenge and responsibility. This is an opportunity to contribute to our culture and join a company that s on the move. We live the Poppulo values each day, as they are key to everything we do. Bring Your Best Self We show up authentically, are self-aware and always strive to be better. See it. Own it. Solve it. We proactively innovate and solve for our customers and each other. We set an example with high standards for our work. We foster a culture of learning, acknowledging our successes and our failures. Together We re Better We value and celebrate our diversity. We learn from others, respecting their expertise, and focus on building trust. Thats what makes us a team. Named a Great Place to Work in 2015, 2016, 2017, 2018, 2019, 2020, and 2021, we are a fast-growing global technology company, with offices in Ireland, the US, and the UK. Poppulo is an equal opportunity employer. Named a Great Place to Work in 2015, 2016, 2017, 2018, 2019, 2020, and 2021, we are a fast-growing global technology company, with offices in Ireland, the US, and the UK. Poppulo is an equal opportunity employer. We are committed to protecting your privacy. For details on how we collect, use, and protect your personal information, please refer to our Job Applicant Privacy Policy.
Posted 6 days ago
1.0 - 3.0 years
4 - 8 Lacs
Gurugram
Work from Office
We are seeking a highly skilled and motivated Senior GIS Data Analyst/Engineer to join our innovative team in India. This role will leverage advanced expertise in GIS, data science, and programming to extract actionable insights from geospatial data, driving impactful business outcomes through cutting-edge visualization and analytical tools. Responsibilities : Data Analysis and Management Conduct advanced spatial data analysis using GIS software (ArcGIS, QGIS) to derive meaningful insights. Manage, manipulate, and analyze large geospatial datasets to produce high-quality maps and actionable reports. Ensure data accuracy and integrity through rigorous quality control measures and regular audits. Programming and Automation Develop and implement Python scripts for data processing, analysis, and automation, with proficiency in SQL for querying and managing databases.
Posted 6 days ago
3.0 - 7.0 years
15 - 20 Lacs
Bengaluru
Work from Office
Job Description Maintain contracts in contracts repository: updated with parent -child linkage, meta data, validate executed contracts. Identify and tracks issues, coordinate and track remediation action with global SS team, vendor, risk teams and business stake holders till closure. Support and execute efficient processes: I. To support locating and understanding how to upload manage and retrieve documentation across internal tools. II. Support ad-hoc projects on contract management and administration - transition to new tool etc. III. Exposure to ERP tools, preferably Coupa: including all modules of procure to pay cycle from vendor set-up to payment process. IV. Exposure to procure to pay cycle, purchase order/contract processing, data analysis and reporting catalogue management and vendor management. V. Deep understanding of contract review process, contract templates and contract clauses especially master agreement, NDA s, amendments etc. Review basic contractual documents (PO, work orders, service orders) from a contractual and commercial perceptive to ensure compliance with company policies and practices. Experience working with globally distributed internal and external teams. Collaborates XFN teams to meet the objective, business controls and compliance requirements. Review of contracts from a commercial perspective, to ensure value for money and reduced risk for the organisation. Negotiate cost structure across multiple spend categories to drive savings and avoidance for the organisation. Contributing to the creation and maintenance of spend category strategies. Manage and support the vendor relationship, contract and high-level sourcing of top spend categories where the products, services or vendors are used across multiple sites or business units. Be process driven and project focused with ability to prioritise and get projects completed in set timeframe. Prioritize workload under direction of your line manager to meet the changing demands of the business and the market Qualifications BA/BS degree 5+ years of related experience Experience with Netsuite & Coupa (or other ERP systems) Experience working with contract documents such as Statements of Work, Master Service Agree
Posted 6 days ago
8.0 - 15.0 years
45 - 55 Lacs
Bengaluru
Work from Office
About Credit Saison India: Established in 2019, CS India is one of the country s fastest growing Non-Bank Financial Company (NBFC) lenders, with verticals in wholesale, direct lending and tech-enabled partnerships with Non-Bank Financial Companies (NBFCs) and fintechs. Its tech-enabled model coupled with underwriting capability facilitates lending at scale, meeting India s huge gap for credit, especially with underserved and under penetrated segments of the population. Credit Saison India is committed to growing as a lender and evolving its offerings in India for the long-term for MSMEs, households, individuals and more. CS India is registered with the Reserve Bank of India (RBI) and has an AAA rating from CRISIL (a subsidiary of S&P Global) and CARE Ratings. Currently, CS India has a branch network of 45 physical offices, 1.2 million active loans, an AUM of over US$1.5B and an employee base of about 1,000 people. Credit Saison India (CS India) is part of Saison International, a global financial company with a mission to bring people, partners and technology together, creating resilient and innovative financial solutions for positive impact. Across its business arms of lending and corporate venture capital, Saison International is committed to being a transformative partner in creating opportunities and enabling the dreams of people. Based in Singapore, over 1,000 employees work across Saison s global operations spanning Singapore, India, Indonesia, Thailand, Vietnam, Mexico, Brazil. Saison International is the international headquarters (IHQ) of Credit Saison Company Limited, founded in 1951 and one of Japan s largest lending conglomerates with over 70 years of history and listed on the Tokyo Stock Exchange. The Company has evolved from a credit-card issuer to a diversified financial services provider across payments, leasing, finance, real estate and entertainment. Roles & Responsibilities: Define and drive the long-term AI engineering strategy and roadmap aligned with the company s business goals and innovation vision, focusing on scalable AI and machine learning solutions including Generative AI. Lead, mentor, and grow a high-performing AI engineering team, fostering a culture of innovation, collaboration, and technical excellence. Collaborate closely with product, data science, infrastructure, and business teams to identify AI use cases, design end-to-end AI solutions, and integrate them seamlessly into products and platforms. Oversee the architecture, development, deployment, and continuous improvement of AI/ML models and systems, ensuring scalability, robustness, and real-time performance. Own the full AI/ML lifecycle including data strategy, model development, validation, deployment, monitoring, and retraining pipelines. Evaluate and incorporate state-of-the-art AI technologies, frameworks, and external AI services (e.g., APIs, pre-trained models) to accelerate delivery and enhance capabilities. Establish and enforce engineering standards, best practices, and observability tools (e.g., MLflow, Langsmith) for model governance, performance tracking, and compliance with data privacy and security requirements. Collaborate with infrastructure and DevOps teams to design and maintain cloud infrastructure optimized for AI workloads, including GPU acceleration and MLOps automation. Manage project timelines, resource allocation, and cross-team coordination to ensure timely delivery of AI initiatives. Stay abreast of emerging AI trends, research, and tools to continuously evolve the AI engineering function. Required Skills & Qualifications: 10 to 15 years of experience in AI, machine learning, or data engineering roles, with at least 8 years in leadership or managerial positions. Bachelor s, Master s, or PhD degree from a top-tier college in Computer Science, Statistics, Mathematics, or related quantitative fields is strongly preferred. Proven experience leading AI engineering teams and delivering production-grade AI/ML systems at scale. Strong expertise in machine learning algorithms, deep learning, NLP, computer vision, and Generative AI technologies. Hands-on experience with AI/ML frameworks and libraries such as TensorFlow, PyTorch, Keras, Hugging Face Transformers, LangChain, MLflow, and related tools. Solid understanding of data engineering concepts, ETL pipelines, and working knowledge of distributed computing frameworks (Spark, Hadoop). Experience with cloud platforms (AWS, Azure, GCP) and container orchestration (Kubernetes, Docker). Familiarity with software engineering best practices including CI/CD, version control (Git), and microservices architecture. Strong problem-solving skills with a product-oriented mindset and ability to translate business needs into technical solutions. Excellent communication skills to collaborate effectively across technical and non-technical teams. Experience in AI governance, model monitoring, and compliance with data privacy/security standards. Preferred Qualifications: Experience building or managing ML platforms or MLOps pipelines. Knowledge of NoSQL databases (MongoDB, Cassandra) and real-time data processing. Prior exposure to AI in specific domains like banking, finance and credit experience is a strong plus. This role offers the opportunity to lead AI innovation at scale, shaping the future of AI-powered products and services in a fast-growing, technology-driven environment. About Credit Saison India:Established in 2019, CS India is one of the country s fastest growing Non-Bank Financial Company (NBFC) lenders, with verticals in wholesale, direct lending and tech-enabled ...
Posted 6 days ago
2.0 - 7.0 years
3 - 6 Lacs
Mumbai
Work from Office
We are seeking an experienced Azure IoT Systems Developer to develop a comprehensive industrial IoT monitoring and control system. This is a hands-on technical role focused exclusively on Azure cloud services configuration, backend development, and IoT solution architecture. Work Model: Remote with occasional on-site collaboration No: of positions: 3 Key Functions & Roles of the Candidate: Azure IoT Platform Development. Design and implement Azure IoT Hub configuration for device connectivity and telemetry ingestion. Develop device provisioning services for secure device onboarding. Create Azure Digital Twins models (DTDL) representing industrial equipment and production sites. Implement real-time data synchronization between physical devices and digital twins. Backend Services & Integration Develop Azure Functions for data processing and business logic implementation. Implement Stream Analytics jobs for real-time telemetry processing. Create batch processing services with complex business rule implementation. Data Management & Analytics Configure hot and cold data storage solutions (Azure SQL, Cosmos DB, Data Lake). Implement data pipelines for real-time and historical analytics. Develop notification services for system alerts and monitoring. Create data archival and retention policies. Implement data pipelines for real-time and historical analytics Implement CI/CD pipelines for automated deployment Configure security policies for IoT devices and cloud services. Set up monitoring and logging solutions Required Technical Skills Azure IoT Hub & IoT Edge - Device connectivity, telemetry ingestion, and edge computing. Azure Digital Twins - DTDL modeling, twin relationships, and queries. Azure Service Bus - Message queuing, sessions, and dead-letter handling. Azure Functions - Serverless computing and event-driven processing. Azure Stream Analytics - Real-time data processing and analytics. Azure Functions - Serverless computing and event-driven processing Azure API Management - API gateway and security implementation. Complete Azure infrastructure setup and configuration Fully functional IoT data ingestion and processing pipeline Digital twin implementation with real-time synchronization Task processing system with business rules engine Backend APIs for system integration and monitoring Comprehensive documentation and deployment guides Unit tests and integration test suites. At least 1 year of IoT systems development. Proven experience building end-to-end solutions. Design and develop RESTful APIs using Azure API Management.
Posted 6 days ago
1.0 - 3.0 years
9 - 13 Lacs
Kochi, Chennai
Work from Office
Key Responsibilities: Apply deep domain knowledge in structural biology to characterize macromolecular complexes and understand their roles in disease pathways. Integrate structural insights with multi-omics data (genomics, transcriptomics, proteomics, and metabolomics) to understand molecular mechanisms of disease. Utilize and develop computational tools for structural modeling, molecular dynamics simulations, and ligand docking to complement experimental data. Stay up to date with the latest advancements in structural biology, biophysics, and computational methods to incorporate cutting-edge research into disease modeling efforts. Collaboration & Project Management: Work closely with cross-functional teams, including disease domain experts, computational scientists, and clinicians to enhance translational research efforts. Work with the Scientific Manager to help in project planning, execution, and reporting, ensuring alignment with research objectives. Communicate findings effectively through reports, presentations, and discussions with internal teams and external collaborators. Publish research findings in high-impact, peer-reviewed journals. Present work at scientific conferences, symposia, and internal research meetings. Contribute to grant applications and funding proposals where relevant. Qualifications & Experience: PhD in structural biology, biophysics, biochemistry, or a closely related field. Strong expertise in at least one major structural biology technique (e.g., X-ray crystallography, cryo-electron microscopy (cryo-EM), NMR spectroscopy). Solid understanding of protein structure-function relationships, macromolecular interactions, and their relevance to disease biology. Experience in experimental design, data collection, data processing, and structure determination. Excellent verbal and written communication skills, with the ability to convey complex structural biology concepts clearly. Preferred: Experience with computational structural biology tools (e.g., Rosetta, AlphaFold, molecular dynamics software). Previous experience working in a multidisciplinary research environment. Familiarity with AI/ML approaches in structural biology or drug discovery. Other Considerations: Fresh PhD graduates will be hired as Postdoctoral Fellows for a two-year term with an opportunity for promotion to Scientist based on performance. Candidates with at least two years of postdoctoral experience in structural biology in either academia or industry will be hired as Scientists with opportunities for career advancement.
Posted 6 days ago
1.0 - 5.0 years
20 - 25 Lacs
Pune
Work from Office
Engineer - Mechanical Design - EMH Crane & Components Job Description Requirements: Graduate Mechanical Engineer with 1 -5 years of experience in Design. Crane industry preferred, but not mandatory. Knowledge on basic mechanical & Structural design is a must. Familiar with the rules and regulations of the crane industry like IS:3177, IS:807 etc is desirable. Must have hands-on experience on 3D CAD software like Solid edge. Design Validation documentation and activities Basic knowledge of mechanical components and machining processes. Working experience on PDM/PLM software like team centre would be added advantage. Innovative & self-learner. Good communication skill. Roles & Responsibilities To work on Design & detail Engineering of EOT & other cranes as per Indian standard/Int STd. Design calculation in excel based tool & detail engineering using Solid Edge software. Providing technical support to marketing during tendering Documenting the offer design process. Innovative and incorporate new ways for efficient and quick offering methods. Should possess the aptitude for learning and self-development. Improvement of existing design for better maintainability, ease of manufacturing, cost reduction etc.
Posted 6 days ago
5.0 - 10.0 years
8 - 9 Lacs
Thiruvananthapuram
Work from Office
What you ll do? Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need? Bachelors degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart? Knowledge or experience with Apache Beam for stream and batch data processing. Familiarity with big data tools and technologies like Apache Kafka, Hadoop, or Spark. Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). Exposure to data visualization tools or platforms. Primary Location: IND-Trivandrum-Equifax Analytics-PEC Function: Function - Tech Dev and Client Services Schedule: Full time
Posted 6 days ago
10.0 - 15.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Youll lead the Data Science function supporting Lending, overseeing credit risk scoring models across PD, LGD, and collections. Youll guide the team in leveraging alternative data to improve model accuracy and signal. Youll lead the full model lifecycle driving strategy, standards, and execution across model development, validation, deployment, and monitoring. Youll partner with business, risk, and ops leaders to shape the credit roadmap and influence decisions with data-driven insights. You are experienced in leading teams while being hands-on when needed. This role is suited for professionals with 10+ years of experience in data science and risk analytics. You will report to Head of Data Science and this role is onsite based in Bangalore. The Critical Tasks You Will Perform Lead the team in building predictive models to separate good vs bad borrowers using ML and traditional methods Drive development of ML and deep learning models for loss estimation across portfolios Oversee model validation and performance monitoring across diverse data sets Guide feature engineering strategies and explore new external data sources Champion model governance in collaboration with risk, compliance, and audit teams Ensure timely identification of performance drifts and lead model refresh cycles Communicate modeling outcomes and trade-offs to senior leadership and key stakeholders Translate analytics into strategic levers policy, pricing, targeting, and credit expansion Set the vision for solving hard data science problems using best-in-class ML techniques Read more Skills you need The Essential Skills You Need Deep domain expertise in credit risk modelling across PD, LGD, EAD, and collections Prior experience working in FinTech/credit businesses, especially in digital lending or unsecured loan portfolios Proven track record of applying data science to credit underwriting, risk segmentation, and portfolio management Expertise in Python for ML model development, with experience building scalable, production-grade solutions Proficient in Spark, SQL, and large-scale distributed data processing frameworks Grasp of advanced ML concepts, including model interpretability, bias mitigation, and performance optimization Experience with ML/DL libraries (scikit-learn, XGBoost, TensorFlow/PyTorch) and guiding teams on their best use Working knowledge of MLOps and orchestration tools (Airflow, MLflow, etc.), with experience standardising model deployment pipelines Exposure to LLMs and Generative AI, with a perspective on their potential applications in credit risk Design of robust, reusable feature pipelines from structured and unstructured data sources Familiarity with Git, CI/CD, and model versioning frameworks as part of scaling DS delivery Mindset and ability to coach team members through complex modelling issues With experience aligning technical outputs with business strategy Read more What we offer About Grab and Our Workplace Grab is Southeast Asias leading superapp. From getting your favourite meals delivered to helping you manage your finances and getting around town hassle-free, weve got your back with everything. In Grab, purpose gives us joy and habits build excellence, while harnessing the power of Technology and AI to deliver the mission of driving Southeast Asia forward by economically empowering everyone, with heart, hunger, honour, and humility. Read more Life at Grab Life at Grab We care about your well-being at Grab, here are some of the global benefits we offer: We have your back with Term Life Insurance and comprehensive Medical Insurance. With GrabFlex, create a benefits package that suits your needs and aspirations. Celebrate moments that matter in life with loved ones through Parental and Birthday leave, and give back to your communities through Love-all-Serve-all (LASA) volunteering leave We have a confidential Grabber Assistance Programme to guide and uplift you and your loved ones through lifes challenges. What we stand for at Grab We are committed to building an inclusive and equitable workplace that enables diverse Grabbers to grow and perform at their best. As an equal opportunity employer, we consider all candidates fairly and equally regardless of nationality, ethnicity, religion, age, gender identity, sexual orientation, family commitments, physical and mental impairments or disabilities, and other attributes that make them unique. #LI-DNI Read more
Posted 6 days ago
5.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Join Team Amex and lets lead the way together. Responsible for contacting clients with overdue accounts to secure the settlement of the account. Also, they do preventive work to avoid future overdues with accounts that have a high exposure. The eCRMS organization is looking for a hands-on Software Engineer for Customer 360 (C360) Engineering team. The C360 Platform is a critical platform in American Express that provides a holistic view of the customers relationships with various American Express products and manages customers demographics and provides intelligent insights about customers contact preferences. This platform is an integral part of all critical user journeys and is at the fore front of all the new initiatives the company is undertaking. In C360, we build and operate highly available and scalable services using event-driven reactive architecture to provide real time services to power critical use cases across the company. We perform data analysis, work on anomaly detection and create new data insights. This role of an Engineer will be an integral part of a team that builds large-scale, cloud-native, event-driven reactive applications to create 360-degree view of the customer. Specifically, you will help: Lead build of new micro-services that help manages our rapidly growing data hub. Lead build of services to perform real time data processing at scale for relational, analytical queries across multi-dimensional data. Lead build of services that generalize stream processing to make it trivial to source, sink and stream process data. Improve efficiency, reliability and scalability of our data pipelines. Work on cross-functional initiatives and collaborate with Engineers across organizations. Influence team members with creative changes and improvements by challenging status quo and demonstrating risk taking Be a productivity multiplier for your team by analyzing your workflow and contributing to enable the team to be more effective, productive, and demonstrating faster and stronger results. Are you up for the challenge? 5+ years of experience in building large scale distributed applications with object-oriented design using java related stack. Holds a masters or bachelor s degree in Computer Science, Information Systems, or other related field (or has equivalent work experience). Ability to implement scalable, high performing, secure, highly available solutions. Proficient in developing solution architecture for business problems and communicating it to large teams. Proficient in weighing pros and cons of different solution options and gaining alignment on the preferred option with multiple stakeholders. Experience with NoSQL technologies such as Cassandra, Couchbase, etc. Experience with web services and API development on enterprise platforms using REST, GraphQL, and gRPC. Expertise in Big Data technologies like Hive, Map Reduce, and Spark Experience in Event Driven Microservice architecture Vert.X, Kafka, etc Experience with automated release management using Maven, Git, Jenkins. Experience with Vert.X and Event driven Architecture is a plus. Experience with Postgres is a plus. Experience with Docker/Openshift based deployment is a plus. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities
Posted 6 days ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
id="job_description_2_0"> Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and weve set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. About the team The mission of Rokus Data Engineering team is to develop a world-class big data platform so that internal and external customers can leverage data to grow their businesses. Data Engineering works closely with business partners and Engineering teams to collect metrics on existing and new initiatives that are critical to business success. As Senior Data Engineer working on Device metrics, you will design data models & develop scalable data pipelines to capturing different business metrics across different Roku Devices. About the role Roku pioneered streaming to the TV. We connect users to the streaming content they love, enable content publishers to build and monetise large audiences, and provide advertisers with unique capabilities to engage consumers. Roku streaming players and Roku TV models are available around the world through direct retail sales and licensing arrangements with TV brands and pay-TV operators.With tens of million players sold across many countries, thousands of streaming channels and billions of hours watched over the platform, building scalable, highly available, fault-tolerant, big data platform is critical for our success.This role is based in Bangalore, India and requires hybrid working, with 3 days in the office. What youll be doing Build highly scalable, available, fault-tolerant distributed data processing systems (batch and streaming systems) processing over 10s of terabytes of data ingested every day and petabyte-sized data warehouse Build quality data solutions and refine existing diverse datasets to simplified data models encouraging self-service Build data pipelines that optimise on data quality and are resilient to poor quality data sources Own the data mapping, business logic, transformations and data quality Low level systems debugging, performance measurement & optimization on large production clusters Participate in architecture discussions, influence product roadmap, and take ownership and responsibility over new projects Maintain and support existing platforms and evolve to newer technology stacks and architectures Were excited if you have Extensive SQL Skills Proficiency in at least one scripting language, Python is required Experience in big data technologies like HDFS, YARN, Map-Reduce, Hive, Kafka, Spark, Airflow, Presto, etc. Proficiency in data modeling, including designing, implementing, and optimizing conceptual, logical, and physical data models to support scalable and efficient data architectures. Experience with AWS, GCP, Looker is a plus Collaborate with cross-functional teams such as developers, analysts, and operations to execute deliverables 5+ years professional experience as a data or software engineer BS in Computer Science; MS in Computer Science preferred The Roku Culture We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isnt real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how weve grown, visit https: / / www.weareroku.com / factsheet .
Posted 6 days ago
13.0 - 18.0 years
12 - 16 Lacs
Hyderabad
Work from Office
Job Role Value Proposition The MetLife Corporate Technology (CT) organization is evolving to enable MetLife s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. As part of Tech Talent Transformation (T3) agenda, MetLife is establishing a Technology Center in India. This technology center will perform as an integrated organization between onshore, offshore, and strategic vendor partners in an Agile delivery model. The US Actuarial Technology Delivery Lead will be part of the larger Actuarial, Reinsurance, and Treasury (ART) leadership team. As a key technical leader, you will design, develop, and maintain scalable data pipelines, implement efficient data solutions, and drive data-driven strategies for our organization. Your expertise in Azure, Databricks, Scala, and big data technologies will be critical in optimizing data flows and empowering analytics initiatives. You will closely collaborate with actuarial teams to deliver data-driven insights and solutions, supporting risk modeling, pricing strategies, and other key business initiatives. Key Relationships Internal Stake Holder - Corporate Technology ART Leader, ART Leadership team, India Corporate Technology AVP, and Business process Owners for US Actuarial Key Responsibilities Lead the design, development, and maintenance of scalable data architectures and pipelines in Azure Cloud environments. Develop strategies for the integration of actuarial data sources, both structured and unstructured, to enable advanced analytics. Build and optimize data models and ETL/ELT processes using Databricks, Scala, and Spark. Ensure the integrity, security, and quality of data through robust data governance practices. Implement performance tuning strategies to optimize big data processing workflows. Develop automation frameworks and CI/CD pipelines for data pipeline deployment. Lead and mentor junior data engineers, providing technical guidance and support. Stay up to date with emerging technologies in the data ecosystem and drive innovation. People Management - Managing data engineers located in multiple countries, upskilling/reskilling, hiring / retention agenda and adoption of engineering culture. Stakeholder Management - Managing key business stakeholders to deliver the required technology capabilities to support the digital transformation agenda. Driving prioritization of the product backlog. This includes managing key vendors providing the resources, SaaS & other capabilities. Ways of Working - Adoption of the Agile ways of working in the software delivery lifecycle. E2E Software Lifecycle Management (Architecture, Design, Development, Testing & Production) Driving the decisions and implementation of technology best practices, architecture and technology stack including adoption of cloud, AI/Data, and other relevant emerging technology that is fit-for-purpose. Conduct peer reviews of solution designs and related code, ensuring adherence to best practice guidelines and compliance with security and privacy mandates. Investigate and resolve escalated production management incidents, problems, and service requests. Ensure disaster recovery, privacy, and security are aligned to enable application/platform stability, including technology currency management. Partner with Cyber & IT Security, Data Analytics, Infrastructure, Global Risk Management, Compliance, and Internal Audit to provide the holistic provision of technology services including risks, controls and security, to the business Education A Bachelors/masters degree in computer science or equivalent field. Experience 13+ years experience leading software engineering teams Proven experience (5+ years) in data engineering, big data processing, and distributed systems. Proficiency in Azure services (Data Factory, Azure Synapse, Azure Data Lake, Event Hubs). Strong hands-on experience with Databricks and Spark for big data processing. Expertise in Scala programming for data processing workflows. Deep understanding of ETL/ELT processes, data modeling, and data warehousing principles. Familiarity with cloud-native architectures and CI/CD pipelines. Experience with data governance, security, and compliance practices. Excellent problem-solving and communication skills. Strong leadership and mentoring abilities. Proficiency in software development lifecycle including CI/CD, test driven development, domain driven design and Agile ways of working. Proven track record in partnering with the business to deliver mission critical transformation via Agile approach. Preferred Skills: Knowledge of additional programming languages like Python and SQL. Familiarity with actuarial tools such as Prophet or other risk modeling systems. Experience with DevOps tools and practices (Azure DevOps, Terraform). Understanding of machine learning workflows and MLOps practices. Proficiency in data visualization tools (Power BI). Soft Skills: Strong leadership and project management capabilities. Excellent problem-solving, communication, and stakeholder management skills. Ability to balance technical innovation with business value delivery. Skills and Competencies: Competencies Communication: Ability to influence and help communicate the organization s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment People Management: Inspiring, motivating and leading diverse and distributed teams Diverse environment: Can-do attitude and ability to work in a high paced environment Technical Stack: Programming: Python, Scala, and SQL for data transformation and analytics workflows. Azure Services: Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Storage, and Event Hubs. Databricks Platform: Strong knowledge of cluster management, performance tuning, job scheduling, and advanced analytics with Spark. Big Data Tools: Apache Spark, Delta Lake, and distributed computing concepts. Data Security: Security best practices including RBAC, encryption, and GDPR compliance strategies. CI/CD: DevSecOps for DataOps using GitHub Actions, Azure DevOps, or similar tools for automation. Data Governance: Knowledge of data quality frameworks, lineage tracking, and metadata management. Cloud Infrastructure: Azure networking, IAM, and infrastructure monitoring. Certifications Azure Data Engineer / Azure AI Engineer Azure Architect Data Bricks - Azure platform Architect
Posted 6 days ago
2.0 - 6.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Were Celonis, the global leader in Process Mining technology and one of the worlds fastest-growing SaaS firms. We believe there is a massive opportunity to unlock productivity by placing data and intelligence at the core of business processes - and for that, we need you to join us. The Team: Our team is responsible for building the Celonis end-to-end Task Mining solution . Task Mining is the technology that allows businesses to capture user interaction (desktop) data, so they can analyze how people get work done, and how they can do it even better. We own all the related components, e.g. the desktop client, the related backend services, the data processing capabilities, and Studio frontend applications. The Role: Celonis is looking for a Senior Software Engineer to build new features and increase the reliability of our Task Mining solution. You would contribute to the development of our Task Mining Client so expertise on C# and .NET framework is required and knowledge of Java and Spring boot is a plus. The work you ll do: Implement highly performant and scalable desktop components to improve our existing Task Mining software Own the implementation of end to end solutions: leading the design, implementation, build and delivery to customers Increase the maintainability, reliability and robustness of our software Continuously improve and automate our development processes Document procedures, concepts, and share knowledge within and across teams Manage complex requests from support, finding the right technical solution and managing the communication with stakeholders Occasionally work directly with customers, including getting to know their system in detail and helping them debug and improve their setup. The qualifications you need: 2-6 years of professional experience building .NET applications Passion for writing clean code that follows SOLID principles Hand-on experience in C# and .NET framework. Experience in user interface development using WPF and MVVM. Familiarity with Java, Spring framework is a plus. Familiarity with containerization technologies (i.e. Docker) Experience in REST APIs and/or distributed micro service architecture Experience in monitoring and log analysis capabilities (i.e. DataDog) Experience in writing and setting up unit and integration tests Experience in refactoring legacy components. Able to supervise and coach junior colleagues Experience interacting with customers is a plus. Strong communication skills. What Celonis Can Offer You: Pioneer Innovation: Work with the leading, award-winning process mining technology, shaping the future of business. Accelerate Your Growth: Benefit from clear career paths, internal mobility, a dedicated learning program, and mentorship opportunities. Receive Exceptional Benefits: Including generous PTO, hybrid working options, company equity (RSUs), comprehensive benefits, extensive parental leave, dedicated volunteer days, and much more . Prioritize Your Well-being: Access to resources such as gym subsidies, counseling, and well-being programs. Connect and Belong: Find community and support through dedicated inclusion and belonging programs. Make Meaningful Impact: Be part of a company driven by strong values that guide everything we do: Live for Customer Value, The Best Team Wins, We Own It, and Earth Is Our Future. Collaborate Globally: Join a dynamic, international team of talented individuals. Empowered Environment: Contribute your ideas in an open culture with autonomous teams. About Us: Celonis makes processes work for people, companies and the planet. The Celonis Process Intelligence Platform uses industry-leading process mining and AI technology and augments it with business context to give customers a living digital twin of their business operation. It s system-agnostic and without bias, and provides everyone with a common language for understanding and improving businesses. Celonis enables its customers to continuously realize significant value across the top, bottom, and green line. Celonis is headquartered in Munich, Germany, and New York City, USA, with more than 20 offices worldwide. Get familiar with the Celonis Process Intelligence Platform by watching this video . Celonis Inclusion Statement: At Celonis, we believe our people make us who we are and that The Best Team Wins . We know that the best teams are made up of people who bring different perspectives to the table. And when everyone feels included, able to speak up and knows their voice is heard - thats when creativity and innovation happen. Your Privacy: Any information you submit to Celonis as part of your application will be processed in accordance with Celonis Accessibility and Candidate Notices By submitting this application, you confirm that you agree to the storing and processing of your personal data by Celonis as described in our Privacy Notice for the Application and Hiring Process . Please be aware of common job offer scams, impersonators and frauds. Learn more here .
Posted 6 days ago
2.0 - 4.0 years
2 - 2 Lacs
Jaipur, Bhankrota
Work from Office
1.Enter and update data in Excel and internal systems. 2.Verify accuracy and resolve discrepancies. 3.Maintain organized data records. 6 Coordinate with WH teams for data collection and validation
Posted 6 days ago
0.0 years
1 - 1 Lacs
Viluppuram
Work from Office
Greetings from Annexmed!!! Huge Openings for Data Analyst - Non-Voice Process (Freshers)- Villupuram Desired Skill: * Typing Skill (Upper / Lower) * Qualification: Diploma or Any Degree * Passed Out Year 2022 To 2025. * Good Communication Skill. * Location: Candidates Must Reside Within 15Kms Radius Of The Office Location. Interview Time : 11:00AM to 4:00PM Interview Day: Monday to Friday working days: 5days only Sat & Sunday fixed leave Contact : Geetha HR 8220529346 Shift : Night shift only (9.30PM to 5.30AM)
Posted 6 days ago
2.0 - 5.0 years
4 - 7 Lacs
Navi Mumbai
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The data processing job market in India is thriving with opportunities for job seekers in the field. With the growing demand for data-driven insights in various industries, the need for professionals skilled in data processing is on the rise. Whether you are a fresh graduate looking to start your career or an experienced professional looking to advance, there are ample opportunities in India for data processing roles.
These major cities in India are actively hiring for data processing roles, with a multitude of job opportunities available for job seekers.
The average salary range for data processing professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakh per annum, while experienced professionals can earn upwards of INR 10 lakh per annum.
A typical career path in data processing may include roles such as Data Analyst, Data Engineer, Data Scientist, and Data Architect. As professionals gain experience and expertise in the field, they may progress from Junior Data Analyst to Senior Data Analyst, and eventually to roles such as Data Scientist or Data Architect.
In addition to data processing skills, professionals in this field are often expected to have knowledge of programming languages such as Python, SQL, and R. Strong analytical and problem-solving skills are also essential for success in data processing roles.
As you explore opportunities in the data processing job market in India, remember to prepare thoroughly for interviews and showcase your skills and expertise confidently. With the right combination of skills and experience, you can embark on a successful career in data processing in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17069 Jobs | Dublin
Wipro
9221 Jobs | Bengaluru
EY
7581 Jobs | London
Amazon
5941 Jobs | Seattle,WA
Uplers
5895 Jobs | Ahmedabad
Accenture in India
5813 Jobs | Dublin 2
Oracle
5703 Jobs | Redwood City
IBM
5669 Jobs | Armonk
Capgemini
3478 Jobs | Paris,France
Tata Consultancy Services
3259 Jobs | Thane