Jobs
Interviews

106374 Python Jobs - Page 48

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Kochi, Kerala, India

On-site

Senior ADAS Function Development Engineer Location: Kochi, Kerala, India (Full-time) Job Summary We’re seeking a seasoned engineer with 6+ years of hands-on experience in advanced driver-assist systems (ADAS) function development. You’ll own the end-to-end design, implementation, and validation of perception and fusion algorithms across radar, LiDAR, camera, and GNSS sensors. Key Responsibilities • Lead ADAS feature design, from requirements gathering through software integration and validation • Develop and optimize simulation environments using CARLA, NVIDIA Isaac Sim, IPG TruckMaker, and MATLAB • Configure and run dSPACE tool-chain workflows for HIL and SIL testing • Integrate and validate sensor data streams (radar, LiDAR, camera, GNSS) and develop fusion algorithms • Collaborate with system architects, calibration engineers, and test teams to deliver production-ready solutions • Drive continuous improvement: code reviews, performance profiling, and documentation Required Qualifications and Experience • Bachelor’s or Master’s in Electrical, Electronics, Computer Science, or Mechatronics Engineering • Minimum 6 years in ADAS function development (perception, tracking, fusion) • Strong simulation expertise with industry tools: o CARLA o NVIDIA Isaac Sim o IPG TruckMaker o MATLAB/Simulink • Hands-on experience with dSPACE tool-chain (ControlDesk, ConfigurationDesk, etc.) • Sensor domain expertise: o Radar: Continental, Bosch, Aptiv o LiDAR: Ouster o Cameras: stereo and monocular setups o GNSS integration and data processing • Proficiency in C/C++, Python, and ROS Preferred Qualifications • Prior work in automotive production projects or Tier-1 supplier environment • Familiarity with version control (Git) and CI/CD for embedded systems What We Offer • Competitive salary and performance-based increment • Professional development budget for conferences, certifications, and training • Collaborative culture with cutting-edge projects in autonomous driving

Posted 21 hours ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Looking for AI and Python Engineers with Premium Colleges with good experience in building AI Agents and working on Domain specific RAG with 3 plus years of experience. Role Description This is a full-time on-site role as an Artificial Intelligence Engineer at Corp Placements located in Kochi. The AI Engineer will be responsible for tasks related to pattern recognition, neural networks, software development, and natural language processing (NLP) to drive innovative solutions and technologies within the organization. Qualifications Strong background in Computer Science Proficiency in Pattern Recognition and Neural Networks Experience in Software Development, particularly in AI applications Knowledge of Natural Language Processing (NLP) Excellent problem-solving and analytical skills Ability to work collaboratively in a team environment Master's degree in a related field like Computer Science, AI, or Engineering

Posted 21 hours ago

Apply

0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Company Description CDN Solutions Group, founded in 2000 by Mr. Surajit Mitra and Mr. Chetan Naik, is a leading app development company. The company has achieved numerous accolades, including being listed among the Top 15 Python Development Companies worldwide and ranking as the Top Enterprise App Development Companies. CDN Solutions Group serves diverse sectors such as healthcare, education, retail, real estate, and finance with customized IT solutions. They specialize in various technologies, including open source development, iOS/Android app development, blockchain technology, and more. The company is certified with ISO 9001:2015 and CRISIL SME. Role Description This is a full-time, on-site role for a Business Manager, located in Indore. The Business Manager will be responsible for overseeing daily operations, managing business development strategies, and ensuring the achievement of revenue targets. Other day-to-day tasks include team management, client relationship management, marketing and sales coordination, financial planning, and performance analysis. Qualifications Proven experience in business development, sales, and marketing strategies Strong leadership and team management skills Excellent client relationship management Identifying and Closing Software Services Leads worldwide. Strong written and verbal communication skills Ability to work independently across time zones and geography's. Experience in the IT services industry is MANDATORY. Bachelor’s degree in Business Administration, Management, or related field

Posted 21 hours ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

We are seeking a talented and experienced Python and PowerBI Developer to join our Automation Chapter. The ideal candidate will have a minimum of 5 years of experience in Python and PowerBI, demonstrating proficiency in Python with web frameworks such as Django and a strong understanding of REST API frameworks for consuming third-party app APIs. The candidate should possess a sound understanding of Python packages used for data analytics, such as pandas, and be adept at creating analytical charts in PowerBI based on outputs generated by Python scripts. Key Responsibilities:  Develop and maintain web applications using Python with Django framework.  Design and implement REST APIs to integrate third-party applications and services.  Utilize Python packages like pandas for data manipulation and analysis.  Create insightful and interactive analytical charts using PowerBI to visualize data outputs from Python scripts.  Collaborate with cross-functional teams to identify automation opportunities and develop solutions.  Participate in code reviews and ensure adherence to best practices and coding standards.  Implement and maintain DevOps tools and practices, including Git, GitHub Actions, Jenkins, and JIRA.  Communicate effectively with team members and stakeholders to ensure project goals are met.  Contribute to the Automation Chapter by sharing knowledge and expertise and participating in continuous improvement initiatives. Qualifications:  Bachelor;s degree in Computer Science or a related field.  Minimum of 5 years of experience in Python development, with expertise in Django and REST API frameworks.  Strong proficiency in PowerBI for data visualization and analytics.  Solid understanding of Python packages for data analytics, such as pandas.  Experience with DevOps tools and practices, including Git, GitHub Actions, Jenkins, and JIRA.  Excellent communication skills with the ability to collaborate effectively in a team environment.  Experience with other relevant tools and technologies, such as SQL, Docker, and Kubernetes, is a plus.  Ability to work independently and manage multiple tasks simultaneously. Preferred Skills:  Familiarity with cloud platforms such as AWS or Azure.  Experience in Agile development methodologies.  Knowledge of machine learning libraries and techniques.  Understanding of CI/CD pipelines and automation scripts.

Posted 21 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka

On-site

About the role: The Resident Architect will play a dual role, focusing equally on architectural leadership and hands-on development. This position is responsible for discovering and onboarding new ADOC use cases and building new connectors for ADOC with a strong developer background in Python or Java. Key Responsibilities: 1. Solutions Architect: Discover & Onboard ADOC Use Cases Identify, evaluate, and prioritise new use cases for ADOC within the organisation. Collaborate with business stakeholders to understand requirements and translate them into technical solutions. Lead the onboarding process for new ADOC use cases, ensuring seamless integration and adoption. Develop architectural frameworks and best practices for ADOC implementation. Provide architectural guidance and support to project teams throughout the lifecycle of ADOC use cases. Monitor and report on the progress and effectiveness of onboarded use cases. Ensure compliance with organisational standards, security, and scalability requirements. 2. Development: Build New Connectors for ADOC Design, develop, and maintain custom connectors for ADOC using Java or Python. Collaborate with cross-functional teams to gather connector requirements and ensure robust integration. Write clean, efficient, and well-documented code for connector development. Troubleshoot, debug, and optimise connector performance. Stay updated with the latest advancements in Python, Java, and connector development best practices. Ensure all connectors meet quality, security, and compliance standards. Provide technical mentorship to junior developers. Qualifications: Experience with Cloud Platform is highly desirable. Proven experience as an Architect with a track record of discovering and onboarding technology use cases. Strong hands-on development experience in Python or Java, with a focus on building connectors or integrations. Excellent problem-solving, communication, and stakeholder management skills. Ability to work independently and as part of a collaborative team. Bachelor’s or Master’s degree in Computer Science, Information Technology, Architecture, or a related field. Competencies: Strategic thinking and a solution-oriented mindset. Strong technical acumen in both architecture and software development. Ability to balance multiple priorities and deliver results in a fast-paced environment. Commitment to continuous learning and professional growth. This role is ideal for professionals who are passionate about both architectural strategy and hands-on software development, particularly in the context of ADOC use cases and connector development. At Acceldata , our new Agentic Data Management (ADM) offers solutions by introducing AI-powered agents that collaborate with human teams to proactively monitor, diagnose, and resolve data issues. We are revolutionising data observability in how enterprises manage and observe data by offering comprehensive solutions tailored to each organisation's unique needs. Our platform integrates various technologies, enabling seamless data observability for modern enterprises.

Posted 21 hours ago

Apply

7.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Overview We are seeking an ETL Developer with expertise in Advanced SQL, Python, and Shell Scripting. This full-time position reports to the Data Engineering Manager and is available in a hybrid work model. This is a replacement position within the SRAI - EYC Implementation team. Key Responsibilities Design and develop ETL processes for data extraction, transformation, and loading. Utilize Advanced SQL for data processing and analysis. Implement data processing solutions using Python and Shell Scripting. Collaborate with cross-functional teams to understand data requirements. Maintain and optimize data pipelines for performance and reliability. Provide insights and analysis to support business decisions. Ensure data quality and integrity throughout the ETL process. Stay updated on industry trends and best practices in data engineering. Must-Have Skills and Qualifications 7-8 years of experience as an ETL Developer. Expertise in Advanced SQL for data manipulation and analysis. Proficient in Python and Shell Scripting. Foundational understanding of Databricks and Power BI. Strong logical problem-solving skills. Experience in data processing and transformation. Understanding of the retail domain is a plus. Good-to-Have Skills and Qualifications Familiarity with cloud data platforms (AWS, Azure). Knowledge of data warehousing concepts. Experience with data visualization tools. Understanding of Agile methodologies. What We Offer Competitive salary and comprehensive benefits package. Opportunities for professional growth and advancement. Collaborative and innovative work environment. Flexible work arrangements. Impactful work that drives industry change.

Posted 21 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

AI Ops Engineer2 Job Title: AI Ops Engineer Experience: 3–5 years About The Role We are seeking a hands-on and proactive AI Ops Engineer to operationalize and support the deployment of large language model (LLM) workflows, including agentic AI applications, across Marvell’s enterprise ecosystem. This role requires strong prompt engineering capabilities, the ability to triage AI pipeline issues, and a deep understanding of how LLM-based agents interact with tools, memory, and APIs. You will be expected to diagnose and remediate real-time problems, from prompt quality issues to model behavior anomalies. Key Responsibilities Design, fine-tune, and manage prompts for various LLM use cases tailored to Marvell’s enterprise operations. Operate, monitor, and troubleshoot agentic AI applications, including identifying whether Issues Stem From Prompt quality or structure Model configuration or performance Tool usage, API failures, or memory/recall issues Build diagnostics and playbooks to triage LLM-driven failures, including handling fallback strategies, retries, or re-routing to human workflows. Collaborate with architects, ML engineers, and DevOps to optimize agent orchestration across platforms like LangGraph, CrewAI, AutoGen, or similar. Support integration of agentic systems with enterprise apps like Jira, ServiceNow, Glean, or Confluence using REST APIs, webhooks, and adapters. Implement observability and logging best practices for model outputs, latency, and agent performance metrics. Contribute to building self-healing mechanisms and alerting strategies for production-grade AI workflows. Required Qualifications 3–6 years of experience in software engineering, DevOps, or ML Ops with exposure to AI/LLM workflows. Strong foundation in prompt engineering and experience with LLMs like GPT, Claude, LLaMA, etc. Practical understanding of AIOps platforms or operational AI use cases (incident triage, log summarization, root cause analysis, etc.). Exposure to agentic AI architectures, such as LangGraph, AutoGen, CrewAI, etc. Familiarity with scripting (Python), RESTful APIs, and basic system debugging. Strong analytical skills and the ability to trace issues across multi-step pipelines and asynchronous agents. Good-To-Have Glean DevRev Codium Cursor Atlassian AI Databricks Mosaic AI

Posted 21 hours ago

Apply

9.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job title: Mainframe Developer Duration: Fulltime Location: Kharadi, Pune/ India Job Description: Minimum 9+ years experience of developing software utilizing Mainframe - Should have strong and proven technical skills in Mainframe – Cobol, DB2, JCL, CICS & MQ’s. Should have strong technical skill & experience working in Assembler language Should have strong object oriented analysis & design skills. Should have exposure to tools like GITLAB, Sonar etc. Should have good exposure & experience in the CI & CD topics. Preferred to have good domain knowledge of Investment Banking - Trade Settlement Systems & Payment Should have good communication & presentation skills Should have hands-on experience with Agile methodologies & metrics like Velocity, Burndown chart, Story points etc. Should have strong organizational and quality assurance skills Strong experience working within large global teams Good to have additional knowledge in Java – Core Java Concepts, Spring/Hibernate OR Proficient in building ML (Machine Learning) & NLP (Natural Language Processing) solutions using common ML libraries and frameworks. Proficient with Python language and worked on various ML toolkits like TensorFlow, PyTorch, Keras,Scikit Learn.

Posted 21 hours ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our Company We’re Hitachi Vantara, the data foundation trusted by the world’s innovators. Our resilient, high-performance data infrastructure means that customers – from banks to theme parks ­– can focus on achieving the incredible with data. If you’ve seen the Las Vegas Sphere, you’ve seen just one example of how we empower businesses to automate, optimize, innovate – and wow their customers. Right now, we’re laying the foundation for our next wave of growth. We’re looking for people who love being part of a diverse, global team – and who get excited about making a real-world impact with data. Meet our Team What You’ll Be Doing We are recruiting a Software Development Engineer in Test (SDET) to work as part of the software development team in Hitachi Vantara Engineering, who are responsible for designing, developing and testing the Hitachi NAS Platform product (https://www.hitachivantara.com/en-us/products/storage/network-attached-storage-platform.html). As part of an agile team you will understand the deliverables and pro-actively take part in the team’s self-organization to ensure sprint goals are met. You will play a key role in shaping the design, testing and integration of prioritized work. This will involve tackling problems of diverse scope, designing and implementing tests to ensure the quality of the product. An ideal candidate will understand the Scrum framework for software development and will be willing to work as part of a team to achieve common sprint goals. Duties are likely to involve software design, coding, hands-on adhoc and exploratory testing, and automated test development work. You will also help to define test strategies and promote test coverage and test automation helping us move towards shorter integration and deployment cycles. A successful candidate must have a proven record of accomplishments in automated test development. What You Bring To The Team Design and implement automated tests in collaboration with the agile development team. Create and execute ad-hoc and exploratory tests to ensure the quality of the product Build and support automated testing infrastructure Test and develop software across the stack (embedded C++, APIs, management applications and user interfaces). Maintain existing tests, and eventually be involved in diagnosing and resolving escalated problems. Work as a part of an agile scrum team to deliver team commitments and achieve continual improvements. Produce and maintain high quality documentation Ensure good, automated regression test coverage Qualifications: Computer Science degree or equivalent industry experience. At least 8 years in quality engineering or development Programming experience in Python, C++ and/or Java Proficient sysadmin skills across all major OS (Linux and Windows is a must), storage systems, hypervisors, and containers Aptitude - self-starter, love of learning, inquisitive mind, openness to new ideas Excellent attention to detail, and adaptability to change Strong verbal and written communication skills Driven, self-motivated, with a strong work ethic; an individual who has a passion for problem solving and improving engineering deliverables. Experience with any of the following technologies would be an advantage: Network Attached Storage (NAS) Storage systems Networking Unix/Windows client/server environments Virtualization File serving protocols e.g. NFS or SMB Our Company Hitachi Vantara is part of the Global Hitachi family. We balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. Our people are our biggest asset, they drive our innovation advantage and we strive to offer a flexible and collaborative workplace where they can thrive. Diversity of thought is welcomed and our employee base is represented by several active Employee Resource Group communities. We offer industry leading benefits packages (flexible working, generous pension and private healthcare) and promote a creative and inclusive culture. If driving real change gives you a sense of pride and you are passionate about powering social good, we’d love to hear from you. Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success.

Posted 21 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Hello Greetings !! we are hiring for Business Analyst for one of the MNC company with INNOVA. Analyzing, documenting, and optimizing business processes, identifying areas of improvement, and ensuring that the project is executed efficiently Credit Risk MI workstream and should have Data Analysis BI tooling experience (Tableau, Qlik, etc.) SQL and Python skills for data extraction Experience in credit risk MI implementation in a banking domain (retail credit) Able to work with Credit Risk stakeholders Role Responsibilities: AIRB Portfolio Management Credit Risk MI workstream Document current state for credit risk MI (measures, dimensions, star schemas, data schemas) Understand technical architecture for future state build Ensure sourcing of golden source data using appropriate data warehouses Collaborate with UK-based workstream lead and BI lead • Liaising with SMEs, facilitating workshops to understand/prioritise requirements • Identify root causes of business problems & Create business cases • Assist Testing team for ready to release solutions and its implementation • Supporting the deployment of changes by coordinating business readiness activities • continuous improvement activities, defining best practices and sharing knowledge • Previous experience of working in a Regulatory Reporting Change Environment • Produce business requirements/user stories • Strong knowledge of banking product and regulatory landscape, specifically knowledge of COREP, FINREP and liquidity is important • You must be able to demonstrate solid understanding and experience of data analysis

Posted 21 hours ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Join our digital revolution in NatWest Digital X In everything we do, we work to one aim. To make digital experiences which are effortless and secure. So we organise ourselves around three principles: engineer, protect, and operate. We engineer simple solutions, we protect our customers, and we operate smarter. Our people work differently depending on their jobs and needs. From hybrid working to flexible hours, we have plenty of options that help our people to thrive. This role is based in India and as such all normal working days must be carried out in India. Job Description Join us as a DevOps Engineer This is an opportunity for a driven individual to take on an exciting new career challenge You’ll be able to build and maintain a wide network of stakeholders of varying degrees of seniority It’s a chance to have a tangible effect on our function, put your existing skills to the test and advance your career We're offering this role at associate level What you'll do You'll be working with platform and feature teams to develop, build and configure DevOps tools, technologies, techniques, patterns, and processes, and create trusted pipelines of work from development through to production environments utilising high degrees of automation. You’ll also: Contribute to building the DevOps engineering capability, culture and mind-sets within the organisation Working with platform and feature teams as SME to develop, build and configure DevOps tools, technologies, techniques, patterns, and processes Create trusted pipelines of work from development through to production environments utilising high degrees of automation Coach and mentor feature and platform teams to higher levels of DevOps capability, to drive continuous improvements and enhancing the end customer experience Support the set up of DevOps methodologies and tools within platform and feature teams and demonstrate technical implementation Support the development and establishment of appropriate DevOps The skills you'll need You'll need knowledge of the core principles and benefits of DevOps, and knowledge and experience of software engineering and or IT Operations. You’ll also need: 5+ years of proficiency in Python, AWS Cloud, React JS, Dynamo DB, AWS Contact Centre Experience and other AWS services such as Lambdas, VPC, CloudTrail, CloudWatch, Route53 Knowledge on GitLab and its features (Runners and Build) & hands on experience on IAC Terraforms Efficient in writing shell scripts and Linux programming and should have worked on native AWS pipeline services like Code Build, Code Pipeline, CloudFormation Experience of common DevOps tools such as CI tools, source code management, deployment, and configuration tooling Experience of scripting Experience of working within a governed or regulated change framework Knowledge and experience of agile ways of working

Posted 21 hours ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title: Head of Options Trading Location: Gurgaon, India Company: Univest Salary: No bar for the right candidate Experience Required: 7+ years in Options Trading Role Overview We are seeking an exceptional Head of Options Trading to lead our options trading desk, provide high-accuracy daily trading calls, and build scalable trading systems. This is a Gurgaon-based leadership role ideal for a seasoned trader with a proven track record in options strategy, risk management, and team leadership. Key Responsibilities Lead the Options Trading Desk: Take full ownership of options trading strategies, execution, and team operations. Generate High-Accuracy Calls: Deliver daily options market calls with an accuracy rate of 80%+ , maintaining a strong risk-reward ratio. Market Research & Analysis: Conduct and share real-time market research, technical and derivative analysis to guide users and internal teams. Team Management: Lead, mentor, and scale a team of traders and analysts; establish a performance-driven culture. System Development: Work closely with product and tech teams to design robust trading and risk management systems, dashboards, and automation tools. Compliance & Risk Oversight: Ensure all trades comply with regulatory requirements and internal risk controls. User Engagement: Actively contribute to user-facing content, webinars, and market outlooks to enhance brand credibility. Ideal Candidate Profile Experience: 7+ years of hands-on experience in options trading (index and/or stock derivatives). Proven Track Record: Demonstrated history of delivering profitable and consistent trading performance. Strong Analytical Mindset: Proficient in technical analysis, derivative strategies, and data-driven decision-making. Leadership Skills: Experience leading a high-performance trading or research team. Communication: Excellent verbal and written communication skills for sharing calls, views, and reports with the community. Technology Friendly: Comfort working with tools like TradingView, Excel, Python/R (optional), and backtesting platforms. Certifications (Preferred): NISM Series VIII (Equity Derivatives) or relevant industry certifications.

Posted 21 hours ago

Apply

8.0 - 12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Role: Technical Analyst- DevOps- Senior Level Job Locations: Noida, Uttar Pradesh, India Required Experience: 8 - 12 Years Skills: CI/CD Pipeline, AWS, GITLAB, GCP, AZURE, Deep Source, Sonarqube Share JOB DESCRIPTION We are looking for a highly skilled and motivated DevOps Engineer to join our dynamic team. As a DevOps Engineer, you will be responsible for managing our infrastructure, CI/CD pipelines, and automating processes to ensure smooth deployment cycles. The ideal candidate will have a strong understanding of cloud platforms (AWS, Azure, GCP), version control tools (GitHub, GitLab), CI/CD tools (GitHub Actions, Jenkins, Azure DevOps, and Agro CD (GitOps methodologies)), and the ability to work in a fast-paced environment. RESPONSIBILITIES Design, implement, and manage CI/CD pipelines using GitHub Actions, Jenkins, Azure DevOps, and Agro CD (GitOps methodologies). Manage and automate the deployment of applications on cloud platforms such as AWS, GCP, and Azure. Maintain and optimize cloud-based infrastructure, ensuring high availability, scalability, and performance. Utilize GitHub and GitLab for version control, branching strategies, and managing code repositories. Collaborate with development, QA, and operations teams to streamline the software delivery process. Monitor system performance and resolve issues related to automation, deployments, and infrastructure. Implement security best practices across CI/CD pipelines, cloud resources, and other environments. Troubleshoot and resolve infrastructure issues, including scaling, outages, and performance degradation. Automate routine tasks and infrastructure management to improve system reliability and developer productivity. Stay up to date with the latest DevOps practices, tools, and technologies. REQUIRED SKILLS At least 8 years’ experience as DevOps Engineer. Proven experience as a DevOps Engineer, Cloud Engineer, or similar role. Expertise in CI/CD tools, including GitHub Actions, Jenkins, Azure DevOps, and Agro CD (GitOps methodologies). Strong proficiency with GitHub and GitLab for version control, repository management, and collaborative development. Extensive experience working with cloud platforms such as AWS, Azure, and Google Cloud Platform (GCP). Solid understanding of infrastructure-as-code (IaC) tools like Terraform or CloudFormation. Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Knowledge of monitoring, logging, and alerting systems (e.g., Prometheus, Grafana, ELK stack). Experience in scripting languages such as Python, Bash, or PowerShell. Strong knowledge of networking, security, and performance optimization in cloud environments. Familiarity with Agile development methodologies and collaboration tools. Education B.Tech/M Tech/MBA/BE/MCA Degree Sierra Development is a leading North America based software development company. SD Global Services ( www.sierradev.in )is a wholly owned subsidiary of ‘Sierra Development LLC’ (www.sierradev.com) and backed by The Riverside Company (www.riversidecompany.com) a global private equity industry leader in the US. Riverside was founded in 1988 with multiple office locations. We provide The Riverside Companies access to highly trained software development resources.

Posted 21 hours ago

Apply

0 years

3 - 0 Lacs

Gunjur, Bengaluru, Karnataka

On-site

Location: Chrysalis High Gunjur Job Type: Full-Time https://maps.app.goo.gl/y1d2nC1R65BNUXkE8 Job Summary: We are seeking a STEAM teacher to lead Robotics and Computer classes for students. The ideal candidate should have strong Python skills and a passion for hands-on, tech-based learning. Responsibilities: Teach Robotics and Computer Science with a focus on Python. Plan and deliver engaging, project-based lessons. Guide students in building and programming robots. Maintain a safe and tech-friendly classroom environment. Requirements: B.Tech/B.E degree in a related field. Good knowledge of Python and basic robotics tools. Strong classroom and communication skills. Job Type: Full-time Pay: Up to ₹25,000.00 per month Benefits: Health insurance Provident Fund Schedule: Day shift Work Location: In person

Posted 21 hours ago

Apply

175.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? International Risk and Control sits within International Card Services (ICS) which comprises of all the issuing functions across all 28 international markets excluding the US - colleagues operate across a variety of geographies and disciplines. ICS issues products to our Consumer and Commercial customers – the organization is an integral part of the global growth strategy for American Express; offering differentiated products and services is critical to our success and promises to drive significant growth and value through the delivery of innovative products, tailored to the needs of our customers. The objective of the ICS Regulatory Compliance Strategy & Operations team is to establish a robust and sustainable framework for Anti-Money Laundering (AML) and Know Your Customer (KYC) within the India market while simultaneously ensuring the effective implementation of KYC/AML regulatory obligations. This team provides support to business units in fulfilling all relevant operational risk, regulatory, and compliance requirements. The role will involve extensive collaboration with multiple partners across numerous business units, functional areas, and geographies. Purpose of the Role: The main area of responsibility for this position is to support the governance of Risk & Control framework within ICS, with a focus on ensuring compliance with regulatory landscape in India. This is a Band 35 role. We are looking for an experienced candidate who can demonstrate a truly collaborative approach to project delivery and management of our stakeholders. The ideal candidate will be able to show a track record of operating effectively in high pressure, rapidly changing circumstances without ever compromising quality of delivery. Responsibilities: The main responsibilities for this role include but are not limited to the following list: Drive delivery of critical AML/KYC initiatives for the market in line with strategic priorities and AML roadmap Will be responsible for the end-to-end execution of AML regulatory changes and KYC requirements in partnership with business & stakeholders across teams, managing program governance and ensuring effective implementation. Supporting the business in execution of the company’s operational risk framework, ensuring compliance with the requirements of the Enterprise-wide Operational Risk Policy and related guidance. Supporting internal/external audits and reviews related to AML & KYC functions. Acting as a trusted advisor supporting the business on assessing, mitigating, and accepting risk; ensuring a strong control focus is integrated into day-to-day operations and risks are managed effectively with root causes identified and addressed. Prepare and collate status reports on the key initiatives to go into a Senior Leadership Steering Committee. Minimum Qualifications Proven experience leading large complex projects/programs of change through to execution, ensuring compliance with regulatory control management requirements. Exceptional and confident communication, and relationship management skills to lead, influence and work closely with a large audience of partners at various levels of seniority and cross-functional partners, with proven experience of producing executive-level updates. Project management, organizational and problem-solving skills with demonstrable experience in navigating changing circumstances/evolving landscape. Demonstrated ability in balancing multiple demands and expectations, managing conflicts while operating in a fast-paced, highly complex environment, ensuring quality and impact of program deliverables. Essential to be able to demonstrate strong resilience and calm approach under pressure. Ability to ask the “right” questions without having extensive knowledge in a particular business area. Strong analytics, tracking/reporting of metrics/data, and structured problem-solving skills High levels of proficiency in MS Visio, PowerPoint, Excel with advanced excel being an advantage for success in this role. Preferred Qualifications Overall 10+ year's experience with minimum 3 year's experience in Operational Risk Management, Project Management, Compliance, Internal Audit, or a related discipline. Knowledge and experience of the Indian Regulatory environment will be a strong advantage. Previous experience leading and performing analysis using advanced Excel, SQL, Python or similar. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 21 hours ago

Apply

175.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? · Analyze large datasets to identify trends, patterns, and relationships for actionable insights · Develop and implement statistical models and algorithms to solve complex business problems · Collaborate with stakeholders to define project objectives and design experiments · Build predictive models and conduct scenario analysis to support forecasting and decision-making processes · Monitor and evaluate model performance, making recommendations for improvements as needed · Conduct research and stay up-to-date with the latest trends and advancements in data science and analytics · Present findings and recommendations to senior management and other stakeholders in a clear and concise manner Minimum Qualifications · Bachelor's degree in Data Science, Computer Science, Statistics, or a related field · Proven experience working as a Data Scientist in the Financial Services industry · Strong proficiency in programming languages such as Python or R · Excellent analytical and problem-solving skills, with the ability to work with complex data and generate insights · Experience with data visualization tools such as Tableau or Power BI · Knowledge of machine learning techniques and algorithms · Familiarity with databases and data querying languages such as SQL · Strong communication and collaboration skills, with the ability to explain complex concepts to non-technical stakeholders We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 21 hours ago

Apply

0 years

1 - 2 Lacs

Naubatpur, Patna, Bihar

On-site

MIS Reporting Data maintenance and data analysis Inward/Outward tracking Cycle count/Inventory count Job Type: Full-time Pay: ₹12,000.00 - ₹18,000.00 per month Schedule: Day shift Work Location: In person

Posted 22 hours ago

Apply

1.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

🚀 We’re Hiring: Technical Trainers / Freelancers! 📍 Location: Coimbatore & Gobi 🏢 Company: Nschool Academy | 🌐 www.n-school.com Excited about guiding aspiring tech minds and making a real difference in the IT world? Join Nschool Academy as a Technical Trainer or Freelancer and play a key role in shaping the future of technology professionals. As a trainer, you’ll lead hands-on sessions, share your industry knowledge, and mentor students to become job-ready. This is your opportunity to work in a collaborative, growth-oriented environment while empowering the next generation of tech talent. 💻 Tech Stack: MERN & MEAN Stack Trainer, Python Full Stack Trainer, Java Full Stack Trainer, Data Science Trainer, Data Analytics Trainer, Power BI Trainer 📅 Training Schedule: Flexible (Module-based) 📌 Location: Coimbatore / Gobi 💬 Remuneration: Discussed directly with shortlisted candidates 🎓 Eligibility: Degree in CSE / IT / MCA / MSc IT or related fields Minimum 1 year of relevant experience Strong communication & presentation skills A passion for teaching and continuous learning 📩 Apply Now: Send your resume to 👉 hr@n-school.com 📞 𝐅𝐨𝐫 𝐌𝐨𝐫𝐞 𝐃𝐞𝐭𝐚𝐢𝐥𝐬: 📍 Coimbatore: +91 90434 94941 | 📍 Gobichettipalayam: +91 63741 48844 🔁 Tag or refer someone who would be excited to teach and shape tech careers! #nschoolacademy hashtag #emergewiztechnologies hashtag #coimbatore hashtag #upskill hashtag #careerpath hashtag #itcareer hashtag #codinginstructor hashtag #techjobsindia hashtag #pythontraining hashtag #javascriptdeveloper hashtag #webdevelopment hashtag #frontenddeveloper hashtag #backenddeveloper hashtag #jobseekers hashtag #codingtrainer hashtag #careerintech hashtag #learningneverstops hashtag #hiringtechtrainers hashtag #educationjobs hashtag #instructorjobs hashtag #teachtech hashtag #teachcoding hashtag #techcoimbatore hashtag #itcoimbatore hashtag #gobicareer #techcommunity #codingmentor

Posted 22 hours ago

Apply

12.0 years

0 Lacs

Madurai, Tamil Nadu, India

On-site

Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer: Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership

Posted 22 hours ago

Apply

5.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

USV, a top-15 pharmaceutical company in India, excels in the diabetes and cardio sectors. We have a presence in over 65 countries with our dynamic team of over 7,000 across generations. Our commitment to brand building is evident in our popular products like Glycomet GP, Ecosprin AV, Jalra, Tazloc, Glynase, MVI and more. Join USV and be part of our journey as we continue to innovate, transform lives, and shape the future of healthcare. Job Title: Business Intelligence Manager Designation: Business Intelligence Manager (Individual Contributor role) Department/Function: Sales & Marketing Reporting To: Sr. VP - Sales & Marketing Direct Reportees : Nil Location: HO @ Govandi, Mumbai Work Schedule: 🗓️ Workdays: All 5 days including 1st, 3rd & 5th Saturdays 🚫 Off days: 2nd & 4th Saturdays Job Summary : The Business Intelligence Manager will be instrumental in driving business growth and leading strategic improvement initiatives within the pharmaceutical organization. This role entails close collaboration with cross-functional teams to identify and analyze business needs, uncover opportunities, and develop data-driven solutions. The position focuses on leveraging advanced analytics and actionable insights to enhance decision-making, optimize processes, and achieve impactful business outcomes. Key Responsibilities Business Needs Assessment: Collaborate with stakeholders to thoroughly understand and assess business needs, translating them into clear, actionable requirements for innovative solutions. Advanced Data Analysis: Analyze large and complex datasets using advanced tools to uncover trends, generate actionable insights, and drive informed business decisions. Solution Design and Development: Create compelling business cases and proposals for solutions, including process enhancements, technology integrations, and organizational optimizations to support business growth. Stakeholder Collaboration: Build and maintain strong communication channels with stakeholders, including senior leadership, ensuring alignment, transparency, and buy-in throughout the solution development process. End-to-End Project Management: Lead projects from conception to completion, ensuring timely delivery, adherence to budgets, and alignment with strategic goals. Continuous Process Optimization: Identify and implement opportunities for streamlining processes to improve efficiency, effectiveness, and overall operational performance. Regulatory Adherence: Ensure all proposed and implemented solutions comply with industry standards and regulatory requirements, such as FDA guidelines, safeguarding organizational integrity and compliance. Requirements 1. Education: Bachelor's degree in Mathematics, Engineering, Business Administration, or a related field. MBA or an advanced degree in Business Analytics, Data Science, or related fields is preferred. Additional certifications in analytics tools or methodologies (e.g., Power BI, SQL, or Python) are a plus. 2. Experience: 3–5 years of experience in the pharmaceutical industry, preferably in a business intelligence, data analytics, or related role. Proven track record in delivering actionable insights and driving data-driven decision-making in sales and marketing contexts. 3. Skills: Analytical Expertise: Strong proficiency in handling and analyzing large datasets to uncover trends and opportunities. Technical Proficiency: Skilled in tools such as Power BI, Tableau, google workspace Excel, SQL , Python, and data visualization frameworks. AI/ML Knowledge: Familiarity with advanced analytics, predictive modeling, and machine learning algorithms is an advantage. Pharmaceutical Knowledge: Comprehensive understanding of industry trends, regulations (e.g., FDA), and sales force effectiveness metrics. Problem-Solving Ability: Strong critical thinking skills with a solution-oriented approach to complex business challenges. Communication Skills: Excellent ability to communicate insights effectively to diverse audiences, including senior leadership, through presentations and dashboards. Project Management: Demonstrated capability to manage multiple priorities in a fast-paced environment, delivering on time and within budget. Stakeholder Collaboration: Ability to work cross-functionally and foster alignment among teams to achieve common objectives.

Posted 22 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job description Job Name: Senior Data Engineer Azure Years of Experience: 5 Job Description: We are looking for a skilled and experienced Senior Azure Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills: ADF, Databricks Secondary Skills: DBT, Python, Databricks, Airflow, Fivetran, Glue, Snowflake Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge /involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases Role Responsibility: Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement: Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement: Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. Utilize Azure Databricks for data transformation and processing. Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. Proficient in programming languages like Python, SQL, and conversant with pertinent scripting languages

Posted 22 hours ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Full Stack AI Developer – Generative AI, LLMs, and Scalable Applications Job Summary We are looking for a dynamic and hands-on Full Stack AI Developer with expertise in Python programming , Generative AI , LLMs , API development , and front-end frameworks like React and Streamlit. The ideal candidate will be responsible for building scalable AI applications, integrating advanced LLM capabilities, developing RESTful APIs, and delivering high-quality user interfaces optimized for performance, responsiveness, and usability. Key Responsibilities Design, develop, and maintain AI-powered full stack applications. Build and optimize REST APIs using FastAPI/Flask, including authentication, deployment, and integration with LLMs and vector databases. Create, test, and containerize backend services, ensuring robust deployment using Docker and CI/CD pipelines. Implement Generative AI features using models from Hugging Face, OpenAI, or LangChain/LlamaIndex. Develop Streamlit or React front-ends to demonstrate AI capabilities through interactive dashboards and tools. Integrate retrieval-augmented generation (RAG) and agentic AI concepts into product workflows. Apply best practices in software engineering including Git version control, testing, debugging, and documentation. Work with cloud platforms (AWS, Azure, GCP) and monitor AI systems in production environments. Required Technical Skills 🔸 Core Python Development Proficiency in Python 3 with experience in syntax, control flow, data structures, functions, error handling. Familiar with file I/O, modules, decorators, list/dict comprehensions, and exception management. 🔸 Scientific & Data Libraries NumPy: Array operations, broadcasting. Pandas: Data manipulation, aggregation, data cleaning. Database Integration: SQL, SQLite, MongoDB, Postgres; CRUD operations; vector DBs (Chroma, PGVector, FIZZ). 🔸 API Development RESTful API design with FastAPI or Flask. Concepts like query/path parameters, async/await, dependency injection, request validation, JWT/OAuth2 auth. Vector DB integration and streaming LLM responses. 🔸 Generative AI / LLMs Understanding of Transformer architectures, LLMs (GPT, BERT, LLaMA). Working knowledge of LangChain, LlamaIndex, Hugging Face Transformers. Hands-on experience with prompt engineering, parameter tuning, RAG, and agent-based workflows. 🔸 Deployment & Containerization Proficient with Docker, Dockerfiles, and Docker Compose. Familiar with CI/CD pipelines, GitHub Actions, and deploying containerized services. 🔸 Frontend Development HTML5, CSS3, and modern JavaScript (ES6+). Experience with Streamlit for rapid AI dashboards. Strong grasp of React.js (with TypeScript): Hooks, component state, form validation, async data fetching, and UI libraries (Material-UI, Tailwind). Familiarity with building conversational UIs, integrating streaming outputs via WebSockets/SSE. Desirable Skills Experience with cloud services: AWS, Azure, or GCP for hosting, AI/ML services, and serverless deployments. Monitoring and logging for AI systems (e.g., Prometheus, ELK stack). Familiarity with testing frameworks like pytest, httpx, React Testing Library. Exposure to PDF extraction, OCR tools (e.g., PyMuPDF, Tesseract). Good understanding of frontend accessibility (A11y) and performance optimization. Soft Skills Strong communication and documentation skills. Problem-solving mindset with the ability to independently debug complex systems. Ability to collaborate across cross-functional teams (AI/ML, DevOps, Frontend, Product). Educational Qualification Bachelor’s or Master’s degree in Computer Science, Data Science, AI/ML, or related field. Certifications or hands-on experience in Generative AI/LLMs is a plus. Summary This role is ideal for candidates passionate about developing end-to-end AI applications —from data ingestion and model integration to real-time deployment and UI/UX delivery. You’ll be working on real-world GenAI products using LLMs, APIs, Vector Databases, and modern front-end stacks in a production-ready setup. Skills: streamlit,langchain,docker,llms,pandas,mongodb,api development,python,sql,ci/cd,gcp,ml,sqlite,azure,javascript,html5,flask,python programming,llamaindex,numpy,aws,ai,git,css3,generative ai,fastapi,react,hugging face,postgres

Posted 22 hours ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Role: Python+DevOps Experience: 5+years Location: Bangalore Budget: 2 LPM Job Description: You'll architect and scale document processing pipelines that handle thousands of financial documents daily, ensuring high availability and cost efficiency. What You'll Do ⦁ Build scalable async processing pipelines for document classification, extraction, and validation ⦁ Optimize cloud infrastructure costs while maintaining 99.9% uptime for document processing workflows ⦁ Design and implement APIs for document upload, processing status, and results retrieval ⦁ Manage Kubernetes deployments with autoscaling based on document processing load ⦁ Implement monitoring and observability for complex multistage document workflows ⦁ Optimize database performance for high-volume document metadata and processing results ⦁ Build CI/CD pipelines for safe deployment of processing algorithms and business rules Technical Requirements Must Have: ⦁ 5+ years backend development (Python or Go) ⦁ Strong experience with async processing (Celery, Temporal, or similar) ⦁ Docker containerization and orchestration ⦁ Cloud platforms (AWS/GCP/Azure) with cost optimization experience ⦁ API design and development (REST/GraphQL) ⦁ Database optimization (MongoDB, PostgreSQL) ⦁ Production monitoring and debugging Nice to Have: ⦁ Kubernetes experience ⦁ Experience with document processing or ML pipelines ⦁ Infrastructure as Code (Terraform/CloudFormation) ⦁ Message queues (SQS, RabbitMQ, Kafka) ⦁ Performance optimization for high-throughput systems Interested candidate can apply through - https://thexakal.com/share-job?jobId=686e09563a69611b52ad693f

Posted 22 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

🚀 Hiring: Python Developer – Financial Modeling & AI (Individual Contributor) 📍 Location: Remote / India-based 🕒 Type: Full-time | Startup Environment We’re looking for a self-driven Python Developer to join our early-stage team and lead the development of a financial modeling engine with integrated AI capabilities. This is a high-impact, individual contributor role ideal for someone who thrives in fast-paced environments and enjoys building from scratch. Key Responsibilities - Design and develop scalable Python-based financial models and simulation tools - Integrate financial datasets, APIs, and compliance logic - Apply AI/ML techniques to enhance forecasting, risk analysis, and decision support - Build modular, testable code for valuation and analytics - Collaborate with product and strategy teams to translate business logic into intelligent systems - Own the full development lifecycle—from architecture to deployment Required Skills - Strong proficiency in Python 3.x, including NumPy, Pandas, SciPy - Experience with financial modeling, valuation techniques, or quantitative analysis - Hands-on knowledge of AI/ML frameworks like Scikit-learn, TensorFlow, or PyTorch - Ability to work independently and manage timelines in a startup setting - Experience with version control (Git) and CI/CD pipelines - Bonus: Exposure to FastAPI, Flask, or Dash for building data-driven apps What We Offer - Opportunity to build a product from the ground up - Flexible work culture

Posted 22 hours ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Join our Team About this opportunity: The Data Analyst is primarily responsible for advanced data analysis and Visualization with Tableau. Also Define, create, automate, and maintain operational, network performance and financial analysis. Develop analysis process to increase task effectiveness and to consolidated in scope customers. Perform and showcase advanced analysis of Data Define, create, automate, and maintain key operation Network performance (standard and customized) Develop analysis processes to increase task Effectiveness and to consolidate in scope customers Describe the best practices and standards for Data analysis (Visualization) Troubleshoot, debug, and upgrade existing Data systems What you will do : Responsible for a variety of reporting and analysis tasks. Extracts, compiles, and interprets key operational and statistical data. Will be responsible for the overall approach to reporting and analysis within the SDU. Develop new processes to increase the effectiveness of reporting and consolidate & maintain departmental reports used internally and externally by parties such as to other departments, senior management, etc. Develop and maintain new standardized reporting on a routine basis, i.e., define, create, automate, and maintain operational recurring reports. Maintain a reporting schedule and documentation of reporting procedures. Generate and maintain control documentation of reporting procedures to comply with governance audits and regulatory requirements. Will also build audit process to ensure data integrity and efficiency. Perform ad hoc duties in accordance with business needs , Build / design new reports based on requests Create documentation of the new reports and modify it or maintain it for existing ones Evaluate the effort needed for building specific reports The skills you bring: Industry experience: Telecom Years of experience: 5 Yrs min , Qualifications: Degree within Electronics Engineering/Telecommunication Engineering/Computer Science/Computer Engineering or equivalent Knowledge of Tableau, Power-BI or any other reporting visualization applications Knowledge of new wave programming for machine learning - like Python would be considered a plus Knowledge of Business Objects (to manage / develop reports in BO) Knowledge of VBA would be an advantage (to manage / automate the Reports) Language skills: Proficiency in Written / Spoken English & Hindi Values and behaviors: Ericsson Core Values, Honesty, Integrity. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more. Primary country and city: India (IN) || Noida Req ID: 769091

Posted 22 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies