Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 7.0 years
0 Lacs
karnataka
On-site
You should have 2 - 7 years of experience in Python with a good understanding of Big data ecosystems and frameworks such as Hadoop, Spark etc. Your experience should include developing data processing tasks using PySpark. Expertise in at least one popular cloud provider preferably AWS would be a plus. Additionally, you should possess good knowledge of any RDBMS/NoSQL database with strong SQL writing skills. Experience with Datawarehouse tools like Snowflake and any one ETL tool would be advantageous. Your skills should include strong analytical and problem-solving capability, excellent verbal and written communication skills, and the ability to work directly with clients to build trusted relationships with stakeholders. You should also be able to collaborate effectively across global teams and have a strong understanding of data structures, algorithms, object-oriented design, and design patterns. Experience in the use of multi-dimensional data, data curation processes, and the measurement/improvement of data quality is required. A general knowledge of business processes, data flows, and quantitative models that generate or consume data is also preferred. You should be an independent thinker, willing to engage, challenge, and learn new technologies. Role & Responsibilities: - Maintain high-quality coding standards and deliver work within the stipulated time frame. - Review the work of team members and occasionally provide guidance. - Develop an understanding of Work Breakdown Structure and assist the manager in delivering the same. - Develop sector initiatives such as credential building and knowledge management. - Act as a Team lead and proficiently deliver key responsibilities in line with the project plan.,
Posted 6 days ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
As an ITIDATA, an EXl Company, you will be responsible for tasks including working with Cypher or Gremlin query languages, Neo4J, Python, PySpark, Hive, and Hadoop. Your expertise in graph theory will be utilized to create and manage knowledge graphs using Neo4J effectively. In this role, we are looking for Neo4J Developers with 7-10 years of experience in data engineering, specifically with 2-3 years of hands-on experience with Neo4J. If you are seeking an exciting opportunity in graph databases, this position offers the chance to work on optimizing performance and scalability of graph databases, as well as researching and implementing new technology solutions. Key Skills & Responsibilities: - Expertise in Cypher or Gremlin query languages - Strong understanding of graph theory - Experience in creating and managing knowledge graphs using Neo4J - Optimizing performance and scalability of graph databases - Researching & implementing new technology solutions - Working with application teams to integrate graph database solutions We are looking for candidates who are available immediately or within 30 days to join our team and contribute to our dynamic projects.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
The role of Senior Java Developer with Big Data based in Gurugram (onsite) is a full-time position that requires a highly skilled individual with expertise in Java development, particularly in Spring Boot and SQL. The primary responsibility of the ideal candidate will be to design, develop, and maintain robust backend systems. Additionally, experience with cloud platforms and big data technologies would be advantageous for this role. As a Senior Java Developer, you will be tasked with designing, developing, and maintaining backend services using Java and Spring Boot. Your role will involve writing efficient SQL queries and collaborating with cross-functional teams to deliver new features. Ensuring code quality through unit testing, code reviews, and troubleshooting production issues are also key aspects of this position. It is essential to document technical designs and processes for effective communication within the team. The required skills for this role include strong experience in Java development (version 8 or above), a solid understanding of Spring Boot, and proficiency in SQL and relational databases such as PostgreSQL, MySQL, or Oracle. Familiarity with RESTful API design and implementation is also necessary. Nice to have skills for this position include experience with cloud platforms like Google Cloud Platform (GCP), AWS, or Azure, exposure to Big Data technologies such as Hadoop, Spark, or BigQuery, familiarity with Adobe Workfront and Adobe Personalization Products, and an understanding of CI/CD pipelines and containerization using tools like Docker and Kubernetes. To qualify for this role, candidates should possess a Bachelor's degree in Computer Science, Engineering, or a related field along with at least 5 years of relevant development experience.,
Posted 6 days ago
9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Big Data Engineer Experience: 5–9 Years Location: Hyderabad-Hybrid Employment Type: Full-Time Job Summary: We are seeking a skilled Big Data Engineer with 5–9 years of experience in building and managing scalable data pipelines and analytics solutions. The ideal candidate will have strong expertise in Big Data, Hadoop, Apache Spark, SQL, Hadoop, and Data Lake/Data Warehouse architectures. Experience working with any cloud platform (AWS, Azure, or GCP) is preferred. Required Skills: 5–9 years of hands-on experience as a Big Data Engineer. Strong proficiency in Apache Spark (PySpark or Scala). Solid understanding and experience with SQL and database optimization. Experience with data lake or data warehouse environments and architecture patterns. Good understanding of data modeling, performance tuning, and partitioning strategies. Experience in working with large-scale distributed systems and batch/stream data processing. Preferred Qualifications: Experience with cloud platforms such as AWS, Azure, or GCP is preferred. Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Posted 6 days ago
0.0 - 3.0 years
0 Lacs
Karnataka
On-site
Location Karnataka Bengaluru Experience Range 7 - 15 Years Job Description Spark/Scala Job Description As a Software Development Engineer 2 you will be responsible for expanding and optimising our data and data pipeline architecture as well as optimising data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline design and data wrangler who enjoys optimising data systems and building them from the ground up. The Data Engineer will lead our software developers on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams systems and products. The right candidate will be excited by the prospect of optimising or even re-designing our company’s data architecture to support our next generation of products and data initiatives. Responsibilities Create and maintain optimal data pipeline architecture Assemble large complex data sets that meet functional / non-functional business requirements. Identify design and implement internal process improvements: automating manual processes optimising data delivery, coordinating to re-design infrastructure for greater scalability etc. Work with stakeholders including the Executive Product Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure Work with data and analytics experts to strive for greater functionality in our data systems. Support PROD systems Qualifications Must have About 5 - 11 years and at least 3 years relevant experience with Bigdata. Must have Experience in building highly scalable business applications, which involve implementing large complex business flows and dealing with huge amount of data. Must have experience in Hadoop, Hive, Spark with Scala with good experience in performance tuning and debugging issues. Good to have any stream processing Spark/Java Kafka. Must have experience in design and development of Big data projects. Good knowledge in Functional programming and OOP concepts, SOLID principles, design patterns for developing scalable applications. Familiarity with build tools like Maven. Must have experience with any RDBMS and at least one NoSQL database preferably PostgresSQL Must have experience writing unit and integration tests using scaliest Must have experience using any versioning control system - Git Must have experience with CI / CD pipeline – Jenkins is a plus Basic hands-on experience in one of the cloud provider (AWS/Azure) is a plus Databricks Spark certification is a plus.
Posted 6 days ago
0.0 - 4.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Tesco India • Bengaluru, Karnataka, India • Hybrid • Full-Time • Permanent • Apply by 16-Jul-2025 About the role Enable data driven decision making across the Tesco business globally by developing analytics solutions using a combination of math, tech and business knowledge What is in it for you At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Performance Bonus - Opportunity to earn additional compensation bonus based on performance, paid annually Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. You will be responsible for Understands business needs and in depth understanding of Tesco processes Builds on Tesco processes and knowledge by applying CI tools and techniques. Responsible for completing tasks and transactions within agreed KPI's Solves problems by analyzing solution alternatives Engaging with business & functional partners to understand business priorities, ask relevant questions and scope same into a analytical solution document calling out how application of data science will improve decision making In depth understanding of techniques to prepare the analytical data set leveraging multiple complex data set sources Building Statistical models and ML algorithms with practitioner level competency Writing structured, modularized & codified algorithms using Continuous Improvement principles (development of knowledge assets and reusable modules on GitHub, Wiki, etc) with expert competency - Building easy visualization layer on top of the algorithms in order to empower end-users to take decisions - this could be on a visualization platform (Tableau / Python) or through a recommendation set through PPTs Working with the line manager to ensure application / consumption and proactively identifying opportunities to help the larger Tesco business with areas of improvement Keeping up-to-date with the latest in data science and retail analytics and disseminating the knowledge among colleagues You will need - 2 - 4 years of experience in data science application in Retail or CPG Preferred Functional experience: Marketing, Supply Chain, Customer, Merchandising, Operations, Finance or Digital Applied Math: Applied Statistics, Design of Experiments, Regression, Decision Trees, Forecasting, Optimization algorithms, Clustering, NLP Tech: SQL, Hadoop, Spark, Python, Tableau, MS Excel, MS Powerpoint, GitHub Business: Basic understanding of Retail domain Soft Skills: Analytical Thinking & Problem solving, Storyboarding, Articulate communication About us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Cloud Support Engineer at Snowflake, you will be a key member of the expanding Support team, dedicated to providing high-quality resolutions to help deliver data-driven business insights and results. You will have the opportunity to work with a wide variety of operating systems, database technologies, big data, data integration, connectors, and networking to solve complex issues. Snowflake's core values of putting customers first, acting with integrity, owning initiative and accountability, and getting it done are reflected in everything we do. Your role will involve delighting customers with your passion and knowledge of Snowflake Data Warehouse, providing technical guidance, and expert advice on effective and optimal use of Snowflake. You will also be the voice of the customer, providing product feedback and improvements to Snowflake's product and engineering teams. Additionally, you will play a crucial role in building knowledge within the team and contributing to strategic initiatives for organizational and process improvements. As a Senior Cloud Support Engineer, you will drive technical solutions to complex problems, adhere to response and resolution SLAs, and demonstrate good problem-solving skills. You will utilize the Snowflake environment, connectors, 3rd party partner software, and tools to investigate issues, document known solutions, and report bugs and feature requests. Partnering with engineering teams, you will prioritize and resolve customer requests, participate in a variety of Support initiatives, and provide support coverage during holidays and weekends based on business needs. The ideal candidate for this role will have a Bachelor's or Master's degree in Computer Science or equivalent discipline, along with 5+ years of experience in a Technical Support environment or a similar technical function in a customer-facing role. Solid knowledge of at least one major RDBMS, in-depth understanding of SQL data types, aggregations, and advanced functions, as well as proficiency in database patch and release management are essential. Additionally, familiarity with distributed computing principles and frameworks, scripting/coding experience, database migration and ETL experience, and the ability to monitor and optimize cloud spending using cost management tools are considered nice-to-haves. Special requirements for this role include participation in pager duty rotations during nights, weekends, and holidays, as well as the ability to work the 4th/night shift starting from 10 pm IST. Applicants should be flexible with schedule changes to meet business needs. Snowflake is a rapidly growing company, and we are looking for individuals who share our values, challenge ordinary thinking, and drive innovation while building a future for themselves and Snowflake. Join us in making an impact and accelerating our growth.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Product Test Engineer at Barclays where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of a team of developers, you will deliver technology stack, using strong analytical and problem-solving skills to understand the business requirements and deliver quality solutions. To be successful as a Product Test Engineer you should have experience with: Hands-on experience in one or more technical skills under any of the technology platforms as below: - Mainframe: COBOL, IMS, CICS, DB2, VSAM, JCL, TWS, File-Aid, REXX - Open Systems and tools: Selenium, Java, Jenkins, J2EE, Web-services, APIs, XML, JSON, Parasoft/SoaTest Service Virtualization - API Testing Tools: SOAP UI, Postman, Insomnia - Mid-Tier technology: MQ, WebSphere, UNIX, API 3rd party hosting platforms - Data warehouse: ETL, Informatica, Ab-initio, Oracle, Hadoop - Good knowledge of API Architecture and API Concepts - Experience in JIRA and similar test management tools - Test Automation Skills - Hands-on Experience of Test Automation using Java or any other Object-Oriented Programming Language - Hands-on Experience of Automation Framework Creation and Optimization - Good understanding of Selenium, Appium, SeeTest, JQuery, JavaScript, and Cucumber - Experience of working Build tools like Apache Ant, Maven, Gradle - Knowledge/previous experience of DevOps and Continuous Integration using Jenkins, GIT, Dockers - Experience In API Automation Framework Like RestAssured, Karate Some other highly valued skills may include: - E2E Integration Testing Experience - Previous Barclays Experience - Understanding of Mainframes and Barclays Systems will be an added advantage - Understanding of Cloud Technologies like AWS, Azure - Hands-on Experience in Agile methodology - Domain/Testing/Technical certification will be an advantage You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the Role: To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities: - Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards - Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues - Collaboration with cross-functional teams to analyze requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested - Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution - Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing - Stay informed of industry technology trends and innovations and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth Analyst Expectations: To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise. They lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L - Listen and be authentic, E - Energize and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they develop technical expertise in the work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team's operational processing and activities. Escalate breaches of policies/procedures appropriately. Take responsibility for embedding new policies/procedures adopted due to risk mitigation. Advise and influence decision making within their area of expertise. Take ownership for managing risk and strengthening controls in relation to the work they own or contribute to. Deliver work and areas of responsibility in line with relevant rules, regulation, and codes of conduct. Maintain and continually build an understanding of how their own sub-function integrates with the function, alongside knowledge of the organization's products, services, and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex/sensitive information. Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 6 days ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
NTT DATA is looking for an App. Software Dev. Prin. Cnslt. to join their team in Chennai, Tamil Nadu, India. As a potential candidate, you should have a minimum of 4 years of experience and possess strong skills in .NET core/Razor with Web Forms. Additionally, you should have hands-on experience in SQL, exposure to ETL technologies, and a solid understanding of Hadoop/HIVE. NTT DATA is a trusted global innovator in business and technology services, with a commitment to helping clients innovate, optimize, and transform for long-term success. With a presence in over 50 countries and a diverse team of experts, NTT DATA offers services in business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. As a leading provider of digital and AI infrastructure, NTT DATA is part of the NTT Group, which invests significantly in R&D to drive organizations and society towards a sustainable digital future. If you are looking to be part of an inclusive, adaptable, and forward-thinking organization, and possess the required skills and experience, we encourage you to apply now and grow with NTT DATA. Visit us at us.nttdata.com to learn more about our company and the exciting opportunities we offer.,
Posted 6 days ago
5.0 - 23.0 years
0 Lacs
pune, maharashtra
On-site
The ideal candidate for the role of People Analytics professional should have a strong background in transforming data into actionable insights to drive evidence-based HR decision-making. You will be responsible for designing, developing, and managing advanced dashboards and data visualizations using tools such as Tableau, Power BI, and other modern BI platforms. Building strong partnerships with key stakeholders across HR and the business is essential to deeply understand their challenges and translate their needs into actionable data solutions. In this role, you will need to develop and implement statistical models and machine learning solutions for HR analytics, while managing end-to-end data workflows including extraction, transformation, and loading (ETL). You will be required to design and deliver regular and ad-hoc reports on key HR metrics, ensuring data accuracy through thorough testing and quality checks. The successful candidate should have a Bachelor's degree in a related field, with a minimum of 5 years of experience in analytics, including specialization in people analytics and HR data analysis. Strong proficiency in RStudio/Python, SQL, data visualization tools such as Power BI or Tableau, machine learning, statistical analysis, and cloud platforms is required. Hands-on experience working with Oracle Cloud HCM data structures and reporting tools is highly desirable. You should bring strong problem-solving skills, effective communication abilities to convey data insights through compelling storytelling, and experience managing multiple projects independently in fast-paced, deadline-driven environments. An entrepreneurial mindset and leadership experience are key to successfully leading high-visibility analytics projects and driving collaboration across teams and departments. As a member of the Global People Analytics team, you will collaborate with key stakeholders within Talent Management, Talent Acquisition, Total Rewards, HR Services, and HR Information Systems to drive data-driven decision-making across the organization. This role offers an exciting opportunity to shape the future of people analytics, leverage advanced technologies, and contribute to high-impact, strategic HR initiatives.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
The role of a software engineer in Corporate Planning and Management (CPM) involves providing engineering solutions to facilitate budget planning, financial forecasting, expense allocation, spend management, third-party risk assessment, and supporting corporate decision-making aligned with strategic objectives. As a software engineer in CPM Engineering, you will have the opportunity to contribute to the development and transformation of financial and spend management workflows, as well as the creation of intelligent reporting systems to drive commercial benefits for the firm. Working in small, agile teams, you will be at the forefront of impacting various aspects of corporate planning and management in a fast-paced environment. To excel in this role, you should possess the following qualities: - Demonstrate energy, self-direction, and motivation, while fostering long-term relationships with clients and colleagues. - Approach problem-solving collaboratively within a team setting. - Showcase exceptional analytical skills to deliver creative and commercially viable solutions through informed decision-making. - Exhibit a strong willingness to learn and actively contribute innovative ideas to the team. - Thrive in dynamic work environments, displaying independence and adaptability. - Efficiently manage multiple tasks, demonstrating sound judgment in prioritization. - Offer advanced financial products digitally to clients. - Engage with a diverse, globally distributed cross-functional team to develop customer-centric products. - Evaluate existing software systems for enhancement opportunities and provide estimates for new feature implementations. - Maintain and update documentation related to team processes, best practices, and software runbooks. Basic Qualifications: - Minimum of 5 years of relevant professional experience. - Bachelor's degree or higher in Computer Science or equivalent field. - 3+ years of experience in Java API development. - Proficiency in React JS, HTML5, and Java. - Strong written and verbal communication skills. - Ability to establish trusted partnerships with product leaders and executive stakeholders. - Hands-on experience in building transactional systems and a solid understanding of software architecture. - Familiarity with integrating Restful web services. - Comfortable working in agile operating environments. Preferred Qualifications: - Knowledge of microservices architecture. - Proficiency in React JS. - Experience with Apache Spark, Hadoop, Hive, and Spring Boot frameworks.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You should have a B.Tech/B.E/MSc/MCA qualification along with a minimum of 10 years of experience. As a Software Architect - Cloud, your responsibilities will include architecting and implementing AI-driven Cloud/SaaS offerings. You will be required to research and design new frameworks and features for various products, ensuring they meet high-quality standards and are designed for scale, resiliency, and efficiency. Additionally, motivating and assisting lead and senior developers for their professional and technical growth, contributing to academic outreach programs, and participating in company branding activities will be part of your role. To qualify for this position, you must have experience in designing and delivering widely used enterprise-class SaaS applications, preferably in Marketing technologies. Knowledge of cloud computing infrastructure, AWS Certification, hands-on experience in scalable distributed systems, AI/ML technologies, big data technologies, in-memory databases, caching systems, ETL tools, containerization solutions like Kubernetes, large-scale RDBMS deployments, SQL optimization, Agile and Scrum development processes, Java, Spring technologies, Git, and DevOps practices are essential requirements for this role.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
The Data Engineer will be responsible for designing, implementing, and maintaining the data infrastructure and pipelines necessary for AI/ML model training and deployment. You will work closely with data scientists and engineers to ensure data is clean, accessible, and efficiently processed. You should have 6-8 years of experience in data engineering, ideally in financial services. Strong proficiency in SQL, Python, and big data technologies (e.g., Hadoop, Spark) is required. Experience with cloud platforms (e.g., AWS, Azure, GCP) and data warehousing solutions is a must. Familiarity with ETL processes and tools, as well as knowledge of data governance, security, and compliance best practices, are essential. Key Responsibilities: - Build and maintain scalable data pipelines for data collection, processing, and analysis. - Ensure data quality and consistency for training and testing AI models. - Collaborate with data scientists and AI engineers to provide the required data for model development. - Optimize data storage and retrieval to support AI-driven applications. - Implement data governance practices to ensure compliance and security. At GlobalLogic, we prioritize a culture of caring where people come first. You'll experience an inclusive culture of acceptance and belonging, build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development, offering opportunities to try new things, sharpen your skills, and advance your career. You will have the chance to work on projects that matter, engage your curiosity and problem-solving skills, and contribute to cutting-edge solutions shaping the world today. We support balance and flexibility in work and life, providing various career areas, roles, and work arrangements. Joining GlobalLogic means being part of a high-trust organization where integrity is key, and trust is fundamental to our relationships with employees and clients. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we've been at the forefront of the digital revolution, collaborating with clients in transforming businesses and industries through intelligent products, platforms, and services.,
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
karnataka
On-site
We are looking for someone who is enthusiastic to contribute to the implementation of a metadata-driven platform managing the full lifecycle of batch and streaming Big Data pipelines. This role involves applying ML and AI techniques in data management, such as anomaly detection for identifying and resolving data quality issues and data discovery. The platform facilitates the delivery of Visa's core data assets to both internal and external customers. You will provide Platform-as-a-Service offerings that are easy to consume, scalable, secure, and reliable using open source-based Cloud solutions for Big Data technologies. Working at the intersection of infrastructure and software engineering, you will design and deploy data and pipeline management frameworks using open-source components like Hadoop, Hive, Spark, HBase, Kafka streaming, and other Big Data technologies. Collaboration with various teams is essential to build and maintain innovative, reliable, secure, and cost-effective distributed solutions. Facilitating knowledge transfer to the Engineering and Operations team, you will work on technical challenges and process improvements with geographically distributed teams. Your responsibilities will include designing and implementing agile-innovative data pipeline and workflow management solutions that leverage technology advances for cost reduction, standardization, and commoditization. Driving the adoption of open standard toolsets to reduce complexity and support operational goals for increasing automation across the enterprise is a key aspect of this role. As a champion for the adoption of open infrastructure solutions that are fit for purpose, you will keep technology relevant. The role involves spending 80% of the time writing code in different languages, frameworks, and technology stacks. At Visa, your uniqueness is valued. Working here provides an opportunity to make a global impact, invest in your career growth, and be part of an inclusive and diverse workplace. Join our global team of disruptors, trailblazers, innovators, and risk-takers who are driving economic growth worldwide, moving the industry forward creatively, and engaging in meaningful work that brings financial literacy and digital commerce to millions of unbanked and underserved consumers. This position is hybrid, and the expectation of days in the office will be confirmed by your hiring manager. **Basic Qualifications**: - Minimum of 6 months of work experience or a bachelor's degree - Bachelor's degree in Computer Science, Computer Engineering, or a related field - Good understanding of data structures and algorithms - Good analytical and problem-solving skills **Preferred Qualifications**: - 1 or more years of work experience or an Advanced Degree (e.g., Masters) in Computer Science - Excellent programming skills with experience in at least one of the following: Python, Node.js, Java, Scala, GoLang - MVC (model-view-controller) for end-to-end development - Knowledge of SQL/NoSQL technology. Familiarity with Databases like Oracle, DB2, SQL Server, etc. - Proficiency in Unix-based operating systems and bash scripts - Strong communication skills, including clear and concise written and spoken communications with professional judgment - Team player with excellent interpersonal skills - Demonstrated ability to lead and navigate through ambiguity **Additional Information**:,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
delhi
On-site
As an AVP, Marketing Technology Audience Analyst at Synchrony, you will play a crucial role in understanding, building, and tracking audiences across various platforms. Your primary focus will be on developing best practices for audience governance and supporting the broader audience strategy development. Your responsibilities will include performing audience analyses using internal and external data sources to optimize current audience campaigns and shape future campaign strategies. You will create data personas, collaborate with marketing partners to identify trends and audience opportunities, and work with cross-functional teams to collect data for audience insights. Additionally, you will be responsible for building audiences, managing the workflow from CRM data onboarding to audience segment delivery for programmatic and personalization campaigns. You will establish partnerships with cross-functional teams to understand their business needs and goals, delivering processes and opportunities accordingly. To qualify for this role, you should have a Bachelor's Degree with at least 5 years of experience in mining and analyzing digital audience performance. Alternatively, a minimum of 7 years of relevant experience in the financial domain will be considered in lieu of a degree. You should have a strong background in enterprise-level data sciences, analytics, and customer intelligence, with at least 3 years of professional digital marketing experience. Desired characteristics for this role include proficiency in data mining techniques and analytic programming languages such as Python, SQL, Java, SAS, and others. You should have leadership experience working with cross-functional partners and familiarity with analytic platforms and tools like Hadoop, R, Hive, and Tableau. Experience in areas such as probability and statistics, machine learning, and artificial intelligence will be advantageous. As an ideal candidate, you should be able to execute analyses with massive data sets, collaborate effectively with diverse teams, and provide strategic recommendations based on data insights. You should be a creative thinker with a history of synthesizing insights to drive business decisions and lead strategic discussions. If you meet the eligibility criteria and possess the required skills and experience, we encourage you to apply for this role. This is a Level 10 position, and the work timings are from 2:00 PM to 11:00 PM IST. For internal applicants, it is essential to understand the criteria and mandatory skills needed for the role before applying. Informing your manager and HRM, updating your professional profile, and ensuring your resume is up to date are crucial steps in the application process. Employees at Level 8 and above who meet the specified tenure requirements are eligible to apply. Join us at Synchrony and be part of a dynamic team that drives ROI, elevates brand presence, and fosters a culture of innovation in the ever-evolving market landscape.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As a Software Engineer II at JPMorgan Chase within the Employee Platforms team, you will have the opportunity to enhance your software engineering career while working with a team of agile professionals. Your main responsibility will be to design and deliver cutting-edge technology products in a secure, stable, and scalable manner. You will play a crucial role in developing technology solutions across different technical areas to support the firm's business objectives. Your key responsibilities will include executing innovative software solutions, developing high-quality production code, and identifying opportunities to enhance operational stability. You will lead evaluation sessions with external vendors and internal teams to drive architectural designs and technical applicability. Additionally, you will collaborate with various teams to drive feature development and produce documentation of cloud solutions. To qualify for this role, you should have formal training or certification in software engineering concepts along with at least 2 years of practical experience. You must possess advanced skills in system design, application development, and testing. Proficiency in programming languages, automation, and continuous delivery methods is essential. An in-depth understanding of agile methodologies, such as CI/CD, Application Resiliency, and Security, is required. Knowledge in Python, Big Data technologies, and financial services industry IT systems will be advantageous. Your success in this role will depend on your ability to innovate, collaborate with stakeholders, and excel in a diverse and improvement-focused environment. You should have a strong track record of technology implementation projects, along with expertise in software applications and technical processes within a technical discipline. Preferred skills include teamwork, initiative, and knowledge of financial instruments and specific programming languages like Core Java 8, Spring, JPA/Hibernate, and React JavaScript.,
Posted 1 week ago
2.0 - 12.0 years
0 Lacs
haryana
On-site
At American Express, the culture is built on a 175-year history of innovation, shared values, and Leadership Behaviors, with an unwavering commitment to supporting customers, communities, and colleagues. As a part of Team Amex, you will experience comprehensive support for your holistic well-being and numerous opportunities to learn new skills, develop as a leader, and advance your career. Your voice and ideas hold significance here, as your work creates an impact and contributes to defining the future of American Express. Java Backend Developer: As a Java Backend Developer, you will serve as a core member of an agile team responsible for driving user story analysis and elaboration. You will design and develop responsive web applications using the best engineering practices. Your role will involve hands-on software development, including writing code, unit tests, proof of concepts, code reviews, and testing in ongoing sprints. Continuous improvement through ongoing code refactoring is essential. You will develop a deep understanding of integrations with other systems and platforms within the supported domains. Managing your time effectively, working both independently and as part of a team, is crucial. Bringing a culture of innovation, ideas, and continuous improvement is encouraged. Challenging the status quo, taking risks, and implementing creative ideas are key aspects of the role. Collaboration with product managers, back-end, and front-end engineers to implement versatile solutions to web development problems is expected. Embracing emerging standards and promoting best practices and consistent framework usage are essential. Qualifications: - BS or MS degree in computer science, computer engineering, or related technical discipline - Total Experience: 3-12 Years; with 2+ years of experience working in Java and demonstrating good Java knowledge - Proficiency in Java 7 and Java 8 is preferred - Demonstrated knowledge of web fundamentals and HTTP protocol - Positive attitude, effective communication skills, willingness to learn, and collaborate - 2+ years of development experience in Java applications within an enterprise setting - Experience in developing Java applications using frameworks such as Spring, Spring Boot, Dropwizard is a plus - Proficiency in Test Driven Development (TDD) / Behavior Driven Development (BDD) practices and various testing frameworks - Experience in continuous integration and continuous delivery environments - Working experience in an Agile or SAFe development environment is advantageous Data Engineer: As a Data Engineer, you will be responsible for designing, developing, and maintaining data pipelines. Serving as a core member of an agile team, you will drive user story analysis, design, and development of responsive web applications. Collaborating closely with data scientists, analysts, and partners is essential to ensure seamless data flow. Building and optimizing reports for analytical and business purposes, monitoring and resolving data pipeline issues, implementing data quality checks, validation processes, data governance policies, access controls, and security measures are all part of your responsibilities. Developing a deep understanding of integrations with other systems and platforms, fostering a culture of innovation, ideas, and continuous improvement, challenging the status quo, and taking risks to implement creative ideas are key aspects of the role. Leading your time effectively, working independently and as part of a team, adopting emerging standards, promoting best practices, and consistent framework usage are crucial. Collaborating with Product Owners to define requirements for new features and plan increments of work is also expected. Qualifications: - BS or MS degree in computer science, computer engineering, or related technical subject area - 3+ years of work experience - At least 5 years of hands-on experience with SQL, including schema design, query optimization, and performance tuning - Experience with distributed computing frameworks such as Hadoop, Hive, Spark for processing large-scale data sets - Proficiency in programming languages like Python, PySpark for building data pipelines and automation scripts - Understanding of cloud computing and exposure to cloud services like GCP, AWS, or Azure - Knowledge of CICD, GIT commands, and deployment processes - Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues and optimize data processing workflows - Excellent communication and collaboration skills American Express offers benefits that support your holistic well-being, including competitive base salaries, bonus incentives, financial well-being and retirement support, comprehensive medical, dental, vision, life insurance, and disability benefits, flexible working models, paid parental leave, access to wellness centers, counseling support, career development, and training opportunities. The offer of employment with American Express is subject to the successful completion of a background verification check, as per applicable laws and regulations.,
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description: Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications: Bachelor's degree in Engineering or a related field. 3 to 5 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams.
Posted 1 week ago
5.0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications Bachelor's degree in Engineering or a related field. 5 to 9 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams.
Posted 1 week ago
710.0 years
0 Lacs
Greater Kolkata Area
Remote
Job Title : Senior Data Scientist Location : Remote Department : Data Science / Analytics / AI & ML Experience : 710 years Employment Type : Summary : Responsibilities We are seeking an experienced and highly motivated Senior Data Scientist with 710 years of industry experience to lead advanced analytics initiatives and drive data-driven decision-making across the organization. The ideal candidate will be skilled in statistical modeling, machine learning, and data engineering, with a strong business sense and the ability to mentor junior team Responsibilities : Lead end-to-end data science projects from problem definition through model deployment. Build, evaluate, and deploy machine learning models and statistical algorithms to solve complex business problems. Collaborate with cross-functional teams including Product, Engineering, and Business stakeholders to integrate data science solutions. Work with large, complex datasets using modern data tools (e.g., Spark, SQL, Airflow). Translate complex analytical results into actionable insights and present them to non-technical audiences. Mentor junior data scientists and provide technical guidance. Stay current with the latest trends in AI/ML, data science, and data engineering. Ensure reproducibility, scalability, and performance of machine learning systems in Qualifications : Bachelors or Masters degree in Computer Science, Statistics, Mathematics, Engineering, or a related field. PhD is a plus. 710 years of experience in data science, machine learning, or applied statistics roles. Strong programming skills in Python and/or R; proficiency in SQL. Deep understanding of statistical techniques, hypothesis testing, and predictive modeling. Hands-on experience with ML libraries such as scikit-learn, TensorFlow, PyTorch, XGBoost, etc. Familiarity with data processing tools like Spark, Hadoop, or equivalent. Experience deploying models into production environments (APIs, MLOps, CI/CD pipelines). Excellent communication skills and ability to convey technical insights to business audiences. Experience working in cloud environments such as AWS, GCP, or Azure. (ref:hirist.tech)
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your primary objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will be responsible for monitoring and controlling all phases of the development process, including analysis, design, construction, testing, and implementation. Additionally, you will provide user and operational support on applications to business users. You will utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, evaluate business processes, system processes, and industry standards, and make evaluative judgments. Furthermore, you will recommend and develop security measures in post-implementation analysis of business usage to ensure successful system design and functionality. You will also consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and install and assist customer exposure systems. As the Applications Development Senior Programmer Analyst, you will ensure that essential procedures are followed, help define operating standards and processes, and serve as an advisor or coach to new or lower-level analysts. You will have the ability to operate with a limited level of direct supervision, exercise independence of judgment and autonomy, and act as a subject matter expert to senior stakeholders and/or other team members. In this role, you will appropriately assess risk when business decisions are made, demonstrate particular consideration for the firm's reputation, and safeguard Citigroup, its clients, and assets by driving compliance with applicable laws, rules, and regulations. You will be required to have strong analytical and communication skills and must be results-oriented, willing, and able to take ownership of engagements. Additionally, experience in the banking domain is a must. Qualifications: Must Have: - 8+ years of application/software development/maintenance - 5+ years of experience on Big Data Technologies like Apache Spark, Hive, Hadoop - Knowledge of Python, Java, or Scala programming language - Experience with JAVA, Web services, XML, Java Script, Micro services, SOA, etc. - Strong technical knowledge of Apache Spark, Hive, SQL, and Hadoop ecosystem - Ability to work independently, multi-task, and take ownership of various analyses or reviews Good to Have: - Work experience in Citi or Regulatory Reporting applications - Hands-on experience on cloud technologies, AI/ML integration, and creation of data pipelines - Experience with vendor products like Tableau, Arcadia, Paxata, KNIME - Experience with API development and use of data formats Education: - Bachelor's degree/University degree or equivalent experience This is a high-level overview of the job responsibilities and qualifications. Other job-related duties may be assigned as required.,
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Location: Gurugram Experience: 3-8 years Qualification: Any Role: Mainframe Developer; Data Engineer; AWS Specialist; Business Analyst (Insurance Background) Skills: Mainframe – COBOL, JCL, DB2, VSAM, IMS, COBOL, JCL, DB2, VSAM, IMS, CICS Main Frame Dveloper Associate 3-6 Years Mandatory skill sets- Captives (GCC) - Mainframe Developer/Data Engineer/Business Analyst/AWS Specialist Preferred skill sets- Hadoop, HIVE, SQL, Spark, CDC Tool with AWS Services (DMS, Glue, Glue Catalog, Athena, S3, Lake Formation) ETL Concept, SQL; BA-from Insurance Domain; Mainframe – COBOL, JCL, DB2, VSAM, IMS, COBOL, JCL, DB2, VSAM, IMS, CICS; Python, AWS API Gateway, AWS Lambda, SNS, S3, SQS, Event Notification, Event Bridge, CloudWatch DynamoDB, AWS SAM, VPC, Security, CloudFormation/Teraform, IAM, AWS CLI etc Year of experience required- 2 - 5 Yrs Qualifications- Any Required Skills Business Analytics, Data Engineering, Mainframe Development Optional Skills Apache Hadoop, Apache Hive, Apache Spark, Structured Query Language (SQL), Virtual Private Cloud Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
As an Advisor, Statistical Analysis at Fiserv, you will utilize your expertise in Computer Science, Data Science, Statistics, Mathematics, or related fields to drive impactful insights. Your responsibilities will involve leveraging your proficiency in Python, SQL, and data science libraries such as pandas, scikit-learn, TensorFlow, and PyTorch to analyze and interpret data effectively. To excel in this role, you should have hands-on experience with big data technologies like Spark, Hadoop, as well as cloud platforms including AWS, GCP, and Azure. Your strong problem-solving skills will be crucial in tackling complex challenges, and your ability to thrive in a dynamic and collaborative work environment will contribute to the success of our team. If you are a forward-thinker with a passion for innovation, and are seeking a rewarding career where you can make a difference, Fiserv is the place for you. Join us in shaping the future of financial technology and unlock your full potential. To apply for this position, please submit your application using your legal name. Complete the step-by-step profile and attach your resume to be considered for this exciting opportunity. We appreciate your interest in becoming part of the Fiserv team. At Fiserv, we are committed to fostering an inclusive and diverse workplace where every individual is valued and respected. We believe that diversity drives innovation and we embrace the unique perspectives and experiences of our employees. Please note that Fiserv does not accept resume submissions from agencies without existing agreements. Kindly refrain from sending resumes to Fiserv associates as we are not liable for any fees related to unsolicited submissions. We also caution applicants to be vigilant against fake job posts that are not affiliated with Fiserv. These fraudulent postings may be used by cybercriminals to obtain personal information or financial data. Any communication from a Fiserv representative will be from a legitimate Fiserv email address to ensure transparency and security throughout the hiring process.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a highly skilled Senior Data Scientist, you will bring expertise in Python, Machine Learning (ML), Natural Language Processing (NLP), Generative AI (GenAI), and Azure Cloud Services to our team. Your primary responsibility will be to design, develop, and deploy advanced AI/ML models to facilitate data-driven decision-making processes. You must possess strong analytical skills, proficiency in AI/ML technologies, and hands-on experience with cloud-based solutions. Your key responsibilities will include designing and developing ML, NLP, and GenAI models to address complex business challenges. You will be tasked with building, training, and optimizing AI models using Python and various ML frameworks, as well as implementing Azure AI/ML services for scalable deployment. Additionally, you will develop and integrate APIs for real-time model inference and decision-making, collaborate with cross-functional teams, and ensure adherence to software engineering best practices and Agile methodologies. To excel in this role, you must stay updated on cutting-edge AI/ML advancements, conduct research on emerging trends, and contribute to the development of innovative solutions. Providing technical mentorship to junior data scientists, optimizing model performance in a production environment, and continuously enhancing models and algorithms will also be part of your responsibilities. The required skills and qualifications for this role include proficiency in Python and ML frameworks such as TensorFlow, PyTorch, or Scikit-learn. Hands-on experience in NLP techniques, expertise in Generative AI models, strong knowledge of Azure AI/ML services, and familiarity with CI/CD pipelines are essential. Additionally, you should have a strong understanding of software engineering principles, experience working in an Agile development environment, excellent problem-solving skills, and a background in statistical analysis, data mining, and data visualization. Preferred qualifications include experience in MLOps, knowledge of vector databases and retrieval-augmented generation techniques, exposure to big data processing frameworks, and familiarity with Graph Neural Networks and recommendation systems. Strong communication skills to convey complex ideas to technical and non-technical stakeholders, experience with AutoML frameworks, and hyperparameter tuning strategies will be advantageous in this role. This is a full-time or part-time permanent position with benefits such as health insurance and provident fund. The work schedule includes day shifts from Monday to Friday with weekend availability. An additional performance bonus will be provided based on your contribution to the team. If you have at least 8 years of experience in Python, Azure AI/ML services, Senior Data Scientist roles, and working with ML, NLP, and GenAI models, we encourage you to apply for this opportunity. The work location is in person to foster collaboration and innovation within the team.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a Data Engineer specializing in supply chain applications, you will play a crucial role in the Supply Chain Analytics team at NovintiX, based in Coimbatore, India. Your primary responsibility will be to design, develop, and optimize scalable data solutions that support various aspects of logistics, procurement, and demand planning. Your key responsibilities will include building and enhancing data pipelines for inventory, shipping, and procurement data, integrating data from ERP, PLM, and third-party sources, and creating APIs to facilitate seamless data exchange. Additionally, you will be tasked with designing and maintaining enterprise-grade data lakes and warehouses while ensuring high standards of data quality, integrity, and security. Collaborating with stakeholders, you will be involved in developing reporting dashboards using tools like Power BI, Tableau, or QlikSense to support supply chain decision-making through data-driven insights. You will also work on building data models and algorithms for demand forecasting and logistics optimization, leveraging ML libraries and concepts for predictive analysis. Your role will involve cross-functional collaboration with supply chain, logistics, and IT teams, translating complex technical solutions into business language to drive operational efficiency. Implementing robust data governance frameworks and ensuring data compliance and audit readiness will be essential aspects of your job. To qualify for this position, you should have at least 7 years of experience in Data Engineering, a Bachelor's degree in Computer Science/IT or a related field, and expertise in technologies such as Python, Java, SQL, Spark SQL, Hadoop, PySpark, NoSQL, Power BI, Tableau, QlikSense, Azure Data Factory, Azure Databricks, and AWS. Strong collaboration, communication skills, and experience in fast-paced, agile environments are also desired. This is a full-time position based in Coimbatore, Tamil Nadu, requiring in-person work. If you are passionate about leveraging data to drive supply chain efficiency and are ready to take on this exciting challenge, please send your resume to shanmathi.saravanan@novintix.com before the application deadline on 13/07/2025.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France