Jobs
Interviews

6187 Scala Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary We are seeking a results-driven and technically proficient Business Intelligence (BI) Officer to join our Data & Analytics team. This role requires strong expertise in ETL processes, data warehousing, and data visualization, along with hands-on experience in both on-premise and AWS cloud BI environments. The ideal candidate should possess a deep understanding of modern BI tools and have domain exposure to sectors like banking, financial services, telecom, government, aviation, or technology consulting. Must-Have Skills (Mandatory) Strong experience in ETL development and workflow design Proficiency in data warehousing concepts, data modeling, and architecture Hands-on with SQL, PL/SQL, and Python for data manipulation Experience with big data tools such as Spark, Hive, Kafka, HDFS, and NiFi Proficiency in data visualization tools like Power BI and Tableau Familiarity with data governance, data validation, and security standards Experience with structured and unstructured datasets Good-to-Have Skills (Optional) Experience with NoSQL databases and traditional RDBMS Familiarity with Scala and data lake architectures Understanding of CI/CD practices in BI solution deployments Relevant certifications in AWS, BI tools, or Data Engineering Qualifications & Experience Bachelor’s or Master’s degree in Computer Science, Data Science, Information Systems, or a related field Minimum of 6 years of hands-on experience in BI development or data engineering Experience working in on-premise as well as AWS cloud environments Exposure to one or more of the following sectors is highly preferred: Banking / Financial Services Telecommunications Government / Public Sector Aviation Technology / IT Consulting

Posted 1 day ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Fusionpact is looking for a motivated and talented Scala and Akka Intern to join our remote team. This 12-month internship is a perfect opportunity for fresh MCA graduates with a solid foundation in the Java stack to kickstart a career in functional programming. You'll work alongside experienced engineers, gaining hands-on experience in building scalable and resilient applications using cutting-edge technologies. We're committed to your growth, and a successful internship will lead to a full-time position with a competitive salary. Responsibilities Assist in the design, development, and testing of software applications using Scala and Akka. Collaborate with the development team to understand project requirements and deliver high-quality solutions. Write clean, efficient, and well-documented code. Participate in code reviews and contribute to a culture of continuous improvement. Debug and resolve technical issues in a timely manner. Stay updated with the latest industry trends and technologies to enhance your skills. Qualifications And Skills Education: A completed MCA degree with a minimum CGPA of 8.0. Primary Skills: Strong understanding of the Java stack, including Core Java, Spring Framework, and related technologies. Eagerness to Learn: A genuine passion for learning new technologies, particularly Scala and Akka. Problem-Solving: Excellent analytical and problem-solving abilities. Communication: Strong verbal and written communication skills to effectively collaborate with a remote team. Self-Starter: Ability to work independently and manage your time effectively in a remote work environment. Why Join Fusionpact? Growth-Oriented Environment: We provide a supportive and challenging environment that fosters professional growth. Mentorship: You will be mentored by senior developers who will guide you in mastering Scala and Akka. Career Path: A successful internship leads to a full-time role with a CTC of ₹3.75 Lakhs per annum . Remote Work: Enjoy the flexibility and convenience of a fully remote position. Competitive Stipend: You will receive a monthly stipend of ₹5,000 during your internship. If you're ready to take the next step in your career and dive into the world of functional programming with Scala and Akka, we encourage you to apply!

Posted 1 day ago

Apply

5.0 years

0 Lacs

Chandigarh

On-site

bebo Technologies is a leading complete software solution provider. bebo stands for 'be extension be offshore'. We are a business partner of QASource, inc. USA[www.QASource.com]. We offer outstanding services in the areas of software development, sustenance engineering, quality assurance and product support. bebo is dedicated to provide high-caliber offshore software services and solutions. Our goal is to 'Deliver in time-every time'. For more details visit our website: www.bebotechnologies.com Let's have a 360 tour of our bebo premises by clicking on below link: https://www.youtube.com/watch?v=S1Bgm07dPmMKey Skill Set Required: 5–7 years of software development experience, with 3+ years focused on building ML systems. Advanced programming skills in Python; working knowledge of Java, Scala, or C++ for backend services. Proficiency with ML frameworks: TensorFlow, PyTorch, Scikit-learn. Experience deploying ML solutions in cloud environments (AWS, GCP, Azure) using tools like SageMaker, Vertex AI, or Databricks. Strong grasp of distributed systems, CI/CD for ML, containerization (Docker/K8s), and serving frameworks. Deep understanding of algorithms, system design, and data pipelines. Experience with MLOps platforms (MLflow, Kubeflow, TFX) and feature stores. Familiarity with LLMs, RAG architectures, or multimodal AI. Experience with real-time data and streaming systems (Kafka, Flink, Spark Streaming). Exposure to governance/compliance in regulated industries (e.g., healthcare, finance). Published research, patents, or contributions to open-source ML tools is a strong plus

Posted 1 day ago

Apply

5.0 years

4 - 9 Lacs

Hyderābād

Remote

Data Engineer Remote role based in India. Note - This is a full time, remote, salaried position through Red Elk Consulting, llc, based in India. This role is 100% focused and dedicated to supporting Together Labs, as a consultant, and includes; salary, benefits, vacation, and a local India - based support team We are seeking an experienced and motivated Data Engineer to join our dynamic data team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines, managing our data warehouse infrastructure, and supporting analytics initiatives across the organization. You will work closely with data scientists, analysts, and other stakeholders to ensure data quality, integrity, and accessibility, enabling the organization to make data-driven decisions. RESPONSIBILITIES Design and Develop Data Pipelines: Architect, develop, and maintain robust and scalable data pipelines for ingesting, processing, and transforming large volumes of data from multiple sources in real-time and batch modes . Data Warehouse Management: Manage, optimize, and maintain the data warehouse infrastructure, ensuring data integrity, security, and availability. Oversee the implementation of best practices for data storage, partitioning, indexing, and schema design. ETL Processes: Design and build efficient ETL (Extract, Transform, Load) processes to move data across various systems while ensuring high performance, reliability, and scalability. Data Integration: Integrate diverse data sources (structured, semi-structured, and unstructured data) into a unified data model that supports analytics and reporting needs. Support Analytics and BI: Collaborate with data analysts, data scientists, and business intelligence teams to understand data requirements and provide data sets, models, and solutions that support their analytics needs. Data Quality and Governance: Establish and enforce data quality standards, governance policies, and best practices. Implement monitoring and alerting to ensure data accuracy, consistency, and completeness. Operational Excellence: Drive the development of automated systems for provisioning, deployment, monitoring, failover, and recovery. Implement systems to monitor key performance metrics, logs, and alerts with a focus on automation and reducing manual intervention. Cross-functional Collaboration: Work closely with product, engineering, and QA teams to ensure the infrastructure supports and enhances development workflows and that services are deployed and operated smoothly at scale. Incident Management & Root Cause Analysis: Act as a first responder to data production issues, leading post-mortems and implementing long-term solutions to prevent recurrence. Ensure all incidents are handled promptly with a focus on minimizing impact. Security & Compliance: Ensure our infrastructure is designed with security best practices in mind, including encryption, access control, and vulnerability scanning. Continuous Improvement: Stay up-to-date with industry trends, technologies, and best practices, bringing innovative ideas into the team to improve reliability, performance, and scale. QUALIFICATIONS Education & Experience: Bachelor’s degree in Computer Science, Engineering, or related technical field, or equivalent practical experience. 5+ years of experience in data engineering, with a strong background in systems architecture, distributed systems, cloud infrastructure, or a related field. Proven experience building and managing data pipelines, data warehouses, and ETL processes. Technical skills: Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, Oracle) and data warehousing solutions (e.g., Snowflake, Redshift, BigQuery). Expertise in data pipeline tools and frameworks (e.g., AWS Glue, Google Dataflow, Apache Airflow, Apache NiFi, dbt). Hands-on experience with cloud platforms and their data services (e.g., AWS, Azure, Google Cloud Platform). Proficiency in programming languages such as Python, Java, or Scala for data manipulation and automation. Knowledge of data modeling, schema design, and data governance principles. Familiarity with distributed data processing frameworks like Apache Spark, Hadoop, or similar. Experience with BI tools (e.g., Tableau, Power BI, Looker) Experience with AWS and standard practices working in Cloud based environments Soft Skills: Strong problem-solving and analytical skills with a keen attention to detail. Excellent communication and collaboration skills, with the ability to work effectively with technical and non-technical stakeholders. Proactive mindset with the ability to work independently and handle multiple tasks in a fast-paced environment. ABOUT US Together Labs innovates technologies that empower people worldwide to connect, create and earn in virtual worlds. Our mission is to redefine social media as a catalyst for authentic human connection through the development of a family of products grounded in this core value. These include: IMVU, the world's largest friendship discovery and social platform and VCOIN, the first regulatory-approved transferable digital currency;. For more information, please visit https://togetherlabs.com/ Founded in 2004 and based in the heart of Silicon Valley, Together Labs is led by a team that's dedicated to pioneering in virtual worlds. Together Labs is backed by venture investors Allegis Capital, Bridgescale Partners and Best Buy Capital. Together Labs (formerly IMVU) has been for nine years running as Best Place to Work in the Silicon Valley. HOW TO APPLY Please familiarize yourself with our products and feel free to try out our core product at https://www.imvu.com/ Together Labs is an equal opportunity employer, and is committed to fostering a culture of inclusion. Our unique differences enable us to learn, collaborate, and grow together. We welcome all applicants without regard to race, color, religious creed, sex, national origin, citizenship status, age, physical or mental disability, sexual orientation, gender identification, marital, parental, veteran or military status, unfavorable military discharge, decisions regarding reproductive health, or any other status protected by applicable federal, state, or local law. This is a remote position.

Posted 1 day ago

Apply

8.0 years

0 Lacs

Hyderābād

On-site

At least 8+ years of experience and strong knowledge in Scala programming language. Able to write clean, maintainable and efficient Scala code following best practices. Good knowledge on the fundamental Data Structures and their usage At least 8+ years of experience in designing and developing large scale, distributed data processing pipelines using Apache Spark and related technologies. Having expertise in Spark Core, Spark SQL and Spark Streaming. Experience with Hadoop, HDFS, Hive and other BigData technologies. Familiarity with Data warehousing and ETL concepts and techniques Having expertise in Database concepts and SQL/NoSQL operations. UNIX shell scripting will be an added advantage in scheduling/running application jobs. At least 8 years of experience in Project development life cycle activities and maintenance/support projects. Work in an Agile environment and participation in scrum daily standups, sprint planning reviews and retrospectives. Understand project requirements and translate them into technical solutions which meets the project quality standards Ability to work in team in diverse/multiple stakeholder environment and collaborate with upstream/downstream functional teams to identify, troubleshoot and resolve data issues. Strong problem solving and Good Analytical skills. Excellent verbal and written communication skills. Experience and desire to work in a Global delivery environment. Stay up to date with new technologies and industry trends in Development. Job Types: Full-time, Permanent, Contractual / Temporary Pay: ₹5,000.00 - ₹9,000.00 per day Work Location: In person

Posted 1 day ago

Apply

1.0 years

1 - 5 Lacs

Hyderābād

On-site

Are you looking for an opportunity to join a team of engineers in positively affecting the experience of every consumer who uses Microsoft products? The OSSE team in OPG group is focused on building client experiences and services that light up Microsoft Account experiences across all devices and platforms. We are passionate about working together to build delightful and inclusive account experiences that empower customers to get the most out of what Microsoft has to offer. We’re looking for a collaborative, inclusive and customer obsessed engineer to help us build and sustain authentication experiences like Passkeys as well as engage with our customers by building experiences to help users keep their account secure and connected across multiple devices and applications. We're looking for an enthusiastic Software Engineer to help us build account experiences and deliver business Intelligence through data for experiences across 1.5 billion Windows devices and various Microsoft products. Your responsibilities will include working closely with a variety of teams such as Engineering, Program Management, Design and application partners to understand the key business questions for customer-facing scenarios, to set up the key performance indicators, and setup data pipelines to identify insights and experiment ideas that moves our business metrics. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Enable the Windows, Developers, and Experiences team to do more with data across all aspects of the development lifecycle. Contribute to a data-driven culture as well as a culture of experimentation across the organization. Provide new and improve upon existing platform offerings with a fundamental understanding of the end-to-end scenarios. Collaborate with partner teams and customers to scope and deliver projects. You’ll write secure, reliable, scalable, and maintainable code, and then effectively debug it, test it, and support it. Authoring and design of Big Data ETL pipelines in SCOPE, Scala, SQL, Python, or C#. Qualifications Required Qualifications: Bachelor's Degree in Computer Science, or related technical discipline with proven experience coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. Proven coding and debugging skills in C#, C++, Java, or SQL. Ability to work and communicate effectively across disciplines and teams. Preferred Qualifications: 1+ years of experience in data engineering. Understanding and experience with data cloud computing technologies such as – Azure Synapse, Azure Data Factory, SQL, Azure Data Explorer, Power BI, PowerApps, Hadoop, YARN, Apache Spark. Excellent analytical skills with systematic and structured approach to software design. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 day ago

Apply

12.0 years

1 - 10 Lacs

Gurgaon

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibilities: Manage and mentor a team of data engineers, fostering a culture of innovation and continuous improvement Design and maintain robust data architectures, including databases and data warehouses Oversee the development and optimization of data pipelines for efficient data processing Implement measures to ensure data integrity, including validation, cleansing, and governance practices Work closely with data scientists, analysts, and business stakeholders to understand requirements and deliver solutions Analyze, synthesize, and interpret data from a variety of data sources, investigating, reconciling and explaining data differences to understand the complete data lifecycle Architecting with modern technology stack and Designing Public Cloud Application leveraging in Azure Basic, structured, standard approach to work Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience. 12+ Years of Implementation experience on time-critical production projects following key software development practices 8+ years of programming experience in Python or any programming language 6+ years of hands-on programming experience in Spark using scala/python 4+ years of hands-on working experience with Azure services like: Azure Databricks Azure Data Factory Azure Functions Azure App Service Good knowledge in writing SQL queries Good knowledge in building REST API's Good knowledge on tools like Azure Dev Ops & Github Ability to understand the existing application codebase, perform impact analysis and update the code when required based on the business logic or for optimization Ability to learn modern technologies and be part of fast paced teams Proven excellent Analytical and Communication skills (Both Verbal and Written) Proficiency with AI-powered development tools such as GitHub Copilot or AWS Code Whisperer or Google’s Codey (Duet AI) or any relevant tools is expected. Candidates should be adept at integrating these tools into their workflows to accelerate development, improve code quality, and enhance delivery velocity. Expected to proactively leverage AI tools throughout the software development lifecycle to drive faster iteration, reduce manual effort, and boost overall engineering productivity Preferred Qualification: Good knowledge on Docker & Kubernetes services At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 1 day ago

Apply

175.0 years

2 - 7 Lacs

Gurgaon

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. Description: The Analytics, Investment and Marketing Enablement (AIM) team – a part of GCS Marketing Organization – is the analytical engine that enables the Global Commercial Card business. The team drives Profitable Growth in Acquisitions through Data, Analytics, AI powered Targeting & Personalization Capabilities. This B30 role would be a part of AIM India team, based out of Gurgaon, and would be responsible for proactive retention and save a card analytics across the SME segment across marketing and sales distribution channels. This critical role represents a unique opportunity to make charge volume impact of 2+ Billion. A very important focus for the role shall be quantitatively determining the value, deriving insights, and then assuring the insights are leveraged to create positive impact that cause a meaningful difference to the business. Key Responsibilities include: Develop/enhance precursors in AI models partnering with Decision science and collaborate across Marketing, Risk, and Sales to help design customized treatments depending upon the precursors. Be a key analytical partner to the Marketing and Measurement teams to report on Digital, Field and Phone Programs that promote growth and retention. Support and enable the GCS partners with actionable, insightful analytical solutions (such as triggers, Prioritization Tiers) to help the Field and Phone Sales team prioritize efforts effectively. Partner with functional leaders, Strategic Business Partners, and Senior leaders to assess and identify opportunities for better customer engagement and revenue growth. Excellent communication skills with the ability to engage, influence, and inspire partners and stakeholders to drive collaboration and alignment. Exceptional execution skills – be able to resolve issues, identify opportunities, and define success metrics and make things happen. Drive Automation and ongoing refinement of analytical frameworks. Willingness to challenge the status quo; breakthrough thinking to generate insights, alternatives, and opportunities for business success. High degree of organization, individual initiative, and personal accountability Minimum Qualifications: Strong programming skills & experience with building models & analytical data products are required. Experience with technologies such as Java, Big Data, PySpark, Hive, Scala, Python Proficiency & experience in applying cutting edge statistical and machine learning techniques to business problems and leverage external thinking (from academia and/or other industries) to develop best in class data science solutions. Excellent communication and interpersonal skills, and ability to build and retain strong working relationships. Ability to interact effectively and deliver compelling messages to business leaders across various band levels. Preferred Qualifications: Good knowledge of statistical techniques like hypothesis testing, regression, knn, t-test, chi-square test Demonstrated ability to work independently and across a matrix organization partnering with capabilities, decision sciences, technology teams and external vendors to deliver solutions at top speed. Experience with commercial data and ability to create insights and drive results. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 1 day ago

Apply

4.0 years

19 - 39 Lacs

Noida

On-site

Sr Software Engineer (3 Openings) RACE Consulting is hiring for one of our top clients in the cybersecurity and AI space. If you're passionate about cutting-edge technology and ready to work on next-gen AI-powered log management and security automation, we want to hear from you! Role Highlights:Work on advanced agentic workflows, threat detection, and behavioral analysis Collaborate with a world-class team of security researchers and data scientists Tech stack: Scala, Python, Java, Go, Docker, Kubernetes, IaC Who We're Looking For:4+ years of experience in backend developmentStrong knowledge of microservices, containerization, and cloud-native architectureBonus if you’ve worked in cybersecurity or AI-driven analytics Job Type: Full-time Pay: ₹1,950,000.00 - ₹3,900,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Provident Fund Work Location: In person

Posted 1 day ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Role : Data Engineer Remote opportunity. Group Objective The mission of the business intelligence team is to create a data-driven culture that empowers leaders to integrate data into daily decisions and strategic planning. We aim to provide visibility, transparency, and guidance regarding the quantity and quality of results, activities, financial KPIs, and leading indicators to identify trends aimed at data-based decision-making easily. Position Objective As a Senior Data Engineer, you will be responsible for designing, architecting, and implementing robust data solutions in a cloud-based environment (GCP). You will partner with other data engineers and technical teams to ensure the availability, reliability, and performance of our data systems. Position Summary Programming & Code Writing Architect and build complex data pipelines using advanced cloud data technologies Lead efforts to optimize data pipelines for performance, scalability, and cost-efficiency Define industry best practices for building data pipelines Ensure data security, compliance, and governance standards are met. Partner with leadership team to define and implement agile and DevOps methodologies Consulting & Partnership Serve as subject matter expert and define data architecture and infrastructure requirements Partner with business analysts to plan project execution including appropriate product and technical specifications, direction, resources, and establishing realistic completion times Understand data technology trends and identify opportunities to implement new technologies and provide forward-thinking recommendations Proactively partner with internal stakeholders to bridge gaps, provide historical references, and design the appropriate processes Troubleshooting & Continuous Improvement Design and implement a robust data observability process Resolve escalated reporting requests and communicate proactively and timely Troubleshoot, and provide technical guidance to resolve issues related to misaligned or inaccurate data or data fields or new customer requirements Maintain new release, migration, and sprint schedules for software upgrades, enhancements, and fixes to aid with product evolution Write QA/QC Scripts to conduct first round of testing and partner with BA team for test validation for new developments prior to moving to production Use industry knowledge & feedback to aid in the development of technology roadmap and future product(s) vision Document standard ways of working via QRGs, intranet pages, and video series Senior activities Drive day-to-day development activities of development team in close collaboration with on-site and off-shore resources, scrum masters and product owners Bootstrapping a data engineering team at an early stage in the team’s evolution Provide leadership on technical front in difficult situations facilitate contentious discussions, and report up when necessary Guide, mentor and coach offshore resources Provide input in forming a long-term data strategy Education Master’s degree in Computer Science / Information Technology or related field, highly preferred Experience Extensive knowledge of BI concepts and related technologies that help drive sustainable technical solutions Extensive Experience with data lakes, ETL and data warehouses Advanced experience of building data pipelines Passion for building quality BI software Project Management and/or process improvement experience highly preferred Knowledge, Skills, and Abilities Polyglot coder and expert level in multiple languages including, Python, R, Java, SQL, relational databases, ERP, DOMO or other data visualization tools i.e. Tableau Advanced and proven experience with Google cloud platform (GCP) is preferred. But experience with Microsoft Azure / Amazon will be considered. Any exposure to Kafka, Spark, and Scala will be an added advantage. Should demonstrate a strong understanding of OOPS concepts and methodologies Expert level understanding of data engineering Intrinsic motivation and problem-solving Proactive leadership, project management, time management, and problem-solving skills Demonstrated continuous improvement, process documentation, and workflow skills Extensive experience with data analysis , modeling, and data pipelining including data cleaning, standardizing, scaling, tuning, scheduling and deployment Experience composing detailed technical documentation and procedures for data models Ability to prioritize and manage multiple projects, tasks, and meeting deadlines while maintaining quality Strong drive and commitment for delivering outstanding results Strong follow up and service orientation

Posted 1 day ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Looking to hire a Data Engineer who has a good understanding of Data Product Lifecycle, Standards and Practices. Will be responsible for building scalable and efficient data solutions to support the Finance, Franchising & Development function with a specific focus on the Finance Analytics product and initiatives. As a Data Engineer, you will collaborate with data scientists, analysts, and other cross-functional teams to ensure the availability, reliability, and performance of data systems. Vital team member for initiatives to enable trusted financial data, supports decision-making, and partners with business and technology teams to align data capabilities with strategic finance objectives. Expertise in cloud computing platforms, technologies and data engineering best practices will play a crucial role within this domain. Who we are looking for: Primary Responsibilities: Builds and maintains relevant and reliable data products that support Finance Analytics. Develops and implements new technology solutions as needed to ensure ongoing improvement with data reliability and observability in-view. Participates in new software development and data engineering initiatives supporting Finance Analytics, ensuring timely and accurate delivery of financial data products. Drive and implement best Data Engineering practices for pipeline development, data governance, data security and quality across financial datasets. Implement security and privacy controls in data workflows, ensuring compliance with finance regulatory requirements. Monitor, troubleshoot, and improve performance and reliability of existing finance data pipeline infrastructure. Staying up to date with emerging data engineering technologies, trends, and best practices, and evaluating their applicability to meet evolving financial analytics needs. Documenting data engineering processes, workflows, and solutions for knowledge sharing and future reference. Partner and collaborate with data engineers, particularly in finance-centric data models and processing frameworks. Ability and flexibility to coordinate and work with teams distributed across time zones, as needed. Skill: Applies technical data engineering expertise to develop reliable pipelines and improve data quality in support of finance and analytics initiatives Bachelor's or master's degree in computer science or related engineering field and deep experience with Cloud computing 3+ years of professional experience in data engineering or related fields Proficiency in Python, Java, or Scala for data processing and automation Hands-on experience with data orchestration tools (e.g., Apache Airflow, Luigi) and big data ecosystems (e.g., Hadoop, Spark, NoSQL) Good working knowledge of Data quality functions like cleansing, standardization, parsing, de-duplication, mapping, hierarchy management, etc. Ability to perform extensive data analysis (comparing multiple datasets) using a variety of tools Effective communication and stakeholder management skills to drive alignment and adoption of data engineering standards Demonstrated experience in data management & data governance capabilities Familiarity with data warehousing principles and best practices. Excellent problem solver - use of data and technology to solve problems or answer complex data related questions Excellent collaboration skills to work effectively in cross-functional teams Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid.

Posted 1 day ago

Apply

7.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

About the Role: Grade Level (for internal use): 10 The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What's in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities: Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. What we're Looking For: Basic Qualifications: Bachelor's degree in Computer Science or Equivalent 7+ years' related experience Passionate, smart, and articulate developer Strong C#, .Net and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Agile/Scrum experience a plus. Exposure to Data Engineering and Big Data technologies like Hadoop, Big data processing engines/Scala, Nifi and ETL is a plus. Experience of Container platforms is a plus Experience working in cloud computing environment like AWS, Azure , GCP etc. What's In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology-the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide-so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We're committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We're constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That's why we provide everything you-and your career-need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It's not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards-small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, "pre-employment training" or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority - Ratings - (Strategic Workforce Planning) Job ID: 317843 Posted On: 2025-07-20 Location: Hyderabad, Telangana, India

Posted 1 day ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Position Summary: Looking to hire a Data Engineer at the G4 level who has a deep understanding of Data Product Lifecycle, Standards and Practices. Will be responsible for building scalable and efficient data solutions to support the Brand Marketing / Menu function with a specific focus on the Menu Data product and initiatives. As a Data Engineer, you will collaborate with data scientists, analysts, and other cross-functional teams to ensure the availability, reliability, and performance of data systems. Leads initiatives to enable trusted Menu data, supports decision-making, and partners with business and technology teams to deliver scalable data solutions that drive insights into menu performance, customer preferences, and marketing effectiveness. Expertise in cloud computing platforms, technologies and data engineering best practices will play a crucial role within this domain. Who we are looking for: Primary Responsibilities: Builds and maintains relevant and reliable Menu data products that support menu and marketing Analytics. Develops and implements new technology solutions as needed to ensure ongoing improvement with data reliability and observability in-view. Participates in new software development engineering and Lead data engineering initiatives supporting Product Mix Analytics, ensuring timely and accurate delivery of marketing and menu-related products. Work closely with the Product owner and help to define business rules that determines the quality of Menu datasets. Drive and implement best practices for pipeline development, data governance, data security and quality across marketing and menu-related datasets. Ensure scalability, maintainability, and quality of data systems powering menu item tracking, promotion data, and marketing analytics. Staying up to date with emerging data engineering technologies, trends, and best practices, and evaluating their applicability to meet evolving Product Mix analytics needs. Documenting data engineering processes, workflows, and solutions for knowledge sharing and future reference. Mentor and coach junior data engineers, particularly in areas related to menu item tracking, promotion data, and marketing analytics. Ability and flexibility to coordinate and work with teams distributed across time zones, as needed Skill: Leads teams to drive scalable data engineering practices and technical excellence within the Menu Data ecosystem. Bachelor's or master's degree in computer science or related engineering field and deep experience with Cloud computing 5+ years of professional experience in data engineering or related fields Proficiency in Python, Java, or Scala for data processing and automation Hands-on experience with data orchestration tools (e.g., Apache Airflow, Luigi) and big data ecosystems (e.g., Hadoop, Spark, NoSQL) Expert knowledge of Data quality functions like cleansing, standardization, parsing, de-duplication, mapping, hierarchy management, etc. Ability to perform extensive data analysis (comparing multiple datasets) using a variety of tools Proven ability to mentor team members and lead technical initiatives across multiple workstreams Effective communication and stakeholder management skills to drive alignment and adoption of data engineering standards Demonstrated experience in data management & data governance capabilities Familiarity with data warehousing principles and best practices. Excellent problem solver - use of data and technology to solve problems or answer complex data related questions Excellent collaboration skills to work effectively in cross-functional teams. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid.

Posted 1 day ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Category Engineering Experience Sr. Manager Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Senior Lead- Machine Learning Engineering At Capital One India, we work in a fast paced and intellectually rigorous environment to solve fundamental business problems at scale. Using advanced analytics, data science and machine learning, we derive valuable insights about product and process design, consumer behavior, regulatory and credit risk, and more from large volumes of data, and use it to build cutting edge patentable products that drive the business forward. We’re looking for a Senior Lead Engineer to join the Machine Learning Experience (MLX) team! As a Capital One Senior Lead Engineer, you'll be part of a team focusing on observability and model governance automation for cutting edge generative AI use cases. You will work on building solutions to collect metadata, metrics and insights from the large scale Gen AI platform. And build intelligent and smart solutions to derive deep insights into platform's use-cases performance and compliance with industry standards. You will contribute to building a system to do this for Capital One models, accelerating the move from fully trained models to deployable model artifacts ready to be used to fuel business decisioning and build an observability platform to monitor the models and platform components. The MLX team is at the forefront of how Capital One builds and deploys well-managed ML models and features. We onboard and educate associates on the ML platforms and products that the whole company uses. We drive new innovation and research and we’re working to seamlessly infuse ML into the fabric of the company. The ML experience we're creating today is the foundation that enables each of our businesses to deliver next-generation ML-driven products and services for our customers. What You’ll Do: Architect and develop full stack solutions for monitoring, logging, and managing Generative AI , machine learning workflows and models. Architect, build and deploy well-managed core APIs and SDKs for observability of LLMs and proprietary Foundation Models including training, pre-training, fine-tuning and prompting. Work with model and platform teams to build systems that ingest large amounts of model and feature metadata and runtime metrics to build an observability platform and to make governance decisions to ensure ethical use, data integrity, and compliance with industry standards for Gen-AI. Partner with product and design teams to develop and integrate advanced observability tools tailored to Gen-AI. Leverage cloud-based architectures and technologies to deliver solutions for platform users providing deep insights into model performance, data flow, and system health. Collaborate as part of a cross-functional Agile team, data scientists, ML engineers, and other stakeholders to understand requirements and translate them into scalable and maintainable solutions. Use programming languages like Python, Scala, or Java Leverage continuous integration and continuous deployment best practices, including test automation and monitoring, to ensure successful deployments of machine learning models and application code. Basic Qualifications: Master's Degree in Computer Science or a related field Minimum 12 years of experience in software engineering and solution architecture At least 8 years of experience designing and building data intensive solutions using distributed computing At least 8 years of experience programming with Python, Go, or Java Proficiency in observability tools such as Prometheus, Grafana, ELK Stack, or similar, with a focus on adapting them for Gen AI systems. Excellent knowledge in Open Telemetry and priority experience in building SDKs and APIs. Excellent communication skills, capable of articulating complex technical concepts to diverse audiences and driving cross-functional initiatives. Experience developing and deploying ML platform solutions in a public cloud such as AWS, Azure, or Google Cloud Platform No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC). This carousel contains a column of headings. Selecting a heading will change the main content in the carousel that follows. Use the Previous and Next buttons to cycle through all the options, use Enter to select. This carousel shows one item at a time. Use the preceding navigation carousel to select a specific heading to display the content here. How We Hire We take finding great coworkers pretty seriously. Step 1 Apply It only takes a few minutes to complete our application and assessment. Step 2 Screen and Schedule If your application is a good match you’ll hear from one of our recruiters to set up a screening interview. Step 3 Interview(s) Now’s your chance to learn about the job, show us who you are, share why you would be a great addition to the team and determine if Capital One is the place for you. Step 4 Decision The team will discuss — if it’s a good fit for us and you, we’ll make it official! How to Pick the Perfect Career Opportunity Overwhelmed by a tough career choice? Read these tips from Devon Rollins, Senior Director of Cyber Intelligence, to help you accept the right offer with confidence. Your wellbeing is our priority Our benefits and total compensation package is designed for the whole person. Caring for both you and your family. Healthy Body, Healthy Mind You have options and we have the tools to help you decide which health plans best fit your needs. Save Money, Make Money Secure your present, plan for your future and reduce expenses along the way. Time, Family and Advice Options for your time, opportunities for your family, and advice along the way. It’s time to BeWell. Career Journey Here’s how the team fits together. We’re big on growth and knowing who and how coworkers can best support you.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Category Engineering Experience Director Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Distinguished Machine Learning Engineer Distinguished Engineer - Machine Learning Engineering At Capital One India, we work in a fast paced and intellectually rigorous environment to solve fundamental business problems at scale. Using advanced analytics, data science and machine learning, we derive valuable insights about product and process design, consumer behavior, regulatory and credit risk, and more from large volumes of data, and use it to build cutting edge patentable products that drive the business forward. We’re looking for a Distinguished Engineer - Machine Learning Engineering to join the Machine Learning Experience (MLX) team! As a Capital One Machine Learning Engineer (MLE), you'll be part of a team focusing on observability and model governance automation. You will work with model training and features and serving metadata at scale, to enable automated model governance decisions and to build a model observability platform. You will contribute to building a system to do this for Capital One models, accelerating the move from fully trained models to deployable model artifacts ready to be used to fuel business decisioning and build an observability platform to monitor the models and platform components. The MLX team is at the forefront of how Capital One builds and deploys well-managed ML models and features. We onboard and educate associates on the ML platforms and products that the whole company uses. We drive new innovation and research and we’re working to seamlessly infuse ML into the fabric of the company. The ML experience we're creating today is the foundation that enables each of our businesses to deliver next-generation ML-driven products and services for our customers. What You’ll Do Work with model and platform teams to build systems that ingest large amounts of model and feature metadata and runtime metrics to build an observability platform and to make governance decisions. Partner with product and design teams to build elegant and scalable solutions to speed up model governance observability Collaborate as part of a cross-functional Agile team to create and enhance software that enables state of the art, next generation big data and machine learning applications. Leverage cloud-based architectures and technologies to deliver optimized ML models at scale Construct optimized data pipelines to feed machine learning models. Use programming languages like Python, Scala, or Java Leverage continuous integration and continuous deployment best practices, including test automation and monitoring, to ensure successful deployments of machine learning models and application code. Basic Qualifications Master's Degree in Computer Science or a related field At least 15 years of experience in software engineering or solution architecture At least 10 years of experience designing and building data intensive solutions using distributed computing At least 10 years of experience programming with Python, Go, or Java At least 8 years of on-the-job experience with an industry recognized ML framework such as scikit-learn, PyTorch, Dask, Spark, or TensorFlow At least 5 years of experience productionizing, monitoring, and maintaining models Preferred Qualifications Master’s Degree or PhD in Computer Science, Electrical Engineering, Mathematics, or a similar field 5+ years of experience building, scaling, and optimizing ML systems 5+ years of experience with data gathering and preparation for ML models 10+ years of experience developing performant, resilient, and maintainable code. Experience developing and deploying ML solutions in a public cloud such as AWS, Azure, or Google Cloud Platform 5+ years of experience with distributed file systems or multi-node database paradigms. Contributed to open source ML software Authored/co-authored a paper on a ML technique, model, or proof of concept 5+ years of experience building production-ready data pipelines that feed ML models. Experience designing, implementing, and scaling complex data pipelines for ML models and evaluating their performance 5+ years of experience in ML Ops either using open source tools like ML Flow or commercial tools 2+ Experience in developing applications using Generative AI i.e open source or commercial LLMs No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC). This carousel contains a column of headings. Selecting a heading will change the main content in the carousel that follows. Use the Previous and Next buttons to cycle through all the options, use Enter to select. This carousel shows one item at a time. Use the preceding navigation carousel to select a specific heading to display the content here. How We Hire We take finding great coworkers pretty seriously. Step 1 Apply It only takes a few minutes to complete our application and assessment. Step 2 Screen and Schedule If your application is a good match you’ll hear from one of our recruiters to set up a screening interview. Step 3 Interview(s) Now’s your chance to learn about the job, show us who you are, share why you would be a great addition to the team and determine if Capital One is the place for you. Step 4 Decision The team will discuss — if it’s a good fit for us and you, we’ll make it official! How to Pick the Perfect Career Opportunity Overwhelmed by a tough career choice? Read these tips from Devon Rollins, Senior Director of Cyber Intelligence, to help you accept the right offer with confidence. Your wellbeing is our priority Our benefits and total compensation package is designed for the whole person. Caring for both you and your family. Healthy Body, Healthy Mind You have options and we have the tools to help you decide which health plans best fit your needs. Save Money, Make Money Secure your present, plan for your future and reduce expenses along the way. Time, Family and Advice Options for your time, opportunities for your family, and advice along the way. It’s time to BeWell. Career Journey Here’s how the team fits together. We’re big on growth and knowing who and how coworkers can best support you.

Posted 1 day ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Information Date Opened 08/07/2025 Job Type Permanent RSD NO 10926 Industry IT Services Min Experience 5 Max Experience 7 City Bangalore North State/Province Karnataka Country India Zip/Postal Code 560001 Job Description JOB RESPONSIBILITIES: Work with product management and dev team to design, develop and deliver features and enhancements Collaborate closely with peers to develop clean code: readable, testable, high quality, performant, and secure Develop code using pair and team programming approaches Perform peer code reviews and walk-throughs Automate testing and deployment of software to enable delivering improvements to customers on a regular cadence Work closely with the agile team to innovate and improve everything MINIMUM REQUIREMENTS: B.S. in Computer Science or equivalent is preferred 4+ years of experience with modern languages such as Java/C#/JavaScript/Scala Recent 2+ years of Scala functional programming in an Enterprise SW environment Experience with RESTful applications Experience with Microservices Ability to work effectively in a distributed, collaborative, agile environment and deliver solutions on a regular cadence At Indium diversity, equity, and inclusion (DEI) are the cornerstones of our values. We champion DEI through a dedicated council, expert sessions, and tailored training programs, ensuring an inclusive workplace for all. Our initiatives, including the WE@IN women empowerment program and our DEI calendar, foster a culture of respect and belonging. Recognized with the Human Capital Award, we are committed to creating an environment where every individual thrives. Join us in building a workplace that values diversity and drives innovation.

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

The Data Analytics Intmd Analyst position is ideal for a developing professional who can independently handle most issues and solve complex problems with some flexibility. By integrating expert knowledge in a specific area with a solid grasp of industry standards, you will contribute to the team's success and align with broader organizational goals. Your role will involve applying analytical thinking and utilizing data analysis tools to make informed judgments and recommendations based on factual information. Attention to detail is crucial, as your decisions can have a significant impact on business outcomes. Responsibilities: - Utilize in-depth data analysis knowledge and industry best practices to drive results. - Understand how data analytics teams collaborate with other departments to achieve common goals. - Apply project management skills effectively. - Analyze data using various tools and methodologies to provide insights into local and wider operational impacts. - Exercise professional judgment to interpret data accurately and communicate findings clearly. - Demonstrate strong communication and diplomacy skills to handle complex information sensitively. - Ensure high-quality, timely service delivery to enhance team and group effectiveness. - Provide informal guidance and on-the-job training to new team members. - Assess risks appropriately in business decision-making processes, prioritizing the firm's reputation and compliance with regulations. Qualifications: - 5+ years of relevant experience - Proficiency in Hadoop, Python, Spark, Hive, RDBMS, and Scala - Knowledge of statistical modeling tools for large datasets - Strong analytical, interpretive, and problem-solving skills - Excellent interpersonal, verbal, and written communication abilities Education: - Bachelor's/University degree or equivalent experience This job description offers an overview of the primary responsibilities. Additional duties related to the role may be assigned as necessary. Citi is committed to providing equal employment opportunities and fostering a diverse workforce. If you have a disability and require accommodation during the application process, please review the Accessibility at Citi guidelines.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Java Developer at our company located in Bangalore, Karnataka, India, you will be responsible for designing and developing Java services utilizing the Java Spring Boot framework. Your role will involve implementing, supporting, troubleshooting, and maintaining applications, while ensuring the development of high-standard SAS/Python code and model documentation. Additionally, you will be working on the release cycle of modern, Java-based web applications and developing automation scripts using Python to streamline day-to-day team activities. Your expertise will be crucial in writing efficient, reusable, and reliable Java code. To excel in this position, you should possess a Bachelor's Degree in Computer Science or a related field, along with 5-7 years of experience in Java development. Strong proficiency in Java/J2EE technologies, web frontend technologies like HTML, JavaScript, and CSS, as well as Java frameworks like Spring MVC and Spring Security are essential requirements. Experience with REST APIs, writing Python libraries, familiarity with databases such as MySQL or Oracle, and strong scripting skills in languages like Python, Perl, or Bash will be beneficial. Additionally, your background in backend programming with Java/Python/Scala and ability to work on full-stack development using Java technologies will be advantageous. Your skills should include strong Java programming abilities, experience with Java Spring framework and Hibernate, proficiency in developing microservices using Java, knowledge of design patterns and Java frameworks, and hands-on experience with front-end and back-end Java technologies. Familiarity with automation tools like Selenium and Protractor, a good understanding of web services and RESTful APIs, and experience in using ORM frameworks like Hibernate/JPA will also be valuable assets. Furthermore, proficiency in Python or relevant scripting languages, experience in web/mobile application development, understanding of high-level JavaScript concepts, ability to work with automation tools for testing, and knowledge of machine learning, AI, or data science will be advantageous. The ability to commute/relocate to Bangalore is required, and a Master's degree is preferred. Core Java experience of 6 years, Spring Boot experience of 4 years, and Python experience of 3 years are required. A willingness to travel 100% of the time is preferred. This is a full-time position with benefits including health insurance and Provident Fund. The work location is in-person with a day shift schedule from Monday to Friday. If you are comfortable relocating to Bangalore and meet the aforementioned requirements, we encourage you to apply and join our dynamic team.,

Posted 2 days ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As an Engineering Manager at Autodesk's Platform Services (APS) team in Pune, India, you will be leading a team of software engineers to develop scalable cloud services that enable customers to access and utilize their design and engineering data via the cloud. Your role will involve collaborating with stakeholders to extend the product vision, design and develop features, and achieve high standards of quality, security, and user experience. You will be responsible for leading key business outcomes, contributing to the company's long-term strategy, and coaching and developing the team to unlock creativity and deliver high-quality solutions. Additionally, you will research and prototype new ideas, technologies, and patterns, collaborate with global teams, and enhance engineering efficiency by identifying and influencing improvement opportunities. To qualify for this role, you should have a Bachelor's degree in any Engineering discipline, along with at least 10 years of hands-on experience in designing and developing cloud-scale server applications. You should also have a strong background in leading and managing development teams, hands-on experience with Java, Springboot, Scala, and cloud technologies like AWS, and familiarity with relational and NoSQL databases. Experience in Agile/Scrum principles, hiring top engineering talent, setting up successful teams, and working with product management on goals alignment are essential for this role. Strong communication skills, the ability to manage geo-distributed teams, and a self-starter mindset to drive solutions for identified problems are also key attributes for this position. If you are passionate about leading a high-performing team, driving innovation in a lean/agile environment, and contributing to the evolution of engineering disciplines, this opportunity at Autodesk is perfect for you. Join us in shaping the future and creating meaningful impact through technology and innovation.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer, you will be responsible for building data pipelines, Big data processing solutions, and data lake infrastructure using various Big data and ETL technologies. You will assemble and process large, complex data sets to meet both functional and non-functional business requirements. ETL tasks will involve extracting data from a variety of sources such as MongoDB, S3, Server-to-Server, Kafka, and processing it using SQL and big data technologies. Additionally, you will develop analytical tools to offer actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Your role will also include constructing interactive and ad-hoc query self-serve tools for analytics use cases. Furthermore, you will be responsible for creating data models and data schema to ensure performance, scalability, and meet functional requirements. You will build processes that support data transformation, metadata, dependency, and workflow management. As part of your responsibilities, you will conduct research, experiments, and prototypes with new tools/technologies to ensure their successful implementation. **Must-Have Skills:** - Strong proficiency in Python/Scala - Experience with Big data technologies such as Spark, Hadoop, Athena/Presto, Redshift, Kafka, etc. - Proficiency in various file formats like parquet, JSON, Avro, orc, etc. - Familiarity with workflow management tools like Airflow - Experience in batch processing, streaming, and message queues - Knowledge of visualization tools such as Redash, Tableau, Kibana - Working experience with structured and unstructured data sets - Strong problem-solving skills **Good-to-Have Skills:** - Exposure to NoSQL databases like MongoDB - Familiarity with Cloud platforms like AWS, GCP, etc. - Understanding of Microservices architecture - Knowledge of Machine learning techniques This role is full-time and offers full benefits. The company is headquartered in the San Francisco Bay Area with its India headquarters located in Bengaluru.,

Posted 2 days ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

Remote

About Tala Tala is on a mission to unleash the economic power of the Global Majority – the 4 billion people overlooked by existing financial systems. With nearly half a billion dollars raised from equity and debt, we are serving millions of customers across three continents. Tala has been named by the Fortune Impact 20 list, CNBC ’s Disruptor 50 five years in a row, CNBC ’s World's Top Fintech Company, Forbes’ Fintech 50 list for eight years running, and Chief's The New Era of Leadership Award. We are expanding across product offerings, countries and crypto and are looking for people who have an entrepreneurial spirit and are passionate about our m ission. By creating a unique platform that enables lending and other financial services around the globe, people in emerging markets are able to start and expand small businesses, manage day-to-day needs, and pursue their financial goals with confidence. Currently, over nine million people across Kenya, the Philippines, Mexico, and India have used Tala products. Due to our global team, we have a remote-first approach, and also have offices in Santa Monica, CA (HQ); Nairobi, Kenya; Mexico City, Mexico; Manila, the Philippines; and Bangalore, India. Most Talazens join us because they connect with our mission. If you are energized by the impact you can make at Tala, we’d love to hear from you! The Role The Senior Backend Engineer builds and extends Tala’s backend architecture to support new country launches, new products and features, and a fast-growing user base. As a technologist and a leader, the Senior Backend Engineer pushes the team towards building a highly available, scalable, reliable, fault-tolerant, and performant microservices platform. The Senior Backend Engineer follows and improves upon Tala’s engineering processes and standards while advancing Tala’s mission and business objectives What You'll Do Develop, test, and deploy software solutions using Java, Scala, or Kotlin Design and contribute to backend systems, making key architectural decisions Work with deployment infrastructure and tooling, including CI/CD pipelines Handle schema evolution and data migrations in production systems Optimize backend systems for performance, including profiling, caching, and JVM tuning Ensure code quality and consistency through best practices and code reviews Create and maintain clear and concise technical documentation What You'll Need 5+ years of professional software development experience Expertise in at least one of the following languages: Java, Scala, or Kotlin Solid understanding of software development principles, design patterns, and best practices Experience with databases (SQL and/or NoSQL) and data migrations Familiarity with message brokers or event-driven architectures (e.g., Kafka, RabbitMQ) Experience with containerization and orchestration tools like Docker or Kubernetes Experience with Cloud infrastructure (AWS, Google Cloud, or Azure) and deploying services at the infra level Our vision is to build a new financial ecosystem where everyone can participate on equal footing and access the tools they need to be financially healthy. We strongly believe that inclusion fosters innovation and we’re proud to have a diverse global team that represents a multitude of backgrounds, cultures, and experience. We hire talented people regardless of race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

Remote

About Tala Tala is on a mission to unleash the economic power of the Global Majority – the 4 billion people overlooked by existing financial systems. With nearly half a billion dollars raised from equity and debt, we are serving millions of customers across three continents. Tala has been named by the Fortune Impact 20 list, CNBC ’s Disruptor 50 five years in a row, CNBC ’s World's Top Fintech Company, Forbes’ Fintech 50 list for eight years running, and Chief's The New Era of Leadership Award. We are expanding across product offerings, countries and crypto and are looking for people who have an entrepreneurial spirit and are passionate about our m ission. By creating a unique platform that enables lending and other financial services around the globe, people in emerging markets are able to start and expand small businesses, manage day-to-day needs, and pursue their financial goals with confidence. Currently, over nine million people across Kenya, the Philippines, Mexico, and India have used Tala products. Due to our global team, we have a remote-first approach, and also have offices in Santa Monica, CA (HQ); Nairobi, Kenya; Mexico City, Mexico; Manila, the Philippines; and Bangalore, India. Most Talazens join us because they connect with our mission. If you are energized by the impact you can make at Tala, we’d love to hear from you! The Role The Senior Backend Engineer builds and extends Tala’s backend architecture to support new country launches, new products and features, and a fast-growing user base. As a technologist and a leader, the Senior Backend Engineer pushes the team towards building a highly available, scalable, reliable, fault-tolerant, and performant microservices platform. The Senior Backend Engineer follows and improves upon Tala’s engineering processes and standards while advancing Tala’s mission and business objectives What You'll Do Develop, test, and deploy software solutions using Java, Scala, or Kotlin Design and contribute to backend systems, making key architectural decisions Work with deployment infrastructure and tooling, including CI/CD pipelines Handle schema evolution and data migrations in production systems Optimize backend systems for performance, including profiling, caching, and JVM tuning Ensure code quality and consistency through best practices and code reviews Create and maintain clear and concise technical documentation What You'll Need 5+ years of professional software development experience Expertise in at least one of the following languages: Java, Scala, or Kotlin Solid understanding of software development principles, design patterns, and best practices Experience with databases (SQL and/or NoSQL) and data migrations Familiarity with message brokers or event-driven architectures (e.g., Kafka, RabbitMQ) Experience with containerization and orchestration tools like Docker or Kubernetes Experience with Cloud infrastructure (AWS, Google Cloud, or Azure) and deploying services at the infra level Our vision is to build a new financial ecosystem where everyone can participate on equal footing and access the tools they need to be financially healthy. We strongly believe that inclusion fosters innovation and we’re proud to have a diverse global team that represents a multitude of backgrounds, cultures, and experience. We hire talented people regardless of race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Role : Data Engineer Experience : 8+ Years Mode : Hybrid Key Responsibilities Design and implement enterprise-grade Data Lake solutions using AWS (e.g., S3, Glue, Lake Formation). Define data architecture patterns, best practices, and frameworks for handling large-scale data ingestion, storage, computing and processing. Optimize cloud infrastructure for performance, scalability, and cost-effectiveness. Develop and maintain ETL pipelines using tools such as AWS Glue or similar platforms. CI/CD Pipelines managing in DevOps. Create and manage robust Data Warehousing solutions using technologies such as Redshift. Ensure high data quality and integrity across all pipelines. Design and deploy dashboards and visualizations using tools like Tableau, Power BI, or Qlik. Collaborate with business stakeholders to define key metrics and deliver actionable insights. Implement best practices for data encryption, secure data transfer, and role-based access control. Lead audits and compliance certifications to maintain organizational standards. Work closely with cross-functional teams, including Data Scientists, Analysts, and DevOps engineers. Mentor junior team members and provide technical guidance for complex projects. Partner with stakeholders to define and align data strategies that meet business objectives. Qualifications & Skills Strong experience in building Data Lakes using AWS Cloud Platforms Tech Stack. Proficiency with AWS technologies such as S3, EC2, Glue/Lake Formation (or EMR), Quick sight, Redshift, Athena, Airflow (or) Lambda + Step Functions + Event Bridge, Data and IAM. Expertise in AWS tools that includes Data Lake Storage, Compute, Security and Data Governance. Advanced skills in ETL processes, SQL (like Cloud SQL, Aurora, Postgres), NoSQL DBs (like DynamoDB, MongoDB, Cassandra) and programming languages (e.g., Python, Spark, or Scala). Real-time streaming applications preferably in Spark, Kafka, or other streaming platforms. AWS Data Security : Good Understanding of security concepts such as : Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager. Hands-on experience with Data Warehousing solutions and modern architectures like Lakehouses or Delta Lake. Proficiency in visualization tools such as Tableau, Power BI, or Qlik. Strong problem-solving skills and ability to debug and optimize application applications for performance. Strong understanding of Database/SQL for database operations and data management. Familiarity with CI/CD pipelines and version control systems like Git. Strong understanding of Agile methodologies and working within scrum teams. Preferred Qualifications Bachelor of Engineering degree in Computer Science, Information Technology, or a related field. AWS Certified Solutions Architect Associate (required). Experience with Agile/Scrum methodologies and design sprints. (ref:hirist.tech)

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled and experienced Senior Data Scientist with a strong background in Artificial Intelligence (AI) and Machine Learning (ML), seeking to join our team. You are known for your innovative, analytical, and collaborative approach, with a proven track record in end-to-end AI/ML project delivery, encompassing data processing, modeling, and model deployment. With a minimum of 5-7 years of experience in data science, focusing on AI/ML applications, you have a deep understanding of a wide range of ML algorithms, including regression, classification, clustering, decision trees, neural networks, and deep learning architectures (e.g., CNNs, RNNs, GANs). Your strong programming skills in Python, R, or Scala, coupled with proficiency in ML libraries like TensorFlow, PyTorch, and Scikit-Learn, enable you to excel in this role. You are experienced in data wrangling, cleaning, and feature engineering, with familiarity in SQL and data processing frameworks such as Apache Spark. Your expertise extends to deploying models using tools like Docker, Kubernetes, and cloud services (AWS, GCP, or Azure), showcasing your comprehensive skill set. Your solid foundation in statistics, probability, and mathematical concepts used in AI/ML, along with proficiency in data visualization tools like Tableau, Power BI, or matplotlib, further enhances your capabilities. Additionally, familiarity with big data tools, natural language processing (NLP) techniques, time-series analysis, MLOps, and leadership experience are highly advantageous. Your problem-solving skills, combined with strong written and verbal communication abilities, enable you to convey technical insights clearly to diverse audiences. Your curiosity and adaptability drive you to stay updated with the latest advancements in AI and ML, allowing for quick learning and implementation of new technologies.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

As a Data Engineer, you will be responsible for analyzing and organizing raw data, building data systems and pipelines, evaluating business needs and objectives, interpreting trends and patterns, conducting complex data analysis, and reporting on results. You will prepare data for prescriptive and predictive modeling, build algorithms and prototypes, and combine raw information from different sources. Your role will involve exploring ways to enhance data quality and reliability, identifying opportunities for data acquisition, developing analytical tools and programs, and collaborating with data scientists and architects on various projects. To excel in this role, you should have previous experience as a data engineer or in a similar position. You must possess technical expertise in data models, data mining, and segmentation techniques. Proficiency in programming languages such as Java and Python is essential. Hands-on experience with SQL database design and great numerical and analytical skills are also required. Your skills in big data, Python, SQL, and Scala will be crucial for successfully fulfilling the responsibilities of this position. If you are passionate about working with diverse data sets, building data systems, and collaborating with cross-functional teams to drive insights and innovation, we encourage you to apply for this exciting opportunity.,

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies