Jobs
Interviews

361 Apache Spark Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

7 - 17 Lacs

bengaluru

Work from Office

About this role: Wells Fargo is Seeking a Lead Data Engineer. In this role, you will: Lead complex initiatives with broad impact and act as key participant in large scale software planning for the Technology area Design, develop, and run tooling to discover problems in data and applications and report the issues to engineering and product leadership Review and analyze complex software enhancement initiatives for business, operational or technical improvements that require in depth evaluation of multiple factors including intangibles or unprecedented factors Make decisions in complex and multi-faceted data engineering situations requiring understanding of software package options and programming language and compliance requirements that influence and lead Technology to meet deliverables and drive organizational change Strategically collaborate and consult with internal partners to resolve highly risky data engineering challenges Required Qualifications: 5+ years of Database Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 5+ years of experience in building and leading data driven application design and implementation using Hadoop/Spark/Scala with very good experience in any of relational databases Teradata/Oracle Exadata. Designing, Developing and maintaining data pipelines using Apache Spark/Hadoop/MapReduce/HBase in MapR or other Hadoop distributions. Optimizing Spark Jobs for performance, scalability and resource utilization in shared infrastructure. Experience in Python/Scala/Java Programming languages. Experience with one of the cloud providers (AWS, GCP, Azure) preferably in GCP in the data area. Experience in data storage service, Google Clouse Storage, data processing Google Cloud Dataflow and data warehousing like Redshift/Azure Synapse/Google Big query. Designing and building Data Lake/Datawarehouse solution with experience in data modeling and ETL process using Teradata/Oracle relation databases. Experience in building Semantic model and reporting Application using Power BI Experience in Abinitio/Talend ETL tool

Posted 7 hours ago

Apply

12.0 - 17.0 years

0 - 0 Lacs

pune

Remote

Hiring: Data Engineer SME Location: Hinjewadi, Pune (Hybrid) Contract: 12 Months Experience: 12 to 17 Years About the Role We are looking for an experienced Subject Matter Expert (SME) Cloud & Data Engineering to lead the design, implementation, and integration of cloud-based data solutions. The ideal candidate will have deep expertise in AWS, data engineering, and application integration , along with proven leadership capabilities. Key Responsibilities Lead the design and implementation of cloud-based solutions for large-scale data engineering projects. Orchestrate the integration of diverse applications and data sources across the cloud. Collaborate with cross-functional teams to ensure seamless data transfer and connectivity . Mentor and guide junior engineers in cloud technologies and data engineering best practices . Stay current with advancements in cloud platforms, data engineering tools, and industry best practices . Required Skills & Experience 12 to 17 years of experience in cloud engineering, data engineering, or related fields . Strong expertise in cloud platforms (preferably AWS) and data engineering frameworks. Proficiency in application and data integration within cloud environments. Hands-on experience with Apache Spark (preferred). Excellent problem-solving skills and ability to work in a fast-paced environment. Strong leadership and communication skills .

Posted 12 hours ago

Apply

4.0 - 6.0 years

8 - 12 Lacs

gurugram

Work from Office

Role Description: As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 14 hours ago

Apply

8.0 - 12.0 years

15 - 20 Lacs

pune

Work from Office

We are seeking a dynamic and experienced Tech Lead with a strong foundation in Java and Apache Spark to join our team. In this role, you will lead the development and deployment of scalable cloud-based data solutions, leveraging your expertise in AWS and big data technologies. Key Responsibilities: Lead the design, development, and deployment of scalable and reliable data processing solutions on AWS using Java and Spark. Architect and implement big data processing pipelines using Apache Spark on AWS EMR. Develop and deploy Serverless applications using AWS Lambda, integrating with other AWS services. Utilize Amazon EKS for container orchestration and microservices management. Design and implement workflow orchestration using Apache Airflow for complex data pipelines. Collaborate with cross-functional teams to define project requirements and ensure seamless integration of services. Mentor and guide team members in Java development best practices, cloud architecture, and data engineering. Monitor and optimize performance and cost of deployed solutions across AWS infrastructure. Stay current with emerging technologies and industry trends to drive innovation and maintain a competitive edge. Required Skills: Strong hands-on experience in Java development. Proficiency in Apache Spark for distributed data processing. Experience with AWS services including EMR, Lambda, EKS, and Airflow. Solid understanding of Serverless architecture and microservices. Proven leadership and mentoring capabilities. Excellent problem-solving and communication skills.

Posted 1 day ago

Apply

6.0 - 8.0 years

3 - 8 Lacs

hyderabad

Work from Office

Role & responsibilities Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor and evaluate team performance to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Scala. - Good To Have Skills: Experience with Java, Apache Spark, Microsoft Azure Data Services. - Strong understanding of data modeling and database design principles. - Experience with cloud-based data solutions and architectures. - Familiarity with data integration tools and techniques. Additional Information: - The candidate should have minimum 5 years of experience in Scala. - This position is based at our Hyderabad office. - A 15 years full time education is required

Posted 2 days ago

Apply

4.0 - 6.0 years

15 - 30 Lacs

bengaluru

Work from Office

Job Description: We are looking for a highly skilled Senior Data Engineer to join Catalog Management team. The ideal candidate will design, build, and maintain scalable data pipelines and infrastructure on Google Cloud Platform (GCP) , ensuring data availability and reliability for analytics and business needs. Key Responsibilities: Design and develop scalable data pipelines using Apache Spark (Java) , Apache Beam, or Kubeflow. Manage and configure GCP services : BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage. Develop and maintain RESTful APIs using Spring Boot. Implement test automation , CI/CD pipelines, and monitoring solutions. Collaborate with cross-functional teams to deliver optimized data solutions. Ensure performance, scalability, and cost-efficiency of data systems. Required Skills: Strong proficiency in Java programming . Expertise in Apache Spark for large-scale data processing. Solid hands-on experience with GCP services . Experience in API development (Spring Boot). Knowledge of CI/CD tools (Jenkins, GitLab CI, Cloud Build). Strong debugging, problem-solving, and communication skills. Preferred Skills: Apache Beam, Flink, or Kubeflow. Infrastructure-as-code (Terraform, Deployment Manager). Containerization (Docker, Kubernetes). Exposure to AI/ML data pipeline requirements. Education: Bachelors degree in Computer Science/Engineering or equivalent. How to Apply: Interested candidates are requested to apply with their updated CV, mentioning: Years of experience as Data engineer Years of experience with GCP Years of experience with GenAI/LLM Years of experience with Java programming Years of experience with Apache Spark

Posted 2 days ago

Apply

5.0 - 8.0 years

14 - 24 Lacs

chennai, bengaluru

Work from Office

Expertise and experience of Apache Spark, PySpark, Python based pipeline jobs Solid Data Lake/Data Warehouse principles, techniques and technologies - Star Schema, SQL (AWS EMR, Apache Iceberg, parquet) Strong database skills, NoSQL databases as well as relational databases in use often with large data volumes Strong data modelling concepts and principles experience of building data architectures Cloud preferably: AWS experience working with of APIs (designing with OpenAPI is desirable) CI/CD pipelines (Git-lab desirable) Kubernetes and Docker Experience developing microservices-based architectures, including distributed messaging patterns is a plus

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

At Crimson Enago, our main focus is on developing AI-powered tools and services that can significantly enhance the productivity of researchers and professionals. We understand that the journey of every researcher or professional involves stages of knowledge discovery, knowledge acquisition, knowledge creation, and knowledge dissemination, all of which can be mentally demanding and interconnected. To address this challenge, we have introduced our flagship products, Trinka and RAx, which are designed to simplify and expedite these four stages seamlessly. Trinka is an AI-powered English grammar checker and language enhancement writing assistant tailored for academic and technical writing. Created by a team of linguists, scientists, and language enthusiasts, Trinka is capable of identifying and rectifying a myriad of intricate writing errors, thus saving you valuable time and effort. It not only corrects contextual spelling mistakes and advanced grammar errors but also improves vocabulary usage and offers real-time writing suggestions. Furthermore, Trinka goes beyond grammar correction to help professionals and academics ensure their writing is professional, concise, and engaging. With its subject-specific correction feature, Trinka comprehends the nuances of different subjects and ensures the writing is tailored to suit the specific subject. Additionally, Trinka's Enterprise solutions provide unlimited access and offer extensive customization options to leverage all of Trinka's powerful capabilities. RAx is the pioneering smart workspace designed to assist researchers (including students, professors, and corporate researchers) in enhancing their efficiency and effectiveness in research projects. Powered by proprietary AI algorithms and a unique problem-solving approach combining design and technology, RAx aims to become the go-to workspace for any research-intensive projects. Launched in 2019, this product connects various sources of information (such as research papers, blogs, wikis, books, courses, and videos) with different actions (reading, writing, annotating, discussing, etc.), thereby uncovering new insights and opportunities in the academic realm that were previously unattainable or unimaginable. Our team comprises passionate researchers, engineers, and designers who have joined forces to develop a product that can transform the landscape of research-intensive project work. At the core of our mission is the objective of reducing cognitive load and aiding individuals in converting information into knowledge. The engineering team is dedicated to building a scalable platform that can handle vast amounts of data, perform AI processing on the data, and facilitate interactions among users worldwide. We firmly believe that research plays a pivotal role in enhancing the world, and our goal is to simplify the research process and make it enjoyable for everyone involved. As an SDE-3 Fullstack at Trinka, you will play a pivotal role in leading a team of talented web developers, setting high engineering standards, and assuming significant responsibility for end-to-end project development and delivery. Collaborating with the Engineering Manager, Principal Engineer, other SDE-3 leads, and Technical Project Manager, you will also be involved in recruitment and training activities for the team. Your primary focus will involve hands-on coding to drive project progress and success. We are looking for an SDE-3 Fullstack professional with over 5 years of experience in enterprise frontend-full-stack web development, particularly working with the AngularJS-Java-AWS stack. The ideal candidate should possess excellent research skills to devise technical solutions for complex business challenges, a strong background in unit and integration testing, and a commitment to maintaining high-quality, testable code based on robust software design patterns. Furthermore, the candidate should demonstrate proficiency in creating optimized scalable solutions, breaking down complex problems into manageable tasks, and conducting thorough code reviews to ensure code quality and performance. The ability to estimate project efforts accurately, communicate effectively within the team, and collaborate with senior developers and project stakeholders to enhance cloud infrastructure and reduce associated costs is essential. Additionally, familiarity with best practices in project deployment, developer tooling, testing, monitoring, and observability will be advantageous. The successful candidate should have a proven track record of architecting cost-efficient and highly scalable solutions, extensive experience in frontend-full-stack development, and proficiency in working with various backend technologies, including relational and document databases, CI/CD, and AWS services. Strong expertise in HTML5, CSS3, CSS processors, CSS frameworks, and CDN optimization is required, along with a keen eye for detail and a passion for creating exceptional front-end experiences. A deep understanding of software engineering principles, collaborative teamwork, and a relentless pursuit of excellence in user experience are key attributes we are looking for in our ideal candidate. Experience with Elasticsearch server cluster optimization, Apache Spark/Ray, and Root Cause Analysis would be considered an added advantage. If you are passionate about leveraging technology to drive innovation in research and possess the necessary skills and experience to excel in a dynamic and collaborative environment, we encourage you to explore this exciting opportunity at Trinka. Join us in our mission to revolutionize the way research-intensive projects are approached and make a positive impact on the world through the power of technology and innovation.,

Posted 3 days ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

The ideal candidate for the Data QA position should have 6 to 10 years of experience in ETL+ DWH Testing and Cloud Testing with expertise in AWS (S3 & Glue) and Azure, along with proficiency in SQL Server, including writing SQL queries. Additionally, familiarity with Snowflake, ADF, and API Testing is essential. Key technical skills required for this role include a strong understanding of Boomi or similar integration tools like MuleSoft, Informatica, Talend, or Workato, experience with Data Bricks, in-depth knowledge of ETL Testing, proficiency in writing SQL Queries, and good technical documentation skills. The candidate should also possess strong communication skills to effectively interact with both technical and non-technical stakeholders. Desirable skills for this role include Data Testing on AWS using Athena, the ability to run and troubleshoot AWS Glue Jobs, understanding of Data Warehousing concepts, and proficiency in Python and Apache Spark. The working hours for this position are from 11:00 am to 8:00 pm, with flexibility required for potential overlap with EST as per project requirements. This is a full-time position with a duration of 1 year, and the candidate is expected to work 2/3 days from the office in Bangalore, Pune, Mumbai, Hyderabad, or Noida, based on the hybrid model.,

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

A career in the Advisory Acceleration Centre at PwC is an opportunity to leverage PwC's global delivery capabilities and provide premium, cost-effective, and high-quality services to support client engagements. As an Azure Senior Developer at PwC - AC, you will collaborate with the Offshore Manager and Onsite Business Analyst to grasp the project requirements and take charge of the complete implementation of Cloud data engineering solutions. Your role will involve utilizing your expertise in Azure cloud services such as Storage services (Blob, ADLS, PaaS SQL), Azure Data Factory, and Azure Synapse. You should excel in planning, organization, and have the ability to lead as a cloud developer in an agile team, delivering automated cloud solutions. With 4-8 years of hands-on experience, you must be proficient in Azure cloud computing, including big data technologies. Your deep understanding of traditional and modern data architecture and processing concepts will be critical, covering relational databases, data warehousing, big data, NoSQL, and business analytics. Your experience with Azure ADLS, Data Bricks, Data Flows, HDInsight, and Azure Analysis services will be essential, along with building stream-processing systems using solutions like Storm or Spark-Streaming. Designing and implementing scalable ETL/ELT pipelines using Databricks and Apache Spark, optimizing data workflows, and understanding big data use-cases and design patterns are key requirements. Your role will also involve architecture, design, implementation, and support of complex application architectures, as well as hands-on experience in implementing Big Data solutions using Microsoft Data Platform and Azure Data Services. Knowledge of Azure SQL DB, Azure Synapse Analytics, Azure HD Insight, Azure Data Lake Storage, Azure Data Lake Analytics, Azure Machine Learning, Stream Analytics, Azure Data Factory, Azure CosmosDB, and Power BI is essential. Exposure to Open-Source technologies like Apache Spark, Hadoop, NoSQL, Kafka, Solr/Elastic Search, as well as expertise in quality processes, design strategies leveraging Azure and Databricks, and Application DevOps tools will be advantageous. Desired skills include experience in stream-processing systems, Big Data ML toolkits, Python, and AWS Architecture certification. Having worked in Offshore/Onsite Engagements, experience in AWS services like STEP & Lambda, good project management skills, and knowledge in Cloud technologies like AWS, GCP, Informatica-Cloud, Oracle-Cloud, Cloud DW - Snowflake & DBT are also beneficial. The ideal candidate should have a professional background in BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA, possess good analytical and problem-solving skills, and excel in communication and presentation. In conclusion, as an Azure Senior Developer at PwC - AC, you will play a pivotal role in delivering high-quality Cloud data engineering solutions, leveraging your expertise in Azure cloud services, big data technologies, and modern data architecture concepts to support client engagements effectively.,

Posted 4 days ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a part of the data and analytics engineering team at PwC, your focus will be on utilizing advanced technologies and techniques to create robust data solutions for clients. Your role will involve transforming raw data into actionable insights, enabling informed decision-making, and contributing to business growth. Specifically in data engineering at PwC, you will be responsible for designing and constructing data infrastructure and systems that facilitate efficient data processing and analysis. This will include the development and implementation of data pipelines, data integration, and data transformation solutions. At PwC - AC, we are seeking an Azure Manager specializing in Data & AI, with a strong background in managing end-to-end implementations of Azure Databricks within large-scale Data & AI programs. In this role, you will be involved in architecting, designing, and deploying scalable and secure solutions that meet business requirements, encompassing ETL, data integration, and migration. Collaboration with cross-functional, geographically dispersed teams and clients will be key to understanding strategic needs and translating them into effective technology solutions. Your responsibilities will span technical project scoping, delivery planning, team leadership, and ensuring the timely execution of high-quality solutions. Utilizing big data technologies, you will create scalable, fault-tolerant components, engage stakeholders, overcome obstacles, and stay abreast of emerging technologies to enhance client ROI. Candidates applying for this role should possess 8-12 years of hands-on experience and meet the following position requirements: - Proficiency in designing, architecting, and implementing scalable Azure Data Analytics solutions utilizing Azure Databricks. - Expertise in Azure Databricks, including Spark architecture and optimization. - Strong grasp of Azure cloud computing and big data technologies. - Experience in traditional and modern data architecture and processing concepts, encompassing relational databases, data warehousing, big data, NoSQL, and business analytics. - Proficiency in Azure ADLS, Data Databricks, Data Flows, HDInsight, and Azure Analysis services. - Ability to build stream-processing systems using solutions like Storm or Spark-Streaming. - Practical knowledge of designing and building Near-Real Time and Batch Data Pipelines, expertise in SQL and Data modeling within an Agile development process. - Experience in the architecture, design, implementation, and support of complex application architectures. - Hands-on experience implementing Big Data solutions using Microsoft Data Platform and Azure Data Services. - Familiarity with working in a DevOps environment using tools like Chef, Puppet, or Terraform. - Strong analytical and troubleshooting skills, along with proficiency in quality processes and implementation. - Excellent communication skills and business/domain knowledge in Financial Services, Healthcare, Consumer Market, Industrial Products, Telecommunication, Media and Technology, or Deal advisory. - Familiarity with Application DevOps tools like Git, CI/CD Frameworks, Jenkins, or Gitlab. - Good understanding of Data Modeling and Data Architecture. Certification in Data Engineering on Microsoft Azure (DP 200/201/203) is required. Additional Information: - Travel Requirements: Travel to client locations may be necessary based on project needs. - Line of Service: Advisory - Horizontal: Technology Consulting - Designation: Manager - Location: Bangalore, India In addition to the above, the following skills are considered advantageous: - Cloud expertise in AWS, GCP, Informatica-Cloud, Oracle-Cloud. - Knowledge of Cloud DW technologies like Snowflake and Databricks. - Certifications in Azure Databricks. - Familiarity with Open Source technologies such as Apache Spark, Hadoop, NoSQL, Kafka, and Solr/Elastic Search. - Data Engineering skills in Java, Python, Pyspark, and R-Programming. - Data Visualization proficiency in Tableau and Qlik. Education qualifications accepted include BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Senior Data Architect, you will play a crucial role in designing and implementing scalable, secure, and high-performance Big Data architectures using Databricks, Apache Spark, and cloud-native services. Your expertise will be essential in leading the end-to-end data architecture lifecycle, from requirements gathering to deployment and optimization. Additionally, you will design repeatable and reusable data ingestion pipelines for various ERP source systems like SAP, Salesforce, HR, Factory, and Marketing systems. Collaboration with cross-functional teams to integrate SAP data sources into modern data platforms will be a key aspect of your role, along with driving cloud cost optimization strategies and ensuring efficient resource utilization. Furthermore, you will provide technical leadership and mentorship to a team of data engineers and developers, while also developing and enforcing data governance, data quality, and security standards. Your ability to translate complex business requirements into technical solutions and data models will be crucial, as well as staying current with emerging technologies and industry trends in data architecture and analytics. Your skill set should include proven expertise in Databricks, Apache Spark, Delta Lake, and MLflow, along with strong programming skills in Python, SQL, and PySpark. Experience with SAP data extraction and integration, as well as hands-on experience with cloud platforms like Azure, AWS, or GCP, will be highly beneficial. Required qualifications include a minimum of 6 years of experience in Big Data architecture, Data Engineering, and AI-assisted BI solutions within Databricks and AWS technologies. A bachelor's degree in computer science, information technology, data science, data analytics, or related field is necessary. Additionally, you should have a solid understanding of data modeling, ETL/ELT pipelines, and data warehousing, along with demonstrated team leadership and project management capabilities. Strong communication, problem-solving, and stakeholder management skills are essential for success in this role. Preferred qualifications include experience in the manufacturing domain, certifications in Databricks, cloud platforms, or data architecture, and familiarity with CI/CD pipelines, DevOps practices, and infrastructure as code (e.g., Terraform).,

Posted 4 days ago

Apply

12.0 - 16.0 years

0 Lacs

jodhpur, rajasthan

On-site

You are looking for a Principal Data Architect (Contract) role in Jodhpur with AnandRathi IT Pvt Ltd Services. AnandRathi is a prominent financial services firm in India, offering a diverse range of services in wealth management, capital markets, brokerage, investment banking, and financial planning. As part of their digital transformation journey, they are focused on developing a state-of-the-art data ecosystem to enhance insights, scalability, and innovation across all sectors. As a Principal Data Architect, you will play a crucial role in shaping and advancing the enterprise data architecture. Your responsibilities will include defining the data architecture strategy aligned with business objectives, designing scalable and secure data pipelines and platforms, overseeing data integration from various sources, maintaining data model integrity, implementing data governance frameworks, and ensuring compliance with regulatory standards. You will lead data engineering teams in building robust data pipelines using tools like Apache Spark, Airflow, and Kafka. Your expertise in data modelling, distributed systems, cloud-native data platforms, data governance, and AI/ML pipelines will be essential for this role. Collaboration with stakeholders from multiple departments, presentation of architectural decisions to executive stakeholders, and participation in the company's digital and AI-first journey are key aspects of this position. To be successful in this role, you should have over 12 years of experience in data architecture, engineering, or analytics roles, a deep understanding of data modelling and modern data stack tools, familiarity with cloud-native data platforms, strong knowledge of data governance and security frameworks, and hands-on experience with AI/ML pipelines. Additionally, having a background in the financial services domain, excellent communication skills, and relevant certifications will be advantageous. If you are looking to be part of a dynamic team driving AnandRathi's digital and AI-first future, collaborating with industry experts, and working on enterprise-scale systems in the BFSI space, this role could be the perfect fit for you. Join AnandRathi and contribute to building a cutting-edge data ecosystem that powers innovation and growth across the organization.,

Posted 5 days ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Technology Lead Analyst position is a senior-level role that involves establishing and implementing new or revised application systems and programs in coordination with the Technology team. Your main objective will be to lead applications systems analysis and programming activities. You will partner with multiple management teams to ensure the integration of functions to meet goals and identify necessary system enhancements for deploying new products and process improvements. Additionally, you will resolve high-impact problems and projects through an in-depth evaluation of complex business processes and industry standards. Your responsibilities will include providing expertise in areas of applications programming, ensuring application designs align with the overall architecture blueprint, developing standards for coding, testing, debugging, and implementation, and gaining comprehensive knowledge of how different areas of business integrate to achieve goals. You will also provide in-depth analysis to define issues and develop innovative solutions, serve as an advisor or coach to mid-level developers and analysts, and assess risks when making business decisions. To qualify for this role, you should have 6-10 years of relevant experience in Apps Development or systems analysis, extensive experience in system analysis and programming of software applications, and experience in managing and implementing successful projects. You should be a Subject Matter Expert (SME) in at least one area of Applications Development, have the ability to adjust priorities quickly, demonstrate leadership and project management skills, and possess clear and concise written and verbal communication skills. Education-wise, a Bachelor's degree or equivalent experience is required, with a preference for a Master's degree. As a Vice President (VP), you will lead a technical vertical (Frontend, Backend, or Data), mentor developers, and ensure timely, scalable, and testable delivery across your domain. Your responsibilities will include leading a domain-specific team of engineers, translating architecture into execution with detailed designs and guidance, reviewing complex components, leading data platform migration projects, integrating CI/CD pipelines, enforcing code quality, and evaluating AI-based tools for productivity and code improvement. The required skills for this role include 10-14 years of experience leading development teams, proficiency in programming languages such as Java, Python, and JavaScript/TypeScript, familiarity with frameworks like Spring Boot/WebFlux and Angular, expertise in databases like Oracle, MongoDB, and Redis, knowledge of cloud technologies and data technologies, experience in development practices like TDD and CI/CD pipelines, and proficiency in quality tools like SonarQube and automated testing frameworks. Strong mentoring, conflict resolution, and cross-team communication skills are also essential. This job description provides a high-level overview of the role, and other job-related duties may be assigned as required.,

Posted 5 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Senior Programmer Analyst position involves participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your role will contribute to applications systems analysis and programming activities. Responsibilities include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will monitor and control all phases of the development process, provide user and operational support on applications to business users, and analyze complex problems/issues utilizing in-depth specialty knowledge of applications development. Additionally, you will recommend and develop security measures, consult with users/clients and other technology groups, and ensure essential procedures are followed while defining operating standards and processes. You will also serve as an advisor or coach to new or lower-level analysts, operate with a limited level of direct supervision, and act as an SME to senior stakeholders and/or other team members. Furthermore, as the Applications Development Senior Programmer Analyst, you will assess risks when making business decisions, safeguard the firm's reputation and assets, adhere to policy, apply ethical judgment, and escalate control issues transparently. Your role will involve communicating with different tech and functional teams across the globe, with teams in various countries such as LATAM, US, Mexico, and India. You are expected to be available during the second shift supporting Indian timings and attend calls during the first half of the US time zone, with minimal interaction with external clients. Qualifications Required: - 8+ years of application/software development/maintenance experience - 5+ years of experience with Big Data Technologies like Apache Spark, Hive, Hadoop - Proficiency in Python, Java, or Scala programming languages - Experience with JAVA (Core Java, J2EE, Spring Boot Restful Services), Web services (REST, SOAP), XML, Java Script, Micro services, SOA, etc. - Strong technical knowledge of Apache Spark, Hive, SQL, and the Hadoop ecosystem - Experience with developing frameworks and utility services, and delivering high-quality software following continuous delivery - Ability to work independently, multi-task, and take ownership of various analyses or reviews - Strong analytical and communication skills Preferred Skills: - Work experience in Citi or Regulatory Reporting applications - Hands-on experience with cloud technologies, AI/ML integration, and creation of data pipelines - Familiarity with vendor products like Tableau, Arcadia, Paxata, KNIME - Experience with API development and data formats Education: Bachelors degree/University degree or equivalent experience Please note that this job description provides an overview of the work performed, and other job-related duties may be assigned as required.,

Posted 5 days ago

Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

At Crimson Enago, we are dedicated to developing AI-powered tools and services that enhance the productivity of researchers and professionals. We understand that the stages of knowledge discovery, acquisition, creation, and dissemination can be cognitively demanding and interconnected. In response to this, our flagship products Trinka and RAx have been designed to streamline these stages and make them efficient and straightforward. Trinka is an AI-powered English grammar checker and language enhancement writing assistant tailored for academic and technical writing purposes. Developed by experts in linguistics, science, and language, Trinka is capable of identifying and correcting a wide array of complex writing errors, ensuring high-quality output without the need for manual intervention. Beyond basic grammar and spelling corrections, Trinka also offers advanced grammar error detection, vocabulary enhancement suggestions, and real-time writing recommendations. With specialized corrections for different subjects, Trinka ensures that writing is not only grammatically correct but also tailored to the specific subject matter. Moreover, Trinka's Enterprise solutions provide unlimited access and customizable options to maximize its powerful capabilities. RAx, on the other hand, is a cutting-edge smart workspace designed to assist researchers, including students, professors, and corporate researchers, in optimizing their research projects. With proprietary AI algorithms and innovative problem-solving approaches, RAx serves as an essential tool for any research-intensive project. By bridging the gap between information sources such as research papers, blogs, and courses, and research activities like reading, writing, and discussing, RAx unlocks new insights and possibilities in the academic landscape. Our team at Crimson Enago consists of passionate researchers, engineers, and designers united in our mission to revolutionize research-intensive projects. By alleviating cognitive burdens and facilitating the transformation of information into knowledge, we aim to make the research process more accessible and enjoyable. Our engineering team is dedicated to building a scalable platform that handles vast amounts of data, leverages AI processing capabilities, and connects users worldwide. We firmly believe in the transformative power of research and strive to simplify and enhance the research experience. As a Principal Engineer Backend at Crimson Enago, you will play a pivotal role in leading a team of web developers, setting engineering standards, and overseeing project development and delivery. Collaborating with the Engineering Manager, Principal Engineers, SDE-3 leads, and Technical Project Manager, you will be responsible for guiding the team, contributing to hiring and training efforts, and engaging in hands-on coding tasks. Your expertise in enterprise backend web development, particularly on the NodeJS-AWS stack, will be crucial in driving the success of the RAx product. The ideal candidate for this role should possess at least 7 years of experience in backend web development, with a strong focus on NodeJS-AWS technologies. Key attributes we are looking for include excellent research skills, a commitment to high-quality code design and review processes, and a proactive approach to problem-solving. Additionally, proficiency in architecting scalable solutions, working with various backend technologies, and optimizing cloud infrastructure are highly valued. A collaborative mindset, dedication to user experience, and a passion for innovation are essential qualities we seek in our team members. If you have a proven track record in backend development, a keen eye for detail, and a drive to deliver impactful solutions, we invite you to join our dynamic team at Crimson Enago. Together, we can shape the future of research and empower professionals to achieve their full potential.,

Posted 5 days ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

We are seeking a seasoned Senior Developer & Tech Lead who is enthusiastic about writing clean and efficient code, constructing scalable systems, promoting engineering excellence, and supervising a team of skilled developers in a fast-paced, Agile environment. This position is well-suited for developers with extensive hands-on experience in Java and Apache Spark, coupled with a solid understanding of object-oriented design principles. Your responsibilities will include conducting detailed impact analysis for code changes, designing and implementing scalable, high-performance code using Java and Bigdata/Apache Spark, and ensuring the code is of high quality, maintainable, modular, and adheres to industry-standard design patterns and SOLID principles. You will also be responsible for writing robust unit tests using JUnit, leading code reviews to enforce clean design and best engineering practices, fostering an environment of ownership and accountability, and mentoring a team of developers through technical challenges. As a Senior Developer & Tech Lead, you will collaborate closely with Architects, Quality Engineers, DevOps, and Product owners to deliver high-quality code at speed. You will work in a cross-functional Agile team, participating in daily stand-ups, sprint planning, retrospectives, and backlog grooming. Additionally, you will translate user stories into technical tasks and ensure timely delivery of high-quality solutions. The ideal candidate for this role should possess at least 8 years of development experience with a strong background in Java, Bigdata/Apache Spark, and object-oriented programming. Experience with REST APIs, RDBMS database, and Kafka messaging systems is required, along with exposure to microservices architecture and containerization tools such as Docker and Kubernetes. Proven experience in leading teams and mentoring developers in a fast-paced development environment is essential. A solid understanding of software development lifecycle (SDLC) and Agile methodologies, excellent problem-solving skills, and the ability to think critically under pressure are also crucial. Strong communication skills and the ability to collaborate effectively in cross-functional teams are highly valued. Education-wise, a Bachelor's degree or equivalent experience is required, while a Master's degree is preferred. If you are a person with a disability and require a reasonable accommodation to use our search tools and/or apply for a career opportunity, please review the Accessibility at Citi. You can also view Citi's EEO Policy Statement and the Know Your Rights poster.,

Posted 5 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Greetings from Themesoft! We are currently seeking a Senior Associate in GenAI & Data Science to join our CES Innovation team in either Chennai or Pune. In this role, you will have the opportunity to work on cutting-edge GenAI, Natural Language Processing (NLP), and Large Language Models (LLM) solutions to address real-world Environmental, Social, and Governance (ESG) challenges. Key Responsibilities: - Build multi-agentic RAG pipelines using LLMs - Develop NLP solutions utilizing high-quality domain training data - Apply Machine Learning (ML), Optical Character Recognition (OCR), and machine translation techniques to process complex document types - Collaborate closely with stakeholders to tackle ESG data challenges effectively Requirements: - Minimum of 5 years of experience in the field - Proficiency in ML, Python, GenAI/NLP, and LLMs with at least 4 years of hands-on experience - Expertise in deep learning frameworks such as PyTorch and TensorFlow - Strong knowledge of NLP concepts including Transformers, embeddings, Named Entity Recognition (NER), etc. - Previous experience with Apache Spark/PySpark and Continuous Integration/Continuous Deployment (CI/CD) is preferred If you possess the required skills and experience and are enthusiastic about contributing to innovative solutions in the field of GenAI and Data Science, we encourage you to share your updated resume with us at mythili@themesoft.com.,

Posted 6 days ago

Apply

3.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Scientist with 3-8 years of experience, you will collaborate with interdisciplinary scientists, IT, and engineering professionals to solve critical problems and support business decision-making. Strong communication and organizational skills are essential, paired with a problem-solving attitude. Key expertise required for this role includes at least 3 years in analytics, particularly with big data solutions. You should have Subject Matter Expertise (SME) in Statistics, Analytics, Data Science, Machine Learning, Deep Learning, Big Data Platforms (SQL, Hadoop, Apache Spark, etc.), Programming Languages (Python and R), and Techniques such as Statistical models, ML algorithms (e.g., Random Forest, SVM), Deep Learning models (e.g., CNN, RNN), and NLP. You should be able to build proposals for analytics/data science solutions, conduct hands-on data analysis, and derive actionable insights. Experience collaborating with global teams and senior stakeholders is also valued. The ideal candidate for this role will have a strong analytical and logical mindset, be well-organized with prioritization skills, and possess an independent, self-starting attitude with ownership of tasks. You should be curious and collaborative, keen to learn, and proficient in English for a global team environment. Your educational background should include a degree in fields like Statistics, Mathematics, Operations Research, Economics, Engineering, or Information Science. Contributions on platforms like GitHub or Kaggle are advantageous for this position.,

Posted 6 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

A career in our Advisory Acceleration Centre is the natural extension of PwC's leading-class global delivery capabilities. We provide premium, cost-effective, high-quality services that support process,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Senior Associate in the Data, Analytics & Specialist Managed Service Tower at PwC, with 6 to 10 years of experience, you will be part of a team of problem solvers addressing complex business issues from strategy to execution. Your responsibilities at this management level include utilizing feedback and reflection for personal development, being flexible in stretch opportunities, and demonstrating critical thinking skills to solve unstructured problems. You will also be involved in ticket quality reviews, project status reporting, and ensuring adherence to SLAs and incident, change, and problem management processes. Seeking diverse opportunities, communicating effectively, upholding ethical standards, demonstrating leadership capabilities, and collaborating in a team environment are essential aspects of this role. You will also be expected to contribute to cross competency work, COE activities, and manage escalations and risks. As a Senior Azure Cloud Engineer, you are required to have a minimum of 6 years of hands-on experience in building advanced Data warehousing solutions on leading cloud platforms, along with 3-5 years of Operate/Managed Services/Production Support Experience. Your responsibilities will include designing scalable and secure data structures, developing data pipelines for downstream consumption, and implementing ETL processes using tools like Informatica, Talend, SSIS, AWS, Azure, Spark, SQL, and Python. Experience with data analytics tools, data governance solutions, ITIL processes, and strong communication and problem-solving skills are essential for this role. Knowledge of Azure Data Factory, Azure SQL Database, Azure Data Lake, Azure Blob Storage, Azure Databricks, Azure Synapse Analytics, and Apache Spark is also required. Additionally, experience in data validation, cleansing, security, and privacy measures, as well as SQL querying, data governance, and performance tuning are essential. Nice to have qualifications for this role include Azure certification. Managed Services- Data, Analytics & Insights Managed Service at PwC focuses on providing integrated services and solutions to clients, enabling them to optimize operations and accelerate outcomes through technology and human-enabled experiences. The team emphasizes a consultative approach to operations, leveraging industry insights and talent to drive transformational journeys and sustained client outcomes. As a member of the Data, Analytics & Insights Managed Service team, you will be involved in critical offerings, help desk support, enhancement, optimization work, and strategic advisory engagements. Your role will require a mix of technical expertise and relationship management skills to support customer engagements effectively.,

Posted 1 week ago

Apply

9.0 - 15.0 years

0 Lacs

karnataka

On-site

As a Senior Data Engineer with 9 to 15 years of experience, you will be joining our team in Bangalore, Pune, or Hyderabad in a hybrid work environment. Your role will involve collaborating closely with client teams, particularly ML engineers and product owners, to design, optimize, and implement advanced data and AI solutions. Your responsibilities will include providing technical leadership to clients in the data domain, optimizing and expanding data pipelines using Apache Spark, working with client ML engineers to establish and execute MLOps and LLMOps processes, supporting Generative AI workflows, and utilizing Databricks for advanced analytics and large-scale data processing. Hands-on experience with Databricks is a mandatory requirement for this role. Additionally, you will engage with client Product Owners to determine resource needs, plan milestones, and oversee cross-stream releases. Leading and mentoring a team of data engineers to ensure high-quality project delivery and alignment with goals will also be part of your responsibilities. To be successful in this role, you should have 9 to 15 years of data engineering experience with a focus on big data technologies, expertise in building and optimizing Spark-based data pipelines, exposure to machine learning workflows and MLOps tools, and familiarity with GenAI and LLMOps. Strong stakeholder management skills, experience with Databricks, and the ability to lead teams and deliver projects in a fast-paced client-facing environment are essential. Preferred qualifications for this position include a Bachelors or Masters degree in Computer Science, Data Engineering, or a related field, exposure to cloud platforms like Azure, AWS, or GCP, and familiarity with tools such as MLflow, Airflow, Kubernetes, or Docker.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

A career in our Advisory Acceleration Centre is the natural extension of PwC's leading-class global delivery capabilities. We provide premium, cost-effective, high-quality services that support process,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You have 5+ years of experience in designing and building data pipelines using Apache Spark, Databricks, or equivalent big data frameworks. You possess hands-on expertise with streaming and messaging systems like Apache Kafka, Confluent Cloud, RabbitMQ, or Azure Event Hub. Your experience includes creating producers, consumers, and topics, and integrating them into downstream processing. You have a deep understanding of relational databases and Change Data Capture (CDC). You are proficient in SQL Server, Oracle, or other RDBMSs, and have experience capturing change events using tools like Debezium or native CDC tools, transforming them for downstream consumption. Your proficiency extends to programming languages such as Python, Scala, or Java, along with solid knowledge of SQL for data manipulation and transformation. You also have cloud platform expertise, including experience with Azure or AWS services for data storage, compute, and orchestration (e.g., ADLS, S3, Azure Data Factory, AWS Glue, Airflow, Databricks, DLT). Furthermore, you have knowledge of data modeling and warehousing, including familiarity with data Lakehouse architectures, Delta Lake, partitioning strategies, and performance optimization. You are also well-versed in version control and DevOps practices, with experience in Git, CI/CD pipelines, and the ability to automate deployment and manage infrastructure as code.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Associate in GenAI & Data Science at Themesoft, you will be an integral part of our CES Innovation team, focusing on cutting-edge GenAI, NLP, and LLM solutions to address real-world ESG challenges. With a primary presence in either Chennai or Pune, you will utilize your 5+ years of experience to build multi-agentic RAG pipelines using LLMs, develop NLP solutions with high-quality domain training data, and apply ML, OCR, and machine translation techniques to various complex document types. Collaboration with stakeholders to effectively tackle ESG data challenges will also be a key aspect of your role. Your profile should include: - A minimum of 4 years of experience in ML, Python, GenAI/NLP, and LLMs - Proficiency in deep learning frameworks such as PyTorch and TensorFlow - Strong knowledge of NLP concepts including Transformers, embeddings, and NER - Hands-on experience with Apache Spark/PySpark and CI/CD processes If you are passionate about leveraging your expertise in AI and data science to drive impactful solutions in the realm of ESG challenges, we encourage you to share your updated resume with us at mythili@themesoft.com. Join our team and be at the forefront of innovation in the field of artificial intelligence and data science.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies