Jobs
Interviews

7759 Hadoop Jobs - Page 27

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

karnataka

On-site

The Software Engineering team at Dell Technologies is dedicated to delivering next-generation application enhancements and new products to meet the evolving needs of the world. As a Software Principal Engineer in Bangalore, you will be at the forefront of designing and developing software using cutting-edge technologies, tools, and methodologies in collaboration with both internal and external partners. Your primary responsibility will be to develop sophisticated systems and software solutions aligned with our customers" business goals and requirements. You will work closely with business stakeholders, conduct data analysis, ETL tasks, and data administration independently. Additionally, you will design, develop, and maintain scalable ETL pipelines, collaborating with various teams to ensure data requirements are met. It will be essential to stay updated on industry trends, mentor junior data engineers, and provide technical guidance as needed. To excel in this role, you should have a minimum of 8 years of industry experience with a focus on advanced ETL skills, including proficiency in tools like Airflow, ControlM, or Informatica. Strong Python programming skills for data manipulation, along with a solid understanding of SQL and NoSQL databases, are required. Experience with big data technologies such as Hadoop, Spark, or Kafka, as well as familiarity with cloud platforms like AWS, Azure, ADF Synapse, and SQLserver, will be beneficial. Desirable qualifications include experience with product data and product lifecycle management, as well as working knowledge of data visualization tools like Tableau or Power BI. At Dell Technologies, we believe in the power of each team member to make a positive impact. We prioritize our team members" growth and development, offering opportunities to work with cutting-edge technology and some of the industry's best minds. If you are seeking a rewarding career where you can contribute to a future that benefits everyone, we invite you to join us. Application closing date: 30 July 2025 Dell Technologies upholds the principle of equal employment opportunity and is committed to providing a work environment free of discrimination and harassment for all employees. If you are ready to take the next step in your career with us, we look forward to welcoming you to our team.,

Posted 1 week ago

Apply

10.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Job Title : IT Lead Reporting Structure : Reports to Manager - IT Infra (Navi Mumbai location) Education : Bachelor's degree in any engineering discipline/ BCA/ MCA Qualifications Minimum 7 and maximum 10 years of hands-on experience in managing overall IT day-to-day operations to improve infrastructure costs, performance and end- user satisfaction. Able to work with cutting-edge technology and assimilate information rapidly. Individual thinker and ability to work independently with little or no oversight. Strong interpersonal and articulation skills both spoken and written. Strong problem-solving and communication skills Practical experience in building IT infrastructure strategy in collaboration with business departments and vendors. Managing IT staff and IT Infra equipment with in-depth knowledge of servers, databases, storage, network, backup and HW/SW components. Should possess knowledge of RedHat OpenShift Container platform along with knowledge of Kubernetes and OS administration of RHEL & Windows. Practical experience in handling IT infrastructure with working knowledge of replication, storage provisioning, IP subnetting, backup policies and OS hardening. Working knowledge of Cloudera in Hadoop (Big Data) environment. Should have practical experience in managing on-premises IT Infrastructure & hosting of application in on-premises DC. Should have service management knowledge of handling incident, changes, problem with ability to troubleshoot complex problems by providing RCA. Should translate complex ideas for non-technical staff/customers by empowering learning and knowledge transfer. - Should have good knowledge of vendor, stakeholder and SLA management. Analytical thinking, communication, teamwork, relationship management, subject matter expertise, Soft skills and service delivery commitments. Industry Technology, ITES, Shared Services or Banking organization Responsibilities To Build infrastructure strategy in collaboration with business departments. Involve in all IT Infra related activities and Infra RFP creation, vendor evaluation. Assist vendor team with infra related discussions and defining the infrastructure requirement. Responsible for overseeing the daily operations of the technical support team as well as participating as an active member of the team. Maintain/track records of daily reporting problems & resolutions and actions taken for user issues. Provide timely resolution for ITSM service tickets raised to maintain SLA commitments. Check, update, and track IT Asset inventory. Work to create any relevant support material for the team. Ensure that all customer inquiries and issues are solved correctly and in a prompt and professional manner. Review all technical support-related processes and documentation for continuous improvement. Manage staff and network/server equipment. Should participate and manage the DR-Drill along with documentation of BCP plans and creation of SOPs for various IT process. Understand the application flow and support to troubleshoot application related issues. Follow change management & organization process. Actively participate in planning & execution of activities, Collaborating with the team on future direction and opportunities for new technology. Should provide regular update to senior management & client with good communication skills. Must Have Knowledge in ITSM tools, patch, backup, archival & restoration management, operating system, Antivirus, Security solutions In depth knowledge of servers and networks, data organization, Linux & Microsoft server. Good To Have Conceptual knowledge of Big data & RDBMS. (CCNA or ITIL or RHEL, SAN/NAS, CCNA, Comptia, Microsoft Certified preferred.) Employment Type All positions are on fixed term contract on a full-time basis exclusively for ReBIT, initially for a period of five years, extendable by mutual consent Location : Navi Mumbai (Kharghar/ Belapur) (ref:hirist.tech)

Posted 1 week ago

Apply

13.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Tech Specialist with strong analytical and technical ability with over 13 years of experience in enterprise Web applications, REST services and Workflow Processing Service development using Java/J2EE technologies. Experienced in working on medium to large enterprise projects, preferably in financial services Knowledge/Experience: Should have hands on experience on designing & development of scalable, distributed applications. Architect large scale of applications using spark, kafka & big data technologies Knowledge of Hadoop architecture. Knowledge of frameworks – Velocity, Springs, Spring Boot. Knowledge of OOAD, UML and design Patterns Should have strong insight on OOPS concept and good hands on experience on Java (version 1.8 or above) and other java-based frameworks like Spring Batch, Spring IOC, Spring Annotation, Spring Security. Should have hands on experience on messaging platform like Kafka. Good working knowledge of JBPM as BPMN Framework is must. Good working knowledge of Docker, Kubernetes and OpenShift is a must. Should have strong knowledge of Java design patterns, microservice design patterns, event streams, event/message-based architecture, Domain driven design etc. Should have strong knowledge of API based architecture and SOA. Expertise in Server less, tomcat (Embedded/Non-Embedded), jetty (Embedded/Non-Embedded), WebSphere, Spring Batch, Spring IOC, Spring Annotation, Spring Security Expertise in mocking, Junit and perf testing of solutions. Should possess basic Unix/Linux knowledge to be able to write and understand basic shell scripts and basic Unix commands Good working knowledge of in memory distributed caches (Hazelcast, Gemfire) is good to have. Person should have worked in Agile/DevOps Environment. knowledge on webserver setup and configuration with reverse proxy/ssl setup etc (preferred nginx webserver) is a plus Good to have skills: Financial markets background is preferable but is not a must Knowledge of testing concepts (TDD, BDD) is a plus. Knowledge of ELK/App Dynamics Knowledge of other programming languages like Vaadin (UI Framework), Kotlin, scala, shell scripting etc is good to have. Key Responsibilities: A seasoned SME and technical specialist in Client On boarding/AML/KYC/Account Opening domain Translate business requirements into technical documents/code Employ standards, frameworks and patterns while designing and developing components. Implement appropriate design standards, frameworks and patterns while designing and developing components Implement and maintain a suite of Workflow driven, Java application with RESTful services. Develop high quality code employing software engineering and testing best practices. Developing software that processes, persists and distributes data via relational and non-relational technologies Hands on coding, authoring unit tests/Junit, performance tests and maintaining high code quality. Needs to be able to react and provide quick turnaround to business requirements and management requests Well versed in Agile Development Life Cycle and capable to lead a team of developers. Partner with database developers to implement ingestion, orchestration, quality/reconciliation and distribution services Ability to work independently, good communication skills, has experience in working on complex and medium to large projects. Job Background: The position is based in India and is required to focus on delivery of the work, ensuring a robust design This role may report to the technology team lead based in Pune Candidate should be able to work independently and should be self-motivated Candidate might be required to work with vendors or third parties in joint delivery teams The role requires application of technical skills and knowledge of the business to develop solutions to meet business needs As part of large, geographically distributed team(s), the candidate may have to manage stakeholders across multiple functional areas The position requires analytical skills in order to filter, prioritize and validate potentially complex material, technical or business or otherwise, from multiple sources. The candidate will work with complex and variable issues with substantial potential impact, weighing various alternatives and balancing potentially conflicting needs Qualification: Bachelor’s degree (in science, computers, information technology or engineering) Candidate should be willing to work late in the evening India time on need basis in order to interact with US/other global teams ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Summary: We are seeking an experienced [7+ years] and to join our Production/Application support team. The ideal candidate will bring a blend of good technical skills [ Unix, SQL, ITIL, Autosys, Big Data ]. The ideal candidate will bring a blend of strong technical skills [ Unix, SQL, Big Data technologies ] and good domain expertise in financial services [e.g: Securities, secured financing , rates, Liquidity reporting , Derivatives , front office/back-office system , trading lifecycle] Key Responsibilities: Provide L2 production support for mission critical liquidity reporting and financial applications, ensuring high availability and performance. Monitor and resolve incidents related to trade capture, batch failures, position keeping, market data, pricing, risk and liquidity reporting. Proactively manage alerts, logs and jobs using Autosys, Unix tools, and monitoring platforms [ ITRS/AWP ]. Execute advance SQL queries and scripts for data analysis, validation, and issue resolution. Support multiple applications build on stored proc, SSIS, SSRS, Big data ecosystems [ hive, spark, Hadoop] and troubleshoot data pipeline issues. Maintain and improve knowledge bases, SOPs, and runbooks for production support. Participate in change management and release activities, including deployment validations. Lead root cause analysis [ RCA] , conduct post incident reviews, and drive permanent resolutions. Collaborate with infrastructure teams on capacity, performance, and system resilience initiatives. Contribute to continuous service improvement, stability management and automation initiatives. Required Skills & Qualification: Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field. 7+ Years of experience in application or production support with 2+ years at a Advance level. Strong hands-on experience with. Unix/Linus [ scripting, file manipulation, job control] SQL [ MSSQL/Oracle or similar, Stored proc, SSIS, SSRS ] Big Data technologies [ Hadoop, Hive, Spark ] Job Schedulers like Autosys. Log analysis tools. Solid understanding of financial instruments and trade lifecycle [ Equities, Fixed incomes, Secured Financing, Derivatives, Liquidity management ] Knowledge of front office/back office and reporting workflows and operations. Excellent analytical and problem-solving skills , with the ability to work in a time-sensitive environment. Effective communication and stakeholder management skills across business and technical teams. Experience with ITIL processes, including incident, problem and change management. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Support ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

We Are Hiring : Data Engineer | 5+ Years Experience Job Description Job Title : Data Engineer Location : Ahmedabad Work Mode : On-Site Opportunity Experience : 5+ Years Employment Type : Full-Time Availability : Immediate Joiner Preferred Join Our Team as a Data Engineer We are seeking a passionate and experienced Data Engineer to be a part of our dynamic and forward-thinking team in Ahmedabad. This is an exciting opportunity for someone who thrives on transforming raw data into powerful insights and building scalable, high-performance data infrastructure. As a Data Engineer, you will work closely with data scientists, analysts, and cross-functional teams to design robust data pipelines, optimize data systems, and enable data-driven decision-making across the organization. Your Key Responsibilities Architect, build, and maintain scalable and reliable data pipelines from diverse data sources. Design effective data storage, retrieval mechanisms, and data models to support analytics and business needs. Implement data validation, transformation, and quality monitoring processes. Collaborate with cross-functional teams to deliver impactful, data-driven solutions. Proactively identify bottlenecks and optimize existing workflows and processes. Provide guidance and mentorship to junior engineers in the team. Skills & Expertise Were Looking For 4+ years of hands-on experience in Data Engineering or related roles. Strong expertise in Python and data pipeline design. Experience working with Big Data tools like Hadoop, Spark, Hive. Proficiency with SQL, NoSQL databases, and data warehousing solutions. Solid experience in cloud platforms - Azure Familiar with distributed computing, data modeling, and performance tuning. Understanding of DevOps, Power Automate, and Microsoft Fabric is a plus. Strong analytical thinking, collaboration skills, Excellent Communication Skill and the ability to work independently or as part of a team. Qualifications Bachelors degree in Computer Science, Data Science, or a related field (ref:hirist.tech)

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a skilled and experienced Database Administrator (DBA), you will be responsible for managing and supporting our database environments to ensure optimal performance, integrity, and security. Working closely with other IT team members and stakeholders, you will play a crucial role in ensuring that our data systems operate efficiently and meet the business needs. Your qualifications include a Bachelor's degree in Computer Science, Information Technology, or a related field. A Master's degree or relevant certifications such as Oracle DBA or Microsoft SQL Server Certified would be a plus. With at least 5+ years of proven experience in managing database systems, you should have hands-on experience with major DBMS platforms like Oracle, SQL Server, MySQL, PostgreSQL, and MongoDB. Proficiency in SQL for querying and managing databases, along with knowledge of database design, data modeling, and normalization, is essential. Your responsibilities will include installing, configuring, and maintaining database software and related tools, monitoring database performance, and ensuring optimal resource utilization. Additionally, you will perform routine maintenance tasks, implement database security measures, and analyze performance metrics to identify bottlenecks and improve query efficiency. Strong analytical and problem-solving skills, excellent communication abilities, and the capacity to manage multiple tasks and projects simultaneously are required. Experience with cloud-based database services like AWS RDS, Google Cloud SQL, and big data technologies such as Hadoop would be beneficial. You will also participate in database design and data modeling activities, ensure data integrity through normalization and data validation, and develop and maintain documentation including data dictionaries and schema diagrams. Implementing robust backup and recovery procedures, managing disaster recovery planning, enforcing database security policies, and ensuring compliance with data privacy regulations are crucial aspects of your role. Collaboration with developers, system administrators, and stakeholders to ensure seamless database integration, as well as providing technical support and troubleshooting for database-related issues, will be part of your everyday tasks. Additionally, you may need to participate in on-call rotations and respond to critical database incidents to maintain the efficiency and security of our database systems.,

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Data Engineer at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Data Engineer you should have experience with: Strong experience with ETL tools such as Ab Initio, Glue, PySpark, Python, DBT, DataBricks and various AWS required services / products. Advanced SQL knowledge across multiple database platforms (Teradata , Hadoop, SQL etc.) Experience with data warehousing concepts and dimensional modeling. Proficiency in scripting languages (Python, Perl, Shell scripting) for automation. Knowledge of big data technologies (Hadoop, Spark, Hive) is highly desirable. Bachelor's degree in Computer Science, Information Systems, or related field. Experience in ETL development and data integration. Proven track record of implementing complex ETL solutions in enterprise environments. Experience with data quality monitoring and implementing data governance practices. Knowledge of cloud data platforms (AWS, Azure, GCP) and their ETL services. Some Other Highly Valued Skills Include Strong analytical and problem-solving skills. Ability to work with large and complex datasets. Excellent documentation skills. Attention to detail and commitment to data quality. Ability to work independently and as part of a team. Strong communication skills to explain technical concepts to non-technical stakeholders. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a talented Big Data Engineer, you will be responsible for developing and managing our company's Big Data solutions. Your role will involve designing and implementing Big Data tools and frameworks, implementing ELT processes, collaborating with development teams, building cloud platforms, and maintaining the production system. To excel in this position, you should possess in-depth knowledge of Hadoop technologies, exceptional project management skills, and advanced problem-solving abilities. A successful Big Data Engineer comprehends the company's needs and establishes scalable data solutions to meet current and future requirements effectively. Your responsibilities will include meeting with managers to assess the company's Big Data requirements, developing solutions on AWS utilizing tools like Apache Spark, Databricks, Delta Tables, EMR, Athena, Glue, and Hadoop. You will also be involved in loading disparate data sets, conducting pre-processing services using tools such as Athena, Glue, and Spark, collaborating with software research and development teams, building cloud platforms for application development, and ensuring the maintenance of production systems. The requirements for this role include a minimum of 5 years of experience as a Big Data Engineer, proficiency in Python & PySpark, expertise in Hadoop, Apache Spark, Databricks, Delta Tables, and AWS data analytics services. Additionally, you should have extensive experience with Delta Tables, JSON, Parquet file formats, familiarity with AWS data analytics services like Athena, Glue, Redshift, EMR, knowledge of Data warehousing, NoSQL, and RDBMS databases. Good communication skills and the ability to solve complex data processing and transformation-related problems are essential for success in this role.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

We are looking for a Java Developer to produce scalable software solutions on distributed systems like Hadoop using Spark Framework. You will be part of a cross-functional team responsible for the full software development life cycle, from conception to deployment. As a Developer, you should be comfortable with back-end coding, development frameworks, third party libraries, and Spark APIs required for application development on distributed platforms like Hadoop. Being a team player with a knack for visual design and utility is essential. Familiarity with Agile methodologies will be an added advantage. A large part of the workloads and applications will be cloud-based, so knowledge and experience with Google Cloud Platform (GCP) will be handy. As part of our flexible scheme, here are some of the benefits you'll enjoy: - Best in class leave policy - Gender-neutral parental leaves - 100% reimbursement under childcare assistance benefit (gender-neutral) - Sponsorship for industry-relevant certifications and education - Employee Assistance Program for you and your family members - Comprehensive Hospitalization Insurance for you and your dependents - Accident and Term life Insurance - Complementary Health screening for 35 years and above Your key responsibilities will include working with development teams and product managers to ideate software solutions, designing client-side and server-side architecture, building features and applications capable of running on distributed platforms and/or the cloud, developing and managing well-functioning applications supporting micro-services architecture, testing software for responsiveness and efficiency, troubleshooting, debugging, and upgrading software, creating security and data protection settings, and writing technical and design documentation. Additionally, you will be responsible for writing effective APIs (REST & SOAP). To be successful in this role, you should have proven experience as a Java Developer or similar role as an individual contributor or development lead, familiarity with common stacks, strong knowledge and working experience of Core Java, Spring Boot, Rest APIs, and Spark API, knowledge of React framework and UI experience, knowledge of Junit, Mockito, or other frameworks, familiarity with GCP services, design/architecture, and security frameworks, experience with databases (e.g., Oracle, PostgreSQL, BigQuery), familiarity with developing on distributed application platforms like Hadoop with Spark, excellent communication and teamwork skills, organizational skills, an analytical mind, a degree in Computer Science, Statistics, or a relevant field, and experience working in Agile environments. Good to have skills include knowledge of JavaScript frameworks (e.g., Angular, React, Node.js) and UI/UX design, knowledge of Python, and knowledge of NoSQL databases like HBASE, MONGO. You should have 4-7 years of prior working experience in a global banking/insurance/financial organization. You will receive training and development to help you excel in your career, coaching and support from experts in your team, and a culture of continuous learning to aid progression. We strive for a culture in which we are empowered to excel together every day, acting responsibly, thinking commercially, taking initiative, and working collaboratively. Together we share and celebrate the successes of our people. We welcome applications from all people and promote a positive, fair, and inclusive work environment.,

Posted 1 week ago

Apply

4.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Big Data Lead with 7-12 years of experience, you will be responsible for software development using multiple computing languages. Your role will involve working on distributed data processing systems and applications, specifically in Business Intelligence/Data Warehouse (BIDW) programs. Additionally, you should have previous experience in development through testing, preferably on the J2EE stack. Your knowledge and understanding of best practices and concepts in Data Warehouse Applications will be crucial to your success in this role. You should possess a strong foundation in distributed systems and computing systems, with hands-on engineering skills. Hands-on experience with technologies such as Spark, Scala, Kafka, Hadoop, Hbase, Pig, and Hive is required. An understanding of NoSQL data stores, data modeling, and data management is essential for this position. Good interpersonal communication skills, along with excellent oral and written communication and analytical skills, are necessary for effective collaboration within the team. Experience with Data Lake implementation as an alternative to Data Warehouse is preferred. You should have hands-on experience with Data frames using Spark SQL and proficiency in SQL. A minimum of 2 end-to-end implementations in either Data Warehouse or Data Lake is required for this role as a Big Data Lead.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

kolkata, west bengal

On-site

You must have knowledge in Azure Datalake, Azure function, Azure Databricks, Azure Data Factory, and PostgreSQL. Working knowledge in Azure DevOps and Git flow would be an added advantage. Alternatively, you should have working knowledge in AWS Kinesis, AWS EMR, AWS Glue, AWS RDS, AWS Athena, and AWS RedShift. Demonstrable expertise in working with timeseries data is essential. Experience in delivering data engineering/data science projects in Industry 4.0 is an added advantage. Knowledge of Palantir is required. You must possess strong problem-solving skills with a focus on sustainable and reusable development. Proficiency in using statistical computer languages like Python/PySpark, Pandas, Numpy, seaborn/matplotlib is necessary. Knowledge in Streamlit.io is a plus. Familiarity with Scala, GoLang, Java, and big data tools such as Hadoop, Spark, Kafka is beneficial. Experience with relational databases like Microsoft SQL Server, MySQL, PostGreSQL, Oracle, and NoSQL databases including Hadoop, Cassandra, MongoDB is expected. Proficiency in data pipeline and workflow management tools like Azkaban, Luigi, Airflow is required. Experience in building and optimizing big data pipelines, architectures, and data sets is crucial. You should possess strong analytical skills related to working with unstructured datasets. Provide innovative solutions to data engineering problems, document technology choices, and integration patterns. Apply best practices for project delivery with clean code. Demonstrate innovation and proactiveness in meeting project requirements. Reporting to: Director- Intelligent Insights and Data Strategy Travel: Must be willing to be deployed at client locations worldwide for long and short terms, flexible for shorter durations within India and abroad.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Programmer Analyst position at our organization is an intermediate level role where you will be involved in establishing and implementing new or updated application systems and programs in collaboration with the Technology team. Your primary goal will be to contribute to activities related to applications systems analysis and programming. Your responsibilities will include utilizing your knowledge of applications development procedures and concepts, as well as basic knowledge of other technical areas to identify and define necessary system enhancements. You will be expected to identify and analyze issues, provide recommendations, and implement solutions. Additionally, you will use your understanding of business processes, system processes, and industry standards to solve complex problems. Analyzing information, making evaluative judgments, recommending solutions and improvements, conducting testing and debugging, utilizing script tools, and writing basic code for design specifications are also part of your responsibilities. You will need to assess the applicability of similar experiences and evaluate options under circumstances not covered by procedures. Developing a working knowledge of various aspects such as Citis information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications will be crucial. Moreover, you are expected to appropriately assess risk when making business decisions, with a particular focus on safeguarding Citigroup, its clients, and assets by ensuring compliance with applicable laws, rules, and regulations. Qualifications for this role include 2-5 years of relevant experience, experience in programming/debugging for business applications, working knowledge of industry practices and standards, comprehensive knowledge of a specific business area for application development, working knowledge of program languages, and consistently demonstrating clear and concise written and verbal communication. Education-wise, a Bachelors degree/University degree or equivalent experience is required. In terms of skillsets, the ideal candidate should have a minimum of 3+ years of hands-on experience in Data engineering. Proficiency in Hadoop, Spark, Hive, Impala, Performance Tuning, Java programming language, SQL, and Oracle is essential. It would be considered a plus to have certifications like Java/Big Data. This job description offers an overview of the work performed in this role, and additional job-related duties may be assigned as necessary. Citi is an equal opportunity and affirmative action employer. Time Type: Full time,

Posted 1 week ago

Apply

8.0 - 15.0 years

0 Lacs

karnataka

On-site

As a Data Science Team Lead at our organization, you will be responsible for leading a team of data scientists, planning data projects, building analytic systems, and managing a team of data scientists and machine learning engineers. Your role will also involve participating in pre-sales activities and developing proposals. With your strong expertise in machine learning, deep learning, NLP, data mining, and information retrieval, you will design, prototype, and build the next-generation analytics engines and services. Your responsibilities will include leading data mining and collection procedures, ensuring data quality and integrity, interpreting and analyzing data problems, conceiving, planning, and prioritizing data projects, building analytic systems and predictive models, testing the performance of data-driven products, visualizing data, creating reports, aligning data projects with organizational goals, understanding business problems, and designing end-to-end analytics use cases. Additionally, you will develop complex models and algorithms to drive innovation throughout the organization, conduct advanced statistical analysis to provide actionable insights, collaborate with model developers to implement scalable solutions, and provide thought leadership by researching best practices and collaborating with industry leaders. To qualify for this role, you should have 8-15 years of experience in a statistical and/or data science role, proven experience as a Data Scientist or similar role, strong organizational and leadership skills, a degree in Computer Science, Data Science, Mathematics, or a similar field, deep knowledge and experience in Large Language Models (LLM), strong knowledge of Generative AI, deep knowledge of machine learning, statistics, optimization, or a related field, experience in Linear and non-Linear Regression models, Classification models, and Unsupervised models, rich experience in NLP techniques, experience in Deep Learning techniques like CNN, RNN, GAN, and Markov, real-time experience with at least one of the machine language software like R, Python, Matlab, good knowledge in Explainable AI, experience working with large datasets, simulation/optimization, and distributed computing tools, excellent written and verbal communication skills, a strong desire to work in cross-functional teams, an attitude to thrive in a fun, fast-paced startup-like environment, and an additional advantage of experience in semi, unstructured databases. Optional qualifications include programming skills in languages like C, C++, Java, or .Net.,

Posted 1 week ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Data Scientist Job Description Product Data & Analytics Team Senior Data Scientist – Product Data & Analytics Overview Product Data & Analytics team builds internal analytic partnerships, strengthening focus on the health of the business, portfolio and revenue optimization opportunities, initiative tracking, new product development and Go-To Market strategies. Are you excited about Data Assets and the value they bring to an organization? Are you an evangelist for data driven decision making? Are you motivated to be part of a Global Analytics team that builds large scale Analytical Capabilities supporting end users across 6 continents? Do you want to be the go-to resource for data analytics in the company? The ideal candidate has a knack for seeing solutions in sprawling data sets and the business mindset to convert insights into strategic opportunities for our company. Role & Responsibilities Work closely with global & regional teams to architect, develop, and maintain data engineering, advanced reporting and data visualization capabilities on large volumes of data to support analytics and reporting needs across products, markets and services. Obtain data from multiple sources, collate, analyze, and triangulate information to develop reliable fact bases. Effectively use tools to manipulate large-scale databases, synthesizing data insights. Execute cross-functional projects using advanced modeling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities. Build, develop and maintain data models, reporting systems, dashboards and performance metrics that support key business decisions. Extract intellectual capital from engagement work and actively share tools, methods and best practices across projects Provide 1st level insights/conclusions/assessments and present findings via Tableau/PowerBI dashboards, Excel and PowerPoint. Apply quality control, data validation, and cleansing processes to new and existing data sources. Lead, mentor and guide more junior team members. Communicate results and business impacts of insight initiatives to stakeholders in leadership, technology, sales, marketing and product teams. Bring your Passion and Expertise All About You Experience in data management, data mining, data analytics, data reporting, data product development and quantitative analysis Financial Institution or a Payments experience a plus Experience presenting data findings in a readable and insight driven format. Experience building support decks. Advanced SQL skills, ability to write optimized queries for large data sets (Big data) Experience on Platforms/Environments: Cloudera Hadoop, Big data technology stack, SQL Server, Microsoft BI Stack Experience with data visualization tools such as Looker, Tableau, PowerBI Experience with Python, R, Databricks a plus Experience on SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS) will be an added advantage Excellent problem solving, quantitative and analytical skills In depth technical knowledge, drive and ability to learn new technologies Strong attention to detail and quality Team player, excellent communication skills Must be able to interact with management, internal stakeholders and collect requirements Must be able to perform in a team, use judgment and operate under ambiguity Education Bachelor’s or Master’s Degree in a Computer Science, Information Technology, Engineering, Mathematics, Statistics Additional Competencies Excellent English, quantitative, technical, and communication (oral/written) skills Analytical/Problem Solving Strong attention to detail and quality Creativity/Innovation Self-motivated, operates with a sense of urgency Project Management/Risk Mitigation Able to prioritize and perform multiple tasks simultaneously Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-245981

Posted 1 week ago

Apply

0.0 - 2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

The Engineering Analyst 2 is an intermediate level position responsible for a variety of engineering activities including the design, acquisition and development of hardware, software and network infrastructure in coordination with the Technology team. The overall objective of this role is to ensure quality standards are being met within existing and planned frameworks. Responsibilities: Perform system and application monitoring, capacity planning and systems tests to ensure products meet performance requirements Evaluate technologies, develop prototypes, contribute to design issues, and implement solutions Work with various internal and external teams to identify and resolve problems Consult with end users and clients to identify and correct systems problems or propose solutions Assist in the development of software and systems tools used by integration teams to create end user packages Provide support for operating systems and in-house applications, including third party applications, as needed Perform coding, analysis, testing or other appropriate functions in order to identify problems and propose solutions Adhere to Citi technology standards, audit requirements and corporate compliance issues and requirements Apply knowledge of engineering procedures and concepts and basic knowledge of other technical areas to day to day activities Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 0-2 years of relevant experience in an Engineering role Experience working in Financial Services or a large complex and/or global environment Project Management experience Consistently demonstrates clear and concise written and verbal communication Comprehensive knowledge of design metrics, analytics tools, benchmarking activities and related reporting to identify best practices Demonstrated analytic/diagnostic skills Ability to work in a matrix environment and partner with virtual teams Ability to work independently, multi-task, and take ownership of various parts of a project or initiative Ability to work under pressure and manage to tight deadlines or unexpected changes in expectations or requirements Proven track record of operational process change and improvement Education: Bachelor’s degree/University degree or equivalent experience Roles & Responsibilities: Knowledge on APIGEE implementation and support Working experience of all CICD processes including LSE and ECS Hadoop cluster experience Cloud Computing Knowledge on AWS Must have SRE knowledge and self healing implementation Experience on Automatic Server patching and batch management Working experice on Devops tools and technologies Skillset: Bigdata, Hadoop cluster, KAFKA, GemFire, NEO4J, TEACMCITY, uDeploy, Autosys, RHEL, Oracle ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Systems & Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

170.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary This role could be based in India, China, Malaysia or Singapore. When you start the application process you will be presented with a drop down menu showing all countries, please ensure that you only select a country where the role is based This is a pivotal role that bridges the gap between advanced AI technologies and delivery of AI use cases of the bank. This position involves designing, optimizing, and managing AI prompts to enhance the performance of language models, ensuring they align with business objectives. Additionally, the role requires analyzing complex datasets to extract actionable insights, supporting strategic initiatives and enhancing operational efficiency. The ideal candidate will possess a unique blend of technical expertise in AI and data analytics. This role will be part of AI squad and will work closely with internal staff, clients and 3rd parties. Key Responsibilities Strategy Delivering, building and maintaining the solutions to AI use cases and augmenting functionalities of AI tool of the bank. Business Accelerate delivery of AI uses cases for various Business & Functions of the bank with focus on, Responsibilities Design and Optimize AI Prompts: Develop, test, and refine AI prompts to ensure optimal performance and alignment with business goals. Data Analysis and Interpretation: Analyze large datasets to uncover trends, patterns, and insights that support strategic decision-making. Collaborate with Cross-Functional Teams: Work closely with data scientists, software engineers, and business stakeholders to integrate AI solutions into banking operations. Continuous Improvement: Stay updated with the latest advancements in AI and data analytics to continuously improve the quality and efficiency of AI models and data analysis processes. Documentation and Training: Document processes, methodologies, and findings, and provide training to team members on best practices in prompt engineering and data analysis. Accountabilities Quality and Performance: Accountable for the quality and performance of AI prompts and data analysis outcomes, ensuring they meet the bank’s standards and objectives. Compliance: Responsible for ensuring all AI and data-related activities adhere to regulatory and compliance requirements. Stakeholder Satisfaction: Accountable for delivering actionable insights and effective AI solutions that satisfy the needs of internal stakeholders and support the bank's strategic goals. People & Talent Work with a team of data science analysts and full stack AI developers. Risk Management Ensure all data handling and analysis processes comply with the bank’s data privacy and cybersecurity standards. Governance Develop and implement change management plans to ensure successful adoption of AI solutions across the organization. Provide training and support to business users to help them understand and leverage AI tools and technologies. Foster a culture of innovation and continuous improvement by promoting the benefits of AI and encouraging experimentation. Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key stakeholders CDO Squads, Business Partners, ITO, internal staff, clients and 3rd parties. Other Responsibilities Embed Here for good and Group’s brand and values in AI squad deployed to various Business and Functions; Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures. Identify and prioritize AI use cases that have the potential to deliver significant business value. Develop business cases for AI projects, including cost-benefit analysis, risk assessment, and ROI estimation. Skills And Experience Software development experience using .Net framework or Java Completed real world ML, NLP and DO projects using R or Python Extensive experience in SQL and NoSQL database design, queries and stored procedures Hands-on experience with Windows Server, Azure, AWS and Git Highly proficient in data visualisation tools... Documentation - Requirements/Use Cases/Business Rules/User Stories, Etc. Report Types - Gap Analysis/Problem Analysis/Initial Assessments, Etc. Process/Data Modelling - 'As Is'/'To Be'/ Visio/Enterprise Architect/System Architect, Etc. Strong query language skills (SQL, Hive, ETL, Hadoop, Spark, R, Python) Good experience with Business Intelligence tools and Decision Support Systems Strong data analysis skills using Hive, Spark, R, Python, Dremio, MicroStrategy and Tableau. Proven experience in working with key stakeholders within the business Proven problem-solving skills Workshop Facilitation Qualifications EDUCATION Graduate or Master Knowledge Engineering or Data science. TRAINING Completed real world ML, NLP and DO projects using Python LANGUAGES English About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Description The Smart Cube, a WNS company, is a trusted partner for high performing intelligence that answers critical business questions. And we work with our clients to figure out how to implement the answers, faster. Job Description Roles and ResponsibilitiesAssistant Managers must understand client objectives and collaborate with the Project Lead to design effective analytical frameworks. They should translate requirements into clear deliverables with defined priorities and constraints. Responsibilities include managing data preparation, performing quality checks, and ensuring analysis readiness. They should implement analytical techniques and machine learning methods such as regression, decision trees, segmentation, forecasting, and algorithms like Random Forest, SVM, and ANN.They are expected to perform sanity checks and quality control of their own work as well as that of junior analysts to ensure accuracy. The ability to interpret results in a business context and identify actionable insights is critical. Assistant Managers should handle client communications independently and interact with onsite leads, discussing deliverables and addressing queries over calls or video conferences.They are responsible for managing the entire project lifecycle from initiation to delivery, ensuring timelines and budgets are met. This includes translating business requirements into technical specifications, managing data teams, ensuring data integrity, and facilitating clear communication between business and technical stakeholders. They should lead process improvements in analytics and act as project leads for cross-functional coordination.Client ManagementThey serve as client leads, maintaining strong relationships and making key decisions. They participate in deliverable discussions and guide project teams on next steps and execution strategy.Technical RequirementsAssistant Managers must know how to connect databases with Knime (e.g., Snowflake, SQL) and understand SQL concepts such as joins and unions. They should be able to read/write data to and from databases and use macros and schedulers to automate workflows. They must design and manage Knime ETL workflows to support BI tools and ensure end-to-end data validation and documentation.Proficiency in PowerBI is required for building dashboards and supporting data-driven decision-making. They must be capable of leading analytics projects using PowerBI, Python, and SQL to generate insights. Visualizing key findings using PowerPoint or BI tools like Tableau or Qlikview is essential.Ideal CandidateCandidates should have 4–7 years of experience in advanced analytics across Marketing, CRM, or Pricing in Retail or CPG. Experience in other B2C domains is acceptable. They must be skilled in handling large datasets using Python, R, or SAS and have worked with multiple analytics or machine learning techniques. Comfort with client interactions and working independently is expected, along with a good understanding of consumer sectors such as Retail, CPG, or Telecom.They should have experience with various data formats and platforms including flat files, RDBMS, Knime workflows and server, SQL Server, Teradata, Hadoop, and Spark—on-prem or in the cloud. Basic knowledge of statistical and machine learning techniques like regression, clustering, decision trees, forecasting (e.g., ARIMA), and other ML models is required.Other SkillsStrong written and verbal communication is essential. They should be capable of creating client-ready deliverables using Excel and PowerPoint. Knowledge of optimization methods, supply chain concepts, VBA, Excel Macros, Tableau, and Qlikview will be an added advantage. Qualifications Engineers from top tier institutes (IITs, DCE/NSIT, NITs) or Post Graduates in Maths/Statistics/OR from top Tier Colleges/UniversitiesMBA from top tier B-schools

Posted 1 week ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

What You'll Do: Criteo is in search of a passionate, highly motivated Data Analyst to join our Analytics team. You will turn business requests into data problems and tackle them in a scalable and efficient way, working together with analyst teams across Criteo locations. Aside from solving business challenges, this position also involves technically rigorous work, including the use of SQL, Excel, Hive, Python, and other leading-edge data tools. We are looking for a team player who is both business-driven and highly analytical. He or she will work with cross-functional business units to perform back-office data analysis and reporting that doesn’t require market context nor interaction with final customers. The ideal candidate will be able to take a recurrent business need and look for ways to address it in an automated and scalable way, both through process optimization and creation of dedicated tools. This role supports our EMEA business and work hours will be between 12.30pm IST – 9.30pm IST. This role is based in Gurgaon, India. Develop & share - deep knowledge of Criteo’s technology, products, and position in the marketplace. Provide actionable insights & create best practices to solve operational problems and actively look for opportunities for scaling analysis and tools across different business units. Leverage Python and SQL to answer commercial requests. Own and maintain reports/tools in Tableau, Python and other data tools. Conduct back-office ad-hoc analysis, problem-solving, and troubleshooting along with Root Cause Analysis. Automate the persistent tasks to enhance efficiency and reduce delivery times. Collaborate with teams based in other countries to support their analytical needs. Who You Are: Bachelor’s degree or higher in a quantitative/business field (Mathematics, Statistics, Engineering, Economics, Business, Finance, etc.). At least 3+ years of work experience in business / data analytics role. preferably from consulting, product-tech, retail, e-commerce background. Strong intellectual curiosity and ability to structure and solve difficult problems with minimal supervision. Excellent technical skills: strong SQL, basic Python, and visualization are a must. Effective business acumen & client engaging skills to provide clear actionable insights. Experience in any of the following is a plus: Excel, Tableau, Hive/Hadoop, Vertica, Git/Gerrit. Knowledge in agency or digital marketing is a plus. We acknowledge that many candidates may not meet every single role requirement listed above. If your experience looks a little different from our requirements but you believe that you can still bring value to the role, we’d love to see your application! Who We Are: Criteo is the global commerce media company that enables marketers and media owners to deliver richer consumer experiences and drive better commerce outcomes through its industry leading Commerce Media Platform. 🌟 At Criteo, our culture is as unique as it is diverse. From our offices around the world or from home, our incredible team of 3,600 Criteos collaborates to develop an open and inclusive environment. We seek to ensure that all of our workers are treated equally, and we do not tolerate discrimination based on race, gender identity, gender, sexual orientation, color, national origin, religion, age, disability, political opinion, pregnancy, migrant status, ethnicity, marital or family status, or other protected characteristics at all stages of the employment lifecycle including how we attract and recruit, through promotions, pay decisions, benefits, career progression and development. We aim to ensure employment decisions and actions are based solely on business-related considerations and not on protected characteristics. As outlined in our Code of Business Conduct and Ethics, we strictly forbid any kind of discrimination, harassment, mistreatment or bullying towards colleagues, clients, suppliers, stakeholders, shareholders, or any visitors of Criteo. All of this supports us in our mission to power the world’s marketers with trusted and impactful advertising encouraging discovery, innovation and choice in an open internet. Why Join Us: At Criteo, we take pride in being a caring culture and are committed to providing our employees with valuable benefits that support their physical, emotional and financial wellbeing, their interests and the important life events. We aim to create a place where people can grow and learn from each other while having a meaningful impact. We want to set you up for success in your job, and an important part of that includes comprehensive perks & benefits. Benefits may vary depending on the country where you work and the nature of your employment with Criteo. When determining compensation, we carefully consider a wide range of job-related factors, including experience, knowledge, skills, education, and location. These factors can cause your compensation to vary.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Senior Manager - Data Engineering Career Level - E Introduction To Role Join our Commercial IT Data Analytics & AI (DAAI) team as a Data Engineering Lead, where you will play a pivotal role in ensuring the quality and stability of our data platforms built on AWS services, Databricks, and Snaplogic. Based in Chennai GITC, you will drive the quality engineering strategy, lead a team of quality engineers, and contribute to the overall success of our data platform. Accountabilities As the Data Engineering Lead for data platforms, you will play a pivotal role for providing leadership and mentorship to your team, driving the adoption of high-quality engineering standards, and fostering effective collaboration across departments. You will lead the design, development, and maintenance of scalable and secure data infrastructure and tools to support the data analytics and data science teams. You will also develop and implement data and data engineering quality assurance strategies and plans tailored to data product build and operations. Essential Skills/Experience Bachelor’s degree or equivalent in Computer Engineering, Computer Science, or a related field Proven experience in a product quality engineering or similar role, with at least 3 years of experience in managing and leading a team. Experience of working within a quality and compliance environment and application of policies, procedures, and guidelines A broad understanding of cloud architecture (preferably in AWS) Strong experience in Databricks, Pyspark and the AWS suite of applications (like S3, Redshift, Lambda, Glue, EMR). Proficiency in programming languages such as Python Experienced in Agile Development techniques and Methodologies. Solid understanding of data modelling, ETL processes and data warehousing concepts Excellent communication and leadership skills, with the ability to collaborate effectively with the technical and non-technical stakeholders. Experience with big data technologies such as Hadoop or Spark Certification in AWS or Databricks. Prior significant experience working in Pharmaceutical or Healthcare industry IT environment. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we are committed to disrupting an industry and changing lives. Our work has a direct impact on patients, transforming our ability to develop life-changing medicines. We empower the business to perform at its peak and lead a new way of working, combining cutting-edge science with leading digital technology platforms and data. We dare to lead, applying our problem-solving mindset to identify and tackle opportunities across the whole enterprise. Our spirit of experimentation is lived every day through our events like hackathons. We enable AstraZeneca to perform at its peak by delivering world-class technology and data solutions. Are you ready to be part of a team that has the backing to innovate, disrupt an industry and change lives? Apply now to join us on this exciting journey! Date Posted 21-Jul-2025 Closing Date 25-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Vadodara

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Agra

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Agra

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 1 week ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Vadodara

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 1 week ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Preferred Education Master's Degree Required Technical And Professional Expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred Technical And Professional Experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Skill: Hadoop Architecture , Spark, Python & advance SQL (Hadoop Developer) Mode: Contract to Hire & Hybrid Rate: Max 225000 + Taxes Location: All Brillio Location Notice Period: Immediate to max 15 Days Joiner

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies