Jobs
Interviews

7004 Hadoop Jobs - Page 18

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

10 - 14 Lacs

Kolkata

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process while ensuring alignment with organizational goals. You will also engage in strategic planning to enhance application performance and user experience, making critical decisions that impact the overall success of the projects you oversee. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to foster their professional growth and enhance team capabilities.- Continuously assess and improve application processes to increase efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Python (Programming Language).- Strong understanding of data processing frameworks and distributed computing.- Experience with data integration and ETL processes.- Familiarity with cloud platforms and services related to application deployment.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Chennai

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will be pivotal in driving innovation and efficiency within the application development lifecycle, fostering a collaborative environment that encourages team growth and success. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Good to have skills - Pyspark, AWS, Airflow, Databricks, SQL, SCALA- Experience should be 6+ years in primary skill- Candidate must be a strong Hands-on senior Developer - As a lead, steer the team in completing their tasks and solve technical issues- Candidate must possess good technical / non-technical communication skills to highlight areas of concern/risks- Should have good troubleshooting skills to do RCA of prod support related issues- Prior experience working with senior client stakeholders is preferable. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based at our Chennai office.- A 15-year full time education is required.- Candidate must be willing to work in Shift B i.e. from 11 AM IST to 9PM IST Qualification 15 years full time education

Posted 3 days ago

Apply

10.0 - 14.0 years

8 - 13 Lacs

Navi Mumbai

Work from Office

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Assoc Manager Qualifications: Any Graduation Years of Experience: 10 to 14 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Helps transform back office and network operations, reduce time to market and grow revenue, by improving customer experience and capex efficiency, and reducing cost-to-serveGood Customer Support Experience preferred with good networking knowledgeManage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage, Direct active participation on GenAI and Machine Learning projects Other skills:Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 3 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationTHIS DEMAND IS FOR ATCAMA . JD as UnderData Engineering skills:Knowledge of data integration, data warehousing, and data lake technologies. Data Quality and Governance skills:Experience with data quality tools, data governance frameworks, and data profiling techniques. Programming skills:Proficiency in languages like Java, Python, or SQL, depending on the specific role. Cloud computing skills:Experience with cloud platforms like AWS, Azure, or Google Cloud Platform. Problem-solving skills:Ability to troubleshoot data issues and identify solutions. As a Data Governance Practitioner, you will establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data governance strategy implementation- Develop and maintain data governance frameworks- Conduct data quality assessments Professional & Technical Skills: - Strong understanding of data governance principles- Experience in implementing data governance solutions- Knowledge of data privacy regulations- Familiarity with data quality management practices Additional Information:- The candidate should have a minimum of 5+ years of experience in Atacama Data Governance.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

8.0 - 12.0 years

25 - 35 Lacs

Pune, Chennai

Hybrid

Hi All, Skill: Bigdata Engineer Exp: 6-9 Years Location: Pune, Chennai Mandatory Skills: PySpark, spark, python , GCP, SCALA, SQL, Hadoop, Hive, AWS, GCP Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL workflows using PySpark, Hadoop, and Hive. Deploy and manage big data workloads on cloud platforms like GCP and AWS. Work closely with cross-functional teams to understand data requirements and deliver high-quality solutions. Optimize data processing jobs for performance and cost-efficiency on cloud infrastructure. Implement automation and CI/CD pipelines to streamline deployment and monitoring of data workflows. Ensure data security, governance, and compliance in cloud environments. Troubleshoot and resolve data issues, monitoring job executions and system health. Mandatory Skills: PySpark: Strong experience in developing data processing jobs and ETL pipelines. Google Cloud Platform (GCP): Hands-on experience with BigQuery, Dataflow, Dataproc, or similar services. Hadoop Ecosystem: Expertise with Hadoop, Hive, and related big data tools. AWS: Familiarity with AWS data services like S3, EMR, Glue, or Redshift. Strong SQL and data modeling skills. Good to Have: Experience with CI/CD tools and DevOps practices (Jenkins, GitLab, Terraform, etc.). Containerization and orchestration knowledge (Docker, Kubernetes). Experience with Infrastructure as Code (IaC). Knowledge of data governance and data security best practices.

Posted 3 days ago

Apply

6.0 - 11.0 years

19 - 27 Lacs

Haryana

Work from Office

About Company Founded in 2011, ReNew, is one of the largest renewable energy companies globally, with a leadership position in India. Listed on Nasdaq under the ticker RNW, ReNew develops, builds, owns, and operates utility-scale wind energy projects, utility-scale solar energy projects, utility-scale firm power projects, and distributed solar energy projects. In addition to being a major independent power producer in India, ReNew is evolving to become an end-to-end decarbonization partner providing solutions in a just and inclusive manner in the areas of clean energy, green hydrogen, value-added energy offerings through digitalisation, storage, and carbon markets that increasingly are integral to addressing climate change. With a total capacity of more than 13.4 GW (including projects in pipeline), ReNew’s solar and wind energy projects are spread across 150+ sites, with a presence spanning 18 states in India, contributing to 1.9 % of India’s power capacity. Consequently, this has helped to avoid 0.5% of India’s total carbon emissions and 1.1% India’s total power sector emissions. In the over 10 years of its operation, ReNew has generated almost 1.3 lakh jobs, directly and indirectly. ReNew has achieved market leadership in the Indian renewable energy industry against the backdrop of the Government of India’s policies to promote growth of this sector. ReNew’s current group of stockholders contains several marquee investors including CPP Investments, Abu Dhabi Investment Authority, Goldman Sachs, GEF SACEF and JERA. Its mission is to play a pivotal role in meeting India’s growing energy needs in an efficient, sustainable, and socially responsible manner. ReNew stands committed to providing clean, safe, affordable, and sustainable energy for all and has been at the forefront of leading climate action in India. Job Description Key responsibilities: 1. Understand, implement, and automate ETL pipelines with better industry standards 2. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, design infrastructure for greater scalability, etc 3. Developing, integrating, testing, and maintaining existing and new applications 4. Design, and create data pipelines (data lake / data warehouses) for real world energy analytical solutions 5. Expert-level proficiency in Python (preferred) for automating everyday tasks 6. Strong understanding and experience in distributed computing frameworks, particularly Spark, Spark-SQL, Kafka, Spark Streaming, Hive, Azure Databricks etc 7. Limited experience in using other leading cloud platforms preferably Azure. 8. Hands on experience on Azure data factory, logic app, Analysis service, Azure blob storage etc. 9. Ability to work in a team in an agile setting, familiarity with JIRA and clear understanding of how Git works 10. Must have 5-7 years of experience

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role We are seeking a highly skilled and strategic Senior Manager, Platform Capabilities to lead the development, implementation, and governance of our platform capabilities. This leadership role will be responsible for ensuring our platforms are optimized for scalability, future readiness, and operational efficiency. The ideal candidate will have a strong background in platform engineering, architecture, and cross-functional leadership, combined with the ability to drive impactful changes and implement cutting-edge technologies in alignment with business needs. Technology Leadership & Strategy Lead the design and development of highly scalable, reliable, and future-proof platforms using modern cloud-based technologies and architectures (e.g., cloud-native, microservices, and containerized environments). Evaluate and selection of new technologies and tools to enhance platform infrastructure, ensuring they meet long-term business goals and compliance standards. Platform Governance Establish and maintain a robust framework for platform governance, ensuring adherence to industry best practices, data security, compliance standards, and regulatory requirements. Define and enforce policies related to platform access, privacy, data quality, data lifecycle management, and platform auditing. Manage FinOps and operational aspects of the platform, ensuring cost optimization, budget adherence, and efficient resource utilization. Platform Architecture & Implementation Lead the architecture and design of platform components, integrating them across various ecosystems (cloud, on-premise, hybrid) to ensure seamless data flow, processing, and accessibility. Ensure the platform is optimized for high availability, disaster recovery, and performance, while being cost-effective and scalable to meet future business needs. Direct the end-to-end implementation of platform initiatives, from initial requirements gathering through deployment and continuous improvements, while ensuring timely and efficient execution. Cross-functional Collaboration Work closely with engineering, analytics, data science, and business teams to ensure the platform meets the diverse needs of stakeholders and provides meaningful insights for the business. Lead cross-functional teams in delivering projects, providing guidance and mentorship to ensure timely delivery and alignment with strategic goals. Develop strong relationships with key internal stakeholders to prioritize platform enhancements and ensure continued support and adoption. Innovation & Future Readiness Stay ahead of emerging technologies and trends in the platform landscape (AI, machine learning, automation) and incorporate innovative solutions to continuously enhance platform capabilities. Ensure the platform is adaptable to support future initiatives, such as advanced analytics, AI, and machine learning use cases. Lead the evolution of platform architecture to meet evolving business and technology needs, ensuring it supports future growth and technological advancements. Team Leadership And Development Build, develop, and mentor a high-performing platform engineering team, fostering a culture of collaboration, innovation, and excellence. Qualifications & Experiences 8+ years of experience in data engineering, platform architecture, and implementation, with a focus on building and scaling complex data platforms. 3+ years of management experience. Proven experience in designing and implementing cloud-based data platforms (AWS, Azure, GCP) with expertise in modern data architectures and tools (e.g., Hadoop, Spark, Kafka, Snowflake, etc.). Strong understanding of data governance, security, and compliance standards (GDPR, CCPA, HIPAA, etc.). Expertise in managing data platform lifecycle, including strategy, architecture, governance, and operations. Strong leadership skills with a proven track record of leading cross-functional teams and driving projects to successful completion. Excellent communication and stakeholder management skills, with the ability to influence at all levels of the organization. A Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.

Posted 3 days ago

Apply

5.0 - 10.0 years

2 - 5 Lacs

Chennai

Work from Office

Job Title:Data EngineerExperience5-10YearsLocation:Remote : Responsibilities: Design, build and maintain core data infrastructure pieces that allow Aircall to support our many data use cases. Enhance the data stack, lineage monitoring and alerting to prevent incidents and improve data quality. Implement best practices for data management, storage and security to ensure data integrity and compliance with regulations. Own the core company data pipeline, responsible for converting business needs to efficient & reliable data pipelines. Participate in code reviews to ensure code quality and share knowledge. Lead efforts to evaluate and integrate new technologies and tools to enhance our data infrastructure. Define and manage evolving data models and data schemas. Manage SLA for data sets that power our company metrics. Mentor junior members of the team, providing guidance and support in their professional development. Collaborate with data scientists, analysts and other stakeholders to drive efficiencies for their work, supporting complex data processing, storage and orchestration A little more about you: Bachelor's degree or higher in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering, with a strong focus on designing and building data pipelines and infrastructure. Proficient in SQL and Python, with the ability to translate complexity into efficient code. Experience with data workflow development and management tools (dbt, Airflow). Solid understanding of distributed computing principles and experience with cloud-based data platforms such as AWS, GCP, or Azure. Strong analytical and problem-solving skills, with the ability to effectively troubleshoot complex data issues. Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment. Experience with data tooling, data governance, business intelligence and data privacy is a plus.

Posted 3 days ago

Apply

9.0 - 12.0 years

15 - 20 Lacs

Chennai

Work from Office

Job Title:Data Engineer Lead / Architect (ADF)Experience9-12YearsLocation:Remote / Hybrid : Role and ResponsibilitiesTalk to client stakeholders, and understand the requirements for building their data warehouse / data lake / data Lakehouse. Design, develop and maintain data pipelines in Azure Data Factory (ADF) for ETL from on-premise and cloud based sources Design, develop and maintain data warehouses and data lakes in Azure Run large data platform and other related programs to provide business intelligence support Design and Develop data models to support business intelligence solutions Implement best practices in data modelling and data warehousing Troubleshoot and resolve issues related to ETL and data connections Skills Required: Excellent written and verbal communication skills Excellent knowledge and experience in ADF Well versed with ADLS Gen 2 Knowledge of SQL for data extraction and transformation Ability to work with various data sources (Excel, SQL databases, APIs, etc.) Knowledge in SAS would be added advantage Knowledge in Power BI would be added advantage

Posted 3 days ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google BigQuery Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress, address any challenges that arise, and facilitate communication among team members to foster a productive work environment. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation for application processes and workflows. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data warehousing concepts and architecture.- Experience with SQL and data manipulation techniques.- Familiarity with cloud computing platforms and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Google BigQuery.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Pune

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : standard 15 years experience Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Your role will also include monitoring data workflows and troubleshooting any issues that arise, ensuring that data is accessible and reliable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and optimize data pipelines to enhance data processing efficiency.- Collaborate with data scientists and analysts to understand data needs and provide necessary support. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and data integration techniques.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A standard 15 years experience is required. Qualification standard 15 years experience

Posted 3 days ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Engineering Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for a highly skilled and experienced ETL Lead with a strong background in Data Warehousing, ETL processes, and Big Data technologies. The ideal candidate should have hands-on experience with Cloudera tools, excellent SQL/PLSQL skills, and the ability to lead a team while working closely with clients in the Banking/Insurance domain. Roles & Responsibilities:Design, develop, and maintain end-to-end Data Warehouse and ETL solutions.Work extensively with SQL, PLSQL, Oracle, Hadoop, and Cloudera Data Platform tools (Spark, Hive, Impala).Lead a team of 4+ engineers, guiding both technical and operational tasks.Manage daily operations and provide L1/L2/L3 support, ensuring no SLA breaches.Collaborate with clients to understand requirements and provide scalable, reliable solutions.Conduct team meetings, client discussions, and knowledge transfer (KT) sessions.Utilize Shell scripting, Unix commands, and have working knowledge of Java/Python for task automation and integration.Ensure the team follows best practices in coding, documentation, and support.Stay up to date with new technologies and demonstrate a willingness to learn as per client needs.Be available to work from client location and travel daily as required.Proficient in analysing and rewriting legacy systems.Solid team leadership skills with experience managing 4+ people.Excellent communication and interpersonal skills.Demonstrated ability to manage operations and support teams efficiently.Willingness to work from the client location and travel as required.Proactive and positive attitude toward learning and adopting new technologies.Strong analytical skills and solution-oriented mindset.Ability to work under minimal supervision in a fast-paced environment. Professional & Technical Skills: 8+ years of experience in Data Warehousing and ETL Technologies.Strong expertise in SQL, PLSQL, Oracle, and Hadoop ecosystem.Hands-on experience with Spark, Hive, Impala on Cloudera Data Platform.Strong understanding of the Banking/Insurance domain.Understand legacy code and translate it effectively to modern technologies. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Engineering.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

8.0 - 12.0 years

7 - 11 Lacs

Pune

Work from Office

Experience with ETL processes and data warehousing Proficient in SQL and Python/Java/Scala Team Lead Experience

Posted 3 days ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Pune

Work from Office

Experience with ETL processes and data warehousing Proficient in SQL

Posted 3 days ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Hyderabad, Pune, Telangana

Work from Office

We have Immediate Openings on Big Data for Contract to Hire role for multiple clients. Job Details Skills Big Data Job type Contract to HIRE Primary Skills 6-8yrs of Experience in working as bigdata developer/supporting environemnts Strong knowledge in Unix/BigData Scripting Strong understanding of BigData (CDP/Hive) Environment Hands-on with GitHub and CI-CD implementations. Attitude to learn / understand ever task doing with reason Ability to work independently on specialized assignments within the context of project deliverable Take ownership of providing solutions and tools that iteratively increase engineering efficiencies. Excellent communication skills & team player Good to have hadoop, Control-M Tooling knowledge. Good to have Automation experience, knowledge of any Monitoring Tools. Role You will work with team handling application developed using Hadoop/CDP, Hive. You will work within the Data Engineering team and with the Lead Hadoop Data Engineer and Product Owner. You are expected to support existing application as well as design and build new Data Pipelines. You are expected to support Evergreening or upgrade activities of CDP/SAS/Hive You are expected to participate in the service management if application Support issue resolution and improve processing performance /avoid issue reoccurring Ensure the use of Hive, Unix Scripting, Control-M reduces lead time to delivery Support application in UK shift as well as on-call support over night/weekend This is mandatory Working Hours UK Shift - One week per Month On Call - One week per Month.

Posted 3 days ago

Apply

4.0 - 8.0 years

0 - 0 Lacs

Pune

Hybrid

So, what’s the role all about? Within Actimize, the AI and Analytics Team is developing the next generation advanced analytical cloud platform that will harness the power of data to provide maximum accuracy for our clients’ Financial Crime programs. As part of the PaaS/SaaS development group, you will be responsible for developing this platform for Actimize cloud-based solutions and to work with cutting edge cloud technologies. How will you make an impact? NICE Actimize is the largest and broadest provider of financial crime, risk and compliance solutions for regional and global financial institutions & has been consistently ranked as number one in the space At NICE Actimize, we recognize that every employee’s contributions are integral to our company’s growth and success. To find and acquire the best and brightest talent around the globe, we offer a challenging work environment, competitive compensation, and benefits, and rewarding career opportunities. Come share, grow and learn with us – you’ll be challenged, you’ll have fun and you’ll be part of a fast growing, highly respected organization. This new SaaS platform will enable our customers (some of the biggest financial institutes around the world) to create solutions on the platform to fight financial crime. Have you got what it takes? Design, implement, and maintain real-time and batch data pipelines for fraud detection systems. Automate data ingestion from transactional systems, third-party fraud intelligence feeds, and behavioral analytics platforms. Ensure high data quality, lineage, and traceability to support audit and compliance requirements. Collaborate with fraud analysts and data scientists to deploy and monitor machine learning models in production. Monitor pipeline performance and implement alerting for anomalies or failures. Ensure data security and compliance with financial regulations Qualifications: Bachelor’s or master’s degree in computer science, Data Engineering, or a related field. 4-6 years of experience in DataOps role, preferably in fraud or risk domains. Strong programming skills in Python and SQL. Knowledge of financial fraud patterns, transaction monitoring, and behavioral analytics. Familiarity with fraud detection systems, rules engines, or anomaly detection frameworks. Experience with AWS cloud platforms Understanding of data governance, encryption, and secure data handling practices. Experience with fraud analytics tools or platforms like Actimize What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7822 Reporting into: Director Role Type: Tech Manager

Posted 3 days ago

Apply

5.0 - 8.0 years

11 - 21 Lacs

Pune

Work from Office

This role is accountable to develop, expand and optimize Data Management Architecture, Design & Implementation under Singtel Data Platform & Management Design, develop and implement data governance and management solution, data quality, Privacy, protection & associated control technology solutions as per best industry practice. Review, evaluate and implement Data Management standards primarily Data Classification, Data Retention across systems. Design, develop and implement Automated Data Discovery rules to identify presence of PII attributes. Drive development, optimization, testing and tooling to improve overall data control management (Security, Data Privacy, protection, Data Quality) Review, analyze, benchmark, and approve solution design from product companies, internal teams, and vendors. Ensure that proposed solutions are aligned and conformed to the data landscape, big data architecture guidelines and roadmap. SECTION B: KEY RESPONSIBILITIES AND RESULTS 1 Design and implement data management standards like Catalog Management, Data Quality, Data Classification, Data Retention 2 Drive BAU process, testing and tooling to improve data security, privacy, and protection 3 Identify, design, and implement internal process improvements: automating manual processes, control and optimizing data technology service delivery. 4 Implement and support Data Management Technology solution throughout lifecycle like user onboarding, upgrades, fixes, access management etc.. SECTION C: QUALIFICATIONS / EXPERIENCE / KNOWLEDGE REQUIRED Category Essential for this role Education and Qualifications Diploma in Data Analytics, Data Engineering, IT, Computer Science, Software Engineering, or equivalent. Work Experience Exposure to Data Management and Big Data Concepts Knowledge and experience in Data Management, Data Integration, Data Quality products Technical Skills Informatica CDGC, Collibra, Alatian Informatica Data Quality, Data Privacy Management Azure Data Bricks This role is accountable to develop, expand and optimize Data Management Architecture, Design & Implementation under Singtel Data Platform & Management Design, develop and implement data governance and management solution, data quality, Privacy, protection & associated control technology solutions as per best industry practice. Review, evaluate and implement Data Management standards primarily Data Classification, Data Retention across systems. Design, develop and implement Automated Data Discovery rules to identify presence of PII attributes. Drive development, optimization, testing and tooling to improve overall data control management (Security, Data Privacy, protection, Data Quality) Review, analyze, benchmark, and approve solution design from product companies, internal teams, and vendors. Ensure that proposed solutions are aligned and conformed to the data landscape, big data architecture guidelines and roadmap. SECTION B: KEY RESPONSIBILITIES AND RESULTS 1 Design and implement data management standards like Catalog Management, Data Quality, Data Classification, Data Retention 2 Drive BAU process, testing and tooling to improve data security, privacy, and protection 3 Identify, design, and implement internal process improvements: automating manual processes, control and optimizing data technology service delivery. 4 Implement and support Data Management Technology solution throughout lifecycle like user onboarding, upgrades, fixes, access management etc.. SECTION C: QUALIFICATIONS / EXPERIENCE / KNOWLEDGE REQUIRED Category Essential for this role Education and Qualifications Diploma in Data Analytics, Data Engineering, IT, Computer Science, Software Engineering, or equivalent. Work Experience Exposure to Data Management and Big Data Concepts Knowledge and experience in Data Management, Data Integration, Data Quality products Technical Skills Informatica CDGC, Collibra, Alatian Informatica Data Quality, Data Privacy Management Azure Data Bricks

Posted 3 days ago

Apply

0 years

0 Lacs

Bengaluru South, Karnataka, India

On-site

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. This role is for Data Testing Analyst in the Regulatory Reporting automation program. This individual will be responsible for assisting the Business Specialist Manager drive the definition, gathering, exploration, and analysis of Finance data to deliver the end-to-end automation of our regulatory reporting platform. This individual will assist the organization coordinate with several groups within American Express during designing, implementing, and migrating the implemented solution into production. The individual selected will partner closely with Business Specialist Manager and Product Owners to support defining functionality to be built, collaborate with Technology to design how functionality will work and validate at regular intervals that the software features developed align with original requirements provided to the team. How will you make an impact in this role? Support data analysis on existing processes and datasets to understand and support Point of Arrival (POA) process design Support and guide determining portfolios, data elements and grain of data required for designing processes Support team review data scenarios and provide clarification on how to report on these scenarios in alignment with regulatory guidance Identify and support business requirements, functional design, prototyping, testing, training, and supporting implementations Support developing functional requirement documents (FRDs) and process specific design documentation to support process and report owner requirements Document and understand core components of solution architecture including data patterns, data-related capabilities, and standardization and conformance of disparate datasets Support the implementation of master and reference data to be used across operational and reporting processes Participate in daily meetings with the pods (implementation groups for various portfolios of the Company for data sourcing and regulatory classification and reporting). Coordinate with various Product Owners, Process Owners, Subject Matter Experts, Solution Architecture colleagues, and Data Management team to ensure builds are appropriate for American Express products Participate on user acceptance testing, parallel run testing, and any other testing required to ensure the build meets the requirements authored including development and execution of test cases, and documentation of results Assist on development of executable testing algorithms that enable validation of the expected system functionality, including replication of deterministic logic and filtering criteria Minimum Qualifications SQL and data analysis experience (Minimum experience - 4 plus years) Product/platform understanding and process design experience Knowledgeable about Financial Data Warehouse and Reporting Solutions (such as ODS, AxiomSL, OFSAA, and Hadoop concepts) Knowledgeable in Data Analytics/profiling Knowledgeable with creating S2T and Functional designs Knowledgeable in creating Data Mappings, analyzing the SOR (System of Record) data, implementing Data Quality Rules to identify data issues in SORs Experience with of MS Excel and Power Query Testing management and execution experience Foundational data warehousing principles and data modeling experience is a plus Agile trained is a plus Financial reporting or accounting experience is a plus A good understanding of the banking products is a plus Exhibits organizational skills with the ability to meet/exceed critical deadlines and manage multiple deliverables simultaneously A self-starter, proactive team player with a passion to consistently deliver high quality service and exceed customers’ expectations Excellent written and verbal communications with ability to communicate highly complex concepts and processes in simple terms and pragmatically across Finance, Business and Technology stakeholders Excellent relationship building, presentation and collaboration skills Knowledge of US Regulatory Reports (Y9C, Y14, Y15, 2052a, amongst others) Working exposure in data analysis and testing of financial data domains to support regulatory and analytical requirements for large scale banking/financial organizations Experience in development of testing automation capabilities Experience in Cloud capabilities We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement. Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need. Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program. Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 3 days ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Company Description At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. Job Description The QA Leader is responsible for formulating test strategies, designing test plans and test cases for the QA team as well as executing tests to validate the application is working as required by the business. This position provides clear and concise feedback to management on the production readiness of the software being tested, ensuring that Nielsen systems meet the highest standard of quality that is practical to obtain. Requirements Graduate in Computer Engineering or Computer Science (or similar degrees) Ability to communicate in English, able to work in a team Experience with tools like JIRA and Software Development and Collaboration tools such as Confluence Plan for test teams and environments to have the right tools (manual and / or automated) in place enabling ability to deliver consistent and quality output Assist testers in test planning Support other test team members for the implementation of testing related activities Previous experience as QA Lead Skills Strong testing, functional, analytical and technical abilities, ability to find bugs, attention to detail, troubleshooting, able to document in details tests performed Basic knowledge of database management, SQL, Microsoft SQL Server and administration tasks Detail oriented Review requirements, specifications and technical design documents to provide timely and meaningful feedback Execute, estimate, prioritize, plan and coordinate testing activities Identify the testable requirements and prepare the test strategy for each of the testing types identified Experience in test planning, design and execution Maintain and evolve formal QA testing processes for a software development team Monitor the effectiveness of testing and bring about improvements through insights gained via analysis Exposure to stress, reliability, and performance testing Preferred Experience with highly critical production environments. Understanding the importance of production environments (test, deploy) Test automation (GUI , web and windows app) Experience in writing unit/integration tests including test automation. Improve product quality by identifying, implementing, and using automated testing tools and frameworks Strong analytical and communication skills Java/J2EE experience on Linux platform Familiarity with Hadoop Java software development Familiarity with a messaging bus, such as AMQ and Kafka SQL Programming (PostgreSQL) Experienced in building RESTful APIs. Work with web technology including AWS, Docker, Java, Python,, JavaScript, React/Redux. Additional Information Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels.

Posted 3 days ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Noida, Pune, Bengaluru

Work from Office

We are hiring BigData Lead for one of our client based in Noida / Indore / Bangalore / Hyderabad / Pune for a full time position and willing to hire immediately. Please share resume to rana@enormousenterprise.in / anjana@enormousenterprise.in Interview Mode Video Interview followed by Video interview or F2F interview Experience Level - 7 to 10 Years Minimum 2-5 Years of Lead Experience. Position Summary: We are looking for candidates with hands-on experience in Big Data or Cloud Technologies. Must have technical Skills 7 to 10 Years of experience Data Ingestion, Processing and Orchestration knowledge Expertise and hands-on experience on Spark DataFrame, and Hadoop echo system components – Must Have Good and hand-on experience* of any of the Cloud (AWS/Azure/GCP) – Must Have Good knowledge of PySpark (SparkSQL) – Must Have Good knowledge of Shell script & Python – Good to Have Good knowledge of SQL – Good to Have Good knowledge of migration projects on Hadoop – Good to Have Good Knowledge of one of the Workflow engine like Oozie, Autosys – Good to Have Good knowledge of Agile Development– Good to Have Passionate about exploring new technologies – Good to Have Automation approach - – Good to Have Good Communication Skills – Must Have Roles & Responsibilities Lead technical implementation of Data Warehouse modernization projects for Impetus Design and development of applications on Cloud technologies Lead technical discussions with internal & external stakeholders Resolve technical issues for team Ensure that team completes all tasks & activities as planned Code Development

Posted 3 days ago

Apply

10.0 - 13.0 years

27 Lacs

Pune, Kharadi

Work from Office

Shift - 11am to 8 pm IST Mandatory - Strong knowledge of and hands-on development experience in Oracle PLSQL - Strong knowledge of and hands-on development experience SQL analytic functions. - Experience with developing complex, numerically-intense business logic. - Good knowledge of & experience in database performance tuning - Fluency in UNIX scripting. Good-to-have - Knowledge of/experience in any of python, Hadoop/ Hive/Impala, horizontally scalable databases, columnar databases - Oracle certifications - Any of DevOps tools/techniques CICD, Jenkins/GitLab, source control/git, deployment automation such Liquibase -Experience with Productions issues/deployments .

Posted 3 days ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Come work at a place where innovation and teamwork come together to support the most exciting missions in the world! We are seeking a talented Sr. QA Engineer to deliver roadmap features of Enterprise TruRisk Platform which would help customers to Measure, Communicate and Eliminate Cyber Risks. The Lead QA Engineer will design, implement, document, and maintain testing frameworks. You will be responsible for the quality of core product capabilities using micro-services and Big Data based components. This is a fantastic opportunity to be an integral part of a team building Qualys next generation platform using Big Data & Micro-Services based technology to process over billions of transactions data per day, leverage open-source technologies, and work on challenging and business-impacting initiatives. Responsibilities: Perform functional testing of the Enterprise TruRisk Platform and its various modules. Conduct integration testing across different systems, working closely with cross-functional teams to ensure seamless data and service flow. Test Big Data ingestion and aggregation pipelines using Spark shell, SQL, and other data tools. Develop and maintain automation frameworks for functional and regression testing. Own and execute end-to-end workflow automation using custom or industry-standard frameworks. Define test strategies, test plans, and test cases for new features, platform enhancements, and services. Debug and troubleshoot issues identified in pre-production or production environments. Drive system performance testing of the platform and data applications. Define operational procedures, service monitors, alerting mechanisms, and coordinate implementation with the NOC team. Collaborate with product and engineering teams to review requirements, specifications, and technical designs, and ensure proper test coverage. Recreate complex production/customer issues to verify root causes and ensure resolution. Identify technical interdependencies, potential issues, and propose effective solutions. Requirements: 6 years of experience in the full-time Functional testing & Automation role as lead. Hands on experience in automating backend applications (e.g., database, REST API's). Hands on experience with automating any backend applications (e.g., database, server side). Knowledge of relational databases and SQL. Good debugging skills. Working experience working in Linux/Unix environment. Good understanding of testing methodologies. Good to have hands-on experience in working on Big Data technologies like Hadoop, Spark, Airflow, Kafka, Elastic and other distributed components. Experience in the Security domain is an advantage.

Posted 3 days ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team is comprised of many talented individuals all working together with cutting-edge technology to build the best airline in the history of aviation. Our team designs, develops and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Job Overview And Responsibilities United Airlines’ Enterprise Data Analytics department partners with business and technology leaders across the company to transform data analytics into a competitive advantage. An offshore team based in Delhi, India will work closely with this group and support it with complementing skills and capabilities. The key objectives are to improve operating performance, boost customer experience and drive incremental revenue by embedding data in decision making across all levels of the organization. An entry level data analyst will be responsible for data, analytics and modeling to dig deep into details as well as to assess the big picture. Execute solutions to business problems using data analysis, data mining, optimization tools, statistical modeling and machine learning techniques Continuously develop and improved analysis methodologies Analyze and model data and working from concept through to execution Understand a business problem and the available data and identify what analytics and modeling techniques can be applied to answer a business question This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree Quantitative field like Math, Statistics, Operations Research At least 2+ years of experience in modeling/ machine learning required Proven comfort and an intellectual curiosity for working with very large sets of data, and hands-on knowledge in predictive modeling Strong knowledge of either R or Python Be experienced in manipulating and analyzing complex, high-volume, high dimensionality data from various sources to highlight patterns and relationships Understanding of how and when to apply predictive and machine learning techniques like logistic regression, random forest, GBM, Neural Nets, SVM etc. is required Be proficient in using database querying tools and able to write complex queries and procedures using Teradata SQL and/or Microsoft TSQL Hands on experience in using Big Data ecosystems (Hadoop/Spark), API Gateways and non-structured data will be a plus Being able to communicate complex quantitative analysis and algorithms in a clear, precise and actionable manner Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's degree GGN00002106

Posted 3 days ago

Apply

4.0 - 9.0 years

0 Lacs

Nagpur, Maharashtra, India

On-site

Position: Big Data (Senior Developer / Lead) Experience: 4 - 9 years Location: Nagpur Responsibilities: Preferred Skillset: Spark, Scala , Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.), SQL, Jenkins. Experience in Big data technologies , real time data processing platform (Spark Streaming - Kafka). Experience in Cloudera would be an advantage. Hands-on experience on Unix Command . Strong foundation in computer science fundamentals: Data structure, Algorithms, and coding. Experienced in Performance optimization techniques. Consistently demonstrates clear and concise written and verbal communication. Ability to multi-task and weekend support for production releases.

Posted 3 days ago

Apply

8.0 - 13.0 years

0 Lacs

Delhi, India

On-site

Responsibilities: Developing intelligent and scalable engineering solutions from scratch Partnering with the customers to share product vision and goals Working on high/low-level product designs & roadmaps along with a team of ace developers (S)he will be responsible for server-side component designing, detailed technical design, development, testing, implementation, and maintenance. Building products and applications using razor-edge technologies on Open source Java tech stack, HTML5, Backbone.js, Hadoop, Cassandra, MongoDB, etc (S)he will also review and understand business requirements ensuring that development tasks are completed within the timeline provided and that issues are fully tested with minimal defects Must Have: B.E/B.Tech/MCA with at least 8-13 years of hands-on web development experience in Core Java and J2EE Should have experience in end-to-end application development in an Agile environment Should be able to define technical architecture, hands-on coder, maintaining standards, and other team policies. Should have experience of managing a team of at least 3 people Experience on OOAD frameworks such as Spring, Hibernate, REST Experience in TDD, Continuous Integration and build tools ( Maven, Jenkins, Gradle) Good knowledge of Design Patterns Understanding of latest technologies and tools in the Java/JEE space and using them Good experience with databases like MySQL or Oracle or any NoSQL Strong interpersonal skills with ability to work effectively across team boundaries Experience with Agile methodology and development tools Know how around integration patterns for queuing, caching, etc. Non-relational platforms like DynamoDB/MongoDB (no-sql) would be add on  Good to have: Experience in cloud computing or Linux Ability to respond well under pressure Logical mind with keen analytical skills

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies