Home
Jobs

4069 Hadoop Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

20 - 28 Lacs

Bengaluru

On-site

GlassDoor logo

Job Purpose We are seeking a dynamic and skilled Data Scientist to join our analytics team. The ideal candidate will work collaboratively with data scientists, business stakeholders, and subject matter experts to deliver end-to-end advanced analytics projects that support strategic decision-making. This role demands technical expertise, strong problem-solving abilities, and the capability to translate business needs into impactful analytical solutions. Key Responsibilities -Collaborate with internal stakeholders to design and deliver advanced analytics projects. -Independently manage project workstreams with minimal supervision. Identify opportunities where analytics can support or improve business decision-making processes. -Provide innovative solutions beyond traditional analytics methodologies. -Apply strong domain knowledge and technical expertise to develop conceptually sound models and tools. -Mentor and guide junior team members in their professional development. -Communicate analytical findings effectively through clear and impactful presentations. Desired Skills & Experience Relevant Experience: 5+ years of analytics experience in Financial Services (Universal Bank/NBFC/Insurance), Rating Agencies, E-commerce, Retail, or Consulting. Exposure to areas like Customer Analytics, Retail Analytics, Collections & Recovery, Credit Risk Ratings, etc. Statistical & Modeling Expertise: Hands-on experience in techniques such as Logistic & Linear Regression, Bayesian Modeling, Classification, Clustering, Neural Networks, Non-parametric Methods, and Multivariate Analysis. Tools & Languages: Proficiency in R, S-Plus, SAS, STATA. Exposure to Python and SPSS is a plus. Data Handling: Experience with relational databases and intermediate SQL skills. Comfortable working with large datasets using tools like Hadoop, Hive, and MapReduce. Analytical Thinking: Ability to derive actionable insights from structured and unstructured data. Strong problem-solving mindset with the ability to align analytics with business objectives. Communication: Excellent verbal and written communication skills to articulate findings and influence stakeholders. Learning Orientation: Eagerness to learn new techniques and apply creative thinking to solve real-world business problems. Job Types: Full-time, Permanent Pay: ₹2,030,998.06 - ₹2,822,692.49 per year Benefits: Health insurance Provident Fund Schedule: Morning shift Supplemental Pay: Performance bonus Yearly bonus Application Question(s): How many years of experience in NBFC/BFSI domain? What's your Notice Period? Experience: Data science: 5 years (Required) Work Location: In person

Posted 15 hours ago

Apply

5.0 years

4 - 6 Lacs

Bengaluru

On-site

GlassDoor logo

Help empower our global customers to connect to culture through their passions. Technology at StockX Our Technology Team is on a mission to build the next generation e-commerce platform for the next generation customer. We build world-class, innovative experiences and products that give our users access to the world's most-coveted products and unlock economic opportunity by turning reselling into a business for anyone. Our team uses cutting edge technologies that handle massive scale globally. We're an internet-native, cloud-native company from day 1 - you won't find legacy technology here. If you're a curious leader who loves solving problems, wearing multiple hats, and learning new things, join us! About the role As a Senior Data Engineer, you will be empowered to leverage data to drive amazing customer experiences and business results. You will own the end to end development of data engineering solutions to support analytical needs of the business. The ideal candidate will be passionate about working with disparate datasets and be someone who loves to bring data together to answer business questions at speed. You should have deep expertise in the creation and management of datasets and the proven ability to translate the data into meaningful insights through collaboration with analysts, data scientists and business stakeholders. What you'll do Design and build mission critical data pipelines with a highly scalable distributed architecture - including data ingestion (streaming, events and batch), data integration, data curation Help continually improve ongoing reporting and analysis processes, simplifying self-service support for business stakeholders Build and support reusable framework to ingest, integration and provision data Automation of end to end data pipeline with metadata, data quality checks and audit Build and support a big data platform on the cloud Define and implement automation of jobs and testing Optimize the data pipeline to support ML workloads and use cases Support mission critical applications and near real time data needs from the data platform Capture and publish metadata and new data to subscribed users Work collaboratively with business analysts, product managers, data scientists as well as business partners and actively participate in design thinking session Participate in design and code reviews Motivate, coach, and serve as a role model and mentor for other development team associates/members that leverage the platform About you Minimum of 5 years' experience in data warehouse / data lake technical architecture 3+ years of experience in using programming languages (Python / Scala / Java / C#) to build data pipelines Minimum 3 years of Big Data and Big Data tools in one or more of the following: Batch Processing (e.g. Hadoop distributions, Spark), Real time processing (e.g. Kafka, Flink/Spark Streaming) Minimum of 2 years' experience with AWS or engineering in other cloud environments Experience with Database Architecture/Schema design Strong familiarity with batch processing and workflow tools such as AirFlow, NiFi Ability to work independently with business partners and management to understand their needs and exceed expectations in delivering tools/solutions Strong interpersonal, verbal and written communication skills and ability to present complex technical/analytical concepts to executive audience Strong business mindset with customer obsession; ability to collaborate with business partners to identify needs and opportunities for improved data management and delivery Experience providing technical leadership and mentoring other engineers for best practices on data engineering Bachelor's degree in Computer Science, or a related technical field Nice to have: Masters in Computer Science or related quantitative field About StockX StockX is proud to be a Detroit-based technology leader focused on the large and growing online market for sneakers, apparel, accessories, electronics, collectibles, trading cards, and more. StockX's powerful platform connects buyers and sellers of high-demand consumer goods from around the world using dynamic pricing mechanics. This approach affords access and market visibility powered by real-time data that empowers buyers and sellers to determine and transact based on market value. The StockX platform features hundreds of brands across verticals including Jordan Brand, adidas, Nike, Supreme, BAPE, Off-White, Louis Vuitton, Gucci; collectibles from brands including LEGO, KAWS, Bearbrick, and Pop Mart; and electronics from industry-leading manufacturers Sony, Microsoft, Meta, and Apple. Launched in 2016, StockX employs 1,000 people across offices and verification centers around the world. Learn more at www.stockx.com. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. This job description is intended to convey information essential to understanding the scope of the job and the general nature and level of work performed by job holders within this job. However, this job description is not intended to be an exhaustive list of qualifications, skills, efforts, duties, responsibilities or working conditions associated with the position. StockX reserves the right to amend this job description at any time. StockX may utilize AI to rank job applicant submissions against the position requirements to assist in determining candidate alignment.

Posted 15 hours ago

Apply

2.0 - 4.0 years

3 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

About the role Enable data driven decision making across the Tesco business globally by developing analytics solutions using a combination of math, tech and business knowledge You will be responsible for Understands business needs and in depth understanding of Tesco processes Builds on Tesco processes and knowledge by applying CI tools and techniques. Responsible for completing tasks and transactions within agreed KPI's Solves problems by analyzing solution alternatives Engaging with business & functional partners to understand business priorities, ask relevant questions and scope same into a analytical solution document calling out how application of data science will improve decision making In depth understanding of techniques to prepare the analytical data set leveraging multiple complex data set sources Building Statistical models and ML algorithms with practitioner level competency Writing structured, modularized & codified algorithms using Continuous Improvement principles (development of knowledge assets and reusable modules on GitHub, Wiki, etc) with expert competency - Building easy visualization layer on top of the algorithms in order to empower end-users to take decisions - this could be on a visualization platform (Tableau / Python) or through a recommendation set through PPTs Working with the line manager to ensure application / consumption and proactively identifying opportunities to help the larger Tesco business with areas of improvement Keeping up-to-date with the latest in data science and retail analytics and disseminating the knowledge among colleagues You will need - 2 - 4 years experience in data science application in Retail or CPG Preferred Functional experience: Marketing, Supply Chain, Customer, Merchandising, Operations, Finance or Digital Applied Math: Applied Statistics, Design of Experiments, Regression, Decision Trees, Forecasting, Optimization algorithms, Clustering, NLP Tech: SQL, Hadoop, Spark, Python, Tableau, MS Excel, MS Powerpoint, GitHub Business: Basic understanding of Retail domain Soft Skills: Analytical Thinking & Problem solving, Storyboarding, Articulate communication Whats in it for you? At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Performance Bonus - Opportunity to earn additional compensation bonus based on performance, paid annually Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. About Us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation

Posted 15 hours ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. At Workday, we value our candidates’ privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About The Team Join our team and experience Workday! / About the team It's fun to work in a company where people truly believe in what they're doing. At Workday, we're committed to bringing passion and customer focus to the business of enterprise applications. We work hard, and we're serious about what we do. But we like to have a good time, too. In fact, we run our company with that principle in mind every day: One of our core values is fun. About The Role Job Description / About the Role Workday is looking for a Support Engineer specializing in Analytics with expertise in troubleshooting, performance optimization, and data analysis across Workday’s analytics services, including Prism Analytics, People Analytics, Discovery Boards, and Accounting Center. The ideal candidate has a solid foundation in big-data processing, data transformation, and reporting frameworks, with the ability to diagnose and resolve complex issues by analyzing logs, performance metrics, and system integrations. This role requires hands-on experience with query performance tuning, data pipeline debugging, and structured troubleshooting methodologies to support Workday’s analytics solutions. Strong data modeling, log analysis, and problem-solving skills combined with clear, effective communication are essential for success in this role. About You Key Areas of Responsibility: Provide sophisticated technical support for Workday’s reporting and analytics tools, including Prism Analytics, People Analytics, Discovery Boards, and Accounting Center, focusing on performance optimization, index debugging, memory management, and system health debugging. Develop expertise in Workday analytics services to drive high-performance reporting and data analytics solutions, using Prism Analytics, People Analytics, and SQL best practices. Collaborate with clients to define business requirements and translate them into optimized reports and configurations, improving query performance, data accuracy, and system health using Prism Analytics and Discovery Boards. Troubleshoot and resolve issues related to report configurations, system performance, integrations, and memory management, including detailed analysis of logs, query performance, and data pipelines. Guide customers in building, modifying, and optimizing reports, ensuring scalability, data integrity, and alignment with business needs, especially in Prism Analytics and Accounting Center. Educate users on standard methodologies for Workday reporting, security, and data governance, emphasizing People Analytics and Discovery Boards. Collaborate cross-functionally with engineering teams to address data quality issues, security concerns, and performance optimizations across Prism Analytics and Accounting Center, with a focus on memory management and system health. Contribute to documentation, QA efforts, and the optimization of analytics tools, with a focus on SQL querying, indexing, and debugging system health issues. Participate in 24x7 global support coverage, providing timely and efficient support across time zones. Key Technical Skills & Knowledge: Bachelor’s degree in Computer Science, Information Management, Statistics, Data Science, or a related field. 3+ years of experience in customer support, system performance optimization, data analysis, or similar roles, with a solid background in big data technologies and AI-driven analytics. Demonstrable experience with data platforms (e.g., Spark, Hadoop) and working with large-scale datasets, including data pipeline design and distributed processing. Hands-on experience with advanced reporting tools and analytics solutions, including AI-powered reporting platforms and big data tools like Spark for data transformation and analysis. Strong proficiency in SQL and data querying with the ability to analyze complex data sets, optimize queries, and perform data-driven insights to enhance system performance and business processes. Demonstrated ability to gather and map business requirements to advanced analytics and application capabilities, ensuring alignment with AI-driven insights and reporting solutions. Solid understanding of data architecture, including data lakes, ETL processes, and real-time data streaming. Strong analytical skills to collect, organize, and interpret complex datasets, using AI and big data tools to drive product improvements and optimize reporting performance. Ability to deliver data-driven insights to technical and non-technical partners, presenting complex findings to end-users and executive teams in an actionable manner. Proven collaboration skills, working across teams to drive issue resolution and using AI or machine learning models to enhance system functionality and customer experience. Strong written and verbal communication skills, with experience in technical consulting, customer support, or AI/ML-driven technical roles. Self-motivated with the ability to work independently in a fast-paced environment, while using AI and big data technologies to identify and resolve issues. Our Approach to Flexible Work With Flex Work, we’re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means you'll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!

Posted 15 hours ago

Apply

0 years

5 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

Join us as a Software Engineer This is an opportunity for a driven Software Engineer to take on an exciting new career challenge Day-to-day, you'll build a wide network of stakeholders of varying levels of seniority It’s a chance to hone your existing technical skills and advance your career We're offering this role at associate vice president level What you'll do In your new role, you’ll engineer and maintain innovative, customer centric, high performance, secure and robust solutions. You’ll be working within a feature team and using your extensive experience to engineer software, scripts and tools that are often complex, as well as liaising with other engineers, architects and business analysts across the platform. You’ll also be: Producing complex and critical software rapidly and of high quality which adds value to the business Working in permanent teams who are responsible for the full life cycle, from initial development, through enhancement and maintenance to replacement or decommissioning Collaborating to optimise our software engineering capability Designing, producing, testing and implementing our working code Working across the life cycle, from requirements analysis and design, through coding to testing, deployment and operations The skills you'll need You’ll need a background in hands-on technical development, with at least eight years of industry experience in a data engineering and at least two years of experience with Quantexa platform. You'll need to configure and deploy Quantexa software using tools such as Spark, Hadoop, Scala, Elasticsearch with our platform being hosted on public virtual clouds such as GCP and AWS. You’ll also need: Experience of Scala, Java, Python, or a programming language associated with data engineering Experience in modern development tooling e.g. Git, Gradle, Nexus and technologies supporting automation and DevOps e.g. Jenkins, Docker and a little bit of good old Bash scripting Experience of testing libraries of common programming languages such as ScalaTest or equivalent. A background in solving highly complex, analytical and numerical problems Experience of implementing programming best practice, especially around scalability, automation, virtualisation, optimisation, availability and performance

Posted 15 hours ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Position Summary... What you'll do... About Team: This is the team which builds reusable technologies that aid in acquiring customers, onboarding and empowering merchants besides ensuring a seamless experience for both these stakeholders. We also optimize tariffs and assortment, adhering to the Walmart philosophy - Everyday Low Cost. In addition to ushering in affordability, we also create personalized experiences for customers the omnichannel way, across all channels - in-store, on the mobile app and websites. Marketplace is the gateway to domestic and international Third-Party sellers; we enable them to manage their end-to-end onboarding, catalog management, order fulfilment, return ; refund management. Our team is responsible for design, development, and operations of large-scale distributed systems by leveraging cutting-edge technologies in web/mobile, cloud, big data ; AI/ML. We interact with multiple teams across the company to provide scalable robust technical solutions. What you'll do: As a Staff Data Scientistis responsible forbuildingscalableend-to-enddata science solutionsfor ourdata products. Work closely with data engineers and data analysts to help build ML- and statistics-driven data quality and continuous data monitoring workflows Solve business problems byscalingadvanced Machine Learning algorithms and complex statistical models on large volumes of data Own theMLOpslifecycle, from data monitoring to refactoringdata science code tobuilding robust model monitoring workflows formodel lifecycle management Demonstratestrong thought-leadership and consult with product and business stakeholders to build, scale and deploy holisticmachine learning solutionsafter successful prototyping. Followindustry best practices, stay up to date with and extend the state of the art in machine learning research and practice and drive innovation Promoteand support company policies, procedures, mission, values, and standards of ethics and integrity. What you'll bring: Preferredqualifications: Knowledge of the foundations of machine learning and statistics Experience withweb service standards and related patterns (REST,gRPC) Experienced in architecting solutions with Continuous Integration and Continuous Delivery in mind Familiar with distributed in-memory computing technologies Solid experience working with state-of-the-art supervised and unsupervised machine learning algorithms on real-world problems Strong Python coding and package development skills Experience with Big Data and analytics in general leveraging technologies like Hadoop, Spark, and MapReduce;Ability to work in a big data ecosystem - expert in SQL/Hive and ability to work with Spark. Able to refactordata science code andhas collaborated with data scientists in developing ML solutions. Experience playing the role of full-stack data scientist and taking solutions to production. Experience developing proper metrics instrumentation in software components, to help facilitate real-time and remote troubleshooting/performance monitoring. Educational qualifications should be preferably in Computer Science, Statistics,Engineeringor a related area. Good effective communication (both written and verbal) skills and the ability to present complex ideas in a clear ; concise way, to different audiences. Ateam player with good work ethics Require prior experience in Delivery Promise Optimization domain or in Supply Chain domain with hands on experience in solving Large Scale Optimization problem (both Linear and Non-Linear Optimization). Require mandatory hands on experience in working in Spark or other comparable distributed computing frameworks About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. Thats what we do at Walmart Global Tech. Were a team of software engineers, data scientists, cybersecurity experts and service professionals within the worlds leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is ''everyone included.'' By fostering a workplace culture where everyone isand feelsincluded, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, were able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions while being welcoming of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelors degree in Statistics, Economics, Analytics, Mathematics, Computer Science, Information Technology or related field and 4 years' experience in an analytics related field. Option 2: Masters degree in Statistics, Economics, Analytics, Mathematics, Computer Science, Information Technology or related field and 2 years' experience in an analytics related field. Option 3: 6 years' experience in an analytics or related field. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Primary Location... 4,5,6, 7 Floor, Building 10, Sez, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli , India R-2127472

Posted 15 hours ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Position Summary... What you'll do... About The Team Ever wondered what would a convergence of online and offline advertising systems looks like? Ever wondered how we can bridge the gap between sponsored search, display, video ad formats? Ever thought how we can write our own ad servers which serve billions of requests in near real time? Our Advertising Technology team is building an end-to-end advertising platform that is key to Walmart’s overall growth strategy. We use cutting edge machine learning, data mining and optimization algorithms to ingest, model and analyze Walmart’s proprietary online and in-store data, encompassing 95% of American households. Importantly, we build smart data systems that deliver relevant retail ads and experiences that connect our customers with the brands and products they love. What You’ll Do Design and Develop Data Pipelines: Design, develop, and maintain data pipelines to extract, transform, and load data from various sources into data warehouses, data lakes, or other data repositories. Data Architecture: Collaborate with stakeholders to design and implement data architectures that meet business requirements, ensuring scalability, reliability, and performance. Data Quality and Governance: Develop and implement data quality and governance policies to ensure data accuracy, completeness, and integrity. Data Integration: Integrate data from various sources, such as databases, files, and APIs, using ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes. Data Warehousing: Design and develop data warehouses, data marts, or other data repositories to store and manage large datasets. Big Data Processing: Process large datasets using big data technologies, such as Hadoop, Spark, or NoSQL databases. Data Visualization: Develop data visualizations and reports to help stakeholders understand complex data insights. Data Security: Ensure data security and compliance with regulatory requirements, such as GDPR, HIPAA, or PCI-DSS. Collaboration: Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders to understand business requirements and develop data solutions. Monitoring and Maintenance: Monitor and maintain data pipelines, data warehouses, and data repositories to ensure they are running smoothly and efficiently. What You’ll Bring An Engineering Degree - B.E/B. Tech/MTech/MS in any stream Computer Science preferred. Minimum 5+ years of object-oriented programming experience in Python, Scala. 5-9 years of experience in Data processing frameworks, Data storage systems, Data storage systems and Data visualization tools Data processing frameworks: Apache Spark, Apache Hadoop, or AWS Glue. Data storage systems: relational databases (e.g., MySQL), NoSQL databases (e.g., MongoDB), or cloud-based storage (e.g., AWS S3). Data processing tools: Apache Beam, Apache Flink, or AWS Lambda. Data visualization tools: Tableau, Power BI, or D3.js. Large scale distributed systems experience, including scalability and fault tolerance. Exposure to cloud infrastructure, such as Open Stack, Azure, GCP, or AWS A continuous drive to explore, improve, enhance, automate and optimize systems and tools. Strong computer science fundamentals in data structures and algorithms Exposure to information retrieval, statistics, and machine learning. Excellent oral and written communication skills. About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity experts and service professionals within the world’s leading retailers who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate on a scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 3years’ experience in software engineering or related area at a technology, retail, or data-driven company. Option 2: 5 years’ experience in software engineering or related area at a technology, retail, or data-driven company. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Certification in Security+, GISF, CISSP, CCSP, or GSEC, Master’s degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 1 year’s experience leading information security or cybersecurity projects Information Technology - CISCO Certification - Certification Primary Location... G, 1, 3, 4, 5 Floor, Building 11, Sez, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli , India R-2176084

Posted 15 hours ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Requirements Description and Requirements Summary: A Big Data (Hadoop) Administrator responsible for supporting the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Strong expertise in DevOps practices, automation, and scripting (e.g . Ansible , Azure DevOps, Shell, Python ) to streamline operations and improve efficiency is highly valued. Job Responsibilities: Assist in the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Perform routine monitoring, troubleshooting, and issue resolution to ensure the stability and performance of Hadoop clusters. Develop and maintain scripts (e.g., Python, Bash, Ansible) to automate operational tasks and improve system efficiency. Collaborate with cross-functional teams, including application development, infrastructure, and operations, to support business requirements and implement new features. Implement and follow best practices for cluster security, including user access management and integration with tools like Apache Ranger and Kerberos. Support backup, recovery, and disaster recovery processes to ensure data availability and business continuity. Conduct performance tuning and optimization of Hadoop clusters to enhance system efficiency and reduce latency. Analyze logs and use tools like Splunk to debug and resolve production issues. Document operational processes, maintenance procedures, and troubleshooting steps to ensure knowledge sharing and consistency. Stay updated on emerging technologies and contribute to the adoption of new tools and practices to improve cluster management. Education: Bachelor’s degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience Experience: 7+ Years Total IT experience & 4+ Years relevant experience in Big Data database Big Data Platform Management : Big Data Platform Management: Knowledge in managing and optimizing the Cloudera Data Platform, including components such as Apache Hadoop (YARN and HDFS), Apache HBase, Apache Solr , Apache Hive, Apache Kafka, Apache NiFi , Apache Ranger, Apache Spark, as well as JanusGraph and IBM BigSQL . Automation and Scripting : Expertise in automation tools and scripting languages such as Ansible, Python, and Bash to streamline operational tasks and improve efficiency. DevOps Practices : Proficiency in DevOps tools and methodologies, including CI/CD pipelines, version control systems (e.g., Git), and infrastructure-as-code practices. Monitoring and Troubleshooting : Experience with monitoring and observability tools such as Splunk, Elastic Stack, or Prometheus to identify and resolve system issues. Linux Administration : Solid knowledge of Linux operating systems, including system administration, troubleshooting, and performance tuning. Backup and Recovery : Familiarity with implementing and managing backup and recovery processes to ensure data availability and business continuity. Security and Access Management : Understanding of security best practices, including user access management and integration with tools like Kerberos. Agile Methodologies : Knowledge of Agile practices and frameworks, such as SAFe , with experience working in Agile environments. ITSM Tools : Familiarity with ITSM processes and tools like ServiceNow for incident and change management. Other Critical Requirement: Excellent Analytical and Problem-Solving skills Ability to work in a 24x7 rotational shift to support Hadoop platforms and ensure high availability. Excellent written and oral communication skills, including the ability to clearly communicate/articulate technical and functional issues with conclusions and recommendations to stakeholders. Prior experience in handling state side and offshore stakeholders Experience in creating and delivering Business presentations. Demonstrate ability to work independently and in a team environment Demonstrate willingness to learn and adopt new technologies and tools to improve operational efficiency About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us!

Posted 15 hours ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Summary Description Summary of This Role Works throughout the software development life cycle and performs in a utility capacity to create, design, code, debug, maintain, test, implement and validate applications with a broad understanding of a variety of languages and architectures. Analyzes existing applications or formulate logic for new applications, procedures, flowcharting, coding and debugging programs. Maintains and utilizes application and programming documents in the development of code. Recommends changes in development, maintenance and system standards. Creates appropriate deliverables and develops application implementation plans throughout the life cycle in a flexible development environment. What Part Will You Play? Develops basic to moderately complex code using front and / or back end programming languages within multiple platforms as needed in collaboration with business and technology teams for internal and external client software solutions. Designs, creates, and delivers routine to moderately complex program specifications for code development and support on multiple projects/issues with a wide understanding of the application / database to better align interactions and technologies. Analyzes, modifies, and develops moderately complex code/unit testing in order to develop concise application documentation. Performs testing and validation requirements for moderately complex code changes. Performs corrective measures for moderately complex code deficiencies and escalates alternative proposals. Participates in client facing meetings, joint venture discussions, vendor partnership teams to determine solution approaches. Provides support to leadership for the design, development and enforcement of business / infrastructure application standards to include associated controls, procedures and monitoring to ensure compliance and accuracy of data. Applies a full understanding of procedures, methodology and application standards to include Payment Card Industry (PCI) security compliance. Conducts and provides basic billable hours and resource estimates on initiatives, projects and issues. Assists with on-the-job training and provides guidance to other software engineers. What Are We Looking For in This Role? Minimum Qualifications BS in Computer Science, Information Technology, Business / Management Information Systems or related field Typically minimum of 4 years - Professional Experience In Coding, Designing, Developing And Analyzing Data. Typically has an advanced knowledge and use of one or more front / back end languages / technologies and a moderate understanding of the other corresponding end language / technology from the following but not limited to; two or more modern programming languages used in the enterprise, experience working with various APIs, external Services, experience with both relational and NoSQL Databases. Preferred Qualifications BS in Computer Science, Information Technology, Business / Management Information Systems or related field 6+ years professional Experience In Coding, Designing, Developing And Analyzing Data and experience with IBM Rational Tools What Are Our Desired Skills and Capabilities? Skills / Knowledge - A seasoned, experienced professional with a full understanding of area of specialization; resolves a wide range of issues in creative ways. This job is the fully qualified, career-oriented, journey-level position. Job Complexity - Works on problems of diverse scope where analysis of data requires evaluation of identifiable factors. Demonstrates good judgment in selecting methods and techniques for obtaining solutions. Networks with senior internal and external personnel in own area of expertise. Supervision - Normally receives little instruction on day-to-day work, general instructions on new assignments. Operating Systems: Linux distributions including one or more for the following: Ubuntu, CentOS/RHEL, Amazon Linux Microsoft Windows z/OS Tandem/HP-Nonstop Database - Design, familiarity with DDL and DML for one or more of the following databases Oracle, MySQL, MS SQL Server, IMS, DB2, Hadoop Back-end technologies - Java, Python, .NET, Ruby, Mainframe COBOL, Mainframe Assembler Front-end technologies - HTML, JavaScript, jQuery, CICS Web Frameworks – Web technologies like Node.js, React.js, Angular, Redux Development Tools - Eclipse, Visual Studio, Webpack, Babel, Gulp Mobile Development – iOS, Android Machine Learning – Python, R, Matlab, Tensorflow, DMTK

Posted 16 hours ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Ready to be pushed beyond what you think you’re capable of? At Coinbase, our mission is to increase economic freedom in the world. It’s a massive, ambitious opportunity that demands the best of us, every day, as we build the emerging onchain platform — and with it, the future global financial system. To achieve our mission, we’re seeking a very specific candidate. We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system. We want someone who is eager to leave their mark on the world, who relishes the pressure and privilege of working with high caliber colleagues, and who actively seeks feedback to keep leveling up. We want someone who will run towards, not away from, solving the company’s hardest problems. Our work culture is intense and isn’t for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, there’s no better place to be. While many roles at Coinbase are remote-first, we are not remote-only. In-person participation is required throughout the year. Team and company-wide offsites are held multiple times annually to foster collaboration, connection, and alignment. Attendance is expected and fully supported. The mission of the Platform Product Group engineers is to build a trusted, scalable and compliant platform to operate with speed, efficiency and quality. Our teams build and maintain the platforms critical to the existence of Coinbase. There are many teams that make up this group which include Product Foundations (i.e. Identity, Payment, Risk, Proofing & Regulatory, Finhub), Machine Learning, Customer Experience, and Infrastructure. As a machine learning engineer, you will play a pivotal role in constructing essential infrastructure for the open financial system. This involves harnessing diverse and extensive data sources, including the blockchain, to grant millions of individuals access to cryptocurrency while simultaneously identifying and thwarting malicious entities. Your impact extends beyond safeguarding Coinbase, as you'll have the opportunity to employ machine learning to enhance the overall user experience. This includes imbuing intelligence into recommendations, risk assessment, chatbots, and various other aspects, making our product not only secure but also exceptionally user-friendly. What you’ll be doing (ie. job duties): Investigate and harness cutting-edge machine learning methodologies, including deep learning, large language models (LLMs), and graph neural networks, to address diverse challenges throughout the company. These challenges encompass areas such as fraud detection, feed ranking, recommendation systems, targeting, chatbots, and blockchain mining. Develop and deploy robust, low-maintenance applied machine learning solutions in a production environment. Create onboarding codelabs, tools, and infrastructure to democratize access to machine learning resources across Coinbase, fostering a culture of widespread ML utilization. What we look for in you (ie. job requirements): 2+yrs of industry experience as a machine learning and software engineer Experience building backend systems at scale with a focus on data processing/machine learning/analytics. Experience with at least one ML model: LLMs, GNN, Deep Learning, Logistic Regression, Gradient Boosting trees, etc. Working knowledge in one or more of the following: data mining, information retrieval, advanced statistics or natural language processing, computer vision. Exhibit our core cultural values: add positive energy, communicate clearly, be curious, and be a builder. Nice to haves: BS, MS, PhD degree in Computer Science, Machine Learning, Data Mining, Statistics, or related technical field. Knowledge of Apache Airflow, Spark, Flink, Kafka/Kinesis, Snowflake, Hadoop, Hive. Experience with Python. Experience with model interpretability, responsible AI. Experience with data analysis and visualization. Job ID : GPML04IN Please be advised that each candidate may submit a maximum of four applications within any 30-day period. We encourage you to carefully evaluate how your skills and interests align with Coinbase's roles before applying. Commitment to Equal Opportunity Coinbase is committed to diversity in its workforce and is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, creed, gender, national origin, age, disability, veteran status, sex, gender expression or identity, sexual orientation or any other basis protected by applicable law. Coinbase will also consider for employment qualified applicants with criminal histories in a manner consistent with applicable federal, state and local law. For US applicants, you may view the Know Your Rights notice here . Additionally, Coinbase participates in the E-Verify program in certain locations, as required by law. Coinbase is also committed to providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process, please contact us at accommodations[at]coinbase.com to let us know the nature of your request and your contact information. For quick access to screen reading technology compatible with this site click here to download a free compatible screen reader (free step by step tutorial can be found here) . Global Data Privacy Notice for Job Candidates and Applicants Depending on your location, the General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) may regulate the way we manage the data of job applicants. Our full notice outlining how data will be processed as part of the application procedure for applicable locations is available here. By submitting your application, you are agreeing to our use and processing of your data as required. For US applicants only, by submitting your application you are agreeing to arbitration of disputes as outlined here.

Posted 16 hours ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Start.io is a mobile marketing and audience platform. Start.io empowers the mobile app ecosystem and simplifies mobile marketing, audience building, and mobile monetization. Start.io's direct integration with over 500,000 monthly active mobile apps provides access to unprecedented levels of global first-party data, which can be leveraged to understand and predict behaviors, identify new opportunities, and fuel growth. If you are a data enthusiast and want to participate in real-time data streams of billions of events from billions of users, your place is with us - Our data team is expanding, and we are actively seeking a passionate Data Analyst with expertise in numbers and SQL. In this role, you will have the exciting opportunity to: Work with large and complex data sets to solve a wide array of challenging problems using different analytical and statistical approaches. Apply technical expertise with quantitative analysis, experimentation, data mining, and the presentation of data to develop our business strategies for our products that serve billions of people. Identify and measure the success of product efforts through goal setting, forecasting, and monitoring of key product metrics to understand trends. Define, understand, and test opportunities and levers to improve the product, and drive roadmaps through your insights and recommendations. Partner with Business, Product, Engineering, and cross-functional teams to inform, influence, support, and execute product strategy and investment decisions. Share your thoughts and insights within our production environment, which fosters creativity and open communication. Thrive in a fast-paced company. What should you MUST bring? Bachelor's degree in engineering, scientific fields, or equivalent (e.g., industrial engineering, information systems, computer science, statistics) 5 years of practical experience with SQL & Python Fluent English and excellent communication skills (written and verbal Proficiency in MS Excel What will be an advantage? Experience working with large data sets and distributed computing tools (e.g., Vertica/redShift, Hadoop, Spark) – A big advantage . Knowledge or experience with statistical modeling, prediction, and ML algorithms - A big advantage . Experience with BI tools: Tableau, Microstrategy, Looker, etc. Ability to drive and manage end-to-end processes and collaborate with multiple interfaces. Experience in the Adtech/media ecosystem Other requirements: High-level analytical and problem-solving skills. Demonstrated business strategic and creative thinking. Proactive, independent, and self-motivated.

Posted 16 hours ago

Apply

8.0 - 10.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Databricks Architect Should have minimum of 10+ years of experience Must have skills DataBricks, Delta Lake, pyspark or scala spark, Unity Catalog Good to have skills - Azure and/or AWS Cloud Handson exposure in o Strong experience with the use of databricks as lakehouse solution o Establish the Databricks Lakehouse architecture o To ingest and transform batch and streaming data on the Databricks Lakehouse Platform. o Orchestrate diverse workloads for the full lifecycle including Delta Live Tables, PySpark etc Mandatory Skills: DataBricks - Data Engineering. Experience8-10 Years.

Posted 16 hours ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Hadoop. Experience5-8 Years.

Posted 16 hours ago

Apply

5.0 - 10.0 years

18 - 24 Lacs

Bengaluru

Work from Office

Naukri logo

Hiring Senior Data Engineer (5+ yrs) with expertise in Azure Data Factory, Databricks, PySpark, AWS. Build scalable ETL pipelines. Location: Bangalore/Hyderabad/Chennai. Immediate to 30 days joiners. share your resume to vadiraj@vtrickstech.com Health insurance Provident fund

Posted 16 hours ago

Apply

3.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Location: Bangalore Experience: 3-7 Yrs Notice Period: Immediate to 15 Days Job Description We are looking for energetic, high-performing and highly skilled Quality Assurance Engineer to help shape our technology and product roadmap. You will be part of the fast-paced, entrepreneurial Enterprise Personalization portfolio focused on delivering the next generation global marketing capabilities. This team is responsible for Global campaign tracking of new accounts acquisition and bounty payments and leverages transformational technologies, such as SQL, Hadoop, Spark, Pyspark, HDFS, MapReduce, Hive, HBase, Kafka & Java.Focus: Provides domain expertise to engineers on Automation, Testing and Quality Assurance (QA) methodologies and processes, crafts and executes test scripts, assists in preparation of test strategies, sets up and maintains test data & environments as well as logs results. 3 - 6 years of hands-on software testing experience in developing test cases and test plans with extensive knowledge of automated testing and architecture. Expert knowledge of Testing Frameworks and Test Automation Design Patterns like TDD, BDD etc. Expertise in developing software test cases for Hive, Spark, SQL written in pyspark SQL and Scala. Hands-on experience in Performance and Load Testing tools such as JMeter, pytest or similar tool. Experience with industry standard tools for defect tracking, source code management, test case management, test automation, and other management and monitoring tools Experience working with Agile methodology Experience with Cloud Platform (GCP) Experience in designing, developing, testing and debugging, and operating resilient distributed systems using Big Data ClustersGood sense for software quality, the clean code principles, test driven development and an agile mindset High engagement, self-organization, strong communication skills and team spirit Experience with building and adopting new test frameworks.Bonus skills:Testing Machine learning/data mining Roles & Responsibilities Responsible for testing and quality assurance of large data processing pipeline using Pyspark and SQL. Develops and tests software, including ongoing refactoring of code, and drives continuous improvement in code structure and quality· Functions as a platform SME who drives quality and automation strategy at application level, identifies new opportunities and drives Software Engineers to deliver the highest quality code. Delivers on capabilities for the portfolio automation strategy and executes against the test and automation strategy defined at the portfolio level. Works with engineers to drive improvements in code quality via manual and automated testing. Involved in the review of the user story backlog and requirements specifications for completeness and weaknesses in function, performance, reliability, scalability, testability, usability, and security and compliance testing. Provides recommendations Plans and defines testing approach, providing advice on prioritization of testing activity in support of identified risks in project schedules or test scenarios.

Posted 17 hours ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 23-Jun-2025 Job ID 10076 Summary Description and Requirements A Big Data (Hadoop) Administrator responsible for supporting the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Strong expertise in DevOps practices, automation, and scripting (e.g. Ansible, Azure DevOps, Shell, Python) to streamline operations and improve efficiency is highly valued. Job Responsibilities Assist in the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Perform routine monitoring, troubleshooting, and issue resolution to ensure the stability and performance of Hadoop clusters. Develop and maintain scripts (e.g., Python, Bash, Ansible) to automate operational tasks and improve system efficiency. Collaborate with cross-functional teams, including application development, infrastructure, and operations, to support business requirements and implement new features. Implement and follow best practices for cluster security, including user access management and integration with tools like Apache Ranger and Kerberos. Support backup, recovery, and disaster recovery processes to ensure data availability and business continuity. Conduct performance tuning and optimization of Hadoop clusters to enhance system efficiency and reduce latency. Analyze logs and use tools like Splunk to debug and resolve production issues. Document operational processes, maintenance procedures, and troubleshooting steps to ensure knowledge sharing and consistency. Stay updated on emerging technologies and contribute to the adoption of new tools and practices to improve cluster management. Education Bachelor’s degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience Experience 7+ Years Total IT experience & 4+ Years relevant experience in Big Data database Big Data Platform Management: Big Data Platform Management: Knowledge in managing and optimizing the Cloudera Data Platform, including components such as Apache Hadoop (YARN and HDFS), Apache HBase, Apache Solr, Apache Hive, Apache Kafka, Apache NiFi, Apache Ranger, Apache Spark, as well as JanusGraph and IBM BigSQL. Automation and Scripting: Expertise in automation tools and scripting languages such as Ansible, Python, and Bash to streamline operational tasks and improve efficiency. DevOps Practices: Proficiency in DevOps tools and methodologies, including CI/CD pipelines, version control systems (e.g., Git), and infrastructure-as-code practices. Monitoring and Troubleshooting: Experience with monitoring and observability tools such as Splunk, Elastic Stack, or Prometheus to identify and resolve system issues. Linux Administration: Solid knowledge of Linux operating systems, including system administration, troubleshooting, and performance tuning. Backup and Recovery: Familiarity with implementing and managing backup and recovery processes to ensure data availability and business continuity. Security and Access Management: Understanding of security best practices, including user access management and integration with tools like Kerberos. Agile Methodologies: Knowledge of Agile practices and frameworks, such as SAFe, with experience working in Agile environments. ITSM Tools: Familiarity with ITSM processes and tools like ServiceNow for incident and change management. Other Critical Requirement Excellent Analytical and Problem-Solving skills Ability to work in a 24x7 rotational shift to support Hadoop platforms and ensure high availability. Excellent written and oral communication skills, including the ability to clearly communicate/articulate technical and functional issues with conclusions and recommendations to stakeholders. Prior experience in handling state side and offshore stakeholders Experience in creating and delivering Business presentations. Demonstrate ability to work independently and in a team environment Demonstrate willingness to learn and adopt new technologies and tools to improve operational efficiency About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 17 hours ago

Apply

10.0 - 14.0 years

30 - 37 Lacs

Noida

Hybrid

Naukri logo

Required Qualifications: Undergraduate degree or equivalent experience 5+ years of work experience on Big Data skills 5+ years of experience managing the team 5+ years of work experience on people management skills 3+ years of work experience on Azure Cloud skills Experience or knowledge Azure Cloud, Databricks, Terraform, CI/CD, Spark, Scala, Java, Hbase, Hive, Sqoop, GitHub, Jenkins, Elastic Search, Grafana, UNIX, SQL, OpenShift, Kubernetes and Oozie etc. Solid technical knowledge and work experience on Big Data skills and Azure Cloud skills Primary Responsibilities: Designing and developing large-scale data processing systems. Use the expertise in big data technologies to ensure that the systems are efficient, scalable, and secure Ensuring that the developed systems are running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance Processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders. Collaborate with these teams to ensure that the systems they develop meet the organizations requirements and can support its goals Collaborating closely with senior stakeholders to understand business requirements and effectively translate them into technical requirements for the development team Planning and documenting comprehensive technical specifications for features or system design, ensuring a clear roadmap for development and implementation Designing, building, and configuring applications to meet business process and application requirements, leveraging your technical expertise and problem-solving skills Directing the development team in all aspects of the software development life cycle, including design, development, coding, testing, and debugging, to deliver high-quality solutions Writing testable, scalable, and efficient code, leading by example, and setting coding standards for the team Conducting code reviews and providing constructive feedback to ensure code quality and adherence to best practices Mentoring and guiding junior team members, fostering their professional growth, and encouraging the adoption of industry best practices Ensuring that software quality standards are met by enforcing code standards, conducting rigorous testing, and implementing continuous improvement processes Staying updated with the latest technologies and industry trends, continuously enhancing technical skills, and driving innovation within the development team Set and communicate team priorities that support the broader organization's goals. Align strategy, processes, and decision-making across teams Set clear expectations with individuals based on their level and role and aligned to the broader organization's goals. Meet regularly with individuals to discuss performance and development and provide feedback and coaching Develop the long-term technical vision and roadmap within, and often beyond, the scope of your teams. Evolve the roadmap to meet anticipated future requirements and infrastructure needs. Identify, navigate, and overcome technical and organizational barriers that may stand in the way of delivery Constantly improve the processes and practices around development and delivery Always think customer first, including striving to outperform their expectations Effectively work with Product Managers, Program Managers and other stakeholders to ensure the customer is benefiting from the work Foster and facilitate Agile methodologies globally and work in an agile environment using SCRUM or Kanban Work with Program Managers/leads to consume product backlog and generate technical design Leading by example on design and development of platform features Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Posted 17 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA The Data Engineer is a seasoned subject matter expert, responsible for the transformation of data into a structured format that can be easily analyzed in a query or report. This role is responsible for developing structured data sets that can be reused or complimented by other data sets and reports. This role analyzes the data sources and data structure and designs and develops data models to support the analytics requirements of the business which includes management / operational / predictive / data science capabilities. Key responsibilities: Creates data models in a structured data format to enable analysis thereof. Designs and develops scalable extract, transformation and loading (ETL) packages from the business source systems and the development of ETL routines to populate data from sources, Participates in the transformation of object and data models into appropriate database schemas within design constraints. Interprets installation standards to meet project needs and produce database components as required. Creates test scenarios and be responsible for participating in thorough testing and validation to support the accuracy of data transformations. Accountable for running data migrations across different databases and applications, for example MS Dynamics, Oracle, SAP and other ERP systems. Works across multiple IT and business teams to define and implement data table structures and data models based on requirements. Accountable for analysis, and development of ETL and migration documentation. Collaborates with various stakeholders to evaluate potential data requirements. Accountable for the definition and management of scoping, requirements, definition, and prioritization activities for small-scale changes and assist with more complex change initiatives. Collaborates with various stakeholders, contributing to the recommendation of improvements in automated and non-automated components of the data tables, data queries and data models. To thrive in this role, you need to have: Seasoned knowledge of the definition and management of scoping requirements, definition and prioritization activities. Seasoned understanding of database concepts, object and data modelling techniques and design principles and conceptual knowledge of building and maintaining physical and logical data models. Seasoned expertise in Microsoft Azure Data Factory, SQL Analysis Server, SAP Data Services, SAP BTP. Seasoned understanding of data architecture landscape between physical and logical data models Analytical mindset with excellent business acumen skills. Problem-solving aptitude with the ability to communicate effectively, both written and verbal. Ability to build effective relationships at all levels within the organization. Seasoned expert in programing languages (Perl, bash, Shell Scripting, Python, etc.). Academic qualifications and certifications: Bachelor's degree or equivalent in computer science, software engineering, information technology, or a related field. Relevant certifications preferred such as SAP, Microsoft Azure etc. Certified Data Engineer, Certified Professional certification preferred. Required experience: Seasoned experience as a data engineering, data mining within a fast-paced environment. Proficient in building modern data analytics solutions that delivers insights from large and complex data sets with multi-terabyte scale. Seasoned experience with architecture and design of secure, highly available and scalable systems. Seasoned proficiency in automation, scripting and proven examples of successful implementation. Seasoned proficiency using scripting language (Perl, bash, Shell Scripting, Python, etc.). Seasoned experience with big data tools like Hadoop, Cassandra, Storm etc. Seasoned experience in any applicable language, preferably .NET. Seasoned proficiency in working with SAP, SQL, MySQL databases and Microsoft SQL. Seasoned experience working with data sets and ordering data through MS Excel functions, e.g. macros, pivots. Workplace type: Hybrid Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today.

Posted 17 hours ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

Naukri logo

The Senior Spark Tech Lead will be responsible for integrating and maintaining the Quantexa platform, a spark based software provided by a UK fintech, into our existing systems to enhance our anti-money laundering capabilities. This role requires a deep expertise in Spark development, as well as an ability to analyze and understand underlying data. Additionally, the candidate should have an interest in exploring open-source applications distributed by Apache, Kubernetes, OpenSearch and Oracle. Should be able to work as a Scrum Master Responsibilities Direct Responsibilities Integrate and upgrade the Quantexa tool with our existing systems for enhanced anti-money laundering measures. Develop and maintain Spark-based applications deployed on Kubernetes clusters. Conduct data analysis to understand and interpret underlying data structures. Collaborate with cross-functional teams to ensure seamless integration and functionality. Stay updated with the latest trends and best practices in Spark development and Kubernetes. Contributing Responsibilities Taking complete ownership of project activities and understand each tasks in details. Ensure that the team delivers on time without any delays and deliveries are of high quality standards. Estimation, Planning and scheduling of the project. Ensure all internal timelines are respected and project is on track. Work with team to develop robust software adhering to the timelines & following all the standard guidelines. Act proactively to ensure smooth team operations and effective collaboration Make sure team adheres to all compliance processes and intervene if required Task assignment to the team and tracking until task completion Proactive Status reporting to the management. Identify Risks in the project and highlight to Manager. Create Contingency and Backup planning as necessary. Create Mitigation Plan. Take decision by own based on situation. Play the role of mentor and coach team members as and when required to meet the target goals Gain functional knowledge on applications worked upon Create knowledge repositories for future reference. Arrange knowledge sharing sessions to enhance team's functional capability. Evaluation of new tools and coming with POCs. Provide feedback of team to upper management on timely basis Technical & Behavioral Competencies Key Responsibilities Integrate and upgrade the Quantexa tool with our existing systems for enhanced anti-money laundering measures. Develop and maintain Spark-based applications deployed on Kubernetes clusters. Conduct data analysis to understand and interpret underlying data structures. Collaborate with cross-functional teams to ensure seamless integration and functionality. Stay updated with the latest trends and best practices in Spark development and Kubernetes. Required Qualifications 7+ Years of experience in development Extensive experience in Hadoop, Spark, Scala development (5 years min). Strong analytical skills and experience in data analysis (SQL), data processing (such as ETL), parsing, data mapping and handling real-life data quality issues. Excellent problem-solving abilities and attention to detail. Strong communication and collaboration skills. Experience in Agile development. High quality coding skill, incl. code control, unit testing, design, and documentation (code, test). Experience with tools such as sonar. Experience with GIT, Jenkins. Specific Qualifications (if required) Experience with development and deployment of spark application and deployment on Kubernetes clusters Hands-on development experience (Java, Scala, etc.) via system integration projects, Python, Elastic (optional). Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Adaptability Creativity & Innovation / Problem solving Attention to detail / rigor Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to develop and adapt a process Ability to develop and leverage networks Choose an item. Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 7 years Fluent in English Team player Strong analytical skills Quality oriented and well organized Willing to work under pressure and mission oriented Excellent Oral and Written Communication Skills, Motivational Skills, Results-Oriented

Posted 17 hours ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Come work at a place where innovation and teamwork come together to support the most exciting missions in the world! Job Description: We are seeking a talented Lead Big Data Engineer to deliver roadmap features of Unified Asset Inventory. This is a great opportunity to be an integral part of a team building Qualys next generation Micro-Services based platform processing over a 100 million transactions and terabytes of data per day, leverage open-source technologies, and work on challenging and business-impacting projects. Responsibilities: You will be building the Unified Asset Management product in the cloud You will be building highly scalable Micro-services that interacts with Qualys Cloud Platform. Research, evaluate and adopt next generation technologies Produce high quality software following good architecture and design principles that you and your team will find easy to work with in the future This is a fantastic opportunity to be an integral part of a team building Qualys next generation platform using Big Data & Micro-Services based technology to process over billions of transactions data per day, leverage open-source technologies, and work on challenging and business-impacting initiatives. Qualifications: Bachelor’s degree in computer science or equivalent 10+ years of total experience. 4+ years of relevant experience in design and architecture Big Data solutions using Spark 3+ years experience in working with engineering resources for innovation. 4+ years experience in understanding Big Data events flow pipeline. 3+ years experience in performance testing for large infrastructure. 3+ In depth experience in understanding various search solutions solr/elastic. 3+ years experience in Kafka In depth experience in Data lakes and related ecosystems. In depth experience of messing queue In depth experience in giving requirements to build a scalable architecture for Big data and Micro-services environments. In depth experience in understanding caching components or services Knowledge in Presto technology. Knowledge in Airflow. Hands-on experience in scripting and automation In depth understanding of RDBMS/NoSQL, Oracle , Cassandra , Kafka , Redis, Hadoop, lambda architecture, kappa , kappa ++ architectures with flink data streaming and rule engines Experience in working with ML models engineering and related deployment. Design and implement secure big data clusters to meet many compliances and regulatory requirements. Experience in leading the delivery of large-scale systems focused on managing the infrastructure layer of the technology stack. Strong experience in doing performance benchmarking testing for Big data technologies. Strong troubleshooting skills. Experience leading development life cycle process and best practices Experience in Big Data services administration would be added value. Experience with Agile Management (SCRUM, RUP, XP), OO Modeling, working on internet, UNIX, Middleware, and database related projects. Experience mentoring/training the engineering community on complex technical issue.

Posted 18 hours ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Who are we? Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers’ innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our employees around the globe are here to accelerate service providers’ migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $5.00 billion in fiscal 2024. For more information, visit www.amdocs.com 📍 Location: Pune 📈 Experience: 6 to 10 Years 🔍 Must-Have Skills: 💡 Generative AI 🐍 Python 📚 RAG (Retrieval-Augmented Generation) 🧠 LLM / NLP 📊 EDA (Exploratory Data Analysis) ☁️ Cloud Platforms (AWS, Azure, GCP) 🔄 Transformers 🧾 Explainable AI 🌌 Deep Learning Desired Background At least 4+ years of relevant experience and track record in Data Science: Machine Learning, Deep Learning and Statistical Data Analysis. MSc or PhD degree in CS or Mathematics, Bioinformatics, Statistics, Engineering, Physics or similar discipline. Strong hands-on experience in Python with the focus being on statistical algorithms development and GenAI practices. Experience with data science libraries such as: sklearn, pandas, numpy, pytorch/tensorflow Experience with GenAI concepts (RAG, LLM) and Agentical development: from conversional to autonomous agents Team player, able to work in collaboration with subject matter experts, with ability to clearly present and communicate findings. Proven ability to build and deliver data solutions in a short time frame. Experience with Azure, docker, and development methodologies - an advantage Proven experiences in productions and DevOPS practices nd find the right answers. Qualifications Bachelor's degree or equivalent experience in quantative field (Statistics, Mathematics, Computer Science, Engineering, etc.) At least 1 - 2 years' of experience in quantitative analytics or data modeling Deep understanding of predictive modeling, machine-learning, clustering and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau)

Posted 18 hours ago

Apply

6.0 - 10.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above

Posted 19 hours ago

Apply

4.0 - 8.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Developer to carry out coding or programming of Hadoop applications and developing software using Hadoop technologies like Spark, Scala, Python, Hbase, Hive, Cloudera. In this role, you will need to concentrate in creating, testing, implementing, and monitoring applications designed to meet the organization?s strategic goals. What You?ll Do Develop (Coding) for Hadoop, Spark and Java and Angular Js Collaborate with like-minded team members to establish best practices, identify optimal technical solutions (20%) Review code and provide feedback relative to best practices; improve performance Design, develop and test a large-scale, custom distributed software system using the latest Java, Scala and Big Data technologies Adhere to appropriate SDLC and Agile practices Contribute actively to the technological strategy definition (design, architecture and interfaces) in order to effectively respond to our client?s business needs Participate in technological watch and the definition of standards to ensure that our systems and data warehouses are efficient, resilient and durable Provide guidance and coaching to associate software developers Use Informatica or similar products, with an understanding of heterogeneous data replication technique Conduct performance tuning, improvement, balancing, usability and automation Expertise You?ll Bring Experience developing code on distributed databases using Spark, HDFS, Hive 3+ years of experience in Application Developer / Data Architect, or equivalent role Strong knowledge of data and data models Good understanding of data consumption patterns by business users Solid understanding of business processes and structures Basic knowledge of the securities trading business and risk Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential at Persistent. See Beyond, Rise Above.

Posted 19 hours ago

Apply

4.0 - 8.0 years

9 - 13 Lacs

Pune

Work from Office

Naukri logo

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Developer to carry out coding or programming of Hadoop applications and developing software using Hadoop technologies like Spark, Scala, Python, Hbase, Hive, Cloudera. In this role, you will need to concentrate in creating, testing, implementing, and monitoring applications designed to meet the organization?s strategic goals. What You?ll Do Develop (Coding) for Hadoop, Spark and Java and Angular Js Collaborate with like-minded team members to establish best practices, identify optimal technical solutions (20%) Review code and provide feedback relative to best practices; improve performance Design, develop and test a large-scale, custom distributed software system using the latest Java, Scala and Big Data technologies Adhere to appropriate SDLC and Agile practices Contribute actively to the technological strategy definition (design, architecture and interfaces) in order to effectively respond to our client?s business needs Participate in technological watch and the definition of standards to ensure that our systems and data warehouses are efficient, resilient and durable Provide guidance and coaching to associate software developers Use Informatica or similar products, with an understanding of heterogeneous data replication technique Conduct performance tuning, improvement, balancing, usability and automation Expertise You?ll Bring Experience developing code on distributed databases using Spark, HDFS, Hive 3+ years of experience in Application Developer / Data Architect, or equivalent role Strong knowledge of data and data models Good understanding of data consumption patterns by business users Solid understanding of business processes and structures Basic knowledge of the securities trading business and risk Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential at Persistent. See Beyond, Rise Above.

Posted 19 hours ago

Apply

12.0 - 17.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for an experienced Python Lead/ Architect to join our engineering team and help us create dynamic software applications for our clients. In this role, you will be responsible for writing and testing scalable code, developing back-end components, and integrating user-facing elements in collaboration with front-end developers. To be successful as a Python Lead, you should possess in-depth knowledge of object-relational mapping, experience with server-side logic, and above-average knowledge of Python programming. Ultimately, as a top-class Python developer, you should be able to design highly responsive web-applications that perfectly meet the needs of the client. What You?ll Do Be fully responsible for the quality of code for which the team is responsible (either through personal review or thoughtful delegation) Write code whenever required (this is not a pure management role) Participate in the development and evangelization of the Python coding standards within the organization Take full responsibility for delivering solutions into production (working through operations teams) Be responsible for training and mentoring developers on the team and recommending actions around hiring, firing and promotion Work with the technical project management to create and maintain a prioritized backlog and schedule for the team Be responsible for architectural decisions in consultation with other members of the engineering leadership Contribute to team effort by accomplishing related results as needed Display solid fiscal responsibility by managing and adhering to budgets and always seeking out operating efficiencies and economies Expertise You?ll Bring 4+ years in leading development teams Leading teams successfully in a dynamic, fast time to market and customer focused environment Leading initiatives where teams were comprised of onshore and offshore resources Working with HTML / CSS / JS Programming in Python Leveraging serverless architecture within AWS or similar cloud platforms Developing server-side web applications, REST APIs, and / or microservices Working with small, nimble development teams Cybersecurity concepts and technologies Data pipelines or distributed message queues Internet / Web technologies, such as web browsers, AJAX, HTTP, HTML / XML, REST, JavaScript, CSS, XSL / XSLT, XPATH etc Strong organizational ability, including quick responses in a fast-paced environment Commitment to quality and an eye for detail Passion for software security Eagerness to learn new technology and solve problems Inclusive, roll-up-your-sleeves work ethic ? show a willingness to participate in daily workloads when needed to meet deadlines Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.

Posted 19 hours ago

Apply

Exploring Hadoop Jobs in India

The demand for Hadoop professionals in India has been on the rise in recent years, with many companies leveraging big data technologies to drive business decisions. As a job seeker exploring opportunities in the Hadoop field, it is important to understand the job market, salary expectations, career progression, related skills, and common interview questions.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving IT industry and have a high demand for Hadoop professionals.

Average Salary Range

The average salary range for Hadoop professionals in India varies based on experience levels. Entry-level Hadoop developers can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with specialized skills can earn upwards of INR 15 lakhs per annum.

Career Path

In the Hadoop field, a typical career path may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually progressing to roles like Data Architect or Big Data Engineer.

Related Skills

In addition to Hadoop expertise, professionals in this field are often expected to have knowledge of related technologies such as Apache Spark, HBase, Hive, and Pig. Strong programming skills in languages like Java, Python, or Scala are also beneficial.

Interview Questions

  • What is Hadoop and how does it work? (basic)
  • Explain the difference between HDFS and MapReduce. (medium)
  • How do you handle data skew in Hadoop? (medium)
  • What is YARN in Hadoop? (basic)
  • Describe the concept of NameNode and DataNode in HDFS. (medium)
  • What are the different types of join operations in Hive? (medium)
  • Explain the role of the ResourceManager in YARN. (medium)
  • What is the significance of the shuffle phase in MapReduce? (medium)
  • How does speculative execution work in Hadoop? (advanced)
  • What is the purpose of the Secondary NameNode in HDFS? (medium)
  • How do you optimize a MapReduce job in Hadoop? (medium)
  • Explain the concept of data locality in Hadoop. (basic)
  • What are the differences between Hadoop 1 and Hadoop 2? (medium)
  • How do you troubleshoot performance issues in a Hadoop cluster? (advanced)
  • Describe the advantages of using HBase over traditional RDBMS. (medium)
  • What is the role of the JobTracker in Hadoop? (medium)
  • How do you handle unstructured data in Hadoop? (medium)
  • Explain the concept of partitioning in Hive. (medium)
  • What is Apache ZooKeeper and how is it used in Hadoop? (advanced)
  • Describe the process of data serialization and deserialization in Hadoop. (medium)
  • How do you secure a Hadoop cluster? (advanced)
  • What is the CAP theorem and how does it relate to distributed systems like Hadoop? (advanced)
  • How do you monitor the health of a Hadoop cluster? (medium)
  • Explain the differences between Hadoop and traditional relational databases. (medium)
  • How do you handle data ingestion in Hadoop? (medium)

Closing Remark

As you navigate the Hadoop job market in India, remember to stay updated on the latest trends and technologies in the field. By honing your skills and preparing diligently for interviews, you can position yourself as a strong candidate for lucrative opportunities in the big data industry. Good luck on your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies