Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 2.0 years
3 - 4 Lacs
Tamil Nadu
Work from Office
We are looking for a highly motivated and experienced Branch Receivable Officer to join our team at Equitas Small Finance Bank. The ideal candidate will have 1-2 years of experience in the BFSI industry, preferably with knowledge of Assets, Inclusive Banking, SBL, Mortgages, and Receivables. Roles and Responsibility Manage and oversee branch receivables operations for timely and accurate payments. Develop and implement strategies to improve receivables management and reduce delinquencies. Collaborate with cross-functional teams to resolve customer complaints and issues. Analyze and report on receivables performance metrics to senior management. Ensure compliance with regulatory requirements and internal policies. Maintain accurate records and reports of receivables transactions. Job Requirements Strong understanding of BFSI industry trends and regulations. Experience in managing branch receivables operations and improving efficiency. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and meet deadlines. Strong analytical and problem-solving skills. Proficiency in using financial software and systems. Location - Inclusive Banking - SBL,South,Tamil Nadu,Madurai,Dhindukkal North,Kambam,Theni,1281,Kambam
Posted 2 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
Karnataka
Work from Office
We are looking for a highly skilled and experienced Branch Receivable Officer to join our team at Equitas Small Finance Bank. The ideal candidate will have 2 to 7 years of experience in the BFSI industry, with expertise in Assets, Inclusive Banking, SBL, Mortgages, and Receivables. Roles and Responsibility Manage and oversee the daily operations of the branch receivable office. Develop and implement strategies to improve receivable management processes. Collaborate with cross-functional teams to resolve customer complaints and issues. Analyze and report on receivable performance metrics to senior management. Ensure compliance with regulatory requirements and internal policies. Train and guide junior staff members on receivable procedures and best practices. Job Requirements Strong knowledge of BFSI industry trends and regulations. Experience in managing assets, inclusive banking, SBL, mortgages, and receivables. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and meet deadlines. Strong analytical and problem-solving skills. Proficiency in Microsoft Office and other relevant software applications. Location - Inclusive Banking - SBL,South,Karnataka,Karnataka,Chitradurga,Bhadravathi,Karnataka,3112,Tiptur
Posted 2 weeks ago
1.0 - 5.0 years
1 - 3 Lacs
Pondicherry
Work from Office
We are looking for a highly skilled and experienced Branch Receivable Officer to join our team at Equitas Small Finance Bank. The ideal candidate will have 1 to 4 years of experience in the BFSI industry, preferably with SBL. Roles and Responsibility Manage and oversee the receivables process at the branch level. Ensure timely collection of payments from customers and maintain accurate records. Develop and implement strategies to improve receivables management. Collaborate with other departments to resolve customer complaints and issues. Analyze and report on receivables performance metrics. Maintain compliance with regulatory requirements and company policies. Job Requirements Strong knowledge of inclusive banking principles and practices. Experience in managing mortgages and receivables processes. Excellent communication and interpersonal skills. Ability to work effectively in a fast-paced environment and meet deadlines. Strong analytical and problem-solving skills. Familiarity with branch operations and procedures.
Posted 2 weeks ago
1.0 - 2.0 years
1 - 3 Lacs
Madurai, Kovilpatti
Work from Office
We are looking for a highly motivated and experienced Branch Receivable Officer to join our team at Equitas Small Finance Bank. The ideal candidate will have 1-2 years of experience in the BFSI industry, preferably with knowledge of Assets, Inclusive Banking, SBL, Mortgages, and Receivables. Roles and Responsibility Manage and oversee branch receivables operations for timely and accurate payments. Develop and implement strategies to improve receivables management and reduce delinquencies. Collaborate with cross-functional teams to resolve customer complaints and issues. Analyze and report on receivables performance metrics to senior management. Ensure compliance with regulatory requirements and internal policies. Maintain accurate records and reports of receivables transactions. Job Requirements Strong understanding of BFSI industry trends and regulations. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and meet deadlines. Proficiency in MS Office and other relevant software applications. Strong analytical and problem-solving skills. Experience working with diverse stakeholders, including customers, colleagues, and management. Familiarity with Equity Small Finance Bank's products and services is an added advantage.
Posted 2 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
Thuraiyur, Tamil Nadu
Work from Office
We are looking for a highly skilled and experienced Branch Receivable Officer to join our team at Equitas Small Finance Bank. The ideal candidate will have 2 to 7 years of experience in the BFSI industry, with expertise in Assets, Inclusive Banking, SBL, Mortgages, and Receivables. Roles and Responsibility Manage and oversee branch receivables operations for efficient cash flow. Develop and implement strategies to improve receivables management. Collaborate with cross-functional teams to resolve customer issues and enhance service quality. Analyze and report on receivables performance metrics to senior management. Ensure compliance with regulatory requirements and internal policies. Train and guide junior staff members to improve their skills and knowledge. Job Requirements Strong understanding of BFSI industry trends and regulations. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and meet deadlines. Proficiency in MS Office and other relevant software applications. Strong analytical and problem-solving skills. Experience in managing and leading a team of receivables professionals. Location - Inclusive Banking - SBL,South,Tamil Nadu,Kumbakonam,Perambalur,Thuraiyur,Musiri,1060,Musiri
Posted 2 weeks ago
5.0 - 10.0 years
14 - 17 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 2 weeks ago
6.0 - 11.0 years
19 - 27 Lacs
Haryana
Work from Office
About Company Founded in 2011, ReNew, is one of the largest renewable energy companies globally, with a leadership position in India. Listed on Nasdaq under the ticker RNW, ReNew develops, builds, owns, and operates utility-scale wind energy projects, utility-scale solar energy projects, utility-scale firm power projects, and distributed solar energy projects. In addition to being a major independent power producer in India, ReNew is evolving to become an end-to-end decarbonization partner providing solutions in a just and inclusive manner in the areas of clean energy, green hydrogen, value-added energy offerings through digitalisation, storage, and carbon markets that increasingly are integral to addressing climate change. With a total capacity of more than 13.4 GW (including projects in pipeline), ReNew’s solar and wind energy projects are spread across 150+ sites, with a presence spanning 18 states in India, contributing to 1.9 % of India’s power capacity. Consequently, this has helped to avoid 0.5% of India’s total carbon emissions and 1.1% India’s total power sector emissions. In the over 10 years of its operation, ReNew has generated almost 1.3 lakh jobs, directly and indirectly. ReNew has achieved market leadership in the Indian renewable energy industry against the backdrop of the Government of India’s policies to promote growth of this sector. ReNew’s current group of stockholders contains several marquee investors including CPP Investments, Abu Dhabi Investment Authority, Goldman Sachs, GEF SACEF and JERA. Its mission is to play a pivotal role in meeting India’s growing energy needs in an efficient, sustainable, and socially responsible manner. ReNew stands committed to providing clean, safe, affordable, and sustainable energy for all and has been at the forefront of leading climate action in India. Job Description Key responsibilities: 1. Understand, implement, and automate ETL pipelines with better industry standards 2. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, design infrastructure for greater scalability, etc 3. Developing, integrating, testing, and maintaining existing and new applications 4. Design, and create data pipelines (data lake / data warehouses) for real world energy analytical solutions 5. Expert-level proficiency in Python (preferred) for automating everyday tasks 6. Strong understanding and experience in distributed computing frameworks, particularly Spark, Spark-SQL, Kafka, Spark Streaming, Hive, Azure Databricks etc 7. Limited experience in using other leading cloud platforms preferably Azure. 8. Hands on experience on Azure data factory, logic app, Analysis service, Azure blob storage etc. 9. Ability to work in a team in an agile setting, familiarity with JIRA and clear understanding of how Git works 10. Must have 5-7 years of experience
Posted 2 weeks ago
8.0 - 12.0 years
7 - 11 Lacs
Pune
Work from Office
Experience with ETL processes and data warehousing Proficient in SQL and Python/Java/Scala Team Lead Experience
Posted 2 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Overall Responsibilities: Data Pipeline Development: Design, develop, and maintain highly scalable and optimized ETL pipelines using PySpark on the Cloudera Data Platform, ensuring data integrity and accuracy. Data Ingestion: Implement and manage data ingestion processes from a variety of sources (e.g., relational databases, APIs, file systems) to the data lake or data warehouse on CDP. Data Transformation and Processing: Use PySpark to process, cleanse, and transform large datasets into meaningful formats that support analytical needs and business requirements. Performance Optimization: Conduct performance tuning of PySpark code and Cloudera components, optimizing resource utilization and reducing runtime of ETL processes. Data Quality and Validation: Implement data quality checks, monitoring, and validation routines to ensure data accuracy and reliability throughout the pipeline. Automation and Orchestration: Automate data workflows using tools like Apache Oozie, Airflow, or similar orchestration tools within the Cloudera ecosystem. Monitoring and Maintenance: Monitor pipeline performance, troubleshoot issues, and perform routine maintenance on the Cloudera Data Platform and associated data processes. Collaboration: Work closely with other data engineers, analysts, product managers, and other stakeholders to understand data requirements and support various data-driven initiatives. Documentation: Maintain thorough documentation of data engineering processes, code, and pipeline configurations. Software Requirements: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Familiarity with Hadoop, Kafka, and other distributed computing tools. Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Strong scripting skills in Linux. Category-wise Technical Skills: PySpark: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Cloudera Data Platform: Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Data Warehousing: Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Big Data Technologies: Familiarity with Hadoop, Kafka, and other distributed computing tools. Orchestration and Scheduling: Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Scripting and Automation: Strong scripting skills in Linux. Experience: 5-12 years of experience as a Data Engineer, with a strong focus on PySpark and the Cloudera Data Platform. Proven track record of implementing data engineering best practices. Experience in data ingestion, transformation, and optimization on the Cloudera Data Platform. Day-to-Day Activities: Design, develop, and maintain ETL pipelines using PySpark on CDP. Implement and manage data ingestion processes from various sources. Process, cleanse, and transform large datasets using PySpark. Conduct performance tuning and optimization of ETL processes. Implement data quality checks and validation routines. Automate data workflows using orchestration tools. Monitor pipeline performance and troubleshoot issues. Collaborate with team members to understand data requirements. Maintain documentation of data engineering processes and configurations. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. Relevant certifications in PySpark and Cloudera technologies are a plus. Soft Skills: Strong analytical and problem-solving skills. Excellent verbal and written communication abilities. Ability to work independently and collaboratively in a team environment. Attention to detail and commitment to data quality. S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice
Posted 2 weeks ago
10.0 - 13.0 years
12 - 15 Lacs
Hyderabad, Gurugram, Ahmedabad
Work from Office
About the Role: Grade Level (for internal use): 11 The Role: Lead Data Engineer Join Our Team: Step into a dynamic team at the forefront of data innovation! Youll collaborate daily with talented professionals from around the world, designing and developing next-generation data products for our clients. Our team thrives on a diverse toolkit that evolves with emerging technologies, offering you the chance to work in a vibrant, global environment that fosters creativity and teamwork. The Impact: As a Lead Data Engineer at S&P Global, youll be a driving force in shaping the future of our data products. Your expertise will streamline software development and deployment, aligning innovative solutions with business needs. By ensuring seamless integration and continuous delivery, youll enhance product capabilities, delivering high-quality systems that meet the highest standards of availability, security, and performance. Your work will empower our clients with impactful, data-driven solutions, making a real difference in the financial world. Whats in it for You: Career Development: Build a rewarding career with a global leader in financial information and analytics, supported by continuous learning and a clear path to advancement. Dynamic Work Environment: Thrive in a fast-paced, forward-thinking setting where your ideas fuel innovation and your contributions shape groundbreaking solutions. Skill Enhancement: Elevate your expertise on an enterprise-level platform, mastering the latest tools and techniques in software development. Versatile Experience: Dive into full-stack development with hands-on exposure to cloud computing and large-scale data technologies. Leadership Opportunities: Guide and inspire a skilled team, steering the direction of our products and leaving your mark on the future of technology at S&P Global. Responsibilities: Architect and develop scalable cloud applications, utilizing a range of services to create robust, high-performing solutions. Design and implement advanced automation pipelines, streamlining software delivery for fast, reliable deployments. Tackle complex challenges head-on, troubleshooting and resolving issues to ensure our products run flawlessly for clients. Lead by example, providing technical guidance and mentoring to your team, driving innovation and embracing new processes. Deliver high-quality code and detailed system design documents, setting the standard with technical walkthroughs that inspire excellence. Bridge the gap between technical and non-technical stakeholders, turning complex requirements into elegant, actionable solutions. Mentor junior developers, nurturing their growth and helping them build skills and careers under your leadership. What Were Looking For: Were seeking a passionate and experienced professional who brings: 10-13 years of expertise in designing and building data-intensive solutions using distributed computing, with a proven track record of scalable architecture design. 5+ years of hands-on experience with Python, Distributed data processing/bigdata processing Frameworks and data/workflow orchestration tools, demonstrating technical versatility. Proficiency in SQL and NoSQL databases, with deep experience operationalizing data pipelines for large-scale processing. Extensive experience deploying data engineering solutions in public cloud environments, leveraging cloud capabilities to their fullest potential. A strong history of collaborating with business stakeholders and users to shape research directions and deliver robust, maintainable products. A talent for rapid prototyping and iteration, delivering high-quality solutions under tight deadlines. Exceptional communication and documentation skills, with the ability to explain complex ideas to both technical and non-technical audiences. Good to Have Skills: Strong knowledge of Generative AI & advanced tools and technologies that enhance developer productivity. Advanced programming skills used in Bigdata processing eco systems, supported by a portfolio of impactful projects. Expertise in containerization, scripting, and automation practices, ready to excel in a modern development ecosystem. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 2 weeks ago
5.0 - 10.0 years
2 - 5 Lacs
Chennai, Bengaluru
Work from Office
Job Title:Data EngineerExperience5-10YearsLocation:Chennai, Bangalore : Minimum 5+ years of development and design experience in Informatica Big Data Management Extensive knowledge on Oozie scheduling, HQL, Hive, HDFS (including usage of storage controllers) and data partitioning. Extensive experience working with SQL and NoSQL databases. Linux OS configuration and use, including shell scripting. Good hands-on experience with design patterns and their implementation. Well versed with Agile, DevOps and CI/CD principles (GitHub, Jenkins etc.), and actively involved in solving, troubleshooting issues in distributed services ecosystem. Familiar with Distributed services resiliency and monitoring in a production environment. Experience in designing, building, testing, and implementing security systems Including identifying security design gaps in existing and proposed architectures and recommend changes or enhancements. Responsible for adhering to established policies, following best practices, developing, and possessing an in-depth Understanding of exploits and vulnerabilities, resolving issues by taking the appropriate corrective action. Knowledge on security controls designing Source and Data Transfers including CRON, ETLs, and JDBC-ODBC scripts. Understand basics of Networking including DNS, Proxy, ACL, Policy, and troubleshooting High level knowledge of compliance and regulatory requirements of data including but not limited to encryption, anonymization, data integrity, policy control features in large scale infrastructures. Understand data sensitivity in terms of logging, events and in memory data storage such as no card numbers or personally identifiable data in logs. Implements wrapper solutions for new/existing components with no/minimal security controls to ensure compliance to bank standards.
Posted 2 weeks ago
4.0 - 7.0 years
6 - 9 Lacs
Noida, India
Work from Office
Spark/PySpark Technical hands on data processing Table designing knowledge using Hive - similar to RDBMS knowledge Database SQL knowledge for retrieval of data - transformation queries such as joins (full, left , right) , ranking , group by Good Communication skills. Additional skills - GitHub , Jenkins , shell scripting would be added advantage Mandatory Competencies Big Data - Big Data - Hadoop Big Data - Big Data - SPARK Big Data - Big Data - Pyspark DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Jenkins Beh - Communication and collaboration Database - Database Programming - SQL DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Basic Bash/Shell script writing Big Data - Big Data - HIVE
Posted 2 weeks ago
7.0 - 12.0 years
11 - 16 Lacs
Bengaluru
Work from Office
As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up . Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Target as a tech companyAbsolutely. Were the behind-the-scenes powerhouse that fuels Targets passion and commitment to cutting-edge innovation. We anchor every facet of one of the worlds best-loved retailers with a strong technology framework that relies on the latest tools and technologiesand the brightest peopleto deliver incredible value to guests online and in stores. Target Technology Services is on a mission to offer the systems, tools and support that guests and team members need and deserve. Our high-performing teams balance independence with collaboration, and we pride ourselves on being versatile, agile and creative. We drive industry-leading technologies in support of every angle of the business, and help ensure that Target operates smoothly, securely and reliably from the inside out. Role overview As a Lead Engineer, you serve as the technical anchor for the engineering team that supports a product. You create, own and are responsible for the application architecture that best serves the product in its functional and non-functional needs. You identify and drive architectural changes to accelerate feature development or improve the quality of service (or both). You have deep and broad engineering skills and are capable of standing up an architecture in its whole on your own, but you choose to influence a wider team by acting as a force multiplier. Core responsibilities of this job are described within this job description. Job duties may change at any time due to business needs. Use your skills, experience and talents to be a part of groundbreaking thinking and visionary goals. As a Lead Engineer, youll take the lead as youUse your technology acumen to apply and maintain knowledge of current and emerging technologies within specialized area(s) of the technology domain. Evaluate new technologies and participates in decision-making, accounting for several factors such as viability within Targets technical environment, maintainability, and cost of ownership. Initiate and execute research and proof-of-concept activities for new technologies. Lead or set strategy for testing and debugging at the platform or enterprise level. In complex and unstructured situations, serve as an expert resource to create and improve standards and best practices to ensure high-performance, scalable, repeatable, and secure deliverables. Lead the design, lifecycle management, and total cost of ownership of services. Provide the team with thought leadership to promote re-use and develop consistent, scalable patterns. Participate in planning services that have enterprise impact. Provide suggestions for handling routine and moderately complex technical problems, escalating issues when appropriate. Gather information, data, and input from a wide variety of sources; identify additional resources when appropriate, engage with appropriate stakeholders, and conduct in-depth analysis of information. Provide suggestions for handling routine and moderately complex technical problems, escalating issues when appropriate. Develop plans and schedules, estimate resource requirements, and define milestones and deliverables. Monitor workflow and risks; play a leadership role in mitigating risks and removing obstacles. Lead and participate in complex construction, automation, and implementation activities, ensuring successful implementation with architectural and operational requirements met. Establish new standards and best practices to monitor, test, automate, and maintain IT components or systems. Serve as an expert resource in disaster recovery and disaster recovery planning. Stay current with Targets technical capabilities, infrastructure, and technical environment. Develop fully attributed data models, including logical, physical, and canonical. Influence data standards, policies, and procedures. Install, configure, and/or tune data management solutions with minimal guidance. Monitor data management solution(s) and identify optimization opportunities About you: Bachelor's degree (or equivalent experience) in Computer Science, Engineering, or related field. 7+ years of hands-on software development experience, including at least one full-cycle project implementation. Expertise in Targets technology landscape, with a solid understanding of industry trends, competitors products, and differentiating features. Proficient in Kotlin with advanced knowledge of Microservices architecture and Event-driven architectures . Strong experience with high-priority, large-scale applications capable of processing millions of records. Proven ability to design and implement highly scalable and observable systems . Working on mission-critical applications with large transaction volumes and high throughput. Building systems that are scalable , with a focus on performance and resilience. Leveraging cutting-edge tools for data correlation and pattern analysis. Experience with Scala , Hadoop , and other Big Data technologies is preferred Strong retail domain knowledge with experience working on multi-channel platforms. Hands-on experience with high-performance messaging platforms that are highly scalable. Useful Links: Life at Targethttps://india.target.com/ Benefitshttps://india.target.com/life-at-target/workplace/benefits Culture https://india.target.com/life-at-target/belonging
Posted 2 weeks ago
8.0 - 12.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Lead Data Analyst As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. As a part of the Merchandising Analytics and Insights team, our analysts work closely with business owners as well as technology and data product teams staffed with product owners and engineers. They support all Merchandising strategic initiatives with data, reporting and analysis. Merchandising teams rely on this team of analysts to bring data to support decision making. PRINCIPAL DUTIES AND RESPONSIBILITIES As a Lead Data Analyst your responsibilities will be exploring data, technologies, and the application of mathematical techniques to derive business insights. Data analysts spend their time determining the best approach to gather, model, manipulate, analyze and present data. You will lead an agile team which requires active participation in ceremonies and team meetings. In this role, youll have opportunities to continuously upskill to stay current with new technologies in the industry via formal training, peer training groups and self-directed education. This role specifically will support a new initiatives requiring new data and metric development. Your curiosity and ability to roll up guest level insights will be critical. The team will leveraging in store analytic data to measure the impact of promotions, product placement, or physical changes at any given store has on store operations, guest experience and purchase decisions. This new capability are still being defined, which allows for creative thinking and leadership opportunities. Job duties may change at any time due to business needs. Key Responsibilities *: Work with MC, Merch, Planning, Marketing, Digital, etc. teams to use data to identify opportunities, bridge gaps, and build advanced capabilities Partner with Product, DS, DE, etc. to determine the best approach to gather, model, manipulate, analyze and present data Develop data and metrics to support key business strategies, initiatives, and decisions Explore data, technologies, and the application of mathematical techniques to derive business insights Desired Skills & Experiences *: Ability to breakdown complex problems, identify root cause of the issue, and develop sustainable solutions Ability to influence cross-functional teams and partners at multiple levels of the organization Possess analytical skills (SQL, Python, R, etc.) to find, manipulate, and present data in meaningful ways to clients Desire to continuously upskill to stay current with new technologies in the industry via formal training, peer training groups and self-directed education About you (In terms of technical) B.E/B.Tech,M.Tech,M.Sc. , MCA - Overall 8-12 years of experience. And 6-8 years data ecosystem experience. Strong Architect of data capabilities and analytical tools. Proven experience to architect enterprise level Datawarehouse solutions and BI Implementations across Domo, Tableau & other Visualization tools. Provide expertise and ability to train and guide team to implement top design architectures to build next generation analytics Deep Big data experience. Should have solid experience in Hadoop ecosystem and its components around writing programs using Map-Reduce, experience in developing Hive and PySpark SQL and designing and developing Oozie workflows. Hands on experience in object oriented or functional programming such as Scala &/or Python/R or other open-source languages Strong foundational mathematics and statistics Experience in analytical techniques like Linear & Non-Linear Regression, Logistic Regression, Time-series models, Classification Techniques etc (In terms of soft skills for lead role) Strong stakeholder management with product teams and business leaders. Has strong problem solving, analytical skills and ability to manage ambiguity. Ability to communicate results of complex analytic findings to both technical /non-technical audiences and business leaders. Ability to lead change, work through conflict and setbacks . Experience working in an agile environment (stories, backlog refinement, sprints, etc.). Excellent attention to detail and timelines. Strong sense of ownership. Desire to continuously upskill to stay current with new technologies in the industry via formal training, peer training groups and self-directed education Useful Links- Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging
Posted 2 weeks ago
1.0 - 5.0 years
3 - 7 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Key skills Azure Data Factory (primary) , Azure Data bricks Spark (PySpark, SQL) Must-have skills Cloud certified in one of these categories Azure Data Engineer Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation Semantic Modelling/ Optimization of data model to work within Rahona Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle. Experience in Sqoop / Hadoop Microsoft Excel (for metadata files with requirements for ingestion) Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud Strong Programming skills with at least one of Python, Scala or Java Strong SQL skills ( T-SQL or PL-SQL) Data files movement via mailbox Source-code versioning/promotion tools, e.g. Git/Jenkins Orchestration tools, e.g. Autosys, Oozie Source-code versioning with Git Nice-to-have skills Experience working with mainframe files Experience in Agile environment, JIRA/Confluence tool Key skills Azure Data Factory (primary) , Azure Data bricks Spark (PySpark, SQL) Experience - 5 to 10 Years Must-have skills Cloud certified in one of these categories Azure Data Engineer Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation Semantic Modelling/ Optimization of data model to work within Rahona Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle. Experience in Sqoop / Hadoop Microsoft Excel (for metadata files with requirements for ingestion) Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud Strong Programming skills with at least one of Python, Scala or Java Strong SQL skills ( T-SQL or PL-SQL) Data files movement via mailbox Source-code versioning/promotion tools, e.g. Git/Jenkins Orchestration tools, e.g. Autosys, Oozie Source-code versioning with Git. Nice-to-have skills Experience working with mainframe files Experience in Agile environment, JIRA/Confluence to
Posted 2 weeks ago
7.0 - 12.0 years
11 - 15 Lacs
Gurugram
Work from Office
Project description We are looking for an experienced Data Engineer to contribute to the design, development, and maintenance of our database systems. This role will work closely with our software development and IT teams to ensure the effective implementation and management of database solutions that align with client's business objectives. Responsibilities The successful candidate would be responsible for managing technology in projects and providing technical guidance/solutions for work completion (1.) To be responsible for providing technical guidance/solutions (2.) To ensure process compliance in the assigned module and participate in technical discussions/reviews (3.) To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations (4.) Being self-organized, focused on develop on time and quality software Skills Must have At least 7 years of experience in development in Data specific projects. Must have working knowledge of streaming data Kafka Framework (kSQL/Mirror Maker etc) Strong programming skills in at least one of these programming language Groovy/Java Good knowledge of Data Structure, ETL Design, and storage. Must have worked in streaming data environments and pipelines Experience working in near real-time/Streaming Data pipeline development using Apache Spark/Streamsets/ Apache NIFI or similar frameworks Nice to have N/A
Posted 3 weeks ago
5.0 - 8.0 years
4 - 7 Lacs
Pune
Work from Office
JC-67103 Band -B2,B3 Location-Chennai, Coimbatore, Bangalore, pun Key skills Azure Data Factory (primary) , Azure Data bricks Spark (PySpark, SQL Experience - 7 to 10 Years Must-have skills Cloud certified in one of these categories Azure Data Engineer Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation . Semantic Modelling/ Optimization of data model to work within Rahona Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle. Experience in Sqoop / Hadoop Microsoft Excel (for metadata files with requirements for ingestion) Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud Strong Programming skills with at least one of Python, Scala or Java Strong SQL skills ( T-SQL or PL-SQL) Data files movement via mailbox Source-code versioning/promotion tools, e.g. Git/Jenkins Orchestration tools, e.g. Autosys, Oozie Source-code versioning with Git. Nice-to-have skills Experience working with mainframe files Experience in Agile environment, JIRA/Confluence tools. Mandatory Skills: DataBricks - Data Engineering. Experience: 5-8 Years.
Posted 3 weeks ago
3.0 - 8.0 years
5 - 8 Lacs
Mumbai
Work from Office
Role Overview: Seeking an experienced Apache Airflow specialist to design and manage data orchestration pipelines for batch/streaming workflows in a Cloudera environment. Key Responsibilities: * Design, schedule, and monitor DAGs for ETL/ELT pipelines * Integrate Airflow with Cloudera services and external APIs * Implement retries, alerts, logging, and failure recovery * Collaborate with data engineers and DevOps teams Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Skills Required: * Experience3-8 years * Expertise in Airflow 2.x, Python, Bash * Knowledge of CI/CD for Airflow DAGs * Proven experience with Cloudera CDP, Spark/Hive-based data pipelines * Integration with Kafka, REST APIs, databases
Posted 3 weeks ago
8.0 - 13.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Bigdata Spark, scala, hive, kafka Preferred Skills: Technology-Big Data-Hbase Technology-Big Data-Sqoop Technology-Java-Apache-Scala Technology-Functional Programming-Scala Technology-Big Data - Data Processing-Map Reduce Technology-Big Data - Data Processing-Spark
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
We are seeking a skilled Big Data Developer with 3+ years of experience to develop, maintain, and optimize large-scale data pipelines using frameworks like Spark, PySpark, and Airflow. The role involves working with SQL, Impala, Hive, and PL/SQL for advanced data transformations and analytics, designing scalable data storage systems, and integrating structured and unstructured data using tools like Sqoop. The ideal candidate will collaborate with cross-functional teams to implement data warehousing strategies and leverage BI tools for insights. Proficiency in Python programming, workflow orchestration with Airflow, and Unix/Linux environments is essential. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 3 weeks ago
4.0 - 9.0 years
10 - 12 Lacs
Bengaluru, Doddakannell, Karnataka
Work from Office
We are seeking a highly skilled Data Engineer with expertise in ETL techniques, programming, and big data technologies. The candidate will play a critical role in designing, developing, and maintaining robust data pipelines, ensuring data accuracy, consistency, and accessibility. This role involves collaboration with cross-functional teams to enrich and maintain a central data repository for advanced analytics and machine learning. The ideal candidate should have experience with cloud-based data platforms, data modeling, and data governance processes. Location - Bengaluru,Doddakannell, Karnataka, Sarjapur Road
Posted 3 weeks ago
3.0 - 6.0 years
25 - 30 Lacs
Chennai
Work from Office
Zalaris is looking for Senior Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 3 weeks ago
3.0 - 6.0 years
25 - 30 Lacs
Pune
Work from Office
Diverse Lynx is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 3 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
New Delhi, Ahmedabad, Bengaluru
Work from Office
We are seeking a skilled Big Data Developer with 3+ years of experience to develop, maintain, and optimize large-scale data pipelines using frameworks like Spark, PySpark, and Airflow. The role involves working with SQL, Impala, Hive, and PL/SQL for advanced data transformations and analytics, designing scalable data storage systems, and integrating structured and unstructured data using tools like Sqoop. The ideal candidate will collaborate with cross-functional teams to implement data warehousing strategies and leverage BI tools for insights. Proficiency in Python programming, workflow orchestration with Airflow, and Unix/Linux environments is essential. Location: Remote- Delhi / NCR,Bangalore/Bengaluru, Hyderabad/Secunderabad,Chennai, Pune,Kolkata,Ahmedabad,Mumbai
Posted 3 weeks ago
5.0 - 10.0 years
15 - 27 Lacs
Bengaluru
Work from Office
Job Summary We are seeking a highly motivated Senior Data Engineer with expertise in designing, building, and securing data systems. The ideal candidate will have a strong background in data engineering, security compliance, and distributed systems, with a focus on ensuring adherence to industry standards and regulatory requirements. Location: Bangalore Experience: 4 to 13 Years Must Have: Informatica BDM, Oozie Scheduling, Hive, HDFS Key Responsibilities Design, implement, and maintain secure data systems, including wrapper solutions for components with minimal security controls, ensuring compliance with bank standards. Identify security design gaps in existing and proposed architectures and recommend enhancements to strengthen system resilience. Develop and enforce security controls for data transfers, including CRON, ETLs, and JDBC-ODBC scripts. Ensure compliance with data sensitivity standards, such as avoiding storage of card numbers or PII in logs, and maintaining data integrity. Collaborate on distributed systems, focusing on resiliency, monitoring, and troubleshooting in production environments. Work with Agile/DevOps practices, CI/CD pipelines (GitHub, Jenkins), and scripting tools to optimize data workflows. Troubleshoot and resolve issues in large-scale data infrastructures, including SQL/NoSQL databases, HDFS, Hive, and HQL. Requirements -5+ years of total experience, with4+ years in Informatica Big Data Management. Extensive knowledge of Oozie scheduling, HQL, Hive, HDFS, and data partitioning. Proficiency in SQL and NoSQL databases, along with Linux OS configuration and shell scripting. Strong understanding of networking concepts (DNS, Proxy, ACL, Policy) and data transfer security. In-depth knowledge of compliance and regulatory requirements (encryption, anonymization, policy controls). Familiarity with Agile/DevOps, CI/CD, and distributed systems monitoring. Ability to address data sensitivity concerns in logging, events, and in-memory storage. About Us For a customer in the banking sector with financial services requirements, we worked on Informatica Big Data Management, Oozie, Hive, and security compliance frameworks. Contact [dlt] and [slt] for more details.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough