Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
5 - 9 Lacs
Mumbai
Work from Office
We are looking for a highly skilled Data Scientist with 7 to 10 years of experience to join our team as a Senior Process Manager in eClerx Services Ltd., located in Mumbai. Roles and Responsibility Develop and implement data-driven solutions to enhance business processes. Collaborate with cross-functional teams to identify areas for improvement and optimize workflows. Design and maintain databases and data systems to support business intelligence initiatives. Analyze complex data sets to inform strategic decisions and drive business growth. Develop and maintain reports and dashboards to track key performance indicators. Ensure data quality and integrity by implementing data validation and testing procedures. Job Requirements Strong understanding of data analysis, machine learning, and statistical modeling techniques. Experience with data visualization tools such as Tableau or Power BI. Excellent problem-solving skills and ability to work in a fast-paced environment. Strong communication and collaboration skills to work effectively with stakeholders. Ability to design and implement process improvements to increase efficiency and productivity. Strong knowledge of data management principles and practices.
Posted 2 weeks ago
3.0 - 5.0 years
6 - 10 Lacs
Chennai
Work from Office
Oracle Master Data Management Role Purpose Device Launch Readiness Team in Client's Xfinity Mobile under Technology & Product Wireless Technologies & New Business, manages master data for Mobile Device, Mobile Device Accessories, Packaging and Xfinity home products. Person responsible for this position will help in this process of SKU Lifecycle Management. Core Responsibilities Manage Device, Accessories Master data Write complex SQL to query large data platforms for analysis Perform queries and create anew datasets Analyze and package data to create / update records Clean data, parse, and makeavailable for groups of users Deep diveinto datato understand business drivers/problems Update Jira for completed activities and report to users/manager Support during prod and stage migration GeneralSkillsets: 3-5 years of experience in RDBMS Working experience in Mobile Device / Service domain Knowledge of mobile business acronyms Advanced Excel skills including macros, VLOOKUP, formula accuracy Other Expectations: Understand our Comcast Operating Principles; make them the guidelines for how you do your job. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Drive results and growth. Mandatory Skills: Oracle Master Data Management - MDM. Experience3-5 Years.
Posted 2 weeks ago
2.0 - 7.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Work with Us. Change the World. At AECOM, we're delivering a better world. Whether improving your commute, keeping the lights on, providing access to clean water, or transforming skylines, our work helps people and communities thrive. We are the world's trusted infrastructure consulting firm, partnering with clients to solve the world’s most complex challenges and build legacies for future generations. There has never been a better time to be at AECOM. With accelerating infrastructure investment worldwide, our services are in great demand. We invite you to bring your bold ideas and big dreams and become part of a global team of over 50,000 planners, designers, engineers, scientists, digital innovators, program and construction managers and other professionals delivering projects that create a positive and tangible impact around the world. We're one global team driven by our common purpose to deliver a better world. Join us. AECOM is seeking a Graduate Environmental Data Specialist with 2+ years of experience to support our enterprise environmental data management system (EarthSoft EQuIS). The ideal candidate will have a strong understanding of environmental data and terminology, good communication skills, and the ability to collaborate with both technical and non-technical stakeholders. This position will offer a hybrid work arrangement to include both office and remote work schedules and will be based from our office located in Bengaluru, India. This role includes, but is not limited to, the following activities: Role and Responsibilities: The ideal candidate will be able to understand requests from environmental subject matter experts. Be a good communicator able to share new functions and features with the users and have a good understanding of environmental data and environmental data terminology. Works on issues of diverse scope where analysis of situation or data requires evaluation of a variety of factors, including an understanding of current business trends. Prepare and update environmental associated reports sound in understanding environmental data, transforming, and analyzing large and diversified environmental datasets. Ability to translate environmental problems through digital and data solutions. Commitment to data quality at all levels and scales. Experience in developing custom reports and user-requested queries and views on various platforms of the desired skill set. Responsive to client (user) requests. Excellent communication skills Provide technical support to field sampling teams and act as a liaison between the project staff, analytical laboratory, data validator, and GIS analysts. Research state and federal regulations necessary to manage action levels or clean-up criteria. Professional qualification & Experience desired Bachelor’s degree in environmental/civil/chemical engineering or science in a related discipline (or similar subject) desirable with a required focus on Environmental Data and 2+ years of experience working in the environmental domain and preferably have relevant experience with environmental data. Skills Required: Ability to understand data management using excellent computer skills to perform transformations in spreadsheets and databases. Expertise and experience with environmental data and database systems (MS SQL Server, MS Access). Expertise with relational databases such as EarthSoft’s Environmental Quality Information System (EQuIS™) /EIM/ ESdat. Ability to continually analyze data at all stages for problems, logic, and consistency concerning field data collection, analytical reporting, and other expertise on EQUIS sub-tools (Collect, Edge, ArcGIS highly desirable but not essential). Assist projects globally and task delivery with high quality and within deadlines. Managing data (geological, Field data, chemical laboratory data) for technical report writing and interpretation as required by the team. Maintaining and updating various project dashboards using the web-based EQuIS Enterprise™ system; and preparing report-ready data tables, charts, and figures for internal review and external client reports. Use of visualization tools like Power BI to help management make effective decisions for the environmental domain is desirable but not essential. Programming and/or coding experience (e.g., Python,R) a plus. Data engineering, AI/ML, and Data science understanding is highly desirable but not essential. Can be in either academic or work experience. Intermediate to the expert level understanding of Office 365, Excel, power query & Power automation. Strong attention to detail with excellent analytical, judgment and problem-solving capabilities. Comfortable running meetings and presentations Strong written and oral communication skills Preferred : Master’s degree in environmental/civil/chemical engineering or science in a related discipline (or similar subject) desirable with a required focus on Environmental Data. Minimum of 2 – 5 years of experience working in the environmental domain and preferably have relevant experience with environmental data. Additional Information
Posted 2 weeks ago
3.0 - 5.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Skill required: Delivery - Sales Reporting Designation: I&F Decision Sci Practitioner Analyst Qualifications: Any Graduation Years of Experience: 3 to 5 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Within Operations, our Sales Operations Data Analytics, Reporting & Insights team is rapidly expanding. We take a bold and modern approach to sales excellencecombining deep operational expertise, cutting-edge technology, and data-driven insights to optimize sales performance and empower go-to-market teams to succeed.As a key member of the Sales Reporting, you will lead a cross-functional team of Data Analytics, Visualization, and Data Modeling experts working within the Sales Operations space. The ideal candidate brings a strong background in business intelligence, reporting, visualization and advanced analytics in sales domain. The candidate will have a proven track record in project delivery, stakeholder management, and team leadership. The candidate will adeptly balance hands-on reporting and analytics tasks, and team management. What are we looking for Sales Operations Functional Expertise:Deep understanding of end-to-end sales lifecycle processes including Sales Support, Pricing & Quoting, Bid & Proposal Management, Contract Lifecycle, Order Management, and Incentives. Sales Transformation:Proven experience in leading or supporting large-scale sales operations transformation projectspreferably for external clientswith a focus on process standardization, consolidation, and operating model redesign. Sales Analytics & Insights Sales Reporting and Visualization:Strong skills in Sales, data analysis, business intelligence, and visualization. Ability to generate actionable insights and drive decision-making. Stakeholder Management Consulting & Communication Skills Ability to work well in a team Innovative & Future-Focused Roles and Responsibilities: Ensure high-quality delivery of reports, dashboards and data models. Ensure data quality, governance, and secure access across platforms Author, review and validate dashboards, visual designs, and model outcomes. Encourage adoption of best practices in code management, versioning, and documentation. Manage, and mentor a high-performing team of SQL, Python, Power Platform, Power BI (or similar platforms) developers, and Data Analytics practitioners. Proactively lead work allocation, delivery timelines, data quality, review metrics and KPIs. Manage multiple projects and tasks while maintaining high attention to detail Provide technical and functional guidance to team members. Conduct regular performance reviews, skill development sessions, and team health checks. Collaborate with senior business leaders, stakeholders and global teams to define problem statements, solutions, KPIs, and success metrics. Serve as the client-facing SME for all analytics and visualization needs in the sales domain. Strong communication of data analytics, and visualizations through effective storytelling. Ensure proactive communication of status, updates, asks, achievements, and challenges. Qualification Any Graduation
Posted 2 weeks ago
0.0 - 3.0 years
2 - 6 Lacs
Pune
Work from Office
About The Role eClerx is hiring a Product Data Management Analyst who will work within our Product Data Management team to help our customers enhance online product data quality. It will also involve creating technical specifications and product descriptions for online presentation. The candidate will also be working on consultancy projects on redesigning e-commerce customers website taxonomy and navigation. The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, adding attributes, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelors Degree (Any Graduate) Good Understanding of tools and technology. Intermediate knowledge of MS Office/Internet.
Posted 2 weeks ago
0.0 - 3.0 years
2 - 6 Lacs
Mumbai
Work from Office
About The Role eClerx is hiring a Product Data Management Analyst who will work within our Product Data Management team to help our customers enhance online product data quality. It will also involve creating technical specifications and product descriptions for online presentation. The candidate will also be working on consultancy projects on redesigning e-commerce customers website taxonomy and navigation. The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, adding attributes, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelors Degree (Any Graduate) Good Understanding of tools and technology. Intermediate knowledge of MS Office/Internet.
Posted 2 weeks ago
7.0 - 12.0 years
12 - 22 Lacs
Mumbai
Work from Office
Job Name (Digital Banking) Associate Data Analyst Location - Mumbai Grade - Senior Manager / AVP Looking for Business Analyst working in Regulated sector by RBI - Bank, Lending NBFC or consulting Firms - Working on Banking data. Having experience in Business credit risk. Predominant Skills - Data Quality; Remediation Processes (Databases, SQL and Python) Data Visualisation Skills (Dashboard, Tableau Power, BI) Informatica Data Quality Basic understanding of Data Lakes and Cloud environment Job Purpose HDFC Bank has huge volume of data, both structured and unstructured, and we are focused on creating assets out of data and deriving best value from the data for the Bank. The Data Remediation and DaaS specialist will be responsible for improving customer data quality through various internal data remediation methodologies. This role will also focus on designing, implementing, and maintaining global and local data marts on the Banks Data Lake to support business, marketing, analytics, regulatory, and other functional use cases. This role is crucial in ensuring high-quality customer data while enabling business functions with reliable and well-structured data marts. The ideal candidate will be someone with a passion for data quality, strong technical skills, and a strategic mindset to drive data-driven decision-making across the Bank. Role & responsibilities Customer Data Quality Management • Analyze and assess data quality issues in customer records • Implement data cleansing, standardization, and deduplication strategies. • Monitor and improve the accuracy, completeness, and consistency of customer data. Formulate Data Remediation Strategies • Conduct root cause analysis to identify sources of poor data quality. • Coordinate with internal stakeholders to drive data improvement initiatives. Data Mart Development & Maintenance • Engage with multiple business, product, credit, risk, analytics, marketing, finance, BIU etc. stakeholders to discover requirements of data marts along with the current challenges faced Providing inputs and recommendation on continuous improvement of policies, procedures, processes, standards, and control pertaining to Data Marts Quantify the impact in business value terms (revenue/cost/loss) due to launch of global and loc Experience Required 5-7 years of total work experience in Data Quality/ Data Product creation 5+ years of experience in Banking and Financial services Experience of working in large, multi-functional, matrix organization Strong technical & functional understanding of Data Remediation and Data Products that includes Staging, Mapping, Cleanse Function, Match Rules, Validation, Trust Scores, Remediation Techniques, Mart creation methodologies & best practices etc Experience with industry-leading master data/metadata/data quality suites, such as Informatica Data Quality Exposure of working in Cloud environment will be an added advantage
Posted 2 weeks ago
0.0 - 3.0 years
4 - 5 Lacs
Pune
Work from Office
Job Description: Essential Job Functions: Contribute to data engineering tasks and projects, including data processing and data integration. Support data pipeline development and maintenance. Collaborate with colleagues to meet data requirements and ensure data quality. Assist in data analysis for insights and reporting. Follow data engineering standards and best practices. Pursue opportunities for continuous learning and growth in the data engineering domain. Learn from experienced data engineers and analysts within the team. Use data engineering tools and techniques to accomplish tasks. Basic Qualifications: Bachelors degree in a relevant field or equivalent combination of education and experience Typically, 2+ years of relevant work experience Proven experience in data engineering Proficiencies in data engineering tools and technologies A continuous learner that stays abreast with industry knowledge and technology Other Qualifications: Advanced degree in a relevant field a plus Relevant certifications, such as Google Cloud Professional Data Engineer or Cloudera Certified Data Analyst a plus At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .
Posted 2 weeks ago
0.0 - 4.0 years
5 - 9 Lacs
Chennai
Work from Office
The primary expectation for this role as a Linguist for the linguistics team is proficiency in Portuguese, enabling you to effectively manage, develop, and optimize linguistic resources. Your role will be to foster this language and develop them for a multitude of products delivered to customers. Your job will be to build and maintain these languages per our Lightcast standards and help in the development of further features. To fill this role we are looking for a dynamic and multilingual person that will quickly learn the ins and outs of the role in order to become an active part of a multicultural team. Major Responsibilities: Analyze and improve data quality of multilingual text classifiers Translate various taxonomies such as Skills, Titles, and Occupations. Annotate data used for model training and validation Education and Experience: Bachelor s degree in Linguistics, Data Analytics, Engineering, Computer Science, Statistics, Artificial Intelligence, NLP or similar. Strong linguistics knowledge Skills/Abilities: Understanding of syntax and structural analysis of languages Microsoft Excel experience (including vlookups, data cleanup, and functions) Experience with data analysis using tools such as Excel Knowledge of RegEx is preferred Lightcast is a global leader in labor market insights with headquarters in Moscow (ID) with offices in the United Kingdom, Europe, and India. We work with partners across six continents to help drive economic prosperity and mobility by providing the insights needed to build and develop our people, our institutions and companies, and our communities.
Posted 2 weeks ago
1.0 - 7.0 years
12 - 16 Lacs
Hyderabad
Work from Office
A Day in the Life Careers that Change Lives At Medtronic, we push the limits of technology to make tomorrow better than today, which makes it an exciting and rewarding place to work. We value what makes you unique. Be a part of a company that thinks differently to solve problems, make progress, and deliver meaningful innovations. As a Data Engineer II, you will be a part of our data engineering team responsible for Developing, deploying, monitoring, and supporting the data mart platform. In addition, you will be responsible for creating tools and automating operational tasks to integrate the data platform with external systems. Your entrepreneurial mindset and technical skills will be used, creating solutions that meet business needs and optimize customer experience directly impacting the organization and affecting the lives of millions. We believe that when people from diverse cultures, genders, and points of view come together, innovation is the result and everyone wins. Medtronic walks the walk, creating an inclusive culture where you can thrive. A DAY IN THE LIFE: In general, the following responsibilities apply for the Data Engineer II role . This includes, but is not limited to the following: Work effectively within a geographically dispersed and cross-functional teams during all phases of the product development process. Must be responsive, flexible, self-motivated and able to succeed within an open collaborative peer environment Participates in reviews, code inspections and will support the development of documentation required. Be Agile and effectively navigate through changing project priorities. Work independently under limited supervision. Setup proactive monitoring and alerting Troubleshoot production issues Qualifications- MUST HAVE - MINIMUM REQUIREMENTS: TO BE CONSIDERED FOR THIS ROLE, PLEASE BE SURE THE MINIMUM REQUIREMENTS ARE EVIDENT ON YOUR RESUME Overall, 4-7 years of IT experience with Bachelor s degree in computer engineering, Software Engineering, Computer Science, Electrical Engineering, or related technical field. Minimum 3 years of relevant experience in Data Engineering Minimum of 2 years of working experience in PySpark , and other data processing tools like Hive, Sqoop, etc. Minimum 1 year of experience in AWS and AWS native tools S3, Glue, Lambda, EMR , Athena Minimum 1 years of Hands-on experience with programming languages such as Python Strong Expertise in writing SQL Queries. Source Control systems Git/GitHub experience Strong problem-solving skills Experience in writing unit test and developing data quality frameworks Strong written and verbal communication & presentation skills Nice to Have Previous healthcare industry experience a plus Experience working with CI/CD tools preferrable Azure Pipelines & Terraform AWS certifications (AWS Developer /AWS Data Engineer) Working experience in any reporting tool like PowerBI. Benefits & Compensation Medtronic offers a competitive Salary and flexible Benefits Package A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create. We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage. We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions. Our Mission to alleviate pain, restore health, and extend life unites a global team of 95, 000+ passionate people. We are engineers at heart putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary. Learn more about our business, mission, and our commitment to diversity here
Posted 2 weeks ago
6.0 - 11.0 years
25 - 27 Lacs
Noida
Work from Office
We are seeking a Tech Operations Lead for our Technology - Business Management Office group, intended to provide decision support and analytics primarily focused on IT Asset Management. This position will support business decisions by providing accurate information on hardware and software assets deployed and used by the organization, and all assets are tracked to manage the financial, legal and compliance risks. Perform end-to-end lifecycle of hardware and software asset management processes, ensuring compliance with licensing terms and internal policies. Plan, monitor, and record software license and/or hardware assets to make sure they complied with vendor contracts in asset Management tools. Develop and enforce asset tagging, tracking and data reconciliation procedures while maintain the accurate inventory of all hardware assets using asset management tool. Design and deliver periodic and ad-hoc reports on asset utilization. Generated non-compliance reconciliation reports (weekly), Published monthly AMC and SLA reports &Managed daily machine and material records. Ensure proper hardware provisioning, deployment, maintenance, relocation and disposal aligned with company standards and lifecycle policies. Interfacing with other support organizations to ensure the effective use of the CMDB and Configuration Management System. Maintaining and recommending improvements to facilitate effective use and integrity of the CMDB. Make sure all changes to the CIs and the CMS are controlled, audited are reported and CMS is up to date. Ensuring that CI Owners maintain adequate Configuration Management process disciplines and systems for the CIs they own. Define and enhance scheme for identifying hardware and software-related assets as well as CIs, including versioning and dependencies in the asset management tools, attributes, the Contract management library, and the CMDB. Drive cost optimization strategies and identify opportunities for savings through effective license reuse, consolidation, and vendor negotiations. Onboard new software vendors for BAU Governance by collaborating with Procurement and Line of Business Operations teams to create a baseline inventory of entitlements and deployments. Responsible for managing lifecycle of hardware and software models in the DML right from their introduction to their retirement. Ensure the data quality, audits of data and interfaces between the tools and provide reporting on the asset management configuration items. Gather data and report effectiveness of IT asset management processes using pre-defined KPIs/metrics. Assist stakeholders in solutions to business needs for hardware and software cascades and technology charge backs. Creation of process guidelines/documentation and procedures to mature the Ameriprise TI asset management area. Experience: 7+ years of experience in hardware asset management and Software Asset Management (SAM), including standards, purchasing, and lifecycle practices. Experience with license management tools such as Flexera FNMS and ServiceNow SAM and HAM pro is highly desirable. Configuration Management experience on document control, source code management, and Defect Management tools. Experience of working in a multi-site environment. Preferred Knowledge Knowledge of IT Asset Management tool like Service Now, Flexera, Aspera, iTunes (Discovery agents) etc. knowledge of Excel, Access and reporting tools is required. SAM Tool Operational knowledge and Certification is preferred. Strong knowledge of Excel, Access and reporting tools is required. Strong written & verbal communication skills with attention to detail. Independent problem-solving ability & handling complex analysis. Ability to manage multiple tasks & projects. Sound business knowledge (Preferably Tech business) and ability to apply it in analysis. Location : Gurugram/Noida Timings : 2. 00 PM - 10. 30 PM Cab Facility provided : Yes. Ameriprise India LLP has been providing client based financial solutions to help clients plan and achieve their financial objectives for 125 years. We are a U. S. based financial planning company headquartered in Minneapolis with a global presence. The firm s focus areas include Asset Management and Advice, Retirement Planning and Insurance Protection. Be part of an inclusive, collaborative culture that rewards you for your contributions and work with other talented individuals who share your passion for doing great work. You ll also have plenty of opportunities to make your mark at the office and a difference in your community. So if youre talented, driven and want to work for a strong ethical company that cares, take the next step and create a career at Ameriprise India LLP. Full-Time/Part-Time Timings (2:00p-10:30p) India Business Unit AWMPO AWMP&S Presidents Office Job Family Group Technology
Posted 2 weeks ago
3.0 - 7.0 years
9 - 14 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist In this role, you will: The Senior Data Engineer will be responsible for designing, building, and managing the data infrastructure and data pipeline processes for the bank. This role involves leading a team of data engineers, working closely with data scientists, analysts, and IT professionals to ensure data is accessible, reliable, and secure. The ideal candidate will have a strong background in data engineering, excellent leadership skills, and a thorough understanding of the banking industrys data requirements. Leadership and Team Management . Lead, mentor, and develop a team of data engineers. Establish best practices for data engineering and ensure team adherence. Coordinate with other IT teams, business units, and stakeholders. Data Pipelines Integration and Management: Design and implement scalable data architectures to support the banks data needs. Develop and maintain ETL (Extract, Transform, Load) processes. Ensure the data infrastructure is reliable, scalable, and secure. Oversee the integration of diverse data sources into a cohesive data platform. Ensure data quality, data governance, and compliance with regulatory requirements. Develop and enforce data security policies and procedures. Monitor and optimize data pipeline performance. Troubleshoot and resolve data-related issues promptly. Implement monitoring and alerting systems for data processes Requirements To be successful in this role, you should meet the following requirements: (Must have Requirements) 8+ years of experience in data engineering or related field Strong experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark) Proficiency in programming languages such as Python, Java, or Scala. Experience with cloud platforms (AWS, Azure, Google Cloud) and their data services. Deep understanding of ETL processes and data pipeline orchestration tools (Airflow, Apache NiFi). Knowledge of data modeling, data warehousing concepts, and data integration techniques. Strong problem-solving skills and ability to work under pressure. Excellent communication and interpersonal skills. Experience in the banking or financial services industry. Familiarity with regulatory requirements related to data security and privacy in the banking sector. Certifications in cloud platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc. ). Experience with machine learning and data science frameworks Location : Pune and Bangalore
Posted 2 weeks ago
8.0 - 9.0 years
10 - 11 Lacs
Mumbai
Work from Office
You are a strategic thinker passionate about driving solutions and innovation mindset. You have found the right team. As a Data Engineer in our STO team, you will be a strategic thinker passionate about promoting solutions using data. You will mine, interpret, and clean our data, asking questions, connecting the dots, and uncovering hidden opportunities for realizing the data s full potential. As part of a team of specialists, you will slice and dice data using various methods and create new visions for the future. Our STO team is focused on collaborating and partnering with business to deliver efficiency and enhance controls via technology adoption and infrastructure support for Global Finance & Business Management India. Job Responsibilities Write efficient Python and SQL code to extract, transform, and load (ETL) data from various sources into Databricks. Perform data analysis and computation to derive actionable insights from the data. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data processes. Develop Optimized solutions for performance and scalability. Monitor and troubleshoot data workflows to ensure reliability and efficiency. Document data engineering processes, methodologies, and workflows. Communicating analytical findings to senior leaders through data visualization and storytelling Required qualifications, capabilities and skills Minimum 3+ years of hands-on experience in Developing, implementing and maintaining python automation solutions including the use of LLM. Develop, implement, and maintain new and existing solutions Write efficient Python and SQL code to extract, transform, and load (ETL) data from various sources . Ability to use LLM to build AI solutions. Perform data analysis and computation to derive actionable insights from the data. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data processes. Monitor and troubleshoot data workflows to ensure reliability and efficiency. Document data engineering processes, methodologies, and workflows. Preferred qualifications, capabilities and skills Hand-on experience in Python desktop solution development Knowledge of machine learning and data science concepts will be plus Experience with data visualization tool Tableau will be plus You are a strategic thinker passionate about driving solutions and innovation mindset. You have found the right team. As a Data Engineer in our STO team, you will be a strategic thinker passionate about promoting solutions using data. You will mine, interpret, and clean our data, asking questions, connecting the dots, and uncovering hidden opportunities for realizing the data s full potential. As part of a team of specialists, you will slice and dice data using various methods and create new visions for the future. Our STO team is focused on collaborating and partnering with business to deliver efficiency and enhance controls via technology adoption and infrastructure support for Global Finance & Business Management India. Job Responsibilities Write efficient Python and SQL code to extract, transform, and load (ETL) data from various sources into Databricks. Perform data analysis and computation to derive actionable insights from the data. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data processes. Develop Optimized solutions for performance and scalability. Monitor and troubleshoot data workflows to ensure reliability and efficiency. Document data engineering processes, methodologies, and workflows. Communicating analytical findings to senior leaders through data visualization and storytelling Required qualifications, capabilities and skills Minimum 3+ years of hands-on experience in Developing, implementing and maintaining python automation solutions including the use of LLM. Develop, implement, and maintain new and existing solutions Write efficient Python and SQL code to extract, transform, and load (ETL) data from various sources . Ability to use LLM to build AI solutions. Perform data analysis and computation to derive actionable insights from the data. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data processes. Monitor and troubleshoot data workflows to ensure reliability and efficiency. Document data engineering processes, methodologies, and workflows. Preferred qualifications, capabilities and skills Hand-on experience in Python desktop solution development Knowledge of machine learning and data science concepts will be plus Experience with data visualization tool Tableau will be plus
Posted 2 weeks ago
2.0 - 9.0 years
7 - 8 Lacs
Mumbai
Work from Office
Step into a transformative role as an Operations Analyst in Collateral Operations, where youll be instrumental in driving portfolio reconciliation and ensuring regulatory adherence across all regulations. Your expertise will span cross-LOB metrics and projects, fostering a culture of continuous improvement that supports business functions across Back Office, Middle Office, and Global teams. Job Summary As an Operations Analyst in Collateral Operations, you will be responsible for Portfolio Reconciliation, Regulatory adherence for all regulations, cross LOBs metrics and projects. Additionally, you will be building the culture of continuous improvement supporting business across Back Office, Middle offices as well as Global teams. You will be interacting with multiple Operations & Technology teams within the organization to provide business support. Job Responsibilities Perform Portfolio Reconciliation and Collateral Dispute Management. Understand MTM Breaks Including data quality, strategic projects, etc. Focus on deep dive and fixing on upstream issues to keep the breaks to minimum. Resolve breaks with Middle Offices, Credit risk, VCG, etc. Checks regulatory compliance CFTC, EMIR, NCMR, etc. Perform UAT testing. Implements Strategic automation projects. Required qualifications, capabilities and skills Graduate or Post-Graduate with 2 years experience in operations. Familiarity with Capital Markets & OTC Derivatives i. e. Investment Banking, including OTC product, process and system knowledge. Ability to drive results through a "hands-on" approach. Excellent verbal and written communication skills, and adapt at communicating with all levels of the business and technical parts of the organization. Skilled in MS office applications including Outlook, PowerPoint, Excel, Word and Access. Can operate effectively in a dynamic environment with tight deadlines, and can prioritize ones own and team s work to achieve goals . Flexibility to work global hours. Preferred qualifications, capabilities and skills Knowledge on CFTC, EMIR, NCMR regulations preferable. Experience on OTC Confirmations, Collateral Management and Reconciliation platforms will be an advantage. Step into a transformative role as an Operations Analyst in Collateral Operations, where youll be instrumental in driving portfolio reconciliation and ensuring regulatory adherence across all regulations. Your expertise will span cross-LOB metrics and projects, fostering a culture of continuous improvement that supports business functions across Back Office, Middle Office, and Global teams. Job Summary As an Operations Analyst in Collateral Operations, you will be responsible for Portfolio Reconciliation, Regulatory adherence for all regulations, cross LOBs metrics and projects. Additionally, you will be building the culture of continuous improvement supporting business across Back Office, Middle offices as well as Global teams. You will be interacting with multiple Operations & Technology teams within the organization to provide business support. Job Responsibilities Perform Portfolio Reconciliation and Collateral Dispute Management. Understand MTM Breaks Including data quality, strategic projects, etc. Focus on deep dive and fixing on upstream issues to keep the breaks to minimum. Resolve breaks with Middle Offices, Credit risk, VCG, etc. Checks regulatory compliance CFTC, EMIR, NCMR, etc. Perform UAT testing. Implements Strategic automation projects. Required qualifications, capabilities and skills Graduate or Post-Graduate with 2 years experience in operations. Familiarity with Capital Markets & OTC Derivatives i. e. Investment Banking, including OTC product, process and system knowledge. Ability to drive results through a "hands-on" approach. Excellent verbal and written communication skills, and adapt at communicating with all levels of the business and technical parts of the organization. Skilled in MS office applications including Outlook, PowerPoint, Excel, Word and Access. Can operate effectively in a dynamic environment with tight deadlines, and can prioritize ones own and team s work to achieve goals . Flexibility to work global hours. Preferred qualifications, capabilities and skills Knowledge on CFTC, EMIR, NCMR regulations preferable. Experience on OTC Confirmations, Collateral Management and Reconciliation platforms will be an advantage.
Posted 2 weeks ago
9.0 - 14.0 years
15 - 19 Lacs
Bengaluru
Work from Office
About the Role: We are looking for an Associate Architect with atleast 9+ years of experience to help scale andmodernize Myntra's data platform The ideal candidate will have a strong background inbuilding scalable data platforms using a combination of open-source technologies andenterprise solutions The role demands deep technical expertise in data ingestion, processing, serving, andgovernance, with a strategic mindset to scale the platform 10x to meet the ever-growing dataneeds across the organization This is a high-impact role requiring innovation, engineering excellence and system stability,with an opportunity to contribute to OSS projects and build data products leveragingavailable data assets Key Responsibilities: Design and scale Myntra's data platform to support growing data needs acrossanalytics, ML, and reporting Architect and optimize streaming data ingestion pipelines using Debezium, Kafka(Confluent), Databricks Spark and Flink Lead improvements in data processing and serving layers, leveraging DatabricksSpark, Trino, and Superset Good understanding of open table formats like Delta and Iceberg Scale data quality frameworks to ensure data accuracy and reliability Build data lineage tracking solutions for governance, access control, and compliance Collaborate with engineering, analytics, and business teams to identify opportunitiesand build / enhance self-serve data platforms Improve system stability, monitoring, and observability to ensure high availability ofthe platform Work with open-source communities and contribute to OSS projects aligned withMyntras tech stack Implement cost-efficient, scalable architectures for handling 10B+ daily events in acloud environment Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or arelated field. Experience: 9+ years of experience in building large-scale data platforms Expertise in big data architectures using Databricks, Trino, and Debezium Strong experience with streaming platforms, including Confluent Kafka Experience in data ingestion, storage, processing, and serving in a cloud-basedenvironment Hands-on experience implementing data quality checks using Great Expectations Deep understanding of data lineage, metadata management, and governancepractices Strong knowledge of query optimization, cost efficiency, and scaling architectures Familiarity with OSS contributions and keeping up with industry trends in dataengineering Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technicalchallenges Excellent communication and collaboration skills to work effectively withcross-functional teams Ability to lead large-scale projects in a fast-paced, dynamic environment Passion for continuous learning, open-source collaboration, and buildingbest-in-class data products
Posted 2 weeks ago
9.0 - 14.0 years
11 - 16 Lacs
Bengaluru
Work from Office
About the Role: We are looking for an Associate Architect with atleast 9+ years of experience to help scale andmodernize Myntra's data platform. The ideal candidate will have a strong background inbuilding scalable data platforms using a combination of open-source technologies andenterprise solutions.The role demands deep technical expertise in data ingestion, processing, serving, andgovernance, with a strategic mindset to scale the platform 10x to meet the ever-growing dataneeds across the organization.This is a high-impact role requiring innovation, engineering excellence and system stability,with an opportunity to contribute to OSS projects and build data products leveragingavailable data assets. Key Responsibilities: Design and scale Myntra's data platform to support growing data needs acrossanalytics, ML, and reporting. Architect and optimize streaming data ingestion pipelines using Debezium, Kafka(Confluent), Databricks Spark and Flink. Lead improvements in data processing and serving layers, leveraging DatabricksSpark, Trino, and Superset. Good understanding of open table formats like Delta and Iceberg. Scale data quality frameworks to ensure data accuracy and reliability. Build data lineage tracking solutions for governance, access control, and compliance. Collaborate with engineering, analytics, and business teams to identify opportunitiesand build / enhance self-serve data platforms. Improve system stability, monitoring, and observability to ensure high availability ofthe platform. Work with open-source communities and contribute to OSS projects aligned withMyntras tech stack. Implement cost-efficient, scalable architectures for handling 10B+ daily events in acloud environment. Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or arelated field. Experience: 9+ years of experience in building large-scale data platforms. Expertise in big data architectures using Databricks, Trino, and Debezium. Strong experience with streaming platforms, including Confluent Kafka. Experience in data ingestion, storage, processing, and serving in a cloud-basedenvironment. Hands-on experience implementing data quality checks using Great Expectations. Deep understanding of data lineage, metadata management, and governancepractices. Strong knowledge of query optimization, cost efficiency, and scaling architectures. Familiarity with OSS contributions and keeping up with industry trends in dataengineering.Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technicalchallenges. Excellent communication and collaboration skills to work effectively withcross-functional teams.Ability to lead large-scale projects in a fast-paced, dynamic environment. Passion for continuous learning, open-source collaboration, and buildingbest-in-class data products.
Posted 2 weeks ago
9.0 - 14.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About the Role: We are looking for an Associate Architect with atleast 9+ years of experience to help scale andmodernize Myntra's data platform The ideal candidate will have a strong background inbuilding scalable data platforms using a combination of open-source technologies andenterprise solutions The role demands deep technical expertise in data ingestion, processing, serving, andgovernance, with a strategic mindset to scale the platform 10x to meet the ever-growing dataneeds across the organization This is a high-impact role requiring innovation, engineering excellence and system stability,with an opportunity to contribute to OSS projects and build data products leveragingavailable data assets Key Responsibilities: Design and scale Myntra's data platform to support growing data needs acrossanalytics, ML, and reporting Architect and optimize streaming data ingestion pipelines using Debezium, Kafka(Confluent), Databricks Spark and Flink Lead improvements in data processing and serving layers, leveraging DatabricksSpark, Trino, and Superset Good understanding of open table formats like Delta and Iceberg Scale data quality frameworks to ensure data accuracy and reliability Build data lineage tracking solutions for governance, access control, and compliance Collaborate with engineering, analytics, and business teams to identify opportunitiesand build / enhance self-serve data platforms Improve system stability, monitoring, and observability to ensure high availability ofthe platform Work with open-source communities and contribute to OSS projects aligned withMyntras tech stack Implement cost-efficient, scalable architectures for handling 10B+ daily events in acloud environment Education: Bachelor's or Masters degree in Computer Science, Information Systems, or arelated field. Experience: 9+ years of experience in building large-scale data platforms Expertise in big data architectures using Databricks, Trino, and Debezium Strong experience with streaming platforms, including Confluent Kafka Experience in data ingestion, storage, processing, and serving in a cloud-basedenvironment Hands-on experience implementing data quality checks using Great Expectations Deep understanding of data lineage, metadata management, and governancepractices Strong knowledge of query optimization, cost efficiency, and scaling architectures Familiarity with OSS contributions and keeping up with industry trends in dataengineering Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technicalchallenges Excellent communication and collaboration skills to work effectively withcross-functional teams Ability to lead large-scale projects in a fast-paced, dynamic environment Passion for continuous learning, open-source collaboration, and buildingbest-in-class data products
Posted 2 weeks ago
4.0 - 9.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Job title: Business Analyst Responsibilities : Analytical Support : Gather all operational and financial data across all centers to provide inputs into the weekly MIS as well as a Monthly Review Meeting. Drive meaningful weekly / monthly reports that will help the regional Managers to take decisions on their centers health Analyse financial data (budgets, income statements, etc.) to understand Oasis Fertility's financial health. Coordinate all operational issues captured at center level and program manager the closure through cross functional collaboration Evaluate operational expenditures (OPEX) and capital expenditures (Capex) against the budget to identify variances. Analyse operational data to identify trends and areas for improvement. Conduct ad-hoc analytics towards a hypothesis and derive insights that will impact business performance Operational support : Coordinate assimilation of data for calculating doctor payouts and facilitate the final file to finance Coordinate and assimilate data to calculate incentives for the eligible operations team members. Use key metrics like yearly growth, return on assets (ROA), return on equity (ROE), and earnings per share (EPS) to assess operational performance. Collaborate with the operations and finance teams to ensure alignment between operational and financial goals. Strategic Support : Conduct business studies to understand past, present, and potential future performance. Conduct market research to stay updated on financial trends in the fertility industry. Evaluate the effectiveness of current processes and recommend changes for better efficiency. Develop data-driven recommendations to improve operational efficiency. Prepare financial models to assess the profitability of different business units and potential investment opportunities. Participate in process improvement initiatives and policy development to optimize business functions.
Posted 2 weeks ago
4.0 - 8.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary A BIE responsible for identifying the data objects developing strategies and analysing data to provide insights and recommendations for product data migrations. SAP MM module with 7+ years of experience. Responsibilities As a BIE must hands on ERP SAP CRM SFDC and LSMW Winshuttle. Should have hands on experience on Material master data Purchase info records condition records source list creation stock migration from legacy to new. Gather business requirements compliance and document the requirements as per Process documents in information models ensuring end-to-end data model consistency across processes and IT applications. Accountable for the implementation of Data Requirement Specifications based on inputs from the various stakeholders such as Business Process Owner Business Process Experts Business Information Owner Markets Business Group representatives and Program/Project managers. Support the implementation of the Services domain roadmap including implementation of data related solutions and processes keeping data quality and compliance controls in mind.
Posted 2 weeks ago
3.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data platform's capabilities. You will be actively involved in problem-solving and contributing innovative ideas to improve the overall data architecture, ensuring that the platform meets the evolving needs of the organization and its stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and best practices.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and management frameworks.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Navi Mumbai
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Purview Good to have skills : Collibra Data GovernanceMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive successful outcomes. You will also engage in problem-solving activities, ensuring that the applications meet the required standards and specifications while fostering a collaborative environment for your team. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously assess and improve application development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Purview.- Good To Have Skills: Experience with Collibra Data Governance.- Strong understanding of data governance principles and practices.- Experience in application design and architecture.- Familiarity with cloud-based solutions and integration techniques. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Purview.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and meet the requirements of the organization, facilitating smooth data integration and accessibility for users across the company. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to increase efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and techniques.- Experience with ETL processes and data integration.- Familiarity with data governance and data quality principles.- Ability to work with various data visualization tools to present data models effectively. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
7.0 - 12.0 years
10 - 14 Lacs
Nagpur
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP FSCM Treasury and Risk Management (TRM) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact for the project- Manage the team and ensure successful project delivery- Collaborate with multiple teams to make key decisions- Provide solutions to problems for the immediate team and across multiple teams Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP FSCM Treasury and Risk Management (TRM)- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP FSCM Treasury and Risk Management (TRM)- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Kolkata
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Data Quality Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement Informatica Data Quality solutions.- Collaborate with cross-functional teams to analyze and address data quality issues.- Create and maintain documentation for data quality processes.- Participate in data quality improvement initiatives.- Assist in training junior professionals in data quality best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality.- Strong understanding of data quality principles.- Experience with ETL processes and data integration.- Knowledge of data profiling and cleansing techniques.- Familiarity with data governance and metadata management. Additional Information:- The candidate should have a minimum of 3 years of experience in Informatica Data Quality.- This position is based at our Kolkata office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Informatica MDM Good to have skills : Informatica AdministrationMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of enhancements and maintenance tasks, while also focusing on the development of new features to meet client needs. You will be responsible for delivering high-quality code and participating in discussions that drive project success. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Good To Have Skills: Experience with Informatica Administration.- Strong understanding of data integration and data quality processes.- Experience with ETL processes and data warehousing concepts.- Familiarity with database management systems and SQL. Additional Information:- The candidate should have minimum 3 years of experience in Informatica MDM.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France