Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 7.0 years
4 - 8 Lacs
Ahmedabad
Work from Office
Travel Designer Group Founded in 1999, Travel Designer Group has consistently achieved remarkable milestones in a relatively short span of time. While we embody the agility, growth mindset, and entrepreneurial energy typical of start-ups, we bring with us over 24 years of deep-rooted expertise in the travel trade industry. As a leading global travel wholesaler, we serve as a vital bridge connecting hotels, travel service providers, and an expansive network of travel agents worldwide. Our core strength lies in sourcing, curating, and distributing high-quality travel inventory through our award-winning B2B reservation platform, RezLive.com. This enables travel trade professionals to access real-time availability and competitive pricing to meet the diverse needs of travelers globally. Our expanding portfolio includes innovative products such as: * Rez.Tez * Affiliate.Travel * Designer Voyages * Designer Indya * RezRewards * RezVault With a presence in over 32+ countries and a growing team of 300+ professionals, we continue to redefine travel distribution through technology, innovation, and a partner-first approach. Website : https://www.traveldesignergroup.com/ Profile :- ETL Developer ETL Tools - any 1 -Talend / Apache NiFi /Pentaho/ AWS Glue / Azure Data Factory /Google Dataflow Workflow & Orchestration : any 1 good to have- not mandatory Apache Airflow/dbt (Data Build Tool)/Luigi/Dagster / Prefect / Control-M Programming & Scripting : SQL (Advanced) Python ( mandatory ) Bash/Shell (mandatory) Java or Scala (optional for Spark) -optional Databases & Data Warehousing MySQL / PostgreSQL / SQL Server / Oracle mandatory Snowflake - good to have Amazon Redshift - good to have Google BigQuery - good to have Azure Synapse Analytics - good to have MongoDB / Cassandra - good to have Cloud & Data Storage : any 1 -2 AWS S3 / Azure Blob Storage / Google Cloud Storage - mandatory Kafka / Kinesis / Pub/Sub Interested candidate also share your resume in shivani.p@rezlive.com
Posted 5 hours ago
3.0 years
4 - 10 Lacs
Coimbatore
Remote
Industry: IT Qualification: Any Degree Required Skills: Python, Pandas, SQL Working Shift: 2PM to 11PM IST City: Coimbatore Country: India Name of the position: Data Engineer Location: Remote No. of resources needed : 01 Mode: Contract (2 Months with Possible Extension) Years of experience: 3+ Years Shift : UK shift Job Summary: We are looking for a highly motivated and detail-oriented Data Engineer with a strong background in data cleansing, Python scripting, and SQL to join our team. The ideal candidate will play a critical role in ensuring data quality, transforming raw datasets into actionable insights, and supporting data-driven decision-making across the organization. Key Responsibilities: Design and implement efficient data cleansing routines to remove duplicates, correct anomalies, and validate data integrity. Write robust Python scripts to automate data processing, transformation, and integration tasks. Develop and optimize SQL queries for data extraction, aggregation, and reporting. Work closely with data analysts, business stakeholders, and engineering teams to understand data requirements and deliver clean, structured datasets. Build and maintain data pipelines that support large-scale data processing. Monitor data workflows and troubleshoot issues to ensure accuracy and reliability. Contribute to documentation of data sources, transformations, and cleansing logic. Requirements: Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field. 3+ years of hands-on experience in data engineering, with a focus on data quality and cleansing. Strong proficiency in Python, including libraries like Pandas and NumPy. Expert-level knowledge of SQL and working with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Familiarity with data profiling tools and techniques. Excellent problem-solving skills and attention to detail. Good communication and documentation skills. Preferred Qualifications: Experience with cloud platforms (AWS, Azure, GCP) and data services (e.g., S3, BigQuery, Redshift). Knowledge of ETL tools like Apache Airflow, Talend, or similar. Exposure to data governance and data cataloging practices.
Posted 10 hours ago
0 years
0 Lacs
Bengaluru
On-site
Job Applicant Privacy Notice APPLICATION DEVELOPER Publication Date: Jun 20, 2025 Ref. No: 532101 Location: Bangalore, IN looking for a Data Engineer to join our team and bring the analytics practice to the next level. We are looking for a motivated person who thrives in a dynamic and challenging environment, who loves working with tools on the edge, who has no problem switching between multiple programming languages, and who is able to find out-of-the-box solutions to solve complex problems. In this role, you will be at the heart of the definition and implementation of world-class analytics solutions, and you will contribute to set data as a strategic advantage for BRP. Responsibilities Design, develop, implement and support robust ETL/ELT/Data Pipelining solutions Coordinate with multiple development teams to achieve delivery objectives Provide support in requirements definition, estimation, development and delivery of robust and scalable solutions Development and support of real-time data ingestion processes from various data sources Development and support of data models optimized for business intelligence usage Build integrations with APIs from external providers such as Google, Facebook, Salesforce, SAP and others Adhering to industry standards and laws such as GDPR and SOX Be a leader in best-practices definition and creative thinking Required Skills Master Degree in Business Intelligence or equivalent Solid and demonstrated experience on following technologies: Snowflake DBT Talend and/or Azure Data Factory Microsoft SQL Server Power BI Fluent in various programming languages such as: T-SQL Python Javascript / Node.js Java Understands and puts into practice data modeling, design and development of solid ETL/ELT data pipelines Fluent in writing, executing and optimizing complex SQL queries Experience implementing API Service architectures ( REST) Develop clean and maintainable code in a CI-CD Environment Experience in using cloud BI technologies such as Azure or similar Experience in translating business requirements into advanced data models able to fulfill analysts and data scientists requirements Experience in data profiling Experience in working within an agile team to build big data / analytics solutions Strong interpersonal relations, motivated and loves to work on multiple challenging projects Strong communication skills, both speaking or writing Fluent in French and English Open-minded and able to adapt to new ways of working (data vault, event-driven architecture, unstructured data, self-service analytics, etc..) Well organized and able to self-prioritize, sometimes with conflictual deadlines Strong communication skills, both speaking or writing, in both french and english Continuously seeks for improvements and craves to put hands on new technologies
Posted 10 hours ago
1.0 - 9.0 years
5 - 8 Lacs
Bengaluru
On-site
Job requisition ID :: 84728 Date: Jun 22, 2025 Location: Bengaluru Designation: Consultant Entity: Technology & Transformation-EAD: ETL Testing-Analyst/Consultant/Senior Consultant Y our potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Analyst/Consultant/Senior Consultant in our T&T Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Develop and execute automated test cases for ETL processes. Validate data transformation, extraction, and loading accuracy. Collaborate with data engineers and QA teams to understand ETL workflows. Identify and document defects and inconsistencies. Maintain test documentation and support manual testing efforts. Design and implement automated ETL test scripts and frameworks. Validate end-to-end data flows and transformation logic. Collaborate with data architects, developers, and QA teams. Integrate ETL testing into CI/CD pipelines where applicable. Analyze test results and troubleshoot data issues. Lead the architecture and development of advanced ETL automation frameworks. Drive best practices in ETL testing and data quality assurance. Mentor and guide junior consultants and analysts. Collaborate with stakeholders to align testing strategies with business goals. Integrate ETL testing within DevOps and CI/CD pipelines. Desired Qualifications 1 to 9 years experience in ETL testing and automation. Knowledge of ETL tools such as Informatica, Talend, or DataStage. Experience with SQL and database querying. Basic scripting or programming skills (Python, Shell, etc.). Good analytical and communication skills. Strong SQL skills and experience with ETL tools like Informatica, Talend, or DataStage. Proficiency in scripting languages for automation (Python, Shell, etc.). Knowledge of data warehousing concepts and best practices. Strong problem-solving and communication skills. Expert knowledge of ETL tools and strong SQL proficiency. Experience with automation scripting and data validation techniques. Strong leadership, communication, and stakeholder management skills. Familiarity with big data technologies and cloud platforms is a plus. Location and way of working: Base location: Bangalore This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Posted 10 hours ago
1.0 years
6 - 9 Lacs
Noida
On-site
Job Description Job ID ANALY014395 Employment Type Regular Work Style on-site Location Noida,UP,India Role Analytics Consultant I Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. Job Summary The Analytics Consultant I is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Analytics Consultant I will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Key Responsibilities: Interact with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyze the customer’s data to spot trends and issues and present the results back to the customer Required Qualifications: 1-3 years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com
Posted 10 hours ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Tech skills Proficient in Python (Including popular python packages e.g. Pandas, NumPy etc.) and SQL Strong background in distributed data processing and storage (e.g. Apache Spark, Hadoop) Large scale (TBs of data) data engineering skills - Model data, create production ready ETL pipelines Development experience with at least one cloud (Azure high preference, AWS, GCP) Knowledge of data lake and data lake house patterns Knowledge of ETL performance tuning and cost optimization Knowledge of data structures and algorithms and good software engineering practices Soft skills Strong communication skills to articulate complex situation concisely Comfortable with picking up new technologies independently Eye for detail, good data intuition, and a passion for data quality Comfortable working in a rapidly changing environment with ambiguous requirements Skills Python,Sql,Aws,Azure
Posted 10 hours ago
3.0 - 8.0 years
16 - 20 Lacs
Indore, Hyderabad, Ahmedabad
Work from Office
We're Hiring: Data Governance Developer Microsoft Purview, Locations: Hyderabad / Indore / Ahmedabad (Work from Office) Experience: 4-6 Years Budget: (Depending on experience & skills) Apply by sharing your resume with: Current CTC Expected CTC Notice Period Preferred Location Email your profile to: navaneetha@suzva.com Contact: +91 90329 56160 Role Overview As a Data Governance Developer at Kanerika, you will be responsible for developing and managing robust metadata, lineage, and compliance frameworks using Microsoft Purview and other leading tools. Youll work closely with engineering and business teams to ensure data integrity, regulatory compliance, and operational transparency. Key Responsibilities Set up and manage Microsoft Purview: accounts, collections, RBAC, and policies. Integrate Purview with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule & monitor metadata scanning, classification, and lineage tracking jobs. Build ingestion workflows for technical, business, and operational metadata. Tag, enrich, and organize assets with glossary terms and metadata. Automate lineage, glossary, and scanning processes via REST APIs, PowerShell, ADF, and Logic Apps. Design and enforce classification rules for PII, PCI, PHI. Collaborate with domain owners for glossary and metadata quality governance. Generate compliance dashboards and lineage maps in Power BI. Tools & Technologies Governance Platforms: Microsoft Purview, Collibra, Atlan, Informatica Axon, IBM IG Catalog Integration Tools: Azure Data Factory, dbt, Talend Automation & Scripting: PowerShell, Azure Functions, Logic Apps, REST APIs Compliance Areas in Purview: Sensitivity Labels, Policy Management, Auto-labeling Data Loss Prevention (DLP), Insider Risk Mgmt, Records Management Compliance Manager, Lifecycle Mgmt, eDiscovery, Audit DSPM, Information Barriers, Unified Catalog Required Qualifications 4-6 years of experience in Data Governance / Data Management. Hands-on with Microsoft Purview, especially lineage and classification workflows. Strong understanding of metadata management, glossary governance, and data classification. Familiarity with Azure Data Factory, dbt, Talend. Working knowledge of data compliance regulations: GDPR, CCPA, SOX, HIPAA. Strong communication skills to collaborate across technical and non-technical teams.
Posted 10 hours ago
1.0 - 5.0 years
4 - 8 Lacs
Mumbai
Work from Office
Piscis Networks is looking for TAC Support Engineer to join our dynamic team and embark on a rewarding career journey Responding to customer inquiries and resolving technical issues via phone, email, or chat Conducting diagnostic tests to identify the root cause of customer issuesProviding technical guidance to customers and walking them through solutions to resolve their problems Collaborating with development teams to escalate and resolve complex technical issues Maintaining accurate records of customer interactions and issue resolutions in a CRM system Participating in the development and delivery of customer training and support materials Communicating with customers and internal stakeholders to provide status updates on issue resolution Strong technical background and understanding of hardware and software systems Excellent communication and interpersonal skills Experience with CRM and ticketing systems
Posted 12 hours ago
6.0 - 10.0 years
13 - 17 Lacs
Bengaluru
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above
Posted 13 hours ago
6.0 - 10.0 years
13 - 17 Lacs
Pune
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities Let's unleash your full potential. See Beyond, Rise Above
Posted 13 hours ago
6.0 - 10.0 years
13 - 17 Lacs
Hyderabad
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above
Posted 13 hours ago
5.0 years
0 - 0 Lacs
Mumbai Metropolitan Region
On-site
Job Title: Data Migration Project Manager (Snowflake) Location: Mumbai (Onsite) About the Role: We are seeking a results-driven Data Migration Project Manager with expertise in managing complex data migration initiatives, particularly involving Snowflake . This role will be responsible for end-to-end planning, execution, and delivery of data migration projects from legacy systems or on-premise databases to cloud-based solutions using Snowflake. Key Responsibilities: Lead and manage full lifecycle data migration projects with a focus on Snowflake as the target platform. Develop and maintain detailed project plans, schedules, risk registers, and resource allocation to ensure successful execution. Collaborate closely with data architects, engineers, business analysts, and other stakeholders to define and validate migration requirements. Oversee data extraction, transformation, and loading (ETL/ELT) activities, ensuring data integrity, quality, and security throughout the migration. Coordinate testing phases including data validation, reconciliation, and user acceptance testing (UAT). Drive issue resolution and ensure mitigation plans are in place for project risks. Provide regular project status updates to leadership and stakeholders. Ensure compliance with data governance, regulatory, and security requirements during migrations. Requirements Required Qualifications: 5+ years of experience in project management with a focus on data migration or data platform modernization. Hands-on experience managing data migrations to or from Snowflake. Strong understanding of Snowflake architecture, capabilities, and best practices. Experience working with ETL/ELT tools (e.g., Talend, Informatica, dbt, Matillion). Familiarity with data modeling, SQL, and cloud platforms (AWS, Azure, or GCP). PMP, Agile, or similar project management certification preferred.
Posted 14 hours ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad (Work from Office) Working Hours: 5:00 PM - 2:00AM IST or 6:00 PM to 3:00 AM IS T, should be ok to work in night shift as per requirement. Position Summary: The Senior Consultant will integrate and map customer data from client source system(s) to our industry-leading platform. The role will include, but is not limited to: Using strong technical data migration, scripting, and organizational skills to ensure the client data is converted efficiently and accurately to the insight software (ISW) platform. Performing extract, transform, load (ETL) activities to ensure accurate and timely data conversions. Providing in-depth research and analysis of complex scenarios to develop innovative solutions to meet customer needs whilst remaining within project governance. Mapping and maintaining business requirements to the solution design using tools such as requirements traceability matrices (RTM). Presenting findings, requirements, and problem statements for ratification by stakeholders and working groups. Identifying and documenting data gaps to allow change impact and downstream impact analysis to be conducted. Qualifications: Experience assessing data and analytic requirements to establish mapping rules from source to target systems to meet business objectives. Experience with real-time, batch, and ETL for complex data conversions. Working knowledge of extract, transform, load (ETL) methodologies and tools such as Talend, Dell Boomi, etc. Utilize data mapping tools to prepare data for data loads based on target system specifications. Working experience using various data applications/systems such as Oracle SQL, Excel, .csv files, etc. Strong SQL scripting experience. Communicate with clients and/or ISW Project Manager to scope, develop, test, and implement conversion/integration Effectively communicate with ISW Project Managers and customers to keep project on target Continually drive improvements in the data migration process. Collaborate via phone and email with clients and/or ISW Project Manager throughout the conversion/integration process. Demonstrated collaboration and problem-solving skills. Working knowledge of software development lifecycle (SDLC) methodologies including, but not limited to: Agile, Waterfall, and others. Clear understanding of cloud and application integrations. Ability to work independently, prioritize tasks, and manage multiple tasks simultaneously. Ensure client’s data is converted/integrated accurately and within deadlines established by ISW Project Manager. Experience in customer SIT, UAT, migration and go live support.
Posted 15 hours ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Test Engineer Location: Hyderabad (Onsite) Experience Required: 5 Years Job Description: We are looking for a detail-oriented and skilled Test Engineer with 5 years of experience in testing SAS applications and data pipelines . The ideal candidate should have a solid background in SAS programming , data validation , and test automation within enterprise data environments. Key Responsibilities: Conduct end-to-end testing of SAS applications and data pipelines to ensure accuracy and performance. Write and execute test cases/scripts using Base SAS, Macros, and SQL . Perform SQL query validation and data reconciliation using industry-standard practices. Validate ETL pipelines developed using tools like Talend, IBM Data Replicator , and Qlik Replicate . Conduct data integration testing with Snowflake and use explicit pass-through SQL to ensure integrity across platforms. Utilize test automation frameworks using Selenium, Python, or Shell scripting to increase test coverage and reduce manual efforts. Identify, document, and track bugs through resolution, ensuring high-quality deliverables. Required Skills: Strong experience in SAS programming (Base SAS, Macro) . Expertise in writing and validating SQL queries . Working knowledge of data testing frameworks and reconciliation tools . Experience with Snowflake and ETL validation tools like Talend, IBM Data Replicator, Qlik Replicate. Proficiency in test automation using Selenium , Python , or Shell scripts . Solid understanding of data pipelines and data integration testing practices.
Posted 15 hours ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Position Summary The Senior Consultant will integrate and map customer data from client source system(s) to our industry-leading platform. The role will include, but is not limited to: Using strong technical data migration, scripting, and organizational skills to ensure the client data is converted efficiently and accurately to the organization’s platform. Performing extract, transform, load (ETL) activities to ensure accurate and timely data conversions. Providing in-depth research and analysis of complex scenarios to develop innovative solutions to meet customer needs whilst remaining within project governance. Mapping and maintaining business requirements to the solution design using tools such as requirements traceability matrices (RTM). Presenting findings, requirements, and problem statements for ratification by stakeholders and working groups. Identifying and documenting data gaps to allow change impact and downstream impact analysis to be conducted. Work Mode – Hybrid (2-3 days in a week working from office) Shift Timings – 2 PM to 11 PM IST (should be flexible to the EST hours when required) Qualifications 6+ years of experience assessing data and analytic requirements to establish mapping rules from source to target systems to meet business objectives. Experience with real-time, batch, and ETL for complex data conversions. Working knowledge of extract, transform, load (ETL) methodologies and tools such as Talend, Dell Boomi, etc. Utilize data mapping tools to prepare data for data loads based on target system specifications. Working experience using various data applications/systems such as Oracle SQL, Excel, .csv files, etc. Strong SQL scripting experience. Communicate with clients and/or Project Manager to scope, develop, test, and implement conversion/integration Effectively communicate with Project Managers and customers to keep project on target Continually drive improvements in the data migration process. Collaborate via phone and email with clients and/or Project Manager throughout the conversion/integration process. Demonstrated collaboration and problem-solving skills. Working knowledge of software development lifecycle (SDLC) methodologies including, but not limited to: Agile, Waterfall, and others. Clear understanding of cloud and application integrations. Ability to work independently, prioritize tasks, and manage multiple tasks simultaneously. Ensure client’s data is converted/integrated accurately and within deadlines established by Project Manager. Experience in customer SIT, UAT, migration and go live support.
Posted 16 hours ago
4.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role: ETL Test Engineer Experience range: 4-10 years Location: Hyderabad only Job description: 1.Min 4 to 6 yrs of Exp in ETL Testing. 2.SQL - Expert level of knowledge in core concepts of SQL and query. 3. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 4. Experience in query optimization, stored procedures/views and functions. 5.Strong familiarity of data warehouse projects and data modeling. 6. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 7.Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 8. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 9. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 10.Communication - Good communication skills to understand and collaborate with all the stake holders within the project
Posted 17 hours ago
15.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Details: Job Description Exp: 15+ year of exp Require knowledge of Talend but also knowledge of other Data related tools like: Databricks or Snowflake The Senior Talend Developer/Architect role is responsible to lead the design, development and manage the INSEAD data infrastructure for the CRM ecosystem, to develop Talend Jobs & Flows and to act as a mentor for the other 3-4 Talend Developers. This role will be instrumental in driving data pipeline architecture and ensuring data integrity, performance, and scalability using the Talend platform. This role is key part of the HARMONIA project team while the engagement is active. The role will also be part of the Harmonia Data Quality project and Data Operations scrum teams. It will contribute to additional activities such data modelling & design, architecture, integration and propose technology strategy. The position holder must organize and plan her/his work under special consideration of a frictionless information flow in the Digital Solutions and the relevant business department. He/She will collaborate closely with cross-functional teams to deliver high-quality data solutions that support strategic business objectives. Job Requirements Details: Design, develop, and deploy scalable ETL/ELT solutions using Talend (e.g. Data Stewardship, Management Console, Studio). Architect end-to-end data integration workflows. Establish development best practices, reusable components, and job templates to optimize performance and maintainability. Responsible for delivering robust data architecture, tested, validated and deployable jobs/flows to production environments. He/she will follow Talend best practices and JIRA development framework development practices. Translate business requirements into efficient and scalable Talend solutions. Assist with the developer input/feedback for those requirements wherever deemed necessary. These are to be done by actively leading brainstorming sessions arranged by the project manager. Work closely with the Manager of Data Operations and Quality, project manager, business analysts, data analysts, developers and other subject matter experts to align technical solutions with operational needs. Ensure alignment with data governance, security, and compliance standards. Responsible for ensuring that newly produced developments follow standard styles which are already part of the current Talend jobs & flows developed by the current integrator, and INSEAD teams. Actively participate to the project related activities and ensure the SDLC process is followed. Participate in the implementation and execution of data cleansing and normalization, deduplication and transformation projects. Conduct performance tuning, error handling, monitoring, and troubleshooting of Talend jobs and environments. Contribute to sprint planning and agile ceremonies with the Harmonia Project Team and Data Operations Team. Document technical solutions, data flows, and design decisions to support operational transparency. Stay current with Talend product enhancements and industry trends, recommending upgrades or changes where appropriate. No budget responsibility Personnel responsibility: Provide technical mentorship to junior Talend developers and contribute to develop the internal knowledge base. (INSEAD and external ones).
Posted 1 day ago
6.0 - 10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a highly skilled and experienced Snowflake Developer to join our dynamic data engineering team. The ideal candidate will have 6 to 10 years of hands-on experience in Snowflake development, data warehousing, and cloud-based data solutions. This role offers the opportunity to work on cutting-edge projects in a collaborative and fast-paced environment. Key Responsibilities: Design, develop, and optimize Snowflake data pipelines and data models. Implement data integration solutions using Snowflake and other ETL tools. Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Ensure data quality, performance tuning, and efficient data storage practices. Develop and maintain documentation for data processes and architecture. Required Skills & Qualifications: 6–10 years of experience in data engineering or data warehousing. Strong expertise in Snowflake including SnowSQL, Snowpipe, and performance tuning. Proficiency in SQL and experience with ETL tools like Informatica, Talend, or Matillion. Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data modeling concepts and best practices. Excellent problem-solving and communication skills. Preferred Qualifications: Snowflake certification(s) is a plus. Experience with CI/CD pipelines and version control tools like Git. Knowledge of Python or other scripting languages. Location: This position is open for candidates based in Chennai or Hyderabad . Why Join Us? Work with a passionate and innovative team. Opportunity to work on large-scale, high-impact data projects. Competitive compensation and benefits. Flexible work culture and continuous learning environment.
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Job Description Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We're committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Pay And Benefits Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact You Will Have In This Role The Enterprise Intelligence Lead will be responsible for building data pipelines using their deep knowledge of Talend, SQL and Data Analysis on the bespoke Snowflake data warehouse for Enterprise Intelligence; This role will be in the Claw Team within Enterprise Data & Corporate Technology (EDCT). The Enterprise Intelligence team maintains the firm’s business intelligence tools and data warehouse. Your Primary Responsibilities Working on and leading engineering and development focused projects from start to finish with minimal supervision Providing technical and operational support for our customer base as well as other technical areas within the company that utilize Claw Risk management functions such as reconciliation of vulnerabilities, security baselines as well as other risk and audit related objectives Administrative functions for our tools such as keeping the tool documentation current and handling service requests Participate in user training to increase awareness of Claw Ensuring incident, problem and change tickets are addressed in a timely fashion, as well as escalating technical and managerial issues Following DTCC’s ITIL process for incident, change and problem resolution Qualifications Minimum of 8 years of related experience Bachelor's degree preferred or equivalent experience. Talents Needed For Success Must have experience in snowflake or SQL Minimum of 5 years of related data warehousing work experience 5+ years managing data warehouses in a production environment. This includes all phases of lifecycle management: planning, design, deployment, upkeep and retirement Strong understanding of star/snowflake schemas and data integration methods and tools Moderate to advanced competency of Windows and Unix-like operating system principles Developed competencies around essential project management, communication (oral, written) and personal effectiveness Working experience in MS Office tools such as Outlook, Excel, PowerPoint, Visio and Project Optimize/Tune source streams, queries, Powerbase Dashboards Good knowledge of the technical components of Claw (i.e. Snowflake, Talend, PowerBI, PowerShell, Autosys) Actual salary is determined based on the role, location, individual experience, skills, and other considerations. we are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. About Us With over 50 years of experience, DTCC is the premier post-trade market infrastructure for the global financial services industry. From 20 locations around the world, DTCC, through its subsidiaries, automates, centralizes, and standardizes the processing of financial transactions, mitigating risk, increasing transparency, enhancing performance and driving efficiency for thousands of broker/dealers, custodian banks and asset managers. Industry owned and governed, the firm innovates purposefully, simplifying the complexities of clearing, settlement, asset servicing, transaction processing, trade reporting and data services across asset classes, bringing enhanced resilience and soundness to existing financial markets while advancing the digital asset ecosystem. In 2024, DTCC’s subsidiaries processed securities transactions valued at U.S. $3.7 quadrillion and its depository subsidiary provided custody and asset servicing for securities issues from over 150 countries and territories valued at U.S. $99 trillion. DTCC’s Global Trade Repository service, through locally registered, licensed, or approved trade repositories, processes more than 25 billion messages annually. To learn more, please visit us at www.dtcc.com or connect with us on LinkedIn , X , YouTube , Facebook and Instagram . DTCC proudly supports Flexible Work Arrangements favoring openness and gives people freedom to do their jobs well, by encouraging diverse opinions and emphasizing teamwork. When you join our team, you’ll have an opportunity to make meaningful contributions at a company that is recognized as a thought leader in both the financial services and technology industries. A DTCC career is more than a good way to earn a living. It’s the chance to make a difference at a company that’s truly one of a kind. Learn more about Clearance and Settlement by clicking here . About The Team IT Architecture and Enterprise Services are responsible for enabling digital transformation of DTCC. The group manages complexity of the technology landscape within DTCC and enhances agility, robustness and security of the technology footprint. It does so by serving as the focal point for all technology architectural activities in the organization as well as engineering a portfolio of foundational technology assets to enable our digital transformation.
Posted 2 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description At least 6-14+ years of Maximo technical experience Experience with Maximo 7.x Must have experience with generating precise functional and technical design, data migration strategy for Maximo Asset Management, Work Management, Purchasing, Spatial and Inventory modules Define Application configuration Design, Data migration strategy, Integration Designs, Prepare necessary documentation Extensive experience with Maximo Data Conversion/ Migration using MX Data loader, Talend and scripts Experience with Maximo Integration technologies (MIF, MEA, Object Structures, services, channels, etc.), User exit classes. XLST/XML, SOAP/REST API etc. Experience with Maximo Customization technologies (Java scripts, Python/Jython, Java/J2EE, SQL) Experience with WebSphere 8.5/ 9.0 application servers administration (Linux environment preferred) Experience on BIRT Reports Knowledge on MAS and MAS Functional/Technical Certification would be an added advantage Job Summary We are seeking a Sr. Analyst MAP with 6 to 8 years of experience to join our team. The ideal candidate will have expertise in IBM Maximo Asset Management and will work in a hybrid model. This role involves day shifts and does not require travel. The candidate will play a crucial role in managing and analyzing geospatial data to support our asset management initiatives. Responsibilities Manage and analyze geospatial data to support asset management initiatives. Oversee the implementation and maintenance of IBM Maximo Asset Management systems. Provide technical expertise in geospatial data analysis and asset management. Collaborate with cross-functional teams to ensure data accuracy and integrity. Develop and maintain geospatial databases and related systems. Conduct regular audits to ensure data quality and compliance with industry standards. Create detailed reports and visualizations to support decision-making processes. Train and support team members in the use of geospatial tools and technologies. Identify and address any issues related to geospatial data and asset management. Ensure that all geospatial data is up-to-date and accurately reflects current asset conditions. Work closely with stakeholders to understand their geospatial data needs and requirements. Contribute to the development of best practices and standards for geospatial data management. Stay updated with the latest trends and advancements in geospatial technologies. Qualifications Possess a strong background in IBM Maximo Asset Management with hands-on experience. Demonstrate expertise in geospatial data analysis and management. Have excellent problem-solving and analytical skills. Show proficiency in creating detailed reports and visualizations. Exhibit strong communication and collaboration skills. Be familiar with industry standards and best practices for geospatial data management. Have the ability to train and support team members in geospatial tools. Stay updated with the latest trends in geospatial technologies. Be detail-oriented and ensure data accuracy and integrity. Have experience in conducting data audits and ensuring compliance. Be able to work in a hybrid model with day shifts. Demonstrate the ability to work closely with stakeholders. Show a commitment to continuous learning and improvement. Certifications Required Certified Maximo Asset Management Professional GIS Certification
Posted 2 days ago
5.0 - 10.0 years
22 - 25 Lacs
Hyderabad
Work from Office
Role & responsibilities - We are seeking a talented data modeler to assist with the design and implementation of company databases. Mandatory skills required to work - Data base management, SQL, NoSQL, Data modelling, ERWin, IBM data architect, ETL, Talend, Apache, Microsoft SSI, Big data understanding, Hadoop, Athena and redshift Cloud computing AWS must and should. As a data modeler, you will be working closely with data architects and data analysts to implement data modeling solutions in order to streamline and support enterprise information management. To ensure success as a data modeler, you should have in-depth knowledge of data warehousing, as well as expert communication skills. Ultimately, a top-notch data modeler should be able to design models that reduce data redundancy, streamline data movements, and improve enterprise information management. Collaborate with business analysts and stakeholders to understand business processes and requirements, translating them into data modelling solutions. Design and develop logical and physical data models that effectively capture the granularity of data necessary for analytical and reporting purposes. Migrate and optimize existing data models from traditional on-premises data stores to Azure/Databricks cloud environments, ensuring scalability and performance. Establish data modelling standards and best practices to maintain the integrity and consistency of the data architecture. Work closely with data engineers and BI developers to ensure that the data models support the needs of analytical and operational reporting. Conduct data profiling and analysis to understand data sources, relationships, and quality, informing the data modelling process. Continuously evaluate and refine data models to accommodate evolving business needs and to leverage new data modelling techniques and cloud capabilities. Document data models, including entity-relationship diagrams, data dictionaries, and metadata, to provide clear guidance for development and maintenance. Provide expertise in data modelling and data architecture to support the development of data governance policies and procedures. Preferred candidate profile - We are a startup and expect each team member to wear multiple hats, take initiative, and spot and solve problems. Candidates with a healthcare reimbursement background is preferable. Bachelor's degree in computer science, information technology, or a similar field. 5 years of hands-on experience with physical and relational data modeling. Expert knowledge of metadata management and related tools. Knowledge of mathematical foundations and statistical analysis. Strong interpersonal skills. Experience with team management. Excellent communication and presentation skills. Advanced troubleshooting skills. Knowledge, Skills, Behaviors: Minimum of 5 years of experience in data modelling, with a strong background in both traditional RDBMS and modern cloud-based data platforms. Proficiency in SQL and experience with data modelling tools (e.g., ER/Studio, ERwin, Power Designer). Familiarity with Azure cloud services, Databricks, and other big data technologies. Understanding of data warehousing concepts, including dimensional modelling, star schemas, and snowflake schemas. Ability to translate complex business requirements into effective data models that support analytical and reporting functions. Strong analytical skills and attention to detail. Excellent communication and collaboration abilities, with the capacity to engage with both technical and non-technical stakeholders.
Posted 2 days ago
12.0 years
3 - 6 Lacs
Chandigarh
On-site
Job Summary As a key contributor to our ERP Transformation Services team, the Senior ETL Data Migration Analyst is responsible for owning the design, development, and execution of enterprise-wide data migration activities. This role is instrumental in the success of global ERP implementations—primarily Oracle EBS and SAP ECC—by ensuring consistent, auditable, and high-quality data migration processes using industry-standard tools and frameworks. In This Role, Your Responsibilities Will Be: Pre-Go-Live: Planning & Development Design and implement global data migration strategies for Oracle and SAP ERP projects. Develop ETL processes using Syniti DSP / SKP or an equivalent tool to support end-to-end migration. Collaborate with legacy system teams to extract and analyze source data. Build workflows for data profiling, cleansing, enrichment, and transformation. Ensure audit ability and traceability of migrated data, aligned with compliance and governance standards. Go-Live & Cutover Execution Support mock loads, cutover rehearsals, and production data loads. Monitor data load progress and resolve issues related to performance, mapping, or data quality. Maintain a clear log of data migration actions and reconcile with source systems. Post-Go-Live: Support & Stewardship Monitor data creation and updates to ensure business process integrity post go-live. Provide data extract/load services for ongoing master data maintenance. Contribute to legacy data archiving strategies, tools, and execution. Tools, Documentation & Collaboration Maintain documentation of ETL procedures, technical specifications, and data lineage. Partner with implementation teams to translate business requirements into technical solutions. Contribute to the development and refinement of ETL frameworks and reusable components. Travel Requirements Willingness to travel up to 20% for project needs, primarily during key implementation phases Who You Are: You show a tremendous amount of initiative in tough situations; you are someone who has strong analytical and problem-solving skills. You are self-motivated, accountable, and proactive in learning and applying new technologies. You possess superb communication and collaboration across global teams. For This Role, You Will Need: 12+ years of IT experience with a focus on ETL, data management, and ERP data migration. Strong hands-on experience with Oracle EBS or SAP ECC implementations. Proficiency in Syniti DSP, Informatica, Talend, or similar enterprise ETL tools. Proficient SQL skills; ability to write and optimize queries for large datasets. Demonstrable track record in data profiling, cleansing, and audit trail maintenance. Academic background in MCA / BE / BSC - Computer Science, Engineering, Information Systems, or Business Administration Proven Application development experience in .NET, ABAP, or scripting languages Familiarity with Data Migration implementations and data modeling principles. Knowledge of project management methodologies (Agile, PMP, etc.). Performance Indicators Successful execution of data migration cutovers with minimal errors. Complete Data traceability and audit compliance from source to target. Timely delivery of ETL solutions and reports per project phases. Continuous improvement and reuse of ETL frameworks and standard processes. Our Culture & Commitment to You: At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives—because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams, working together are key to driving growth and delivering business results. We recognize the importance of employee wellbeing. We prioritize providing competitive benefits plans, a variety of medical insurance plans, Employee Assistance Program, employee resource groups, recognition, and much more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave WHY EMERSON Our Commitment to Our People At Emerson, we are motivated by a spirit of collaboration that helps our diverse, multicultural teams across the world drive innovation that makes the world healthier, safer, smarter, and more sustainable. And we want you to join us in our bold aspiration. We have built an engaged community of inquisitive, dedicated people who thrive knowing they are welcomed, trusted, celebrated, and empowered to solve the world’s most complex problems — for our customers, our communities, and the planet. You’ll contribute to this vital work while further developing your skills through our award-winning employee development programs. We are a proud corporate citizen in every city where we operate and are committed to our people, our communities, and the world at large. We take this responsibility seriously and strive to make a positive impact through every endeavor. At Emerson, you’ll see firsthand that our people are at the center of everything we do. So, let’s go. Let’s think differently. Learn, collaborate, and grow. Seek opportunity. Push boundaries. Be empowered to make things better. Speed up to break through. Let’s go, together. Accessibility Assistance or Accommodation If you have a disability and are having difficulty accessing or using this website to apply for a position, please contact: idisability.administrator@emerson.com . ABOUT EMERSON Emerson is a global leader in automation technology and software. Through our deep domain expertise and legacy of flawless execution, Emerson helps customers in critical industries like life sciences, energy, power and renewables, chemical and advanced factory automation operate more sustainably while improving productivity, energy security and reliability. With global operations and a comprehensive portfolio of software and technology, we are helping companies implement digital transformation to measurably improve their operations, conserve valuable resources and enhance their safety. We offer equitable opportunities, celebrate diversity, and embrace challenges with confidence that, together, we can make an impact across a broad spectrum of countries and industries. Whether you’re an established professional looking for a career change, an undergraduate student exploring possibilities, or a recent graduate with an advanced degree, you’ll find your chance to make a difference with Emerson. Join our team – let’s go! No calls or agencies please.
Posted 2 days ago
5.0 years
0 Lacs
Hyderābād
On-site
Job Information Date Opened 06/19/2025 Job Type Full time Work Experience 5+ years Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500032 Job Description As a Data Governance Lead at Kanerika, you will be responsible for defining, leading, and operationalizing the data governance framework, ensuring enterprise-wide alignment and regulatory compliance. Key Responsibilities: 1. Governance Strategy & Stakeholder Alignment Develop and maintain enterprise data governance strategies, policies, and standards. Align governance with business goals: compliance, analytics, and decision-making. Collaborate across business, IT, legal, and compliance teams for role alignment. Drive governance training, awareness, and change management programs. 2. Microsoft Purview Administration & Implementation Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure. Optimize Purview setup for large-scale environments (50TB+). Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule scans, set classification jobs, and maintain collection hierarchies. 3. Metadata & Lineage Management Design metadata repositories and maintain business glossaries and data dictionaries. Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions. Ensure lineage mapping (ADF Synapse Power BI) and impact analysis. 4. Data Classification & Security Governance Define classification rules and sensitivity labels (PII, PCI, PHI). Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager. Enforce records management, lifecycle policies, and information barriers. 5. Data Quality & Policy Management Define KPIs and dashboards to monitor data quality across domains. Collaborate on rule design, remediation workflows, and exception handling. Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management. 6. Business Glossary & Stewardship Maintain business glossary with domain owners and stewards in Purview. Enforce approval workflows, standard naming, and steward responsibilities. Conduct metadata audits for glossary and asset documentation quality. 7. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps. Create pipelines for ingestion, lineage, glossary updates, tagging. Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc. 8. Monitoring, Auditing & Compliance Set up dashboards for audit logs, compliance reporting, metadata coverage. Oversee data lifecycle management across its phases. Support internal and external audit readiness with proper documentation. Requirements 7+ years of experience in data governance and data management. Proficient in Microsoft Purview and Informatica data governance tools. Strong in metadata management, lineage mapping, classification, and security. Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools. Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs.Skilled in bridging technical governance with business and compliance goals. Benefits Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
Posted 2 days ago
7.0 years
3 - 6 Lacs
Hyderābād
On-site
As a Data Governance Lead at Kanerika, you will be responsible for defining, leading, and operationalizing the data governance framework, ensuring enterprise-wide alignment and regulatory compliance. Key Responsibilities: 1. Governance Strategy & Stakeholder Alignment Develop and maintain enterprise data governance strategies, policies, and standards. Align governance with business goals: compliance, analytics, and decision-making. Collaborate across business, IT, legal, and compliance teams for role alignment. Drive governance training, awareness, and change management programs. 2. Microsoft Purview Administration & Implementation Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure. Optimize Purview setup for large-scale environments (50TB+). Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule scans, set classification jobs, and maintain collection hierarchies. 3. Metadata & Lineage Management Design metadata repositories and maintain business glossaries and data dictionaries. Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions. Ensure lineage mapping (ADF Synapse Power BI) and impact analysis. 4. Data Classification & Security Governance Define classification rules and sensitivity labels (PII, PCI, PHI). Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager. Enforce records management, lifecycle policies, and information barriers. 5. Data Quality & Policy Management Define KPIs and dashboards to monitor data quality across domains. Collaborate on rule design, remediation workflows, and exception handling. Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management. 6. Business Glossary & Stewardship Maintain business glossary with domain owners and stewards in Purview. Enforce approval workflows, standard naming, and steward responsibilities. Conduct metadata audits for glossary and asset documentation quality. 7. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps. Create pipelines for ingestion, lineage, glossary updates, tagging. Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc. 8. Monitoring, Auditing & Compliance Set up dashboards for audit logs, compliance reporting, metadata coverage. Oversee data lifecycle management across its phases. Support internal and external audit readiness with proper documentation. Requirements 7+ years of experience in data governance and data management. Proficient in Microsoft Purview and Informatica data governance tools. Strong in metadata management, lineage mapping, classification, and security. Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools. Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs.Skilled in bridging technical governance with business and compliance goals. Benefits Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
Posted 2 days ago
4.0 - 5.0 years
0 Lacs
Bengaluru
On-site
Job Title: ETL Tester Experience: 4-5 years We are looking for a highly skilled and detail-oriented ETL Tester with 4–5 years of hands-on experience in validating data pipelines, ETL processes, and data warehousing systems. The ideal candidate will have a strong understanding of data extraction, transformation, and loading processes, and will be responsible for ensuring data accuracy, completeness, and quality across various system. Key responsibilities: -Review and understand ETL requirements and source-to-target mappings. -Develop and execute comprehensive test cases, test plans, and test scripts for ETL processes. -Validate data accuracy, transformations, and data flow between source and target systems. -Perform data validation, data reconciliation, and back-end/database testing using SQL. -Identify, document, and track defects using tools like JIRA or HP ALM. -Work closely with developers, business analysts, and data engineers to resolve issues. -Automate testing processes where applicable using scripting or ETL testing tools Required Skills: -4–5 years of hands-on experience in ETL testing. Strong SQL skills for writing complex queries and performing data validation. -Experience with ETL tools (e.g., Informatica, SSIS, Talend). -Familiarity with Data Warehousing concepts and Data Migration projects. -Proficiency in defect tracking and test management tools (e.g., JIRA, ALM, TestRail). -Knowledge of automation frameworks or scripting for ETL test automation is a plus. -Good understanding of Agile/Scrum methodology. Preferred Qualifications: -Bachelor's degree in Computer Science, Information Systems, or a related field. -Experience in cloud-based data platforms (AWS, Azure, GCP) is a plus. -Exposure to reporting and BI tools (Tableau, Power BI) is an advantage. Job Type: Full-time Schedule: Day shift Morning shift Work Location: In person
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.
These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.
The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.
In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)
As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17069 Jobs | Dublin
Wipro
9221 Jobs | Bengaluru
EY
7581 Jobs | London
Amazon
5941 Jobs | Seattle,WA
Uplers
5895 Jobs | Ahmedabad
Accenture in India
5813 Jobs | Dublin 2
Oracle
5703 Jobs | Redwood City
IBM
5669 Jobs | Armonk
Capgemini
3478 Jobs | Paris,France
Tata Consultancy Services
3259 Jobs | Thane