Home
Jobs

210 Data Transformation Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Minimum 3.5 years experience with Having experience on end-to-end Mulesoft Integration (Anypoint platform) experience with various systems/applications SAAS, Legacy system, DB, Webservices(SOAP & REST) Knowledge on integration design patterns Hands on experience in using Mule connectors like Salesforce, FTP, FILE, sFTP, IMAP, Database, HTTP etc. Experience in developing middle tier applications using ESB Mule ( API and batch processing ) Experience in RDBMS SQL queries , functions & Stored procedure Strong knowledge in data transformations using Mulesoft Dataweave and exception handling . Hands on experience with Mule 4, RAML 1.0, Maven, MUnits current version of Mulesoft Anypoint studio, Anypoint platform in a cloud implementation or On-prem or Runtime Fabric Security , Logging , Auditing, Policy Management and Performance Monitoring and KPI for end-to-end process execution Experience with Mulesoft, Java integration Basic knowledge on java Intermediate level knowledge in working with Web-services technologies ( XML, SOAP, REST, XSLT ) and CLOUD API Basic knowledge on Salesforce in a Cloud Implementation Other Qualifications Familiarity with Agile (Scrum) project management methodology nice to have Familiarity with Salesforce in a cloud implementation Familiarity with Microsoft Office suite including Visio, draw.io If you are interested, please Share below details and Updated Resume Matched First Name Last Name Date of Birth Pass Port No and Expiry Date Alternate Contact Number Total Experience Relevant Experience Current CTC Expected CTC Current Location Preferred Location Current Organization Payroll Company Notice period Holding any offer

Posted 3 hours ago

Apply

3.0 - 7.0 years

11 - 16 Lacs

Gurugram

Work from Office

Naukri logo

Project description We are looking for the star Python Developer who is not afraid of work and challenges! Gladly becoming a partner with famous financial institution, we are gathering a team of professionals with wide range of skills to successfully deliver business value to the client. Responsibilities Analyse existing SAS DI pipelines and SQL-based transformations. Translate and optimize SAS SQL logic into Python code using frameworks such as Pyspark. Develop and maintain scalable ETL pipelines using Python on AWS EMR. Implement data transformation, cleansing, and aggregation logic to support business requirements. Design modular and reusable code for distributed data processing tasks on EMR clusters. Integrate EMR jobs with upstream and downstream systems, including AWS S3, Snowflake, and Tableau. Develop Tableau reports for business reporting. Skills Must have 6+ years of experience in ETL development, with at least 5 years working with AWS EMR. Bachelor's degree in Computer Science, Data Science, Statistics, or a related field. Proficiency in Python for data processing and scripting. Proficient in SQL and experience with one or more ETL tools (e.g., SAS DI, Informatica)/. Hands-on experience with AWS servicesEMR, S3, IAM, VPC, and Glue. Familiarity with data storage systems such as Snowflake or RDS. Excellent communication skills and ability to work collaboratively in a team environment. Strong problem-solving skills and ability to work independently. Nice to have N/A Other Languages EnglishB2 Upper Intermediate Seniority Senior

Posted 6 hours ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

6+Years experience with solution scoping, development and delivery with Winshuttle Studio, Composer and Foundation Should have architect level experience data management (Integration/migration and governance) Should have experience in SAP implementation with Winshuttle as data load tool into SAP S4/HANA. Should have extensively worked on Winshuttle Transaction, Winshuttle Query and Winshuttle Direct. Responsible throughout design, development, testing, loading and reconciliation. Experience with SAP Data Migration tools, ETL and other Data migration tools is an advantage. Should be hands-on with development techniques as well. Knowledge of data structures in SAP FI/SD/MM modules and dependencies is a must. Responsible to work on data migration documentation. cut-over planning. Experience with leading a team across different process areas, geographies, work forces and technical expertise. Coordination with Project team stakeholders Business / Data Owners, SAP Functional teams, SAP ABAP / Basis teams, ETL teams, Legacy data team etc. Should have rich experience in Data transformation Procedures, Post migration data validation procedures, Data retention, Data security procedures.

Posted 7 hours ago

Apply

5.0 - 9.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary: We are seeking a skilled and motivated Workday Developer with 3+ years of experience and a strong background in Workday Studio, Core Connectors, EIBs, and web service APIs. The ideal candidate will be responsible for designing, building, and reviewing Workday integrations in collaboration with Workday Functional Leads and business SMEs. This role is critical in delivering scalable, secure, and efficient solutions aligned with evolving business needs. Key Responsibilities: Understand business and functional requirements; design, build, test, and deploy Workday integrations using Studio, EIB, Core Connectors, and Workday APIs. Analyze and translate functional specifications and change requests into detailed technical specifications. Develop and maintain calculated fields, condition rules, business processes, custom reports, and security configurations. Collaborate with functional leads and stakeholders to deliver end-to-end technical solutions. Troubleshoot integration issues, perform root cause analysis, and provide ongoing production support. Document system design, integration maps, deployment strategies, and change logs. Assess the impact of Workday updates on existing integrations and configurations. Support data migrations, audits, and compliance reporting activities. Review and validate reports to ensure accuracy, security, and data privacy compliance. Train end-users on Workday reporting tools and available dashboards. Adhere to established standards, documentation protocols, and change management processes. Ensure compliance with Workday security and governance policies in all development activities. Required Qualifications and Experience: Bachelors degree in Computer Science, Information Systems, MIS, or a related field (preferred). 3+ years of hands-on Workday integration development experience. Proficiency in Workday Studio, EIB, Core Connectors, Web Services (SOAP/REST), and XSLT. Strong understanding of Workday HCM modules such as Payroll, Benefits, Time Tracking, and Compensation. Experience with Workday Report Writer and Calculated Fields. Proficient in XML, XSLT, JSON, and data transformation technologies. Ability to interpret business needs into technical solutions. Strong analytical and problem-solving skills with the ability to thrive in a fast-paced environment. Preferred Qualifications: Workday Integration Certification (Studio or Core Connectors). Experience working in Agile or Scrum-based development environments. Exposure to Workday Prism, Extend, or People Experience. Knowledge of HR business processes and compliance standards such as SOX and GDPR. Technical / Soft Skills: Quick learner with a strong aptitude for mastering report writing tools Familiarity with finance-related reporting Proficiency in Microsoft Office tools (Word, Excel, PowerPoint) Ability to consolidate data from multiple sources into comprehensive reports Strong problem-solving, troubleshooting, and analytical abilities Self-motivated with the ability to prioritize and execute tasks independently Demonstrated ability to meet deadlines and manage multiple priorities Excellent communication, presentation, and stakeholder management skills Work Environment: Hybrid work setup with 34 days required in the office each week Collaborative culture with continuous professional development and learning opportunities

Posted 8 hours ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking skilled Linux System Administrator with a minimum of 1 year of development experience in Bash Development to join us as freelancers and contribute to impactful projects. Key Responsibilities:. Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance. Required Qualifications:. 1+ year of Bash development experience. Strong scripting skills in Bash, with expertise in automating tasks, managing system processes, and creating shell scripts. Experience with Linux/Unix environments, command-line tools, and integrating Bash scripts. Why Join Us. Competitive pay (‚1000/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. . Shape the future of AI with Soul AI!.

Posted 9 hours ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking skilled Linux System Administrator with a minimum of 1 year of development experience in Bash Development to join us as freelancers and contribute to impactful projects. Key Responsibilities:. Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance. Required Qualifications:. 1+ year of Bash development experience. Strong scripting skills in Bash, with expertise in automating tasks, managing system processes, and creating shell scripts. Experience with Linux/Unix environments, command-line tools, and integrating Bash scripts. Why Join Us. Competitive pay (‚1000/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. . Shape the future of AI with Soul AI!.

Posted 9 hours ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking skilled Linux System Administrator with a minimum of 1 year of development experience in Bash Development to join us as freelancers and contribute to impactful projects. Key Responsibilities:. Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance. Required Qualifications:. 1+ year of Bash development experience. Strong scripting skills in Bash, with expertise in automating tasks, managing system processes, and creating shell scripts. Experience with Linux/Unix environments, command-line tools, and integrating Bash scripts. Why Join Us. Competitive pay (‚1000/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. . Shape the future of AI with Soul AI!.

Posted 9 hours ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Kolkata

Work from Office

Naukri logo

:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking skilled Linux System Administrator with a minimum of 1 year of development experience in Bash Development to join us as freelancers and contribute to impactful projects. Key Responsibilities:. Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance. Required Qualifications:. 1+ year of Bash development experience. Strong scripting skills in Bash, with expertise in automating tasks, managing system processes, and creating shell scripts. Experience with Linux/Unix environments, command-line tools, and integrating Bash scripts. Why Join Us. Competitive pay (‚1000/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. . Shape the future of AI with Soul AI!.

Posted 9 hours ago

Apply

9.0 - 14.0 years

17 - 30 Lacs

Noida

Hybrid

Naukri logo

We are looking for a skilled Telecom Billing Mediation Specialist to manage and optimize the mediation process between network elements and the postpaid billing system. Connect with me over LinkedIn : https://www.linkedin.com/in/nitin-tushir-abc0048/ The ideal candidate will have a strong background in telecom mediation platforms, CDR (Call Detail Records) processing, billing integration, and data transformation . This role involves ensuring seamless data collection, processing, and delivery to downstream billing and revenue management systems. What you will do: Mediation System Management: Configure, monitor, and troubleshoot mediation systems for postpaid billing. Ensure accurate and timely collection, aggregation, and transformation of CDRs from multiple network elements. Implement rules for data filtering, deduplication, and enrichment before sending to the billing system. Integration & Optimization: Work with network, IT, and billing teams to ensure smooth integration between mediation and billing platforms. Optimize mediation rules to handle high-volume CDR processing efficiently. Perform data reconciliation between network elements, mediation, and billing systems. Issue Resolution & Performance Monitoring: Investigate and resolve discrepancies in mediation and billing data. Monitor system health, troubleshoot issues, and ensure high availability of mediation services. Conduct root cause analysis (RCA) for mediation-related issues and implement corrective actions. Compliance & Reporting: Ensure adherence to regulatory, audit, and revenue assurance requirements. Generate reports on mediation performance, errors, and processed CDR volumes. Support fraud management and revenue assurance teams by providing mediation-related insights. You will bring: Technical Skills: Hands-on experience with billing mediation platforms (e.g. Amdocs Mediation, IBM, HP Openet, etc.) Proficiency in SQL, Linux/Unix scripting, and data transformation tools. Strong understanding of CDR structures, mediation rules configuration . Familiarity with ETL processes, data parsing, and API integrations . Domain Knowledge: Solid understanding of telecom postpaid billing systems (e.g., Amdocs, HP, Oracle BRM). Knowledge of network elements (MSC, MME, SGSN, GGSN, PCRF, OCS, IN) and their impact on mediation. Awareness of revenue assurance and fraud detection in telecom billing. Soft Skills: Strong problem-solving and analytical skills. Ability to work in a cross-functional team and communicate effectively with stakeholders. Key Qualification: Bachelors degree in computer science, E.C.E Telecommunications. 10+ years of experience in telecom billing mediation. Experience in cloud-based mediation solutions (AWS, Azure, GCP) is a plus. Knowledge of 5G mediation and real-time charging architectures is an advantage.

Posted 1 day ago

Apply

4.0 - 6.0 years

3 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Overview: We are looking for an individual to join the Middle Office - Calypso Platform team as a System Analyst. The person should have clear understating about the system and its functionality with real hands on to perform setups to onboard clients/funds for trading, settlement and PnL report generations (non-operations) This role requires a solid understanding of Capital Markets across both Exchange and OTC markets. The ideal candidate should have at least 4 years of hands-on experience with platforms preferably - Calypso, and a total experience not exceeding 12 years. The role involves supporting various Middle Office functions, particularly Funds Onboarding, while collaborating closely with Client Service Managers in regions including Europe, Asia, and the US. Key Responsibilities: 1. Provide daily support for the Calypso platform, assisting Middle Office (MO) users. 2. Manage fund onboarding tasks and related responsibilities. 3. Help develop Standard Operating Procedures (SOPs) for Operations. 4. Track and resolve issues related to the platform in a timely manner. 5. Assist with tasks related to trade booking, allocations, matching, settlement, and related activities. 6. Engage with both Front Office and Back Office setup, ensuring smooth workflow and message handling. 7. Collaborate with the team to ensure timely and accurate resolution of market data, trade processing, and scheduled tasks. 8. Work with tools such as Jira, Confluence, and Excel (including VBA, Macros, etc.) for process optimization and tracking. Required Skills & Experience: 1. 4-12 years of experience in financial services, with direct experience using the Calypso platform or a similar system FO & BO setups. 2. Expertise in asset classes such as Commodities, Equity Derivatives, Credit Derivatives, Exotic Structures, Fixed Income, Futures, FX (Foreign Exchange), FX Options, Interest Rate Derivatives, and Money Markets. 3. Knowledge of financial products including Bonds, Repo, TRS, Equity Swaps, CDS, CDX, Futures, Options, and Equities, along with their confirmation, settlement, and P&L tracking. 4. Strong data transformation and analysis skills. 5. Proficiency in advanced Excel functions, including VBA and Macros. 6. Excellent problem-solving abilities with critical and objective thinking. 7. Outstanding communication and interpersonal skills to work effectively with teams across different regions. Qualifications: 1. Postgraduate degree in Commerce, MBA in Finance, or professional qualifications like CA/CMA/CFA.

Posted 1 day ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values diverse voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from diverse backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Pyramid overview A role with Target Data Science & Engineering means the chance to help develop and manage state of the art predictive algorithms that use data at scale to automate and optimize decisions at scale. Whether you join our Statistics, Optimization or Machine Learning teams, youll be challenged to harness Targets impressive data breadth to build the algorithms that power solutions our partners in Marketing, Supply Chain Optimization, Network Security and Personalization rely on. Team Overview Global Supply Chain and Logisticsat Target is evolving at an incredible pace. We are constantly reimagining how we get the right product to the guests better, faster and more cost effectively than before across 1900 locations Our Supply Chain Data Science team oversees the development of state-of-the-art mathematical techniques to help solve important problems for Targets Supply Chain e.g. identifying the optimal quantities and positioning of inventory across multiple channels and locations, planning for the right mix of inventory investments vs guest experience, Digital order Fulfillment planning, transportation resource planning, etc. As a Data Scientist in Digital fulfillment Planning space , you will have the opportunity to work with Product, Tech, and business partners to solve retail challenges at scale for our fulfillment network. Position Overview As a Senior Data Scientist at Target you will get an opportunity design,develop, deploy and maintain data science models and tools. Youll work closely with applied data scientists, data analysts and business partners to continuously learn and understand evolving business needs. Youll also collaborate with engineers and data scientists onpeerteams to buildand productionizefulfillment solutions for our supply chain/logistics needs.In this role as a Senior Data Scientist, you will: Develop a strong understanding of business and operational processes within Targets Supply chain. Develop an in-depth understanding of the various systems and processes that influence Digital order fulfillment speed & costs. Analyze large datasets for insights leading to business process improvements or solution development. Work with the team to build and maintain complex software systems and tools. Add new capabilities and features to the simulation framework to reflect the complexities of an evolving digital fulfillment network. Develop and deploy modules to run simulations for testing and validating multiple scenarios to evaluate the impact of various fulfillment strategies. Adopt modular architecture and good software development/engineering practices to enhance the overall product performance and guide other team members. Produce clean, efficient code based on specifications. Enhance and maintain simulation environment to enable testing/deploying new features for running custom, user defined scenarios through the simulator. Coordinate the analysis, troubleshooting and resolution of issues in the models and software. Ensure your Target Career Profile is up to date before applying for a position. As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up . Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Team Overview: Our Supply Chain Data Science team oversees the development of state-of-the-art mathematical techniques to help solve important problems for Targets Supply Chain e.g., identifying the optimal quantities and positioning of inventory across multiple channels and locations, planning for the right mix of inventory investments vs guest experience, digital order fulfillment planning, transportation resource planning, etc. As a Senior Data Scientist in Digital fulfilment space, you will have the opportunity to work with Product, Tech, and Business Partners to solve retail challenges at scale for our fulfillment network. Position Overview: As aSenior Data Scientist at Target, you will get an opportunity design,develop, deploy, and maintain data science models and tools. Youll work closely with applied data scientists, data analysts and business partners to continuously learn and understand evolving business needs. Youll also collaborate with engineers and data scientists onpeerteams to buildand productionizefulfillment solutions for our supply chain/logistics needs. Develop a strong understanding of business/ operational processes within Targets supply chain. Develop an in-depth understanding of the various systems and processes that influence digital order fulfillment speed & costs. Develop optimization-based solutions, approximate mathematical models (probabilistic/deterministic models) of real-world phenomena, predictive models and implement the same in real production systems with measurable impact. Analyze large datasets for insights leading to business process improvements or solution development. Develop and deploy modules for testing and validating multiple scenarios to evaluate the impact of various fulfillment strategies. Adopt modular architecture and good solution development practices to enhance the overall product performance and guide other team members. Produce clean, efficient code based on specifications. Coordinate the analysis, troubleshooting and resolution of issues in the models and software. About You: Must Haves: We are looking for candidates who meet the following criteria: Bachelors/Masters/PhD in Mathematics, Statistics, Operations Research, Industrial Engineering, Computer Science, or a related quantitative field. 3+ years of direct, hands-on experience building optimization models (e.g., linear, mixed-integer, network flow, or combinatorial). Strong proficiency in Python and Spark for developing and deploying optimization-based solutions. Solid understanding of operations research techniques and their application in real-world business problems . Demonstrated experience working on end-to-end solution delivery , preferably with production-grade implementation. Strong verbal and written communication skillsable to translate technical detail into business insight, and vice versa. Strong analytical thinking, data transformation, and problem-solving ability, especially under ambiguity. Team player with the ability to collaborate effectively across cross-functional, geographically distributed teams. Preferred Experience: Experience building large scale optimization models/ ability to build models with smart heuristic solutions Familiarity with supply chain or e-commerce fulfillment data and business processes. Experience working with large datasets. Experience deploying solutions at scale for business impact. Background in clean code practices, version control (Git), and collaborative development environments. Know More here: Life at Target - https://india.target.com/ Benefits - https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging

Posted 2 days ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Naukri logo

ETL QA tester1 Job Tile ETL QA tester Job Summary: We are looking for an experienced ETL Tester to ensure the quality and integrity of our data processing and reporting systems. The ideal candidate will have a strong background in ETL processes, data warehousing, and experience with Snowflake and Tableau. This role involves designing and executing test plans, identifying and resolving data quality issues, and collaborating with development teams to enhance data processing systems. Key Responsibilities: Design, develop, and execute comprehensive test plans and test cases for ETL processes. Validate data transformation, extraction, and loading processes to ensure accuracy and integrity. Perform data validation and data quality checks using Snowflake and Tableau. Identify, document, and track defects and data quality issues. Collaborate with developers, business analysts, and stakeholders to understand requirements and provide feedback on data-related issues. Create and maintain test data, test scripts, and test environments. Generate and analyze reports using Tableau to validate data accuracy and completeness. Conduct performance testing and optimization of ETL processes. Develop and maintain automated testing scripts and frameworks for ETL testing. Ensure compliance with data governance and security standards. Location - Pune,Hyderabad,Kolkata,Chandigarh

Posted 2 days ago

Apply

3.0 - 8.0 years

8 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Education Qualification: Bachelors degree in Computer Science or related field or higher with minimum 3 years of relevant experience. Position Description: Experience in design and implementation of data pipelines using DataStage with strong SQL and decent Unix knowledge. Should able to - create, review and understand technical and requirement documents - participate in walkthrough reviews of technical specifications, programs, code and unit testing - plan design and implementation ensuring solution quality - estimate tasks/time require to perform design, coding and unit testing - ensure highest quality standards are followed in the ETL pipelines designs and follow best practices - perform detailed/technical analysis, coding, validation and documentation of new features - perform data transformations, analysis, performance tunings and optimizations , etc. Required qualifications to be successful in this role: Must-Have Skills: Strong analytical and problem-solving skills. ETL developer(IBM Datastage) Additional Skills - Teradata . DB2 , SQL and Unix Optional - SAS SIS Excellent communication and collaboration abilities. Ability to work in a fast-paced, dynamic environment with minimal supervision. Attention to detail and a commitment to data accuracy.Good-to-Have Skills: Knowledge of - SDLC: Agile mostly, Data warehousing techniques, data modeling, Git, CI/CD pipelines, etc. CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodation for people with disabilities in accordance with provincial legislation. Please let us know if you require reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. Life at CGI: It is rooted in ownership, teamwork, respect and belonging. Here, you ll reach your full potential because Your work creates value. You ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise You ll shape your career by joining a company built to grow and last. You ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons Skills: Data Migration ETL Teradata Unix

Posted 2 days ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services, Microsoft Dynamics AX Technical Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for a Team Lead - Migration Engineer with deep expertise in Microsoft Dynamics 365 Finance & Operations (D365 F&O) to drive successful data migration, system upgrades, and platform transitions. The ideal candidate will be responsible for leading migration projects, ensuring data integrity, optimizing migration processes, and collaborating with teams to ensure seamless transitions.Key Responsibilities:Lead end-to-end migration projects for transitioning legacy systems to Microsoft Dynamics 365 F&O.Develop migration strategies, roadmaps, and execution plans for seamless data transfer.Design and implement ETL processes to ensure accurate and efficient data migration.Collaborate with technical teams to configure data mappings, transformations, and validation rules.Ensure compliance with Microsoft best practices for data migration and security.Conduct data integrity checks, validation, and reconciliation post-migration.Provide guidance and mentorship to a team of migration engineers, developers, and consultants.Troubleshoot migration-related issues and optimize performance for large-scale data transfers.Stay updated with the latest D365 F&O upgrades, tools, and methodologies related to data migration.Required Skills & Qualifications:Proven experience in Microsoft Dynamics 365 F&O migration projects.Strong knowledge of data architecture, ETL tools, and integration frameworks.Expertise in SQL, Azure Data Factory, and other data transformation tools.Hands-on experience with data cleansing, mapping, validation, and reconciliation.Ability to lead and manage teams in large-scale migration projects.Excellent analytical and problem-solving skills.Strong communication and stakeholder management abilities.Preferred Certifications:Microsoft Certified:Dynamics 365 Finance Functional Consultant AssociateMicrosoft Certified:Dynamics 365 Data Migration & Integration SpecialistMicrosoft Certified:Azure Data Engineer Associate Qualification 15 years full time education

Posted 2 days ago

Apply

8.0 - 13.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : BI Engineer Project Role Description : Develop, migrate, deploy, and maintain data-driven insights and knowledge engine interfaces that drive adoption and decision making. Integrate security and data privacy protection. Must have skills : SAS Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 2 days ago

Apply

8.0 - 13.0 years

14 - 19 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 2 days ago

Apply

8.0 - 13.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 2 days ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Lucknow

Remote

Naukri logo

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 2 days ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Ludhiana

Remote

Naukri logo

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 2 days ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Hyderabad

Remote

Naukri logo

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 2 days ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Mohali, Pune

Work from Office

Naukri logo

Exp with Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, Azure Storage, SQL, Git, CI/CD, Azure DevOps, RESTful APIs, Data APIs, event-driven architecture, data governance, lineage, security, privacy best practices. Immediate Joiners Required Candidate profile Data Warehousing, Data Lake, Azure Cloud Services, Azure DevOps ETL-SSIS, ADF, Synapse, SQL Server, Azure SQL Data Transformation, Modelling, Integration Microsoft Certified: Azure Data Engineer

Posted 2 days ago

Apply

6.0 - 11.0 years

11 - 18 Lacs

Noida, Greater Noida, Delhi / NCR

Work from Office

Naukri logo

Responsibilities: Design, develop, and maintain data pipelines and ETL processes using Domo. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Implement data transformation and data warehousing solutions to support business intelligence and analytics. Optimize and troubleshoot data workflows to ensure efficiency and reliability. Develop and maintain documentation for data processes and systems. Ensure data quality and integrity through rigorous testing and validation. Monitor and manage data infrastructure to ensure optimal performance. Stay updated with industry trends and best practices in data engineering and Domo. Mandatory Skills- Domo, Data Transformation Layer (SQL, Python), Data Warehouse Layer (SQL, Python) Requirements: Bachelor's degree in computer science, Information Technology, or related field. Proven experience as a Data Engineer, with a strong focus on data transformation and data warehousing. Proficiency in Domo and its various tools and functionalities. Experience with SQL, Python, and other relevant programming languages. Strong understanding of ETL processes and data pipeline architecture. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Strong communication skills to collaborate effectively with stakeholders. Preferred Qualifications: -Knowledge of data visualization and reporting tools. -Familiarity with Agile methodologies and project management tools. - Data Transformation Layer (SQL, Python) -Data Warehouse Layer (SQL, Python) Share your resume over Aarushi.Shukla@coforge.com

Posted 2 days ago

Apply

5.0 - 7.0 years

12 - 15 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Naukri logo

We are looking for a skilled Data Engineer with expertise in SSIS, Tableau, SQL, and ETL processes. The ideal candidate should have experience in Data Modeling, Data Pipelines, and Agile methodologies. Responsibilities include designing and maintaining data pipelines, implementing ETL processes using SSIS, optimizing data models for reporting, and developing advanced dashboards in Tableau. The role requires proficiency in SQL for complex data transformations, troubleshooting data workflows, and ensuring data integrity and compliance. Strong problem-solving skills, Agile collaboration experience, and the ability to work independently in a remote setup are essential. Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 3 days ago

Apply

6.0 - 10.0 years

13 - 14 Lacs

Jaipur, Delhi / NCR, Bengaluru

Hybrid

Naukri logo

Location: DELHI / NCR / Jaipur / Bangalore / Hyderabad Work Mode: Hybrid - 2 Days WFO Working Time: 1:00 PM to 10:00 PM IST iSource Services is hiring for one of their USA based client for the position of Data Integration Specialist. About the Role - We are seeking a skilled Data Integration Specialist to manage data ingestion, unification, and activation across Salesforce Data Cloud and other platforms. You will design and implement robust integration workflows, leveraging APIs and ETL tools to enable seamless data flow and support a unified customer experience. Key Responsibilities: Design and implement data ingestion workflows into Salesforce Data Cloud Unify data from multiple sources to create a 360 customer view Develop integrations using APIs, ETL tools, and middleware (e.g., MuleSoft) Collaborate with cross-functional teams to gather and fulfill data integration requirements Monitor integration performance and ensure real-time data availability Ensure compliance with data privacy and governance standards Enable data activation across Salesforce Marketing, Sales, and Service Clouds Must-Have Skills: Experience with cloud data platforms (e.g., Snowflake, Redshift, BigQuery) Salesforce certifications (e.g., Data Cloud Consultant, Integration Architect) Hands-on experience with Salesforce Data Cloud (CDP) Proficiency in ETL, data transformation, and data mapping Strong knowledge of REST/SOAP APIs and integration tools Solid understanding of data modeling and customer data platforms Familiarity with data privacy regulations (e.g., GDPR, CCPA).

Posted 5 days ago

Apply

4.0 - 7.0 years

10 - 15 Lacs

Mumbai

Work from Office

Naukri logo

We are seeking a skilled Business Intelligence Manager to construct and uphold analytics and reporting solutions that convert data into actionable insights. The BI Manager role is pivotal, involving the conversion of provided data into meaningful insights through user-friendly dashboards and reports. An ideal BI Manager possesses proficiency in Business Intelligence tools and technology, overseeing the creation and administration of BI tools with comprehensive knowledge of the BI system, managing stakeholder expectations and ensuring we deliver to that expectation as a team. This role demands a grasp of business concepts, strong problem-solving abilities, and prior experience in data and business analysis. Analytical prowess and effective communication skills are highly valued attributes for this position.. BI Responsibilities. The day-to-day responsibilities include but not limited to:. Develop actionable insights that can be used to make business decisions by building reports and dashboards.. Understand business stakeholders’ objectives, metrics that are most important to them, and how they measure performance.. Translate data into highly leveraged and effective visualizations. Share knowledge and skills with your teammates to grow analytics impact. Ability to come up with an overall design strategy for all analytics that improves the user experience. Influence and educate stakeholders on the appropriate data, tools, and visualizations.. Review all analytics for quality before final output are delivered to stakeholders.. Responsibly for version control and creating technical documentation.. Partner with IT to provide different ways of improving on existing processes.. Successful contribution to delivery through the development and implementation of best-in-class data visualization and insights. Strong relationship with the business stakeholders to ensure understanding of business needs.. Improvement in performance for all visualizations due to optimized code. Experience with custom/ third party visuals. Design, implement, and maintain scalable data pipelines and architectures. Essential Traits. Qualifications/Skills:. Graduate or equivalent level qualification, preferably in a related discipline; Master’s degree preferred. 6-8 years of analytical experience in Data and Analytics: Building reports and dashboards. 6-8 years of experience with visualization tools such as Power BI. Hands on experience in DAX, Power Query, SQL and build data models that can generate meaningful insights.. Experience working with and creating analytics to enable stakeholders for data driven decision making. 4+ years of experience with requirements gathering.. Should have expert level proficiency in data transformation/configuration and connecting the data with the Power BI dashboard.. Exposure in implementing row-level security and bookmarks.. Competencies. Highly motivated and influential team player with a proven track record of driving results.. Strong communicator and collaborator with exceptional interpersonal skills.. Analytical problem-solver with a passion for innovation and continuous improvement.. Teachable, embraces best practices, and leverages feedback as a means of continuous improvement.. Consistently high achiever marked by perseverance, humility, and a positive outlook in the face of challenges.. Strong problem solving, quantitative and analytical abilities.. Solid written and verbal communication skills and knowledge to build strong relationships. Preferred. Microsoft/ Any other BI Certified. About Kroll. In a world of disruption and increasingly complex business challenges, our professionals bring truth into focus with the Kroll Lens. Our sharp analytical skills, paired with the latest technology, allow us to give our clients clarity—not just answers—in all areas of business. We value the diverse backgrounds and perspectives that enable us to think globally. As part of One team, One Kroll, you’ll contribute to a supportive and collaborative work environment that empowers you to excel. Kroll is the premier global valuation and corporate finance advisor with expertise in complex valuation, disputes and investigations, M&A, restructuring, and compliance and regulatory consulting. Our professionals balance analytical skills, deep market insight and independence to help our clients make sound decisions. As an organization, we think globally—and encourage our people to do the same.. Kroll is committed to equal opportunity and diversity, and recruits people based on merit.. In order to be considered for a position, you must formally apply via careers.kroll.com. Show more Show less

Posted 6 days ago

Apply

Exploring Data Transformation Jobs in India

India has seen a significant rise in the demand for data transformation professionals in recent years. With the increasing importance of data in business decision-making, companies across various industries are actively seeking skilled individuals who can transform raw data into valuable insights. If you are considering a career in data transformation in India, here is a comprehensive guide to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi NCR
  4. Hyderabad
  5. Pune

These cities are known for their thriving tech industries and have a high demand for data transformation professionals.

Average Salary Range

The average salary range for data transformation professionals in India varies based on experience levels. Entry-level positions typically start at INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.

Career Path

A typical career path in data transformation may include roles such as Data Analyst, Data Engineer, Data Scientist, and Data Architect. As professionals gain experience and expertise, they may progress to roles like Senior Data Scientist, Lead Data Engineer, and Chief Data Officer.

Related Skills

In addition to data transformation skills, professionals in this field are often expected to have knowledge of programming languages (such as Python, R, or SQL), data visualization tools (like Tableau or Power BI), statistical analysis, and machine learning techniques.

Interview Questions

  • What is data transformation and why is it important? (basic)
  • How do you handle missing data during the transformation process? (basic)
  • Can you explain the difference between ETL and ELT? (medium)
  • How do you ensure the quality and accuracy of transformed data? (medium)
  • Describe a data transformation project you worked on and the challenges you faced. (medium)
  • What are the benefits of using data transformation tools like Apache Spark or Talend? (advanced)
  • How would you optimize a data transformation process for large datasets? (advanced)
  • Explain the concept of data lineage and its significance in data transformation. (advanced)

Closing Remark

As the demand for data transformation professionals continues to rise in India, now is a great time to explore opportunities in this field. By honing your skills, gaining relevant experience, and preparing for interviews, you can position yourself for a successful career in data transformation. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies