Home
Jobs

990 Data Integration Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 9.0 years

12 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking an experienced ETL Developer with a strong background in Python and Airflow to join our dynamic team in Hitech City, Hyderabad. The ideal candidate will have over 7 years of experience in ETL processes and data integration, with a focus on optimizing and enhancing data pipelines. While expertise in Snowflake is not mandatory, a strong understanding of RDBMS and SQL is essential.

Posted 10 hours ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Power BI Developer Must Have Power BI, Data Integration and Visualization Specialist and Data Lakes and Dashboard Developer Good to have - Tableau and ELK stack (Elasticsearch, Logstash, Kibana) to retrieve, analyze, and visualize log or event data. Experience 7+ JD - Develop and Manage Power BI SolutionsDesign, develop, and maintain Power BI reports and dashboards that generate comprehensive reports and insights to support business decision-making. Which needs to be built from scratch. Data Lake IntegrationIntegrate data from various sources into a data lake, ensuring data is clean, accurate, and accessible & troubleshooting and resolve issues related to data integration and dashboard functionality Data ModelingCreate and maintain data models to support reporting and analytics needs. ETL ProcessesDesign and implement ETL (Extract, Transform, Load) processes to move data from source systems to the data lake and Power BI. Performance OptimizationOptimize Power BI reports and dashboards for performance and usability. DocumentationDocument data models, ETL processes, and Power BI solutions to ensure maintainability and knowledge sharing. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Power BI Visualization on cloud. Experience5-8 Years.

Posted 13 hours ago

Apply

5.0 - 10.0 years

12 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Design and implement robust and scalable data pipelines using Azure Data Factory, Azure Data Lake, and Azure SQL. Work extensively with Azure Fabric, CosmosDB, and SQL Server to develop and optimize end-to-end data solutions. Perform Database Design, Data Modeling, and Performance Tuning to ensure system reliability and data integrity. Write and optimize complex SQL queries to support data ingestion, transformation, and reporting needs. Proactively implement SQL optimization and preventive maintenance strategies to ensure efficient database performance. Lead data migration efforts from on-premise to cloud or across Azure services. Collaborate with cross-functional teams to gather requirements and translate them into technical solutions. Maintain clear documentation and follow industry best practices for security, compliance, and scalability. Required Skills : Proven experience working with: Azure Fabric SQL Server Azure Data Factory Azure Data Lake Cosmos DB Strong hands-on expertise in: Complex SQL queries SQL query efficiency and optimization Database design and data modeling Data migration techniques and performance tuning Solid understanding of cloud infrastructure and data integration patterns in Azure. Experience working in agile environments with CI/CD practices. Nice to have :Microsoft Azure certifications related to Data Engineering or Azure Solutions Location - Bengaluru , Hyderabad , Chennai , Pune , Noida , Mumbai.

Posted 14 hours ago

Apply

1.0 - 4.0 years

1 - 4 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Marketing Manager Analytics Job Title : Marketing Manager Analytics Experience : 1-4 Location : Chennai, Hyderabad, Bangalore Role Summary: Manage marketing technology stack and drive data-driven insights to optimize campaigns and user journeys. Key Responsibilities: Implement and manage tools like GA4, Mix panel, HubSpot, Segment. Analyze campaign performance and user behavior. Build dashboards and reports for marketing KPIs. Collaborate with tech and marketing teams for data integration. Requirements: Strong analytical skills and experience with marketing analytics tools. Knowledge of SQL, Excel, and data visualization platforms. Understanding of attribution models and funnel analysis. Key Skills: Google Analytics 4 (GA4), Mix panel, Segment HubSpot, Salesforce, or similar CRM platforms SQL and Excel for data manipulation Data visualization tools (Tableau, Power BI) UTM tracking and campaign attribution Funnel analysis and cohort tracking

Posted 14 hours ago

Apply

1.0 - 5.0 years

4 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Piscis Networks is looking for TAC Support Engineer to join our dynamic team and embark on a rewarding career journey Responding to customer inquiries and resolving technical issues via phone, email, or chat Conducting diagnostic tests to identify the root cause of customer issuesProviding technical guidance to customers and walking them through solutions to resolve their problems Collaborating with development teams to escalate and resolve complex technical issues Maintaining accurate records of customer interactions and issue resolutions in a CRM system Participating in the development and delivery of customer training and support materials Communicating with customers and internal stakeholders to provide status updates on issue resolution Strong technical background and understanding of hardware and software systems Excellent communication and interpersonal skills Experience with CRM and ticketing systems

Posted 14 hours ago

Apply

2.0 - 6.0 years

6 - 9 Lacs

Mumbai, Mumbai Suburban, Mumbai (All Areas)

Work from Office

Naukri logo

1. We are seeking a Min 3 years of skilled SQL + Python Developer to join our dynamic team. 2. This role involves a mix of database development, administration, and data engineering tasks. 3. Design and implement ETL processes for Data Integration. Required Candidate profile 1.The ideal candidate will have a strong background in SQL, PL/SQL, and Python scripting. 2.Proven expertise in SQL query tuning and database performance optimization ,Snowflake Data Warehouse. Perks and benefits To be disclosed post intrview

Posted 2 days ago

Apply

6.0 - 8.0 years

18 - 22 Lacs

Bangalore Rural, Chennai, Bengaluru

Work from Office

Naukri logo

Senior Python ETL Developer/Lead, 5+ years of exp in ETL development using Python,Apache Airflow, PySpark, and Pandas,Oracle SQL, PL/SQL,UNIX , Windows environments,OOAD, SOA,data warehousing concepts, data modeling, and data integration.

Posted 2 days ago

Apply

4.0 - 9.0 years

9 - 12 Lacs

Mumbai

Work from Office

Naukri logo

SUMMARY Job Role: Microsoft Azure Data Service Professionals Location: Mumbai Experience: 4+ years Must-Have The candidate should have at least 3 years of relevant experience in Microsoft Azure Data Services. Job Description As a Software Development Engineer, you will be part of a dynamic work environment, responsible for analyzing, designing, coding, and testing various components of application code for multiple clients. Your role will involve collaborating with team members to ensure successful feature implementation, troubleshooting issues, and contributing to the overall improvement of the software development process. Additionally, you will participate in code reviews and maintain documentation to support the development lifecycle, ensuring that the applications meet the required standards and specifications. Roles & Responsibilities Independently perform and become a Subject Matter Expert (SME). Actively participate and contribute in team discussions. Contribute to providing solutions to work-related problems. Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Conduct thorough testing and debugging of application components to ensure high-quality deliverables. Professional & Technical Skills Must Have Skills: Proficiency in Microsoft Azure Data Services. Strong understanding of cloud computing concepts and architecture. Experience with data integration and ETL processes. Familiarity with database management systems and data modeling. Knowledge of programming languages such as C#, Python, or Java. Additional Information The candidate should have a minimum of 3 years of experience in Microsoft Azure Data Services. This position is based at our Mumbai office. A 15-year full-time education is required. Requirements Requirements: Minimum 3 years of relevant experience in Microsoft Azure Data Services. 15-year full-time education.

Posted 2 days ago

Apply

6.0 - 10.0 years

13 - 14 Lacs

Jaipur, Delhi / NCR, Bengaluru

Hybrid

Naukri logo

Location: DELHI / NCR / Jaipur / Bangalore / Hyderabad Work Mode: Hybrid - 2 Days WFO Working Time: 1:00 PM to 10:00 PM IST iSource Services is hiring for one of their USA based client for the position of Data Integration Specialist. About the Role - We are seeking a skilled Data Integration Specialist to manage data ingestion, unification, and activation across Salesforce Data Cloud and other platforms. You will design and implement robust integration workflows, leveraging APIs and ETL tools to enable seamless data flow and support a unified customer experience. Key Responsibilities: Design and implement data ingestion workflows into Salesforce Data Cloud Unify data from multiple sources to create a 360 customer view Develop integrations using APIs, ETL tools, and middleware (e.g., MuleSoft) Collaborate with cross-functional teams to gather and fulfill data integration requirements Monitor integration performance and ensure real-time data availability Ensure compliance with data privacy and governance standards Enable data activation across Salesforce Marketing, Sales, and Service Clouds Must-Have Skills: Experience with cloud data platforms (e.g., Snowflake, Redshift, BigQuery) Salesforce certifications (e.g., Data Cloud Consultant, Integration Architect) Hands-on experience with Salesforce Data Cloud (CDP) Proficiency in ETL, data transformation, and data mapping Strong knowledge of REST/SOAP APIs and integration tools Solid understanding of data modeling and customer data platforms Familiarity with data privacy regulations (e.g., GDPR, CCPA).

Posted 3 days ago

Apply

4.0 - 5.0 years

10 - 11 Lacs

Mumbai

Remote

Naukri logo

Location: Remote / Pan India iSource Services is hiring for one of their client for the position of Adobe CDP Consultant (Execution Specialist). About the role: We are hiring an Adobe CDP Consultant (Execution Specialist) to support hands-on implementation of Adobe Real-Time CDP. Working under a Senior Consultant, you will manage technical setup, data flows, segmentation, and activation processes. The ideal candidate has strong execution experience in CDP environments and a solid grasp of data integration and marketing tech ecosystems. Roles & Responsibilities: Assist in the technical setup, configuration, and deployment of Adobe Real-Time CDP. Support data ingestion, transformation, identity resolution, and activation workflows. Implement and execute audience segmentation strategies based on strategic direction. Ensure seamless integration between Adobe Real-Time CDP and downstream marketing platforms. Troubleshoot data pipeline issues and optimize data flow performance. Build dashboards and reports to monitor campaign performance, data quality, and audience insights. Collaborate with cross-functional teams including marketing, analytics, and IT to deliver on marketing use cases. Stay current with Adobe Experience Cloud features and enhancements to drive innovation and continuous improvement. Required Skills: 4-5 years of CDP execution experience Hands-on with Adobe Real-Time CDP or similar platforms Proficient in data modeling, identity resolution, and segmentation Knowledge of APIs, data workflows, and troubleshooting Familiarity with Adobe Analytics, Target, and automation tools Adobe certifications are a plus

Posted 3 days ago

Apply

5.0 - 7.0 years

12 - 15 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Job Title: Power Apps QA Location: Remote, Hyderabad,ahmedabad,pune,chennai,kolkata. Notice Period: Immediate iSource Services is hiring for one of their client for the position of Power Apps QA. About the Role - We are looking for an experienced Power Apps QA with 5-7 years of experience to join our team remotely. In this role, you will be responsible for designing, developing,and managing workflows and data pipelines to streamline business processes and ensure quality automation solutions. The ideal candidate will have in-depth experience with Microsoft Power Automate, Azure Data Factory (ADF), and other Azure services, alongside a strong ability to troubleshoot and optimize automated solutions. You will be integral in ensuring that the automation and data workflows run smoothly, efficiently, and comply with security and quality standards. 1. Power Automate: Experience in creating and managing workflows using Microsoft Power Automate. Proficiency in integrating Power Automate with other Microsoft tools like SharePoint, Microsoft Teams. Ability to design, develop, and implement automated solutions to streamline business processes. Strong understanding of connectors and their configurations in Power Automate. 2. Azure Data Factory: In-depth knowledge of Azure Data Factory (ADF) including data pipelines, data flows, and data integration activities. Experience in creating, scheduling, and managing data pipelines. Understanding of various data storage services (e.g., Azure Blob Storage, Azure SQL Database, etc.) and their integration in ADF. Expertise in debugging, performance tuning, and troubleshooting data workflows in ADF. Familiarity with version control systems like Git for managing ADF assets. 3. General Azure Service General knowledge of other Azure services like Azure Functions, Logic Apps, and Azure DevOps. Understanding of Azure security and compliance standards.

Posted 3 days ago

Apply

7.0 - 9.0 years

27 - 42 Lacs

Pune

Work from Office

Naukri logo

Job Summary We are seeking a Sr. Developer with 7 to 9 years of experience proficient in ITX TIBCO Adapters TIBCO Business Works and related technologies. The ideal candidate will work in a hybrid model focusing on data integration and financial messaging. Strong English communication skills are essential. Responsibilities Develop and maintain integration solutions using ITX TIBCO Business Works and related technologies. Implement and manage Docker Containers for application deployment and orchestration. Utilize TIBCO Adapters and TIBCO BW Container Edition to streamline integration processes. Leverage TIBCO Cloud Integration for seamless data integration across platforms. Monitor and manage TIBCO Rendezvous TIBCO Hawk and TIBCO EMS for efficient message handling. Administer TIBCO environments using TIBCO Administrator and TIBCO ADB Adapter. Develop and maintain Java-based applications and services. Create and manage REST APIs ensuring robust and secure data exchange. Work with XML JSON and XSLT to transform and validate data. Implement SOAP-based web services for reliable communication. Utilize Java Message Service for asynchronous messaging. Collaborate with cross-functional teams to ensure seamless integration and data flow. Provide technical guidance and support to junior developers. Ensure all solutions adhere to industry best practices and company standards. Qualifications Possess a strong understanding of Docker Container TIBCO Adapters and TIBCO Business Works. Have experience with TIBCO BW Container Edition and TIBCO Cloud Integration. Be proficient in data integration using YAML Kubernetes and XSD. Demonstrate expertise in TIBCO Rendezvous TIBCO Hawk and TIBCO EMS. Have experience with TIBCO Administrator and TIBCO ADB Adapter. Be skilled in Java JSON REST API XML XSLT SOAP and Java Message Service. Nice to have experience in Financial Messaging (Payments-Corp). Certifications Required TIBCO Certified Professional Docker Certified Associate Java SE 8 Programmer Certification

Posted 3 days ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Madurai, Tiruppur, Salem

Work from Office

Naukri logo

Req ID: 125023. Remote Position: Hybrid. Region: Asia. Country: India. State/Province: Chennai. City: Guindy, Chennai. Summary. The Senior Specialist, IT Solutions is a key role that evaluates, implements, and manages Security solutions to protect Celestica's systems and data. Responsibilities include implementing automation technologies, performing risk assessments, contributing to automation policies and standards, and advising on automation best practices. This role also mentors junior team members and provides advanced technical support for automation solutions.. Detailed Description. Performs tasks such as, but not limited to, the following:. Maintain security infrastructure for operational efficiencies. collaborate with other IT infrastructure, application and network teams to ensure seamless integrations of tools and technology.. Develop and implement playbooks for security automation and orchestration to respond to security events and incidents.. Design and implement integrations between security tools such as EDR, SIEM, and ServiceNow, to automate incident response and threat intelligence sharing.. Automate security processes, such as vulnerability scanning, patching, and user provisioning, using scripting and configuration management tools.. Develop custom scripts and tools, such as parsers and data enrichment scripts, to automate repetitive security tasks and integrate disparate security data sources.. Create and maintain comprehensive documentation and runbooks for security automation processes and integrations.. Collaborate with other security team members, such as threat intelligence analysts and incident responders, to identify automation opportunities and implement effective security automation solutions.. Stay up-to-date on emerging security threats and technologies to proactively identify and address potential security risks through automation.. Knowledge/Skills/Competencies. Expert knowledge of information security principles, practices, and technologies.. Expert knowledge of EDR, SIEM, and ServiceNow. Strong understanding of data integration and API development. In-depth knowledge of information security standards and regulations (e.g., ISO 27001, NIST).. Strong understanding of software design processes and data modeling.. Excellent problem-solving and analytical skills.. Strong leadership, mentoring, and communication skills.. Ability to work independently and as part of a team.. Physical Demands. Duties of this position are performed in a normal office environment.. Duties may require extended periods of sitting and sustained visual concentration on a computer monitor or on numbers and other detailed data. Repetitive manual movements (e.g., data entry, using a computer mouse, using a calculator, etc.) are frequently required.. Typical Experience. 6 to 8 years of experience in information security, with a proven track record of evaluating, implementing, and managing security solutions.. Typical Education. Bachelor's degree in Software Engineering, Computer Science, Information Security, or a related field.. Relevant industry certifications (e.g., CISSP, CISM) are highly desirable.. Notes. This job description is not intended to be an exhaustive list of all duties and responsibilities of the position. Employees are held accountable for all duties of the job. Job duties and the % of time identified for any function are subject to change at any time.. Celestica is an equal opportunity employer. All qualified applicants will receive consideration for employment and will not be discriminated against on any protected status (including race, religion, national origin, gender, sexual orientation, age, marital status, veteran or disability status or other characteristics protected by law).. At Celestica we are committed to fostering an inclusive, accessible environment, where all employees and customers feel valued, respected and supported. Special arrangements can be made for candidates who need it throughout the hiring process. Please indicate your needs and we will work with you to meet them.. Company Overview. Celestica (NYSE, TSX: CLS) enables the world’s best brands. Through our recognized customer-centric approach, we partner with leading companies in Aerospace and Defense, Communications, Enterprise, HealthTech, Industrial, Capital Equipment and Energy to deliver solutions for their most complex challenges. As a leader in design, manufacturing, hardware platform and supply chain solutions, Celestica brings global expertise and insight at every stage of product development – from drawing board to full-scale production and after-market services for products from advanced medical devices, to highly engineered aviation systems, to next-generation hardware platform solutions for the Cloud. Headquartered in Toronto, with talented teams spanning 40+ locations in 13 countries across the Americas, Europe and Asia, we imagine, develop and deliver a better future with our customers.. Celestica would like to thank all applicants, however, only qualified applicants will be contacted.. Celestica does not accept unsolicited resumes from recruitment agencies or fee based recruitment services.. Show more Show less

Posted 3 days ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description. Tietoevry Create is seeking a skilled Snowflake Developer to join our team in Bengaluru, India. In this role, you will be responsible for designing, implementing, and maintaining data solutions using Snowflake's cloud data platform. You will work closely with cross-functional teams to deliver high-quality, scalable data solutions that drive business value.. 7+ years of experience in designing, development of Datawarehouse & Data integration projects (SSE / TL level). Experience of working in Azure environment. Developing ETL pipelines in and out of data warehouse using a combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.. Good understanding of database design concepts Transactional / Datamart / Data warehouse etc.. Expertise in loading from disparate data sets and translating complex functional and technical requirements into detailed design. Will also perform analysis of vast data stores and uncover insights.. Snowflake data engineers will be responsible for architecting and implementing substantial scale data intelligence solutions around Snowflake Data Warehouse.. A solid experience and understanding of architecting, designing, and operationalizing large-scale data & analytics solutions on Snowflake Cloud Data Warehouse is a must.. Very good articulation skill. Flexible and ready to learn new skills.. Additional Information. At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.. Diversity,?equity and?inclusion (tietoevry.com). Show more Show less

Posted 3 days ago

Apply

3.0 - 4.0 years

11 - 14 Lacs

Mumbai

Work from Office

Naukri logo

AEP Data Architect. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Key Responsibilities. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Requirements & Qualifications. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Show more Show less

Posted 3 days ago

Apply

4.0 - 7.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Looking for a Cloud Backend Engineer (4–7 yrs exp). Design scalable backend services on Kubernetes with GCP/AWS. Work with Python/Java/Kotlin & data tools like BigQuery or Snowflake. Onsite | 5-day week. Required Candidate profile 4–7 yrs backend dev with 2+ yrs Kubernetes in production, cloud expertise (GCP/AWS), Python/Java/Kotlin. Must know cloud-native systems & BigQuery or Snowflake. Onsite role, immediate joiner preferred

Posted 3 days ago

Apply

5.0 - 8.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Date 17 Jun 2025 Location: Bangalore, IN Company Alstom Req ID:486332 STRUCTURE, REPORTING, NETWORKS & LINKS: Organization Structure CITO |-- Data & AI Governance Vice President |-- Enterprise Data Domain Director |-- Head of Analytics Platform |-- Analytics Delivery Architect |-- Analytics Technical Specialist Organizational Reporting Reports to Delivery Manager Networks & Links Internally Transversal Digital Platforms Team. Innovation Team, Application Platform Owners, Business process owners, Infrastructure team Externally Third-party technology providers, Strategic Partners Location :Position will be based in Bangalore Willing to travel occasionally for onsite meetings and team workshops as required RESPONSIBILITIES - Design, develop, and deploy interactive dashboards and reports using MS Fabric & Qlik Cloud, ensuring alignment with business requirements and goals. Implement and manage data integration workflows utilizing MS Fabric to ensure efficient data processing and accessibility. Translate business needs to technical specifications and Design, build and deploy solutions. Understand and integrate Power BI reports into other applications using embedded analytics like Power BI service (SaaS), Teams, SharePoint or by API automation. Will be responsible for access management of app workspaces and content. Integration of PowerBi servers with different data sources and timely upgradation/services of PowerBi Able to schedule and refresh jobs on Power BI On-premise data gateway. Configure standard system reports, as well as customized reports as required. Responsible in helping various kind of database connections (SQL, Oracle, Excel etc.) with Power BI Services Investigate and troubleshoot reporting issues and problems Maintain reporting schedule and document reporting procedures Monitor and troubleshoot data flow issues, optimizing the performance of MS Fabric applications as needed. Optimize application performance and data models in Qlik Cloud while ensuring data accuracy and integrity. Ensure collaboration with Functional & Technical Architectsasbusiness cases are setup for each initiative,collaborate with other analytics team to drive and operationalize analytical deployment.Maintain clear and coherent communication, both verbal and written, to understand data needs and report results. Ensure compliance with internal policies and regulations Strong ability to take the lead and be autonomous Proven planning, prioritization, and organizational skills.Ability to drive change through innovation & process improvement. Be able to report to management and stakeholders in a clear and concise manner. Good to havecontribition to the integration and utilization of Denodo for data virtualization, enhancing data access across multiple sources. Document Denodo processes, including data sources and transformations, to support knowledge sharing within the team. Facilitate effective communication with stakeholders regarding project updates, risks, and resolutions to ensure transparency and alignment. Participate in team meetings and contribute innovative ideas to improve reporting and analytics solutions. EDUCATION Bachelors/Masters degree in Computer Science Engineering /Technology or related field Experience Total 6 years of experience Mandatory 2+ years of experience in Power BI End-to-End Development using Power BI Desktop connecting multiple data sources (SAP, SQL, Azure, REST APIs, etc.) Experience in MS Fabric Components along with Denodo. Technical competencies Proficient in using MS Fabric for data integration and automation of ETL processes. Understanding of data governance principles for quality and security. Strong expertise in creating dashboards and reports using Power BI and Qlik. Knowledge of data modeling concepts in Qlik and Power BI. Proficient in writing complex SQL queries for data extraction and analysis. Skilled in utilizing analytical functions in Power BI and Qlik. Experience in troubleshooting performance issues in MS Fabric and Denodo. Experience in Developing visual reports, dashboards and KPI scorecards using Power BI desktop & Qlik Understand Power BI application security layer model. Hands on PowerPivot, Role based data security, Power Query, Dax Query, Excel, Pivots/Charts/grid and Power View Good to have Power BI Services and Administration knowledge. Experience in developing data models using Denodo to support business intelligence and analytics needs. Proficient in creating base views and derived views for effective data representation. Ability to implement data transformations and enrichment within Denodo. Skilled in using Denodo's SQL capabilities to write complex queries for data retrieval. Familiarity with integrating Denodo with various data sources, such as databases, web services, and big data platforms. BEHAVIORAL COMPETENCIES The candidate should demonstrate: A strong sense for collaboration and being a team player Articulate issues and propose solutions. Structured thought process and articulation Critical thinking and problem-solving skills. Analytical bent of mind and be willing to question the status quo Possess excellent soft skills. Individual contributor and proactive and have leadership skills. Be able to guide and drive team from technical standpoint. Excellent written, verbal, and interpersonal skills. Self-motivated, quick learner is a must. Be fluent in English. Be able to influence and deliver. You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.

Posted 3 days ago

Apply

8.0 - 12.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Date 12 Jun 2025 Location: Bangalore, KA, IN Company Alstom As a promoter of sustainable mobility, Alstom develops and markets systems, equipment and services for the transport sector. Alstom offers a complete range of solutions (from high-speed trains to metros, tramways and e-buses), passenger solutions, customised services (maintenance, modernisation), infrastructure, signalling and digital mobility solutions. Alstom is a world leader in integrated transport systems. The company recorded sales of 7.3 billion and booked 10.0 billion of orders in the 2016/17 fiscal year. Headquartered in France, Alstom is present in over 60 countries and employs 32,800 people. UNIFE report forecasts India's accessible market at 4B over 2016-18, with growth of 6.6%. Alstom has established a strong presence in India and is currently executing metro projects in several Indian cities including Chennai, Kochi and Lucknow where it is supplying Rolling Stock manufactured out its state of the art facility at SriCity in Andhra Pradesh. In the Mainline space, Alstom is executing Signaling & Power Supply Systems for the 343 Km. section on World Bank funded Eastern Dedicated Freight Corridor. Construction of the new electric locomotive factory for manufacturing and supply of 800 units of high horse power locomotives is also in full swing at Madhepura in Bihar. Alstom has set up an Engineering Centre of Excellence in Bengaluru, and this coupled with a strong manufacturing base as well as localized supply chains, is uniquely positioned to serve customers across the globe. Today, Alstom in India employs close to 3000 people and in line with Government of Indias Make in India policy initiative, Alstom has been investing heavily in the country in producing world class rolling stock, components, design, research and development to not only serve the domestic market, but also rest of the world. OVERALL PURPOSE OF THE ROLE: The purpose of this role to build inhouse technical expertise for data integration area and delivery of data integration services for the platform. Primary Goals and Objectives- This role should be responsible for delivery model for Data Integration services. Person should be responsible for building technical expertise on data integration solutions and providing data integration services. The role is viewed as an expert in solution design, development, performance tuning and troubleshooting for data integration. RESPONSIBILITIES: Technical - Hands-on-experience architecting and delivering solutions related to enterprise integration, APIs, service-oriented architecture, and technology modernizations 3-4 years hands-on experience with the design, and implementation of integrations in the area of Dell Boomi Understanding the Business requirements and Functional requirement Documents and Design a Technical Solution as per the needs Person should be good with Master Data Management, Migration and Governance best practices Extensive data quality and data migration experience including proficiency in data warehousing, data analysis and conversion planning for data migration activities Lead and build data migration objects as needed for conversions of data from different sources Should have architected integration solutions using Dell Boomi for cloud, hybrid and on-premise integration landscapes Ability to build and architect a high performing, highly available, highly scale Boomi Molecule Infrastructure In depth understanding of enterprise integration patterns and prowess to apply them in the customers IT landscape Assists project teams during system design to promote the efficient re-use of IT assets Advises project team during system development to assure compliance with architectural principles, guidelines and standards Adept in building the Boomi processes with Error handling and email alerts logging best practices Should be proficient in using Enterprise level and Database connectors Extensive data quality and data migration experience including proficiency in data warehousing, data analysis and conversion planning for data migration activities Excellent understanding on REST with in-depth understanding on how Boomi processes can expose consume services using the different http methods, URI and Media type Understand Atom, Molecule, Atmosphere Configuration and Management, Platform Monitoring, Performance Optimization Suggestions, Platform Extension, User Permissions Control Skills. Knowledge on API governance and skills like caching, DB management and data warehousing Should have hands on experience in configuring AS2, https, SFTP involving different authentication methods Thorough knowledge on process deployment, applying extensions, setting up schedules, Web Services user management process filtering and process reporting Should be expert with XML and JSON activities like creation, mapping and migrations Person should have worked on integration on SAP, SuccessFactors, Sharepoint, cloud-based apps, Web applications and engineering application Support and resolve issues related to data integration deliveries or platform Project Management Person should deliver Data Integration projects using data integration platform Manage partner deliveries by setting up governance of their deliveries Understand project priorities, timelines, budget, and deliverables and the need to proactively push yourself and others to achieve project goals Managerial: Person is individual contributor and operationally managing small technical team Qualifications & Skills: 10+ years of experience in the area of enterprise integrations Minimum 3-4 years of experience with Dell boomi Should have working experience with database like sql server, Data warehousing Hands on experience on REST, SOAP, XML, JSON, SFTP, EDI Should have worked on integration of multiple technologies like SAP, Web, cloud based apps. EDUCATIONB.E. BEHAVIORAL COMPETENCIES: Demonstrate excellent collaboration skills as person will be interacting with multiple business units, Solution managers and internal IT teams Should have excellent analytical and problem solving skills Coaches, supports and trains other team membres You demonstrate excellent communication skills TECHNICAL COMPETENCIES & EXPERIENCE Technical expertise in Delll Boomi for data integration is MUST. Language Skills: English IT Skills: Dell Boomi, SQL, REST APIs, EDI, JSON, XML Location for the roleTravelIf yes, how much (%) - Bangalore. 5%. Contract Type/ Bonus (OPTIONAL) Alstom is committed to create a diverse & international working environment, that reflects the future of our industry, our clients and end-users. As an employee, you will have a unique opportunity to continue to build your career and directly contribute to the expanding growth of the global transport industry Job TypeExperienced

Posted 3 days ago

Apply

4.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Date 19 Jun 2025 Location: Bangalore, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. OVERALL PURPOSE OF THE ROLE He/She will act as an Anaplan expert and ensure the Platform is suitable for user adoption as per the Business requirements and also compliant with security Alstom standards at all times. He/She will play a crucial role in implementing an architecture that is optimized for performance & storage. He/She is expected to lead/coordinate end to end Delivery on the Projects/demands. In addition, He/She will be responsible for tracking the users and manage licenses to ensure compliance with the contractual objectives. STRUCTURE, REPORTING, NETWORKS & LINKS Organization Structure CITO |-- VP Data & AI Governance |-- Enterprise Data Domain Director |--Head of Analytics Platform |--Analytics Delivery Architect |--Analytics Technical Analyst Organizational Reporting: Reports to Head of Analytics Platform.. Networks & Links InternallyDigital Platforms Team, InnovationTeam, ApplicationPlatform Owners, Business process owners, Infrastructure team ExternallyThird-party technology providers, Strategic Partners Location Position will be based in Bangalore RESPONSIBILITIES - Design, develop, and deploy interactive dashboards and reports using MS Fabric & Qlik Cloud (Good to Have), ensuring alignment with business requirements and goals. Implement and manage data integration workflows utilizing MS Fabric to ensure efficient data processing and accessibility. Use Python scripts to automate data cleaning and preprocessing tasks for Data models. Understand and integrate Power BI reports into other applications using embedded analytics like Power BI service (SaaS), Teams, SharePoint or by API automation. Will be responsible for access management of app workspaces and content. Integration of PowerBi servers with different data sources and timely upgradation/services of PowerBi Able to schedule and refresh jobs on Power BI On-premise data gateway. Configure standard system reports, as well as customized reports as required. Responsible in helping various kind of database connections (SQL, Oracle, Excel etc.) with Power BI Services Investigate and troubleshoot reporting issues and problems Maintain reporting schedule and document reporting procedures Monitor and troubleshoot data flow issues, optimizing the performance of MS Fabric applications as needed. Ensure collaboration with Functional & Technical Architectsasbusiness cases are setup for each initiative,collaborate with other analytics team to drive and operationalize analytical deployment. Maintain clear and coherent communication, both verbal and written, to understand data needs and report results. Ensure compliance with internal policies and regulations Strong ability to take the lead and be autonomous Proven planning, prioritization, and organizational skills.Ability to drive change through innovation & process improvement. Be able to report to management and stakeholders in a clear and concise manner. Good to havecontribution to the integration and utilization of Denodo for data virtualization, enhancing data access across multiple sources. Facilitate effective communication with stakeholders regarding project updates, risks, and resolutions to ensure transparency and alignment. Participate in team meetings and contribute innovative ideas to improve reporting and analytics solutions. EDUCATION Bachelors/Masters degree in Computer Science Engineering /Technology or related field Experience Minimum 3 and maximum 5 years of total experience Mandatory 2+ years of experience in MS Fabric Power BI End-to-End Development using Power BI Desktop connecting multiple data sources (SAP, SQL, Azure, REST APIs, etc.) Handson Experience in Python, R, and SQL for data manipulation, analysis, data pipeline and database interaction. Experience or Knowledge in using PySpark or Jupiter Notebook for data cleaning, transformation, exploration, visualization, and building data models on large datasets. Technical competencies Proficient in using MS Fabric for data integration and automation of ETL processes. Knowlege in using Pyspark Modules for Data Modelling (Numpy, Panda) Handson & in using Python , R Programming language for data processing. Understanding of data governance principles for quality and security. Strong expertise in creating dashboards and reports using Power BI and Qlik. Knowledge of data modeling concepts in Qlik and Power BI. Proficient in writing complex SQL queries for data extraction and analysis. Skilled in utilizing analytical functions in Power BI and Qlik. Experience in Developing visual reports, dashboards and KPI scorecards using Power BI desktop & Qlik Hands on PowerPivot, Role based data security, Power Query, Dax Query, Excel, Pivots/Charts/grid and Power View Good to have Power BI Services and Administration knowledge. Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.

Posted 3 days ago

Apply

2.0 - 6.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Job Purpose Data Analyst plays a crucial lead role in managing and optimizing business intelligence solutions using Power BI. Leadership and StrategyLead the design, development, and deployment of Power BI reports and dashboards. Provide strategic direction for data visualization and business intelligence initiatives. Interface with Business Owner, Project Manager, Planning Manager, Resource Managers etc. Develop roadmap for execution of complex data analytics projects. Data Modeling and IntegrationDevelop complex data models, establish relationships, and ensure data integrity. Oversee data integration from various sources. Advanced AnalyticsPerform advanced data analysis using DAX (Data Analysis Expressions) and other analytical tools to derive insights and support decision-making. CollaborationWork closely with stakeholders to gather requirements, define data needs, and ensure the delivery of high-quality BI solutions. Performance OptimizationOptimize solutions for performance, ensuring efficient data processing and report rendering. MentorshipMentor and guide junior developers, providing technical support and best practices for Power BI development. Data SecurityImplement and maintain data security measures, ensuring compliance with data protection regulations. Demonstrated experience of leading complex projects with a team of varied experience levels. You are meant for this job if: Educational BackgroundBachelors or Masters degree in Computer Science, Information Systems, or a related field. Experience in working with unstructured data and data integration. Technical Skills: Proficiency in Power BI, DAX, SQL, and data modeling, exposure to data engineering. Experience with data integration tools and ETL processes. Hands-on experience with Snowflake Experience7-8 years of experience in business intelligence and data analytics, with a focus on Power BI. Soft Skills: Strong analytical and problem-solving skills, excellent communication abilities, and the capacity to lead and collaborate with global cross-functional teams. Skills Change Leadership Process Mapping

Posted 3 days ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

ODI Developer Chennai/Bangalore. WFO only. 5-8 years of experience as an ETL Developer, with hands-on expertise in Oracle Data Integrator (ODI). ODI expertise is must, ETL with informatica and any other tools except ODI will be rejected. Proficiency in Oracle Database and MySQL, with strong skills in SQL and PL/SQL development. Experience in data integration, transformation, and loading from heterogeneous data sources. Strong understanding of data modeling concepts and ETL best practices. Familiarity with performance tuning and troubleshooting of ETL processes. Knowledge of scripting languages (e.g., Python, JavaScript) for automation is a plus. Excellent analytical and problem-solving skills. Strong communication skills to work effectively with cross-functional teams. Please call varsha 7200847046 for more Info Regards varsha 7200847046

Posted 3 days ago

Apply

3.0 - 6.0 years

5 - 6 Lacs

Gurugram

Work from Office

Naukri logo

Design, develop, and deploy backend services and APIs using Python Python code to implement workflows and automation Work closely with Odoo developer Write optimized, scalable SQL script Data pipelines for processing large datasets via Python & SQL. Required Candidate profile 3–6 years of professional experience in Python & SQL Solid understanding of relational databases Advanced SQL queries and stored procedures Experience with RESTful API development

Posted 3 days ago

Apply

6.0 - 9.0 years

27 - 42 Lacs

Pune

Work from Office

Naukri logo

Job Summary The Sr. Developer role focuses on designing developing and implementing data solutions using IBM Infosphere Datastage and PL/SQL. With a hybrid work model the candidate will contribute to optimizing data warehousing processes and ensuring seamless data integration. This position requires a minimum of 6 years of experience in ETL and scheduling basics with a preference for expertise in account management and cash management domains. Responsibilities Design and develop efficient ETL processes using IBM Infosphere Datastage to ensure seamless data integration and transformation. Implement data warehousing concepts to optimize storage and retrieval processes enhancing overall data management efficiency. Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications for data solutions. Oversee the scheduling of data processes to ensure timely execution and delivery of data insights to stakeholders. Provide technical expertise in PL/SQL to develop robust database solutions that support business operations. Analyze existing data systems and recommend improvements to enhance performance and scalability. Ensure data quality and integrity by implementing rigorous testing and validation procedures throughout the development lifecycle. Troubleshoot and resolve data-related issues to maintain smooth operations and minimize downtime. Document technical processes and solutions to facilitate knowledge sharing and support future development efforts. Stay updated with industry trends and advancements in data integration technologies to drive innovation within the team. Support the hybrid work model by effectively managing tasks both remotely and on-site ensuring consistent productivity. Communicate effectively in English to collaborate with team members and present findings to stakeholders. Contribute to the companys purpose by developing data solutions that enhance decision-making and impact society positively. Qualifications Possess strong technical skills in data warehousing concepts ETL and data integration with hands-on experience in IBM Infosphere Datastage. Demonstrate proficiency in PL/SQL for developing and optimizing database solutions. Have a solid understanding of scheduling basics to manage data processes efficiently. Experience in account management and cash management domains is a plus providing valuable insights into financial data handling. Exhibit excellent communication skills in English both written and spoken to collaborate effectively with team members. Adapt to the hybrid work model showcasing flexibility and productivity in both remote and on-site settings. Show a proactive approach to learning and applying new technologies to enhance data solutions.

Posted 3 days ago

Apply

5.0 - 8.0 years

18 - 25 Lacs

Pune

Work from Office

Naukri logo

We are seeking an experienced Modern Microservice Developer to join our team and contribute to the design, development, and optimization of scalable microservices and data processing workflows. The ideal candidate will have expertise in Python, containerization, and orchestration tools, along with strong skills in SQL and data integration. Key Responsibilities: Develop and optimize data processing workflows and large-scale data transformations using Python. Write and maintain complex SQL queries in Snowflake to support efficient data extraction, manipulation, and aggregation. Integrate diverse data sources and perform validation testing to ensure data accuracy and integrity. Design and deploy containerized applications using Docker, ensuring scalability and reliability. Build and maintain RESTful APIs to support microservices architecture. Implement CI/CD pipelines and manage orchestration tools such as Kubernetes or ECS for automated deployments. Monitor and log application performance, ensuring high availability and quick issue resolution. Requirements Mandatory: Bachelor's degree in Computer Science, Engineering, or a related field. 5-8 years of experience in Python development, with a focus on data processing and automation. Proficiency in SQL, with hands-on experience in Snowflake. Strong experience with Docker and containerized application development. Solid understanding of RESTful APIs and microservices architecture. Familiarity with CI/CD pipelines and orchestration tools like Kubernetes or ECS. Knowledge of logging and monitoring tools to ensure system health and performance. Preferred Skills: Experience with cloud platforms (AWS, Azure, or GCP) is a plus.

Posted 3 days ago

Apply

6.0 - 8.0 years

27 - 42 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary The Sr. Developer role involves designing developing and maintaining data integration solutions using Abinitio admin and other ETL tools. The candidate will work on data warehousing projects ensuring efficient data processing and integration. This position requires a strong understanding of data warehousing concepts and proficiency in SQL and Unix Shell Scripting. The role is hybrid with no travel required. Responsibilities Design and develop data integration solutions using Ab Initio tools to ensure seamless data processing and transformation. Collaborate with cross-functional teams to gather and analyze requirements for data warehousing projects. Implement ETL processes to extract transform and load data from various sources into data warehouses. Optimize SQL queries to enhance performance and ensure efficient data retrieval and manipulation. Utilize Unix Shell Scripting for automation and scheduling of data processing tasks. Monitor and troubleshoot data integration workflows to ensure data accuracy and integrity. Provide technical support and guidance to team members on data warehousing and ETL best practices. Conduct regular reviews of data integration processes to identify areas for improvement and implement necessary changes. Ensure compliance with data governance and security standards in all data integration activities. Collaborate with stakeholders to understand business requirements and translate them into technical specifications. Develop and maintain documentation for data integration processes and workflows. Stay updated with the latest trends and technologies in data warehousing and ETL to drive innovation. Contribute to the companys mission by enabling data-driven decision-making through robust data integration solutions. Qualifications Demonstrate expertise in Data Warehousing Concepts and Scheduling Basics to design effective data solutions. Possess strong skills in ETL and SQL to manage data extraction and transformation processes efficiently. Show proficiency in Ab Initio GDE Conduct>It and Co>Operating System for advanced data integration tasks. Have experience in Unix Shell Scripting to automate and streamline data workflows. Nice to have domain experience in Telecom to understand industry-specific data requirements. Exhibit ability to work in a hybrid model balancing remote and on-site tasks effectively. Display strong analytical and problem-solving skills to address data integration challenges. Certifications Required Ab Initio Certification SQL Certification

Posted 3 days ago

Apply

Exploring Data Integration Jobs in India

Data integration is a crucial aspect of businesses today, as organizations strive to streamline their data management processes and gain valuable insights from their data. In India, the demand for data integration professionals is on the rise, with companies across various industries actively seeking skilled individuals to fill these roles.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Pune
  5. Hyderabad

Average Salary Range

The average salary range for data integration professionals in India varies based on experience levels. Entry-level positions typically start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

In the field of data integration, a typical career path may include roles such as Data Analyst, ETL Developer, Data Engineer, Data Integration Specialist, and Data Architect. Progression in this field often involves moving from Junior Developer to Senior Developer, and eventually to a Tech Lead position.

Related Skills

In addition to expertise in data integration tools and technologies, professionals in this field are often expected to have skills in data modeling, SQL, ETL processes, data warehousing, and data governance.

Interview Questions

  • What is data integration and why is it important? (basic)
  • Can you explain the difference between ETL and ELT processes? (medium)
  • How do you handle data quality issues during the data integration process? (medium)
  • Describe a challenging data integration project you worked on and how you overcame obstacles. (advanced)
  • How do you stay updated with the latest trends and technologies in data integration? (basic)
  • What is your experience with data migration and synchronization tasks? (medium)
  • Explain the role of metadata in data integration. (medium)
  • Can you discuss the benefits and limitations of using cloud-based data integration tools? (advanced)
  • How do you ensure data security and privacy in data integration processes? (medium)
  • What is your experience with integrating unstructured data into a structured database? (medium)
  • Describe a scenario where you had to optimize data integration performance. (advanced)
  • How do you handle data transformation requirements in a data integration project? (medium)
  • What are the common challenges faced during data integration projects and how do you address them? (medium)
  • Can you explain the concept of data mapping and its significance in data integration? (basic)
  • How do you approach data profiling and cleansing tasks in a data integration project? (medium)
  • Describe your experience with implementing real-time data integration solutions. (advanced)
  • How do you ensure data consistency and accuracy in a large-scale data integration project? (medium)
  • What are your preferred data integration tools and why? (basic)
  • How do you collaborate with cross-functional teams during a data integration project? (medium)
  • Explain the importance of data lineage in data integration processes. (medium)
  • What are the key factors to consider when designing a data integration strategy for an organization? (advanced)
  • How do you handle data conflicts and inconsistencies during the data integration process? (medium)
  • Can you discuss your experience with implementing data governance policies in data integration projects? (advanced)
  • Describe a scenario where you had to troubleshoot data integration issues in a production environment. (advanced)
  • How do you prioritize and manage multiple data integration projects simultaneously? (medium)

Closing Remark

As you explore opportunities in the data integration field in India, remember to showcase your expertise, stay updated with industry trends, and practice your interview skills. With the right preparation and confidence, you can land a rewarding career in data integration. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies