Home
Jobs

1102 Data Integration Jobs - Page 7

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

6 - 10 Lacs

Gurugram

Work from Office

Naukri logo

As a consutant you wi serve as a cient-facing practitioner who ses, eads and impements expert services utiizing the breadth of IBM's offerings and technoogies. A successfu Consutant is regarded by cients as a trusted business advisor who coaborates to provide innovative soutions used to sove the most chaenging business probems. You wi work deveoping soutions that exce at user experience, stye, performance, reiabiity and scaabiity to reduce costs and improve profit and sharehoder vaue. Your primary responsibiities incude Buid, automate and reease soutions based on cients priorities and requirements. Expore and discover risks and resoving issues that affect reease scope, schedue and quaity and bring to the tabe potentia soutions. Make sure that a integration soutions meet the cient specifications and are deivered on time. Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Minimum 5+ years of experience in IT industry. Minimum of 4+ years of Experience in Orace Appications and Orace Coud in Technica Domain. 2 End to End Impementations in Orace Suppy Chain Management Coud as Functiona Consutant. Shoud have worked in Inventory, Order Management, Cost Management, GOP Coud, Data Integration, FBDI, ADFDI Minimum 4+ years of experience in BIP reporting Preferred technica and professiona experience You’ have access to a the technica and management training courses you need to become the expert you want to be. Shoud have minimum 3 or more years of reevant experience in Orace Coud Technica (Orace Fusion ) 12c Deveopment and Impementation. Shoud have good knowedge of integrating with WebServices, XML(Extensibe Markup Language) and other API(Appication Programming Interface) to transfer the data - from source and target, in addition to database.

Posted 5 days ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

As a consutant you wi serve as a cient-facing practitioner who ses, eads and impements expert services utiizing the breadth of IBM's offerings and technoogies. A successfu Consutant is regarded by cients as a trusted business advisor who coaborates to provide innovative soutions used to sove the most chaenging business probems. You wi work deveoping soutions that exce at user experience, stye, performance, reiabiity and scaabiity to reduce costs and improve profit and sharehoder vaue. Your primary responsibiities incude: Buid, automate and reease soutions based on cients priorities and requirements. Expore and discover risks and resoving issues that affect reease scope, schedue and quaity and bring to the tabe potentia soutions. Make sure that a integration soutions meet the cient specifications and are deivered on time Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Minimum 5+ years of experience in IT industry. Minimum of 4+ years of Experience in Orace Appications and Orace Coud in Technica Domain. 2 End to End Impementations in Orace Suppy Chain Management Coud as Functiona Consutant. Shoud have worked in Inventory, Order Management, Cost Management, GOP Coud, Data Integration, FBDI, ADFDI Minimum 4+ years of experience in BIP reporting Preferred technica and professiona experience You’ have access to a the technica and management training courses you need to become the expert you want to be. Shoud have minimum 3 or more years of reevant experience in Orace Coud Technica (Orace Fusion ) 12c Deveopment and Impementation. Shoud have good knowedge of integrating with WebServices, XML(Extensibe Markup Language) and other API(Appication Programming Interface) to transfer the data - from source and target, in addition to database

Posted 5 days ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

As a BigData Engineer at IBM you wi harness the power of data to unvei captivating stories and intricate patterns. You' contribute to data gathering, storage, and both batch and rea-time processing. Coaborating cosey with diverse teams, you' pay an important roe in deciding the most suitabe data management systems and identifying the crucia data required for insightfu anaysis. As a Data Engineer, you' tacke obstaces reated to database integration and untange compex, unstructured data sets. In this roe, your responsibiities may incude: As a Big Data Engineer, you wi deveop, maintain, evauate, and test big data soutions. You wi be invoved in data engineering activities ike creating pipeines/workfows for Source to Target and impementing soutions that tacke the cients needs Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Big Data Deveoper, Hadoop, Hive, Spark, PySpark, Strong SQL. Abiity to incorporate a variety of statistica and machine earning techniques. Basic understanding of Coud (AWS,Azure, etc). Abiity to use programming anguages ike Java, Python, Scaa, etc., to buid pipeines to extract and transform data from a repository to a data consumer Abiity to use Extract, Transform, and Load (ETL) toos and/or data integration, or federation toos to prepare and transform data as needed. Abiity to use eading edge toos such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technica and professiona experience Basic understanding or experience with predictive/prescriptive modeing skis You thrive on teamwork and have exceent verba and written communication skis. Abiity to communicate with interna and externa cients to understand and define business needs, providing anaytica soutions

Posted 5 days ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

Coimbatore

Work from Office

Naukri logo

As an Appication Deveoper, you wi ead IBM into the future by transating system requirements into the design and deveopment of customized systems in an agie environment. The success of IBM is in your hands as you transform vita business needs into code and drive innovation. Your work wi power IBM and its cients gobay, coaborating and integrating code into enterprise systems. You wi have access to the atest education, toos and technoogy, and a imitess career path with the word’s technoogy eader. Come to IBM and make a goba impact! The abiity to be a team payer The abiity and ski to train other peope in procedura and technica topics Strong communication and coaboration skis Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Strong knowedge and experience in database design, modeing and deveopment using PL SQL. Minimum of 6 years. Proficiency with Orace databases and toos, such as SQL Deveoper and Toad In-depth understanding of SQL tuning and optimization techniques Knowedge of database performance monitoring and troubeshooting Famiiarity with ETL processes and data integration techniques and Strong anaytica and probem-soving skis Preferred technica and professiona experience Abiity to work in a fast-paced environment and meet deadines Knowedge of agie software deveopment practices is a pus Bacheor's degree in computer science or a reated fied is preferred, but not required

Posted 5 days ago

Apply

3.0 - 7.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

As a member of the Data and Technology practice, you will be working on advanced AI ML engagements tailored for the investment banking sector. This includes developing and maintaining data pipelines, ensuring data quality, and enabling data-driven insights. Your core responsibility will be to build and manage scalable data infrastructure that supports our proof-of-concept initiatives (POCs) and full-scale solutions for our clients. You will work closely with data scientists, DevOps engineers, and clients to understand their data requirements, translate them into technical tasks, and develop robust data solutions. Your primary duties will encompass: Develop, optimize, and maintain scalable and reliable data pipelines using tools such as Python, SQL, and Spark. Integrate data from various sources including APIs, databases, and cloud storage solutions such as Azure, Snowflake, and Databricks. Implement data quality checks and ensure the accuracy and consistency of data. Manage and optimize data storage solutions, ensuring high performance and availability. Work closely with data scientists and DevOps engineers to ensure seamless integration of data pipelines and support machine learning model deployment. Monitor and optimize the performance of data workflows to handle large volumes of data efficiently. Create detailed documentation of data processes. Implement security best practices and ensure compliance with industry standards. Experience / Skills 5+ years of relevant experience in: Experience in a data engineering role , preferably within the financial services industry . Strong experience with data pipeline tools and frameworks such as Python, SQL, and Spark. Proficiency in cloud platforms, particularly Azure, Snowflake, and Databricks. Experience with data integration from various sources including APIs and databases. Strong understanding of data warehousing concepts and practices. Excellent problem-solving skills and attention to detail. Strong communication skills, both written and oral, with a business and technical aptitude. Additionally, desired skills: Familiarity with big data technologies and frameworks. Experience with financial datasets and understanding of investment banking metrics. Knowledge of visualization tools (e.g., PowerBI). Education Bachelors or Masters in Science or Engineering disciplines such as Computer Science, Engineering, Mathematics, Physics, etc.

Posted 5 days ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking an experienced ETL Data Engineer with expertise in Informatica Intelligent Cloud Services (IICS) and Informatica PowerCenter to support our ongoing and upcoming projects. The ideal candidate will be responsible for designing, developing, and maintaining data integration processes using both IICS and PowerCenter. Proficiency in Oracle is essential, including hands-on experience in building, optimizing, and managing data solutions on the platform. The candidate should have the ability to handle tasks independently , demonstrating strong problem-solving skills and initiative in managing data integration projects. This role involves close collaboration with business stakeholders, data architects, and cross-functional teams to deliver effective data solutions that align with business objectives. Who you are: Basics Qualification: Education: Bachelors in computer/ IT or Similar Mandate Skills: ETL Data Engineer, IICS, Informatica PowerCenter, Nice to have: Unix

Posted 5 days ago

Apply

8.0 - 13.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: B achelors or masters degree in computer science, mathematics, or related fields. At least 8 years of experience as a data warehouse expert, data engineer or data integration specialist. In depth knowledge of Snowflake components including Security and Governance Proven experience in implementing complex data models (eg. OLTP , OLAP , Data vault) A strong understanding of ETL including end-to-end data flows, from ingestion to data modeling and solution delivery. Proven industry experience with DBT and JINJA scripts Strong proficiency in SQL, with additional knowledge of Python (i.e. pandas and PySpark) being advantageous. Familiarity with data & analytics solutions such as AWS (especially Glue, Lambda, DMS) is nice to have. Experience working with Azure Dev Ops and warehouse automation tools (eg. Coalesce) is a plus. Experience with Healthcare R&D is a plus. Excellent English communication skills, with the ability to effectively engage both with R&D scientists and software engineers. Experience working in virtual and agile teams.

Posted 5 days ago

Apply

8.0 - 13.0 years

12 - 13 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

Hi everyone. Open Positions in the Grafana Dashboard Specialist Role Greetings from Tekaccel! This is an excellent opportunity with us. If you have that unique and unlimited passion for building world-class enterprise software products that turn into actionable intelligence, then we have the right opportunity for you and your career. What are we looking for? Job Title: Grafana Dashboard Specialist Total Experience Required: 8+ Years Relevant Experience: 8+ Years Work Location: Pune / Chennai / Chandigarh / Bangalore Working Mode: Work From Office (WFO) Shift Timings: EMEA/NAM Shift Employment Type: Contract Key Responsibilities: Dashboard Development: Design and build robust Grafana dashboards to deliver actionable insights for monitoring complex production environments. Collaboration & Metrics Identification: Work closely with technology and management teams to identify key metrics and ensure effective visualization. Data Integration: Integrate data from multiple sourcesSplunk, AppDynamics, microservices, Spark, Jira, and ServiceNow—into Grafana for real-time monitoring and alerting. SPL Query Development: Write and optimize SPL queries within Grafana to extract, manipulate, and visualize relevant data from Splunk. Log Aggregation & Insights: Utilize log aggregation and analysis to drive actionable insights from large datasets. Custom Scripting: Develop custom Unix/Python scripts to automate data retrieval and processing, feeding into Grafana dashboards. Operational Reporting: Create dashboards in Jira and ServiceNow for enhanced visibility and reporting of operational KPIs. Production Support: Troubleshoot dashboard performance issues and ensure accuracy, supporting the overall production monitoring ecosystem. Qualifications: 8+ years of relevant experience in Grafana dashboard development and production support monitoring. Strong expertise in Grafana and Splunk, with proficiency in writing and optimizing SPL queries. In-depth knowledge of AppDynamics, microservices architecture, and Apache Spark. Advanced scripting abilities using Unix and Python. Proven experience in log aggregation and transforming raw logs into meaningful business and technical metrics. Familiarity with Jira and ServiceNow for operational dashboards and reporting. Excellent problem-solving skills, with a keen eye for detail and a drive for continuous improvement. Strong communication and collaboration skills, with the ability to engage both technical and management stakeholders. Mandatory Skills: Grafana, Splunk, AppDynamics, Microservices, Spark, Log Aggregation If interested, candidates, please share your updated resume at naveen@tekaccel.com or WhatsApp at +91 7997763537 Tekaccel Software Services India

Posted 6 days ago

Apply

8.0 - 12.0 years

13 - 20 Lacs

Ranchi

Work from Office

Naukri logo

Key Responsibilities: OSI PI System Development: Design, configure, and implement OSI PI System solutions, including PI Asset Framework (AF), PI Data Archive, PI Vision, and PI Integrators. Asset Framework (AF) Modeling: Develop hierarchical asset models, templates, and calculations to standardize data across industrial operations. Real-time Data Integration: Work with SCADA, DCS, PLCs, and IoT systems to integrate real-time and historical data into OSI PI. Scripting & Automation: Develop scripts using PowerShell, Python, or PI SDK (AF SDK, PI Web API, or PI SQL DAS) to automate data processes.

Posted 6 days ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role This is an Internal document. Job TitleSenior Data Engineer About The Role As a Senior Data Engineer, you will play a key role in designing and implementing data solutions @Kotak811. — You will be responsible for leading data engineering projects, mentoring junior team members, and collaborating with cross-functional teams to deliver high-quality and scalable data infrastructure. — Your expertise in data architecture, performance optimization, and data integration will be instrumental in driving the success of our data initiatives. Responsibilities 1. Data Architecture and Designa. Design and develop scalable, high-performance data architecture and data models. b. Collaborate with data scientists, architects, and business stakeholders to understand data requirements and design optimal data solutions. c. Evaluate and select appropriate technologies, tools, and frameworks for data engineering projects. d. Define and enforce data engineering best practices, standards, and guidelines. 2. Data Pipeline Development & Maintenancea. Develop and maintain robust and scalable data pipelines for data ingestion, transformation, and loading for real-time and batch-use-cases b. Implement ETL processes to integrate data from various sources into data storage systems. c. Optimise data pipelines for performance, scalability, and reliability. i. Identify and resolve performance bottlenecks in data pipelines and analytical systems. ii. Monitor and analyse system performance metrics, identifying areas for improvement and implementing solutions. iii. Optimise database performance, including query tuning, indexing, and partitioning strategies. d. Implement real-time and batch data processing solutions. 3. Data Quality and Governancea. Implement data quality frameworks and processes to ensure high data integrity and consistency. b. Design and enforce data management policies and standards. c. Develop and maintain documentation, data dictionaries, and metadata repositories. d. Conduct data profiling and analysis to identify data quality issues and implement remediation strategies. 4. ML Models Deployment & Management (is a plus) This is an Internal document. a. Responsible for designing, developing, and maintaining the infrastructure and processes necessary for deploying and managing machine learning models in production environments b. Implement model deployment strategies, including containerization and orchestration using tools like Docker and Kubernetes. c. Optimise model performance and latency for real-time inference in consumer applications. d. Collaborate with DevOps teams to implement continuous integration and continuous deployment (CI/CD) processes for model deployment. e. Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues. f. Implement monitoring and logging solutions to track model performance, data drift, and system health. 5. Team Leadership and Mentorshipa. Lead data engineering projects, providing technical guidance and expertise to team members. i. Conduct code reviews and ensure adherence to coding standards and best practices. b. Mentor and coach junior data engineers, fostering their professional growth and development. c. Collaborate with cross-functional teams, including data scientists, software engineers, and business analysts, to drive successful project outcomes. d. Stay abreast of emerging technologies, trends, and best practices in data engineering and share knowledge within the team. i. Participate in the evaluation and selection of data engineering tools and technologies. Qualifications1. 3-5 years"™ experience with Bachelor's Degree in Computer Science, Engineering, Technology or related field required 2. Good understanding of streaming technologies like Kafka, Spark Streaming. 3. Experience with Enterprise Business Intelligence Platform/Data platform sizing, tuning, optimization and system landscape integration in large-scale, enterprise deployments. 4. Proficiency in one of the programming language preferably Java, Scala or Python 5. Good knowledge of Agile, SDLC/CICD practices and tools 6. Must have proven experience with Hadoop, Mapreduce, Hive, Spark, Scala programming. Must have in-depth knowledge of performance tuning/optimizing data processing jobs, debugging time consuming jobs. 7. Proven experience in development of conceptual, logical, and physical data models for Hadoop, relational, EDW (enterprise data warehouse) and OLAP database solutions. 8. Good understanding of distributed systems 9. Experience working extensively in multi-petabyte DW environment 10. Experience in engineering large-scale systems in a product environment

Posted 6 days ago

Apply

4.0 - 9.0 years

11 - 21 Lacs

Pune, Hinjewadi, Hinjewadi-Pune

Work from Office

Naukri logo

SECTION A: POSITION SUMMARY This role is accountable to develop, expand and optimize Data Management Architecture, Design & Implementation under Singtel Data Platform & Management 1. Design, develop and implement data governance and management solution, data quality, Privacy, protection & associated control technology solutions as per best industry practice. 2. Review, evaluate and implement Data Management standards primarily Data Classification, Data Retention across systems. 3. Design, develop and implement Automated Data Discovery rules to identify presence of PII attributes. 4. Drive development, optimization, testing and tooling to improve overall data control management (Security, Data Privacy, protection, Data Quality) 5. Review, analyze, benchmark, and approve solution design from product companies, internal teams, and vendors. 6. Ensure that proposed solutions are aligned and conformed to the data landscape, big data architecture guidelines and roadmap. SECTION B: KEY RESPONSIBILITIES AND RESULTS 1 Design and implement data management standards like Catalog Management, Data Quality, Data Classification, Data Retention 2 Drive BAU process, testing and tooling to improve data security, privacy, and protection 3 Identify, design, and implement internal process improvements: automating manual processes, control and optimizing data technology service delivery. 4 Implement and support Data Management Technology solution throughout lifecycle like user onboarding, upgrades, fixes, access management etc.. SECTION C: QUALIFICATIONS / EXPERIENCE / KNOWLEDGE REQUIRED Category Essential for this role Education and Qualifications Diploma in Data Analytics, Data Engineering, IT, Computer Science, Software Engineering, or equivalent. Work Experience Exposure to Data Management and Big Data Concepts Knowledge and experience in Data Management, Data Integration, Data Quality products Technical Skills Informatica CDGC, Collibra, Alatian Informatica Data Quality, Data Privacy Management Azure Data Bricks

Posted 6 days ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Primary Responsibilities: Reporting Development and Data Integration : Assist with data projects related to integration with our core claims adjudication engines, eligibility, and other database items as necessary Support the data leads by producing ad hoc reports as needed based on requirements from the business Report on key milestones to our project leads Ensuring all reporting aligns with brand standards Ensuring PADU guidelines for tools, connections, and data security Build a network with internal partners to assist with validating data quality Analytical Skills Utilization : Applying analytical skills and developing business knowledge to support operations Identify automation opportunities through the trends and day to day tasks to help create efficiencies within the team Perform root cause analysis via the 5 why root causing to identify process gaps and initiate process improvement efforts Assist with user testing for reports, business insights dashboards, and assist with automation validation review Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Degree or equivalent data science, analysis, mathematics experience Experience supporting operational teams performance with reports and analytics Experience using Word (creating templates/documents), PowerPoint (creation and presentation), Teams, and SharePoint (document access/storage, sharing, List development and management) Basic understanding of reporting using Business Insights tools including Tableau and PowerBI Expertise in Excel (data entry, sorting/filtering) and VBA Proven solid communication skills including oral, written, and organizational skills Proven ability to manage emotions effectively in high-pressure situations, maintaining composure, and fosters a positive work environment conducive to collaboration and productivity Preferred Qualifications: Experience leveraging and creating automation such as macros, PowerAutomate, Alteryx/ETL Applications Experience working with cloud-based servers, knowledge of database structure, stored procedures Experience performing root cause analysis and demonstrated problem solving skills Knowledge of R/Python, SQL, DAX or other coding languages Knowledge of multiple lines of business, benefit structures and claims processing systems At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyoneof every race, gender, sexuality, age, location and incomedeserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes an enterprise priority reflected in our mission.

Posted 6 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Mumbai

Work from Office

Naukri logo

Detailed Job Description Responsible: 1. Building, maintaining & enhancing the financial systems (Planning Analytics) for budgeting/ forecasting/ Actual MIS reporting. 2. Experience in developing objects in TM1/Planning Analytics including cubes/rules/dimensions/MDX expressions /TI processes/Web sheets. 3. Create and administer security at all levels client-groups cubes dimensions processes etc. 4. Analyze and resolve data and technical issues raised by the users. 5. Strong in Business Processes (Manufacturing domain knowledge) and data warehousing concepts. 6. Experience with Planning Analytics Workspace and Planning Analytics for Excel. Total Years of Experience: 5+ years experience Relevant years of Experience: 4+ years experience into development & Support activities in live project Mandatory Skills for screening: 1. Cognos TM1 (Planning Analytics) 2. TM1 Rules and Feeders 3. ETL Processes 5. Performance Optimization 5. Data Integration Good to have (Not Mandatory) 1. Cognos Analytics 2. SQL Queries

Posted 6 days ago

Apply

3.0 - 8.0 years

11 - 21 Lacs

Pune

Work from Office

Naukri logo

P2 Grade- (3+ to 5 yrs) -> CTC -> 11.50 LPA Max P3 Grade (5 to 8 yrs) CTC -> 21 LPA Max POSITION SUMMARY This role is accountable to develop, expand and optimize Data Management Architecture, Design & Implementation under Singtel Data Platform & Management 1. Design, develop and implement data governance and management solution, data quality, Privacy, protection & associated control technology solutions as per best industry practice. 2. Review, evaluate and implement Data Management standards primarily Data Classification, Data Retention across systems. 3. Design, develop and implement Automated Data Discovery rules to identify presence of PII attributes. 4. Drive development, optimization, testing and tooling to improve overall data control management (Security, Data Privacy, protection, Data Quality) 5. Review, analyze, benchmark, and approve solution design from product companies, internal teams, and vendors. 6. Ensure that proposed solutions are aligned and conformed to the data landscape, big data architecture guidelines and roadmap. KEY RESPONSIBILITIES AND RESULTS 1 Design and implement data management standards like Catalog Management, Data Quality, Data Classification, Data Retention 2 Drive BAU process, testing and tooling to improve data security, privacy, and protection 3 Identify, design, and implement internal process improvements: automating manual processes, control and optimizing data technology service delivery. 4 Implement and support Data Management Technology solution throughout lifecycle like user onboarding, upgrades, fixes, access management etc.. QUALIFICATIONS / EXPERIENCE / KNOWLEDGE REQUIRED Category Essential for this role Education and Qualifications Diploma in Data Analytics, Data Engineering, IT, Computer Science, Software Engineering, or equivalent. Work Experience • Exposure to Data Management and Big Data Concepts Knowledge and experience in Data Management, Data Integration, Data Quality products Technical Skills • Informatica CDGC, Collibra, Alatian Informatica Data Quality, Data Privacy Management Azure Data Bricks

Posted 6 days ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific An expert on the principles and practices associated with data platform engineering, particularly within cloud environments, and demonstrates proficiency in specific technical areas related to cloud-based data infrastructure, automation, and scalability.Key responsibilities encompass:Team Leadership and Management: Supervising a team of platform engineers, with a focus on team dynamics and the efficient delivery of cloud platform solutions.Technical Guidance and Decision-Making: Providing technical leadership and making pivotal decisions concerning platform architecture, tools, and processes. Balancing hands-on involvement with strategic oversight.Mentorship and Skill Development: Guiding team members through mentorship, enhancing their technical proficiencies, and nurturing a culture of continual learning and innovation in platform engineering practices.In-Depth Technical Proficiency: Possessing a comprehensive understanding of platform engineering principles and practices, and demonstrating expertise in crucial technical areas such as cloud services, automation, and system architecture.Community Contribution: Making significant contributions to the development of the platform engineering community, staying informed about emerging trends, and applying this knowledge to drive enhancements in capability.

Posted 6 days ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A strong grasp of the principles and practices associated with data platform engineering, particularly within cloud environments, and demonstrates proficiency in specific technical areas related to cloud-based data infrastructure, automation, and scalability.Key responsibilities encompass:Community Engagement: Actively participating in the professional data platform engineering community, sharing insights, and staying up-to-date with the latest trends and best practices.Project Contributions: Making substantial contributions to client delivery, particularly in the design, construction, and maintenance of cloud-based data platforms and infrastructure.Technical Expertise: Demonstrating a sound understanding of data platform engineering principles and knowledge in areas such as cloud data storage solutions (e.g., AWS S3, Azure Data Lake), data processing frameworks (e.g., Apache Spark), and data orchestration tools.Independent Work and Initiative: Taking ownership of independent tasks, displaying initiative and problem-solving skills when confronted with intricate data platform engineering challenges.Emerging Leadership: Commencing leadership roles, which may encompass mentoring junior engineers, leading smaller project teams, or taking the lead on specific aspects of data platform projects.

Posted 6 days ago

Apply

4.0 - 8.0 years

19 - 25 Lacs

Pune

Work from Office

Naukri logo

5-7 years of experience in data engineering or software development , preferably within a finance or enterprise IT environment. Proficient in ETL tools , SQL , and data warehouse development . Proficient in Snowflake , Power BI , and OBIEE reporting platforms. Must have worked in implementation using these tools and technologies. Strong understanding of data warehousing principles , including schema design (star/snowflake), ER modeling, and relational databases. Working knowledge of Oracle databases and Oracle EBS structures. Design, develop, and maintain ETL pipelines using Snowflake and related data transformation tools. Build and automate data integration workflows that extract, transform, and load data from various sources including Oracle EBS and other enterprise systems. Analyze, monitor, and troubleshoot data quality and integrity issues using standardized tools and methods. Develop and maintain dashboards and reports using OBIEE , Power BI , and other visualization tools for business stakeholders. Work with IT and Business teams to gather reporting requirements and translate them into scalable technical solutions. Participate in data modeling and storage architecture using star and snowflake schema designs. Contribute to the implementation of data governance , metadata management , and access control mechanisms . Maintain documentation for solutions and participate in testing and validation activities. Support migration and replication of data using tools such as Qlik Replicate and contribute to cloud-based data architecture . Apply agile and DevOps methodologies to continuously improve data delivery and quality assurance processes.

Posted 6 days ago

Apply

5.0 - 10.0 years

18 - 25 Lacs

Bengaluru

Remote

Naukri logo

Job Title: Data Engineer ETL & Spatial Data Expert Locations: Bengaluru / Gurugram / Nagpur / Remote Department: Data Engineering / GIS / ETL Experience: As per requirement (CTC capped at 3.5x of experience in years) Notice Period: Max 30 days Role Overview: We are looking for a detail-oriented and technically proficient Data Engineer with strong experience in FME, spatial data handling , and ETL pipelines . The role involves building, transforming, validating, and automating complex geospatial datasets and dashboards to support operational and analytical needs. Candidates will work closely with internal teams, local authorities (LA), and HMLR specs. Key Responsibilities: 1. Data Integration & Transformation Build ETL pipelines using FME to ingest and transform data from Idox/CCF systems. Create Custom Transformers in FME to apply reusable business rules. Use Python (standalone or within FME) for custom transformations, date parsing, and validations. Conduct data profiling to assess completeness, consistency, and accuracy. 2. Spatial Data Handling Manage and query spatial datasets using PostgreSQL/PostGIS . Handle spatial formats like GeoPackage, GML, GeoJSON, Shapefiles . Fix geometry issues like overlaps or invalid polygons using FME or SQL . Ensure proper coordinate system alignment (e.g., EPSG:27700). 3. Automation & Workflow Orchestration Use FME Server/FME Cloud to automate and monitor ETL workflows. Schedule batch processes via CI/CD, Cron, or Python . Implement audit trails and logs for all data processes and rule applications. 4. Dashboard & Reporting Integration Write SQL views and aggregations to support dashboard visualizations. Optionally integrate with Power BI, Grafana, or Superset . Maintain metadata tagging for each data batch. 5. Collaboration & Communication Interpret validation reports and collaborate with Analysts/Ops teams. Translate business rules into FME logic or SQL queries. Map data to LA/HMLR schemas accurately. Preferred Tools & Technologies: CategoryToolsETLFME (Safe Software), Talend (optional), PythonSpatial DBPostGIS, Oracle SpatialGIS ToolsQGIS, ArcGISScriptingPython, SQLValidationFME Testers, AttributeValidator, SQL viewsFormatsCSV, JSON, GPKG, XML, ShapefilesCollaborationJira, Confluence, Git Ideal Candidate Profile: Strong hands-on experience with FME workflows and spatial data transformation . Proficient in scripting using Python and working with PostGIS . Demonstrated ability to build scalable data automation pipelines. Effective communicator capable of converting requirements into technical logic. Past experience with LA or HMLR data specifications is a plus. Required Qualifications: B.E./B.Tech. (Computer Science, IT, or ECE) B.Sc. (IT/CS) or Full-time MCA Strict Screening Criteria: No employment gaps over 4 months. Do not consider candidates from Jawaharlal Nehru University. Exclude profiles from Hyderabad or Andhra Pradesh (education or employment). Reject profiles with BCA, B.Com, Diploma, or open university backgrounds. Projects must detail technical tools/skills used clearly. Max CTC is 3.5x of total years of experience. No flexibility on notice period or compensation. No candidates from Noida for Gurugram location.

Posted 1 week ago

Apply

12.0 - 20.0 years

35 - 50 Lacs

Bengaluru

Hybrid

Naukri logo

Data Architect with Cloud Expert, Data Architecture, Data Integration & Data Engineering ETL/ELT - Talend, Informatica, Apache NiFi. Big Data - Hadoop, Spark Cloud platforms (AWS, Azure, GCP), Redshift, BigQuery Python, SQL, Scala,, GDPR, CCPA

Posted 1 week ago

Apply

5.0 - 9.0 years

4 - 7 Lacs

Gurugram

Work from Office

Naukri logo

Primary Skills SQL (Advanced Level) SSAS (SQL Server Analysis Services) Multidimensional and/or Tabular Model MDX / DAX (strong querying capabilities) Data Modeling (Star Schema, Snowflake Schema) Secondary Skills ETL processes (SSIS or similar tools) Power BI / Reporting tools Azure Data Services (optional but a plus) Role & Responsibilities Design, develop, and deploy SSAS models (both tabular and multidimensional). Write and optimize MDX/DAX queries for complex business logic. Work closely with business analysts and stakeholders to translate requirements into robust data models. Design and implement ETL pipelines for data integration. Build reporting datasets and support BI teams in developing insightful dashboards (Power BI preferred). Optimize existing cubes and data models for performance and scalability. Ensure data quality, consistency, and governance standards. Top Skill Set SSAS (Tabular + Multidimensional modeling) Strong MDX and/or DAX query writing SQL Advanced level for data extraction and transformations Data Modeling concepts (Fact/Dimension, Slowly Changing Dimensions, etc.) ETL Tools (SSIS preferred) Power BI or similar BI tools Understanding of OLAP & OLTP concepts Performance Tuning (SSAS/SQL) Skills: analytical skills,etl processes (ssis or similar tools),collaboration,multidimensional expressions (mdx),power bi / reporting tools,sql (advanced level),sql proficiency,dax,ssas (multidimensional and tabular model),etl,data modeling (star schema, snowflake schema),communication,azure data services,mdx,data modeling,ssas,data visualization

Posted 1 week ago

Apply

5.0 - 9.0 years

13 - 17 Lacs

Pune

Work from Office

Naukri logo

Diacto is looking for a highly capable Data Architect with 5 to 9 years of experience to lead cloud data platform initiatives with a primary focus on Snowflake and Azure Data Hub. This individual will play a key role in defining the data architecture strategy, implementing robust data pipelines, and enabling enterprise-grade analytics solutions. This is an on-site role based in our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design and implement enterprise-level data architecture with a strong focus on Snowflake and Azure Data Hub Define standards and best practices for data ingestion, transformation, and storage Collaborate with cross-functional teams to develop scalable, secure, and high-performance data pipelines Lead Snowflake environment setup, configuration, performance tuning, and optimization Integrate Azure Data Services with Snowflake to support diverse business use cases Implement governance, metadata management, and security policies Mentor junior developers and data engineers on cloud data technologies and best practices Experience and Skills Required: 5?9 years of overall experience in data architecture or data engineering roles Strong, hands-on expertise in Snowflake, including design, development, and performance tuning Solid experience with Azure Data Hub and Azure Data Services (Data Lake, Synapse, etc.) Understanding of cloud data integration techniques and ELT/ETL frameworks Familiarity with data orchestration tools such as DBT, Airflow, or Azure Data Factory Proven ability to handle structured, semi-structured, and unstructured data Strong analytical, problem-solving, and communication skills Nice to Have: Certifications in Snowflake and/or Microsoft Azure Experience with CI/CD tools like GitHub for code versioning and deployment Familiarity with real-time or near-real-time data ingestio Why Join Diacto Technologies Work with a cutting-edge tech stack and cloud-native architectures Be part of a data-driven culture with opportunities for continuous learning Collaborate with industry experts and build transformative data solutions

Posted 1 week ago

Apply

3.0 - 8.0 years

2 Lacs

Hyderabad

Work from Office

Naukri logo

Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse Implement best practices for data loading, ensuring optimal performance and data quality Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements Work on data modeling and schema design to optimize database structures for ETL processes Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading Troubleshoot and resolve issues related to data integration and performance bottlenecks Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions Provide guidance and mentorship to junior members of the data engineering team Create and maintain comprehensive documentation for ETL processes, data models, and data flows Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows Use Jira for task tracking and project management Implement data quality checks and validation processes to ensure data integrity and reliability Maintain detailed documentation of data engineering processes and solutions Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e g , Informatica Cloud, Talend, Apache NiFi) Expertise in IDMC principles, including data governance, data quality, and metadata management Solid understanding of data warehousing concepts and practices Strong SQL skills and experience working with relational databases Excellent problem-solving and analytical skills Qualified candidates should APPLY NOW for immediate consideration! Please hit APPLY to provide the required information, and we will be back in touch as soon as possible Thank you! ABOUT INNOVA SOLUTIONS: Founded in 1998 and headquartered in Atlanta, Georgia, Innova Solutions employs approximately 50,000 professionals worldwide and reports an annual revenue approaching $3 Billion Through our global delivery centers across North America, Asia, and Europe, we deliver strategic technology and business transformation solutions to our clients, enabling them to operate as leaders within their fields Recent Recognitions: One of Largest IT Consulting Staffing firms in the USA Recognized as #4 by Staffing Industry Analysts (SIA 2022) ClearlyRated Client Diamond Award Winner (2020) One of the Largest Certified MBE Companies in the NMSDC Network (2022) Advanced Tier Services partner with AWS and Gold with MS

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Pune

Work from Office

Naukri logo

As a BigData Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Big Data Developer, Hadoop, Hive, Spark, PySpark, Strong SQL. Ability to incorporate a variety of statistical and machine learning techniques. Basic understanding of Cloud (AWS,Azure, etc) . Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience Basic understanding or experience with predictive/prescriptive modeling skills You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. Youll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, youll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, youll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks

Posted 1 week ago

Apply

12.0 - 17.0 years

6 - 10 Lacs

Mumbai, Vikhroli

Work from Office

Naukri logo

Oracle analytics - ADW solution architect with good knowledge of Oracle Data Integrator and Oracle Analytics Cloud. Hands on experience is must. Solution architect to design warehouse on Oracle ADW and implementing security Ability to design, implement, and maintain data integration solutions (using Oracle Data Integrator or ODI) with the skills to build and optimize data visualizations and reports in Oracle Analytics Cloud (OAC). This role involves working with diverse data sources (Oracle Fusion ERP/Procurement cloud, SAP success Factor, Salesforce and on-premises databases) , transforming them, and delivering insights through dashboards and reports. Manage team of junior developers to deliver warehouse needs Good communications skills and experience working on Financial warehouse. Good understanding of Finance reporting needs. Qualifications Any graduate with 12+ years of Technology experience 8+ years of experience working on Oracle analytics, ADW on cloud Additional Information Certifications - good to have Job Location

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies