Jobs
Interviews

4508 Informatica Jobs - Page 34

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Title: Informatica IDMC Developer Skills: Informatica Intelligent Data Management Cloud (IDMC/IICS), SQL, AWS, Azure, or GCP, CI/CD pipelines, Snowflake, Redshift, or BigQuery. Experience Required: 5 - 8 Years Job Location: Greater Noida Only Send your CV to Gaurav.2.Kumar@coforge.com We at Coforge are hiring Informatica IDMC Developer with following skillset: Key Responsibilities: Design, develop, and maintain robust ETL pipelines using Informatica IDMC (IICS) . Collaborate with data architects, analysts, and business stakeholders to gather and understand data requirements. Integrate data from diverse sources including databases, APIs, and flat files. Optimize data workflows for performance, scalability, and reliability. Monitor and troubleshoot ETL jobs and resolve data quality issues. Implement data governance and security best practices. Maintain comprehensive documentation of data flows, transformations, and architecture. Participate in code reviews and contribute to continuous improvement initiatives. Required Skills & Qualifications: Strong hands-on experience with Informatica IDMC (IICS) and cloud-based ETL tools. Proficiency in SQL and experience with relational databases such as Oracle, SQL Server, PostgreSQL . Experience working with cloud platforms like AWS, Azure, or GCP . Familiarity with data warehousing concepts and tools such as Snowflake, Redshift, or BigQuery . Excellent problem-solving abilities and strong communication skills. Preferred Qualifications: Experience with CI/CD pipelines and version control systems. Knowledge of data modeling and metadata management. Certification in Informatica or cloud platforms is a plus.

Posted 1 week ago

Apply

7.0 - 11.0 years

30 - 35 Lacs

Bengaluru

Hybrid

Lead Data Engineer We're Hiring: Lead Data Engineer | Bangalore | 7 - 11 Years Experience Location: Bangalore(Hybrid) Position Type: Permanent Mode of Interview: Face to Face Experience: 7 - 11yrs. Skills: Snowflake, ETL tools(Informatica/BODS/Datastage), Scripting (Python/Powershell/Shell), SQL, Data Warehousing Candidate who are available for a Face to Face discussion can apply. Interested? Send your updated CV to: radhika@theedgepartnership.com Do connect to me on LinkedIn: https://www.linkedin.com/in/radhika-gm-00b20a254/ Skills and Qualification (Functional and Technical Skills) Functional Skills: Team Player: Support peers, team, and department management. Communication: Excellent verbal, written, and interpersonal communication skills. Problem Solving: Excellent problem-solving skills, incident management, root cause analysis, and proactive solutions to improve quality. Partnership and Collaboration: Develop and maintain partnership with business and IT stakeholders Attention to Detail: Ensure accuracy and thoroughness in all tasks. Technical/Business Skills: Data Engineering: Experience in designing and building Data Warehouse and Data lakes. Good knowledge of data warehouse principles, and concepts. Technical expertise working in large scale Data Warehousing applications and databases such as Oracle, Netezza, Teradata, and SQL Server. Experience with public cloud-based data platforms especially Snowflake and AWS. Data integration skills: Expertise in design and development of complex data pipelines Solutions using any industry leading ETL tools such as SAP Business Objects Data Services (BODS), Informatica Cloud Data Integration Services (IICS), IBM Data Stage. Experience of ELT tools such as DBT, Fivetran, and AWS Glue Expert in SQL - development experience in at least one scripting language (Python etc.), adept in tracing and resolving data integrity issues. Strong knowledge of data architecture, data design patterns, modeling, and cloud data solutions (Snowflake, AWS Redshift, Google BigQuery). Data Model: Expertise in Logical and Physical Data Model using Relational or Dimensional Modeling practices, high volume ETL/ELT processes. Performance tuning of data pipelines and DB Objects to deliver optimal performance. Experience in Gitlab version control and CI/CD processes. Experience working in Financial Industry is a plus.

Posted 1 week ago

Apply

7.0 - 10.0 years

10 - 12 Lacs

Hyderabad

Work from Office

We are seeking a SQL Developer with 7+ years of experience in enterprise-level database management, and excellent communication skills This role involves handling large datasets (5-10M records) with a focus on performance tuning, deadlock management, and schema design for high-volume data systems Responsibilities include writing and optimizing SQL code, ETL transformations using SSIS, and managing database transactions The candidate will also work with MySQL, AWS Data Warehouse, and PostgreSQL, while ensuring scalability and efficiency for tera/petabyte-scale data Prior experience with data warehouses, conceptualizing schemas, and working with multi-client SAAS solutions is essential Expertise in tools like AWS QuickSight is preferred Knowledge of data science and statistics is a plus

Posted 1 week ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Kolkata

Work from Office

Diverse Lynx is looking for Power BI to join our dynamic team and embark on a rewarding career journeyResponsible for designing, developing, and implementing business intelligence solutions using Power BI, a data visualization and reporting tool from Microsoft. Connecting to and integrating data from various sources, including databases, spreadsheets, and cloud services.Designing and creating data models, dashboards, reports, and other data visualizations.Enhancing existing Power BI solutions to meet evolving business requirements.Collaborating with stakeholders to understand their data needs and requirements.Building and maintaining data pipelines and ETL processes to ensure data quality and accuracy.Developing and implementing security and access control measures to ensure the protection of sensitive data.Troubleshooting and resolving issues with Power BI solutions.Documenting and communicating solutions to stakeholders.Excellent communication, analytical, and problem-solving skills.

Posted 1 week ago

Apply

1.0 - 4.0 years

25 - 30 Lacs

Bengaluru

Work from Office

SKF has been around for more than a century and today we are one of the world s largest global suppliers of bearings and supporting solutions for rotating equipment. Our products can be found literally everywhere in society. This means that we are an important part of the everyday lives of people and companies around the world. In September of 2024, SKF announced the separation of its Automotive business, with the objective to build two world-leading businesses. The role you are applying for will be part of the automotive business. This means you will have the opportunity to be a part of shaping a new company aimed at meeting the needs of the transforming global automotive market. Would you like to join us in shaping the future of motion? We are now looking for a Data Engineer, India - Automobile Business Design, build, and maintain the data infrastructure and systems that support SKF VA data needs. By leveraging their skills in data modeling, data integration, data processing, data storage, data retrieval, and performance optimization, this role can help VA manage and utilize their data more effectively. Key responsibilities (or What you can expect in the role) Build an VA data warehouse which is scalable, secured, and compliant using snowflake technologies. This would include designing and developing Snowflake data models Work with Central data warehouse like SDW, MDW, OIDW to extract data and enrich with VA specific customer grouping, program details etc. Data integration: Responsible for integrating data from ERP s, BPC and other systems into Snowflake, SKF standard DW s ensuring that data is accurate, complete, and consistent. Performance optimization: Responsible for optimizing the performance of Snowflake queries and data loading processes. Involves optimizing SQL queries, creating indexes, and tuning data loading processes. Security and access management: Responsible for managing the security and access controls of the Snowflake environment. This includes configuring user roles and permissions, managing encryption keys, and monitoring access logs. Maintain existing databases, warehouse solutions addressing support needs, enhancements Troubleshooting etc. Metrics Technical metrics: Data quality for whole of VA BU, data processing time, data storage capacity and systems availability Business metrics: data driven decision making, data security and compliance, cross functional collaboration. Competencies Should have a good understanding of data modeling concepts and should be familiar with Snowflakes data modeling tools and techniques. SQL: Should be expert in SQL. Should be able to write complex SQL queries and understand how to optimize SQL performance in Snowflake. Pipeline Management & ETL: Should be able to design and manage data pipelines on Snowflake and Azure, using ETL/ELT tools (e.g., DBT, Alteryx, Talend, Informatica). Should have a good understanding of cloud computing concepts and be familiar with the cloud infrastructure on which Snowflake operates. Good understanding of data warehousing concepts and be familiar with Snowflakes data warehousing tools and techniques Familiar with data governance and security concepts Able to identify and troubleshoot issues with Snowflake and SKF s data infrastructure Experience with Agile solution development Good to have - knowledge on SKF ERP systems (XA, SAP, PIM etc.), data related sales, supply chain data, manufacturing. Candidate Profile: Bachelor s degree in computer science, Information technology or a related field SKF is committed to creating a diverse environment, and we firmly believe that a diverse workforce is essential for our continued success. Therefore, we only focus on your experience, skills, and potential. Come as you are - just be yourself. #weareSKF Some additional information This position will be located in Bangalore. For . About SKF

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 6 Lacs

Bengaluru

Work from Office

Req ID: 331269 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Informatica Admin to join our team in Bangalore, Karn taka (IN-KA), India (IN). Informatica Cloud Data Governance & Catalog (CDGC): Glossary creation, metadata management, data classification Data lineage, policy definition, and domain configuration Informatica Administration: User/role management, Secure Agent installation & maintenance Job monitoring, repository backups, system troubleshooting Environment configuration and version upgrades Informatica Data Quality (IDQ): Data profiling, rule specification, transformations Scorecards, DQ metrics, accuracy, and completeness checks Exception handling and remediation Additionally, it would be beneficial if the candidate has knowledge and experience in: Scripting: Shell scripting (Bash/Korn), Python scripting for automation Experience in building monitoring and housekeeping scripts Cloud Knowledge: Familiarity with Azure, AWS, or GCP Working with cloud-hosted Informatica services DevOps & CI/CD: Azure DevOps: Creating and managing pipelines, repos, and releases Integration with Informatica for automated deployments

Posted 1 week ago

Apply

2.0 - 8.0 years

6 - 7 Lacs

Noida

Work from Office

Req ID: 328482 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a ETL Informatica ,IICS Developer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts

Posted 1 week ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

Noida

Work from Office

Req ID: 331538 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Database-Oracle Developer to join our team in Ban/Hyd/Chn/Gur/Noida, Karn taka (IN-KA), India (IN). Job Title- Developer Role Description The Developer is responsible for analyzing the technical requirements and perform code changes, unit testing, facilitate integration testing, UAT, production release. The developer also works as L3 support for fixing any production defects. Your key responsibilities Design, develop, test, deploy, maintain and improve software Manage individual project priorities, deadlines and deliverables etc. Directly interact with a broad spectrum of stakeholders in multiple regions Liaise with other technical areas, conducting technology research, and evaluating software required for maintaining the development environment Perform research on various technologies and define architectural improvements, build prototypes or core features as needed Help set the technical direction that will help achieve the best user experience Your skills and experience/Qualifications Overall 8+ years of Experience Technical skills required include o Oracle SQL, PL/SQL o Unix o Control-M o Informatica (Good to have ) o Python (Good to have) Knowledge of DevOps configuration management tools (Chef, Puppet, Docker, TeamCity, Jenkins, uDeploy, Kubernetes, Maven etc) Experience with tooling across the SDLC: Sonar, Crucible, JIRA, HP ALM, HP UFT, Confluence, Nexus, Artifactory, Teamcity, Git/BitBucket) Experience in a Banking /Wealth management domain. Experience of working in Agile Environment. Relevant and demonstrable experience in designing and documenting requirements in agile teams An organized self-starter able to manage in a complex environment A team player who continually collaborates and shares information Continually looks to simplify and standardize solutions Actively seeks to reduce complexity and do the "right thing" Persistent in your drive for quality and excellence Architecturally minded with an ability to simplify complex activities Influencer and problem solving person Fluent in English (written/verbal) additional language(s) are an advantage Familiar with Excel, PowerPoint, Visio etc. Ability to work in a Matrix organization with stakeholders spread across geographies. Understanding of executing projects in agile (scrum) methodology Ability to identify and interpret stakeholders needs and requirements Self-motivated and flexibility to work autonomously coupled with ability to work in virtual teams and matrix/global organizations including appreciation of different cultures during collaborating and sharing. Ability to influence and motivate other team members and stakeholders through strong dialogue, facilitation and persuasiveness. Preferred Domain Knowledge: Weath management and banking regulations Nice to have skills: Banking, Database, BigQuery, Google cloud knowledge

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Noida, Pune, Bengaluru

Work from Office

Description: The Data & Analytics Team is seeking a Data Engineer with a hybrid skillset in data integration and application development. This role is crucial for designing, engineering, governing, and improving our entire Data Platform, which serves customers, partners, and employees through self-service access. You'll demonstrate expertise in data & metadata management, data integration, data warehousing, data quality, machine learning, and core engineering principles Requirements: • 5+ years of experience with system/data integration, development, or implementation of enterprise and/or cloud software. • Strong experience with Web APIs (RESTful and SOAP). • Strong experience setting up data warehousing solutions and associated pipelines, including ETL tools (preferably Informatica Cloud). • Demonstrated proficiency with Python. • Strong experience with data wrangling and query authoring in SQL and NoSQL environments for both structured and unstructured data. • Experience in a cloud-based computing environment, specifically GCP. • Expertise in documenting Business Requirement, Functional & Technical documentation. • Expertise in writing Unit & Functional Test Cases, Test Scripts & Run books. • Expertise in incident management systems like Jira, Service Now etc. • Working knowledge of Agile Software development methodology. • Strong organizational and troubleshooting skills with attention to detail. • Strong analytical ability, judgment, and problem analysis techniques. • Excellent interpersonal skills with the ability to work effectively in a cross-functional team. Job Responsibilities: • Lead system/data integration, development, or implementation efforts for enterprise and/or cloud software. • Design and implement data warehousing solutions and associated pipelines for internal and external data sources, including ETL processes. • Perform extensive data wrangling and author complex queries in both SQL and NoSQL environments for structured and unstructured data. • Develop and integrate applications, leveraging strong proficiency in Python and Web APIs (RESTful and SOAP). • Provide operational support for the data platform and applications, including incident management. • Create comprehensive Business Requirement, Functional, and Technical documentation. • Develop Unit & Functional Test Cases, Test Scripts, and Run Books to ensure solution quality. • Manage incidents effectively using systems like Jira, Service Now, etc. • Prepare change management packages and implementation plans for migrations across different environments. • Actively participate in Enterprise Risk Management Processes. • Work within an Agile Software Development methodology, contributing to team success. • Collaborate effectively within cross-functional teams. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 1 week ago

Apply

8.0 - 13.0 years

6 - 7 Lacs

Mohali

Work from Office

ETL & DWH Engineer/Data Analyst Jobs | Grazitti By continuing to use our website, you consent to the use of cookies. Please refer our Join Our Clan Lead/Associate/Sr. ETL & DWH Engineer / Data Analyst Description Job Description We are looking for an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of experience in data integration, warehousing, and analytics. The ideal candidate will have deep technical expertise in ETL tools, strong data modeling knowledge, and the ability to lead complex data engineering projects from design to deployment. Skills Key Skills 4+ years of hands-on experience with ETL tools like SSIS, Informatica, DataStage, or Talend. Proficient in relational databases such as SQL Server and MySQL. Strong understanding of Data Mart/EDW methodologies. Experience in designing star schemas, snowflake schemas, fact and dimension tables. Experience with Snowflake or BigQuery. Knowledge of reporting and analytics tools like Tableau and Power BI. Scripting and programming proficiency using Python. Familiarity with cloud platforms such as AWS or Azure. Ability to lead recruitment, estimation, and project execution. Exposure to Sales and Marketing data domains. Experience with cross-functional and geographically distributed teams. Ability to translate complex data problems into actionable insights. Strong communication and client management skills. Self-starter with a collaborative attitude and problem-solving mindset. Responsibilities Roles & Responsibilities Deliver high-level and low-level design documents for middleware and ETL architecture. Design and review data integration components, ensuring adherence to standards and best practices. Own delivery quality and timeliness across one or more complex projects. Provide functional and non-functional assessments for global data implementations. Offer technical problem-solving guidance and support to junior team members. Drive QA for deliverables and validate progress against project timelines. Lead issue escalation, status tracking, and continuous improvement initiatives. Support planning, estimation, and resourcing across data engineering efforts. Position: Lead/Associate/Sr. ETL & DWH Engineer / Data Analyst Thank you for submitting your application. We will contact you shortly! Stay updated with us Address: Grazitti Interactive LLP (SEZ Unit), 2nd oor, Quark City SEZ, A-40A, Phase VIII Extn., Mohali, SAS Nagar, Punjab, 160059, Mohali, Punjab, India Life at Grazitti Share Your Profile We are always looking for the best talent to join our team * Skills Upload Your CV Thank you for sharing your profile with us. If it aligns with our requirements, we will reach out to you for the next steps in the process. Marketo Forms 2 Cross Domain request proxy frame This page is used by Marketo Forms 2 to proxy cross domain AJAX requests.

Posted 1 week ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Chennai

Work from Office

Req ID: 327893 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a SQL, ETL Testing, Python - Tester to join our team in Chennai, Tamil N du (IN-TN), India (IN). . Able to review defects on regular basis and perform root- cause analysis Working experience in integrating the automation code with DevOps pipeline for CICD and usage of tools like Jenkins, Udeploy, Concourse etc., Experience in working in an Agile software development environment Implementing QA best practice acquired through prior experience for QA process improvement The Skills that are Good to Have for this role Experience in reporting tools like - OBIEE, Power BI, Tableau etc. is an advantage Experience with sematic layer platform - Atscale Experience in data test automation tools like iCEDQ Minimum Experience on Key Skills 6-9 Years General Expectation 1) Must have Good Communication 2) Must be ready to work in 10:30 AM to 8:30 PM Shift 3) Flexible to work in Client Location Ramanujam IT park, Taramani, Chennai 4) Must be ready to work from office in a Hybrid work environment. Full Remote work is not an option 5) Expect Full Return to office in 2025 Pre-Requisites before submitting profiles 1) Must have Genuine and Digitally signed Form16 for ALL employments 2) All employment history/details must be present in UAN/PPF statements 3) Candidate must be screened using Video and ensure he/she is genuine and have proper work setup 4) Candidates must have real work experience on mandatory skills mentioned in JD 5) Profiles must have the companies which they are having payroll with and not the client names as their employers 6) As these are competitive positions and client will not wait for 60 days and carry the risks of drop-out, candidates must of 0 to 3 weeks of Notice Period 7) Candidates must be screened for any gaps after education and during employment for genuineness of the reasons

Posted 1 week ago

Apply

6.0 - 9.0 years

4 - 5 Lacs

Bengaluru

Work from Office

Req ID: 329103 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Oracle, PL/SQL - Developer to join our team in Bangalore, Karn taka (IN-KA), India (IN). Oracle, PL/SQL - Developer Mandatory Oracle, PL/SQL Good To have Some knowledge on ETL, Informatica, Control M, Autosys Some knowledge on Azure and DevOps Minimum Experience on Key Skills 6 - 9Yrs General Expectation 1) Must have Good Communication 2) Must be ready to work in 10:30 AM to 8:30 PM Shift 3) Flexible to work in Client Location GV, Manyata or EGL, Bangalore 4) Must be ready to work from office in a Hybrid work environment. Full Remote work is not an option 5) Expect Full Return to office in 2025 Pre-Requisites before submitting profiles 1) Must have Genuine and Digitally signed Form16 for ALL employments 2) All employment history/details must be present in UAN/PPF statements 3) Candidate must be screened using Video and ensure he/she is genuine and have proper work setup 4) Candidates must have real work experience on mandatory skills mentioned in JD 5) Profiles must have the companies which they are having payroll with and not the client names as their employers 6) As these are competitive positions and client will not wait for 60 days and carry the risks of drop-out, candidates must of 0 to 3 weeks of Notice Period 7) Candidates must be screened for any gaps after education and during employment for genuineness of the reasons

Posted 1 week ago

Apply

6.0 - 9.0 years

6 - 10 Lacs

Chennai

Work from Office

Req ID: 329233 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a ETL, Informatica - Production Support role to join our team in Chennai, Tamil N du (IN-TN), India (IN). . Minimum Experience on Key Skills 6-9Yrs General Expectation 1) Must have Good Communication 2) Must be ready to work in 10:30 AM to 8:30 PM Shift 3) Flexible to work in Client Location Ramanujam IT park, Taramani, Chennai 4) Must be ready to work from office in a Hybrid work environment. Full Remote work is not an option 5) Expect Full Return to office in 2025 Pre-Requisites before submitting profiles 1) Must have Genuine and Digitally signed Form16 for ALL employments 2) All employment history/details must be present in UAN/PPF statements 3) Candidate must be screened using Video and ensure he/she is genuine and have proper work setup 4) Candidates must have real work experience on mandatory skills mentioned in JD 5) Profiles must have the companies which they are having payroll with and not the client names as their employers 6) As these are competitive positions and client will not wait for 60 days and carry the risks of drop-out, candidates must of 0 to 3 weeks of Notice Period 7) Candidates must be screened for any gaps after education and during employment for genuineness of the reasons

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

We are seeking a highly skilled and experienced OBIEE Consultant with over 5 years of expertise in OBIEE Reporting and RPD development, including at least 2 years working on BI 12c version The role requires strong SQL skills to write and debug scripts effectively The ideal candidate will have experience managing large-scale projects, with a solid understanding of project lifecycles and OBIEE security configurations Proficiency in OBIEE reporting, Informatica, and DAC is essential The consultant should be adept at accessing Informatica tools for log analysis and checking schedules in DAC The position demands strong problem-solving skills and the ability to work collaboratively within a fast-paced IT environment Immediate joiners are preferred for this remote opportunity Location : - Remote

Posted 1 week ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

We are seeking a highly skilled and experienced OBIEE Consultant with over 5 years of expertise in OBIEE Reporting and RPD development, including at least 2 years working on BI 12c version. The role requires strong SQL skills to write and debug scripts effectively. The ideal candidate will have experience managing large-scale projects, with a solid understanding of project lifecycles and OBIEE security configurations. Proficiency in OBIEE reporting, Informatica, and DAC is essential. The consultant should be adept at accessing Informatica tools for log analysis and checking schedules in DAC. The position demands strong problem-solving skills and the ability to work collaboratively within a fast-paced IT environment. Immediate joiners are preferred for this remote opportunity. Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Remote

Posted 1 week ago

Apply

4.0 - 7.0 years

15 - 17 Lacs

Hyderabad, Bengaluru

Work from Office

Design, develop, and implement data solutions using AWS Data Stack components such as Glue and Redshift.Write and optimize advanced SQL queries for data extraction, transformation, and analysis.Develop data processing workflows and ETL processes using Python and PySpark.

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

: Design, develop, and maintain relational and non-relational database systems. Define analytical architecture including datalakes, lakehouse, data mesh and medallion patterns Ability to understand & analyze business requirements and translate them into analytical or relational database design Able to design for non-SQL datastores Optimize SQL queries, stored procedures, and database performance. Create and maintain ETL processes for data integration from various sources. Work closely with application teams to design database schemas and support integration. Monitor, troubleshoot, and resolve database issues related to performance, storage, and replication. Implement data security, backup, recovery, and disaster recovery procedures. Ensure data integrity and enforce best practices in database development. Participate in code reviews and mentor junior developers. Collaborate with business and analytics teams for reporting and data warehousing needs. Must Have: Strong expertise in SQL and PL/SQL Hands-on experience with at least one RDBMS: Snowflake, ADLS, Bigquery, SQL Server, Oracle, PostgreSQL, or MySQL Experience with NoSQL databases: MongoDB, Cassandra, or Redis (at least one) ETL Development Tools: SSIS, Informatica, Talend, or equivalent ( changed - Good to have ) Experience in performance tuning and optimization Database design and modeling tools: Erwin, dbForge, or similar Cloud platforms: Design and Development experience on cloud services AWS/GCP/ AZURE(must have) Understanding of indexing, partitioning, replication, and sharding Knowledge of CI/CD pipelines and DevOps practices for database deployments( Nice to Have) Cloud Certified Engineer( Nice to have) Experience with Big Data technologies (Hadoop, Spark)Experience working in Agile/Scrum environments (Must have ) Knowledge of star schema and snowflake schema (Must have ) Qualifications Bachelor s or Master s degree in Computer Science, Information Technology, or a related field.8+ years of relevant experience in database design and development Qualifications Bachelor s or Master s degree in Computer Science, Information Technology, or a related field.8+ years of relevant experience in database design and development

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Snowflake + DBT Mandatory Skills: Candidate should have 5-10 years of experience in Snowflake (Cloud Data Warehousing) & ELT DBT (Data Build Tool) and (Informatica) Development and Design & Development experience: Primary Technical skills: Snowflake (Cloud Data Warehouse): Good understanding of Snowflake ECO system Good experience on Data modeling and Dimensional Modeling and techniques and will be able to drive the Technical discussions with IT & Business and Architects / Data Modelers Need to guide the team and provide the technical solutions Need to prepare the technical solution and architectures as part of project requirements Virtual Warehouse (Compute) - Good Understanding of Warehouse creation & manage Data Modeling & Storage - Strong knowledge on LDM/PDM design Data Loading/Unloading and Data Sharing- Should have good knowledge SnowSQL (CLI)- Expertise and excellent understanding of Snowflake Internals and Integration Strong hands on experience on SNOWSQL queries and Stored procedures and performance tuning techniques Good knowledge on SNOWSQL Scripts preparation the data validation and Audits SnowPipe - Good knowledge of Snow pipe implementation Expertise and excellent understanding of S3 - Internal data copy/movement Good knowledge on Security & Readers and Consumers accounts Good knowledge and hands on experience on Query performance tuning implementation techniques DBT ( Data Build Tool) hands on experience with all modules : In depth hands on experience in DBT CLI and DBT Cloud and GitHub version control and code repository knowledge. Design and develop DBT models and build the data pipeline processes to perform complex ELT processes as per DBT scripting. Good knowledge and hands on experience on dbt concepts such as Models, Materialization, Sources, Seeds, Snapshots, Packages, Hooks, Exposures, Analyses, write complex SQL queries Informatica- 10.x and above with all modules : ETL- Design & Development and Troubleshooting/debugging experience Repository Manager Mapping Designer - Candidate should have strong hands on experience Mapping & Mapplets creations using all Transformations Workflow designer - Candidate should have good understanding on Session & Task implementation and Job scheduling SQL Knowledge: Advance SQL knowledge and hands on experience on complex queries writing using with Analytical functions Strong knowledge on stored procedures Troubleshooting, problem solving and performance tuning of SQL queries accessing data warehouse Ability to participate in RFI/RFPs discussions and provide technical solutions and effort estimations Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements Candidate should have good work attitude and ability to work in Onshore/Offshore model. Nice to have skills: Cisco Process and Data understanding like IB Data, Service contract Management, Auto Renewals, Services Domain, Booking, Quotes, Digital Case Services, Quote to Cash, Invoicing, Licensing, and Provisioning, Sales Compensation, Software and Recurring Revenue etc. SnowPro-Core - Certification - Snowflake Experience/Knowledge with reporting technologies (i.e. Tableau, PowerBI) is an added advantage. Good to have Hi-Tech Manufacturing domain experience.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

We are seeking a highly skilled and experienced OBIEE Consultant with over 5 years of expertise in OBIEE Reporting and RPD development, including at least 2 years working on BI 12c version. The role requires strong SQL skills to write and debug scripts effectively. The ideal candidate will have experience managing large-scale projects, with a solid understanding of project lifecycles and OBIEE security configurations. Proficiency in OBIEE reporting, Informatica, and DAC is essential. The consultant should be adept at accessing Informatica tools for log analysis and checking schedules in DAC. The position demands strong problem-solving skills and the ability to work collaboratively within a fast-paced IT environment. Immediate joiners are preferred for this remote opportunity. Location : - Remote

Posted 1 week ago

Apply

5.0 - 9.0 years

8 - 15 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

We are seeking a highly skilled and experienced OBIEE Consultant with over 5 years of expertise in OBIEE Reporting and RPD development, including at least 2 years working on BI 12c version. The role requires strong SQL skills to write and debug scripts effectively. The ideal candidate will have experience managing large-scale projects, with a solid understanding of project lifecycles and OBIEE security configurations. Proficiency in OBIEE reporting, Informatica, and DAC is essential. The consultant should be adept at accessing Informatica tools for log analysis and checking schedules in DAC. The position demands strong problem-solving skills and the ability to work collaboratively within a fast-paced IT environment. Immediate joiners are preferred for this remote opportunity. Location - Remote, Hyderabad,Ahmedabad,pune,chennai,kolkata.

Posted 1 week ago

Apply

7.0 - 8.0 years

12 - 15 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

We are looking for an experienced Informatica MDM Specialist with 7-8 years of expertise in Informatica Intelligent Cloud Services (IICS), including Cloud Application Integration (CAI) and Cloud Data Integration (CDI). The role involves developing business-critical Informatica entities, managing IDMC administration and architecture, and implementing application integration components. The ideal candidate will have hands-on experience in IDMC, CDI, CAI, and data integration best practices. Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 1 week ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Hyderabad

Work from Office

Details of the role: 8 to 10 years experience as Informatica Admin (IICS) Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse. Implement best practices for data loading, ensuring optimal performance and data quality. Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes. Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements. Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements. Work on data modeling and schema design to optimize database structures for ETL processes. Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading. Troubleshoot and resolve issues related to data integration and performance bottlenecks. Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions. Provide guidance and mentorship to junior members of the data engineering team. Create and maintain comprehensive documentation for ETL processes, data models, and data flows. Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows. Use Jira for task tracking and project management. Implement data quality checks and validation processes to ensure data integrity and reliability. Maintain detailed documentation of data engineering processes and solutions. Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e.g., Informatica Cloud, Talend, Apache NiFi). Expertise in IDMC principles, including data governance, data quality, and metadata management. Solid understanding of data warehousing concepts and practices. Strong SQL skills and experience working with relational databases. Excellent problem-solving and analytical skills.

Posted 1 week ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Mumbai

Work from Office

We are looking for a highly skilled and experienced Senior Analyst to join our team at eClerx Services Ltd. The ideal candidate will have a strong background in IT Services & Consulting, with excellent analytical and problem-solving skills. Roles and Responsibility Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and implement process improvements to increase efficiency and productivity. Analyze complex data sets to inform business decisions and drive growth. Provide expert-level support for data analysis and reporting. Identify and mitigate risks associated with data quality and integrity. Develop and maintain technical documentation for processes and procedures. Job Requirements Strong understanding of IT Services & Consulting industry trends and technologies. Excellent analytical and problem-solving skills, with attention to detail. Ability to work collaboratively in a fast-paced environment. Strong communication and interpersonal skills, with the ability to present complex ideas simply. Experience with data analysis and reporting tools, such as Excel or SQL. Ability to adapt to changing priorities and deadlines in a dynamic environment. About Company eClerx Services Ltd. is a leading provider of IT Services & Consulting solutions, committed to delivering exceptional results and exceeding client expectations.

Posted 1 week ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Chandigarh

Work from Office

We are looking for a highly skilled and experienced Senior Analyst to join our team at eClerx Services Ltd. The ideal candidate will have a strong background in IT Services & Consulting, with excellent analytical and problem-solving skills. Roles and Responsibility Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data models and reports using various tools and technologies. Analyze large datasets to extract insights and trends, and provide recommendations to stakeholders. Design and implement process improvements to increase efficiency and productivity. Provide expert-level support for data analysis and reporting, ensuring high-quality results. Stay up-to-date with industry trends and emerging technologies to continuously improve skills and knowledge. Job Requirements Strong understanding of IT Services & Consulting principles and practices. Excellent analytical and problem-solving skills, with the ability to think critically and creatively. Proficiency in data modeling and reporting tools, with experience working with large datasets. Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams. Ability to prioritize tasks and manage multiple projects simultaneously, demonstrating strong time management skills. Experience with process improvement initiatives, focusing on increasing efficiency and productivity.

Posted 1 week ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Chennai

Work from Office

Oracle Master Data Management Role Purpose Device Launch Readiness Team in Client's Xfinity Mobile under Technology & Product Wireless Technologies & New Business, manages master data for Mobile Device, Mobile Device Accessories, Packaging and Xfinity home products. Person responsible for this position will help in this process of SKU Lifecycle Management. Core Responsibilities Manage Device, Accessories Master data Write complex SQL to query large data platforms for analysis Perform queries and create anew datasets Analyze and package data to create / update records Clean data, parse, and makeavailable for groups of users Deep diveinto datato understand business drivers/problems Update Jira for completed activities and report to users/manager Support during prod and stage migration GeneralSkillsets: 3-5 years of experience in RDBMS Working experience in Mobile Device / Service domain Knowledge of mobile business acronyms Advanced Excel skills including macros, VLOOKUP, formula accuracy Other Expectations: Understand our Comcast Operating Principles; make them the guidelines for how you do your job. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Drive results and growth. Mandatory Skills: Oracle Master Data Management - MDM. Experience3-5 Years.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies