Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Looking for ETL/DB tester with 5+ years of experience. Should have strong SQL skills. Should have hands on coding knowledge in any scripting language. should be able to design and write SQL queries for data validation should be able to verify and test ETL processes understanding of Data warehousing concepts is a plus good communication skill and testing mindset
Posted 1 day ago
4.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Hybrid
Lead the technical design and architecture of Informatica Data Management and Data Integration platforms, ensuring alignment with customer requirements, industry best practices, and project objectives.oConduct reviews, ensure adherence to coding standards and best practices, and drive continuous improvement in code quality and performanceoExpertise in Finetuning the slow performance of Data Integration jobs or workflows.Development and Support of Data Ingestions using various integrations patterns including (but not limited to) Informatica IICS, SFTP, Rest API based frameworks.Support of Data Governance capability on Informatica Cloud Data Governance and CatalogoMonitor and optimise platform performance and resource usageoImplement CI/CD pipelines for integration workflowsoStrong communication and presentation skills
Posted 1 day ago
6.0 - 11.0 years
5 - 9 Lacs
Hyderabad, Bengaluru
Work from Office
Skill-Snowflake Developer with Data Build Tool with ADF with Python Job Descripion: We are looking for a Data Engineer with experience in data warehouse projects, strong expertise in Snowflake , and hands-on knowledge of Azure Data Factory (ADF) and dbt (Data Build Tool). Proficiency in Python scripting will be an added advantage. Key Responsibilities Design, develop, and optimize data pipelines and ETL processes for data warehousing projects. Work extensively with Snowflake, ensuring efficient data modeling, and query optimization. Develop and manage data workflows using Azure Data Factory (ADF) for seamless data integration. Implement data transformations, testing, and documentation using dbt. Collaborate with cross-functional teams to ensure data accuracy, consistency, and security. Troubleshoot data-related issues. (Optional) Utilize Python for scripting, automation, and data processing tasks. Required Skills & Qualifications Experience in Data Warehousing with a strong understanding of best practices. Hands-on experience with Snowflake (Data Modeling, Query Optimization). Proficiency in Azure Data Factory (ADF) for data pipeline development. Strong working knowledge of dbt (Data Build Tool) for data transformations. (Optional) Experience in Python scripting for automation and data manipulation. Good understanding of SQL and query optimization techniques. Experience in cloud-based data solutions (Azure). Strong problem-solving skills and ability to work in a fast-paced environment. Experience with CI/CD pipelines for data engineering.
Posted 1 day ago
4.0 - 8.0 years
8 - 12 Lacs
Hyderabad
Work from Office
AWS Datalake Lead (India - Lead 8 to 10 yrs exp):Lead the technical design and architecture of AWS datalake and its related services ensuring alignment with customer requirements, industry best practices, and project objectives.Conduct reviews, ensure adherence to coding standards and best practices, and drive continuous improvement in code quality and performanceProvide technical support, troubleshooting problems, and providing timely resolution for Incidents, Services Requests and Minor Enhancements as required for AWS platforms and services.Adding, updating, or deleting datasets in AWS Data LakeMonitoring storage usage and handling capacity planningSchema optimization in Snowflake and AWS Data Lake for query performanceStrong communication and presentation skills
Posted 1 day ago
5.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Hybrid
5+ Years of strong experience in Informatica CloudExp in Informatica Cloud Designer & Informatica Cloud Portal Exp in Transformations, Mapping Configuration Task, Task Flows and Parameterized Templates Experience in Informatica Power Center and DesignerGood knowledge in Oracle, SQL,PL/SQL Should have experience in scheduling the Informatica cloud ETL mappings Experience in Integration of Informatica cloud with SFDC, SAP etc. as sources Experience in Business Objects and other business intelligence platforms is an advantage Should be good in understanding of functional requirements and Business
Posted 1 day ago
5.0 - 9.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Looking for 4 to 6 years of extensive experience in informatica power center.Strong hands on experience in advance SQL and RDBMS conceptsGood hands on experience in Unix and shell scriptinggood understanding of data warehousing conceptsexcellent problem solving skillsgood communication skillsgood analytical skills.
Posted 1 day ago
6.0 - 10.0 years
20 - 27 Lacs
Indore, Gurugram, Jaipur
Work from Office
Datapipelines. Hands-on with log analytics, user engagement metrics, and product performance metrics. Ability to identify patterns, trends, and anomalies in log data to generate actionable insights for product enhancements and feature optimization.
Posted 1 day ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Immediate Openings on CPQDeveloper _Panindia_Contract Experience 5+ Years Skills CPQDeveloper Location Panindia Notice Period Immediate. Employment Type Contract SAP Callidus CPQ developer Consultant with minimum 5-7 years of experience in CPQ configurations, customizations, and development 5-7 years of technology consulting project implementation experience Experience in creating UI and Configurations with CPQScripting and working Knowledge of IronPython.
Posted 1 day ago
10.0 - 15.0 years
4 - 8 Lacs
Chandigarh, Dadra & Nagar Haveli, Daman
Work from Office
Strong change and project management skills Stakeholder Management, Communications, Reporting Data management, data governance, and data quality management domain knowledge 10+ years experience Subject Matter Expertise required in more than one of the following areas- Data Management, Data Governance, Data Quality Measurement and Reporting, Data Quality Issues Management. Liaise with IWPB markets and stakeholders to coordinate delivery of organizational DQ Governance objectives, and provide consultative support to facilitate progress Conduct analysis of IWPB DQ portfolio to identify thematic trends and insights, to effectively advise stakeholders in managing their respective domains Proficiency in MI reporting and visualization is strongly preferred Proficiency in Change and Project Management is strongly preferred. Ability to prepare programme update materials and present for senior stakeholders, with prompt response any issues escalations Strong communications and Stakeholder Management skills: Should be able to work effectively and maintain strong working relationships as an integral part of a larger team Location - Chandigarh,Dadra & Nagar Haveli,Daman,Diu,Goa,Haveli,Hyderabad,Jammu,Lakshadweep,Nagar,New Delhi,Puducherry,Sikkim
Posted 1 day ago
5.0 years
0 Lacs
India
On-site
Company Description Seosaph-infotech is an emerging company in the customized software development sector. We offer various tech solutions to businesses across different industries, including Finance, Healthcare, and E-commerce. In just two years, Seosaph has expanded rapidly by delivering exceptional solutions and services. Our vision is to help clients become industry leaders by providing optimal technology solutions and trusted services. Job Summary We are seeking an experienced Informatica professional to lead and execute the migration of data integration workflows and assets from Informatica PowerCenter (on-premises) to Informatica Intelligent Cloud Services (IICS) or other cloud-based data integration platforms. The ideal candidate will possess strong hands-on experience in data integration, ETL/ELT, and cloud migration projects, with deep understanding of Informatica tools, cloud platforms (AWS, Azure, GCP), and enterprise data systems. Key Responsibilities Assess and analyze existing Informatica PowerCenter mappings, workflows, and dependencies. Design and develop migration strategies to transition ETL jobs from on-premises to the cloud (IICS or other platforms). Re-engineer or refactor legacy ETL jobs for optimization in cloud environments. Execute data and workflow migration, ensuring data accuracy, security, and integrity throughout the process. Collaborate with architects, cloud engineers, and stakeholders to define requirements and ensure alignment with business goals. Conduct system and integration testing, resolve data discrepancies, and implement performance tuning. Document all processes, mappings, workflows, and data lineage before and after migration. Provide post-migration support, troubleshoot issues, and optimize cloud ETL jobs. Ensure compliance with data governance, security policies, and best practices. Required Skills & Experience 5+ years of experience with Informatica PowerCenter and ETL development. Hands-on experience with Informatica Intelligent Cloud Services (IICS) or Cloud Data Integration tools. Proven expertise in cloud platforms (AWS, Azure, or GCP) and related services (e.g., S3, Azure Blob, BigQuery, Snowflake). Strong understanding of data warehousing concepts, data modeling, and relational databases. Experience with performance tuning, parallel processing, and scheduling tools (e.g., Autosys, Control-M). Proficiency in SQL and scripting (Shell, Python, etc.). Experience with version control, CI/CD, and DevOps pipelines (nice to have). Strong problem-solving skills and ability to work independently or in a team environment. Preferred Qualifications Informatica certifications (PowerCenter, IICS, or Cloud Data Integration). Experience with other ETL tools or cloud data migration frameworks. Background in finance, healthcare, retail, or other data-intensive industries is a plus. Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA).
Posted 1 day ago
6.0 - 11.0 years
8 - 15 Lacs
Hyderabad
Work from Office
Skills Spanish Speaking Experience 6+ Years Location PAN INDIA Job type Contract to Hire Work Model Hybrid "Must know Spanish language spoken, written and Reading. An Engagement Manager with a focus on Master Data Management (MDM) typically has a multifaceted role that combines client relationship management, project leadership, and data governance. Heres a brief overview of the key responsibilities and skills required for this position Responsibilities Client Relationship Management o Develop and maintain strong relationships with key client stakeholders. o Act as the primary point of contact for clients, ensuring their needs are met and expectations managed. Project Leadership o Lead project teams to deliver MDM solutions, ensuring projects are completed on time, within budget, and to the clients satisfaction. o Define project scope, requirements, and deliverables in collaboration with internal teams. o Establish and manage project timelines and plans, ensuring milestones are met. Data Governance o Oversee the implementation of MDM strategies and solutions. o Ensure data quality, consistency, and compliance with relevant standards and regulations. o Manage data integration, migration, and maintenance processes. Risk Management o Identify potential project risks and develop mitigation strategies. o Resolve any issues or problems faced by clients, maintaining trust and satisfaction. Reporting and Analysis o Prepare and present reports on project performance and data metrics to stakeholders. o Analyze engagement metrics and provide insights for continuous improvement. Skills and Qualifications Technical Skills o Proficiency in MDM tools and technologies. o Strong understanding of data management principles and practices. Project Management o Proven experience in leading and managing projects. o Excellent organizational and multitasking abilities. Communication o Strong verbal and written communication skills. o Ability to effectively communicate complex technical concepts to non-technical stakeholders. Problem-Solving o Sharp business acumen and problem-solving aptitude.
Posted 1 day ago
6.0 - 9.0 years
8 - 11 Lacs
Hyderabad
Work from Office
Must Have TSQL, SSIS , SSRS or Informatica PC and Data warehousing Good to have Snowflake Good knowledge of T-SQL, including the ability to write stored procedures, views, functions etc.. Good experience in designing ,developing, unit testing and implementation of data integration solutions using ETL in SSIS and SSRS reporting platform Experience with data warehousing concepts and enterprise data modeling techniques Good knowledge of relational and dimensional database structures, theories, principles and best practices Conduct thorough analysis of existing MSBI (Microsoft Business Intelligence) legacy applications and Informatica PC Identify and document the functionalities, workflows, and dependencies of legacy systems Create detailed mapping specifications for data integration and transformation processes Collaborate with business stakeholders/architects and data modelers to understand their needs and translate into technical documentation Ensure accurate documentation of data sources, targets, and transformation rules Perform data validation, cleansing, and analysis to ensure data accuracy and integrity Update the Design documents after successful code changes and testing Provide Deployment support Possess good knowledge of Agile and Waterfall methodologies Requirements Bachelors degree in computer science, Engineering, or a related field Highly skilled at handling complex technical situations and have exceptional verbal and written communication skills 5+ years experience with understanding of data lifecycle, governance, and migration processes 5+ years experience with SSIS, SSRS (or Informatica PC) and MS SQL Server, TSQL 5+ years experience with Data Warehouse technologies 3+ years experience with Agile methodologies (Scrum, Kanban, JIRA) Nice to have experience in wealth management domain
Posted 1 day ago
6.0 - 10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
TCS Hiring for Snowflake Developer Role!! TCS presents an excellent opportunity for Snowflake Developer Role: Snowflake on AWS Developer Desired Experience Range: 6-10 years Location: Chennai, Hyderabad Mode of Interview : Virtual Date: 03-07-2025 (Thursday) Must Have : Strong experience in Snowflake, SQL for data manipulation and engineering tasks. Strong Expertise in SQL and Snowflake architecture Experience in one or more cloud platforms – Preferably Azure /AWS/GCP In-depth knowledge of data warehousing concepts and experience in building and managing data warehouses. Knowledge in ETL concepts - needs to manage data migration into Snowflake. This includes extracting data from various sources, transforming it into a usable form, and loading it into the Snowflake platform. Informatica DBT
Posted 1 day ago
6.0 years
0 Lacs
India
Remote
Job Title: ServiceNow to ServiceMax Data Migration Specialist Experience: 6+ Years Location: Remote Employment Type: Full-Time / Contract (Both) Job Summary: We are looking for an experienced Data Migration Specialist to lead the end-to-end migration of data from ServiceNow to ServiceMax as part of a strategic transformation initiative. The ideal candidate will have strong experience in data extraction, mapping, transformation, and validation, with prior exposure to Salesforce ecosystems, including ServiceMax . This is a remote opportunity , and we are seeking candidates who can work independently and collaboratively across global teams. Key Responsibilities: Design and implement data migration strategy from ServiceNow to ServiceMax . Collaborate with business and technical stakeholders to define data mapping rules , transformation logic, and validation criteria. Extract, cleanse, transform, and load (ETL) data using appropriate tools and technologies. Ensure data integrity, accuracy, and consistency throughout the migration lifecycle. Conduct data quality audits and support User Acceptance Testing (UAT) . Troubleshoot data-related issues and support go-live data readiness. Document the data migration process, rules, and any post-migration validations. Required Skills & Experience: Minimum 6+ years of experience in data migration , especially across enterprise applications. Proven experience with ServiceNow and Salesforce / ServiceMax platforms. Hands-on experience in ETL tools (such as Informatica, Jitterbit, MuleSoft, Data Loader, etc.). Deep understanding of Salesforce data models , ServiceMax architecture, and ServiceNow schema. Proficient in data mapping, cleansing, validation, and reconciliation techniques . Strong problem-solving and analytical skills. Excellent communication and documentation abilities. Preferred Qualifications: Experience with ServiceMax implementation/migration projects. Familiarity with Salesforce APIs, SOQL/SOSL , and data loader tools. Certifications in Salesforce or ServiceMax are a plus. Experience working in remote, cross-functional environments .
Posted 1 day ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description and Requirements "At BMC trust is not just a word - it's a way of life!" Description And Requirements CareerArc Code CA_UT Hybrid "At BMC trust is not just a word - it's a way of life!" We are an award-winning, equal opportunity, culturally diverse, fun place to be. Giving back to the community drives us to be better every single day. Our work environment allows you to balance your priorities, because we know you will bring your best every day. We will champion your wins and shout them from the rooftops. Your peers will inspire, drive, support you, and make you laugh out loud! We help our customers free up time and space to become an Autonomous Digital Enterprise that conquers the opportunities ahead - and are relentless in the pursuit of innovation! We are seeking a highly analytical and technically skilled professional to join our Finance team as Manager Finance | Principal Finance Analyst. In this role, you will own and lead the reporting and analytics of key top line metrics in the Software industry, ensuring the accuracy, consistency, and usability of data to support critical financial and business decisions. You will work cross functionally with Finance, Sales Operations, Business Intelligence, and IT teams to build scalable solutions, streamline reporting processes, and drive data integrity across systems. Here is how, through this exciting role, YOU will contribute to BMC's and your own success: Lead the design and delivery of reporting and analytics for top line KPIs, including Total Contract Value (TCV), Annual Recurring Revenue (ARR), Remaining Performance Obligations (RPO), and more. Partners with FP&A, Revenue Accounting, SalesOps, and BI teams on alignment of standardized metric definitions, data logic, and governance across systems like Tableau, Snowflake, and Workday Adaptive Planning Support monthly and quarterly financial close processes by validating, reconciling, and finalizing revenue-related data in partnership with accounting teams. Design and manage interactive, self-service dashboards in Tableau that enable business users to explore revenue and customer trends effectively. Build and maintain robust ETL pipelines using tools such as Informatica or SSIS to transform and model data from various sources into finance reporting layers. Develop and optimize complex SQL queries and stored procedures to support dynamic reporting, reconciliations, and business insights. Ensure data quality and accuracy by implementing automated data validation, reconciliation checks, and exception reporting mechanisms. Identify and lead process automation opportunities to enhance reporting speed, consistency, and scalability. Collaborate with IT and Finance Systems teams to test, implement, and document system and data model enhancements. Support audit and compliance activities by preparing necessary documentation, validating financial controls, and participating in audit walkthroughs. Cross Train team members/End Users To ensure you are set up for success, you will bring the following skillset & experience: Required Skills Bachelor's degree required (with MBA preferred), with at least 10 years’ experience especially as Domain Expert on building and maintaining Financial Metrics (TCV, ACV, ARR, Revenue). Technical Skills Strong proficiency in SQL (Snowflake preferred), with experience building scalable, modular queries and views. Hands-on experience with Tableau: workbook development, LODs, parameters, dashboard actions, and performance tuning. Knowledge of ETL tools such as Informatica, SSIS, Alteryx, or custom Python/SQL-based pipelines. Understanding of data warehousing concepts and data modeling (e.g., star schema, dimensional modeling). Experience working with ERP/CRM systems such as Salesforce or Oracle. Familiarity with Workday Adaptive Planning, Power BI is a plus. Finance & Business Acumen Strong understanding of Software industry top line metrics (ARR, TCV, RPO, churn, upsell, etc.) and the finance logic behind software revenue recognition. Prior experience supporting FP&A, Revenue Accounting, or Business Operations teams in a data or systems role. Strong communication and collaboration skills to work effectively with both technical and non-technical stakeholders. Our commitment to you! BMC’s culture is built around its people. We have 6000+ brilliant minds working together across the globe. You won’t be known just by your employee number, but for your true authentic self. BMC lets you be YOU! If after reading the above, You’re unsure if you meet the qualifications of this role but are deeply excited about BMC and this team, we still encourage you to apply! We want to attract talents from diverse backgrounds and experience to ensure we face the world together with the best ideas! BMC is committed to equal opportunity employment regardless of race, age, sex, creed, color, religion, citizenship status, sexual orientation, gender, gender expression, gender identity, national origin, disability, marital status, pregnancy, disabled veteran or status as a protected veteran. If you need a reasonable accommodation for any part of the application and hiring process, visit the accommodation request page. BMC Software maintains a strict policy of not requesting any form of payment in exchange for employment opportunities, upholding a fair and ethical hiring process. At BMC we believe in pay transparency and have set the midpoint of the salary band for this role at 3,380,000 INR. Actual salaries depend on a wide range of factors that are considered in making compensation decisions, including but not limited to skill sets; experience and training, licensure, and certifications; and other business and organizational needs. The salary listed is just one component of BMC's employee compensation package. Other rewards may include a variable plan and country specific benefits. We are committed to ensuring that our employees are paid fairly and equitably, and that we are transparent about our compensation practices. ( Returnship@BMC ) Had a break in your career? No worries. This role is eligible for candidates who have taken a break in their career and want to re-enter the workforce. If your expertise matches the above job, visit to https://bmcrecruit.avature.net/returnship know more and how to apply.
Posted 1 day ago
3.0 years
0 Lacs
India
On-site
Job Title: Oracle Product Data Hub (PDH) Technical Consultant – Product Master Data Specialist Location: India Job Type: Full-Time Consultant Experience Level: Mid to Senior-Level Industry: ERP / Master Data Management / Manufacturing / Retail / Supply Chain Job Summary: We are seeking a skilled Oracle Product Data Hub (PDH) Technical Consultant with deep expertise in Product Master Data Management to support the end-to-end lifecycle of finished goods, raw materials, and pricing data in Oracle Fusion PDH. The ideal candidate will have hands-on experience in data cleansing, enrichment, transformation, validation, and mass data loading into Oracle Cloud PDH using best practices and tools such as FBDI, REST/SOAP APIs, and Data Import Templates . This role requires strong technical knowledge of Oracle PDH, a problem-solving mindset, and experience collaborating with functional teams and business users to ensure clean, standardized, and accurate product data is maintained across systems. Key Responsibilities: Lead technical efforts in product data onboarding , including finished goods , raw materials , and pricing structures into Oracle Fusion Product Data Hub. Perform data cleansing, de-duplication, normalization, and transformation activities using industry best practices and custom rulesets. Develop and execute data migration strategies using Oracle FBDI templates , Import Maps , REST/SOAP APIs , and spreadsheets . Create and maintain scripts or tools for mass upload, update, and validation of product data. Collaborate with business analysts, data stewards, and IT to define and implement product data governance, data quality rules, and workflows. Conduct data validation and reconciliation activities post-load, ensuring accuracy, completeness, and compliance with business rules. Troubleshoot and resolve technical issues related to PDH data imports, validations, and integrations. Support product hierarchy setup, item class configuration, attribute groups, catalogs, and data quality scorecards. Document technical specifications, data load procedures, and configuration guides. Required Skills and Experience: 3+ years of hands-on technical experience with Oracle Fusion Product Data Hub (PDH) . Proven experience in mass loading and maintaining product data, including finished goods , raw materials , and pricing . Strong experience with Oracle FBDI templates , REST/SOAP Web Services , and Excel-based data load tools . Proficiency in SQL and PL/SQL for data analysis and transformation. Solid understanding of Oracle Fusion Product Hub structures: Item Classes, Templates, Catalogs, Attributes, and Change Orders . Knowledge of item lifecycle management , global product definitions , and cross-functional data dependencies . Familiarity with Oracle SCM modules (Inventory, Costing, Pricing) is a plus. Experience in large-scale data migration, cleansing, and conversion projects. Excellent analytical, communication, and stakeholder engagement skills. Preferred Qualifications: Oracle Cloud Certification in Product Data Management or SCM . Experience with data governance frameworks or MDM tools . Exposure to tools like Oracle Integration Cloud (OIC) , OACS , or Informatica MDM . Experience in manufacturing, apparel, or retail industries preferred.
Posted 1 day ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
ETL Tester Location : Chennai. Exp : 5 to 8 years. Key Responsibilities Review ETL design documents and understand data flows, mapping documents, and business requirements. Develop comprehensive test plans, test cases, and test scripts for validating ETL processes. Perform data validation and data quality testing at various stages of the ETL cycle. Write and execute SQL queries to verify data transformation logic, source-to-target data mapping, and business rules. Identify, troubleshoot, and document data anomalies, discrepancies, and system defects. Work closely with development teams to replicate, debug, and resolve issues. Participate in daily stand-ups, sprint planning, and defect triage meetings. Communicate clearly with stakeholders and provide timely updates on test status and results. Contribute to the development and maintenance of automated ETL testing solutions (optional, based on project). Ensure compliance with testing standards and best practices across data projects. Required Skills And Qualifications Bachelor's degree in Computer Science, Information Systems, or related field. 5+ years of hands-on experience in ETL testing or data validation roles. Strong knowledge of SQL and ability to write complex queries for data verification. Familiarity with ETL tools (e.g., Informatica, Talend, DataStage, SSIS, etc.) Experience working with large datasets and relational databases (Oracle, SQL Server, PostgreSQL, etc.) Excellent problem-solving skills with a keen eye for identifying data quality issues. Strong analytical and critical thinking skills. Clear and concise verbal and written communication skills for cross-functional collaboration. Ability to work in agile/scrum environments with fast-changing priorities. Nice To Have Experience with test automation for ETL pipelines using tools like Selenium, PyTest, or Apache Airflow validation scripts. Familiarity with cloud platforms such as AWS, Azure, or GCP. Exposure to BI tools like Power BI, Tableau, or Looker. Understanding of data warehousing and data lake concepts. (ref:hirist.tech)
Posted 1 day ago
5.0 years
0 Lacs
Kochi, Kerala, India
On-site
The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidate with Databricks certification is preferred. Key Responsibilities: Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: Bachelor’s degree in computer science or equivalent. Minimum of 5 years of hands-on experience with Databricks. Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). Well-versed in data warehousing and data lake concepts. Proficient in SQL and Python for data manipulation and analysis. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills. Strong communication and collaboration skills. Preferred Qualifications: Certified Databricks Engineer. Experience in the life sciences domain. Familiarity with Reltio or similar MDM (Master Data Management) tools. Experience with data governance and data security best practices. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com
Posted 1 day ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About ProcDNA ProcDNA is a global consulting firm. We fuse design thinking with cutting-edge technology to create game-changing Commercial Analytics and Technology solutions for our clients. We're a passionate team of 275+ across 6 offices, all growing and learning together since our launch during the pandemic. Here, you won't be stuck in a cubicle - you'll be out in the open water, shaping the future with brilliant minds. At ProcDNA, innovation isn't just encouraged; it's ingrained in our DNA. Ready to join our epic growth journey? What We Are Looking For You’ll be driving the adoption of the latest technologies in our solutions, bringing in thought leadership to guide clients on complex data management problems, and driving business performance. You will work with the leadership team to bring subject matter expertise in areas such as Big Data, ETL, Reporting, CRM, Data Warehousing, MDM, DevOps, Software Development, etc. We are seeking an individual who not only possesses the requisite expertise but also thrives in the dynamic landscape of a fast-paced global firm. What You’ll Do Leading end-to-end data management solution projects for multiple clients across data engineering and BI technologies. Responsible for creating a project management plan and ensuring adherence to project timelines. Integrate multiple data sources into one visualization to tell a story. Interact with customers to understand their business problems and provide best-in-class analytics solutions. Interact with Data Platform leaders and understand data flows that integrate into Tableau/analytics. Understand data governance, quality, security, and integrate analytics with these enterprise platforms. Interact with UX/UI global functions and design best-in class visualization for customers, harnessing all product capabilities. Must have 7 - 10 years of data warehousing and data engineering. Experience in interacting with Life Science clients directly, discussing requirements, and stakeholder management. Experience in requirement gathering and designing enterprise warehouse solutions from scratch. Hands-on experience with ETL tools like ADF, Databricks, and Informatica; experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc; experience in data warehouse: SQL/NoSQL, Amazon Redshift, Snowflake, Apache Hive, HDFS, etc. BI tools knowledge and experience in leading the implementation of dashboards. Deep understanding of data governance and data quality management frameworks. Strong communication and presentation skills with a strong problem-solving attitude. Excellent analytical, problem-solving, and debugging skills, with a strong ability to quickly learn and comprehend business processes and problems to effectively develop technical solutions to their requirements. Skills: mdm,sql,hdfs,data warehousing,big data,devops,cloud,amazon redshift,snowflake,pharmaceutical consulting,data management,apache hive,azure,reporting,problem-solving,luigi,informatica,analytical skills,presentation skills,data governance,adf,data engineering,crm,databricks,bi technologies,airflow,team management,business technology,aws,azkaban,software development,etl,client management,data quality management,life science
Posted 1 day ago
5.0 - 8.0 years
12 - 22 Lacs
Mumbai
Hybrid
Informatica Developer- Exp- 5 to 8 years Location- Mumbai Skills- Informatica Power center , ETL, RDBMS concepts, Informatica, Shell scripting, Java and Python Informatica Developer with 5-7 years of experience of ETL RDBMS concepts, Informatica, Shell scripting, Java and Python to support the applications in the Surveillance area Duties and Responsibilities Support the applications in the Surveillance area Build enhance applications in the Surveillance area as needed Support Develop ETL Data loads on Informatica Shell Scripts Work with users stakeholders including application owners for upstream systems to resolve any support issues Take full ownership of issues that arise provide analysis of issues escalate them as and when necessary and take them through to resolution within defined SLAs Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-
Posted 1 day ago
4.0 - 6.0 years
6 - 8 Lacs
Chennai
Work from Office
Implement data integration solutions using Informatica PowerCenter and Teradata. Optimize ETL workflows and ensure efficient data processing for analytics.
Posted 1 day ago
4.0 - 5.0 years
6 - 7 Lacs
Bengaluru
Work from Office
Implement and manage Informatica Data Quality (IDQ) solutions to ensure the accuracy, completeness, and reliability of data. Design data profiling, cleansing, and validation processes to maintain high-quality data across systems.
Posted 1 day ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Develop and implement OBIEE solutions, focusing on Oracle Business Intelligence Enterprise Edition (OBIEE) for reporting and analytics. You will work with SQL, Unix, and IPC for application development. Expertise in OBIEE and SQL is required.
Posted 1 day ago
4.0 - 5.0 years
6 - 7 Lacs
Bengaluru
Work from Office
Develop and manage integration solutions using Informatica Data Management Cloud (IDMC). Ensure smooth data integration processes, automate workflows, and ensure the consistency and reliability of data across systems.
Posted 1 day ago
8.0 - 11.0 years
20 - 30 Lacs
Gurugram, Bengaluru
Hybrid
Key responsibilities: An expert in solution design with the ability to see the big picture across the portfolio; providing guidance and governance for the analysis, solution design, development and implementation of projects A strategic thinker who will be responsible for the technical strategy within the portfolio; ensuring it aligns with the overall architecture roadmap and business strategy. An effective communicator who will utilize their technical/business knowledge to lead technical discussions with project teams, business sponsors, and external technical partners in a manner that ensures understanding of risks, options, and overall solutions. An individual with the ability to effectively consult/support technical discussions with Account Delivery and Business Sponsors ensuring alignment to both technology and business visions. Collaborate with Designers, Business System Analysts, Application Analysts and Testing specialists to deliver high quality solutions Able to prepare high-level and detailed-level designs based on technical/business requirements and defined architectures and maintain documentation Have been instrumental in platform migration work and technical migration work in the past and understands the involved intricacies. Analyze, define and document requirements for data, workflow, logical processes, interface design, internal and external checks, controls, and outputs Ensure information security standards and requirements are incorporated into all solutions Stay current with trends in emerging technologies and how they could apply to Sun Life. Key experience: A Bachelors or master’s degree in Computer Science or related field 8 -11 years of progressive information technology experience with full application development life cycle. Domain knowledge of Insurance and Retail Wealth Management. Experience in Informatica Powercenter / IDMC Development. Experience of applying various informatica transformations and different type of sources. Ability to write complex T-SQL and stored procedures, views. Experience in SQL Server 2014 and above. Exposure to DevOps and API architecture Should have experience leading small teams (5-8 developers). Good knowledge and experience of Java1.8 or above. Experience in PostGRE SQL and No-SQL DB like MongoDB etc. Good knowledge of coding best practices and should be able to do code review of peer. Produce clean, efficient code based on specifications and troubleshoot, debug and upgrade existing software.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane