Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
3 - 6 Lacs
Lucknow
Work from Office
Job Title: Social Media Manager & Graphic Designer Company: BE Realty Location: Lucknow Job Type: Full-time If you're a passionate Social Media Manager with a knack for Meta, designing, social media campaigns and branding, we would love to hear from you! Apply today by sending your resume, portfolio, and a brief cover letter at hr.head@be.realty. Job Overview: BE Realty is seeking a creative and strategic Social Media Manager who also has some experience with graphic designing and creatives to elevate our brand presence across digital platforms. The ideal candidate will be responsible for creating engaging visual content, managing social media accounts, and implementing marketing strategies to enhance audience engagement and business growth. Key Responsibilities: Develop and execute social media strategies across platforms such as Facebook, Instagram, LinkedIn. Manage social media content calendars, ensuring timely posting and consistency in branding. Monitor analytics and performance metrics to optimize engagement and reach. Collaborate with marketing and sales teams to align content with business goals. Stay updated on design trends and social media best practices to keep content fresh and innovative. Engage with followers, respond to inquiries, and foster community growth. Assist with website content updates and digital marketing initiatives. Design high-quality graphics, including social media posts, flyers, brochures, email campaigns, and digital ads. Qualifications & Skills: Proven experience as a Social Media Manager (real estate industry experience and good knowledge of graphic designing is a plus). Proficiency in Meta, Instragram and Facebook Ads, Adobe Creative Suite (Photoshop, Illustrator, InDesign, etc.) and Canva. Experience with social media management tools (e.g., Hootsuite, Buffer, or Meta Business Suite). Strong understanding of branding, typography, and visual storytelling. Excellent writing and communication skills. Knowledge of digital marketing, SEO, and paid social media campaigns is a plus. Ability to multitask, meet deadlines, and work independently. What We Offer: Competitive salary based on experience. Opportunity to work in a dynamic and growing real estate company. Creative freedom to develop engaging content. A supportive and collaborative team environment. Professional growth and learning opportunities.
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Platform Engineer to build scalable infrastructure for data ingestion, processing, and analysis. Key Responsibilities: Architect distributed data systems. Enable data discoverability and quality. Develop data tooling and platform APIs. Required Skills & Qualifications: Experience with Spark, Kafka, and Delta Lake. Proficiency in Python, Scala, or Java. Familiar with cloud-based data platforms. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies
Posted 1 month ago
3.0 - 8.0 years
12 - 20 Lacs
Hyderabad
Work from Office
Data Governance, Data Quality, Data Management Collibra, Alation MDM, and BI/reporting processes data privacy regulations (GDPR, CCPA, etc.) Agile project environments
Posted 1 month ago
5.0 - 7.0 years
7 - 9 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Description Publicis Sapient is looking for a Senior AEM Developer to join our team of bright thinkers and doers Youll use your problem-solving creativity to design, architect, and develop high-end technology solutions that solve our clientsmost complex and challenging problems across different industries We are on a mission to transform the world, and you will be instrumental in shaping how we do it with your ideas, thoughts, and solutions, Your Impact Drive the design, planning, and implementation of multifaceted applications, giving you breadth and depth of knowledge across the entire project lifecycle, Combine your technical expertise and problem-solving passion to work closely with clients, turning complex ideas into end-to-end solutions that transform our clientsbusiness, Constantly innovate and evaluate emerging technologies and methods to provide scalable and elegant solutions that help clients achieve their business goals, Qualifications The overall experience of 6 to 8 years with 2-3 years of AEM including AEM 6 x, Strong hands-on experience in Components, Templates, Taxonomy, metadata management, Forward and Reverse Replication, workflow, Content Publishing and unpublishing, Tagging, Analytics, Deployment(Maven), and Content migration/ planning, Significant hands-on design experience with AEM and very strong concepts of OSGi, Apache Sling, Apache Sightly, Apache Oak, and Adobe Dispatcher Has worked on and implemented at least one of the popular Search engines with AEM like Solr, ElasticSearch, or Lucene and is involved in search queries performance tuning, Worked on integrations with popular products /Technologies like Salesforce, SSO, LDAP, and API Gateways using ReST Understands and implements the Quality Properties of the system in everyday work including Accessibility, SEO, URL management, Security, Performance, and Responsive architecture, Has implemented Quality Processes for projects like Continuous Integration (Bamboo / Jenkins / Git / BitBucket / Cloud Manager), SonarQube, Code reviews (Manual and Automated), code Formatters, Automation testing, etc Understanding of frontend technologies like Bootstrap, BackboneDot JS, ReactJS, Handlebars, Grunt, Angular, CSS3, HTML 5, and jQuery, Mastery of all core web and Java technologies including Java 8/11, JEE, XML, XHTML, client/server-side scripting languages such as JavaScript, and JSP, HTL, and web services development using Restful implementations, A good understanding of AEM capabilities including Multi-site manager (MSM) and Blueprinting, and the use of Online marketing components such as advanced targeting/personalization, and multi-variate testing, is preferred, Proficient knowledge of the end-to-end content lifecycle, web content management, content publishing/deployment, and delivery processes, Knowledge of Cloud-native approaches and platforms including AWS, Azure, or GCP Understanding of AEM as Cloud Service, Familiarity with Adobe I/O Runtime and Adobe I/O Events, A good understanding of integration patterns and content-centric application development patterns using AEM Search, Commerce package, or other platforms is preferred Education: Full-time Bachelors/Masters degree (Science or Engineering preferred) Additional Information Develop Digital Consumer experiences based on Adobe marketing product suite including Adobe Sites (AEM), CRX, WCM, Adobe Launch, Adobe Assets (DAM) & Adobe Social, Develop powerful features such as multi-site and multi-channel delivery, personalization/targeting, content aggregation & syndication, multi-lingual support, automated workflow management, social media, etc Diagnose and solve technical problems related to web content management implementation, Interact with clients to create end-to-end specifications and solution architecture for content & collaboration solutions, Mentor and provide inputs and direction to senior developers on the team for design and implementation, Collaborate with your architect to define implementation processes and quality gates and standards Write application code and extensions for the AEM platform that exceeds the defined quality standards Additional Information Gender-Neutral Policy 18 paid holidays throughout the year Generous parental leave and new parent transition program Flexible work arrangements Employee Assistance Programs to help you in wellness and well being Company Description Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clientsbusinesses through designing the products and services their customers truly value,
Posted 1 month ago
3.0 - 4.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:Data Quality Engineer Experience3-4 Years Location:Bangalore : We are seeking a detail-oriented and highly motivated Data Quality Engineerto join our growing data team. In this role, you will be responsible for designing, implementing, and maintaining data quality frameworks to ensure the accuracy, completeness, consistency, and reliability of enterprise data. You will work closely with business stakeholders, data stewards, and data engineers to enforce data governance policies and utilize tools like Ataccamato support enterprise data quality initiatives. We only need immediate joiners. Key Responsibilities: Design and implement robust data quality frameworksand rules using Ataccama ONEor similar data quality tools. Develop automated data quality checks and validation routines to proactively detect and remediate data issues. Collaborate with business and technical teams to define data quality metrics, thresholds, and standards. Support the data governance strategyby identifying critical data elements and ensuring alignment with organizational policies. Monitor, analyze, and report on data quality trends, providing insights and recommendations for continuous improvement. Work with data stewards to resolve data issues and ensure adherence to data quality best practices. Support metadata management, data lineage, and data profiling activities. Document processes, data flows, and data quality rules to facilitate transparency and reproducibility. Conduct root cause analysis on data issues and implement corrective actions to prevent recurrence. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. 3+ years of experience in a Data Quality, Data Governance, or Data Engineering role. Hands-on experience with Ataccama ONE or similar data quality tools, including rule creation, data profiling, and issue management. Strong knowledge of data governance frameworks, principles, and best practices. Proficient in SQL and data analysis with the ability to query complex datasets. Experience with data management platforms and enterprise data ecosystems. Excellent problem-solving skills and attention to detail. Strong communication and stakeholder engagement skills. Preferred Qualifications: Experience with cloud data platforms (e.g., Snowflake, AWS, Azure). Familiarity with data catalog tools (e.g., Collibra, Alation). Knowledge of industry data standards and regulatory requirements (e.g., GDPR, HIPAA).
Posted 1 month ago
3.0 - 6.0 years
9 - 14 Lacs
Mumbai
Work from Office
Role Overview : We are looking for aTalend Data Catalog Specialistto drive enterprise data governance initiatives by implementingTalend Data Catalogand integrating it withApache Atlasfor unified metadata management within a Cloudera-based data lakehouse. The role involves establishing metadata lineage, glossary harmonization, and governance policies to enhance trust, discovery, and compliance across the data ecosystem Key Responsibilities: o Set up and configure Talend Data Catalog to ingest and manage metadata from source systems, data lake (HDFS), Iceberg tables, Hive metastore, and external data sources. o Develop and maintain business glossaries , data classifications, and metadata models. o Design and implement bi-directional integration between Talend Data Catalog and Apache Atlas to enable metadata synchronization , lineage capture, and policy alignment across the Cloudera stack. o Map technical metadata from Hive/Impala to business metadata defined in Talend. o Capture end-to-end lineage of data pipelines (e.g., from ingestion in PySpark to consumption in BI tools) using Talend and Atlas. o Provide impact analysis for schema changes, data transformations, and governance rule enforcement. o Support definition and rollout of enterprise data governance policies (e.g., ownership, stewardship, access control). o Enable role-based metadata access , tagging, and data sensitivity classification. o Work with data owners, stewards, and architects to ensure data assets are well-documented, governed, and discoverable. o Provide training to users on leveraging the catalog for search, understanding, and reuse. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 6–12 years in data governance or metadata management, with at least 2–3 years in Talend Data Catalog. Talend Data Catalog, Apache Atlas, Cloudera CDP, Hive/Impala, Spark, HDFS, SQL. Business glossary, metadata enrichment, lineage tracking, stewardship workflows. Hands-on experience in Talend–Atlas integration , either through REST APIs, Kafka hooks, or metadata bridges. Preferred technical and professional experience .
Posted 1 month ago
7.0 - 12.0 years
27 - 42 Lacs
Chennai
Work from Office
Collibra Certified Rangers (Preferable) Analyze current architecture, data flow, data dependencies and related documentation. Develop Collibra integration solutions. Trace data from source system, across the various contact points of data landscape, to final destination system. Use Lineage harvester and adhere to industry best practices. Experience in metadata extraction, building business and technical lineage among assets. Design, develop and test Collibra integrations and workflows. Take advantage of the depth and breadth of integrations to connect data ecosystem to Collibra Data Intelligence Platform. Access Collibra-supported integrations, partner and other pre-built integrations and APIs to gain visibility. Understand and share insights. Understand Data Governance, Metadata Management, Reference Data Management, Data Modeling, Data Integration & Data Analysis. Work as a Solution provider. Collibra API Development
Posted 1 month ago
9.0 - 14.0 years
50 - 85 Lacs
Noida
Work from Office
About the Role We are looking for a Staff Engineer to lead the design and development of a scalable, secure, and robust data platform. You will play a key role in building data platform capabilities for data quality, metadata management, lineage tracking, and compliance across all data layers. If youre passionate about building foundational data infrastructure that accelerates innovation in healthcare, wed love to talk. A Day in the Life Architect, design, and build scalable data governance tools and frameworks. Collaborate with cross-functional teams to ensure data compliance, security, and usability. Lead initiatives around metadata management, data lineage, and data cataloging. Define and evangelize standards and best practices across data engineering teams. Own the end-to-end lifecycle of governance tooling from prototyping to production deployment. Mentor and guide junior engineers and contribute to technical leadership across the organization. Drive innovation in privacy-by-design, regulatory compliance (e.g., HIPAA), and data observability solutions. What You Need 8+ years of experience in software engineering. Strong experience building distributed systems for metadata management, data lineage, and quality tracking. Proficient in backend development (Python, Java, or Scala or Go) and familiar with RESTful API design. Expertise in modern data stacks: Kafka, Spark, Airflow, Snowflake etc. Experience with open-source data governance frameworks like Apache Atlas, Amundsen, or DataHub is a big plus. Familiarity with cloud platforms (AWS, Azure, GCP) and their native data governance offerings. Prior experience in building metadata management frameworks for scale.
Posted 1 month ago
3 - 7 years
3 - 7 Lacs
Bengaluru
Work from Office
Hyperion Essbase Developer Full-time DepartmentEnterprise Applications Company Description Version 1 has celebrated over 26 years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment, and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role A Hyperion Essbase Developer will be responsible for designing, developing, and maintaining Oracle Hyperion Planning and Essbase applications . This role requires expertise in multidimensional databases, OLAP technologies, and financial data modeling . Technical Responsibilities Essbase Development : Design and develop BSO (Block Storage Option) and ASO (Aggregate Storage Option) cubes . Implement calculation scripts, business rules, and member formulas . Optimize cube performance using indexing, partitioning, and aggregation techniques. Hyperion Planning : Configure and maintain Hyperion Planning applications . Develop data forms, task lists, and workflow processes . Data Integration & Automation : Implement ETL processes using FDMEE (Financial Data Quality Management Enterprise Edition) . Develop SQL scripts for data extraction and transformation. Automate data loads, metadata updates, and security provisioning . Security & Performance Optimization : Manage user roles, access permissions, and authentication . Optimize query performance using Essbase tuning techniques. Monitor system health and troubleshoot performance issues . Qualifications Oracle Hyperion Planning & Essbase Essbase Calculation Scripts & Business Rules SQL & PL/SQL FDMEE & Data Integration EPM Automate Smart View & Financial Reporting Metadata Management & Security Configuration Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. #LI-BS1 Cookies Settings
Posted 1 month ago
3 - 5 years
11 - 15 Lacs
Hyderabad
Work from Office
Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).
Posted 1 month ago
3 - 5 years
5 - 10 Lacs
Mumbai
Work from Office
We are seeking a proactive and detail-oriented Digital Asset Manager to oversee the organization and coordination of digital assets for a high-volume, multi-market automotive campaign. This role is ideal for individuals with strong project management skills and the ability to adapt quickly to new tools and workflows. While prior experience with digital asset management platforms like Cape is advantageous, it is not mandatory. The position offers an opportunity to evolve into a leadership role, encompassing responsibilities such as asset governance, quality assurance, and cross-functional coordination with stakeholders, including teams in Germany. Key responsibilities This role requires strong project management skills, which may be applied across a range of areas including-but not limited to-the following: Asset organization: Ensure assets are properly tagged and stored for easy retrieval and compliance Maintain a structured taxonomy and metadata schema for organizing digital assets Workflow coordination: Collaborate with creative teams to streamline asset creation and approval processes Coordinate with layout experts and other stakeholders to ensure timely delivery of assets Quality assurance: Develop and implement QA checklists to maintain asset quality and consistency Conduct regular audits of assets to ensure they meet brand and technical standards Governance and compliance: Establish and enforce guidelines for asset usage and distribution Ensure compliance with licensing agreements and usage rights Stakeholder communication: Serve as the primary point of contact between the creative team and external stakeholders Facilitate regular updates and feedback sessions with teams in Germany and other locations Skills & requirements 3 - 5 years of experience in project management, digital asset management, or related fields Strong organizational skills and attention to detail Excellent communication and interpersonal abilities Ability to adapt quickly to new tools and technologies Familiarity with digital asset management systems is a plus, but not mandatory Proficiency in Microsoft Office Suite and project management tools Nice to have Experience with creative automation platforms like Cape Understanding of metadata standards and taxonomy development Background in quality assurance or compliance roles Exposure to international stakeholder management.
Posted 1 month ago
10 - 15 years
15 - 18 Lacs
Hyderabad
Work from Office
Skilled in data modeling (ER/Studio, Erwin), MPP DBs (Databricks, Snowflake), GitHub, CI/CD, metadata/lineage, agile/DevOps, SAP HANA/S4, and retail data (IRI, Nielsen). Mail:kowsalya.k@srsinfoway.com
Posted 1 month ago
4 - 7 years
20 - 22 Lacs
Pune, Gurugram
Work from Office
Core skills and Competencies 1. Design, develop, and maintain data pipelines, ETL/ELT processes, and data integrations to support efficient and reliable data ingestion, transformation, and loading. 2. Collaborate with API developers, and other stakeholders to understand data requirements and ensure the availability, reliability, and accuracy of the data. 3. Optimize and tune performance of data processes and workflows to ensure efficient data processing and analysis at scale. 4. Implement data governance practices, including data quality monitoring, data lineage tracking, and metadata management. 5. Work closely with infrastructure and DevOps teams to ensure the scalability, security, and availability of the data platform and data storage systems. 6. Continuously evaluate and recommend new technologies, tools, and frameworks to improve the efficiency and effectiveness of data engineering processes. 7. Collaborate with software engineers to integrate data engineering solutions with other systems and applications. 8. Document and maintain data engineering processes, including data pipeline configurations, job schedules, and monitoring and alerting mechanisms. 9. Stay up-to-date with industry trends and advancements in data engineering, cloud technologies, and data processing frameworks. 10. Provide mentorship and guidance to junior data engineers, promoting best practices in data engineering and ensuring the growth and development of the team. 11. Able to implement and troubleshoot Rest services in Python.
Posted 1 month ago
7 - 11 years
15 - 19 Lacs
Hyderabad
Work from Office
ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Role Description: We are seeking a Data Solutions Architect to design, implement, and optimize scalable and high-performance data solutions that support enterprise analytics, AI-driven insights, and digital transformation initiatives. This role will focus on data strategy, architecture, governance, security, and operational efficiency, ensuring seamless data integration across modern cloud platforms. The ideal candidate will work closely with engineering teams, business stakeholders, and leadership to establish a future-ready data ecosystem, balancing performance, cost-efficiency, security, and usability. This position requires expertise in modern cloud-based data architectures, data engineering best practices, and Scaled Agile methodologies. Roles & Responsibilities: Design and implement scalable, modular, and future-proof data architectures that support enterprise data lakes, data warehouses, and real-time analytics. Develop enterprise-wide data frameworks that enable governed, secure, and accessible data across various business domains. Define data modeling strategies to support structured and unstructured data, ensuring efficiency, consistency, and usability across analytical platforms. Lead the development of high-performance data pipelines for batch and real-time data processing, integrating APIs, streaming sources, transactional systems, and external data platforms. Optimize query performance, indexing, caching, and storage strategies to enhance scalability, cost efficiency, and analytical capabilities. Establish data interoperability frameworks that enable seamless integration across multiple data sources and platforms. Drive data governance strategies, ensuring security, compliance, access controls, and lineage tracking are embedded into enterprise data solutions. Implement DataOps best practices, including CI/CD for data pipelines, automated monitoring, and proactive issue resolution, to improve operational efficiency. Lead Scaled Agile (SAFe) practices, facilitating Program Increment (PI) Planning, Sprint Planning, and Agile ceremonies, ensuring iterative delivery of enterprise data capabilities. Collaborate with business stakeholders, product teams, and technology leaders to align data architecture strategies with organizational goals. Act as a trusted advisor on emerging data technologies and trends, ensuring that the enterprise adopts cutting-edge data solutions that provide competitive advantage and long-term scalability. What we expect of you Must-Have Skills: Experience in data architecture, enterprise data management, and cloud-based analytics solutions. Expertise in Databricks, cloud-native data platforms, and distributed computing frameworks. Strong proficiency in modern data modeling techniques, including dimensional modeling, NoSQL, and data virtualization. Experience designing high-performance ETL/ELT pipelines and real-time data processing solutions. Deep understanding of data governance, security, metadata management, and access control frameworks. Hands-on experience with CI/CD for data solutions, DataOps automation, and infrastructure as code (IaaC). Proven ability to collaborate with cross-functional teams, including business executives, data engineers, and analytics teams, to drive successful data initiatives. Strong problem-solving, strategic thinking, and technical leadership skills. Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience with Data Mesh architectures and federated data governance models. Certification in cloud data platforms or enterprise architecture frameworks. Knowledge of AI/ML pipeline integration within enterprise data architectures. Familiarity with BI & analytics platforms for enabling self-service analytics and enterprise reporting. Education and Professional Certifications Doctorate Degree with 6-8 + years of experience in Computer Science, IT or related field OR Master’s degree with 8-10 + years of experience in Computer Science, IT or related field OR Bachelor’s degree with 10-12 + years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 1 month ago
3 - 6 years
6 - 9 Lacs
Hyderabad
Work from Office
ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data . You will leverage domain, technical and business process expertise to provide exceptional support of Amgen’s data governance framework. This role involves working closely with business stakeholder and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with Product Data Owner s and data leaders to increase the trust and reuse of data across Amgen . Roles & Responsibilities Responsible for the adoption of data governance framework implementation for a given domain of expertise (Research, Development, Supply Chain, etc.). ? Responsible for the operationalization of the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. ? Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Drives cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. ? D evelop s policies for data, metadata, privacy, lineage, access, security, retention, and archival. Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. ? Ensure compliance requirements with data privacy, security, and regulatory policies for the assigned domains ? Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) define the specifications shaping the development and implementation of data foundations?. Build strong relationship with key business leads and partners to ensure their needs are met . Functional Skills: Must-Have Functional Skills: Technical skills (Advanced SQL, Python etc ) with knowledge of Pharma processes with specialization in a domain (e.g., Research, Clinical Trials, Commercial, etc.) In depth experience of working with or supporting systems used to data governance framework. E.g. Collibra, Alation In depth knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. In depth experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Customer focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics Ability to manage scrum teams Good-to-Have Functional Skills: Experience of working with data governance councils or forums Experience with Agile software development methodologies (Scrum) Soft Skills: Excellent analytical skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ability to build business relationships and understand end-to-end data use and needs. Excellent interpersonal skills (team player). People management skills either in matrix or direct line function. Strong verbal and written communication skills Basic Qualifications Minimum 9 - 12 years of experience in Data Science, Artificial Intelligence, Computer Science, or related fields OR EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .
Posted 1 month ago
3 - 7 years
6 - 9 Lacs
Hyderabad
Work from Office
ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data . This role involves working closely with business stakeholder and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with Data Product Owners, Data Stewards and technology teams to increase the trust and reuse of data across Amgen. Roles & Responsibilities Responsible for the execution of data governance framework for a given domain of expertise (Research, Development, Supply Chain, etc.). ? Contribute to the operationalization of the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. ? Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Contribute to the cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. ? Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. ? Partner with business teams to identify compliance requirements with data privacy, security, and regulatory policies for the assigned domains ? Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) delivers data foundations?. Build strong relationship with key business leads and partners to ensure their needs are met . Functional Skills: Must-Have Functional Skills: Technical skills (Advanced SQL, Python etc ) with knowledge of Pharma processes with specialization in a domain (e.g., Research, Clinical Trials, Commercial, etc.) Experience of working with or supporting systems used to data governance framework. E.g. Collibra, Alation General knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. Experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Customer focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics Excellent problem-solving skills and a committed attention to detail in finding solutions Good-to-Have Functional Skills: Experience with Agile software development methodologies (Scrum) Soft Skills: Excellent analytical skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ability to build business relationships and understand end-to-end data use and needs. Strong verbal and written communication skills Basic Qualifications Minimum experience with 5 - 8 years of experience in Business, Engineering, IT or related field EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .
Posted 1 month ago
7 - 11 years
15 - 19 Lacs
Hyderabad
Work from Office
ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Role Description: We are seeking a Data Solutions Architect with deep R&D expertise in Biotech/Pharma to design, implement, and optimize scalable and high-performance data solutions that support enterprise analytics, AI-driven insights, and digital transformation initiatives. This role will focus on data strategy, architecture, governance, security, and operational efficiency, ensuring seamless data integration across modern cloud platforms. The ideal candidate will work closely with R&D and engineering teams, business stakeholders, and leadership to establish a future-ready data ecosystem, balancing performance, cost-efficiency, security, and usability. This position requires expertise in modern cloud-based data architectures, data engineering best practices, and Scaled Agile methodologies. Roles & Responsibilities: Design and implement scalable, modular, and future-proof data architectures that support R&D initiatives in enterprise. Develop enterprise-wide data frameworks that enable governed, secure, and accessible data across various business domains. Define data modeling strategies to support structured and unstructured data, ensuring efficiency, consistency, and usability across analytical platforms. Lead the development of high-performance data pipelines for batch and real-time data processing, integrating APIs, streaming sources, transactional systems, and external data platforms. Optimize query performance, indexing, caching, and storage strategies to enhance scalability, cost efficiency, and analytical capabilities. Establish data interoperability frameworks that enable seamless integration across multiple data sources and platforms. Drive data governance strategies, ensuring security, compliance, access controls, and lineage tracking are embedded into enterprise data solutions. Implement DataOps best practices, including CI/CD for data pipelines, automated monitoring, and proactive issue resolution, to improve operational efficiency. Lead Scaled Agile (SAFe) practices, facilitating Program Increment (PI) Planning, Sprint Planning, and Agile ceremonies, ensuring iterative delivery of enterprise data capabilities. Collaborate with business stakeholders, product teams, and technology leaders to align data architecture strategies with organizational goals. Act as a trusted advisor on emerging data technologies and trends, ensuring that the enterprise adopts cutting-edge data solutions that provide competitive advantage and long-term scalability. What we expect of you Must-Have Skills: Experience in data architecture, enterprise data management, and cloud-based analytics solutions. Well versed in R&D domain of Biotech/Pharma industry and has been instrumental in solving complex problems for them using data strategy. Expertise in Databricks, cloud-native data platforms, and distributed computing frameworks. Strong proficiency in modern data modeling techniques, including dimensional modeling, NoSQL, and data virtualization. Experience designing high-performance ETL/ELT pipelines and real-time data processing solutions. Deep understanding of data governance, security, metadata management, and access control frameworks. Hands-on experience with CI/CD for data solutions, DataOps automation, and infrastructure as code (IaC). Proven ability to collaborate with cross-functional teams, including business executives, data engineers, and analytics teams, to drive successful data initiatives. Strong problem-solving, strategic thinking, and technical leadership skills. Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with Data Mesh architectures and federated data governance models. Certification in cloud data platforms or enterprise architecture frameworks. Knowledge of AI/ML pipeline integration within enterprise data architectures. Familiarity with BI & analytics platforms for enabling self-service analytics and enterprise reporting. Education and Professional Certifications Doctorate Degree with 3-5 + years of experience in Computer Science, IT or related field OR Master’s degree with 6 - 8 + years of experience in Computer Science, IT or related field OR Bachelor’s degree with 8-10 + years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 1 month ago
2 - 5 years
4 - 8 Lacs
Hyderabad
Work from Office
ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data . This role involves working closely with business stakeholders and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with Data Stewards to increase the trust and reuse of data across Amgen. Roles & Responsibilities Contribute to the data governance and data management framework implementation for a given domain of expertise (Research, Development, Supply Chain, etc.). ? Assess and document with the stakeholder community their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. ? Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. ? Partner with business teams to identify compliance requirements with data privacy, security, and regulatory policies for the assigned domains ? Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) delivers data foundations?. Functional Skills: Must-Have Functional Skills: General knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. General knowledge with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy . Customer focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics Excellent problem-solving skills and a committed attention to detail in finding solutions Good-to-Have Functional Skills: K nowledge of Pharma processes with specialization in a domain (e.g., Research, Clinical Trials, Commercial, etc.) Experience of working with or supporting systems used for data management . E.g. Collibra, Ataccama Data Quality platform. Experience with Agile software development methodologies (Scrum ) Soft Skills: Good analytical skills Ability to work effectively with global, virtual teams Team-oriented, with a focus on achieving team goals Ability to build business relationships and understand end-to-end data use and needs. Good verbal and written communication skills Basic Qualifications Minimum Experience with 2 - 5 years of experience in Business, Engineering, IT or related field EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .
Posted 1 month ago
5 - 10 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica MDM Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions that align with business needs and requirements. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the team in implementing innovative solutions Ensure adherence to project timelines and quality standards Professional & Technical Skills: Must To Have Skills: Proficiency in Informatica MDM Strong understanding of data integration and data quality management Experience in designing and implementing MDM solutions Knowledge of data modeling and metadata management Hands-on experience with Informatica PowerCenter Good To Have Skills: Experience with Informatica Data Quality Additional Information: The candidate should have a minimum of 5 years of experience in Informatica MDM This position is based at our Bengaluru office A 15 years full-time education is required Qualifications 15 years full time education
Posted 1 month ago
2 - 5 years
7 - 11 Lacs
Mumbai
Work from Office
What you’ll do As a Data Engineer – Data Modeling, you will be responsible for: Data Modeling & Schema Design Developing conceptual, logical, and physical data models to support enterprise data requirements. Designing schema structures for Apache Iceberg tables on Cloudera Data Platform. Collaborating with ETL developers and data engineers to optimize data models for efficient ingestion and retrieval. Data Governance & Quality Assurance Ensuring data accuracy, consistency, and integrity across data models. Supporting data lineage and metadata management to enhance data traceability. Implementing naming conventions, data definitions, and standardization in collaboration with governance teams. ETL & Data Pipeline Support Assisting in the migration of data from IIAS to Cloudera Data Lake by designing efficient data structures. Working with Denodo for data virtualization, ensuring optimized data access across multiple sources. Collaborating with teams using Talend Data Quality (DQ) tools to ensure high-quality data in the models. Collaboration & Documentation Working closely with business analysts, architects, and reporting teams to understand data requirements. Maintaining data dictionaries, entity relationships, and technical documentation for data models. Supporting data visualization and analytics teams by designing reporting-friendly data models. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in data modeling, database design, and data engineering. Hands-on experience with ERwin Data Modeler for creating and managing data models. Strong knowledge of relational databases (PostgreSQL) and big data platforms (Cloudera, Apache Iceberg). Proficiency in SQL and NoSQL database concepts. Understanding of data governance, metadata management, and data security principles. Familiarity with ETL processes and data pipeline optimization. Strong analytical, problem-solving, and documentation skills. Preferred technical and professional experience Experience working on Cloudera migration projects. Exposure to Denodo for data virtualization and Talend DQ for data quality management. Knowledge of Kafka, Airflow, and PySpark for data processing. Familiarity with GitLab, Sonatype Nexus, and CheckMarx for CI/CD and security compliance. Certifications in Data Modeling, Cloudera Data Engineering, or IBM Data Solutions.
Posted 1 month ago
6 - 10 years
14 - 17 Lacs
Mumbai
Work from Office
A Data Engineer specializing in enterprise data platforms, experienced in building, managing, and optimizing data pipelines for large-scale environments. Having expertise in big data technologies, distributed computing, data ingestion, and transformation frameworks. Proficient in Apache Spark, PySpark, Kafka, and Iceberg tables, and understand how to design and implement scalable, high-performance data processing solutions.What you’ll doAs a Data Engineer – Data Platform Services, responsibilities include: Data Ingestion & Processing Designing and developing data pipelines to migrate workloads from IIAS to Cloudera Data Lake. Implementing streaming and batch data ingestion frameworks using Kafka, Apache Spark (PySpark). Working with IBM CDC and Universal Data Mover to manage data replication and movement. Big Data & Data Lakehouse Management Implementing Apache Iceberg tables for efficient data storage and retrieval. Managing distributed data processing with Cloudera Data Platform (CDP). Ensuring data lineage, cataloging, and governance for compliance with Bank/regulatory policies. Optimization & Performance Tuning Optimizing Spark and PySpark jobs for performance and scalability. Implementing data partitioning, indexing, and caching to enhance query performance. Monitoring and troubleshooting pipeline failures and performance bottlenecks. Security & Compliance Ensuring secure data access, encryption, and masking using Thales CipherTrust. Implementing role-based access controls (RBAC) and data governance policies. Supporting metadata management and data quality initiatives. Collaboration & Automation Working closely with Data Scientists, Analysts, and DevOps teams to integrate data solutions. Automating data workflows using Airflow and implementing CI/CD pipelines with GitLab and Sonatype Nexus. Supporting Denodo-based data virtualization for seamless data access. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 6-10 years of experience in big data engineering, data processing, and distributed computing. Proficiency in Apache Spark, PySpark, Kafka, Iceberg, and Cloudera Data Platform (CDP). Strong programming skills in Python, Scala, and SQL. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Knowledge of data security, encryption, and compliance frameworks. Experience working with metadata management and data quality solutions. Preferred technical and professional experience Experience with data migration projects in the banking/financial sector. Knowledge of graph databases (DGraph Enterprise) and data virtualization (Denodo). Exposure to cloud-based data platforms (AWS, Azure, GCP). Familiarity with MLOps integration for AI-driven data processing. Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics. Architectural review and recommendations on the migration/transformation solutions. Experience working with Banking Data model. “Meghdoot” Cloud platform knowledge.
Posted 1 month ago
2 - 5 years
7 - 11 Lacs
Mumbai
Work from Office
Who you areA highly skilled Data Engineer specializing in Data Modeling with experience in designing, implementing, and optimizing data structures that support the storage, retrieval and processing of data for large-scale enterprise environments. Having expertise in conceptual, logical, and physical data modeling, along with a deep understanding of ETL processes, data lake architectures, and modern data platforms. Proficient in ERwin, PostgreSQL, Apache Iceberg, Cloudera Data Platform, and Denodo. Possess ability to work with cross-functional teams, data architects, and business stakeholders ensures that data models align with enterprise data strategies and support analytical use cases effectively. What you’ll doAs a Data Engineer – Data Modeling, you will be responsible for: Data Modeling & Architecture Designing and developing conceptual, logical, and physical data models to support data migration from IIAS to Cloudera Data Lake. Creating and optimizing data models for structured, semi-structured, and unstructured data stored in Apache Iceberg tables on Cloudera. Establishing data lineage and metadata management for the new data platform. Implementing Denodo-based data virtualization models to ensure seamless data access across multiple sources. Data Governance & Quality Ensuring data integrity, consistency, and compliance with regulatory standards, including Banking/regulatory guidelines. Implementing Talend Data Quality (DQ) solutions to maintain high data accuracy. Defining and enforcing naming conventions, data definitions, and business rules for structured and semi-structured data. ETL & Data Pipeline Optimization Supporting the migration of ETL workflows from IBM DataStage to PySpark, ensuring models align with the new ingestion framework. Collaborating with data engineers to define schema evolution strategies for Iceberg tables. Ensuring performance optimization for large-scale data processing on Cloudera. Collaboration & Documentation Working closely with business analysts, architects, and developers to translate business requirements into scalable data models. Documenting data dictionary, entity relationships, and mapping specifications for data migration. Supporting reporting and analytics teams (Qlik Sense/Tableau) by providing well-structured data models. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in Cloudera migration projects in the banking or financial sector. Knowledge of PySpark, Kafka, Airflow, and cloud-native data processing. Experience with Talend DQ for data quality monitoring. Preferred technical and professional experience Experience in Cloudera migration projects in the banking or financial sector. Knowledge of PySpark, Kafka, Airflow, and cloud-native data processing. Experience with Talend DQ for data quality monitoring. Familiarity with graph databases (DGraph Enterprise) for data relationships. Experience with GitLab, Sonatype Nexus, and CheckMarx for CI/CD and security compliance. IBM, Cloudera, or AWS/GCP certifications in Data Engineering or Data Modeling.
Posted 1 month ago
5 - 10 years
10 - 20 Lacs
Bengaluru
Work from Office
Analyze business workflows and translate them into data structures. Define data requirements, advise on integration and modeling, ensure metadata accuracy, and support delivery teams with reusable data assets and insights. Required Candidate profile Analyst with strong understanding of data models, business workflows, metadata management, and integration best practices. Work with cross-functional teams to align data with ops and strategic goals.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane