Jobs
Interviews

1817 Data Architecture Jobs - Page 32

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

As a Senior Software Engineer, you will be responsible for the entire software life cycle - design, development, test, release, and maintenance and translates business needs into working software. The Senior Software Engineer believes in a non-hierarchical culture of collaboration, transparency, safety and trust. We believe that you are focused on value creation, growth and serving customers with full ownership and accountability delivering exceptional customer and business results! Key Responsibilities: Strong understanding of DevOps methodologies, CI/CD pipelines, GitHub Actions, Git workflows and cloud-native development. Building and setting up new development tools and infrastructure. Working on ways to automate and improve development and release processes Ensuring that systems and architecture is safe and secure against cybersecurity threats. Should perform event-driven architecture and designing messaging patterns (Solace and Kafka) Integration development- API driven, Microservices oriented, file-based integration and handling transformations. B.Tech/ B.E/ MS from a premier engineering college with 7+ years of total experience in software development. Good knowledge in retail and ecommerce. Knowledge of integration architecture and data architecture standards, frameworks and practices. 6+ years of experience in Azure PaaS - Azure Function App, Azure Logic App. Good knowledge in Azure Service Bus (messaging in Azure) and blob storage. Experience and knowledge of API design (GraphQL). Experience on the Azure platform and resources - Service bus, Data Lake, API Management, App Service, Azure Event hub, Storage accounts, Kubernetes, Services, SQL Server etc. Good Experience with programming skills with C# (.Net) and knowledge in java. Experience with messaging and working with messaging brokers such as Solace. Experience and knowledge of writing testable and high-quality code using Xunit and WebJobs framework. Familiarity with containerization (Docker) and orchestration (Kubernetes) Knowledge of common system integration methods and technologies including

Posted 1 month ago

Apply

8.0 - 13.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Tech Permanent Job Description Be part of something bigger. Decode the future. At Electrolux, as a leading global appliance company, we strive every day to shape living for the better for our consumers, our people and our planet. We share ideas and collaborate so that together, we can develop solutions that deliver enjoyable and sustainable living. Come join us as you are. We believe diverse perspectives make us stronger and more innovative. In our global community of people from 100+ countries, we listen to each other, actively contribute, and grow together. All about the role: We are looking for a Engineer to help driving our global MarTech strategy forward, with a particular focus on data engineering and data science to design and scale our customer data infrastructure. You will work closely with cross-functional teams - from engineering and product to data and CX teams - ensuring scalable, future-ready solutions that enhance both consumer and business outcomes. Great innovation happens when complexity is tamed, and possibilities are unleashed. That s what we firmly believe! Join our team at Electrolux, where we lead Digital Transformation efforts. We specialize in developing centralized solutions to enhance inter-system communications, integrate third-party platforms, and establish ourselves as the Master Data within Electrolux. Our focus is on delivering high-performance and scalable solutions that consistently achieve top-quality results on a global scale. Currently operating in Europe and North America, we are expanding our footprint to all regions worldwide. About the CDI Experience Organization: The Consumer Direct Interaction Experience Organization is a Digital Product Organization responsible for delivering tech solutions to our end-users and consumers across both pre-purchase and post-purchase journeys. We are organized in 15+ digital product areas, providing solutions ranging from Contact Center, E-commerce, Marketing, and Identity to AI. You will play a key role in ensuring the right sizing, right skillset, and core competency across these product areas. What you ll do: Design and implement scalable, secure data architectures that support advanced marketing use cases across platforms such as BlueConic (CDP), SAP CDC (Identity & Consent), Iterable (Marketing Automation), Qualtrics (Experience Management), and Dynamic Yield (Personalization). Define and govern data pipelines for collecting, processing, and enriching first party and behavioural data from digital and offline touchpoints. Ability to productionize probabilistic and/or machine learning models for audience segmentation, propensity scoring, content recommendations, and predictive analytics. Collaborate with Data Engineering and Cloud teams to build out event-driven and batch data flows using technologies such as Azure Data Factory, Databricks, Delta Lake, Azure Synapse, and Kafka. Lead the integration of MarTech data with enterprise data warehouses and data lakes, ensuring consistency, accessibility, and compliance. Translate business needs into scalable data models and transformation logic that empower marketing, analytics, and CX stakeholders. Establish data governance and quality frameworks, including metadata management, lineage tracking, and privacy compliance (GDPR, CCPA). Serve as a subject matter expert in both MarTech data architecture and advanced analytics capabilities. Who are you: Bachelor s or Master s degree in Computer Science, Data Engineering, or related field. 8+ years of experience in data engineering, analytics platforms, and data science applications roles, ideally within a MarTech, CX, or Digital environment. Hands-on experience with customer data platforms (CDPs) and integrating marketing data into enterprise ecosystems. Solid programming skills in Python, SQL, and experience working with Spark, ML pipelines, and ETL orchestration frameworks. Experience integrating marketing technology platforms (e.g., Iterable, Qualtrics, Dynamic Yield) into analytical workflows and consumer intelligence layers. Strong understanding of data privacy, consent management, and ethical AI practices. Excellent communication skills with the ability to influence and collaborate effectively across diverse teams and stakeholders. Experience in Agile development environments and working in distributed/global teams. Where youll be This is a full-time position, based in Bangalore, India. Benefits highlights Flexible work hours/hybrid work environment Discounts on our award-winning Electrolux products and services Family-friendly benefits Extensive learning opportunities and flexible career path.

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Bengaluru

Work from Office

We are seeking a strategic and technically adept Product Owner to lead our Data & Analytics initiatives, with a strong focus on Data Governance and Data Engineering . This role will be pivotal in shaping and executing the data strategy, ensuring high data quality, compliance, and enabling scalable data infrastructure to support business intelligence and advanced analytics. Data Governance Manager with extensive experience in data governance, quality management, and stakeholder engagement. Proven track record in designing and implementing global data standards and governance frameworks at Daimler Trucks. Expertise in managing diverse data sources from multiple domains and platforms. Skilled in tools such as Alation, Azure Purview , Informatica or similar products to build marketplace for Data Products. Excellent communication skills for managing global CDO stakeholders, policy makers, data practictioners Certifications in Agile (e.g., CSPO), Data Governance (e.g., DCAM), or Cloud Platforms. Experience with data cataloging tools (e.g., Informatica, Collibra, Alation) and data quality platforms. Key Responsibilities: Product Ownership & Strategy Define and maintain the product vision, roadmap, and backlog for data governance and engineering initiatives. Collaborate with stakeholders across business units to gather requirements and translate them into actionable data solutions. Prioritize features and enhancements based on business value, technical feasibility, and compliance needs. Data Governance Lead the implementation of data governance frameworks, policies, and standards. Ensure data quality, lineage, metadata management, and compliance with regulatory requirements (e.g., GDPR, CCPA). Partner with legal, compliance, and security teams to manage data risks and ensure ethical data usage. Data Engineering Oversee the development and maintenance of scalable data pipelines and infrastructure. Work closely with data engineers to ensure robust ETL/ELT processes, data warehousing, and integration with analytics platforms. Advocate for best practices in data architecture, performance optimization, and cloud-based data solutions. Stakeholder Engagement Act as the primary liaison between technical teams and business stakeholders. Facilitate sprint planning, reviews, and retrospectives with Agile teams. Communicate progress, risks, and dependencies effectively to leadership and stakeholders. Qualifications: Education & Experience Bachelor s or Master s degree in Computer Science, Information Systems, Data Science, or related field. 5+ years of experience in data-related roles, with at least 2 years as a Product Owner or similar role. Proven experience in data governance and data engineering within enterprise environments. Skills & Competencies Strong understanding of data governance principles, data privacy laws, and compliance frameworks. Hands-on experience with data engineering tools and platforms (e.g., Apache Spark, Airflow, Snowflake, AWS/GCP/Azure). Proficiency in Agile methodologies and product management tools (e.g., Jira, Confluence). Excellent communication, leadership, and stakeholder management skills.

Posted 1 month ago

Apply

12.0 - 18.0 years

20 - 30 Lacs

Hyderabad

Work from Office

Techwave , we are always in an exercise to foster a culture of growth, and inclusivity. We ensure whoever is associated with the brand is being challenged at every step and is provided with all the necessary opportunities to excel in life. People are at the core of everything we do. Join us! https://techwave.net/join-us/ Who are we? Techwave is a leading global IT and engineering services and solutions company revolutionizing digital transformations. We believe in enabling clients to maximize the potential and achieve a greater market with a wide array of technology services, including, but not limited to, Enterprise Resource Planning, Application Development, Analytics, Digital, and the Internet of things (IoT). Founded in 2004, headquartered in Houston, TX, USA, Techwave leverages its expertise in Digital Transformation, Enterprise Applications, and Engineering Services to enable businesses accelerate their growth. Plus, we're a team of dreamers and doers who are pushing the boundaries of what's possible. And we want YOU to be a part of it. Job Title: Data Architect Experience: 12+ Years Mode of Hire : Fulltime Key Skills: Engage with client executives & business analysts for architecture & requirements discussions Design data architecture aligned with business goals using Microsoft Fabric components Build modern medallion-based infrastructure for scalable analytics Lead execution of data strategy via reusable pipelines & modeling patterns Provide technical leadership to engineering teams across Fabric workloads Create reusable design templates and frameworks (e.g., pipelines, semantic models) Integrate external systems using APIs, connectors Enforce data governance, compliance, and security standards using Purview & ACLs Evaluate and recommend toolsets across Microsoft Fabric workloads (e.g., Lakehouse vs Warehouse) Review cost optimization strategies across Fabric capacity and workspace design

Posted 1 month ago

Apply

12.0 - 17.0 years

25 - 32 Lacs

Pune, Chennai, Bengaluru

Hybrid

If interested, pls share your resume on PriyaM4@hexaware.com Total Exp CTC ECTC NP Loc Should have 12+yrs of experience. Data warehouse Architect and design pattern and integration patterns candidate has to be well versed. Candidate has to be clear on how he/she has implemented and enforced those in his /her previous assignments. 10+ Experience with developing Batch ETL/ELT processes using SQL Server and SSIS ensuring all related data pipelines meets best-in-class standards offering high performance. 10+ experience writing and optimizing SQL queries and stored Procedures for Data Processing and Data Analysis. 10+ years of Experience in designing and building complete data pipelines, moving and transforming data for ODS, Staging, Data Warehousing and Data Marts using SQL Server Integration Services (ETL) or other related technologies. 10 + years' experience Implementing Data Warehouse solutions (Star Schema, Snowflake schema) for reporting and analytical applications using SQL Server and SSIS ,or other related technologies.

Posted 1 month ago

Apply

16.0 - 18.0 years

50 - 60 Lacs

Chennai, Gurugram, Bengaluru

Work from Office

Join us as a Data Engineer We re looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, you ll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If you re ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you Were offering this role at associate vice president level What you ll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development. You ll also provide transformation solutions and carry out complex data extractions. We ll expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions. You ll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers. You ll also be responsible for: Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills you ll need To be successful in this role, you ll have an understanding of data usage and dependencies with wider teams and the end customer. You ll also have experience of extracting value and features from large scale data. We ll expect you to have experience of ETL technical design, data quality testing, cleansing and monitoring, data sourcing, exploration and analysis, and data warehousing and data modelling capabilities. You ll also need: Experience of using programming language such as Python for developing custom operators and sensors in Airflow, improving workflow capabilities and reliability Good knowledge of Kafka and Kinesis for effective real-time data processing, Scala and Spark to enhance data processing efficiency and scalability. Great communication skills with the ability to proactively engage with a range of stakeholders Hours 45 Job Posting Closing Date: 14/07/2025

Posted 1 month ago

Apply

4.0 - 7.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary s Strong understanding & hands on experience on Collibra. Experience with designing & implementing operating model in DGC, scanning different sources with Collibra Catalog connectors, Rest API knowledge Experience in designing, developing & configuring workflows using Eclipse. Good experience in groovy scripting Experience with lineage harvesting in Collibra to track data movement and transformations across systems Good understanding & experience in developing & implementing Data Governance, Metadata Management, Data Quality frameworks, policies & processes Excellent communication & interpersonal skills, with the ability to interact effectively with senior stakeholders & crossfunctional teams Excellent analytical and problem solving skills, with the ability to address complex data governance challenges Mandatory skill sets Collibra Developer Preferred skill sets Collibra Developer Years of experience required 4 7 yrs Education qualification B.tech & MBA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Collibra Data Governance Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No

Posted 1 month ago

Apply

7.0 - 10.0 years

7 - 11 Lacs

Pune

Work from Office

Job Requirements Why work for us Alkegen brings together two of the world s leading specialty materials companies to create one new, innovation-driven leader focused on battery technologies, filtration media, and specialty insulation and sealing materials. Through global reach and breakthrough inventions, we are delivering products that enable the world to breathe easier, live greener, and go further than ever before. With over 60 manufacturing facilities with a global workforce of over 9,000 of the industry s most experienced talent, including insulation and filtration experts, Alkegen is uniquely positioned to help customers impact the environment in meaningful ways. Alkegen offers a range of dynamic career opportunities with a global reach. From production operators to engineers, technicians to specialists, sales to leadership, we are always looking for top talent ready to bring their best. Come grow with us! Key Responsibilities: Lead and manage the Data Operations team, including BI developers and ETL developers, to deliver high-quality data solutions. Oversee the design, development, and maintenance of data models, data transformation processes, and ETL pipelines. Collaborate with business stakeholders to understand their data needs and translate them into actionable data insights solutions. Ensure the efficient and reliable operation of data pipelines and data integration processes. Develop and implement best practices for data management, data quality, and data governance. Utilize SQL, Python, and Microsoft SQL Server to perform data analysis, data manipulation, and data transformation tasks. Build and deploy data insights solutions using tools such as PowerBI, Tableau, and other BI platforms. Design, create, and maintain data warehouse environments using Microsoft SQL Server and the data vault design pattern. Design, create, and maintain ETL packages using Microsoft SQL Server and SSIS. Work closely with cross-functional teams in a matrix organization to ensure alignment with business objectives and priorities. Lead and mentor team members, providing guidance and support to help them achieve their professional goals. Proactively identify opportunities for process improvements and implement solutions to enhance data operations. Communicate effectively with stakeholders at all levels, presenting data insights and recommendations in a clear and compelling manner. Implement and manage CI/CD pipelines to automate the testing, integration, and deployment of data solutions. Apply Agile methodologies and Scrum practices to ensure efficient and timely delivery of projects. Skills & Qualifications: Masters or Bachelor s degree in computer science, Data Science, Information Technology, or a related field. 7 to 10 years of experience in data modelling, data transformation, and building and managing ETL processes. Strong proficiency in SQL, Python, and Microsoft SQL Server for data manipulation and analysis. Extensive experience in building and deploying data insights solutions using BI tools such as PowerBI and Tableau. At least 2 years of experience leading BI developers or ETL developers. Experience working in a matrix organization and collaborating with cross-functional teams. Proficiency in cloud platforms such as Azure, AWS, and GCP. Familiarity with data engineering tools such as ADF, Databricks, Power Apps, Power Automate, and SSIS. Strong stakeholder management skills with the ability to communicate complex data concepts to non-technical audiences. Proactive and results-oriented, with a focus on delivering value aligned with business objectives. Knowledge of CI/CD pipelines and experience implementing them for data solutions. Experience with Agile methodologies and Scrum practices. Relevant certifications in Data Analytics, Data Architecture, Data Warehousing, and ETL are highly desirable. At Alkegen, we strive every day to help people - ALL PEOPLE - breathe easier, live greener and go further than ever before. We believe that diversity and inclusion is central to this mission and to our impact. Our diverse and inclusive culture drives our growth & innovation and we nurture it by actively embracing our differences and using our varied perspectives to solve the complex challenges facing our changing and diverse world. Employment selection and related decisions are made without regard to sex, race, ethnicity, nation of origin, religion, color, gender identity and expression, age, disability, education, opinions, culture, languages spoken, veteran s status, or any other protected class.

Posted 1 month ago

Apply

13.0 - 18.0 years

13 - 17 Lacs

Kochi, Thiruvananthapuram

Work from Office

"> Home / Home / Careers / Careers / Technical Project Ma... Technical Project Manager(Data) Introduction We are looking for 13+years experienced candidates for this role. Responsibilities include: Own the end-to-end delivery of data platform, AI, BI, and analytics projects, ensuring alignment with business objectives and stakeholder expectations. Develop and maintain comprehensive project plans, roadmaps, and timelines for data ingestion, transformation, governance, AI/ML models, and analytics deliverables. Lead cross-functional teams including data engineers, data scientists, BI analysts, architects, and business stakeholders to deliver high-quality, scalable solutions on time and within budget. Define, prioritize, and manage product and project backlogs covering data pipelines, data quality, governance, AI services, and BI dashboards or reporting tools. Collaborate closely with business units to capture and translate requirements into actionable user stories and acceptance criteria for data and analytics solutions. Oversee BI and analytics area including dashboard development, embedded analytics, self-service BI enablement, and ad hoc reporting capabilities. Ensure data quality, lineage, security, and compliance requirements are integrated throughout the project lifecycle, in collaboration with governance and security teams. Coordinate UAT, performance testing, and user training to ensure adoption and successful rollout of data and analytics products. Act as the primary point of contact for . This is to notify jobseekers that some fraudsters are promising jobs with Reflections Info Systems for a fee. Please note that no payment is ever sought for jobs in Reflections. We contact our candidates only through our official website or LinkedIn and all employment related mails are sent through the official HR email id. for any clarification/ alerts on this subject. Apply Now

Posted 1 month ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Pune

Work from Office

Job Title: Senior / Lead Data Engineer Company: Synechron Technologies Locations: Pune or Chennai Experience: 5 to 12 years : Synechron Technologies is seeking an accomplished Senior or Lead Data Engineer with expertise in Java and Big Data technologies. The ideal candidate will have a strong background in Java Spark, with extensive experience working with big data frameworks such as Spark, Hadoop, HBase, Couchbase, and Phoenix. You will lead the design and development of scalable data solutions, ensuring efficient data processing and deployment in a modern technology environment. Key Responsibilities: Lead the development and optimization of large-scale data pipelines using Java and Spark. Design, implement, and maintain data infrastructure leveraging Spark, Hadoop, HBase, Couchbase, and Phoenix. Collaborate with cross-functional teams to gather requirements and develop robust data solutions. Lead deployment automation and management using CI/CD tools including Jenkins, Bitbucket, GIT, Docker, and OpenShift. Ensure the performance, security, and reliability of data processing systems. Provide technical guidance to team members and participate in code reviews. Stay updated on emerging technologies and leverage best practices in data engineering. Qualifications & Skills: 5 to 14 years of experience as a Data Engineer or similar role. Strong expertise in Java programming and Apache Spark. Proven experience with Big Data technologiesSpark, Hadoop, HBase, Couchbase, and Phoenix. Hands-on experience with CI/CD toolsJenkins, Bitbucket, GIT, Docker, OpenShift. Solid understanding of data modeling, ETL workflows, and data architecture. Excellent problem-solving, communication, and leadership skills. S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice

Posted 1 month ago

Apply

0.0 years

9 - 14 Lacs

Pune

Work from Office

: Job Title - Senior Engineer for Data Management Private Bank Location- Pune, India Role Description Our Data Governance and Architecture team is driving forward data management together with the Divisional Data Office for Private Bank. In close collaboration between business and IT we assign data roles, manage the documentation of data flows, align data requirements between consumers and producers of data, report data quality and coordinate Private Banks data delivery through the group data hub. We support our colleagues in the group Chief Data Office to optimize Deutsche Banks Data Policy and the associated processes and methods to manage and model data. As part of the team you will be responsible for work streams from project planning to preparing reports to senior management. You combine regulatory compliance with data driven business benefits for Deutsche Bank. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Establish and maintain the Private Bank contribution to the Deutsche Bank Enterprise Logical and Physical Data models and ensure its usefulness for the Private Bank business Understand the requirement of the group functions risk, finance, treasury, and regulatory reporting and cast them into data models in alignment with the producers of the data Co-own Private Bank relevant parts of the Deutsche Bank Enterprise Logical and Physical Data models Support the Private Bank experts and stakeholders in delivering the relevant data Optimize requirements management and modelling processes together with the group Chief Data Office and Private Bank stakeholders Align your tasks with the team and the Private Bank Data Council priorities Your skills and experience In depth understanding of how data and data quality impacts processes across the bank in the retail sector Hands on-experience with data modelling in the financial industry Extensive experience with data architecture and the challenges of harmonized data provisioning Project and stakeholder management capabilities Open minded team playermaking different people work together well across the world Fluent in English How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

8.0 - 13.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas -Oncology, Inflammation, General Medicine, and Rare Disease- we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist IS Architect What you will do Let s do this. Let s change the world. In this vital role you will responsible for developing and maintaining the overall IT architecture of the organization. In this role you will be responsible for designing and implementing information system architectures to support business needs. You will analyze requirements, develop architectural designs, evaluate technology solutions, and ensure alignment with industry best practices and standards. You will be working closely with collaborators to understand requirements, develop architectural blueprints, and ensure that solutions are scalable, secure, and aligned with enterprise standards. Architects will be involved in defining the enterprise architecture strategy, guiding technology decisions, and ensuring that all IT projects adhere to established architectural principles. Roles & Responsibilities: Develop and maintain the enterprise architecture vision and strategy, ensuring alignment with business objectives for Corporate Functions data architecture. Collaborating closely with business clients and key collaborators to align solutions with strategic objectives. Create and maintain architectural roadmaps that guide the evolution of IT systems and capabilities for Corporate Functions data architecture Establish and enforce architectural standards, policies, and governance frameworks Evaluate emerging technologies and assess their potential impact on the solution architecture Identify and mitigate architectural risks, ensuring that IT systems are scalable, secure, and resilient Maintain comprehensive documentation of the architecture, including principles, standards, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with partners to gather and analyze requirements, ensuring that solutions meet both business and technical needs Ensure seamless integration between systems and platforms, both within the organization and with external partners Design systems that can scale to meet growing business needs and performance demands Deliver high-quality Salesforce solutions using LWC, Apex, Flows and other Salesforce technologies. Ensure alignment to established standard methodologies and definitions of done, maintaining high-quality standards in work Create architectural design and data model as per business requirements and Salesforce standard methodologies Proactively identify technical debt and collaborate with the Principal Architect and Product Owner to prioritize and address it effectively Negotiate solutions to complex problems with both the product teams and third-party service providers Build relationships and work with product teams; contribute to broader goals and growth beyond the scope of a single or your current project What we expect of you We are all different, yet we all use our unique contributions to serve patients. Doctorate degree / Masters degree / Bachelors degree and 8 to 13 years of Computer Science, IT or related field experience Preferred Qualifications: Strong architectural design and modeling skills Proficiency in Salesforce Health Cloud / Service Cloud implementation for a Call Center Solid hands-on experience of implementing Salesforce Configurations, Apex, LWC and integrations Solid understanding of declarative tools like Flows and Process Builder Proficiency in using Salesforce tools such as Data Loader, Salesforce Inspector to query, manipulate and export data Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Ability to train and guide junior developers in standard methodologies Familiarity with Agile practices such as User Story Creation and, sprint planning Experience creating proofs of concept (PoCs) to validate new ideas or backlog items. Professional Certifications: Salesforce Admin Salesforce Advanced Administrator Salesforce Platform Developer 1 (Mandatory) Salesforce Platform Developer 2 Platform Builder Salesforce Application Architect Salesforce Health Cloud Accredited Professional (Preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

About the job We are seeking a highly skilled and motivated AI-ML Lead with expertise in Generative AI and LLMs to join our team. As a Generative AI and LLM Expert, you will play a crucial role in developing and implementing cutting-edge generative models and algorithms to solve complex problems and generate high-quality outputs. You will collaborate with a multidisciplinary team of researchers, engineers, and data scientists to explore innovative applications of generative AI across various domains. Responsibilities: Research and Development: Stay up-to-date with the latest advancements in generative AI, including LLMs, GPTs, GANs (Generative Adversarial Networks), VAEs (Variational Autoencoders), and other related techniques. Conduct research to identify and develop novel generative models and algorithms. Model Development: Design, develop, and optimize generative models to generate realistic and diverse outputs. Implement and fine-tune state-of-the-art generative AI architectures to achieve desired performance metrics. Data Processing and Preparation: Collect, preprocess, and curate large-scale datasets suitable for training generative models. Apply data augmentation techniques and explore strategies to handle complex data types and distributions. Training and Evaluation: Train generative models using appropriate deep learning frameworks and libraries. Evaluate model performance using quantitative and qualitative metrics. Iterate and improve models based on feedback and analysis of results. Collaboration: Collaborate with cross-functional teams, including researchers, engineers, and data scientists, to understand project requirements, define objectives, and identify opportunities to leverage generative AI techniques. Provide technical guidance and support to team members. Innovation and Problem Solving: Identify and tackle challenges related to generative AI, such as mode collapse, training instability, and generating diverse and high-quality outputs. Propose innovative solutions and approaches to address these challenges. Documentation and Communication: Document research findings, methodologies, and model architectures. Prepare technical reports, papers, and presentations to communicate results and insights to both technical and non-technical stakeholders. Requirements: Education: A Master's or Ph.D. degree in Computer Science, Artificial Intelligence, or a related field. A strong background in deep learning, generative models, and computer vision is preferred. Experience: Proven experience in designing and implementing generative models using deep learning frameworks (e.g., TensorFlow, PyTorch). Demonstrated expertise in working with GPTs, GANs, VAEs, or other generative AI techniques. Experience with large-scale dataset handling and training deep neural networks is highly desirable. Technical Skills: Proficiency in programming languages such as Python, and familiarity with relevant libraries and tools. Strong mathematical and statistical skills, including linear algebra and probability theory. Experience with cloud computing platforms and GPU acceleration is a plus. Research and Publication: Track record of research contributions in generative AI, demonstrated through publications in top-tier conferences or journals. Active participation in the AI research community, such as attending conferences or workshops, is highly valued. Analytical and Problem-Solving Abilities: Strong analytical thinking and problem-solving skills to tackle complex challenges in generative AI. Ability to think creatively and propose innovative solutions. Attention to detail and the ability to analyze and interpret experimental results. Collaboration and Communication: Excellent teamwork and communication skills to effectively collaborate with cross-functional teams. Ability to explain complex technical concepts to both technical and non-technical stakeholders. Strong written and verbal communication skills. Adaptability and Learning: Enthusiasm for staying updated with the latest advancements in AI and generative models. Willingness to learn new techniques and adapt to evolving technologies and methodologies.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

We are seeking a skilled and motivated Data Engineer with hands-on experience in Snowflake , Azure Data Factory (ADF) , and Fivetran . The ideal candidate will be responsible for building and optimizing data pipelines, ensuring efficient data integration and transformation to support analytics and business intelligence initiatives. Key Responsibilities: Design, develop, and maintain robust data pipelines using Fivetran , ADF , and other ETL tools. Build and manage scalable data models and data warehouses on Snowflake . Integrate data from various sources into Snowflake using automated workflows. Implement data transformation and cleansing processes to ensure data quality and integrity. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Monitor pipeline performance, troubleshoot issues, and optimize for efficiency. Maintain documentation related to data architecture, processes, and workflows. Ensure data security and compliance with company policies and industry standards. Required Skills & Qualifications: Bachelor's or Masters degree in Computer Science, Information Systems, or a related field. 3+ years of experience in data engineering or a similar role. Proficiency with Snowflake including architecture, SQL scripting, and performance tuning. Hands-on experience with Azure Data Factory (ADF) for pipeline orchestration and data integration. Experience with Fivetran or similar ELT/ETL automation tools. Strong SQL skills and familiarity with data warehousing best practices. Knowledge of cloud platforms, preferably Microsoft Azure. Familiarity with version control tools (e.g., Git) and CI/CD practices. Excellent communication and problem-solving skills. Preferred Qualifications: Experience with Python, dbt, or other data transformation tools. Understanding of data governance, data quality, and compliance frameworks. Knowledge of additional data tools (e.g., Power BI, Databricks, Kafka) is a plus.

Posted 1 month ago

Apply

3.0 - 8.0 years

55 - 60 Lacs

Bengaluru

Work from Office

Our vision for the future is based on the idea that transforming financial lives starts by giving our people the freedom to transform their own. We have a flexible work environment, and fluid career paths. We not only encourage but celebrate internal mobility. We also recognize the importance of purpose, well-being, and work-life balance. Within Empower and our communities, we work hard to create a welcoming and inclusive environment, and our associates dedicate thousands of hours to volunteering for causes that matter most to them. Chart your own path and grow your career while helping more customers achieve financial freedom. Empower Yourself. Job Summary: At Empower, a Sr. Architect is a mix of leadership position and thought leadership role. A Sr. Architect works with enterprise architects, and both business and IT teams, to align solutions to the technology vision they help create. This role supports enterprise Architects in the development of technology strategies, reference architectures, solutions, best practices and guidance across the entire IT development organization; all the while addressing total cost of ownership, stability, performance and efficiency. The candidate will also be working with Empower Innovation Lab team as the team is experimenting with emerging technologies, such as Generative AI, and Advanced Analytics. In this rapid paced environment, the person must possess a "can-do" attitude while demonstrating a strong work ethic. This person should have a strong aptitude to help drive decisions. He or she will be actively involved in influencing the strategic direction of technology at Empower Retirement. There will be collaboration across all teams including IT Infrastructure, PMO office, Business, and third-party integrators in reviewing, evaluating, designing and implementing solutions. The Architect must understand available technology options and educate and influence technology teams to leverage them where appropriate. The Architect will recognize and propose alternatives, make recommendations, and describe any necessary trade-offs. In some cases, particularly on key initiatives, the Architect will participate on the design and implementation of end-to-end solutions directly with development teams. The ideal candidate will leverage their technical leadership/direction-setting skills with the development organization to be able to prove technical concepts quickly using a variety of tools, methods, & frameworks. Responsibilities: Help Enterprise Architect, work with peer Sr. Architects and more junior resources to define and execute on the business aligned IT strategy and vision. Develop, document, and provide input into the technology roadmap for Empower. Create reference architectures that demonstrate an understanding of technology components and the relationships between them. Design and modernize complex systems into cloud compatible or cloud native applications where applicable. Create strategies and designs for migrating applications to cloud systems. Participate in the evaluation of new applications, technical options, challenge the status quo, create solid business cases, and influence direction while establishing partnerships with key constituencies. Implement best practices, standards & guidance, then subsequently provide coaching of technology team members. Make leadership recommendations regarding strategic architectural considerations related to process design and process orchestration. Provide strong leadership and direction in development/engineering practices. Collaborate with other business and technology teams on architecture and design issues. Respond to evolving and changing security conditions. Implement and recommend security guidelines. Provide thought-leadership, advocacy, articulation, assurance, and maintenance of the enterprise architecture discipline. Provide solution, guidance, and implementation assistance within full stack development teams. Recommend long term scalable and performant architecture changes keeping cost in control. Preferred Qualifications: 12+ years of experience in the development and delivery of data systems. This experience should be relevant to roles such as Data Analyst, ETL (Extract, Transform and Load) Developer (Data Engineer), Database Administrator (DBA), Business Intelligence Developer (BI Engineer), Machine Learning Developer (ML Engineer), Data Scientist, Data Architect, Data Governance Analyst, or a managerial position overseeing any of these functions. 3+ years of experience creating solution architectures and strategies across multiple architecture domains (business, application, data, integration, infrastructure and security). Solid experience with the following technology disciplines: Python, Cloud architectures, AWS (Amazon Web Services), Bigdata (300+TBs), Advanced Analytics, Advance SQL Skills, Data Warehouse systems(Redshift or Snowflake), Advanced Programming, NoSQL, Distributed Computing, Real-time streaming Nice to have experience in Java, Kubernetes, Argo, Aurora, Google Analytics, META Analytics, Integration with 3rd party APIs, SOA & microservices design, modern integration methods (API gateway/web services, messaging & RESTful architectures). Familiarity with BI tools such as Tableau/QuickSight. Experience with code coverage tools. Working knowledge of addressing architectural cross cutting concerns and their tradeoffs, including topics such as caching, monitoring, operational surround, high availability, security, etc. Demonstrates competency applying architecture frameworks and development methods. Understanding of business process analysis and business process management (BPM). Excellent written and verbal communication skills. Experience in mentoring junior team members through code reviews and recommend adherence to best practices. Experience working with global, distributed teams. Interacts with people constantly, demonstrating strong people skills. Able to motivate and inspire, influencing and evangelizing a set of ideals within the enterprise. Requires a high degree of independence, proactively achieving objectives without direct supervision. Negotiates effectively at the decision-making table to accomplish goals. Evaluates and solves complex and unique problems with strong problem-solving skills. Thinks broadly, avoiding tunnel vision and considering problems from multiple angles. Possesses a general understanding of the wealth management industry, comprehending how technology impacts the business. Stays on top of the latest technologies and trends through continuous learning, including reading, training, and networking with industry colleagues. Data Architecture - Proficiency in platform design and data architecture, ensuring scalable, efficient, and secure data systems that support business objectives. Data Modeling - Expertise in designing data models that accurately represent business processes and facilitate efficient data retrieval and analysis. Cost Management - Ability to manage costs associated with data storage and processing, optimizing resource usage, and ensuring budget adherence. Disaster Recovery Planning - Planning for data disaster recovery to ensure business continuity and data integrity in case of unexpected events. SQL Optimization/Performance Improvements - Advanced skills in optimizing SQL queries for performance, reducing query execution time, and improving overall system efficiency. CICD - Knowledge of continuous integration and continuous deployment processes, ensuring rapid and reliable delivery of data solutions. Data Encryption - Implementing data encryption techniques to protect sensitive information and ensure data privacy and security. Data Obfuscation/Masking - Techniques for data obfuscation and masking to protect sensitive data while maintaining its usability for testing and analysis. Reporting - Experience with static and dynamic reporting to provide comprehensive and up-to-date information to business users. Dashboards and Visualizations - Creating d ashboards and visualizations to present data in an intuitive and accessible manner, facilitating data-driven insights. Generative AI / Machine Learning - Understanding of generative artificial intelligence and machine learning to develop advanced predictive models and automate decision-making processes. Understanding of machine learning algorithms, deep learning frameworks, and AI model architectures. Understanding of ethical AI principles and practices. Experience implementing AI transparency and explainability techniques. Knowledge of popular RAG frameworks and tools (e.g., LangChain, LlamaIndex). Familiarity with fairness metrics and techniques to mitigate bias in AI models. Sample technologies: Cloud Platforms – AWS (preferred) or Azure or Google Cloud Databases - Oracle, Postgres, MySQL(preferred), RDS, DynamoDB(preferred), Snowflake or Redshift(preferred) Data Engineering (ETL, ELT) - Informatica, Talend, Glue, Python(must), Jupyter Streaming – Kafka or Kinesis CICD Pipeline – Jenkins or GitHub or GitLab or ArgoCD Business Intelligence – Quicksight (preferred), Tableau(preferred), Business Objects, MicroStrategy, Qlik, PowerBI, Looker Advanced Analytics - AWS Sagemaker(preferred), TensorFlow, PyTorch, R, scikit learn Monitoring tools – DataDog(preferred) or AppDynamics or Splunk Bigdata technologies – Apache Spark(must), EMR(preferred) Container Management technologies – Kubernetes, EKS(preferred), Docker, Helm Preferred Certifications: AWS Solution Architect AWS Data Engineer AWS Machine Learning Engineer AWS Machine Learning EDUCATION: Bachelor’s and/or master’s degree in computer science or related field (information systems, mathematics, software engineering) . We are an equal opportunity employer with a commitment to diversity. All individuals, regardless of personal characteristics, are encouraged to apply. All qualified applicants will receive consideration for employment without regard to age, race, color, national origin, ancestry, sex, sexual orientation, gender, gender identity, gender expression, marital status, pregnancy, religion, physical or mental disability, military or veteran status, genetic information, or any other status protected by applicable state or local law.

Posted 1 month ago

Apply

10.0 - 20.0 years

30 - 45 Lacs

Hyderabad, Jaipur

Hybrid

Data Architect - AI & Azure - Lead & Coach Team Job Description: Job Title: Data Architect - AI & Azure - Lead & Coach Teams Shift Timings: 12 PM - 9 PM Location: Jaipur, hybrid Experience required: 10 to 20 years Job Title: Data Architect - AI & Azure - Lead & Coach Teams Experience: Total Experience: 5+ years in data architecture and implementation. Pre-Sales Experience: Minimum 1 year in a client-facing pre-sales or technical solutioning role is mandatory. Must-Have Skills & Qualifications: Technical Expertise: In-depth knowledge of the Microsoft Azure data platform (Azure Synapse Analytics, Azure Data Factory, Azure SQL, Azure Data Lake Storage). Modern Data Platforms: Hands-on experience with Databricks and/or Snowflake. AI Acumen: Strong understanding of AI workflows and data requirements. Must have a solid grasp of Gen AI applications and concepts. Leadership: Experience in mentoring, coaching, or leading technical teams or project initiation phases. Solutioning: Proven ability to create high-quality technical proposals, respond to RFPs, and design end-to-end data solutions. Communication: Exceptional English communication and presentation skills are essential for this client-facing role. If interested Please share your resume on shivam.gaurav@programmers.io

Posted 1 month ago

Apply

2.0 - 5.0 years

12 - 16 Lacs

Pune

Work from Office

Overview We are looking for a Senior Data Engineer with deep hands-on expertise in PySpark, Databricks, and distributed data architecture. This individual will play a lead role in designing, developing, and optimizing data pipelines critical to our Ratings Modernization, Corrections, and Regulatory implementation programs under PDB 2.0. The ideal candidate will thrive in fast-paced, ambiguous environments and collaborate closely with engineering, product, and governance teams. Responsibilities Design, develop, and maintain robust ETL/ELT pipelines using PySpark and Databricks . Own pipeline architecture and drive performance improvements through partitioning, indexing, and Spark optimization . Collaborate with product owners, analysts, and other engineers to gather requirements and resolve complex data issues. Perform deep analysis and optimization of SQL queries , functions, and procedures for performance and scalability. Ensure high standards of data quality and reliability via robust validation and cleansing processes. Lead efforts in Delta Lake and cloud data warehouse architecture , including best practices for data lineage and schema management. Troubleshoot and resolve production incidents and pipeline failures quickly and thoroughly. Mentor junior team members and guide best practices across the team. Qualifications Bachelor's degree in Computer Science, Engineering, or a related technical field. 6+ years of experience in data engineering or related roles. Advanced proficiency in Python, PySpark, and SQL . Strong experience with Databricks , BigQuery , and modern data lakehouse design. Hands-on knowledge of Azure or GCP data services. Proven experience in performance tuning and large-scale data processing . Strong communication skills and the ability to work independently in uncertain or evolving contexts What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 1 month ago

Apply

6.0 - 11.0 years

17 - 20 Lacs

Mumbai

Work from Office

We are looking for a highly skilled and experienced professional with 6 to 11 years of experience to join our team as a Manager - Business Transformation in Mumbai. Roles and Responsibility Develop and implement automation processes to enhance efficiency and productivity. Create MIS dashboards using Tableau for business insights and decision-making. Design analytical models, including scorecards, based on business requirements. Conduct deviation analytics to identify areas for improvement. Collaborate with cross-functional teams to drive business transformation initiatives. Analyze data to provide actionable recommendations to stakeholders. Job Graduate with a strong understanding of MIS and visualization tools. Proven experience in process automation and data analysis. Strong knowledge of Tableau and other data visualization tools. Excellent analytical and problem-solving skills. Ability to work collaboratively with cross-functional teams. Strong communication and interpersonal skills.

Posted 1 month ago

Apply

10.0 - 15.0 years

25 - 40 Lacs

Noida, Pune, Delhi / NCR

Hybrid

Role & responsibilities As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Use the Data Modelling tool to create appropriate data models Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Ideate, design, and guide the teams in building automations and accelerators Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. Work with the client to define, establish and implement the right modelling approach as per the requirement Help define the standards and best practices Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. Coach team members, and review code artifacts. Contribute to proposals and RFPs Preferred candidate profile 10+ years of experience in Data space. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry . Perks and benefits

Posted 1 month ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Use the Data Modelling tool to create appropriate data models Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Ideate, design, and guide the teams in building automations and accelerators Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. Work with the client to define, establish and implement the right modelling approach as per the requirement Help define the standards and best practices Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. Coach team members, and review code artifacts. Contribute to proposals and RFPs Preferred candidate profile 10+ years of experience in Data space. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry.

Posted 1 month ago

Apply

10.0 - 17.0 years

25 - 40 Lacs

Chennai

Work from Office

Extensive experience in big data architecture, with a focus on Cloud native and/or Cloud based services / solutions. data processing technologies such as Hadoop, Spark, and Kafka, in the Cloud ecosystem. AWS, Azure and GCP.

Posted 1 month ago

Apply

15.0 - 20.0 years

13 - 17 Lacs

Noida

Work from Office

We are looking for a skilled Senior Data Architect with 15 to 20 years of experience to lead our data warehousing function, setting the vision and direction for driving actionable insights across revenue, subscriptions, paid marketing channels, and operational functions. This role is based in Remote. Roles and Responsibility Define and execute the long-term strategy for our data warehousing platform using medallion architecture and modern cloud-based solutions. Oversee end-to-end pipeline design, implementation, and maintenance for seamless integration with business intelligence tools. Champion best practices in data modeling, including the effective use of DBT packages to streamline complex transformations. Establish rigorous data quality standards, governance policies, and automated validation frameworks across all data streams. Develop frameworks to reconcile revenue discrepancies and unify validation across Finance, SEM, and Analytics teams. Implement robust monitoring and alerting systems to quickly identify, diagnose, and resolve data pipeline issues. Lead, mentor, and grow a high-performing team of data warehousing specialists, fostering a culture of accountability, innovation, and continuous improvement. Partner with RevOps, Analytics, SEM, Finance, and Product teams to align the data infrastructure with business objectives, serving as the primary data warehouse expert in discussions around revenue attribution and paid marketing channel performance. Translate complex technical concepts into clear business insights for both technical and non-technical stakeholders. Oversee deployment processes, including staging, QA, and rollback strategies, to ensure minimal disruption during updates. Regularly assess and optimize data pipelines for performance, scalability, and reliability while reducing operational overhead. Lead initiatives to transition from legacy on-premise systems to modern cloud-based architectures for improved agility and cost efficiency. Stay abreast of emerging trends and technologies in data warehousing, analytics, and cloud solutions. Propose and lead innovative projects to enhance our data capabilities, with a particular focus on predictive and prescriptive analytics. Represent the data warehousing function in senior leadership discussions and strategic planning sessions. Job Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field. Proven track record in designing and implementing scalable data warehousing solutions in cloud environments. Deep experience with medallion architecture and modern data pipeline tools, including DBT (and DBT packages), Databricks, SQL, and cloud-based data platforms. Strong understanding of ETL/ELT best practices, data modeling (logical and physical), and large-scale data processing. Hands-on experience with BI tools (e.g., Tableau, Looker) and familiarity with Google Analytics, and other tracking systems. Solid understanding of attribution models (first-touch, last-touch, multi-touch) and experience working with paid marketing channels. Excellent leadership and team management skills with the ability to mentor and inspire cross-functional teams. Outstanding communication skills, capable of distilling complex technical information into clear business insights. Demonstrated ability to lead strategic initiatives, manage competing priorities, and deliver results in a fast-paced environment.

Posted 1 month ago

Apply

10.0 - 17.0 years

35 - 60 Lacs

Noida, Gurugram, Bengaluru

Hybrid

This is a individual contributor role. Looking candidates from Product/Life Science/ Pharma/Consulting background only. POSITION: Data Architect. LOCATION: NCR/Bangalore/Gurugram. PRODUCT: Axtria DataMAx is a global cloud-based data management product specifically designed for the life sciences industry. It facilitates the rapid integration of both structured and unstructured data sources, enabling accelerated and actionable business insights from trusted data This product is particularly useful for pharmaceutical companies looking to streamline their data processes and enhance decision-making capabilities. JOB OBJECTIVE: To leverage expertise in data architecture and management to design, implement, and optimize a robust data warehousing platform for the pharmaceutical industry. The goal is to ensure seamless integration of diverse data sources, maintain high standards of data quality and governance, and enable advanced analytics through the definition and management of semantic and common data layers. Utilizing Axtria DataMAx and generative AI technologies, the aim is to accelerate business insights and support regulatory compliance, ultimately enhancing decision-making and operational efficiency. Key Responsibilities: Data Modeling: Design logical and physical data models to ensure efficient data storage and retrieval. ETL Processes: Develop and optimize ETL processes to accurately and efficiently move data from various sources into the data warehouse. Infrastructure Design: Plan and implement the technical infrastructure, including hardware, software, and network components. Data Governance: Ensure compliance with regulatory standards and implement data governance policies to maintain data quality and security. Performance Optimization: Continuously monitor and improve the performance of the data warehouse to handle large volumes of data and complex queries. Semantic Layer Definition: Define and manage the semantic layer architecture and technology stack to manage the lifecycle of semantic constructs including consumption into downstream systems. Common Data Layer Management: Integrate data from multiple sources into a centralized repository, ensuring consistency and accessibility. Deep expertise in architecting enterprise grade software systems that are performant, scalable, resilient and manageable. Architecting GenAI based systems is an added plus. Advanced Analytics: Enable advanced analytics and machine learning to identify patterns in genomic data, optimize clinical trials, and personalize medication. Generative AI: Should have worked with production ready usecase for GenAI based data and Stakeholder Engagement: Work closely with business stakeholders to understand their data needs and translate them into technical solutions. Cross-Functional Collaboration: Collaborate with IT, data scientists, and business analysts to ensure the data warehouse supports various analytical and operational needs. Data Modeling: Strong expertise in Data Modelling, with ability to design complex data models from the ground up and clearly articulate the rationale behind design choices. ETL Processes: Must have worked with different loading strategies for facts and dimensions like SCD, Full Load, Incremental Load, Upsert, Append only, Rolling Window etc.. Cloud Warehouse skills: Expertise in leading cloud data warehouse platformsSnowflake, Databricks, and Amazon Redshift—with a deep understanding of their architectural nuances, strengths, and limitations, enabling the design and deployment of scalable, high-performance data solutions aligned with business objectives. Qualifications: Proven experience in data architecture and data warehousing, preferably in the pharmaceutical industry. Strong knowledge of data modeling, ETL processes, and infrastructure design. Experience with data governance and regulatory compliance in the life sciences sector. Proficiency in using Axtria DataMAx or similar data management products. Excellent analytical and problem-solving skills. Strong communication and collaboration skills. Preferred Skills: Familiarity with advanced analytics and machine learning techniques. Experience in managing semantic and common data layers. Knowledge of FDA guidelines, HIPAA regulations, and other relevant regulatory standards. Experience with generative AI technologies and their application in data warehousing.

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=5 to 10 , jd= Job Title:-SQL+ ADF Job Location:- Gurgaon Job Type:- Full Time JD:- Strong exp in SQL developmentalong with exp in cloud AWS & good exp in ADF Job Summary : We are looking for a skilled SQL + Azure Data Factory (ADF) Developer to join our data engineering team. The ideal candidate will have strong experience in writing complex SQL queries, developing ETL pipelines using Azure Data Factory, and integrating data from multiple sources into cloud-based data solutions. This role will support data warehousing, analytics, and business intelligence initiatives. Key Responsibilities : Design, develop, and maintain data integration pipelines using Azure Data Factory (ADF) . Write optimized and complex SQL queries , stored procedures, and functions for data transformation and reporting. Extract data from various structured and unstructured sources and load into Azure-based data platforms (e.g., Azure SQL Database, Azure Data Lake). Schedule and monitor ADF pipelines, ensuring data quality, accuracy, and availability. Collaborate with data analysts, data architects, and business stakeholders to gather requirements and deliver solutions. Troubleshoot data issues and implement corrective actions to resolve pipeline or data quality problems. Implement and maintain data lineage, metadata, and documentation for pipelines. Participate in code reviews, performance tuning, and optimization of ETL processes. Ensure compliance with data governance, privacy, and security standards. Hands-on experience with T-SQL / SQL Server . Experience working with Azure Data Factory (ADF) and Azure SQL . Strong understanding of ETL processes , data warehousing concepts , and cloud data architecture . Experience working with Azure services such as Azure Data Lake, Blob Storage, and Azure Synapse Analytics (preferred). Familiarity with Git/DevOps CI/CD pipelines for ADF deployments is a plus. Excellent problem-solving, analytical, and communication skills. , Title=SQL+ ADF, ref=6566294

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 15 Lacs

Kochi

Remote

We are seeking a highly skilled ETL/Data Engineer with expertise in Informatica DEI BDM to design and implement robust data pipelines handling medium to large-scale datasets. The role involves building efficient ETL frameworks that support batch .

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies