Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
10 - 20 Lacs
Chennai
Hybrid
Senior Data Architect GCP | 8 - 12 Yrs | Chennai Location: Chennai Notice Period: Immediate / Serving Notice / 30 Days Max Experience: 8 - 12 Years Employment Type: Full Time About the Role: Join our dynamic Materials Management Platform (MMP) This platform is redefining how we plan and manage inventory across Product Development, Manufacturing, Finance, Purchasing, and N-Tier Supply Chain systems. We are looking for a highly experienced Data Architect who excels in designing and deploying Data-Centric Architectures on GCP . Youll work across modern and legacy ecosystems , building highly scalable, secure, and efficient data solutions that power real-time and batch operations. Must-Have Skills (Top Priority): GCP (Google Cloud Platform) Core expertise Data Architecture & Engineering Strong foundations in large-scale design BigQuery, GCP Pub/Sub, Airflow Java / Python / Spark / SQL / Scala Streaming & Batch Pipelines Microservices, REST APIs DevOps Tools: Terraform, GitHub Actions, Tekton, Docker RDBMS: MySQL, PostgreSQL, SQL Server Good to Have: Cloud Solution Architecture certification Automotive domain experience Onshore-offshore collaboration experience Agile methodology exposure (JIRA) Education & Background: Bachelor's or equivalent in Computer Science / IT / Engineering 8+ years in data engineering / cloud software development Proven experience in launching data products at scale #GCP
Posted 2 months ago
3.0 - 8.0 years
12 - 20 Lacs
Noida, Gurugram, Mumbai (All Areas)
Work from Office
3+ years of experience in data engineering or backend development with a focus on highly scalable data systems Experience B2B SaaS AI company ideally in a high-growth or startup designing and scaling cloud-based data platforms (AWS, GCP, Azure).
Posted 2 months ago
12.0 - 16.0 years
19 - 34 Lacs
Hyderabad
Hybrid
Travel Requirement: Must travel to the EA Hyderabad office at least once a month. Key Responsibilities: Lead the end-to-end design and architecture of Salesforce solutions Collaborate with business stakeholders to convert requirements into scalable, effective solutions Provide guidance to development teams on best practices in Apex, LWC, Aura, and system integrations Ensure alignment with Salesforce security protocols and data governance policies Manage deployment processes using tools like ANT, Change Sets, and CI/CD pipelines Required Experience & Skills: Minimum of 8 years' experience in Salesforce, with strong proficiency in Apex, Visualforce, Lightning Web Components (LWC), and Aura Solid understanding of Salesforce architecture across Sales Cloud, Service Cloud, and Experience Cloud Expertise in integration technologies such as REST, SOAP APIs, and WSDL Hands-on experience with development tools like VS Code, Data Loader, Workbench, and version control using Git Familiar with Agile practices and tools like Jira and Agile Accelerator Possession of multiple Salesforce certifications, such as Application Architect or System Architect, is highly preferred Essential Skills: Lightning Web Components (LWC) APEX, Aura, Visualforce Salesforce Integration (REST, SOAP, Streaming APIs, Events) Data Architecture and Migration Business Process Mapping and Design Integration and Estimation Architecture Strategy and Roadmap Planning
Posted 2 months ago
7.0 - 12.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Senior Azure Databricks Engineer We are looking for a Senior Azure Databricks Engineer to support and maintain our internal BI platform, used by our Finance and Business Operations teams. This is a hands-on technical role focused on backend data operations including data ingestion, transformation, and CI/CD support within a cloud-based data warehouse environment. Key Responsibilities: Ensure stable operation of the internal BI platform used by Finance and Business Operations Develop, maintain, and troubleshoot data pipelines for ingestion, transformation, and load using Azure Databricks (PySpark, SQL). Support and optimize CI/CD pipelines (Azure DevOps) for smooth deployments and minimal downtime. Collaborate with BI front-end analysts, IT teams, and business stakeholders to ensure alignment of data needs and delivery. Monitor and improve system performance, resolve incidents, and ensure data quality and consistency. Maintain data architecture standards and support platform scalability and compliance. Integrate data from systems like D365 Finance Operations and other business applications. Work with Azure services such as Data Lake, Key Vaults, Service Principals, and SQL Database. Maintain proper documentation of processes, configurations, and procedures. Participate in improvement initiatives to enhance platform efficiency and usability. What you need to succeed: 7+ years of experience with Business Data Analytics Platforms. Strong hands-on experience with Azure Databricks, PySpark, and (SparkSQL or SQL) Solid understanding of CI/CD pipelines (preferably with Azure DevOps) and troubleshooting deployment issues. Proficiency in Python and working knowledge of Shell scripting. Experience with data ingestion, ETL processes, and managing large-scale data pipelines. Experience with Azure services such as Azure Key Vaults, Azure SQL, Azure Data Lake, and Service Principals. Understanding data governance, security standards, and handling sensitive data. Ability to work closely with both IT and finance/business stakeholders. Good knowledge of data integration from sources like D365 FO, Unit4, Azure Portal Strong analytical, problem-solving, and communication skills. Excellent problem-solving, collaboration, and communication skills. Department CFO Remote status Hybrid Employment type Full-time Employment level First /Mid-Level Officials Application deadline 30 June, 2025 Contact Lead-Talent Acquisition Colleagues Bengaluru OUR POWER IS CURIOSITY, CREATION AND INNOVATION We believe you love to experiment, challenge the established, co-create, develop and cultivate. Together we can explore new answers to today s challenges and future opportunities, and talk about how industrial digitalisation can be a part of the solution for a better tomorrow. We believe that different perspectives are crucial for developing gamechanging technology for a better tomorrow. Join us in taking on this challenge! About Kongsberg Digital Kongsberg Digital is a provider of next generation software and digital solutions to customers within maritime, oil gas and utilities. Together with the rest of KONGSBERG, Kongsberg Digital offers solutions within autonomy, smart data, augmented reality and other areas. Join Kongsberg Digital as we pursue our mission to digitalize the world s industries for a better tomorrow. We truly believe that technology will drive more efficient and sustainable operations, making the oil sector more energy efficient, ships less polluting and green energy future proof. Founded in 2016 Co-workers 1316 CFO Bengaluru Hybrid Senior Azure Databricks Engineer Loading application form Already working at Kongsberg Digital Let s recruit together and find your next colleague.
Posted 2 months ago
10.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
1 Utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle a distribution of data types and volumes in support of data architecture design. A Senior Data Engineer designs and oversees the entire data infrastructure, data products and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost effective. This Lead role has an expectation of 10-15 years of relevant experience and will provide mentorship to junior members of the team. Key responsibilities: Oversee the entire data infrastructure to ensure scalability, operation efficiency and resiliency. Mentor junior data engineers within the organization. Design, develop, and maintain data pipelines and ETL processes using Microsoft Azure services (e.g., Azure Data Factory, Azure Synapse, Azure Databricks, Azure Fabric). Utilize Azure data storage accounts for organizing and maintaining data pipeline outputs. (e.g., Azure Data Lake Storage Gen 2 Azure Blob storage). Collaborate with data scientists, data analysts, data architects and other stakeholders to understand data requirements and deliver high-quality data solutions. Optimize data pipelines in the Azure environment for performance, scalability, and reliability. Ensure data quality and integrity through data validation techniques and frameworks. Develop and maintain documentation for data processes, configurations, and best practices. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Stay current with industry trends and emerging technologies to ensure our data solutions remain cutting-edge. Manage the CI/CD process for deploying and maintaining data solutions. Required Qualifications: Bachelor s degree in Computer Science, Engineering, or a related field (or equivalent experience) and able to demonstrate high proficiency in programming fundamentals. At least 5 years of proven experience as a Data Engineer or similar role dealing with data and ETL processes. 10-15 overall years of experience Strong knowledge of Microsoft Azure services, including Azure Data Factory, Azure Synapse, Azure Databricks, Azure Blob Storage and Azure Data Lake Gen 2. Experience utilizing SQL DML to query modern RDBMS in an efficient manner (e.g., SQL Server, PostgreSQL). Strong understanding of Software Engineering principles and how they apply to Data Engineering (e.g., CI/CD, version control, testing). Experience with big data technologies (e.g., Spark). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications: Learning agility Technical Leadership Consulting and managing business needs Strong experience in Python is preferred but experience in other languages such as Scala, Java, C#, etc is accepted. Experience building spark applications utilizing PySpark. Experience with file formats such as Parquet, Delta, Avro. Experience efficiently querying API endpoints as a data source. Understanding of the Azure environment and related services such as subscriptions, resource groups, etc. Understanding of Git workflows in software development. Using Azure DevOps pipeline and repositories to deploy and maintain solutions. Understanding of Ansible and how to use it in Azure DevOps pipelines. Chevron participates in E-Verify in certain locations as required by law.
Posted 2 months ago
10.0 - 15.0 years
30 - 37 Lacs
Mumbai
Work from Office
Elevate your career and harness your expertise to influence data strategy and craft innovative solutions in a dynamic environment. Be the keystone in revolutionizing data insights and turning challenges into opportunities. As a Data Operations Director in Markets Operations, you conduct, facilitate, and oversee expert analysis to uncover patterns which lead to new questions and solutions through data collection, integrity, utilization, requirements, and analysis. You apply your extensive in-depth expertise and problem-solving methodologies to accomplish, plan, and review tasks across multiple large scope projects in various technical areas. You use your advanced technical skills to advise on the design and development of metrics reporting and dashboards to enable Operations Management to execute their strategic objectives and ensure conformance with all controls, policies, and procedures. Job responsibilities Oversees the consultative partnerships across multiple stakeholders including Markets Operations Executives and senior Business Management; with strong understanding of the business success factors and underlying data Sets the strategic direction on data requirements and defines, leads, and implements KPIs, trend analysis, dashboards, and analyses to improve business function performance Conceptualizes, structures, and implements multiple programs in line with business priorities and leads the solutioning of highly complex and critical issues and business analysis activities, including improving data utilization and identifying patterns Drive the adoption of cutting-edge technology and automation solutions to streamline operations and enhance data analytics and insight capabilities Manages operational, financial, and technical activities, including financial budgeting, billing, and business planning activities, while ensuring adherence to risk associated controls, and regulatory requirements Communicates information, insights, and solutions to senior management and stakeholders, and designs the strategy to resolve problems through broad decision making Lead and develop a global high-performing team ensuring effective communication and collaboration across regions Required qualifications, capabilities, and skills 10+ years of experience or equivalent expertise in delivering data-driven problem solving and leading global teams Proven ability to lead complex data collection and analysis, and to advise on the development of conclusions Demonstratable experience leveraging advanced knowledge of data technologies to execute solutioning of complex issues and business analysis activities Exceptional leadership and communication skills, with the ability to influence and collaborate with senior management and cross-functional teams Proven ability to develop and retain talent with excellent coaching and mentoring, and an inclusive work culture Preferred qualifications, capabilities, and skills MBA or Master s degree In depth experience with the data architecture discipline including various database design techniques, modeling tools, and data architecture principle
Posted 2 months ago
13.0 - 18.0 years
2 Lacs
Chennai
Work from Office
Job Description We are looking for a BI Architect with 13+ years of experience to lead the design and implementation of scalable BI and data architecture solutions. The role involves driving data modeling, cloud-based pipelines, migration projects, and data lake initiatives using technologies like AWS, Kafka, Spark, SQL, and Python. Experience with EDW modeling and architecture is a strong plus. Key Responsibilities Design and develop scalable BI and data models to support enterprise analytics. Lead data platform migration from legacy BI systems to modern cloud architectures. Architect and manage data lakes, batch and streaming pipelines, and real-time integrations via Kafka and APIs. Support data governance, quality, and access control initiatives. Partner with data engineers, analysts, and business stakeholders to deliver reliable, high-performing data solutions. Contribute to architecture decisions and platform scalability planning Qualifications Should have 13 - 19 years of relevant experience. 10+ years in BI, data engineering, or data architecture roles. Proficiency in SQL, Python, Apache Spark, and Ka
Posted 2 months ago
5.0 - 10.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Please note that we will only be able to accept candidates who have the appropriate rights and documentation for employment in India. Who we are. Axi is a leading global provider of margin and deliverable Foreign Exchange, Contracts for Difference (CFDs), and Financial Spread betting. Our business has evolved into a world-class, multifaceted brokerage with offices in six regions. With heavy investment in the latest trading technology, Axi seeks to offer the most comprehensive end-to-end trading experience available, servicing traders of all levels from beginners to institutional-level clients. Lets talk about the cool stuff you do at Axi ! As the Senior Data Engineer for the Customer Data Platform, you will play a pivotal role improving Axi s data capabilities which deliver positive performance impacts on our global trading products. You will sit within our Product Data Team and proactively collaborate with broader business teams across Product, IT, Sales/Marketing and Finance. You will implement data initiatives which buildAxi s utilisation of our Customer Data Platform to deliver growth strategies that increase the adoption of Axi s trading products that will generate new trading clients. Your EDGE Assignment/You Will Responsible for building and applying configurations supporting Axi s Customer Data Platform (CDP) - Segment. Collaborating with cross-functional teams across Product, Technology and Marketing to understand requirements, design scalable data pipelines and implement new features or enhancements to the CDP using new data sources and destinations. Supporting the business to derive insights using PowerBI dashboards. Design and develop data pipelines including orchestration, ingestion, transformation, and delivery - across streaming, event and batch-style processing. Work with a team of data specialists analysts helping measure Product onboarding effectiveness and marketing activities. Develop, maintain, and optimize data integration and ETL processes using GCP and Azure services and tools. Create new data pipelines to efficiently enrich Axi s Customer Data Platform. Support our Marketing Data Warehouse and daily synchronisation of channel cost data. Proactively derive insights and data from Axi s platforms to inform digital experience design and strategy, and feed into executive reporting. Democratising data for stakeholders. Responsible for maturing Axis data model which facilitates Customer Data Platform audience activations for Marketing / Sales broader departments. You will keep up to date across industry developments and trends, the latest data architecture techniques, and tools. Adhere to all Axi s policies and procedures to ensure compliance with Axis internal processes. Are you the one Overall 5 years of experience in data/software engineering. Strong experience in Python/SQL. Strong experience using Databricks/Snowflake or other Spark backed data engineering tools. Strong experience using Azure DevOps. Bonus experience using Kafka. Axis bag of delights Competitive and attractive compensation. Extensive learning opportunities, such as professional training and certifications and soft skills development. 18 annual leave days per year. 12 sick days leave per year. Public holidays as declared by local government. Maternity leave as per law. Health Insurance. Axis interview journey Talent Acquisition Interview (45 minutes) Cultural Interview (30 minutes) Technical Interview (90 minutes) Please note that our organization works with recruitment agencies on a pre-approved basis only. A recruitment agency that wishes to submit candidate profiles or resumes for consideration must obtain prior written consent from our talent acquisition team. We do not accept unsolicited resumes from recruitment agencies, and we will not be responsible for any fees related to unsolicited resumes. Should we receive an unsolicited resume from a recruitment agency that does not have prior written consent, we will not be responsible for the payment any fees related to the recruitment of the candidate represented in the unsolicited resume.
Posted 2 months ago
3.0 - 10.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Job_Description":" About tsworks: tsworks is a leading technology innovator, providing transformative products and services designed for the digital-first world. Our mission is to provide domain expertise, innovative solutions and thought leadership to drive exceptional user and customer experiences. Demonstrating this commitment , we have a proven track record of championing digital transformation for industries such as Banking, Travel and Hospitality, and Retail (including e-commerce and omnichannel), as well as Distribution and Supply Chain, delivering impactful solutions that drive efficiency and growth. We take pride in fostering a workplace where your skills, ideas, and attitude shape meaningful customer engagements. About This Role: tsworks Technologies India Private Limited is seeking driven and motivated Senior Data Engineers to join its Digital Services Team. You will get hands-on experience with projects employing industry-leading technologies. This would initially be focused on the operational readiness and maintenance of existing applications and would transition into a build and maintenance role in the long run. Requirements Position: DataEngineer II Experience: 3 to 10+Years Location: Bangalore, India Mandatory RequiredQualification Strongproficiency in Azure services such as Azure Data Factory, Azure Databricks,Azure Synapse Analytics, Azure Storage, etc. Expertise inDevOps and CI/CD implementation Good knowledgein SQL ExcellentCommunication Skills In This Role, YouWill Design,implement, and manage scalable and efficient data architecture on the Azurecloud platform. Develop andmaintain data pipelines for efficient data extraction, transformation, andloading (ETL) processes. Perform complexdata transformations and processing using Azure Data Factory, Azure Databricks,Snowflakes data processing capabilities, or other relevant tools. Develop andmaintain data models within Snowflake and related tools to support reporting,analytics, and business intelligence needs. Collaboratewith cross-functional teams to understand data requirements and designappropriate data integration solutions. Integrate datafrom various sources, both internal and external, ensuring data quality andconsistency. Ensure datamodels are designed for scalability, reusability, and flexibility. Implement dataquality checks, validations, and monitoring processes to ensure data accuracyand integrity across Azure and Snowflake environments. Adhere to datagovernance standards and best practices to maintain data security andcompliance. Handlingperformance optimization in ADF and Snowflake platforms Collaboratewith data scientists, analysts, and business stakeholders to understand dataneeds and deliver actionable insights Provideguidance and mentorship to junior team members to enhance their technicalskills. Maintaincomprehensive documentation for data pipelines, processes, and architecturewithin both Azure and Snowflake environments including best practices,standards, and procedures. Skills Knowledge Bachelors orMasters degree in Computer Science, Engineering, or a related field. 3 + Years ofexperience in Information Technology, designing, developing and executingsolutions. 3+ Years ofhands-on experience in designing and executing data solutions on Azure cloudplatforms as a Data Engineer. Strongproficiency in Azure services such as Azure Data Factory, Azure Databricks,Azure Synapse Analytics, Azure Storage, etc.
Posted 2 months ago
7.0 - 12.0 years
20 - 35 Lacs
Gurugram
Work from Office
Exciting opportunity for a Delivery Lead Data Architect to join a high-growth analytics environment. You will be responsible for leading end-to-end technical delivery across data platforms, ensuring robust architecture, performance, and cross-functional alignment. Location : Gurugram (Hybrid) Your Future Employer A reputed analytics-driven organization focused on delivering innovative and scalable data solutions. Known for its inclusive work culture and continuous learning environment. Responsibilities 1) Leading design and delivery across complete SDLC for data and analytics projects 2) Translating business requirements into scalable data architecture and models 3) Collaborating with engineering, BI, testing, and support teams for smooth execution 4) Guiding development of ETL pipelines and reporting workflows 5) Mentoring team on best practices in data modeling, engineering, and architecture 6) Driving client communication, estimations, and technical workshops Requirements 1) Bachelors or Masters in Computer Science, IT, or related field 2) 6+ years of experience in data architecture and delivery leadership 3) Proficiency in SQL, Python, data modeling, and ETL tools 4) Experience with cloud platforms (AWS, Azure, or GCP) and Power BI 5) Strong understanding of SDLC, DevOps, and managed services delivery 6) Excellent communication, stakeholder management, and team leadership skills Whats in it for you 1) Opportunity to lead enterprise-level data programs 2) Work across modern cloud-native technologies 3) Competitive compensation with growth opportunities 4) Inclusive and collaborative work environment
Posted 2 months ago
10.0 - 15.0 years
12 - 17 Lacs
Pune
Work from Office
What You'll Do We're hiring a Senior Analytics Consultant to lead the analytics strategy behind how we scale and improve customer implementations at Avalara. This is an independent contributor role where you'll own critical dashboards, run deep-dive investigations, and shape operational narratives that guide decisions across GoLive, Sales, and Product. You're not just a builder of dashboardsyou're an advisor who connects dots between data, operations, and customer experience. You'll identify patterns, make recommendations, and communicate insights in ways that influence decisions at every level of the organization. You will report to Senior Manager, Data & Automation GoLive What Your Responsibilities Will Be Strategic Analytics Leadership Lead the design, evolution, and governance of dashboards assets across GoLive operational metrics (e.g., implementation velocity, CPO, quality). Translate complex data into simple, compelling business stories that influence decision-making across executive and operational teams. Identify trends, risks, and opportunities. Promote action through storytelling and strategic framing. Data Architecture & Dashboard Development Develop PowerBI dashboards and analytics models that scale with evolving operational needs. Write, improve, and document SQL queries to integrate and validate data from multiple enterprise systems (Salesforce, internal databases). Establish and enforce best practices for metric definitions, documentation, and data quality. Cross-Functional Partnership Be the analytics expert for GoLive vertical leads, implementation managers, and executive partners. Lead working sessions, executive readouts and business review sessions to ensure insights and context. Advisory & Influence Consult process improvement efforts by using data to evaluate what's working and what's not. Frame analytical work in the context of broader business goals. Be the voice that ties operational data to customer impact and strategic value. What You'll Need to be Successful 10+ years of experience in analytics, business intelligence or data strategy in a SaaS or tech-enabled operations environment. Expert-level SQL experience with BI tools (Power BI preferred; Tableau, Looker etc. also accepted). Demonstrated experience independently managing analytical workstreams that influence senior partners. Exceptional communication skills, able to translate raw data into a compelling, executive-ready narrative. Experience with operational metrics, capacity modeling, customer implementation, or success analytics. What You'll Bring With a strategic mindset grounded in data, you understand both what to measure and why it matters. High ownership and autonomy; you don't wait for direction to uncover and solve important problems. Experience with making the complex simple, bridging technical detail and business clarity with ease.
Posted 2 months ago
5.0 - 9.0 years
19 - 23 Lacs
Mumbai
Work from Office
Overview MSCI has an immediate opening in of our fastest growing product lines. As a Lead Architect within Sustainability and Climate, you are an integral part of a team that works to develop high-quality architecture solutions for various software applications on modern cloud-based technologies. As a core technical contributor, you are responsible for conducting critical architecture solutions across multiple technical areas to support project goals. The systems under your responsibility will be amongst the most mission critical systems of MSCI. They require strong technology expertise and a strong sense of enterprise system design, state-of-the-art scalability and reliability but also innovation. Your ability to take technology decisions in a consistent framework to support the growth of our company and products, lead the various software implementations in close partnerships with global leaders and multiple product organizations and drive the technology innovations will be the key measures of your success in our dynamic and rapidly growing environment. At MSCI, you will be operating in a culture where we value merit and track record. You will own the full life-cycle of the technology services and provide management, technical and people leadership in the design, development, quality assurance and maintenance of our production systems, making sure we continue to scale our great franchise. Responsibilities Engages technical teams and business stakeholders to discuss and propose technical approaches to meet current and future needs • Defines the technical target state of the product and drives achievement of the strategy • As the Lead Architect you will be responsible for leading the design, development, and maintenance of our data architecture, ensuring scalability, efficiency, and reliability. • Create and maintain comprehensive documentation for the architecture, processes, and best practices including Architecture Decision Records (ADRs). • Evaluates recommendations and provides feedback on new technologies • Develops secure and high-quality production code, and reviews and debugs code written by others Informaon Classificaon: GENERAL • Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems • Collaborating with a cross functional team to draft, implement and adapt the overall architecture of our products and support infrastructure in conjunction with software development managers, and product management teams • Staying abreast of new technologies and issues in the software-as-a-service industry, including current technologies, platforms, standards and methodologies • Being actively engaged in setting technology standards that impact the company and its offerings • Ensuring the knowledge sharing of engineering best practices across departments; and developing and monitoring technical standards to ensure adherence to them. Qualifications Prior senior Software Architecture roles • Demonstrate proficiency in programming languages such as Python/Java/Scala and knowledge of SQL and NoSQL databases. • Drive the development of conceptual, logical, and physical data models aligned with business requirements. • Lead the implementation and optimization of data technologies, including Apache Spark. • Experience with one of the table formats, such as Delta, Iceberg. • Strong hands-on experience in data architecture, database design, and data modeling. • Proven experience as a Data Platform Architect or in a similar role, with expertise in Airflow, Databricks, Snowflake, Collibra, and Dremio. • Experience with cloud platforms such as AWS, Azure, or Google Cloud. • Ability to dive into details, hands on technologist with strong core computer science fundamentals. • Strong preference for financial services experience • Proven leadership of large-scale distributed software teams that have delivered great products on deadline • Experience in a modern iterative software development methodology • Experience with globally distributed teams and business partners • Experience in building and maintaining applications that are mission critical for customers • M.S. in Computer Science, Management Information Systems or related engineering field • 15+ years of software engineering experience • Demonstrated consensus builder and collegial peer What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 2 months ago
10.0 - 15.0 years
35 - 40 Lacs
Hyderabad
Work from Office
Step into a pivotal role where your expertise in data architecture will shape the future of analytics at Novartis. As Associate Director - Data Architect, you ll lead the design and implementation of innovative data solutions that empower business decisions and drive digital transformation. This is your opportunity to influence enterprise-wide strategies, collaborate with cross-functional teams, and bring emerging technologies to life all while making a meaningful impact on global healthcare. About the Role Key Responsibilities - Design and implement scalable data architecture solutions aligned with business strategy and innovation goals - Lead architecture for US&I Analytics Capabilities including GenAI, MLOps, NLP, and data visualization - Collaborate with cross-functional teams to ensure scalable, future-ready data solutions - Define and evolve architecture governance frameworks, standards, and best practices - Drive adoption of emerging technologies through rapid prototyping and enterprise-scale deployment - Architect data solutions using AWS, Snowflake, Databricks, and other modern platforms - Oversee delivery of data lake projects including acquisition, transformation, and publishing - Ensure data security, governance, and compliance across all architecture solutions - Promote a data product-centric approach to solution design and delivery - Align innovation efforts with business strategy, IT roadmap, and regulatory requirements Essential Requirements - Bachelor s degree in computer science, engineering, or a related field - Over 10 years of experience in analytical and technical frameworks for descriptive and prescriptive analytics - Strong expertise in AWS, Databricks, and Snowflake service offerings - Proven experience delivering data lake projects from acquisition to publishing - Deep understanding of data security, governance policies, and enforcement mechanisms - Agile delivery experience managing multiple concurrent delivery cycles - Strong knowledge of MLOps and analytical data lifecycle management - Excellent communication, problem-solving, and cross-functional collaboration skills Desirable Requirements - Experience working with pharmaceutical data and familiarity with global healthcare data sources - Exposure to regulatory frameworks and compliance standards in the life sciences industry
Posted 2 months ago
3.0 - 8.0 years
10 - 15 Lacs
Kolkata
Work from Office
Data Engineer || Product Based MNC (Direct Payroll) || Kolkata Location Role & responsibilities : Understands requirements and is involved in the discussions relating to technical and functional design of the sprint/ module/project Design and implement end-to-end data solutions (storage, integration, processing, and visualization) in Azure. Used various sources to ingest data into Azure Data Factory ,Azure Data Lake Storage (ADLS) such as SQL Server, Excel, Oracle, SQL Azure etc. Extract data from one database and load it into another Build data architecture for ingestion, processing, and surfacing of data for large-scale applications Use many different scripting languages, understanding the nuances and benefits of each, to combine systems Research and discover new methods to acquire data, and new applications for existing data Work with other members of the data team, including data architects, data analysts, and data scientists Prepare data sets for analysis and interpretation Perform statistical analysis and fine-tuning using test results Create libraries and extend the existing frameworks Create design documents basis discussions and assists in providing technical solutions for the business process Preferred candidate profile In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework 3+ years of overall experience with Azure, Data Factory and .Net Strong in Data factory and should be able to create manual and auto trigger pipelines Should be able to create, update, edit and delete ETL jobs in Azure Synapse Analytics Recreate existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database and SQL data warehouse environment. Knowledge of SQL queries, SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS) Proven abilities to take initiative and be innovative Analytical mind with a problem-solving aptitude 10 LPA - 15 LPA Apply for this position Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. * Your Next Step Towards Success Starts Here Why Choose Us Free Expert Consultation Have an idea but unsure how to execute it? Our industry experts offer free feasibility checks, expert advice, and actionable strategies tailored to your goals at no cost! Complimentary Technical Project Manager Every project comes with a Complimentary Technical Project Manager to ensure smooth project management, offer valuable development guidance and keep everything on track.
Posted 2 months ago
5.0 - 10.0 years
25 - 30 Lacs
Gurugram
Work from Office
Join Team Amex and lets lead the way together. About Enterprise Architecture: Enterprise Architecture is an organization within the Chief Technology Office at American Express and it is a key enabler of the company s technology strategy. The four pillars of Enterprise Architecture include: Architecture as Code: this pillar owns and operates foundational technologies that are leveraged by engineering teams across the enterprise. Architecture as Design: this pillar includes the solution and technical design for transformation programs and business critical projects which need architectural guidance and support. Governance: this pillar is responsible for defining technical standards, and developing innovative tools that automate controls to ensure compliance. Colleague Enablement: this pillar is focused on colleague development, recognition, training, and enterprise outreach. Seeking a seasoned Solution Architect to lead the design and delivery of scalable, enterprise-grade solutions across our financial, servicing, or compliance platforms. The ideal candidate will possess deep domain knowledge and a strong architectural mindset, capable of translating complex business needs into robust technical designs that meet regulatory, operational, and performance standards Responsibilities: Contribute to enterprise architecture initiatives, domain reviews, and solution architecture. Design and prototype scalable, secure, and resilient applications and data pipelines using modern technologies. Collaborate with solution architects and engineering teams to deliver architectural documentation and design patterns that align with business and technology goals. Build and evaluate proof of concepts (PoCs) to test and validate emerging technologies and architectural approaches. Participate in enterprise forums and working groups to drive architectural decisions and cross-domain alignment. Communicate architectural guidance effectively across both technical and non-technical stakeholders. Foster innovation by exploring new tools, frameworks, and design methodologies. Qualifications: 7 years of experience in software engineering with 3+ years experience in solution architecture with strong knowledge of distributed system design and high throughput data architecture Proven experience in finance, servicing, or compliance domains, including regulatory and operational considerations. Proven track record of designing and deploying data-intensive applications in distributed or cloud-native environments. Ability to write clear architectural documentation and present ideas concisely. Experience with enterprise platforms such as microservices, event-driven architecture, Familiarity with cloud- native patterns (GCP, AWS or Azure) and modern dev-ops pipeline. Experience with Agentic AI including autonomous agents, using AI reasoning and knowledge of AI/ML frameworks is preferred Familiarity with regulatory frameworks and experience designing compliant solutions. Excellent communication and stakeholder management skills; ability to present technical concepts to non-technical audiences. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities
Posted 2 months ago
10.0 - 20.0 years
25 - 40 Lacs
Hyderabad, Pune
Hybrid
Data Modeler / Lead - Healthcare Data Systems Position Overview We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities Data Architecture & Modeling • Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management • • • Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) Create and maintain data lineage documentation and data dictionaries for healthcare datasets Establish data modeling standards and best practices across the organization Technical Leadership • • • Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica Architect scalable data solutions that handle large volumes of healthcare transactional data Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise • Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) • • Design data models that support analytical, reporting and AI/ML needs Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations • Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality • • • Implement data governance frameworks specic to healthcare data privacy and security requirements Establish data quality monitoring and validation processes for critical health plan metrics Lead eorts to standardize healthcare data denitions across multiple systems and data sources Required Qualications Technical Skills • • • • • 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data Expert-level prociency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) Prociency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge • Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data • • Experience with healthcare data standards and medical coding systems Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) • Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication • • • • Proven track record of leading data modeling projects in complex healthcare environments Strong analytical and problem-solving skills with ability to work with ambiguous requirements Excellent communication skills with ability to explain technical concepts to business stakeholders Experience mentoring team members and establishing technical standards Preferred Qualications • Experience with Medicare Advantage, Medicaid, or Commercial health plan operations • • • • Cloud platform certications (AWS, Azure, or GCP) Experience with real-time data streaming and modern data lake architectures Knowledge of machine learning applications in healthcare analytics Previous experience in a lead or architect role within healthcare organizations
Posted 2 months ago
7.0 - 12.0 years
8 - 14 Lacs
Bengaluru
Work from Office
Individual Accountabilities Collaboration Collaborates with domain architects in the DSS, OEA, EUS, and HaN towers and if appropriate, the respective business stakeholders in architecting data solutions for their data service needs. Collaborates with the Data Engineering and Data Software Engineering teams to effectively communicate the data architecture to be implemented. Contributes to prototype or proof of concept efforts. Collaborates with InfoSec organization to understand corporate security policies and how they apply to data solutions. Collaborates with the Legal and Data Privacy organization to understand the latest policies so they may be incorporated into every data architecture solution. Suggest architecture design with Ontologies, MDM team. Technical skills & design Significant experience working with structured and unstructured data at scale and comfort with a variety of different stores (key-value, document, columnar, etc.) as well as traditional RDBMS and data warehouses. Deep understanding of modern data services in leading cloud environments, and able to select and assemble data services with maximum cost efficiency while meeting business requirements of speed, continuity, and data integrity. Creates data architecture artifacts such as architecture diagrams, data models, design documents, etc. Guides domain architect on the value of a modern data and analytics platform. Research, design, test, and evaluate new technologies, platforms and third-party products. Working experience with Azure Cloud, Data Mesh, MS Fabric, Ontologies, MDM, IoT, BI solution and AI would be greater assets. Expert troubleshoot skills and experience. Leadership Mentors aspiring data architects typically operating in data engineering and software engineering roles. Key shared accountabilities Leads medium to large data services projects. Provides technical partnership to product owners Shared stewardship, with domains architects, of the Arcadis data ecosystem. Actively participates in Arcadis Tech Architect community. Key profile requirements Minimum of 7 years of experience in designing and implementing modern solutions as part of variety of data ingestion and transformation pipelines Minimum of 5 years of experience with best practice design principles and approaches for a range of application styles and technologies to help guide and steer decisions. Experience working in large scale development and cloud environment.
Posted 2 months ago
3.0 - 8.0 years
35 - 40 Lacs
Chennai
Work from Office
Amazon Retail Financial Intelligence Systems is seeking a seasoned and talented Senior Data Engineer to join the Fortune Platform team. Fortune is a fast growing team with a mandate to build tools to automate profit-and-loss forecasting and planning for the Physical Consumer business. We are building the next generation Business Intelligence solutions using big data technologies such as Apache Spark, Hive/Hadoop, and distributed query engines. As a Data Engineer in Amazon, you will be working in a large, extremely complex and dynamic data environment. You should be passionate about working with big data and are able to learn new technologies rapidly and evaluate them critically. You should have excellent communication skills and be able to work with business owners to translate business requirements into system solutions. You are a self-starter, comfortable with ambiguity, and working in a fast-paced and ever-changing environment. Ideally, you are also experienced with at least one of the programming languages such as Java, C++, Spark/Scala, Python, etc. Major Responsibilities: Work with a team of product and program managers, engineering leaders, and business leaders to build data architectures and platforms to support business Design, develop, and operate high-scalable, high-performance, low-cost, and accurate data pipelines in distributed data processing platforms Recognize and adopt best practices in data processing, reporting, and analysis: data integrity, test design, analysis, validation, and documentation Keep up to date with big data technologies, evaluate and make decisions around the use of new or existing software products to design the data architecture Design, build and own all the components of a high-volume data warehouse end to end. Provide end-to-end data engineering support for project lifecycle execution (design, execution and risk assessment) Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Interface with other technology teams to extract, transform, and load (ETL) data from a wide variety of data sources Own the functional and nonfunctional scaling of software systems in your ownership area. Implement big data solutions for distributed computing. As a DE on our team, you will be responsible for leading the data modelling, database design, and launch of some of the core data pipelines. You will have significant influence on our overall strategy by helping define the data model, drive the database design, and spearhead the best practices to delivery high quality products. About the team Profit intelligence systems measures, predicts true profit(/loss) for each item as a result of a specific shipment to an Amazon customer. Profit Intelligence is all about providing intelligent ways for Amazon to understand profitability across retail business. What are the hidden factors driving the growth or profitability across millions of shipments each day We compute the profitability of each and every shipment that gets shipped out of Amazon. Guess what, we predict the profitability of future possible shipments too. We are a team of agile, can-do engineers, who believe that not only are moon shots possible but that they can be done before lunch. All it takes is finding new ideas that challenge our preconceived notions of how things should be done. Process and procedure matter less than ideas and the practical work of getting stuff done. This is a place for exploring the new and taking risks. We push the envelope in using cloud services in AWS as well as the latest in distributed systems, forecasting algorithms, and data mining. 3+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)
Posted 2 months ago
7.0 - 10.0 years
20 - 27 Lacs
Kolkata
Hybrid
We are looking for a Senior Data Lead to lead enterprise-level data modernization and innovation. In this highly strategic role, you will design scalable, secure, and future-ready data architectures, modernize legacy systems, and provide trusted technical leadership across both technology and business teams. This is a unique opportunity to make a company-wide impact by influencing data strategy and enabling smarter, faster decision-making through data. Key Responsibilities Architect & Design: Lead the development of robust, scalable data models, data management systems, and integration frameworks to ensure enterprise-wide data accuracy, consistency, and security. Domain Expertise: Act as a subject matter expert across key business functions such as Supply Chain, Product Engineering, Sales & Marketing, Manufacturing, Finance, and Legal. Modernization Leadership: Drive the transformation of legacy systems and manage end-to-end cloud migrations with minimal business disruption. Collaboration: Partner with data engineers, scientists, analysts, and IT leaders to build high-performance, scalable data pipelines and transformation solutions. Governance & Compliance: Establish and maintain data governance frameworks including metadata repositories, data dictionaries, and data lineage documentation. Strategic Advisory: Provide guidance on data architecture best practices, technology selection, and roadmap alignment to senior leadership and cross-functional teams. Mentorship: Serve as a mentor and thought leader to junior data professionals, fostering a culture of innovation, knowledge sharing, and technical excellence. Innovation & Trends: Stay abreast of emerging technologies in cloud, data platforms, and AI/ML to identify and implement innovative solutions. Communication: Translate complex technical concepts into clear, actionable insights for technical and non-technical audiences alike. Required Qualifications 10+ years of experience in data architecture, engineering, or enterprise data management roles. Demonstrated success leading large-scale data initiatives in life sciences or other highly regulated industries. Deep expertise in modern data architecture paradigms such as Data Lakehouse, Data Mesh, or Data Fabric. Strong hands-on experience with cloud platforms like AWS, Azure, or Google Cloud Platform (GCP). Proficiency in data modeling, ETL/ELT frameworks, and enterprise integration patterns. Deep understanding of data governance, metadata management, master data management (MDM), and data quality practices. Experience with tools and platforms including but not limited to: Data Integration: Informatica, Talend Data Governance: Collibra Modeling/Transformation: dbt Cloud Platforms: Snowflake, Databricks Excellent problem-solving skills with the ability to translate business requirements into scalable data solutions. Exceptional communication skills and experience engaging with both executive stakeholders and engineering teams. Preferred Qualifications (Nice to Have) Experience with AI/ML data pipelines or real-time streaming architectures. Certifications in cloud technologies (e.g., AWS Certified Solutions Architect, Azure Data Engineer). Familiarity with regulatory frameworks such as GxP, HIPAA, or GDPR.
Posted 2 months ago
10.0 - 15.0 years
35 - 40 Lacs
Hyderabad
Hybrid
Role Description: The Director for Data Architecture and Solutions will lead Amgens enterprise data architecture and solutions strategy, overseeing the design, integration, and deployment of scalable, secure, and future-ready data systems. This leader will define the architectural vision and guide a high-performing team of architects and technical experts to implement data and analytics solutions that drive business value and innovation. This role demands a strong blend of business acumen, deep technical expertise, and strategic thinking to align data capabilities with the company's mission and growth. The Director will also serve as a key liaison with executive leadership, influencing technology investment and enterprise data direction . Roles & Responsibilities: Develop and own the enterprise data architecture and solutions roadmap, aligned with Amgens business strategy and digital transformation goals. Provide executive leadership and oversight of data architecture initiatives across business domains (R&D, Commercial, Manufacturing, etc.). Lead and grow a high-impact team of data and solution architects. Coach, mentor, and foster innovation and continuous improvement in the team. Design and promote modern data architectures (data mesh, data fabric, lakehouse etc.) across hybrid cloud environments and enable for AI readiness. Collaborate with stakeholders to define solution blueprints, integrating business requirements with technical strategy to drive value. Drive enterprise-wide adoption of data modeling, metadata management, and data lineage standards. Ensure solutions meet enterprise-grade requirements for security, performance, scalability, compliance, and data governance. Partner closely with Data Engineering, Analytics, AI/ML, and IT Security teams to operationalize data solutions that enable advanced analytics and decision-making. Champion innovation and continuous evolution of Amgens data and analytics landscape through new technologies and industry best practices. Communicate architectural strategy and project outcomes to executive leadership and other non-technical stakeholders. Functional Skills: Must-Have Skills: 10+ years of experience in data architecture or solution architecture leadership roles, including experience at the enterprise level. Proven experience leading architecture strategy and delivery in the life sciences or pharmaceutical industry. Expertise in cloud platforms (AWS, Azure, or GCP) and modern data technologies (data lakes, APIs, ETL/ELT frameworks). Strong understanding of data governance, compliance (e.g., HIPAA, GxP), and data privacy best practices. Demonstrated success managing cross-functional, global teams and large-scale data programs. Experience with enterprise architecture frameworks (TOGAF, Zachman, etc.). Proven leadership skills with a track record of managing and mentoring high-performing data architecture teams. Good-to-Have Skills: Masters or doctorate in Computer Science, Engineering, or related field. Certifications in cloud architecture (AWS, GCP, Azure). Experience integrating AI/ML solutions into enterprise Data Achitecture. Familiarity with DevOps, CI/CD pipelines, and Infrastructure as Code (Terraform, CloudFormation). Scaled Agile or similar methodology experience. Leadership and Communication Skills: Strategic thinker with the ability to influence at the executive level. Strong executive presence with excellent communication and storytelling skills. Ability to lead in a matrixed, global environment with multiple stakeholders. Highly collaborative, proactive, and business-oriented mindset. Strong organizational and prioritization skills to manage complex initiatives. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Basic Qualifications: Doctorate degree and 2 years of Information Systems experience, or Masters degree and 6 years of Information Systems experience, or Bachelors degree and 8 years of Information Systems experience, or Associates degree and 10 years of Information Systems experience, or 4 years of managerial experience directly managing people and leadership experience leading teams, projects, or programs.
Posted 2 months ago
8.0 - 17.0 years
20 - 27 Lacs
Hyderabad
Work from Office
Career Category Engineering Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: We are seeking a seasoned Principal Architect - Solutions to drive the architecture, development and implementation of data solutions to Amgen functional groups. The ideal candidate able to work in large scale Data Analytic initiatives, engage and work along with Business, Program Management, Data Engineering and Analytic Engineering teams. Be champions of enterprise data analytic strategy, data architecture blueprints and architectural guidelines. As a Principal Architect, you will play a crucial role in designing, building, and optimizing data solutions to Amgen functional groups such as RD, Operations and GCO. Roles Responsibilities: Implement and manage large scale data analytic solutions to Amgen functional groups that align with the Amgen Data strategy Collaborate with Business, Program Management, Data Engineering and Analytic Engineering teams to deliver data solutions Responsible for design, develop, optimize, delivery and support of Data solutions on AWS and Databricks architecture Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Provide expert guidance and mentorship to the team members, fostering a culture of innovation and best practices. Be passionate and hands-on to quickly experiment with new data related technologies Define guidelines, standards, strategies, security policies and change management policies to support the Enterprise Data platform. Collaborate and align with EARB, Cloud Infrastructure, Security and other technology leaders on Enterprise Data Architecture changes Work with different project and application groups to drive growth of the Enterprise Data Platform using effective written/verbal communication skills, and lead demos at different roadmap sessions Overall management of the Enterprise Data Platform on AWS environment to ensure that the service delivery is cost effective and business SLAs around uptime, performance and capacity are met Ensure scalability, reliability, and performance of data platforms by implementing best practices for architecture, cloud resource optimization, and system tuning. Collaboration with RunOps engineers to continuously increase our ability to push changes into production with as little manual overhead and as much speed as possible. Maintain knowledge of market trends and developments in data integration, data management and analytics software/tools Work as part of team in a SAFe Agile/Scrum model Basic Qualifications and Experience: Master s degree with 12 - 15 years of experience in Computer Science, IT or related field OR Bachelor s degree with 14 - 17 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills: 8+ years of hands-on experience in Data integrations, Data Management and BI technology stack. Strong experience with one or more Data Management tools such as AWS data lake, Snowflake or Azure Data Fabric Expert-level proficiency with Databricks and experience in optimizing data pipelines and workflows in Databricks environments. Strong experience with Python, PySpark, and SQL for building scalable data workflows and pipelines. Experience with Apache Spark, Delta Lake, and other relevant technologies for large-scale data processing. Familiarity with BI tools including Tableau and PowerBI Demonstrated ability to enhance cost-efficiency, scalability, and performance for data solutions Strong analytical and problem-solving skills to address complex data solutions Good-to-Have Skills: Preferred to have experience in life science or tech or consultative solution architecture roles Experience working with agile development methodologies such as Scaled Agile. Professional Certifications AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. .
Posted 2 months ago
5.0 - 14.0 years
22 - 30 Lacs
Hyderabad
Work from Office
Career Category Engineering Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: We are seeking a seasoned and passionate Principal Architect (Enterprise Architect - Data Platform Engineering ) in our Data Architecture Engineer ing group to drive the architecture, development and implementation of our strategy spanning across Data Fabric, Data Management, and Data Analytics Platform stack . The ideal candidate possesses a deep technical expertise and understanding of data and analytics landscape, current tools and technology trends, and data engineering principles, coupled with strong leadership and data-driven problem-solving skills . As a Principal Architect , you will play a crucial role in building the strategy and driving the implementation of best practices across data and analytics platforms . Roles Responsibilities: Must be passionate about Data, Content and AI technologies - with ability to evaluate and assess new technology and trends in the market quickly - with enterprise architecture in mind Drive the strategy and implementation of enterprise data platform and technical roadmaps that align with the Amgen Data strategy Maintain the pulse of current market trends in data AI space and be able to quickly perform hands-on experimentation and evaluations Provide expert guidance and influence the management and peers from functional groups with Enterprise mindset and goals R esponsible for design, develop , optimize , delivery and support of Enterprise Data platform on AWS and Databricks architecture Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Advice and support Application teams (product managers, architects, business analysts, and developers) on tools, technology, and methodology related to the design and development of applications that have large data volume and variety of data types Collaborate and align with EARB, Cloud Infrastructure , Security and other technology leaders on Enterprise Data Architecture changes Ensure scalability, reliability, and performance of data platforms by implementing best practices for architecture, cloud resource optimization, and system tuning. Collaboration with RunOps engineers to continuously increase our ability to push changes into production with as little manual overhead and as much speed as possible. Basic Qualifications and Experience: Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills: 8 + years of experience in data architecture and engineering or related roles with hands-on experience building enterprise data platforms in a cloud environment (AWS, Azure, GCP). 5+ years of experience in leading enterprise scale data platforms and solutions Expert-level proficiency with Databricks and experience in optimizing data pipelines and workflows in Databricks environments. Deep understanding of distributed computing, data architecture, and performance optimization in cloud-based environments. Experience with Enterprise mindset / certifications like TOGAF etc. are a plus. Highly preferred to have Big Tech or Big Consulting experience. Solid knowledge of data security, governance, and compliance practices in cloud environments. Must have exceptional communication to engage and influence architects and leaders in the organization Good-to-Have Skills: Experience with Gen AI tools in databricks Experience with unstructured data architecture and pip elines Experience working with agile development methodologies such as Scaled Agile. Professional Certifications AWS Certified Data Engineer preferred Databricks Certifi cate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation . .
Posted 2 months ago
8.0 - 17.0 years
20 - 27 Lacs
Hyderabad
Work from Office
Career Category Engineering Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: We are seeking a seasoned Principal Architect - Solutions to drive the architecture, development and implementation of data solutions to Amgen functional groups. The ideal candidate able to work in large scale Data Analytic initiatives, engage and work along with Business, Program Management, Data Engineering and Analytic Engineering teams. Be champions of enterprise data analytic strategy, data architecture blueprints and architectural guidelines. As a Principal Architect, you will play a crucial role in designing, building, and optimizing data solutions to Amgen functional groups such as RD, Operations and GCO. Roles Responsibilities: Implement and manage large scale data analytic solutions to Amgen functional groups that align with the Amgen Data strategy Collaborate with Business, Program Management, Data Engineering and Analytic Engineering teams to deliver data solutions Responsible for design, develop, optimize , delivery and support of Data solutions on AWS and Databricks architecture Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Provide expert guidance and mentorship to the team members, fostering a culture of innovation and best practices. Be passionate and hands-on to quickly experiment with new data related technologies Define guidelines, standards, strategies, security policies and change management policies to support the Enterprise Data platform. Collaborate and align with EARB, Cloud Infrastructure, Security and other technology leaders on Enterprise Data Architecture changes Work with different project and application groups to drive growth of the Enterprise Data Platform using effective written/verbal communication skills, and lead demos at different roadmap sessions Overall management of the Enterprise Data Platform on AWS environment to ensure that the service delivery is cost effective and business SLAs around uptime, performance and capacity are met Ensure scalability, reliability, and performance of data platforms by implementing best practices for architecture, cloud resource optimization, and system tuning. Collaboration with RunOps engineers to continuously increase our ability to push changes into production with as little manual overhead and as much speed as possible. Maintain knowledge of market trends and developments in data integration, data management and analytics software/tools Work as part of team in a SAFe Agile/Scrum model Basic Qualifications and Experience: Master s degree with 12 - 15 years of experience in Computer Science, IT or related field OR Bachelor s degree with 14 - 17 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills: 8+ years of hands-on experience in Data integrations, Data Management and BI technology stack. Strong experience with one or more Data Management tools such as AWS data lake, Snowflake or Azure Data Fabric Expert-level proficiency with Databricks and experience in optimizing data pipelines and workflows in Databricks environments. Strong experience with Python, PySpark , and SQL for building scalable data workflows and pipelines. Experience with Apache Spark, Delta Lake, and other relevant technologies for large-scale data processing. Familiarity with BI tools including Tableau and PowerBI Demonstrated ability to enhance cost-efficiency, scalability, and performance for data solutions Strong analytical and problem-solving skills to address complex data solutions Good-to-Have Skills: Preferred to have experience in life science or tech or consultative solution architecture roles Experience working with agile development methodologies such as Scaled Agile. Professional Certifications AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation . .
Posted 2 months ago
11.0 - 19.0 years
32 - 40 Lacs
Hyderabad
Work from Office
You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills 7+ years of hands-on practical experience delivering data architecture and system designs, data engineer, testing, and operational stability Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e. g. , data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc. ) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e. g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e. g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e. g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e. g. , Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e. g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e. g. Architecture Trade off Analysis
Posted 2 months ago
12.0 - 22.0 years
8 - 18 Lacs
Pune, Bengaluru
Hybrid
Role & responsibilities Understanding of the business area that the project is involved with. Working with data stewards to understand the data sources. Clear understanding of data entities, relationships, cardinality etc for the inbound sources based on inputs from the data stewards / source system experts. Performance tuning understanding the overall requirement, reporting impact. Data Modeling for the business and reporting models as per the reporting needs or delivery needs to other downstream systems. Have experience to components and languages like Databricks, Python, PySpark, SCALA, R. Ability to ask strong questions to help the team see areas that may lead to problems. Ability to validate the data by writing sql queries and compare against the source system and transformation mapping. Work closely with teams to collect and translate information requirements into data to develop data-centric solutions. Ensure that industry-accepted data architecture principles and standards are integrated and followed for modeling, stored procedures, replication, regulations, and security, among other concepts, to meet technical and business goals. Continuously improve the quality, consistency, accessibility, and security of our data activity across company needs. Experience on Azure DevOps project tracking tool or equivalent tools like JIRA. Should have Outstanding verbal, non-verbal communication. Should have experience and desire to work in a Global delivery environment.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40419 Jobs | Dublin
Wipro
19673 Jobs | Bengaluru
Accenture in India
18234 Jobs | Dublin 2
EY
16675 Jobs | London
Uplers
12161 Jobs | Ahmedabad
Amazon
10909 Jobs | Seattle,WA
Accenture services Pvt Ltd
10500 Jobs |
Bajaj Finserv
10207 Jobs |
Oracle
9771 Jobs | Redwood City
IBM
9641 Jobs | Armonk