Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Sr Associate IS Analyst - DocuBridge in this role will play a key role in implementation and lifecycle management of structured regulatory submission solutions, with a main focus on Lorenz DocuBridge. This role demands strong technical expertise, hands-on experience with Lorenz DocuBridge, and the ability to assist in managing multi-functional partner relationships across business, IT, and vendor teams. The candidate will collaborate with internal Regulatory Affairs team members and vendor partners to ensure accurate interpretation of requirements, delivery of compliant submissions, and successful deployment of the solution. The role includes assisting in validation testing, user grievance resolution, and overall user experience optimization. Timely submission is critical, and the candidate must support the project to closure within set timelines and quality standards. Assist in the implementation and system ownership of Lorenz DocuBridge Suite, ensuring the solution meets both global and regional regulatory requirements (e. g. , eCTD & NeeS). Collaborate with multi-functional partners including Regulatory Affairs, Quality Assurance, IT Security, and vendor teams to gather detailed business requirements and translate them into scalable, compliant technical solutions. Support the team in defining and managing project plans, timelines, resource allocation, and key landmarks to ensure end-to-end project execution - from system assessment, procurement, configuration, validation, launch, to post-production support. Assist in driving validation and compliance readiness by overseeing the development of validation plans, IQ/OQ/PQ protocols, and related documentation as per GXP, 21 CFR Part 11guidelines. Ensure regulatory submission readiness by enabling structured document authoring workflows, lifecycle management, and integration with content sources such as Regulatory Veeva RIM. Oversee user access controls, role-based privileges, and audit trail configurations to ensure system integrity and security are maintained. Support the change control process for the submission system by aligning with ITIL standards and ensuring traceability for all updates, patches, and configuration changes. Support team to Develop training materials and conduct hands-on user training to onboard regulatory users and business owners, ensuring effective adoption of the system. Monitor production performance and work with business team and vender partner to solve issues, ensuring timely resolution of incidents with minimal impact on business continuity. Lead continuous improvement initiatives to enhance system usability, performance, and regulatory compliance alignment. Track KPIs and provide regular status updates to leadership on system performance, user adoption, and project level challenges and risks. Contribute to technology roadmap planning by finding opportunities for tool upgrades, integration with newer modules (e. g. , Lorenz eValidator, ), and regulatory intelligence platforms. Act as the SME for structured submissions and represent the function during audits, inspections, and regulatory reviews. Demonstrate adaptability to agile methodology, ensuring flexibility and responsiveness to changing project requirements. Manage and lead teams effectively, fostering collaboration and productivity. Use Jira and ServiceNow for project tracking, issue resolution, and service management. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Masters degree / Bachelors degree and 5 to 9 years of relevant experience Must-Have Skills: Demonstrate a deep understanding of pharma industry regulations and compliance requirements for, including FDA and EUCTR. Have good knowledge of submission publishing systems like Lorenz s docuBridge application and Regulatory Veeva RIM. Demonstrated experience in managing technology initiatives and teams with a track record of successful innovation and fostering the development of talent. Must be flexible and able to manage multiple activities and priorities with minimal direction in a rapidly changing and demanding environment. Experience in applying technology standard process methodologies such as Scaled Agile (SAFe) and ITIL. Exceptional collaboration, communication, must be flexible and able to manage multiple activities and priorities with minimal direction in a rapidly changing and demanding environment. Possess strong knowledge of information systems and network technologies. Good-to-Have Skills: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc. ) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Veeva Vault Platform Administrator or Equivalent Vault Certification (Mandatory) SAFe for Teams (Preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements.
Posted 1 week ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Position: Capability Lead – GCP (Director/ Enterprise Architect) Location: Chennai, Hyderabad, Bangalore or Noida (Hybrid)- No remote option available Duration: Full Time Reporting : Practice Head Budget: 30- 65 LPA (Depending on level of expertise) Notice Period: Immediate Joiner/ Currently Serving/ Notice is less than 60 days Level of experience: 12+ Years Shift Timings: 2 pm -11 pm IST. Overlap with EST time zone of 2 pm Job Summary As a key member of the Data business leadership team, the role will be responsible for building and expanding the Google Cloud Platform data and analytics capability within the organization. This individual will drive technical excellence, innovative solution development, and successful delivery of GCP-based data initiatives. The role requires close collaboration with clients, delivery teams, GCP alliance partners, and internal stakeholders to grow GCP offerings, build talent pipelines, and ensure delivery excellence. Areas of Responsibility 1. Offering and Capability Development Design and enhance GCP-based data platform offerings and accelerators Define architectural standards, best practices, and reusable components Collaborate with alliance teams to strengthen the strategic partnership. 2. Technical Leadership Provide architectural guidance for data solutions on GCP Lead solutioning for proposals, RFIs, and RFPs that involve GCP services. Conduct technical reviews to ensure alignment with GCP architecture best practices Act as the escalation point for complex architecture or engineering challenges 3. Delivery Oversight Support project delivery teams with deep technical expertise in GCP Drive project quality, performance optimization, and technical risk mitigation Ensure best-in-class delivery aligned with GCP’s security, performance, and cost standards. 4. Talent Development Build and lead a high-performing GCP data engineering and architecture team Define certification and upskilling paths aligned with GCP learning programs Mentor team members and foster a culture of technical excellence and knowledge sharing 5. Business Development Support Collaborate with sales and pre-sales teams to position solutions effectively Assist in identifying new opportunities within existing and new accounts Participate in client presentations, solution demos, and technical workshops 6. Thought Leadership and Innovation Develop whitepapers, blogs, and technical assets to showcase GCP leadership Stay current on market updates and innovations in the data engineering landscape Contribute to internal innovation initiatives and PoCs involving GCP Job Requirements 12–15 years of experience in Data Engineering & Analytics, with 3–5 years of deep GCP expertise. Proven experience leading data platforms using GCP technologies (BigQuery, Dataflow, Dataproc, Vertex AI, Looker), Containerization (Kubernetes, Docker), API-based microservices architecture, CI/CD pipelines, and infrastructure-as-code tools like Terraform Experience with tools such as DBT, Airflow, Informatica, Fivetran, and Looker/Tableau and programming skills in languages such as PySpark, Python, Java, or Scala Architectural best practices in cloud around user management, data privacy, data security, performance and other non-functional requirements Familiarity with building AI/ML models on cloud solutions built in GCP GCP certifications preferred (e.g., Professional Data Engineer, Professional Cloud Architect) Exposure to data governance, privacy, and compliance practices in cloud environments Strong presales, client engagement, and solution architecture experience Excellent communication and stakeholder management skills Prior experience in IT consulting, system integration, or technology services environment About Mastech InfoTrellis Mastech InfoTrellis is the Data and Analytics unit of Mastech Digital. At Mastech InfoTrellis, we have built intelligent Data Modernization practices and solutions to help companies harness the true potential of their data. Our expertise lies in providing timely insights from your data to make better decisions…FASTER. With our proven strategies and cutting-edge technologies, we foster intelligent decision-making, increase operational efficiency, and impact substantial business growth. With an unwavering commitment to building a better future, we are driven by the purpose of transforming businesses through data-powered innovation. (Who We Are | Mastech InfoTrellis) Mastech Digital is an Equal Opportunity Employer - All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status.
Posted 1 week ago
8.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Specialist, Data Architecture Job Posting Title: Tech Lead, Data Architecture What does a successful Specialist , Data Architecture do? We are seeking seasoned Data Architect with extensive experience in architecting ETL solutions with Informatica Data Management Cloud (IDMC)/Informatica Intelligent Cloud Services (IICS). Hands-on experience in designing, implementing, and optimizing enterprise-wide IDMC applications to meet the evolving needs of our organization. As a Data Architect, you will play a pivotal role in ensuring the robustness, scalability, and efficiency of our data systems. What You Will Do Collaborate within a team environment in development, testing, and support of software development project lifecycles Understand incoming data elements, map them with existing data structures and design data interfaces with underlying business logic Prepare and review any necessary technical documentation prepared by team Track and report daily and weekly activities Perform code reviews and code remediation. Design unit test scenarios and integration tests Participate in on-call support for product releases Research problems discovered by QA or product management and develop solutions to the problems Perform additional duties as determined by business needs and as directed by management What You Will Need To Have Must have 8 to 10 years of experience in ETL development using Informatica Data Management Cloud (IDMC)/Informatica Intelligent Cloud Services (IICS) with Snowflake as Cloud Data Warehouse, unit testing, code review, coding standards, Scrum (Agile), JIRA Ability to track progress against assigned tasks, report status, and proactively identify issues Demonstrate the ability to present information effectively in communications with peers and project management teams Highly organized and works well in a fast paced, fluid and dynamic environment Familiarity with data governance and data security best practices Demonstrate ability to lead and mentor technical teams What Would Be Great To Have Experience in working in a Scrum Development Team Banking and Financial Services experience, preferably, Cards and Payments domain Knowledge of performance tuning and optimization techniques in IDMC/IICS Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 1 week ago
4.0 - 5.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Job Requirements JD for the ETL Snowflake developer role. 4 - 5 years of Strong experience on advanced SQL any Database. (preferably snowflake , Oracle, or Teradata, ). Work Experience Extensive experience in data integration area and hands on experience in any of the ETL tools like Datastage , Informatica , Snaplogic etc. Able to transform technical requirements into data collection query. Should be capable of working with business and other IT teams and convert the requirements into queries. Good understanding of ETL architecture and design. Good knowledge in UNIX commands, Databases, SQL, PL/SQL Good to have experience on AWS Glue Good to have knowledge on Qlik replication tool.
Posted 1 week ago
10.0 - 15.0 years
25 - 30 Lacs
Hyderabad
Work from Office
We are seeking a highly experienced and passionate Data Engineer to join our growing data team. As a Senior Data Engineer, you will be responsible for designing, developing, and maintaining our robust data infrastructure, ensuring data quality, availability, and scalability. You will play a critical role in empowering our data-driven decision-making by building and optimizing data pipelines, ETL processes, and data warehousing solutions. This is a senior-level position requiring extensive experience with database technologies, data modelling, and cloud platforms. Responsibilities: Data Pipeline Development & Maintenance: Design, build, and maintain end-to-end data pipelines for ingesting, processing, and transforming large datasets from various sources (e.g., relational databases, APIs, flat files, streaming data). ETL/ELT Process Optimization: Develop and optimize ETL/ELT processes using industry-standard tools and techniques, ensuring data accuracy, efficiency, and scalability. Database Design & Management: Design, implement, and manage relational databases, including schema design, indexing, performance tuning, and data governance. Data Warehousing: Design and implement data warehousing solutions to support business intelligence and reporting needs. Experience with star schema, snowflake schema, and other data modelling techniques is essential. Database Administration: Perform database administration tasks, including performance monitoring, capacity planning, backup and recovery, and security management. Data Quality & Governance: Implement data quality checks, validation rules, and data governance policies to ensure data accuracy and consistency. Cloud Platform Expertise: Leverage cloud platforms (e.g., AWS, Azure, GCP) for data storage, processing, and management. Collaboration & Communication: Collaborate with cross-functional teams (e.g., data scientists, business analysts, software engineers) to understand data requirements and deliver effective data solutions. Clearly communicate technical concepts to both technical and non-technical audiences. Mentoring & Knowledge Sharing: Mentor junior engineers and share knowledge and best practices within the team. Automation & Scripting: Automate data engineering tasks using scripting languages (e.g., Python, Bash). Stay Up to Date: Continuously research and evaluate new data technologies and techniques to improve our data infrastructure. Production Support: Monitoring the batch/jobs daily, no matter itweekdays or weekend. Production Release: Actively participation in release process. Qualifications: Experience: Minimum of 10 years of experience as a Data Engineer or in a similar role. Database Expertise: o Expert proficiency in SQL Server: Extensive experience with SQL Server, including database design, performance tuning, query optimization, and database administration. o Expert proficiency in PostgreSQL: Deep understanding of PostgreSQL, including database design, performance tuning, query optimization, and database administration. o Solid experience with other relational databases like MySQL, Oracle, etc. is a plus. ETL/ELT Tools: Proven experience with ETL/ELT tools (e.g., Apache Airflow, Informatica, Talend, SSIS, ADF etc.). Data Modelling: Strong understanding of data modelling principles and techniques (e.g., dimensional modelling, star schema, snowflake schema). Cloud Computing: Experience with cloud platforms (AWS, Azure, or GCP) and related data services (e.g., S3, Redshift, Snowflake, Azure Data Lake Storage). Programming & Scripting: Proficiency in scripting languages such as Python, Bash, or similar. Data Governance & Quality: Experience implementing data quality checks, data governance policies, and data validation rules. Problem-Solving & Analytical Skills: Excellent problem-solving and analytical skills with the ability to identify and resolve complex data-related issues. Communication & Collaboration: Excellent communication, collaboration, and interpersonal skills. Ability to work effectively in a team environment. Education : Bacheloror masterdegree in computer science, Information Technology, or a related field. Bonus Points: Experience with data streaming technologies (e.g., Kafka, Spark Streaming). Experience with data visualization tools (e.g., Tableau, Power BI, Looker). Relevant certifications (e.g., AWS Certified Data Engineer, Microsoft Certified: Azure Data Engineer Associate).
Posted 1 week ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Description & Requirements Introduction: A Career at HARMAN Lifestyle We re a global, multi-disciplinary team that s putting the innovative power of technology to work and transforming tomorrow. As a member of HARMAN Lifestyle, you connect consumers with the power of superior sound. About the Role The person will be responsible for the end-to-end General Trade business of the defined territory. They will be responsible for implementing the overall strategic sales plan, targets, and tools to monitor sales achievements. What You Will Do. Contribute your talents to high-end, esteemed brands like JBL, Mark Levinson and Revel Unite your passion for audio innovation with high-tech product development Create pitch-perfect, cutting-edge technology that elevates the listening experience. Have the skills necessary to create and maintain a data management framework that establishes roles and responsibilities for data governance and decision making to meet the enterprises data objectives and goals. Responsible for loading and validating data into the data management framework Responsible for building end to end lineage for all data assets within the data management framework Investigate data quality related issues and able to identify root causes and build solution plan Investigate data management platform related issues and optimize performance Have the skills necessary to understand organization data & current analytical landscape complex data models Database expert and able to perform environment assessment & optimize performance up to max extend within available resources Knowledge of industry leading data quality and data protection management practices Knowledge of data governance practices, business and technology issues related to management of enterprise information assets and approaches related to data protection Knowledge of data related government regulatory requirements and emerging trends and issues Understand the overall core concepts for analytics and data management Monitors usage of the data management platform to identify potential capacity overloads and bottlenecks Collaborates with different stakeholders to identify, define, develop and implement new requirements Ability to prioritize tasks and work concurrently on multiple tasks. Good to have advanced statistical and machine learning knowledge to discover similar data and subsets of data, helping users find the most relevant and trusted data the business needs. What You Need to Be Successful Informatica, MDM , SQL, Axon, EDC, Data Governance Min 8+Years of experience in MDM, Data Governance and EDC. Strong experience in team handling. Need only local candidate Bonus Points if You Have Bachelor s or master s degree in computer science, Data Engineering, or a related field. Experience working in cross-functional teams and collaborating effectively with different stakeholders. Strong problem-solving and analytical skills. Excellent communication skills to document and present technical concepts clearly. 8-10 years relevant and Proven experience What Makes You Eligible Be willing to travel up to 30%, including domestic travel. Work location: Bangalore What We Offer Flexible work environment, allowing for full-time remote work globally for positions that can be performed outside a HARMAN or customer location Access to employee discounts on world-class Harman and Samsung products (JBL, HARMAN Kardon, AKG, etc.) Extensive training opportunities through our own HARMAN University Competitive wellness benefits Tuition reimbursement Be Brilliant employee recognition and rewards program An inclusive and diverse work environment that fosters and encourages professional and personal development
Posted 1 week ago
12.0 - 17.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will [act as a Business Analyst and Product Analyst for Amgen s enterprise Data Governance initiatives , focusing on tools, platforms, and frameworks that enable data stewardship, metadata management, data quality, security, and lineage. As a Senior Product Analyst, you will: Collaborate with data governance leads, business SMEs, data engineers, and product managers to gather, document, and manage requirements for Data Governance tools and programs. Lead stakeholder sessions to capture and refine needs across areas such as Metadata Management , Data Quality , Data Lineage , and Data Security . Develop user stories, process flows, and use cases for features related to governance platforms such as Collibra, Informatica, or similar tools. Define scope, success criteria, and KPIs for governance tools implementation and enhancements. Partner with product and engineering teams to translate requirements into technical design specifications and delivery roadmaps. Drive sprint-level execution by facilitating backlog grooming, user story refinement, and feature acceptance reviews. Support enterprise governance teams in evaluating vendor solutions and conducting proof-of-concepts (PoCs). Ensure change control, documentation, and training processes are in place for governance implementations. Apply a deep understanding of enterprise data landscapes and data governance frameworks to assess business impact and drive adoption. Maintain metadata, data catalogs, and stewardship models by ensuring tools reflect current data architecture and ownership. What we expect of you We are all different, yet we all use our unique contributions to serve patients. We are seeking a highly skilled and experienced Principal Product Analyst with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders with these qualifications Basic Qualifications: Doctorate degree / Masters degree / Bachelors degree and 12 to 17 years of business analyst experience for enterprise Data Governance platforms and programs. Experience with tools like Collibra, Informatica Axon/EDC, Alation, or equivalent. Understanding of data lifecycle and principles of data quality, data security, metadata, and lineage. Strong key collaborator management and communication skills, including experience with executive-level reporting. Familiarity with Agile development processes and backlog management in JIRA, Azure DevOps, or similar tools. Ability to author and maintain documentation including business requirement documents, process diagrams, and RACI charts. Experience with writing user requirements and acceptance criteria Affinity to work in a DevOps environment and Agile mind set Ability to work in a team environment, effectively interacting with others Ability to meet deadlines and schedules and be accountable Preferred Qualifications: Industry certifications in data governance or data management (e. g. , DCAM, CDMP). Understanding of data protection regulations such as GDPR, HIPAA, and CCPA. Experience with cloud-based data architectures (AWS, Azure, GCP). Strong analytical skills with working knowledge of SQL or data profiling tools. Exposure to enterprise MDM, data stewardship workflows, and metadata repositories. Must-Have Skills Excellent problem-solving skills and a passion for solving complex challenges in for Data Governance areas Experience with Agile software development methodologies (Scrum) Superb communication skills and the ability to work with senior leadership with confidence and clarity Has experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Experience in managing product features for PI planning and developing product roadmaps and user journeys Good-to-Have Skills: Demonstrated expertise in data and analytics and related technology concepts Understanding of data and analytics software systems strategy, governance, and infrastructure Familiarity with low-code, no-code test automation software Technical thought leadership Able to communicate technical or complex subject matters in business terms Jira Align experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills
Posted 1 week ago
3.0 - 8.0 years
7 - 12 Lacs
Bengaluru
Work from Office
The Data Engineer II will work closely with clients and provide technical consulting services, configuration of the elluminate platform, development for specific projects that include trial configuration, quality control, process improvements, system validation, custom analytics development, clinical software implementations and integrations. platform configuration, ETL and custom analytics development. The Data Engineer II will engage in technical development and implementation of various software service delivery related activities. As a Data Engineer II, you will play a crucial role in providing high-level technical consulting services and configuring the elluminate platform. You will oversee specific projects, including trial configuration, quality control, and project management. KEY TASKS & RESPONSIBILITIES Collaborating closely with clients to deliver technical consulting services and configure the elluminate platform. Guiding and supporting the team of data engineers on various technical service delivery activities. Designing, developing, testing, and deploying efficient SQL code to support SDTM, custom reports, and visualizations using tools like MS SQL, elluminate Mapper, and Qlik. Providing technical guidance, training, and support to team members and users on processes, technology, and products. Managing multiple timelines and deliverables for single or multiple clients, and handling client communications as assigned. Possessing in-depth knowledge of at least one Elluminate module, with hands-on experience in all other modules. Delivering proactive technical support for all client-reported support tickets. Facilitating client onboarding workshops and conducting training sessions for end users on the Elluminate platform. Configuring, migrating, and supporting the elluminate platform for assigned clients. Creating and maintaining all required specifications and quality control documents as per SOP and processes. CANDIDATE S PROFILE Education & Experience 3+ years of professional experience preferred Bachelors degree or equivalent experience preferred Experience developing back end, database/warehouse architecture, design and development preferred Knowledge of variety of data platforms including SQL Server, DB2, Teradata, (Cloud based DB a plus) Understanding of Cloud / Hybrid data architecture concepts is a plus Knowledge of clinical trial data is a plus - CDISC ODM, SDTM, or ADAM standards Experience in Pharmaceutical / Biotechnology / Life Science industry is a plus Professional Skills Ability to work with different technical and cross functional teams Must be proactive, demonstrate initiative, and be a logical thinker Must be team oriented with strong collaboration, prioritization, and adaptability skills Good understanding of technical challenges and capability to analyze requirement and technical problems Excited to learn new tools and product modules and adapt to changing technology and requirements Excellent knowledge of English; verbal and written communication skills Ensuring compliance with eClinical Solutions/industry quality standards, regulations, guidelines, and procedures. Demonstrating strong analytical and problem-solving skills to identify issues and develop creative solutions that drive results. Conveying information clearly and concisely to diverse audiences, facilitating understanding and collaboration. Working effectively in a team environment, contributing to group objectives, and supporting colleagues. Adapting to changing circumstances and accepting new challenges with a positive attitude. Understanding clinical trial data and applying CDISC, SDTM standards. Performing other duties as assigned. Technical Skills Proficient in SQL, T-SQL, PL/SQL programing Experience in Microsoft Office Applications, specifically MS Project and MS Excel Familiarity with multiple Database Platforms: Oracle, SQL Server, Teradata, DB2 Oracle Familiarity with Data Reporting Tools: QlikSense, QlikView, Spotfire, Tableau, JReview, Business Objects, Cognos, MicroStrategy, IBM DataStage, Informatica, Spark or related Familiarity with other languages and concepts: .NET, C#, Python, R, Java, HTML, SSRS, AWS, Azure, Spark, REST APIs, Big Data, ETL, Data Pipelines, Data Modelling, Data Analytics, BI, Data Warehouse, Data Lake or related
Posted 1 week ago
7.0 - 12.0 years
12 - 16 Lacs
Bengaluru
Work from Office
The Data Engineer, Sr - II will work closely with clients and provide high level technical consulting services, configuration of the elluminate platform, development and oversight for specific projects that include trial configuration, quality control, project management, assessing new technologies, process improvements, system validation, SOP development, custom analytics development, clinical software implementations and integrations, platform configuration, ETL and custom analytics development. As a Data Engineer, Sr - II you will serve as the primary technical lead, spearheading consulting efforts related to clinical systems software. You will also design and develop reporting programs as needed. KEY TASKS & RESPONSIBILITIES Leading consulting efforts and providing high-level technical consulting services to clients, including configuring the elluminate platform and overseeing specific projects such as trial configuration, quality control, and project management. Guiding and supporting the team of data engineers on various technical service delivery activities. Designing, developing, testing, and deploying efficient SQL code to support SDTM, custom reports, and visualizations using tools like MS SQL, elluminate Mapper, and Qlik. Managing multiple timelines and deliverables for single or multiple clients, and handling client communications as assigned. Demonstrating in-depth knowledge of at least one Elluminate module, with hands-on experience in all other modules. Delivering proactive technical support for all client-reported support tickets. Facilitating client onboarding workshops and conducting training sessions for end users on the Elluminate platform. Configuring, migrating, and supporting the elluminate platform for assigned clients. Creating and maintaining all required specifications and quality control documents as per SOP and processes. Ensuring compliance with eClinical Solutions/industry quality standards, regulations, guidelines, and procedures. Demonstrating strong analytical and problem-solving skills to identify issues and develop creative solutions that drive results. Conveying information clearly and concisely to diverse audiences, facilitating understanding and collaboration. CANDIDATE S PROFILE 7+ years of professional experience in a Services or Consulting role preferred Bachelors or Masters degree in science, technical or business discipline or equivalent experience preferred 5+ years in database design and development experience preferred Understanding of Cloud / Hybrid data architecture concepts is a plus Prior experience in customer facing roles is a plus Knowledge of clinical trial data is a plus - CDISC ODM, SDTM, or ADAM standards Experience in Pharmaceutical / Biotechnology / Life Science industry is a plus Professional Skills & Experience: Have critical observation and communication skills to Identify any gaps in processes or product Ability to work with various technical and non-technical teams both internal to eCS and clients Must be team oriented with strong collaboration, prioritization, and adaptability skills Excellent knowledge of English; verbal and written communication skills with ability to interact with users and clients providing solutions Experience in the Life Sciences industry, CRO / Clinical Trial regulated environment preferred Experience in regulatory computer systems validation a strong plus Technical Skills & Experience: Proficient in SQL, T-SQL, PL/SQL programming, Or R Package, or Python or SAS Proficiency in Microsoft Office Applications, specifically MS Project and MS Excel Experience with multiple Database Platforms: Oracle, SQL Server, Teradata, DB2 Oracle Working effectively in a team environment, contributing to group objectives, and supporting colleagues. Adapting to changing circumstances and accepting new challenges with a positive attitude. Understanding clinical trial data and applying CDISC, SDTM/AdAM/TLFs standards. Having a good understanding of clinical data collection tools like RAVE, InForm, Veeva EDC, Sibel CTMS, and analytics tools. Performing other duties as assigned. Experience with Data Reporting Tools: QlikSense, QlikView, Spotfire, Tableau, JReview, Business Objects, Cognos, MicroStrategy, IBM DataStage, Informatica, Spark or related Familiarity with other languages and concepts: .NET, C#, Python, R, Java, HTML, SSRS, AWS, Azure, Spark, REST APIs, Big Data, ETL, Data Pipelines, Data Modelling, Data Analytics, BI, Data Warehouse, Data Lake or related
Posted 1 week ago
9.0 - 14.0 years
45 - 50 Lacs
Bengaluru
Work from Office
Were looking for a diverse group of collaborators who believe data has the power to improve society. Adventurous minds who value solving some of the worlds most challenging problems. Here, employees are encouraged to push their boldest ideas forward, united by a passion to create a world where data improves the quality of life for people and businesses everywhere. Principal Performance Engineer - Bangalore Were looking for a Principal Performance Engineer candidate with experience in performance engineering and benchmarking for Java enterprise applications. You will report to the Senior Manager, Engineering. Hybrid work mode Technology Youll Use Java, JMeter, Shell/Python, Prometheus, Grafana, ELK - Logstatsh and APM, AWS, MongoDB, Kubernetes, AWS, Spark, Kafka, Hadoop, network protocols, Linux internals, and operating system performance tuning. Your Role ResponsibilitiesHeres What Youll Doy Essentials The successful candidate will be based in Bangalore, India and will: Strategic Performance Leadership: Define and evangelize the performance engineering strategy and roadmap for the product, aligning with overall product vision and business objectives. Architecture & Design Influence: Collaborate closely with product management, architects, and development teams to embed performance considerations into the early stages of the software development lifecycle (SDLC), influencing architectural decisions and design patterns for optimal performance and scalability. Performance Testing & Analysis: Lead the design, development, and execution of comprehensive performance testing strategies (load, stress, scalability, endurance, resilience) across various components and layers of the cloud-based platform. This includes API services, data ingestion pipelines, metadata stores, search engines, and user interfaces. Bottleneck Identification & Root Cause Analysis: Utilize advanced profiling, monitoring, and tracing tools to identify and deep-dive into performance bottlenecks within the application, database, infrastructure, and cloud services. Conduct root cause analysis and provide actionable recommendations for resolution. Performance Optimization: Drive optimization efforts by collaborating with development teams on code improvements, database tuning, infrastructure scaling, and cloud resource utilization. Tooling & Framework Development: Develop and maintain robust, automated performance testing frameworks, tools, and simulations tailored for cloud-native distributed systems. Evaluate and integrate new performance engineering technologies and best practices. Monitoring & Observability: Establish and enhance comprehensive performance monitoring and observability capabilities in production, defining key performance indicators (KPIs), metrics, and dashboards to proactively identify and address performance regressions. Mentorship & Thought Leadership: Mentor and guide a team of performance engineers, fostering their technical growth and promoting a culture. Act as a subject matter expert and thought leader in performance engineering, contributing to industry best practices and internal knowledge sharing. Capacity Planning & Forecasting: Collaborate with operations and SRE teams on capacity planning, predicting future performance needs based on growth projections and usage patterns. Cross-Functional Collaboration: Partner effectively with cross-functional teams including SRE, DevOps, security, and customer support to ensure optimal system performance and reliability across the entire product lifecycle. What Wed Like to See BTech/MS in Computer Science, Computer Engineering, or equivalent technical degree. 9+ years of experience in performance engineering, with at least 3+ years in a Principal or Lead role focusing on cloud-native applications and distributed systems. Deep expertise in performance testing methodologies, tools (e.g., JMeter, Gatling, k6), and frameworks. Strong understanding of cloud platforms (AWS, Azure, GCP - specific expertise in one or more) and their services (e.g., Kubernetes, serverless functions, message queues, managed databases, storage solutions). Proficiency in programming languages commonly used in performance engineering and cloud development (e.g., Java, Python). Extensive experience with performance monitoring and observability tools (e.g., Prometheus, Grafana, ELK - Logstatsh and APM). Solid understanding of distributed systems concepts, microservices architectures, and their performance implications. Experience with relational and NoSQL databases (e.g., Graph DB, Elasticsearch) and their performance tuning. Experience with CI/CD pipelines and integrating performance testing into the development workflow. Proven ability to lead and mentor other engineers. Perks & Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit
Posted 1 week ago
10.0 - 15.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Were hiring an Cloud Operations Manager- TechSales to lead our global cloud infrastructure team in Bangalore (Hybrid) . Youll ensure secure, scalable, cost-effective cloud environments for demos and enablement, leading a global support team and collaborating with Sales, Cloud Architects, and Partners to lead technical sales excellence. You will report to the Director, Technical Sales Support You will work with Technical Sales systems to build automation that streamlines workflows, reduces cloud costs, and speeds environment readiness. Partnering with R&D and DevOps, youll manage platform operations like patching and upgrades to ensure uninterrupted demos. Regular collaboration with IT, Security, and InfoSec teams will maintain compliance. Success requires expertise in hybrid/multi-cloud architecture, security, and governance. Youll also mentor your team, coordinate with Sales and IT, and manage 24/7 support for reliable operations. Technology Youll Use Cloud Platforms (AWS, Azure, GCP, Oracle cloud), Docker, Kubernetes, Infrastructure automation, Monitoring Your Role ResponsibilitiesHeres What Youll Do Lead a global operations and support team providing 24x5 support, provisioning, monitoring, and incident response for technical sales cloud environments used in demos, PoCs, and training.. Provision, maintain, and decommission cloud infrastructure (AWS, Azure, GCP, and Oracle Cloud resources) helping all Technical Sales team members perform demo related activities. Ensure systems uptime and reliability through monitoring, alerting, and observability frameworks. Perform performance tuning of runtime environments (VMs, docker containers and hybrid deployments). Improve automation plans to refine provisioning and deployment of technical sales resources. Collaborate with Technical Sales, IT, Partner teams, and product teams to align priorities and enhance infrastructure. Lead infrastructure projects focused on improving performance, reliability, scalability, and cost control. Implement resource governance to ensure compliance and manage cloud spend. Oversee risk and incident management with clear escalation and prompt resolution. Forecast resource usage and capacity planning to balance demand without overspending. Develop operational roadmaps and provide transparent reporting on environment health and indicators to leadership. What Wed Like to See 10+ years in cloud infrastructure/DevOps or operations, 3+ years in leadership Experience managing cloud-based technical sales environments on AWS, Azure, GCP, or Oracle Cloud. Expertise in mission-critical global environments, cloud security, compliance, and multi-cloud operations Experience with global support models and on-call management Proficient in infrastructure automation, scripting, Docker, and Kubernetes Analytical with an ability to improve cloud spend using usage data Translate technical needs into scalable solutions Preferred Skills Knowledge of Informatica IDMC platform (runtime, configuration, deployment, and tuning) Experience supporting Sales demos, PoCs, and enablement platforms Proven collaboration with Technical Sales, GSI, and Partner teams Project management skills across infrastructure projects Perks & Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit
Posted 1 week ago
5.0 - 9.0 years
5 - 9 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-220243 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jul. 11, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Sr Associate IS Analyst - docuBridge What you will do Sr Associate IS Analyst – DocuBridge in this role will play a key role in implementation and lifecycle management of structured regulatory submission solutions, with a main focus on Lorenz DocuBridge. This role demands strong technical expertise, hands-on experience with Lorenz DocuBridge, and the ability to assist in managing multi-functional partner relationships across business, IT, and vendor teams. The candidate will collaborate with internal Regulatory Affairs team members and vendor partners to ensure accurate interpretation of requirements, delivery of compliant submissions, and successful deployment of the solution. The role includes assisting in validation testing, user grievance resolution, and overall user experience optimization. Timely submission is critical, and the candidate must support the project to closure within set timelines and quality standards. Assist in the implementation and system ownership of Lorenz DocuBridge Suite, ensuring the solution meets both global and regional regulatory requirements (e.g., eCTD & NeeS). Collaborate with multi-functional partners including Regulatory Affairs, Quality Assurance, IT Security, and vendor teams to gather detailed business requirements and translate them into scalable, compliant technical solutions. Support the team in defining and managing project plans, timelines, resource allocation, and key landmarks to ensure end-to-end project execution - from system assessment, procurement, configuration, validation, launch, to post-production support. Assist in driving validation and compliance readiness by overseeing the development of validation plans, IQ/OQ/PQ protocols, and related documentation as per GXP, 21 CFR Part 11guidelines. Ensure regulatory submission readiness by enabling structured document authoring workflows, lifecycle management, and integration with content sources such as Regulatory Veeva RIM. Oversee user access controls, role-based privileges, and audit trail configurations to ensure system integrity and security are maintained. Support the change control process for the submission system by aligning with ITIL standards and ensuring traceability for all updates, patches, and configuration changes. Support team to Develop training materials and conduct hands-on user training to onboard regulatory users and business owners, ensuring effective adoption of the system. Monitor production performance and work with business team and vender partner to solve issues, ensuring timely resolution of incidents with minimal impact on business continuity. Lead continuous improvement initiatives to enhance system usability, performance, and regulatory compliance alignment. Track KPIs and provide regular status updates to leadership on system performance, user adoption, and project level challenges and risks. Contribute to technology roadmap planning by finding opportunities for tool upgrades, integration with newer modules (e.g., Lorenz eValidator,), and regulatory intelligence platforms. Act as the SME for structured submissions and represent the function during audits, inspections, and regulatory reviews. Demonstrate adaptability to agile methodology, ensuring flexibility and responsiveness to changing project requirements. Manage and lead teams effectively, fostering collaboration and productivity. Use Jira and ServiceNow for project tracking, issue resolution, and service management. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9 years of relevant experience Must-Have Skills: Demonstrate a deep understanding of pharma industry regulations and compliance requirements for, including FDA and EUCTR. Have good knowledge of submission publishing systems like Lorenz’s docuBridge application and Regulatory Veeva RIM. Demonstrated experience in managing technology initiatives and teams with a track record of successful innovation and fostering the development of talent. Must be flexible and able to manage multiple activities and priorities with minimal direction in a rapidly changing and demanding environment. Experience in applying technology standard process methodologies such as Scaled Agile (SAFe) and ITIL. Exceptional collaboration, communication, must be flexible and able to manage multiple activities and priorities with minimal direction in a rapidly changing and demanding environment. Possess strong knowledge of information systems and network technologies. Good-to-Have Skills: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Veeva Vault Platform Administrator or Equivalent Vault Certification (Mandatory) SAFe for Teams (Preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
4.0 years
7 - 8 Lacs
Hyderābād
On-site
Job title: Data Engineer Location: Hyderabad About Sanofi: We are an innovative global healthcare company, driven by one purpose: we chase the miracles of science to improve people’s lives. Our team, across some 100 countries, is dedicated to transforming the practice of medicine by working to turn the impossible into the possible. We provide potentially life-changing treatment options and life-saving vaccine protection to millions of people globally, while putting sustainability and social responsibility at the center of our ambitions. Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions that will accelerate Manufacturing & Supply performance and help bring drugs and vaccines to patients faster, to improve health and save lives. Who You Are: You are a dynamic Data Engineer interested in challenging the status quo to design and develop globally scalable solutions that are needed by Sanofi’s advanced analytic, AI and ML initiatives for the betterment of our global patients and customers. You are a valued influencer and leader who has contributed to making key datasets available to data scientists, analysts, and consumers throughout the enterprise to meet vital business use needs. You have a keen eye for improvement opportunities while continuing to fully comply with all data quality, security, and governance standards. Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofi’s Digital Transformation through becoming an AI first organization. This means: AI Factory - Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment. Leading Edge Tech Stack: Experience building products that will be deployed globally on a leading-edge tech stack. World Class Mentorship and Training: Working with renowned leaders and academics in machine learning to further develop your skillsets Job Highlights: Propose and establish technical designs to meet business and technical requirements Develop and maintain data engineering solutions based on requirements and design specifications using appropriate tools and technologies Create data pipelines / ETL pipelines and optimize performance Test and validate developed solution to ensure it meets requirements Create design and development documentation based on standards for knowledge transfer, training, and maintenance Work with business and products teams to understand requirements, and translate them into technical needs Adhere to and promote to best practices and standards for code management, automated testing, and deployments Leverage existing or create new standard data pipelines within Sanofi to bring value through business use cases Develop automated tests for CI/CD pipelines Gather/organize large & complex data assets, and perform relevant analysis Conduct peer reviews for quality, consistency, and rigor for production level solution Actively contribute to Data Engineering community and define leading practices and frameworks Communicate results and findings in a clear, structured manner to stakeholders Remains up to date on the company’s standards, industry practices and emerging technologies Key Functional Requirements & Qualifications: Experience working cross-functional teams to solve complex data architecture and engineering problems Demonstrated ability to learn new data and software engineering technologies in short amount of time Good understanding of agile/scrum development processes and concepts Able to work in a fast-paced, constantly evolving environment and manage multiple priorities Strong technical analysis and problem-solving skills related to data and technology solutions Excellent written, verbal, and interpersonal skills with ability to communicate ideas, concepts and solutions to peers and leaders Pragmatic and capable of solving complex issues, with technical intuition and attention to detail Service-oriented, flexible, and approachable team player Fluent in English (Other languages a plus) Key Technical Requirements & Qualifications: Bachelor’s Degree or equivalent in Computer Science, Engineering, or relevant field 4 to 5+ years of experience in data engineering, integration, data warehousing, business intelligence, business analytics, or comparable role with relevant technologies and tools, such as Spark/Scala, Informatica/IICS/Dbt Understanding of data structures and algorithms Working knowledge of scripting languages (Python, Shell scripting) Experience in cloud-based data platforms (Snowflake is a plus) Experience with job scheduling and orchestration (Airflow is a plus) Good knowledge of SQL and relational databases technologies/concepts Experience working with data models and query tuning Nice to haves: Experience working in life sciences/pharmaceutical industry is a plus Familiarity with data ingestion through batch, near real-time, and streaming environments Familiarity with data warehouse concepts and architectures (data mesh a plus) Familiarity with Source Code Management Tools (GitHub a plus) Why choose us? Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks’ gender-neutral parental leave. Opportunity to work in an international environment, collaborating with diverse business teams and vendors, working in a dynamic team, and fully empowered to propose and implement innovative ideas. Pursue Progress . Discover Extraordinary . Progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. You can be one of those people. Chasing change, embracing new ideas and exploring all the opportunities we have to offer. Let’s pursue Progress. And let’s discover Extraordinary together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com!
Posted 1 week ago
15.0 years
0 Lacs
Hyderābād
On-site
Project Role : Program/Project Management Lead Project Role Description : Manage overall delivery of a program or project to achieve business outcomes. Define project scope and monitor execution of deliverables. Communicate across multiple stakeholders to manage expectations, issues and outcomes. Must have skills : Stibo Product Master Data Management Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Program/Project Management Lead, you will manage the overall delivery of a program or project to achieve business outcomes. Your typical day will involve defining project scope, monitoring the execution of deliverables, and communicating with multiple stakeholders to effectively manage expectations, issues, and outcomes. You will play a crucial role in ensuring that the project aligns with the strategic goals of the organization while fostering collaboration among team members and stakeholders. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate regular team meetings to ensure alignment and address any challenges. - Mentor junior team members to enhance their skills and professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM. - Strong understanding of data integration and data quality management. - Experience with project management methodologies such as Agile or Waterfall. - Ability to analyze project risks and develop mitigation strategies. - Excellent communication and interpersonal skills to engage with stakeholders. Additional Information: - The candidate should have minimum 5 years of experience in Informatica MDM. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education
Posted 1 week ago
5.0 years
6 - 9 Lacs
Bengaluru
On-site
As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Career Level - IC3 As a Sr. Support Engineer, you will be the technical interface to customers, Original Equipment Manufacturers (OEMs) and Value-Added Resellers (VARs) for resolution of problems related to the installation, recommended maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. RESPONSIBILITIES: To manage and resolve Service Requests logged by customers (internal and external) on Oracle products and contribute to proactive support activities according to product support strategy and model Provide expert-level troubleshooting and technical support for Oracle’s Big Data Service (BDS) , DFS, DIS, Data Catalog, and associated cloud services Diagnose and resolve complex issues across the Hadoop ecosystem (e.g., HDFS, YARN, Spark, Hive, Impala, Sqoop, Oozie) Manage cluster configurations, upgrades, patches, and installations using tools like Ambari Support real-time data processing frameworks (Kafka, Flink) and ETL pipelines (ODI, Informatica) Collaborate with OCI platform teams to support secure and scalable AI/ML data workflows Engage in hands-on support for agentic frameworks (LangChain, Semantic Kernel, CrewAI) and RAG-based systems Interact regularly with customers, build technical documentation, and contribute to knowledge sharing Collaborate cross-functionally with product engineering, infrastructure, and cloud ops teams for holistic support delivery Qualifications Bachelor’s degree in Computer Science, Engineering, or a related technical field 5+ years of proven experience supporting Oracle Big Data platforms including Oracle’s Big Data Service (BDS) , DFS, DIS, Data Catalog, and Oracle Cloud Infrastructure (OCI) Strong expertise in Hadoop ecosystem: HDFS, YARN, Spark, Hive, Impala, Sqoop, Oozie, Ranger, Kerberos Experience in Linux OS administration, networking, TLS/SSL, and SSO integration Experience with data integration tools (ODI/Informatica) and cloud data sources (FusionApps/BICC, Snowflake) Hands-on experience with LLMs, agentic frameworks (LangChain, Semantic Kernel, CrewAI), RAG pipelines, and vector databases (FAISS, Pinecone, Weaviate) Proficiency in Python and Shell scripting Skills & Competencies Deep understanding of Oracle’s Big Data Service (BDS) , Data Flow Service (DFS), Data Integration Service(DIS), Data Catalog architecture and operations Cluster administration using Ambari and troubleshooting across the Cloudera stack Real-time processing using Kafka, Flink AI/ML workflow support, including OCI Gen AI services and integration of agentic pipelines Working knowledge of cloud services, networking, system-level security, and distributed architectures Experience supporting multi-tier enterprise applications Personal Competencies Strong customer focus with ability to handle escalations and technical deep dives Structured problem-solving mindset Self-motivated with a continuous learning attitude Excellent communication, documentation, and global collaboration skills Results-oriented with a passion for service quality and technical excellence
Posted 1 week ago
5.0 - 7.0 years
2 - 9 Lacs
Bengaluru
On-site
Associate Business Analyst Category: Supply Chain Location: Bangalore, Karnataka, IN Novo Nordisk Global Business Services (GBS) India Department – DSC-OPS Master Data, GBS Are you passionate about supply chain management and master data? Do you thrive in a dynamic environment where accuracy and efficiency are key? If you are ready to take your career to the next level, we have the perfect opportunity for you! Join us as an Associate Business Analyst at Novo Nordisk and help shape the future of our supply chain operations. Read more and apply today. Apply Now! The position As an Associate Business Analyst at Novo Nordisk, you will be responsible for creating and maintaining master data in SAP and Winshuttle in alignment with established business processes and rules. You will also handle Change Requests (CR-cases) and Development Requests (DV) related to master data creation. You will be entrusted with the below responsibilities: Create and maintain master data for raw, semi-finished, and finished goods in SAP ECC and Winshuttle, including Bill of Materials (BOM) individually and in mass, and manage change-controlled objects using Engineering Change Master. Support global serialization master data maintenance by updating specification sheets, addressing ad hoc requests, and executing related support tasks. Manage material master data across the product life cycle—including creation, maintenance, and deactivation—by leading and documenting product change control, analyzing material usage and inventory trends, delivering actionable reports, and supporting cross-functional projects to ensure timely execution. Perform data cleansing to ensure accuracy and integrity and create and maintain Standard Operating Procedures (SOPs) and work instructions in compliance with Novo Nordisk standards. Identify process improvement opportunities within master data management and standardization and ensure creation and maintenance of SOPs and work instructions in alignment with Novo Nordisk standards. Manage stakeholder relationships effectively, support new process implementations, ensure adherence to defined KPIs, and contribute to training and onboarding of new joiners. Qualifications Bachelor’s degree in supply chain management, production, mechanical engineering, or equivalent from a well-recognised institute. 5 to 7 years of experience within SAP master data, preferably within pharma or supply chain. Experience with master data. Ability to analyse and process data. Must have experience in Product Life Cycle Management, S/4HANA, Winshuttle, SAP ECC, and Master Data Management. Good understanding of supply chain concepts (Plan, Make, Source, Deliver, and Return) and the supporting master data. Proficient user of Microsoft Office (Excel, PowerPoint). Experience in automation with advanced Excel and building macros or have ETL knowledge with Informatica/Winshuttle. Experience in conducting meetings with peers, including preparation and facilitation. Knowledge of business rules for processes and attributes within SAP. Excellent communication skills in English, both written and oral. About the department Supply Chain was established in March 2017 as part of Product Supply Devices & Supply Chain Management business plan. The Business plan has three parts; Robust, Ready, Effective and Supply Chain Global Business Service (GBS) is part of the last one focusing business through offshoring. The unit is anchored under Supply Chain Planning (SCP) in Head Quarter and is the agreed place to consolidate Supply Chain activities across Novo Nordisk. The Supply Chain offshoring journey has started in D&S, Service Delivery Catalogue is taking form and other areas within Product Supply can soon join or add to the Catalogue to optimize costs and reduce complexity by operating an effective supply chain. Working at Novo Nordisk Novo Nordisk is a leading global healthcare company with a 100-year legacy of driving change to defeat serious chronic diseases. Building on our strong legacy within diabetes, we are growing massively and expanding our commitment, reaching millions around the world and impacting more than 40 million patient lives daily. All of this has made us one of the 20 most valuable companies in the world by market cap. Our success relies on the joint potential and collaboration of our more than 72,000 employees around the world. We recognize the importance of the unique skills and perspectives our people bring to the table, and we work continuously to bring out the best in them. Working at Novo Nordisk, we’re working toward something bigger than ourselves, and it’s a collective effort. Join us! Together, we go further. Together, we’re life changing. Contact To submit your application, please upload your CV online (click on Apply and follow the instructions). Deadline 21st July 2025 Disclaimer It has been brought to our attention that there have recently been instances of fraudulent job offers, purporting to be from Novo Nordisk and/or its affiliate companies. The individuals or organisations sending these false employment offers may pose as a Novo Nordisk recruiter or representative and request personal information, purchasing of equipment, or funds to further the recruitment process or offer paid trainings. Be advised that Novo Nordisk does not extend unsolicited employment offers. Furthermore, Novo Nordisk does not charge prospective employees with fees or make requests for funding as a part of the recruitment process. We commit to an inclusive recruitment process and equality of opportunity for all our job applicants. We’re not your typical healthcare company. In a modern world of quick fixes, we focus on solutions to defeat serious chronic diseases and promote long-term health. Our unordinary mindset is at the heart of everything we do. We seek out new ideas and put people first as we push the boundaries of science, make healthcare more accessible, and treat, prevent, and even cure diseases that affect millions of lives. Because it takes an unordinary approach to drive real, lasting change in health.
Posted 1 week ago
7.0 years
4 - 5 Lacs
Noida
On-site
Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 11-Jul-2025 Job ID 10082 Description and Requirements Position Summary The engineer role is to support external data transmission, operations, scheduling and middleware transmission. Experience in Windows and Linux environments and knowledge of Informatica MFT & Data Exchange tools. Should be able to handle day to day customer transmission and Informatica MFT/DX activities. Job Responsibilities Design and implement complex integration solutions through collaboration with engineers, application teams and operations team across the global enterprise Provide technical support to application developers when required. This includes promoting use of best practices, ensuring standardization across applications and trouble shooting Able to create new setups and support existing transmissions Able to diagnose and troubleshoot transmission and connection issues Experience in Windows administration and good to have expertise in IBM workload scheduler Hands on experience in tools like IIS, Informatica MFT & DX console, Splunk and IBM workload scheduler Responsibilities also include planning, engineering, and implementation of new transmissions as well as migration of setups The role will participate in the evaluation and recommendation of new products and technologies The role will also represent the domain in relevant automation and value innovation efforts Technical leadership, ability to think strategically and effectively communicate solutions to a variety of stake holders Able to debug production issues by analyzing the logs directly and using tools like Splunk. Learn new technologies based on demand and help team members by coaching and assisting Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills and Abilities Education Bachelor's degree in computer science, Information Systems, or related field Experience 7+ years of total experience and at least 4+ years of experience in designing and implementation of complex integration solutions through collaboration with engineers, application and operations team Create new setups and support existing transmissions Experience in tools like IIS, Informatica MFT & DX console, Splunk and IBM workload scheduler SSH/SSL/Tectia Microsoft IIS IBM Connect:Direct IBM Sterling Informatica MFT Operating System Knowledge (Linux/Windows/AIX) Troubleshooting Azure Dev Ops Pipeline Knowledge Mainframe z/OS Knowledge Open Shift and Kube Enterprise Scheduling Knowledge (Maestro) Good to Have : Python and/or Powershell Agile SAFe for Teams Ansible (Automation) Elastic Other Requirements (licenses, certifications, specialized training – if required) Working Relationships Internal Contacts (and purpose of relationship): MetLife internal partners External Contacts (and purpose of relationship) – If Applicable MetLife external partners About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!
Posted 1 week ago
3.0 - 6.0 years
1 - 4 Lacs
Jaipur
On-site
Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science — empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you? You are smart, collaborative and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams, and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Senior Data Engineering Consultant, you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. The Senior Data Engineering Consultant will: Create and maintain optimal data pipeline architecture Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, AWS, and Big Data technologies Development of ETL processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization's data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In this role, you will have: B.Tech degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 3-6 years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations. Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture Strong experience with Snowflake and Data Warehouse architecture SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Hands-on experience with SQL (Stored Procedures, functions) including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL tools (Informatica, Talend, SAP BODS, DataStage, Dell Boomi, Mulesoft, FiveTran, Matillion, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer as it’s important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employer and all qualified applicants will receive consideration for employment.
Posted 1 week ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements AI & Data Testing Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. An ETL Tester is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. They work closely with ETL developers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes. Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA Require experienced ETL testers (Informatica Power center) with an experience of 6-10 yrs. and having below skills: Required Skills Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle, SQL Server, Teradata and Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in ETL Automation and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300070
Posted 1 week ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About the Company Brace Infotech is looking for a Java Backend Developer with strong expertise in SQL and hands-on experience with ETL tools. The ideal candidate will be responsible for backend development, database handling, and data integration workflows. About the Role Brace Infotech is looking for a Java Backend Developer with strong expertise in SQL and hands-on experience with ETL tools. The ideal candidate will be responsible for backend development, database handling, and data integration workflows. Responsibilities Proficient in Java (Core and/or Spring Boot) Strong working knowledge of SQL (query optimization, joins, stored procedures) Experience with ETL tools (e.g., Informatica, Talend, Apache Nifi, or others) Good understanding of data flow, transformation logic, and loading mechanisms Familiarity with version control tools like Git Excellent problem-solving and communication skills Qualifications 4–6 Years of experience (can be adjusted based on your need) Immediate Joiners Only Required Skills Proficient in Java (Core and/or Spring Boot) Strong working knowledge of SQL (query optimization, joins, stored procedures) Experience with ETL tools (e.g., Informatica, Talend, Apache Nifi, or others) Good understanding of data flow, transformation logic, and loading mechanisms Familiarity with version control tools like Git Excellent problem-solving and communication skills Preferred Skills Experience working with cloud platforms (AWS, Azure, or GCP) Knowledge of data warehousing concepts Familiarity with REST APIs Pay range and compensation package Full-Time position with a competitive salary based on experience. Equal Opportunity Statement Brace Infotech is committed to diversity and inclusivity in the workplace. ```
Posted 1 week ago
7.0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Salesforce system integration at PwC will focus on connecting Salesforce with other systems, applications, or databases to enable seamless data flow and process automation. You will be responsible for designing, developing, and implementing integration solutions using various integration technologies and tools, such as Salesforce APIs, middleware platforms, and web services. Growing as a strategic advisor, you leverage your influence, expertise, and network to deliver quality results. You motivate and coach others, coming together to solve complex problems. As you increase in autonomy, you apply sound judgment, recognising when to take action and when to escalate. You are expected to solve through complexity, ask thoughtful questions, and clearly communicate how things fit together. Your ability to develop and sustain high performing, diverse, and inclusive teams, and your commitment to excellence, contributes to the success of our Firm. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Craft and convey clear, impactful and engaging messages that tell a holistic story. Apply systems thinking to identify underlying problems and/or opportunities. Validate outcomes with clients, share alternative perspectives, and act on client feedback. Direct the team through complexity, demonstrating composure through ambiguous, challenging and uncertain situations. Deepen and evolve your expertise with a focus on staying relevant. Initiate open and honest coaching conversations at all levels. Make difficult decisions and take action to resolve issues hindering team effectiveness. Model and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Field CAN be edited Field CANNOT be edited ____________________________________________________________________________ Job Summary - A career in our Managed Services team will provide you an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Application Evolution Services team will provide you with the opportunity to help organizations harness the power of their enterprise applications by optimizing the technology while driving transformation and innovation to increase business performance. We assist our clients in capitalizing on technology improvements, implementing new capabilities, and achieving operational efficiencies by managing and maintaining their application ecosystems. We help our clients maximize the value of their Salesforce investment by managing the support and continuous transformation of their solutions in areas such as sales, service, marketing and customer relationship Management. Minimum Degree Required (BQ) *: Bachelor’s Degree Degree Preferred Bachelor's degree Required Field(s) Of Study (BQ) Preferred Field(s) of Study: Computer and Information Science, Management Information Systems Minimum Year(s) of Experience (BQ) *: US Certification(s) Preferred Minimum of 7 years of experience Salesforce.com certifications: Certified Administrator, Certified Developer, Certified Business Analyst, or Certified Sales/Service Consultant; Additional specialized Salesforce.com certifications such as Marketing Cloud, Experience Cloud, App Builder, AI Associate Preferred Knowledge/Skills *: Demonstrates intimate level of abilities and/or a proven record of success identifying and addressing client needs: Acts as Salesforce domain specialist and provides coaching, mentoring, guidance and feedback to develop skills of team members; Analyzes and customizes Salesforce seasonal release notes for engagements, presenting them to clients; Serve as an SME for resolving complex production support issues and enhancements, staying current on Salesforce’s product roadmap and proposing solutions to clients; Leads ticket procedure calls with clients in their area of expertise; Leads design, development, and deployment of enhancements; Designs and develops deliverables/processes to improve delivery quality and efficiency; Leads aspects of delivery on multiple engagements; Manages client relationships and expectations, confirming client satisfaction of services; Leads delivery resource recruitment efforts; Develops training and certification plans for delivery resources; Conceptualizes, designs, and develops deliverables/processes to improve delivery quality and efficiency; Facilitates team operations management of multiple engagements and clients; Possesses strong functional and/or technical skills in Salesforce to provide solution architecture, design trade-offs, and ability to decipher design choices; Has ability to provide functional and/or technical leadership setting industry leading practices including quality of design, implementation, maintenance, and support; Possesses extensive experience in Force.com platform using Apex, Lightning and LWC; Proven experience with software configuration, Mobile solutions, Apex coding, or Visualforce coding experience with Salesforce and/or Veeva; Understanding of enterprise applications to which Salesforce.com clouds (for example: Sales, Service, Marketing, Revenue, Slack, MuleSoft) is commonly integrated to enable an end-to-end ecosystem for enterprise customers (e.g., SAP, Oracle, Marketo and related cloud and/or on-premises ERP business applications); Extensive abilities and/or a proven record of success serving as a solution or technical architect on one or more Salesforce Managed services engagements; Leads continuous improvement of solutions; Identifies automations and designs solutions to improve service delivery or simplify application processes for end users; Oversees transitioning and leading application support operations; Understands the common issues facing PwC's clients of specific industry sectors; Manages teams to deliver contracted services including troubleshooting and resolving production issues, developing and testing enhancements, and assessing impact of solutions within applications; Provides consistent communication of status to clients, including managing client expectations in regard to scope and service levels; Experience in: Development methodologies including Agile; Application technology stack for Salesforce; DevOps processes and tools; and, ITIL process knowledge/understanding is highly preferred. Should have strong technical skills in Salesforce to provide solution architecture, design trade-offs, and ability to decipher design choices. Should have managed multi environments, multi regions complex implementation support projects and therefore able to define scalable and robust solutions. Extensive experience in Force.com platform using Apex, Lightning and LWC. Solid implementation support experience using Sales / Service / Marketing /Custom cloud. Should have strong experience in working with middleware that supports Salesforce platforms like Mulesoft, Boomi, Informatica, Tibco, and Fusion middleware. Demonstrated solutioning experience in handling one or more Industry domain. Deep expertise in one or more Salesforce domain products – CPQ, CLM, nCino, Vlocity, FSL etc. Ability to address security complexities, and design solutions aligning with Salesforce security models. Experience in working with a broad range of emerging Salesforce products – B2B Commerce, Tableau CRM, CG Cloud, MFG Cloud, Loyalty cloud and Slack. Demonstrating ability to develop value propositions, solution approaches, and business proposals to meet client goals. Good experience with proposal activities like RFI/RFP analysis, RAID analysis, resource and effort estimation for Salesforce projects. Demonstrating communication skills to lead client executive discussions focused on scope, approach, design and implementation support considerations. Extensive experience managing and delivering multiple projects using Agile Methodology. Able to run practice initiatives and enable capabilities within the Salesforce practice. Good experience in articulating Point of Values and defining Go-to market solution. Review releases from Salesforce.com on a regular basis to determine new features that are appropriate for end users. Define, develop, and follow best practices in Salesforce. Able to handle data management inclusive of data load, data translation, data hygiene, data migration and integration. Proven ability to look at technical processes from a strategic standpoint and understand the inter-relationships. Recommend to team members or customers the appropriate and optimal use/configuration of a custom build solution. Familiarity building custom solutions on: SAP, Oracle, MS-SQL Server, or other RDMS. Proven track record of writing, interpreting, and managing deliverables of a consulting engagement. Awareness of the changing Cloud ecosystem and adjust to new technologies, methods and apps. Strong communication skills, negotiation skills, and conflict resolution. Possess advanced Salesforce certifications and Certified as Scrum Master. Demonstrating and directing multi-competency teams to deliver complex, quote-to-cash transformation programs. Additional Information Experience Level: 12-15 years
Posted 1 week ago
4.0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a Salesforce consulting generalist at PwC, you will possess a broad range of consulting skills and experience across various Salesforce applications. You will provide consulting services to clients, analysing their needs, implementing software solutions, and offering training and support for effective utilisation of Salesforce applications. Your versatile knowledge will allow you to assist clients in optimising operational efficiency and achieving their strategic objectives. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Job Title: Salesforce Lightning, LWC Developer Job Level - Sr. Associate Year of Experience –4 Yrs – 8 Yrs Educational Qualifications : BE / B Tech / MCA/ M.Sc / M.E / M.Tech Key Skills : Salesforce, Lightning, LWC, Job Description 4+ Years of Total IT experience. 4+ years of SFDC experience. Extensive experience in Force.com platform using APEX and Visualforce. Solid Implementation experience using Sales / Service / Custom cloud. Experience in working with HTML, CSS, Ajax, JavaScript , JQuery. Must have Field service Lightning tool configuration experience. Must have Salesforce Field service Lightning Technical/Functional Skill. Must have Hands on Customization APEX, Visual Force, Workflow/ Process Builder, Triggers, Batch, Schedule Apex, VF Components, Test Class , Web services/APEX/REST etc Additional Desired Skills Good working knowledge in Object Oriented programming like Java, Ruby, C++. Experience in working with Bootstrap, Angular JS. Experience in working with Lightning and design components. Experience in marketing tools like Marketing Cloud, Exact Target, Eloqua Experience in products like Apttus, Veeva, nCino, Adobe Flex Able to handle data management inclusive of data load, data translation, data hygiene, data migration and integration. Proven ability to look at technical processes from a strategic standpoint and understand the inter-relationships. Recommend to team members or customers the appropriate and optimal use/configuration of a custom build solution. Exemplary enthusiast for code honesty, code modularity, code cleanliness and version control. Familiarity building custom solutions on: SAP, Oracle, MS-SQL Server, or other RDMS. Understanding of integration platforms such as, but not limited to: Cast Iron, Boomi, Informatica, Tibco, and Fusion. Able to translate the customer requirements and gap/fit analysis in to comprehensible functional configuration of Salesforce.com. Proven track record of writing, interpreting and managing deliverables of a consulting engagement. Must be able to think independently and creatively. Aptitude for taking on technical challenges. Awareness of the changing Cloud ecosystem and adjust to new technologies, methods and apps ________________________________________________________________________________
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Responsibilities Design and implement scalable Snowflake data warehouse architectures, including schema modeling and data partitioning. Lead or support data migration projects from on-premise or legacy cloud platforms to Snowflake. Develop ETL/ELT pipelines and integrate data using tools such as DBT, Fivetran, Informatica, Airflow, etc. Define and implement best practices for data modeling, query optimization, and storage efficiency in Snowflake. Collaborate with cross-functional teams including data engineers, analysts, BI developers, and stakeholders to align architectural solutions. Ensure data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake. Work with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments. Optimize resource utilization, monitor workloads, and manage cost-effectiveness of the platform. Stay updated with Snowflake features, cloud vendor offerings, and best Qualifications & Skills : Bachelors or Masters degree in Computer Science, Information Systems, or a related field. years of experience in data engineering, data warehousing, or analytics architecture. 3+ years of hands-on experience in Snowflake architecture, development, and administration. Strong knowledge of cloud platforms (AWS, Azure, or GCP). Solid understanding of SQL, data modeling, and data transformation principles. Experience with ETL/ELT tools, orchestration frameworks, and data integration. Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance Qualifications : Snowflake certification (SnowPro Core / Advanced). Experience in building data lakes, data mesh architectures, or streaming data platforms. Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. Experience with Agile delivery models and CI/CD workflows (ref:hirist.tech)
Posted 1 week ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are looking for a highly skilled and experienced Snowflake Architect to lead the design, development, and deployment of enterprise-grade cloud data solutions. The ideal candidate will have a strong background in data architecture, cloud data platforms, and Snowflake implementation, with hands-on experience in end-to-end data pipeline and data warehouse design. This role requires strategic thinking, technical leadership, and the ability to work collaboratively across cross-functional Responsibilities : Lead the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. Define data modeling standards, best practices, and governance frameworks. Design and optimize ETL/ELT pipelines using tools like Snowpipe, Azure Data Factory, Informatica, or DBT. Collaborate with stakeholders to understand data requirements and translate them into robust architectural solutions. Implement data security, privacy, and role-based access controls within Snowflake. Guide development teams on performance tuning, query optimization, and cost management in Snowflake. Ensure high availability, fault tolerance, and compliance across data platforms. Mentor developers and junior architects on Snowflake capabilities and Skills & Experience : 8+ years of overall experience in data engineering, BI, or data architecture, with at least 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization. Strong experience with SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP). Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion. Good understanding of data lakes, data mesh, and modern data stack principles. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks. Solid knowledge of data governance, metadata management, and cataloging to Have : Snowflake certification (e.g., SnowPro Core/Advanced Architect). Familiarity with Apache Airflow, Kafka, or event-driven data ingestion. Knowledge of data visualization tools such as Power BI, Tableau, or Looker. Experience in healthcare, BFSI, or retail domain projects (ref:hirist.tech)
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for designing, developing, and implementing data-centric software solutions using various technologies. This includes conducting code reviews, recommending best coding practices, and providing effort estimates for the proposed solutions. Additionally, you will design audit business-centric software solutions and maintain comprehensive documentation for all proposed solutions. As a key member of the team, you will lead architect and design efforts for product development and application development for relevant use cases. You will provide guidance and support to team members and clients, implementing best practices of data engineering and architectural solution design, development, testing, and documentation. Your role will require you to participate in team meetings, brainstorming sessions, and project planning activities. It is essential to stay up-to-date with the latest advancements in the data engineering area to drive innovation and maintain a competitive edge. You will stay hands-on with the design, development, and validation of systems and models deployed. Collaboration with audit professionals to understand business, regulatory, and risk requirements, as well as key alignment considerations for audit, is a crucial aspect of the role. Driving efforts in the data engineering and architecture practice area will be a key responsibility. In terms of mandatory technical and functional skills, you should have a deep understanding of RDBMS (MS SQL Server, ORACLE, etc.), strong programming skills in T-SQL, and proven experience in ETL and reporting (MSBI stack/COGNOS/INFORMATICA, etc.). Additionally, experience with cloud-centric databases (AZURE SQL/AWS RDS), ADF (AZURE Data Factory), data warehousing skills using SYNAPSE/Redshift, understanding and implementation experience of datalakes, and experience in large data processing/ingestion using Databricks APIs, Lakehouse, etc., are required. Knowledge in MPP databases like SnowFlake/Postgres-XL is also essential. Preferred technical and functional skills include understanding financial accounting, experience with NoSQL using MONGODB/COSMOS, Python coding experience, and an aptitude towards emerging data platforms technologies like MS AZURE Fabric. Key behavioral attributes required for this role include strong analytical, problem-solving, and critical-thinking skills, excellent collaboration skills, the ability to work effectively in a team-oriented environment, excellent written and verbal communication skills, and the willingness to learn new technologies and work on them.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France