Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 years
0 Lacs
India
On-site
Job Title: Deliver Leader – Analytics and Data Science Department: Digital Services Reports to: Vice President Band: AVP Kindly click on the link below to apply: https://forms.office.com/r/zsjy1bD9yf Main Purpose of job We are looking for a dynamic leader qualified in solving problems with the power of analytics to join our Digital Services team. The Analytics and Data Science team is an integral part of our strategy to deliver superior value to our clients combined with the immense knowledge of the industries we operate in. As the leader for Analytics and Data science, you will be working with a team of highly motivated and insightful colleagues, in an entrepreneurial environment and play a crucial role in identifying, building and deploying analytics solutions that drive strategic goals of customers. This is an expanding team to deliver multi-million-dollar projects. You must have the ability to collaborate across various business segments within the organization to drive change and deliver value. Key responsibilities: · Have the experience and expertise in leveraging analytics for prediction and forecasting, data analysis, visualization with various sources – predominantly image and text data · Collaborate with cross-functional teams to support solutioning for proposals, planning and execution of programs. · Establishing credibility by thought partnering with pre-sales and customer teams on analytics topics · Communicating analytical insights through sophisticated synthesis and packaging of results (including PowerPoint presentation, Documents, dashboard and charts) · Support build collateral of documents to enhance core capabilities and supporting reference for internal documents · Ability to support business development activities (proposals, etc.) and build sales collateral to generate leads · Contributes to team's content & IP development; Mentor team on analytical methodologies and platforms and help in quality checks; Plan training needs of teams and work with corresponding teams to ensure training programs are delivered. · Develop and implement analytics strategies, frameworks, and best practices to ensure accurate data collection, analysis, and reporting. · Define and document data governance policies and procedures to ensure data quality, security, and compliance with regulations. · Conduct regular audits and assessments of analytics implementations, identifying gaps, errors, and opportunities for improvement. · Define and track key performance indicators (KPIs) and metrics to measure the success of digital marketing campaigns and website performance. · Stay updated with emerging trends and advancements in web analytics and data architecture, recommending and implementing innovative solutions to enhance analytics capabilities. Technical skills: · Strong proficiency in statistics (concepts & methodologies like hypothesis testing, sampling, etc.) and its application & interpretation · Hands-on data mining and predictive modeling experience (Linear Regression, Clustering (K-means, DBSCAN, etc.), Classification (Logistic regression, Decision trees/Random Forest/Boosted Trees), Timeseries (SARIMAX) etc. · Strong experience in at least one of the prominent cloud providers (Azure, AWS, GCP) and working knowledge of auto ML solutions (Sage Maker, Azure ML, Databricks etc.) · At least one tool in each category: o Programming language - Python (Must have), (R Or SAS OR PySpark), SQL (Must have) o Data Visualization (Tableau, QlikView, Power BI, Streamlit) , o Data management (using Alteryx, MS Access, or any RDBMS) o ML Deployment tools (Airflow, MLflow Luigi, Docker etc.) o Version Control (Git/Github/Git Lab) o MS Office (Excel, PowerPoint, Word) o Coding IDE (VS Code/PyCharm) o Nice to have: § Big data technologies (Hadoop ecosystem, Spark) § Data warehouse solutions (Teradata, Azure SQL DW/Synapse, Redshift, BigQuery etc,) § GenAI tools (OpenAI, Google PaLM/BERT, Hugging Face, etc.) Experience · Proven experience (15+ years) in analytics, ML, BI, AI and data modeling. · Strong understanding of data analytics tools and technologies, tools, and best practices, hands-on experience in applying appropriate analytics techniques to build statistical models, including handling of all pre- and post-modeling deliverables · Working knowledge of Python and SQL. Hands-on knowledge of any of the visualization tools amongst Power BI/Tableau · Experience across structured and unstructured data (text) leveraging classification, regression, unsupervised, NLP and other techniques · Excellent problem-solving and analytical skills, with the ability to think strategically and provide data-driven insights. Desired requirements: · Exposure to Process Mining platforms, concepts · Experience in working with Operations and business teams in a BPO environment.
Posted 2 weeks ago
0.0 - 4.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Tesco India • Bengaluru, Karnataka, India • Hybrid • Full-Time • Apply by 01-Jul-2025 About the role Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation What is in it for you Analyse complex datasets and make it consumable using visual storytelling and visualization tools such as reports and dashboards built using approved tools (Tableau, PyDash) You will be responsible for Understands business needs and in depth understanding of Tesco processes - Builds on Tesco processes and knowledge by applying CI tools and techniques. - Responsible for completing tasks and transactions within agreed KPI's - Solves problems by analyzing solution alternatives -Engage with market leaders to understand problems to be solved, translate the business problems to analytical problems, taking ownership of specified analysis and translate the answers back to decision makers in business - Manipulating, analyzing and synthesizing large complex data sets using different sources and ensuring data quality and integrity - Think beyond the ask and develop analysis and reports that will contribute beyond basic asks - Accountable for high quality and timely completion of specified work deliverables and ad-hocs business asks - Write codes that are well detailed, structured, and compute efficient - Drive value delivery through efficiency gain by automating repeatable tasks, report creation or dashboard refresh - Collaborate with colleagues to craft, implement and measure consumption of analysis, reports and dashboards - Contribute to development of knowledge assets and reusable modules on GitHub/Wiki - Understands business needs and in depth understanding of Tesco processes - Responsible for completing tasks and transactions within agreed metrics - Experience in handling high volume, time pressured business asks and ad-hocs requests You will need 2-4 years experience preferred in analysis oriented delivery in any one of domains like retail, cpg, telecom or hospitality and for one of the following functional areas - marketing, supply chain, customer, space range and merchandising, operations, finance or digital will be preferred Strong understanding of Business Decisions, Skills to develop visualizations, self-service dashboards and reports using Tableau & Basic Statistical Concepts (Correlation Analysis and Hyp. Testing), Good Skills to analyze data using Adv Excel, Adv SQL, Hive, Phython, Data Warehousing concepts (Hadoop, Teradata), Automation using alteryx, python About us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation Apply
Posted 2 weeks ago
4.0 - 6.0 years
7 - 9 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: Responsible for performing system development work around ETL, which can include both the development of new function and facilities, and the on-going systems support of live systems. Responsible for the documentation, coding, and maintenance of new and existing Extract, Transform, and Load (ETL) processes within the Enterprise Data Warehouse. Investigate live systems faults, diagnose problems, and propose and provide solutions. Work closely with various teams to design, build, test, deploy and maintain insightful MI reports. Support System Acceptance Testing, System Integration and Regression Testing. Identify any issues that may arise to delivery risk, formulate preventive actions or corrective measures, and timely escalate major project risks & issues to service owner. Execute test cases and log defects. Should be proactive in understanding the existing system, identifying areas for improvement, and taking ownership of assigned tasks. Ability to work independently with minimal supervision while ensuring timely delivery of tasks. Requirements To be successful in this role, you should meet the following requirements : 4-6 years of experience in Data Warehousing specialized in ETL. Given the current team is highly technical in nature, the expectation is that the candidate has experience in technologies like DataStage, Teradata Vantage, Unix Scripting and scheduling using Control-M and DevOps tools. Candidate should possess good knowledge on SQL and demonstrate the ability to write efficient and optimized queries effectively. Hands on experience or knowledge on GCP’s Data Storage and Processing services such as BigQuery, Dataflow, Bigtable, Cloud spanner, Cloud SQL would be an added advantage. Hands-on experience with Unix, Git and Jenkins and would be added advantage. This individual should be able to develop and implement solutions on both on-prem and google cloud platform (GCP). Conducting migration, where necessary, to bring tools and other elements into the cloud and software upgrades. Should have proficiency in using JIRA and Confluence and experienced in working in projects that have followed Agile methodologies. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 2 weeks ago
0 years
0 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: Bachelor’s degree or International Equivalent Excellent written and verbal communication skills; presentation skills preferred. Focus on detail and consistency. Strong prioritization and time management skills Experience in Financial domain (Banking). Good knowledge and experience on ETL, Data Stage, DB2, Teradata, Oracle, Unix Self-motivated, focused, detailed oriented and able to work efficiently to deadlines are essential. Ability to work with a degree of autonomy while also being able to work in a collaborative team environment. High degree of personal integrity Experienced in Development of UNIX shell scripts for enhancing the ETL job performance. Hands-on experience applying modeling techniques to actual business solutions. Experience in designing, developing, and implementing Business Intelligence tools. Understanding and Experience on Unix/Linux system, file systems, shell scripting Strong knowledge of scheduling tools such as Control- M Requirements To be successful in this role, you should meet the following requirements: Candidates must have outstanding analytical and problem-solving skills and a good grasp of the technical side of business intelligence. Provide effective co-ordination and communication with distributed teams. Assist and support the implementation of releases. Identify, interpret, and document requirements for new or altered systems, by working closely with various departments. Prepare detailed specifications that describe input, output, and logical operation. Prepare/submit Change Requests/GSDs to support the implementation process. Review design to ensure it meets the criteria and work is of an acceptable quality. Undertake technical investigations and program design on new and existing applications as required. Update computer programs to increase operating efficiency or adapt to new requirements. Review code from offshore Analyst/Developers as part of the quality assurance process. Produce unit test plans with detailed expected results to fully exercise the code. Work with Scheduling Software to automate ETL processes such as using Control-M You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 2 weeks ago
9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview We are seeking an Associate Manager - Data IntegrationOps to support and assist in managing data integration and operations (IntegrationOps) programs within our growing data organization. In this role, you will help maintain and optimize data integration workflows, ensure data reliability, and support operational excellence. This position requires a solid understanding of enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support. Support the management of Data IntegrationOps programs by assisting in aligning with business objectives, data governance standards, and enterprise data strategies. Monitor and enhance data integration platforms by implementing real-time monitoring, automated alerting, and self-healing capabilities to help improve uptime and system performance under the guidance of senior team members. Assist in developing and enforcing data integration governance models, operational frameworks, and execution roadmaps to ensure smooth data delivery across the organization. Support the standardization and automation of data integration workflows, including report generation and dashboard refreshes. Collaborate with cross-functional teams to help optimize data movement across cloud and on-premises platforms, ensuring data availability, accuracy, and security. Provide assistance in Data & Analytics technology transformations by supporting full sustainment capabilities, including data platform management and proactive issue identification with automated solutions. Contribute to promoting a data-first culture by aligning with PepsiCo’s Data & Analytics program and supporting global data engineering efforts across sectors. Support continuous improvement initiatives to help enhance the reliability, scalability, and efficiency of data integration processes. Engage with business and IT teams to help identify operational challenges and provide solutions that align with the organization’s data strategy. Develop technical expertise in ETL/ELT processes, cloud-based data platforms, and API-driven data integration, working closely with senior team members. Assist with monitoring, incident management, and troubleshooting in a data operations environment to ensure smooth daily operations. Support the implementation of sustainable solutions for operational challenges by helping analyze root causes and recommending improvements. Foster strong communication and collaboration skills, contributing to effective engagement with cross-functional teams and stakeholders. Demonstrate a passion for continuous learning and adapting to emerging technologies in data integration and operations. Responsibilities Support and maintain data pipelines using ETL/ELT tools such as Informatica IICS, PowerCenter, DDH, SAP BW, and Azure Data Factory under the guidance of senior team members. Assist in developing API-driven data integration solutions using REST APIs and Kafka to ensure seamless data movement across platforms. Contribute to the deployment and management of cloud-based data platforms like Azure Data Services, AWS Redshift, and Snowflake, working closely with the team. Help automate data pipelines and participate in implementing DevOps practices using tools like Terraform, GitOps, Kubernetes, and Jenkins. Monitor system reliability using observability tools such as Splunk, Grafana, Prometheus, and other custom monitoring solutions, reporting issues as needed. Assist in end-to-end data integration operations by testing and monitoring processes to maintain service quality and support global products and projects. Support the day-to-day operations of data products, ensuring SLAs are met and assisting in collaboration with SMEs to fulfill business demands. Support incident management processes, helping to resolve service outages and ensuring the timely resolution of critical issues. Assist in developing and maintaining operational processes to enhance system efficiency and resilience through automation. Collaborate with cross-functional teams like Data Engineering, Analytics, AI/ML, CloudOps, and DataOps to improve data reliability and contribute to data-driven decision-making. Work closely with teams to troubleshoot and resolve issues related to cloud infrastructure and data services, escalating to senior team members as necessary. Support building and maintaining relationships with internal stakeholders to align data integration operations with business objectives. Engage directly with customers, actively listening to their concerns, addressing challenges, and helping set clear expectations. Promote a customer-centric approach by contributing to efforts that enhance the customer experience and empower the team to advocate for customer needs. Assist in incorporating customer feedback and business priorities into operational processes to ensure continuous improvement. Contribute to the work intake and Agile processes for data platform teams, ensuring operational excellence through collaboration and continuous feedback. Support the execution of Agile frameworks, helping drive a culture of adaptability, efficiency, and learning within the team. Help align the team with a shared vision, ensuring a collaborative approach while contributing to a culture of accountability. Mentor junior technical team members, supporting their growth and ensuring adherence to best practices in data integration. Contribute to resource planning by helping assess team capacity and ensuring alignment with business objectives. Remove productivity barriers in an agile environment, assisting the team to shift priorities as needed without compromising quality. Support continuous improvement in data integration processes by helping evaluate and suggest optimizations to enhance system performance. Leverage technical expertise in cloud and computing technologies to support business goals and drive operational success. Stay informed on emerging trends and technologies, helping bring innovative ideas to the team and supporting ongoing improvements in data operations. Qualifications 9+ years of technology work experience in a large-scale, global organization - CPG (Consumer Packaged Goods) industry preferred. 4+ years of experience in Data Integration, Data Operations, and Analytics, supporting and maintaining enterprise data platforms. 4+ years of experience working in cross-functional IT organizations, collaborating with teams such as Data Engineering, CloudOps, DevOps, and Analytics. 1+ years of leadership/management experience supporting technical teams and contributing to operational efficiency initiatives. 4+ years of hands-on experience in monitoring and supporting SAP BW processes for data extraction, transformation, and loading (ETL). Managing Process Chains and Batch Jobs to ensure smooth data load operations and identifying failures for quick resolution. Debugging and troubleshooting data load failures and performance bottlenecks in SAP BW systems. Validating data consistency and integrity between source systems and BW targets. Strong understanding of SAP BW architecture, InfoProviders, DSOs, Cubes, and MultiProviders. Knowledge of SAP BW process chains and event-based triggers to manage and optimize data loads. Exposure to SAP BW on HANA and knowledge of SAP’s modern data platforms. Basic knowledge of integrating SAP BW with other ETL/ELT tools like Informatica IICS, PowerCenter, DDH, and Azure Data Factory. Knowledge of ETL/ELT tools such as Informatica IICS, PowerCenter, Teradata, and Azure Data Factory. Hands-on knowledge of cloud-based data integration platforms such as Azure Data Services, AWS Redshift, Snowflake, and Google BigQuery. Familiarity with API-driven data integration (e.g., REST APIs, Kafka), and supporting cloud-based data pipelines. Basic proficiency in Infrastructure-as-Code (IaC) tools such as Terraform, GitOps, Kubernetes, and Jenkins for automating infrastructure management. Understanding of Site Reliability Engineering (SRE) principles, with a focus on proactive monitoring and process improvements. Strong communication skills, with the ability to explain technical concepts clearly to both technical and non-technical stakeholders. Ability to effectively advocate for customer needs and collaborate with teams to ensure alignment between business and technical solutions. Interpersonal skills to help build relationships with stakeholders across both business and IT teams. Customer Obsession: Enthusiastic about ensuring high-quality customer experiences and continuously addressing customer needs. Ownership Mindset: Willingness to take responsibility for issues and drive timely resolutions while maintaining service quality. Ability to support and improve operational efficiency in large-scale, mission-critical systems. Some experience leading or supporting technical teams in a cloud-based environment, ideally within Microsoft Azure. Able to deliver operational services in fast-paced, transformation-driven environments. Proven capability in balancing business and IT priorities, executing solutions that drive mutually beneficial outcomes. Basic experience with Agile methodologies, and an ability to collaborate effectively across virtual teams and different functions. Understanding of master data management (MDM), data standards, and familiarity with data governance and analytics concepts. Openness to learning new technologies, tools, and methodologies to stay current in the rapidly evolving data space. Passion for continuous improvement and keeping up with trends in data integration and cloud technologies.
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities About the Role: We are hiring sharp, hands-on Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do: Build and manage high-performance data pipelines for batch and near real-time use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over large-scale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multi-step workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory Skill Sets ‘Must have’ knowledge, skills and experiences 5+ years of hands-on experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred Skill Sets ‘Good to have’ knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to on-prem to cloud data migrations Knowledge of Git-based workflows and CI/CD pipelines Years Of Experience Required Experience 5-8 years Education Qualification BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 2 weeks ago
9.0 - 14.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist IS Analyst What You Will Do Let’s do this. Let’s change the world. In this vital role you will be part of Enterprise Data Fabric (EDF) Platform team. In this role you will be leveraging AI and other automation tools to innovate and provide solutions for business. The role leverages domain and business process expertise to detail product requirements as epics and user stories, along with supporting artifacts like business process maps, use cases, and test plans for Enterprise Data Fabric (EDF) Platform team. This role involves working closely with varied business stakeholders - business users, data engineers, data analysts, and testers to ensure that the technical requirements for upcoming development are thoroughly elaborated. This enables the delivery team to estimate, plan, and commit to delivery with high confidence and identify test cases and scenarios to ensure the quality and performance of IT Systems. In this role you will analyze business requirements and help design solutions for the EDF platform. You will collaborate with multi-functional teams to understand business needs, identify system enhancements, and drive system implementation projects. Experience in business analysis, system design, and project management will enable this role to deliver innovative and effective technology products. What We Expect Of You Roles & Responsibilities Collaborate with System Architects and Product Owners to manage business analysis activities for systems, ensuring alignment with engineering and product goals Capture the voice of the customer to define business processes and product needs Collaborate with business stakeholders, Architects and Engineering teams to prioritize release scopes and refine the Product backlog Facilitate the breakdown of Epics into Features and Sprint-sized User Stories and participate in backlog reviews with the development team Clearly express features in User Stories/requirements so all team members and stakeholders understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are well-defined Stay focused on software development to ensure it meets requirements, providing proactive feedback to stakeholders Develop and execute effective product demonstrations for internal and external stakeholders Help develop and maintain a product roadmap that clearly outlines the planned features and enhancements, timelines, and achievements Identify and manage risks associated with the systems, requirement validation, and user acceptance Develop & maintain documentation of configurations, processes, changes, communication plans and training plans for end users Ensure operational excellence, cybersecurity, and compliance. Collaborate with geographically dispersed teams, including those in the US and other international locations. Foster a culture of collaboration, innovation, and continuous improvement Ability to work flexible hours that align with US time zones Basic Qualifications: Master’s degree with9 - 14 years of experience in Computer Science, Business, Engineering, IT or related field OR Bachelor’s degree with 10 - 14 years of experience in Computer Science, Business, Engineering, IT or related field OR Diploma with 10 - 14 years of experience in Computer Science, Business, Engineering, IT or related field. Must-have Skills: Proven ability in translating business requirements into technical specifications and writing user requirement documents. Able to communicate technical or complex subject matters in business terms Experience with Agile software development methodologies (Scrum) Excellent communication skills and the ability to interface with senior leadership with confidence and clarity Strong knowledge of data engineering processes Experience in managing product features for PI planning and developing product roadmaps and user journeys Technical thought leadership Good-to-have Skills: Experience maintaining SaaS (software as a system) solutions and COTS (Commercial off the shelf) solutions Experience with AWS Services (like EC2, S3), Salesforce, Jira, and API gateway, etc. Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.) Experience in understanding micro services architecture and API development Experience with data analysis, data modeling, and data visualization solutions such as Tableau and Spotfire Professional Certifications: SAFe for Teams certification (preferred) Certified Business Analysis Professional (Preferred) Soft Skills : Excellent critical-thinking, analytical and problem-solving skills Strong verbal & written communication and collaboration skills Demonstrated awareness of how to function in a team setting Strong presentation and public speaking skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully High degree of initiative and self-motivation Ability to work under minimal supervision Skilled in providing oversight and mentoring team members. Demonstrated ability in effectively delegating work Team-oriented, with a focus on achieving team goals What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 2 weeks ago
4.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: Responsible for performing system development work around ETL, which can include both the development of new function and facilities, and the on-going systems support of live systems. Responsible for the documentation, coding, and maintenance of new and existing Extract, Transform, and Load (ETL) processes within the Enterprise Data Warehouse. Investigate live systems faults, diagnose problems, and propose and provide solutions. Work closely with various teams to design, build, test, deploy and maintain insightful MI reports. Support System Acceptance Testing, System Integration and Regression Testing. Identify any issues that may arise to delivery risk, formulate preventive actions or corrective measures, and timely escalate major project risks & issues to service owner. Execute test cases and log defects. Should be proactive in understanding the existing system, identifying areas for improvement, and taking ownership of assigned tasks. Ability to work independently with minimal supervision while ensuring timely delivery of tasks. Requirements To be successful in this role, you should meet the following requirements : 4-6 years of experience in Data Warehousing specialized in ETL. Given the current team is highly technical in nature, the expectation is that the candidate has experience in technologies like DataStage, Teradata Vantage, Unix Scripting and scheduling using Control-M and DevOps tools. Candidate should possess good knowledge on SQL and demonstrate the ability to write efficient and optimized queries effectively. Hands on experience or knowledge on GCP’s Data Storage and Processing services such as BigQuery, Dataflow, Bigtable, Cloud spanner, Cloud SQL would be an added advantage. Hands-on experience with Unix, Git and Jenkins and would be added advantage. This individual should be able to develop and implement solutions on both on-prem and google cloud platform (GCP). Conducting migration, where necessary, to bring tools and other elements into the cloud and software upgrades. Should have proficiency in using JIRA and Confluence and experienced in working in projects that have followed Agile methodologies. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 2 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: Bachelor’s degree or International Equivalent Excellent written and verbal communication skills; presentation skills preferred. Focus on detail and consistency. Strong prioritization and time management skills Experience in Financial domain (Banking). Good knowledge and experience on ETL, Data Stage, DB2, Teradata, Oracle, Unix Self-motivated, focused, detailed oriented and able to work efficiently to deadlines are essential. Ability to work with a degree of autonomy while also being able to work in a collaborative team environment. High degree of personal integrity Experienced in Development of UNIX shell scripts for enhancing the ETL job performance. Hands-on experience applying modeling techniques to actual business solutions. Experience in designing, developing, and implementing Business Intelligence tools. Understanding and Experience on Unix/Linux system, file systems, shell scripting Strong knowledge of scheduling tools such as Control- M Requirements To be successful in this role, you should meet the following requirements: Candidates must have outstanding analytical and problem-solving skills and a good grasp of the technical side of business intelligence. Provide effective co-ordination and communication with distributed teams. Assist and support the implementation of releases. Identify, interpret, and document requirements for new or altered systems, by working closely with various departments. Prepare detailed specifications that describe input, output, and logical operation. Prepare/submit Change Requests/GSDs to support the implementation process. Review design to ensure it meets the criteria and work is of an acceptable quality. Undertake technical investigations and program design on new and existing applications as required. Update computer programs to increase operating efficiency or adapt to new requirements. Review code from offshore Analyst/Developers as part of the quality assurance process. Produce unit test plans with detailed expected results to fully exercise the code. Work with Scheduling Software to automate ETL processes such as using Control-M You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 2 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Sr Mgr Software Development Engineering What You Will Do Let’s do this. Let’s change the world. In this vital role you will Provide technical leadership to enhance the culture of innovation, automation, and solving difficult scientific and business challenges. Technical leadership includes providing vision and direction to develop scalable reliable solutions. Provide leadership to select right-sized and appropriate tools and architectures based on requirements, data source format, and current technologies Develop, refactor, research and improve Weave cloud platform capabilities. Understand business drivers and technical needs so our cloud services seamlessly, automatically, and securely provides them the best service. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Build strong partnership with partner Build data products and service processes which perform data transformation, metadata extraction, workload management and error processing management to ensure high quality data Provide clear documentation for delivered solutions and processes, integrating documentation Collaborate with business partners to understand user stories and ensure technical solution/build can deliver to those needs Work with multi-functional teams to design and document effective and efficient solutions. Develop organisational change strategies and assist in their implementation. Mentor junior data engineers on standard processes in the industry and in the Amgen data landscape What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 8 to 13 years of relevant experience Must-Have Skills: Superb communication and interpersonal skills, with the ability to work closely with multi-functional GTM, product, and engineering teams. Minimum of 10+ years overall Software Engineer or Cloud Architect experience Minimum 3+ years in architecture role using public cloud solutions such as AWS Experience with AWS Technology stack Good-to-Have Skills: Familiarity with big data technologies, AI platforms, and cloud-based data solutions. Ability to work effectively across matrixed organizations and lead collaboration between data and AI teams. Passion for technology and customer success, particularly in driving innovative AI and data solutions. Experience working with teams of data scientists, software engineers and business experts to drive insights Experience with AWS Services such as EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway. Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc) Solid understanding of relevant data standards and industry trends Ability to understand new business requirements and prioritize them for delivery Experience working in biopharma/life sciences industry Proficient in one of the coding languages (Python, Java, Scala) Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience with Schema Design & Dimensional data modeling. Experience with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script Hands on experience using Databricks/Jupyter or similar notebook environment. Experience working with GxP systems Experience working in an agile environment (i.e. user stories, iterative development, etc.) Experience working with test-driven development and software test automation Experience working in a Product environment Good overall understanding of business, manufacturing, and laboratory systems common in the pharmaceutical industry, as well as the integration of these systems through applicable standards. Soft Skills: Excellent analytical and solving skills. Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 2 weeks ago
3.0 - 8.0 years
12 - 16 Lacs
Hyderabad
Work from Office
Provide technical leadership to enhance the culture of innovation, automation, and solving difficult scientific and business challenges. Technical leadership includes providing vision and direction to develop scalable reliable solutions. Provide leadership to select right-sized and appropriate tools and architectures based on requirements, data source format, and current technologies Develop, refactor, research and improve Weave cloud platform capabilities. Understand business drivers and technical needs so our cloud services seamlessly, automatically, and securely provides them the best service. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Build strong partnership with partner Build data products and service processes which perform data transformation, metadata extraction, workload management and error processing management to ensure high quality data Provide clear documentation for delivered solutions and processes, integrating documentation Collaborate with business partners to understand user stories and ensure technical solution/build can deliver to those needs Work with multi-functional teams to design and document effective and efficient solutions. Develop organisational change strategies and assist in their implementation. Mentor junior data engineers on standard processes in the industry and in the Amgen data landscape What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Doctorate degree / Masters degree / Bachelors degree and 8 to 13 years of relevant experience Must-Have Skills: Superb communication and interpersonal skills, with the ability to work closely with multi-functional GTM, product, and engineering teams. Minimum of 10+ years overall Software Engineer or Cloud Architect experience Minimum 3+ years in architecture role using public cloud solutions such as AWS Experience with AWS Technology stack Good-to-Have Skills: Familiarity with big data technologies, AI platforms, and cloud-based data solutions. Ability to work effectively across matrixed organizations and lead collaboration between data and AI teams. Passion for technology and customer success, particularly in driving innovative AI and data solutions. Experience working with teams of data scientists, software engineers and business experts to drive insights Experience with AWS Services such as EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway. Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc) Solid understanding of relevant data standards and industry trends Ability to understand new business requirements and prioritize them for delivery Experience working in biopharma/life sciences industry Proficient in one of the coding languages (Python, Java, Scala) Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc ). Experience with Schema Design Dimensional data modeling. Experience with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script Hands on experience using Databricks/Jupyter or similar notebook environment. Experience working with GxP systems Experience working in an agile environment (i. e. user stories, iterative development, etc ) Experience working with test-driven development and software test automation Experience working in a Product environment Good overall understanding of business, manufacturing, and laboratory systems common in the pharmaceutical industry, as we'll as the integration of these systems through applicable standards. Soft Skills: Excellent analytical and solving skills. Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals
Posted 2 weeks ago
9.0 - 14.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you're part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we'do. Since 1980, we've helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas -Oncology , Inflammation, General Medicine, and Rare Disease- we'reach millions of patients each year. As a member of the Amgen team, you'll help make a lasting impact on the lives of patients as we'research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you'll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist IS Analyst What you will do Let s do this. Let s change the world. In this vital role you will be part of Enterprise Data Fabric (EDF) Platform t eam. In this role you will be leveraging AI and other automation tools to innovate and provide solutions for business. The role leverages domain and business process expertise to detail product requirements as epics and user stories, along with supporting artifacts like business process maps, use cases, and test plans for Enterprise Data Fabric (EDF) Platform t eam. This role involves working closely with varied business stakeholders - business users, d ata engineers, d ata a nalysts, and t esters to ensure that the technical requirements for upcoming development are thoroughly elaborated. This enables the delivery team to estimate, plan, and commit to delivery with high confidence and identify test cases and scenarios to ensure the quality and performance of IT Systems. In this role you will analyze business requirements and help design solutions for the EDF platform . You will collaborate with multi-functional teams to understand business needs, identify system enhancements, and drive system implementation projects. E xperience in business analysis, system design, and project management will enable this role to deliver innovative and effective technology products. What we expect of you Roles Responsibilities Collaborate with System Architects and Product Owners to manage business analysis activities for systems, ensuring alignment with engineering and product goals Capture the voice of the customer to define business processes and product needs Collaborate with business stakeholders, Architects and Engineering teams to prioritize release scopes and refine the Product backlog Facilitate the breakdown of Epics into Features and Sprint- s ized User Stories and participate in backlog reviews with the development team Clearly express features in User Stories/requirements so all team members and stakeholders understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are we'll-defined Stay focused on software development to ensure it meets requirements, providing proactive feedback to stakeholders Develop and execute effective product demonstrations for internal and external stakeholders Help develop and maintain a product roadmap that clearly outlines the planned features and enhancements, timelines, and achievements Identify and manage risks associated with the systems, requirement validation, and user acceptance Develop maintain documentation of configurations, processes, changes, communication plans and training plans for end users Ensure operational excellence, cybersecurity, and compliance. Collaborate with geographically dispersed teams, including those in the US and other international locations. Foster a culture of collaboration, innovation, and continuous improvement Ability to work flexible hours that align with US time zones Basic Qualifications : masters degree with9 - 14 years of experience in Computer Science, Business, Engineering, IT or related field OR bachelors degree with 10 - 14 years of experience in Computer Science, Business, Engineering, IT or related field OR Diploma with 10 - 1 4 years of experience in Computer Science, Business, Engineering, IT or related field. Must - have Skills : Proven ability in translating business requirements into technical specifications and writing user requirement documents. Able to communicate technical or complex subject matters in business terms Experience with Agile software development methodologies (Scrum) Excellent communication skills and the ability to interface with senior leadership with confidence and clarity Strong knowledge of data engineering process es Experience in managing product features for PI planning and developing product roadmaps and user journeys Technical thought leadership Good-to-have Skills : Experience maintaining SaaS (software as a system) solutions and COTS (Commercial off the shelf) solutions Experience with AWS Services (like EC2, S3), Salesforce, Jira, and API gateway, etc. Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc. ) Experience in understanding micro services architecture and API development Experience with data analysis, data modeling, and data visualization solutions such as Tableau and Spotfire Professional Certifications: SAFe for Teams certification (preferred) Certified Business Analysis Professional (Preferred) Soft Skills : Excellent critical-thinking , analytical and problem-solving skills Strong verbal written communication and collaboration skills Demonstrated awareness of how to function in a team setting Strong presentation and public speaking skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully High degree of initiative and self-motivation Abi lity to work under minimal supervision Skilled in providing oversight and mentoring team members. Demonstrated ability in effectively delegating work Team-oriented, with a focus on achieving team goals What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and we'll-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color , religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 2 weeks ago
5.0 - 10.0 years
9 - 13 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Data Modelers. Multiple openings. Analyze and design software requirement specifications. Conduct Gap Analysis and User Acceptance Testing. Prepare templates, reports, and test plans and test cases. Develop module components and proxies and review code. Develop, test, and deploy UIs. Develop test scripts and scenarios. Implement business processes and define test data. Develop estimations and enhancements. Prepare design patterns and write stored procedures. Work with Erwin, Cognos, Tableau, Oracle, Toad, SVN, Jira, Teradata, and Hadoop. 40 hrs/wk. Must have Masters degree or equivalent in Computer Science, Electr Engineering, Mechanical Engineering, or a related field (will accept a Bachelors degree plus 5 years of progressive post baccalaureate experience in lieu of a Masters) and 1 year experience (or 1 year experience as a Data Analyst, Computer Systems Analyst, or related occupation). Must have 1 year experience analyzing and designing software requirement specifications, conducting Gap Analysis and User Acceptance Testing, and working with Erwin, Cognos and Tableau
Posted 2 weeks ago
3.0 - 8.0 years
30 - 35 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
eClinical Solutions helps life sciences organizations around the world accelerate clinical development initiatives with expert data services and the elluminate Clinical Data Cloud - the foundation of digital trials. Together, the elluminate platform and digital data services give clients self-service access to all their data from one centralized location plus advanced analytics that help them make smarter, faster business decisions. You will make an impact: The Data Engineer will work closely with clients and provide technical consulting services, configuration of the elluminate platform, development for specific projects that include trial configuration, quality control, process improvements, system validation, custom analytics development, clinical software implementations and integrations. platform configuration, ETL and custom analytics development. The Data Engineer will engage in technical development and implementation of various software service delivery related activities. Accelerate your skills and career within a fast-growing company while impacting the future of healthcare. Your day to day: Design, develop, test, and deploy highly efficient SQL code and data mapping code according to specifications Develop ETL code in support of analytic software applications and related analysis projects Work with Analytics developers, other team members and clients to review the business requirements and translate them into database objects and visualizations Build any analytics reports and visualizations using tools like JReview, Qlik Provide diagnostic support and fix defects as needed Ensure compliance with eClinical Solutions/industry quality standards, regulations, guidelines, and procedures Other duties as assigned Take the first step towards your dream career. Here is what we are looking for in this role. Qualifications: 3+ years of professional experience preferred Bachelors degree or equivalent experience preferred Experience developing back end, database/warehouse architecture, design and development preferred Knowledge of variety of data platforms including SQL Server, DB2, Teradata, (Cloud based DB a plus) Understanding of Cloud / Hybrid data architecture concepts is a plus Knowledge of clinical trial data is a plus - CDISC ODM, SDTM, or ADAM standards Experience in Pharmaceutical / Biotechnology / Life Science industry is a plus Proficient in SQL, T-SQL, PL/SQL programing Experience in Microsoft Office Applications, specifically MS Project and MS Excel Familiarity with multiple Database Platforms: Oracle, SQL Server, Teradata, DB2 Oracle Familiarity with Data Reporting Tools: QlikSense, QlikView, Spotfire, Tableau, JReview, Business Objects, Cognos, MicroStrategy, IBM DataStage, Informatica, Spark or related Familiarity with other languages and concepts: .NET, C#, Python, R, Java, HTML, SSRS, AWS, Azure, Spark, REST APIs, Big Data, ETL, Data Pipelines, Data Modelling, Data Analytics, BI, Data Warehouse, Data Lake or relate Accelerate your skills and career within a fast-growing company while impacting the future of healthcare. We have shared our story, now we look forward to learning yours! eClinical is a winner of the 2023 Top Workplaces USA national award! We have also received numerous Culture Excellence Awards celebrating our exceptional company vision, values, and employee experience. See all the details here: https: / / topworkplaces.com / company / eclinical-solutions / eClinical Solutions is a people first organization. Our inclusive culture values the contribution that diversity brings to our business. We celebrate individual experiences that connect us and that inspire innovation in our community. Our team seeks out opportunities to learn, grow and continuously improve. Bring your authentic self, you are welcome here! We are proud to be an equal opportunity employer that values diversity. Our management team is committed to the principle that employment decisions are based on qualifications, merit, culture fit and business need.
Posted 2 weeks ago
5.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
What youll do Docusign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) Team, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive can do attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues on time Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Continuously evaluate and improve data architecture and processes Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a positions job designation depending on business needs and as permitted by local law. What you bring Basic Bachelor s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 5+ years dimensional and relational data modeling experience Preferred 5+ years in data warehouse engineering (OLAP) Snowflake, Teradata etc 5+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 5+ years with commercial ETL tools DBT, Matillion etc 5+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills Excellent SQL and database management skills Life at Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. . for assistance. Applicant and Candidate Privacy Notice #LI-Hybrid #LI-SA4 ","qualifications":" Basic Bachelor s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 5+ years dimensional and relational data modeling experience Preferred 5+ years in data warehouse engineering (OLAP) Snowflake, Teradata etc 5+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 5+ years with commercial ETL tools DBT, Matillion etc 5+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills Excellent SQL and database management skills ","responsibilities":" Docusign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) Team, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive can do attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues on time Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Continuously evaluate and improve data architecture and processes Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player
Posted 2 weeks ago
5.0 - 7.0 years
9 - 13 Lacs
Mumbai
Work from Office
As a Power BI Developer, you will play a crucial role in handling Power BI Dashboard Developments and maintenance independently. Your key responsibilities will include reviewing requests from cross functional teams and developing/maintaining dashboards accordingly to successfully deliver projects. You will need to ensure that these projects meet the cost, timescale, and quality parameters aligned with our overall business objectives, needs, and group guidelines. Key Responsibilities: Design, develop, and maintain interactive Power BI dashboards and reports. Translate business requirements into technical specifications for effective data visualizations. Implement data models and integrate data sources in Power BI for efficient data retrieval. Perform ETL processes from various sources, including SQL, Teradata, Excel, and APIs. Ensure data quality and consistency through data cleansing and transformation. Create unified datasets from multiple sources for comprehensive reporting and analysis. Collaborate with business stakeholders to understand their reporting and analytics needs. Work closely with IT teams and data engineers to optimize data flows and processes. Provide training and support to end-users on Power BI tools and best practices. Optimize Power BI dashboards and reports for performance and fast loading times. Monitor and troubleshoot Power BI environments to resolve performance and security issues. Implement and maintain data security practices within Power BI, ensuring compliance. Manage user roles (RLS) and permissions within PowerBI to control data access appropriately. Document data models, processes, and workflows to ensure knowledge sharing and continuity. Stay updated with the latest Power BI features and best practices and identify opportunities for process and tool improvements. Required Skills/Abilities: Graduate (Bachelor s degree from a recognized University in any discipline) 5-7 years of experience in Power BI development and data analytics. Proven track record of delivering high-quality dashboards and reports in a business environment. : Proficient in Power BI / MS Fabric, including DAX, building data models, Power Query, and Power BI Service. Should be good at Data transformations, Data Modelling and Data visualization layers. Performance tuning of existing PowerBI dashboards and utilize the best practices to develop Dashboards/Reports. Analyze and provide solutions to improve the data model design for PowerBI dashboards where the user experience rating is lesser. Strong SQL skills for querying and data manipulation. Experience with ETL processes and tools. Knowledge of data warehousing concepts and data modeling. Experience with various data sources (e.g., Teradata, MSSQL) is desirable. : Strong analytical and problem-solving skills. Ability to translate complex data into meaningful insights. Experience in data visualization and storytelling with data. : Excellent communication skills, both written and verbal. Ability to work collaboratively with cross-functional teams. Strong stakeholder management skills. Apply now and embark on an exciting journey with us! We offer: We recognize and reward your hard work through a competitive compensation and performance-based incentive. We empower you to learn and grow through training that gives you the knowledge, skills, and abilities to develop into your role and a great range of resources to support your future career aspirations & personal development. Flexible work arrangements to support work/life balance. Generous paid time off: Privilege (earned leave). Comprehensive medical insurance coverage including voluntary parental cover (applicable for IN only) Recognition & Engagement culture
Posted 2 weeks ago
5.0 - 8.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities About the Role We are hiring sharp, handson Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do Build and manage highperformance data pipelines for batch and near realtime use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over largescale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multistep workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory skill sets Must have knowledge, skills and experiences 5+ years of handson experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred skill sets Good to have knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to onprem to cloud data migrations Knowledge of Gitbased workflows and CI/CD pipelines Years of experience required Experience 58 years Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?
Posted 2 weeks ago
5.0 - 8.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s About the Role We are hiring sharp, handson Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do Build and manage highperformance data pipelines for batch and near realtime use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over largescale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multistep workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory skill sets Must have knowledge, skills and experiences 5+ years of handson experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred skill sets Good to have knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to onprem to cloud data migrations Knowledge of Gitbased workflows and CI/CD pipelines Years of experience required Experience 58 years Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration, Bachelor of Technology, Master of Engineering Degrees/Field of Study preferred Required Skills Data Engineering, Python (Programming Language), Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} No
Posted 2 weeks ago
5.0 - 8.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s About the Role We are hiring sharp, handson Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do Build and manage highperformance data pipelines for batch and near realtime use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over largescale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multistep workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory skill sets Must have knowledge, skills and experiences 5+ years of handson experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred skill sets Good to have knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to onprem to cloud data migrations Knowledge of Gitbased workflows and CI/CD pipelines Years of experience required Experience 58 years Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Data Engineering, Python (Programming Language), Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} No
Posted 2 weeks ago
5.0 - 8.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s About the Role We are hiring sharp, handson Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do Build and manage highperformance data pipelines for batch and near realtime use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over largescale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multistep workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory skill sets Must have knowledge, skills and experiences 5+ years of handson experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred skill sets Good to have knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to onprem to cloud data migrations Knowledge of Gitbased workflows and CI/CD pipelines Years of experience required Experience 58 years Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Data Engineering, Python (Programming Language), Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} No
Posted 2 weeks ago
5.0 - 8.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s About the Role We are hiring sharp, handson Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do Build and manage highperformance data pipelines for batch and near realtime use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over largescale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multistep workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory skill sets Must have knowledge, skills and experiences 5+ years of handson experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred skill sets Good to have knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to onprem to cloud data migrations Knowledge of Gitbased workflows and CI/CD pipelines Years of experience required Experience 58 years Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Master of Engineering, Master of Business Administration, Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Data Engineering, Python (Programming Language), Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} No
Posted 2 weeks ago
6.0 years
7 - 8 Lacs
Gurgaon
On-site
Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team is comprised of many talented individuals all working together with cutting-edge technology to build the best airline in the history of aviation. Our team designs, develops and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Job overview and responsibilities United Airlines’ Enterprise Data Analytics department partners with business and technology leaders across the company to transform data analytics into a competitive advantage. An offshore team based in Delhi, India will work closely with this group and support it with complementing skills and capabilities. The key objectives are to improve operating performance, boost customer experience and drive incremental revenue by embedding data in decision making across all levels of the organization. The team is currently looking for a leader who has a passion for data and analytics with the willingness to dig deep into details as well as the ability to assess the big picture. Developing and maintaining strong relationships with key stakeholders in US as well as training and retaining key talent within the offshore team are keys to success in this role. This role will require strategic thinking and strong client focus. Manage a team of data analysts by guiding them on modeling techniques, approaches and methodologies Execute solutions to business problems using data analysis, data mining, optimization tools, statistical modeling and machine learning techniques Continuously develop and demonstrate improved analysis methodologies Ensure alignment and prioritization with business objectives and initiatives – help business owners make faster, smarter decisions Create and develop presentations for United leadership and external stakeholders Encourage development and sharing of internal best practices and foster collaboration with internal and external teams This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree required At least 6+ years of experience in analytics required At least 2+ years of experience in supervisory role Be experienced in manipulating and analyzing complex, high-volume, high dimensionality data from various sources to highlight patterns and relationships Proven comfort and an intellectual curiosity for working with very large sets of data, pulling in relevant team members to address identified – and sometimes undiscovered Be able to communicate complex quantitative analysis and algorithms in a clear, precise and actionable manner Be adept at juggling several projects and initiatives simultaneously through appropriate prioritization Be proficient in using database querying tools and able to write complex queries and procedures using Teradata SQL and/or Microsoft TSQL Be familiar with one or more reporting tools – Spotfire / Slate Be able to communicate complex quantitative analysis and algorithms in a clear, precise and actionable manner Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's Degree in a quantitative field like Math, Statistics and/or MBA preferred Hands on experience in setting up using Big Data ecosystems like Hadoop/Spark Have extensive knowledge of predictive modeling, test design and Database querying Strong knowledge of either R or Python Basic programming skills for web scraping and experience of working with non-structured data will be a plus Deep technical experience in distributed computing, machine learning, and statistics related work
Posted 2 weeks ago
2.0 years
7 - 8 Lacs
Gurgaon
On-site
Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team is comprised of many talented individuals all working together with cutting-edge technology to build the best airline in the history of aviation. Our team designs, develops and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Job overview and responsibilities United Airlines’ Enterprise Data Analytics department partners with business and technology leaders across the company to transform data analytics into a competitive advantage. An offshore team based in Delhi, India will work closely with this group and support it with complementing skills and capabilities. The key objectives are to improve operating performance, boost customer experience and drive incremental revenue by embedding data in decision making across all levels of the organization. The team is currently looking for a leader who has a passion for data and analytics with the willingness to dig deep into details as well as the ability to assess the big picture. Developing and maintaining strong relationships with key stakeholders in US as well as training and retaining key talent within the offshore team are keys to success in this role. This role will require strategic thinking and strong client focus. High-level responsibilities of the role include:" Execute solutions to business problems using data analysis, data mining, optimization tools, statistical modeling and machine learning techniques Continuously develop and demonstrate improved analysis methodologies Ensure alignment and prioritization with business objectives and initiatives – help business owners make faster, smarter decisions Sharing of internal best practices and foster collaboration with internal and external teams Create and develop presentations for United leadership and external stakeholders This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree required At least 2+ years of experience in analytics required Proven comfort and an intellectual curiosity for working with very large sets of data, pulling in relevant team members to address identified – and sometimes undiscovered Strong knowledge of either R or Python Be proficient in using database querying tools and able to write complex queries and procedures using Teradata SQL and/or Microsoft TSQL Be experienced in manipulating and analyzing complex, high-volume, high dimensionality data from various sources to highlight patterns and relationships Be familiar with one or more reporting tools – Spotfire / Tableau 4Be able to communicate complex quantitative analysis and algorithms in a clear, precise and actionable manner Exhibit written and spoken English fluency Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's Degree in a quantitative field like Math, Statistics and/or MBA Hands on experience with Big Data products will be a big plus Basic programming skills for web scraping and experience of working with non-structured data will be a plus
Posted 2 weeks ago
8.0 years
4 - 10 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-217915 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 27, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Sr Mgr Software Development Engineering What you will do Let’s do this. Let’s change the world. In this vital role you will Provide technical leadership to enhance the culture of innovation, automation, and solving difficult scientific and business challenges. Technical leadership includes providing vision and direction to develop scalable reliable solutions. Provide leadership to select right-sized and appropriate tools and architectures based on requirements, data source format, and current technologies Develop, refactor, research and improve Weave cloud platform capabilities. Understand business drivers and technical needs so our cloud services seamlessly, automatically, and securely provides them the best service. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Build strong partnership with partner Build data products and service processes which perform data transformation, metadata extraction, workload management and error processing management to ensure high quality data Provide clear documentation for delivered solutions and processes, integrating documentation Collaborate with business partners to understand user stories and ensure technical solution/build can deliver to those needs Work with multi-functional teams to design and document effective and efficient solutions. Develop organisational change strategies and assist in their implementation. Mentor junior data engineers on standard processes in the industry and in the Amgen data landscape What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 8 to 13 years of relevant experience Must-Have Skills: Superb communication and interpersonal skills, with the ability to work closely with multi-functional GTM, product, and engineering teams. Minimum of 10+ years overall Software Engineer or Cloud Architect experience Minimum 3+ years in architecture role using public cloud solutions such as AWS Experience with AWS Technology stack Good-to-Have Skills: Familiarity with big data technologies, AI platforms, and cloud-based data solutions. Ability to work effectively across matrixed organizations and lead collaboration between data and AI teams. Passion for technology and customer success, particularly in driving innovative AI and data solutions. Experience working with teams of data scientists, software engineers and business experts to drive insights Experience with AWS Services such as EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway. Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc) Solid understanding of relevant data standards and industry trends Ability to understand new business requirements and prioritize them for delivery Experience working in biopharma/life sciences industry Proficient in one of the coding languages (Python, Java, Scala) Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience with Schema Design & Dimensional data modeling. Experience with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script Hands on experience using Databricks/Jupyter or similar notebook environment. Experience working with GxP systems Experience working in an agile environment (i.e. user stories, iterative development, etc.) Experience working with test-driven development and software test automation Experience working in a Product environment Good overall understanding of business, manufacturing, and laboratory systems common in the pharmaceutical industry, as well as the integration of these systems through applicable standards. Soft Skills: Excellent analytical and solving skills. Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 2 weeks ago
9.0 - 14.0 years
5 - 9 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-219095 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 27, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist IS Analyst What you will do Let’s do this. Let’s change the world. In this vital role you will be part of Enterprise Data Fabric (EDF) Platform team. In this role you will be leveragingAI and other automation tools to innovate and provide solutions for business. The role leverages domain and business process expertise to detail product requirements as epics and user stories, along with supporting artifacts like business process maps, use cases, and test plans for Enterprise Data Fabric (EDF) Platform team. This role involves working closely with varied business stakeholders - business users, data engineers, data analysts, and testers to ensure that the technical requirements for upcoming development are thoroughly elaborated. This enables the delivery team to estimate, plan, and commit to delivery with high confidence and identify test cases and scenarios to ensure the quality and performance of IT Systems. In this roleyou will analyze business requirements and help design solutions for the EDF platform. You will collaborate with multi-functional teams to understand business needs, identify system enhancements, and drive system implementation projects. Experience in business analysis, system design, and project management will enable this role to deliver innovative and effective technology products. What we expect of you Roles & Responsibilities Collaborate with System Architects and Product Owners to manage business analysis activities for systems, ensuring alignment with engineering and product goals Capture the voice of the customer to define business processes and product needs Collaborate with business stakeholders, Architects and Engineering teams to prioritize release scopes and refine the Product backlog Facilitate the breakdown of Epics into Features and Sprint-sized User Stories and participate in backlog reviews with the development team Clearly express features in User Stories/requirements so all team members and stakeholders understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are well-defined Stay focused on software development to ensure it meets requirements, providing proactive feedback to stakeholders Develop and execute effective product demonstrations for internal and external stakeholders Help develop and maintain a product roadmap that clearly outlines the planned features and enhancements, timelines, and achievements Identify and manage risks associated with the systems, requirement validation, and user acceptance Develop & maintaindocumentation of configurations, processes, changes, communication plans and training plans for end users Ensure operational excellence, cybersecurity, and compliance. Collaborate with geographically dispersed teams, including those in the US and other international locations. Foster a culture of collaboration, innovation, and continuous improvement Ability to work flexible hours that align with US time zones Basic Qualifications : Master’s degree with9 - 14 years of experience in Computer Science, Business, Engineering, IT or related field OR Bachelor’s degree with 10 - 14 years of experience in Computer Science, Business, Engineering, IT or related field OR Diploma with 10 - 14 years of experience in Computer Science, Business, Engineering, IT or related field. Must - have Skills : Proven ability in translating business requirements into technical specifications and writing user requirement documents. Able to communicate technical or complex subject matters in business terms Experience with Agile software development methodologies (Scrum) Excellent communication skills and the ability to interface with senior leadership with confidence and clarity Strong knowledge of data engineering processes Experience in managing product features for PI planning and developing product roadmaps and user journeys Technical thought leadership Good-to-have Skills : Experience maintaining SaaS (software as a system) solutions and COTS (Commercial off the shelf) solutions Experience with AWS Services (like EC2, S3), Salesforce, Jira, and API gateway, etc. Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.) Experience in understanding micro services architecture and API development Experience with data analysis, data modeling, and data visualization solutions such as Tableau and Spotfire Professional Certifications: SAFe for Teams certification (preferred) Certified Business Analysis Professional (Preferred) Soft Skills : Excellent critical-thinking, analytical and problem-solving skills Strong verbal & written communication and collaboration skills Demonstrated awareness of how to function in a team setting Strong presentation and public speaking skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully High degree of initiative and self-motivation Ability to work under minimal supervision Skilled in providing oversight and mentoring team members. Demonstrated ability in effectively delegating work Team-oriented, with a focus on achieving team goals What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31300 Jobs | Dublin
Wipro
16502 Jobs | Bengaluru
EY
10539 Jobs | London
Accenture in India
10399 Jobs | Dublin 2
Uplers
8481 Jobs | Ahmedabad
Amazon
8475 Jobs | Seattle,WA
IBM
7957 Jobs | Armonk
Oracle
7438 Jobs | Redwood City
Muthoot FinCorp (MFL)
6169 Jobs | New Delhi
Capgemini
5811 Jobs | Paris,France