Jobs
Interviews

7570 Hadoop Jobs - Page 27

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 6.0 years

8 - 15 Lacs

India

On-site

We are seeking a highly skilled Python Developer with expertise in Machine Learning and Data Analytics to join our team. The ideal candidate should have 5-6 years of experience in developing end-to-end ML-driven applications and handling data-driven projects independently. You will be responsible for designing, developing, and deploying Python-based applications that leverage data analytics, statistical modeling, and machine learning techniques. Key Responsibilities: Design, develop, and deploy Python applications for data analytics and machine learning. Work independently on machine learning model development, evaluation, and optimization. Develop ETL pipelines and process large-scale datasets for analysis. Implement scalable and efficient algorithms for predictive analytics and automation. Optimize code for performance, scalability, and maintainability. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Integrate APIs and third-party tools to enhance functionality. Document processes, code, and best practices for maintainability. Required Skills & Qualifications: 5-6 years of professional experience in Python application development. Strong expertise in Machine Learning, Data Analytics, and AI frameworks (TensorFlow, PyTorch, Scikit-learn, etc.). Proficiency in Python libraries such as Pandas, NumPy, SciPy, and Matplotlib. Experience with SQL and NoSQL databases (PostgreSQL, MongoDB, etc.). Hands-on experience with big data technologies (Apache Spark, Delta Lake, Hadoop, etc.). Strong experience in developing APIs and microservices using FastAPI, Flask, or Django. Good understanding of data structures, algorithms, and software development best practices. Strong problem-solving and debugging skills. Ability to work independently and handle multiple projects simultaneously. Good to have - Working knowledge of cloud platforms (Azure/AWS/GCP) for deploying ML models and data applications. Job Type: Full-time Pay: ₹800,000.00 - ₹1,500,000.00 per year Schedule: Day shift Ability to commute/relocate: Chandrasekharpur, Bhubaneswar, Orissa: Reliably commute or planning to relocate before starting work (Preferred) Experience: Python: 5 years (Required) Work Location: In person Expected Start Date: 01/08/2025

Posted 6 days ago

Apply

0 years

7 - 10 Lacs

Calcutta

On-site

Job description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Analyst - Regulatory Compliance Artificial Intelligence and Analytics Principal responsibilities The individual will be responsible for reporting RC AI & Analytics scorecard and key performance indicators in a timely and accurate manner. Promote a culture of data driven decision making, aligning short term decisions and investments with longer term vision and objectives. Help the business to manage regulatory risk in a more effective, efficient, and commercial way through the adoption of data science (AI/ML and advanced analytics) Support communication and engagement with stakeholders and partners to increase understanding and adoption of data science products and services also research opportunities. Collaborate with other analytics teams across the banks to share insight and best practice. Foster a collaborative, open and agile delivery culture. Build positive momentum for change across the organization with the active support and buy-in of all stakeholders. The ability to communicate often complex analytical solutions to the wider department, ensuring a strong transfer of key findings & intelligence. Requirements University degree in technology, data analytics or related discipline or relevant work experience in computer or Data Science Understanding of Regulatory Compliance, risks and direct experience of deployment of controls and analytics to manage those risks. Experience in Financial Services (experience within a tier one bank) or related industry Knowledge of the HSBC Group structure, its business and personnel, and HSBC’s corporate culture Have good interpersonal and communication skills, coupled with proven experience working in a matrixed management structure, managing global teams. Active contribution to strategy and innovation. Able to work independently and solve complex business problems whilst keeping stakeholders informed. Client focused, with strong relationship building and analytical skills. Effective communication (both verbal and written) and presentation skills. Sound judgment and critical thinking skills, ability to think laterally. Able to manage numerous tasks with continual re positioning and prioritization.Those of the above skills which the role holder does not currently bring to the role, will need to be developed. Fair understanding of applied mathematics, statistics, data science principles and advanced computing. Moderate experience working within the Hadoop ecosystem in addition to strong technical skills in analytical languages such as Python and SQL. You’ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. ***Issued By HSBC Electronic Data Processing (India) Private LTD***

Posted 6 days ago

Apply

8.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Position Overview This role is responsible for defining and delivering ZURU’s next-generation data architecture—built for global scalability, real-time analytics, and AI enablement. You will lead the unification of fragmented data systems into a cohesive, cloud-native platform that supports advanced business intelligence and decision-making. Sitting at the intersection of data strategy, engineering, and commercial enablement, this role demands both deep technical acumen and strong cross-functional influence. You will drive the vision and implementation of robust data infrastructure, champion governance standards, and embed a culture of data excellence across the organisation. Position Impact In the first six months, the Head of Data Architecture will gain deep understanding of ZURU’s operating model, technology stack, and data fragmentation challenges. You’ll conduct a comprehensive review of current architecture, identifying performance gaps, security concerns, and integration challenges across systems like SAP, Odoo, POS, and marketing platforms. By month twelve, you’ll have delivered a fully aligned architecture roadmap—implementing cloud-native infrastructure, data governance standards, and scalable models and pipelines to support AI and analytics. You will have stood up a Centre of Excellence for Data, formalised global data team structures, and established yourself as a trusted partner to senior leadership. What are you Going to do? Lead Global Data Architecture: Own the design, evolution, and delivery of ZURU’s enterprise data architecture across cloud and hybrid environments. Consolidate Core Systems: Unify data sources across SAP, Odoo, POS, IoT, and media into a single analytical platform optimised for business value. Build Scalable Infrastructure: Architect cloud-native solutions that support both batch and streaming data workflows using tools like Databricks, Kafka, and Snowflake. Implement Governance Frameworks: Define and enforce enterprise-wide data standards for access control, privacy, quality, security, and lineage. Enable Metadata & Cataloguing: Deploy metadata management and cataloguing tools to enhance data discoverability and self-service analytics. Operationalise AI/ML Pipelines: Lead data architecture that supports AI/ML initiatives, including demand forecasting, pricing models, and personalisation. Partner Across Functions: Translate business needs into data architecture solutions by collaborating with leaders in Marketing, Finance, Supply Chain, R&D, and Technology. Optimize Cloud Cost & Performance: Roll out compute and storage systems that balance cost efficiency, performance, and observability across platforms. Establish Data Leadership: Build and mentor a high-performing data team across India and NZ, and drive alignment across engineering, analytics, and governance. Vendor and Tool Strategy: Evaluate external tools and partners to ensure the data ecosystem is future-ready, scalable, and cost-effective. What are we Looking for? 8+ years of experience in data architecture, with 3+ years in a senior or leadership role across cloud or hybrid environments Proven ability to design and scale large data platforms supporting analytics, real-time reporting, and AI/ML use cases Hands-on expertise with ingestion, transformation, and orchestration pipelines (e.g. Kafka, Airflow, DBT, Fivetran) Strong knowledge of ERP data models, especially SAP and Odoo Experience with data governance, compliance (GDPR/CCPA), metadata cataloguing, and security practices Familiarity with distributed systems and streaming frameworks like Spark or Flink Strong stakeholder management and communication skills, with the ability to influence both technical and business teams Experience building and leading cross-regional data teams Tools & Technologies Cloud Platforms: AWS (S3, EMR, Kinesis, Glue), Azure (Synapse, ADLS), GCP Big Data: Hadoop, Apache Spark, Apache Flink Streaming: Kafka, Kinesis, Pub/Sub Orchestration: Airflow, Prefect, Dagster, DBT Warehousing: Snowflake, Redshift, BigQuery, Databricks Delta NoSQL: Cassandra, DynamoDB, HBase, Redis Query Engines: Presto/Trino, Athena IaC & CI/CD: Terraform, GitLab Actions Monitoring: Prometheus, Grafana, ELK, OpenTelemetry Security/Governance: IAM, TLS, KMS, Amundsen, DataHub, Collibra, DBT for lineage What do we Offer? 💰 Competitive compensation 💰 Annual Performance Bonus ⌛️ 5 Working Days with Flexible Working Hours 🚑 Medical Insurance for self & family 🚩 Training & skill development programs 🤘🏼 Work with the Global team, Make the most of the diverse knowledge 🍕 Several discussions over Multiple Pizza Parties A lot more! Come and discover us!

Posted 6 days ago

Apply

7.0 - 10.0 years

8 - 16 Lacs

India

On-site

Role: Sr. Data Engineer Location: Indore, Madhya Pradesh Experience: 7-10 Years Job Type: Full-time Job Summary: As a Data Engineer with a focus on Python, you'll play a crucial role in designing, developing, and maintaining data pipelines and ETL processes. You will work with large-scale datasets and leverage modern tools like PySpark, Airflow, and AWS Glue to automate and orchestrate data processes. Your work will support critical decision-making by ensuring data accuracy, accessibility, and efficiency across the organization Key Responsibilities: Design, build, and maintain scalable data pipelines using Python. Develop ETL processes for extracting, transforming, and loading data. Optimise SQL queries and database schemas for enhanced performance. Collaborate with data scientists, analysts, and stakeholders to understand data needs. Implement and monitor data quality checks to resolve any issues. Automate data processing tasks with Python scripts and tools. Ensure data security, integrity, and regulatory compliance. Document data processes, workflows, and system designs. Primary Skills: Python Proficiency: Experience with Python, including libraries such as Pandas, NumPy, and SQLAlchemy. PySpark: Hands-on experience in distributed data processing using PySpark. AWS Glue: Practical knowledge of AWS Glue for building serverless ETL pipelines. SQL Expertise: Advanced knowledge of SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Data Pipeline Development: Proven experience in building and maintaining data pipeline and ETL processes. Cloud Data Platforms: Familiarity with cloud-based platforms like AWS Redshift, Google BigQuery, or Azure Synapse Data Warehousing: Knowledge of data warehousing and data modelling best practices. Version Control: Proficiency with Git. Preferred Skills: Big Data Technologies: Experience with tools like Hadoop or Kafka Data Visualization: Familiarity with visualisation tools (e.g., Tableau, Power BI). DevOps Practices: Understanding of CI/CD pipelines and DevOps practices. Data Governance: Knowledge of data governance and security best practices. Job Type: Full-time Pay: ₹800,000.00 - ₹1,600,000.00 per year Work Location: In person Application Deadline: 30/07/2025 Expected Start Date: 19/07/2025

Posted 6 days ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Role Proficiency: Leverage expertise in a technology area (e.g. Infromatica Transformation Terradata data warehouse Hadoop Analytics) Responsible for Architecture for a small/mid-size projects. Outcomes Implement either data extract and transformation a data warehouse (ETL Data Extracts Data Load Logic Mapping Work Flows stored procedures data warehouse) data analysis solution data reporting solutions or cloud data tools in any one of the cloud providers(AWS/AZURE/GCP) Understand business workflows and related data flows. Develop design for data acquisitions and data transformation or data modelling; applying business intelligence on data or design data fetching and dashboards Design information structure work-and dataflow navigation. Define backup recovery and security specifications Enforce and maintain naming standards and data dictionary for data models Provide or guide team to perform estimates Help team to develop proof of concepts (POC) and solution relevant to customer problems. Able to trouble shoot problems while developing POCs Architect/Big Data Speciality Certification in (AWS/AZURE/GCP/General for example Coursera or similar learning platform/Any ML) Measures Of Outcomes Percentage of billable time spent in a year for developing and implementing data transformation or data storage Number of best practices documented in any new tool and technology emerging in the market Number of associates trained on the data service practice Outputs Expected Strategy & Planning: Create or contribute short-term tactical solutions to achieve long-term objectives and an overall data management roadmap Implement methods and procedures for tracking data quality completeness redundancy and improvement Ensure that data strategies and architectures meet regulatory compliance requirements Begin engaging external stakeholders including standards organizations regulatory bodies operators and scientific research communities or attend conferences with respect to data in cloud Operational Management Help Architects to establish governance stewardship and frameworks for managing data across the organization Provide support in implementing the appropriate tools software applications and systems to support data technology goals Collaborate with project managers and business teams for all projects involving enterprise data Analyse data-related issues with systems integration compatibility and multi-platform integration Project Control And Review Provide advice to teams facing complex technical issues in the course of project delivery Define and measure project and program specific architectural and technology quality metrics Knowledge Management & Capability Development Publish and maintain a repository of solutions best practices and standards and other knowledge articles for data management Conduct and facilitate knowledge sharing and learning sessions across the team Gain industry standard certifications on technology or area of expertise Support technical skill building (including hiring and training) for the team based on inputs from project manager /RTE’s Mentor new members in the team in technical areas Gain and cultivate domain expertise to provide best and optimized solution to customer (delivery) Requirement Gathering And Analysis Work with customer business owners and other teams to collect analyze and understand the requirements including NFRs/define NFRs Analyze gaps/ trade-offs based on current system context and industry practices; clarify the requirements by working with the customer Define the systems and sub-systems that define the programs People Management Set goals and manage performance of team engineers Provide career guidance to technical specialists and mentor them Alliance Management Identify alliance partners based on the understanding of service offerings and client requirements In collaboration with Architect create a compelling business case around the offerings Conduct beta testing of the offerings and relevance to program Technology Consulting In collaboration with Architects II and III analyze the application and technology landscapers process and tolls to arrive at the architecture options best fit for the client program Analyze Cost Vs Benefits of solution options Support Architects II and III to create a technology/ architecture roadmap for the client Define Architecture strategy for the program Innovation And Thought Leadership Participate in internal and external forums (seminars paper presentation etc) Understand clients existing business at the program level and explore new avenues to save cost and bring process efficiency Identify business opportunities to create reusable components/accelerators and reuse existing components and best practices Project Management Support Assist the PM/Scrum Master/Program Manager to identify technical risks and come-up with mitigation strategies Stakeholder Management Monitor the concerns of internal stakeholders like Product Managers & RTE’s and external stakeholders like client architects on Architecture aspects. Follow through on commitments to achieve timely resolution of issues Conduct initiatives to meet client expectations Work to expand professional network in the client organization at team and program levels New Service Design Identify potential opportunities for new service offerings based on customer voice/ partner inputs Conduct beta testing / POC as applicable Develop collaterals guides for GTM Skill Examples Use data services knowledge creating POC to meet a business requirements; contextualize the solution to the industry under guidance of Architects Use technology knowledge to create Proof of Concept (POC) / (reusable) assets under the guidance of the specialist. Apply best practices in own area of work helping with performance troubleshooting and other complex troubleshooting. Define decide and defend the technology choices made review solution under guidance Use knowledge of technology t rends to provide inputs on potential areas of opportunity for UST Use independent knowledge of Design Patterns Tools and Principles to create high level design for the given requirements. Evaluate multiple design options and choose the appropriate options for best possible trade-offs. Conduct knowledge sessions to enhance team's design capabilities. Review the low and high level design created by Specialists for efficiency (consumption of hardware memory and memory leaks etc.) Use knowledge of Software Development Process Tools & Techniques to identify and assess incremental improvements for software development process methodology and tools. Take technical responsibility for all stages in the software development process. Conduct optimal coding with clear understanding of memory leakage and related impact. Implement global standards and guidelines relevant to programming and development come up with 'points of view' and new technological ideas Use knowledge of Project Management & Agile Tools and Techniques to support plan and manage medium size projects/programs as defined within UST; identifying risks and mitigation strategies Use knowledge of Project Metrics to understand relevance in project. Collect and collate project metrics and share with the relevant stakeholders Use knowledge of Estimation and Resource Planning to create estimate and plan resources for specific modules or small projects with detailed requirements or user stories in place Strong proficiencies in understanding data workflows and dataflow Attention to details High analytical capabilities Knowledge Examples Data visualization Data migration RDMSs (relational database management systems SQL Hadoop technologies like MapReduce Hive and Pig. Programming languages especially Python and Java Operating systems like UNIX and MS Windows. Backup/archival software. Additional Comments AI Architect Role Summary: Hands-on AI Architect with strong expertise in Deep Learning, Generative AI, and real-world AI/ML systems. The role involves leading the architecture, development, and deployment of AI agent-based solutions, supporting initiatives such as intelligent automation, anomaly detection, and GenAI-powered assistants across enterprise operations and engineering. This is a hands-on role ideal for someone who thrives in fast-paced environments, is passionate about AI innovations, and can adapt across multiple opportunities based on business priorities. Key Responsibilities: Design and architect AI-based solutions including multi-agent GenAI systems using LLMs and RAG pipelines. Build POCs, prototypes, and production-grade AI components for operations, support automation, and intelligent assistants. Lead end-to-end development of AI agents for use cases such as triage, RCA automation, and predictive analytics. Leverage GenAI (LLMs) and Time Series models to drive intelligent observability and performance management. Work closely with product, engineering, and operations teams to align solutions with domain and customer needs. Own model lifecycle from experimentation to deployment using modern MLOps and LLMOps practices. Ensure scalable, secure, and cost-efficient implementation across AWS and Azure cloud environments. Key Skills & Technology Areas: AI/ML Expertise: 8+ years in AI/ML, with hands-on experience in deep learning, model deployment, and GenAI. LLMs & Frameworks: GPT-3+, Claude, LLAMA3, LangChain, LangGraph, Transformers (BERT, T5), RAG pipelines, LLMOps. Programming: Python (advanced), Keras, PyTorch, Pandas, FastAPI, Celery (for agent orchestration), Redis. Modeling & Analytics: Time Series Forecasting, Predictive Modeling, Synthetic Data Generation. Data & Storage: ChromaDB, Pinecone, FAISS, DynamoDB, PostgreSQL, Azure Synapse, Azure Data Factory. Cloud & Tools: o AWS (Bedrock, SageMaker, Lambda), o Azure (Azure ML, Azure Databricks, Synapse), o GCP (Vertex AI – optional) Observability Integration: Splunk, ELK Stack, Prometheus. DevOps/MLOps: Docker, GitHub Actions, Kubernetes, CI/CD pipelines, model monitoring & versioning. Architectural Patterns: Microservices, Event-Driven Architecture, Multi-Agent Systems, API-first Design. Other Requirements: Proven ability to work independently and collaboratively in agile, innovation-driven teams. Strong problem-solving mindset and product-oriented thinking. Excellent communication and technical storytelling skills. Flexibility to work across multiple opportunities based on business priorities. Experience in Telecom, E- Commerce, or Enterprise IT Operations is a plus. ________________________________________ ________________________________________ ________________________________________ Skills python,pandas,AIML,GENAI

Posted 6 days ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

We are seeking a highly skilled and motivated Big Data Engineer to join our data engineering team. The ideal candidate will have hands-on experience with Hadoop ecosystem, Apache Spark, and programming expertise in Python (PySpark), Scala, and Java. You will be responsible for designing, developing, and optimizing scalable data pipelines and big data solutions to support analytics and business intelligence initiatives.

Posted 6 days ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Collaborate with Product Owners and stakeholders to understand the business requirements. Good Experience in Apache Kafka, Python, Tableau and MSBI-SSIS ,SSRS. Kafka integration with Python and data loading process. Analyse data from key source systems and design suitable solutions that transform the data from source to target. Provide support to the stakeholders and scrum team throughout the development lifecycle and respond to any design queries Support testing and implementation and review solutions to ensure functional and data assurance requirements are met Passionate about data and delivering high quality data-led solutions Able to influence stakeholders, build strong business relationships and communicate in a clear, concise manner Experience working with SQL or any big data technologies is a plus (Hadoop, Hive, Hbase, Scala, Spark etc) Good in Control-M, Git, CI & CD pipeline Good team player with a strong team ethos. Skills Required MSBI SSIS, MS SQL Server, Kafka, Airflow, ANSI SQL, ShellScript, Python, Scala, HDFS.

Posted 6 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

osition: Data Scientist Location: Chennai, India (Work from Office) Experience: 2–5 years About the Opportunity: Omnihire is seeking a Data Scientist to join a leading AI-driven data‐solutions company. As part of the Data Consulting team, you’ll collaborate with scientists, IT, and engineering to solve high-impact problems and deliver actionable insights. Key Responsibilities: Analyze large, structured and unstructured datasets (SQL, Hadoop/Spark) to extract business-critical insights Build and validate statistical models (regression, classification, time-series, segmentation) and machine-learning algorithms (Random Forest, Boosting, SVM, KNN) Develop deep-learning solutions (CNN, RNN, LSTM, transfer learning) and apply NLP techniques (tokenization, stemming/lemmatization, NER, LSA) Write production-quality code in Python and/or R using libraries (scikit-learn, TensorFlow/PyTorch, pandas, NumPy, NLTK/spaCy) Collaborate with cross-functional teams to scope requirements, propose analytics solutions, and present findings via clear visualizations (Power BI, Matplotlib) Own end-to-end ML pipelines: data ingestion → preprocessing → feature engineering → model training → evaluation → deployment Contribute to solution proposals and maintain documentation for data schemas, model architectures, and experiment tracking (Git, MLflow) Required Qualifications: Bachelor’s or Master’s in Computer Science, Statistics, Mathematics, Data Science, or a related field 2–5 years of hands-on experience as a Data Scientist (or similar) in a data-driven environment Proficiency in Python and/or R for statistical modeling and ML Strong SQL skills and familiarity with Big Data platforms (e.g., Hadoop, Apache Spark) Demonstrated experience building, validating, and deploying ML/DL models in production or staging Excellent problem-solving skills, attention to detail, and ability to communicate technical concepts clearly Self-starter who thrives in a collaborative, Agile environment Nice-to-Have: Active GitHub/Kaggle portfolio showcasing personal projects or contributions Exposure to cloud-based ML services (Azure ML Studio, AWS SageMaker) and containerization (Docker) Familiarity with advanced NLP frameworks (e.g., Hugging Face Transformers) or production monitoring tools (Azure Monitor, Prometheus) Why Join? Work on high-impact AI/ML projects that drive real business value Rapid skill development with exposure to cutting-edge technologies Collaborative, Agile culture with mentorship from senior data scientists Competitive compensation package and comprehensive benefits

Posted 6 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Experience in SQL and understanding of ETL best practices Should have good hands on in ETL/Big Data development Extensive hands on experience in Scala Should have experience in Spark/Yarn, troubleshooting Spark, Linux, Python Setting up a Hadoop cluster, Backup, recovery, and maintenance.

Posted 6 days ago

Apply

2.0 - 6.0 years

3 - 7 Lacs

Gurugram

Work from Office

We are looking for a Pyspark Developer that loves solving complex problems across a full spectrum of technologies. You will help ensure our technological infrastructure operates seamlessly in support of our business objectives. Responsibilities Develop and maintain data pipelines implementing ETL processes. Take responsibility for Hadoop development and implementation. Work closely with a data science team implementing data analytic pipelines. Help define data governance policies and support data versioning processes. Maintain security and data privacy working closely with Data Protection Officer internally. Analyse a vast number of data stores and uncover insights. Skillset Required Ability to design, build and unit test the applications in Pyspark. Experience with Python development and Python data transformations. Experience with SQL scripting on one or more platforms Hive, Oracle, PostgreSQL, MySQL etc. In-depth knowledge of Hadoop, Spark, and similar frameworks. Strong knowledge of Data Management principles. Experience with normalizing/de-normalizing data structures, and developing tabular, dimensional and other data models. Have knowledge about YARN, cluster, executor, cluster configuration. Hands on working in different file formats like Json, parquet, csv etc. Experience with CLI on Linux-based platforms. Experience analysing current ETL/ELT processes, define and design new processes. Experience analysing business requirements in BI/Analytics context and designing data models to transform raw data into meaningful insights. Good to have knowledge on Data Visualization. Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources.

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are seeking an experienced Lead Data Software Engineer to join our dynamic team and tackle rewarding challenges. As a Lead Engineer, you will be pivotal in creating and implementing Data solutions across various projects. The ideal candidate will possess deep experience in Data and associated technologies, strongly emphasizing Apache Spark, Python, Azure and AWS. Responsibilities Develop and execute end-to-end Data solutions for intricate business needs Work alongside cross-functional teams to comprehend project requirements and deliver superior software solutions Apply your knowledge in Apache Spark, Python, Azure and AWS to build scalable and effective data processing systems Maintain the performance, security, and scalability of Data applications Keep abreast of industry trends and advancements in Data technologies to enhance our development processes Requirements 8-12 years of hands-on experience in Data and Data-related technologies Expert-level knowledge and practical experience with Apache Spark Strong proficiency with Hadoop and Hive Proficiency in Python Experience working with native Cloud data services, specifically AWS and Azure Nice to have Background in machine learning algorithms and techniques Skills in data visualization tools such as Tableau or Power BI Knowledge of real-time data processing frameworks like Apache Flink or Apache Storm Understanding of containerization technologies like Docker or Kubernetes Technologies Hadoop Hive

Posted 6 days ago

Apply

1.0 - 3.0 years

9 - 13 Lacs

Pune

Work from Office

Overview We are hiring an Associate Data Engineer to support our core data pipeline development efforts and gain hands-on experience with industry-grade tools like PySpark, Databricks, and cloud-based data warehouses. The ideal candidate is curious, detail-oriented, and eager to learn from senior engineers while contributing to the development and operationalization of critical data workflows. Responsibilities Assist in the development and maintenance of ETL/ELT pipelines using PySpark and Databricks under senior guidance. Support data ingestion, validation, and transformation tasks across Rating Modernization and Regulatory programs. Collaborate with team members to gather requirements and document technical solutions. Perform unit testing, data quality checks , and process monitoring activities. Contribute to the creation of stored procedures, functions, and views . Support troubleshooting of pipeline errors and validation issues. Qualifications Bachelor’s degree in Computer Science, Engineering, or related discipline. 3+ years of experience in data engineering or internships in data/analytics teams. Working knowledge of Python, SQL , and ideally PySpark . Understanding of cloud data platforms (Databricks, BigQuery, Azure/GCP). Strong problem-solving skills and eagerness to learn distributed data processing. Good verbal and written communication skills. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 6 days ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Pune

Work from Office

What You'll Do The Global Analytics & Insights (GAI) team is looking for a Senior Data Engineer to lead our build of the data infrastructure for Avalara's core data assets- empowering us with accurate, data to lead data backed decisions. As A Senior Data Engineer, you will help architect, implement, and maintain our data infrastructure using Snowflake, dbt (Data Build Tool), Python, Terraform, and Airflow. You will immerse yourself in our financial, marketing, and sales data to become an expert of Avalara's domain. You will have deep SQL experience, an understanding of modern data stacks and technology, a desire to build things the right way using modern software principles, and experience with data and all things data related. What Your Responsibilities Will Be You will architect repeatable, reusable solutions to keep our technology stack DRY Conduct technical and architecture reviews with engineers, ensuring all contributions meet quality expectations You will develop scalable, reliable, and efficient data pipelines using dbt, Python, or other ELT tools Implement and maintain scalable data orchestration and transformation, ensuring data accuracy, consistency Collaborate with cross-functional teams to understand complex requirements and translate them into technical solutions Build scalable, complex dbt models Demonstrate ownership of complex projects and calculations of core financial metrics and processes Work with Data Engineering teams to define and maintain scalable data pipelines. Promote automation and optimization of reporting processes to improve efficiency. You will be reporting to Senior Manager What You'll Need to be Successful Bachelor's degree in Computer Science or Engineering, or related field 6+ years experience in data engineering field, with advanced SQL knowledge 4+ years of working with Git, and demonstrated experience collaborating with other engineers across repositories 4+ years of working with Snowflake 3+ years working with dbt (dbt core) 3+ years working with Infrastructure as Code Terraform 3+ years working with CI CD, and demonstrated ability to build and operate pipelines AWS Certified Terraform Certified Experience working with complex Salesforce data Snowflake, dbt certified

Posted 6 days ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Position Summary... What you'll do... Job Description Summary: Responsible for coding, unit testing, building high performance and scalable applications that meet the needs of millions of Walmart-International customers, in the areas of supply chain management & Customer experience. About Team: Our team collaborates with Walmart International, which has over 5,900 retail units operating outside of the United States under 55 banners in 26 countries including Africa, Argentina, Canada, Central America, Chile, China, India, Japan, and Mexico, to name a few. What you'll do: Design, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from products we build at Walmart Scale Work in high performing agile team environment - sharing innovative ideas and working collaboratively across teams. Work with talented engineers and product visionaries to contribute to the vision and design of our web and mobile products. Be a product-oriented Full Stack Developer creating and experimenting with new ideas that will engage and excite our customers Own and lead the delivery of products working along with a team of junior developers Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Experience performing root cause analysis on applications to answer specific business questions and identify opportunities for improvement. Utilize industry research to improve Sam’s technology environment. What you'll bring: Bachelor's Degree or Master’s Degree with 6+ years of experience in Computer Science or related field. Deep knowledge of Service Oriented Architecture and experience implementing RESTFUL Web Services. Strong in Java programming and JVM internals (concurrency, multi-threading). Solid design and coding skills in Java and/or Spring framework. Extensive hands-on experience building services using these technologies (Java, J2EE, Spring Boot, Hibernate, JAX). Strong computer science knowledge in algorithms, data structures, database concepts and SQL technologies. Experience with storage technologies such as Cosmos DB, Elastic Search, Hive, Cassandra, Hadoop and Kafka are good to have. Cloud Development experience Good to have experience in HTML5, JavaScript, CSS3, AJAX, GraphQL, React Native, React, Redux, Webpack and Node. Experience in building scalable/highly available distributed systems in production. Understanding of stream processing with knowledge on Kafka. Knowledge of Software Engineering best practices with experience on implementing CI/CD, Log aggregation/Monitoring/alerting for production system. Very good expertise in production support related activities (issue identification, resolution) About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 3years’ experience in software engineering or related area at a technology, retail, or data-driven company. Option 2: 5 years’ experience in software engineering or related area at a technology, retail, or data-driven company. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Certification in Security+, GISF, CISSP, CCSP, or GSEC, Master’s degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 1 year’s experience leading information security or cybersecurity projects Information Technology - CISCO Certification - Certification Primary Location... BLOCK- 1, PRESTIGE TECH PACIFIC PARK, SY NO. 38/1, OUTER RING ROAD KADUBEESANAHALLI, , India R-2219378

Posted 6 days ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Position Summary... What you'll do... Job Description Summary: Responsible for coding, unit testing, building high performance and scalable applications that meet the needs of millions of Walmart-International customers, in the areas of supply chain management & Customer experience. About Team: Our team collaborates with Walmart International, which has over 5,900 retail units operating outside of the United States under 55 banners in 26 countries including Africa, Argentina, Canada, Central America, Chile, China, India, Japan, and Mexico, to name a few. What you'll do: Design, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from products we build at Walmart Scale Work in high performing agile team environment - sharing innovative ideas and working collaboratively across teams. Work with talented engineers and product visionaries to contribute to the vision and design of our web and mobile products. Be a product-oriented Full Stack Developer creating and experimenting with new ideas that will engage and excite our customers Own and lead the delivery of products working along with a team of junior developers Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Experience performing root cause analysis on applications to answer specific business questions and identify opportunities for improvement. Utilize industry research to improve Sam’s technology environment. What you'll bring: Bachelor's Degree or Master’s Degree with 6+ years of experience in Computer Science or related field. Deep knowledge of Service Oriented Architecture and experience implementing RESTFUL Web Services. Strong in Java programming and JVM internals (concurrency, multi-threading). Solid design and coding skills in Java and/or Spring framework. Extensive hands-on experience building services using these technologies (Java, J2EE, Spring Boot, Hibernate, JAX). Strong computer science knowledge in algorithms, data structures, database concepts and SQL technologies. Experience with storage technologies such as Cosmos DB, Elastic Search, Hive, Cassandra, Hadoop and Kafka are good to have. Cloud Development experience Good to have experience in HTML5, JavaScript, CSS3, AJAX, GraphQL, React Native, React, Redux, Webpack and Node. Experience in building scalable/highly available distributed systems in production. Understanding of stream processing with knowledge on Kafka. Knowledge of Software Engineering best practices with experience on implementing CI/CD, Log aggregation/Monitoring/alerting for production system. Very good expertise in production support related activities (issue identification, resolution) About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 3years’ experience in software engineering or related area at a technology, retail, or data-driven company. Option 2: 5 years’ experience in software engineering or related area at a technology, retail, or data-driven company. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Certification in Security+, GISF, CISSP, CCSP, or GSEC, Master’s degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 1 year’s experience leading information security or cybersecurity projects Information Technology - CISCO Certification - Certification Primary Location... Pardhanani Wilshire Ii, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli , India R-2180398

Posted 6 days ago

Apply

6.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role: Senior Associate Tower: Data, Analytics & Specialist Managed Service Experience: 6 - 10 years Key Skills: Azure Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: Bangalore Job Description As a Senior Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements: Required Skills: Azure Cloud Engineer: Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 6 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 3-5 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, AWS, Azure, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Perform data transformation and processing tasks to prepare the data for analysis and reporting in Azure Databricks or Azure Synapse Analytics for large-scale data transformations using tools like Apache Spark. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like databricks Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice to have: Azure certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 6 days ago

Apply

4.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description: As a Data Engineer, you will be responsible for designing, implementing, and maintaining our data infrastructure to support our rapidly growing business needs. The ideal candidate will have expertise in Apache Iceberg, Apache Hive, Apache Hadoop, SparkSQL, YARN, HDFS, MySQL, Data Modeling, Data Warehousing, Spark Architecture, and SQL Query Optimization. Experience with Apache Flink, PySpark, Automated Data Quality testing & Data Migration is considered a plus. Also, it's mandatory to know any one cloud stack (AWS or Azure) for Data Engineering to Create Data Jobs and Workflows and Scheduler it later for Automation Job Responsibilities & Requirements : Bachelor's degree in computer science, Information Technology, or a related field. Master's degree preferred. 4-5 years of experience working as a Data Engineer Mandatory experience in PySpark Development for Big data processing Strong proficiency in Apache Iceberg, Apache Hive, Apache Hadoop, SparkSQL, YARN, HDFS, Data Modeling, and Data Warehousing. Core PySpark Development and Optimizing SQL queries and performance tuning to ensure optimal data retrieval and processing. Experience with Apache Flink, and Automated Data Quality testing is a plus. It's mandatory to know any one cloud stack (AWS or Azure) for Data Engineering to Create Data Jobs and Workflows and Scheduler later for Automation Join Xiaomi India Technology and be part of a team that is shaping the future of technology innovation. Apply now and embark on an exciting journey with us!

Posted 6 days ago

Apply

0.0 years

1 - 2 Lacs

Mumbai

Work from Office

1 year paid Internship for Linux System Administrator Chance to learn Administering high-traffic production systems (IN, US, AU) Deploying & troubleshooting systems Participation in on-call rotation Required Candidate profile Candidate should: - have excellent communication skills - learn quickly & get things done on time - Linux admin / DevOPs course is a must - Refer key skills for skillset required

Posted 6 days ago

Apply

6.0 - 10.0 years

13 - 18 Lacs

Mumbai

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are looking for a skilled Data Engineer to design, build, and maintain scalable, secure, and high-performance data solutions. This role spans the full data engineering lifecycle – from research and architecture to deployment and support- within cloud-native environments, with a strong focus on AWS and Kubernetes (EKS). Primary Responsibilities: Data Engineering Lifecycle: Lead research, proof of concept, architecture, development, testing, deployment, and ongoing maintenance of data solutions Data Solutions: Design and implement modular, flexible, secure, and reliable data systems that scale with business needs Instrumentation and Monitoring: Integrate pipeline observability to detect and resolve issues proactively Troubleshooting and Optimization: Develop tools and processes to debug, optimize, and maintain production systems Tech Debt Reduction: Identify and address legacy inefficiencies to improve performance and maintainability Debugging and Troubleshooting: Quickly diagnose and resolve unknown issues across complex systems Documentation and Governance: Maintain clear documentation of data models, transformations, and pipelines to ensure security and governance compliance Cloud Expertise: Leverage advanced skills in AWS and EKS to build, deploy, and scale cloud-native data platforms Cross-Functional Support: Collaborate with analytics, application development, and business teams to enable data-driven solutions Team Leadership: Lead and mentor engineering teams to ensure operational efficiency and innovation Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor’s degree in Computer Science or related field 5+ years of experience in data engineering or related roles Proven experience designing and deploying scalable, secure, high-quality data solutions Solid expertise in full Data Engineering lifecycle (research to maintenance) Advanced AWS and EKS knowledge Proficient in CI/CD, IaC, and addressing tech debt Proven skilled in monitoring and instrumentation of data pipelines Proven advanced troubleshooting and performance optimization abilities Proven ownership mindset with ability to manage multiple components Proven effective cross-functional collaborator (DS, SMEs, and external teams). Proven exceptional debugging and problem-solving skills Proven solid individual contributor with a team-first approach At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #njp External Candidate Application Internal Employee Application

Posted 6 days ago

Apply

0.0 - 5.0 years

8 - 15 Lacs

Chandrasekharpur, Bhubaneswar, Orissa

On-site

We are seeking a highly skilled Python Developer with expertise in Machine Learning and Data Analytics to join our team. The ideal candidate should have 5-6 years of experience in developing end-to-end ML-driven applications and handling data-driven projects independently. You will be responsible for designing, developing, and deploying Python-based applications that leverage data analytics, statistical modeling, and machine learning techniques. Key Responsibilities: Design, develop, and deploy Python applications for data analytics and machine learning. Work independently on machine learning model development, evaluation, and optimization. Develop ETL pipelines and process large-scale datasets for analysis. Implement scalable and efficient algorithms for predictive analytics and automation. Optimize code for performance, scalability, and maintainability. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Integrate APIs and third-party tools to enhance functionality. Document processes, code, and best practices for maintainability. Required Skills & Qualifications: 5-6 years of professional experience in Python application development. Strong expertise in Machine Learning, Data Analytics, and AI frameworks (TensorFlow, PyTorch, Scikit-learn, etc.). Proficiency in Python libraries such as Pandas, NumPy, SciPy, and Matplotlib. Experience with SQL and NoSQL databases (PostgreSQL, MongoDB, etc.). Hands-on experience with big data technologies (Apache Spark, Delta Lake, Hadoop, etc.). Strong experience in developing APIs and microservices using FastAPI, Flask, or Django. Good understanding of data structures, algorithms, and software development best practices. Strong problem-solving and debugging skills. Ability to work independently and handle multiple projects simultaneously. Good to have - Working knowledge of cloud platforms (Azure/AWS/GCP) for deploying ML models and data applications. Job Type: Full-time Pay: ₹800,000.00 - ₹1,500,000.00 per year Schedule: Day shift Ability to commute/relocate: Chandrasekharpur, Bhubaneswar, Orissa: Reliably commute or planning to relocate before starting work (Preferred) Experience: Python: 5 years (Required) Work Location: In person Expected Start Date: 01/08/2025

Posted 6 days ago

Apply

6.0 - 9.5 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Role: Associate Tower: Data, Analytics & Specialist Managed Service Experience: : 6-9.5 years Key Skills: AWS Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: Bangalore Job Description As a Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements: Required Skills: AWS Cloud Engineer: Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 2 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 1-3 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snowflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice to have: AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective. Kindly share your resume to deepa.radhakrishnan.nair@pwc.com Regards, Deepa Radhakrishnan

Posted 6 days ago

Apply

5.5 - 9.9 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role: Azure Data Engineer - Senior Associate Experience: 5.5 - 9.9 years Key Skills: Azure Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: Bangalore Job Description As a Senior Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements: Required Skills: Azure Cloud Engineer: Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 6 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 3-5 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, AWS, Azure, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Perform data transformation and processing tasks to prepare the data for analysis and reporting in Azure Databricks or Azure Synapse Analytics for large-scale data transformations using tools like Apache Spark. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like databricks Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice to have: Azure certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 6 days ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary Work as a Senior Developer for a Strategic Tax Reporting application under Finance Technology. Individual will be responsible for end to end Development, Testing and Implementation of Data solutions using Dataiku tool, with Python, Spark, Hadoop and Hive as the core programming languages and frameworks to develop data products, API’s and integrate with other applications within the Bank and leveraging Devops for Continuous Integration and Continuous Development Project Background As part of the Strategic Tax Reporting solution, the aim is to provide an automated tax provisions calculation for Group financial reporting, local statutory reporting and provide inputs for tax returns compliance process globally. The strategic reporting tool will use a vendor product called “Long View” as a tax calculation engine and will integrate with Enterprise ASPIRE and EDM infrastructure to provide for the required automation capabilities. In order for the integration between SAP S4 HANA and Long View, the interim architecture will use Dataiku to invoke S4 API’s to retrieve GL Data required for Tax Reporting by Long view & other Tax processes as appropriate. Key Responsibilities Business Responsible for End to Development with tasks not limited to but covering areas such as Analysis, Design, Development, Data management, Devops Integration using Pipeline build and maintenance, Level 3 support, Issue analysis debugging, Performance tuning, Configuration management, Automation, Monitoring etc. Processes Ensure adherence to Change & incident management process by Coordination and working with other teams in the organization for getting a release deployed into production. People and Talent Effectively uses teamwork to positively contribute to a high-morale, high-performance team culture, leading by example. Consulting attitude who is approachable and ready to offer insights or assistance Should be results oriented and have a positive attitude Effective team player and collaborator Strong personal integrity Strong written and verbal communication skills Can effectively communicate business and technical information across the organization, being sensitive to the needs of unique audiences Risk Management Work in coordination with Production support & SRE to maintain stability of production applications via controlled, automated deployments with minimal outages and impact Is able to collaborate with relevant IT teams and business users who may be Product owner or end users to manage a problem resolution effectively Work cross-functionally and think both critically and strategically Process and Governance Document detailed Unit test results, application functionality, design etc in Confluence and JIRA’s with supporting evidences & test cases Embrace Devops methodology for Dev, Build and Deploy and work in Agile delivery model. Coordinate UAT and system integration testing. Answer user queries support UAT/SIT/PT & other related activities Provide updates to the project / program manager with regards to progress and issues, where appropriate Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Skills And Experience Overall 8+ years total experience in IT preferably Banking or Financial Services Domain 4 + years for experience in software development using Python, Spark with strong experience in PySpark and experience developing big data solutions using Hadoop, HDFS, Hive. Experience in delivering majors projects, programs against schedule Experience working on cloud native infrastructure with exposure to AWS or Azure Good experience and mindset towards DevSecOps with exposure to tools like Jenkins, Maven, Artefactory, Gradle, Ansible, Shell scripting Experience with agile frameworks (e.g. Scrum, Kanban, Lean) and toolsets (Git, JIRA, Confluence) Experience Developing API’s , managing, deploying and integrating applications Knowledge on Java, JavaScript will be good to have. Non-technical Skills Proven ability to work within a team environment Ability to work with multiple tasks based on priorities and switching between deliverables. Highly effective verbal and written English communication & presentation skills. Ability to quickly understand and articulate problems and solutions Ability to make good / sound decisions and use independent judgement. Strong reasoning, analytical and inter-personal skills. Excellent attention to detail and time management. Good knowledge on Agile practices Role Specific Technical Competencies Python Spark Linux scripting and ansible Gradle, Maven, Jenkins, Docker, Artefactory and other Devops tools Jira, Confluence and Agile Java, Javascript Hadoop/Hive About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.

Posted 6 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Company Description Decision Point develops analytics and big data solutions for CPG, Retail, and Consumer-focused industries, working with global Fortune 500 clients. We provide analytical insights and solutions that help develop sales and marketing strategies in the Retail and CPG Industry by leveraging diverse sources of data including Point of Sale data, syndicated category data, primary shipments, and other similar sources. Decision Point was founded by Ravi Shankar along with his classmates from IIT Madras, who have diverse experience across the CPG and Marketing Analytics domain. At Decision Point, you will meet data scientists, business consultants, and tech-savvy engineers passionate about extracting every ounce of value from data for our clients. Role Description This is a full-time on-site role for a Lead Data Engineer, located in Gurugram. The Lead Data Engineer will be responsible for designing, developing, and maintaining data pipelines, building data models, implementing ETL processes, and managing data warehousing solutions. The role also includes data analytics responsibilities to derive actionable insights for our clients. The candidate will engage with cross-functional teams to understand data requirements and deliver robust data solutions. Qualifications Skills in Data Engineering, Data Modeling Experience with Extract Transform Load (ETL) processes and tools Proficiency in Data Warehousing solutions Strong Data Analytics capabilities Excellent problem-solving and analytical skills Ability to work collaboratively in a team environment Bachelor's or Master's degree in Computer Science, Information Technology, or related field Experience in the Retail or CPG industry is a plus Proficient in programming languages such as Python, SQL, and tools like Hadoop, Spark, etc. is beneficial

Posted 6 days ago

Apply

5.5 - 9.9 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role: AWS Data Engineer- Senior Associate Experience: : 5.5 -9.9 years Key Skills: AWS Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: Bangalore Job Description As a Senior Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements: Required Skills: AWS Cloud Engineer: Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 5 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 3-4 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snowflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice to have: AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies