Jobs
Interviews

6785 Hadoop Jobs - Page 23

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

2 - 3 Lacs

Chennai

On-site

The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Apply fundamental knowledge of programming languages for design specifications. Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging Serve as advisor or coach to new or lower level analysts Identify problems, analyze information, and make evaluative judgements to recommend and implement solutions Resolve issues by identifying and selecting solutions through the applications of acquired technical experience and guided by precedents Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Key Responsibilities: Design and implement ETL pipelines using PySpark and Big Data tools on platforms like Hadoop, Hive, HDFS etc. Write scalable Python code for Machine Learning preprocessing tasks and work with libraries such as pandas, Scikit-learn etc. Develop data pipelines to support model training, evaluation and inference. Skills: Proficiency in Python programming with experience in PySpark for large-scale data processing. Hands-on experience in Big Data technologies: Hadoop, Hive HDFS etc. Exposure to machine learning workflows, model lifecycle and data preparation. Experience with ML libraries: Scikit-learn, XGBoost, Tensorflow, PyTorch etc. Exposure to cloud platforms (AWS/GCP) for data and AI workloads. Qualifications: 4-8 years of relevant experience in the Financial Service industry Intermediate level experience in Applications Development role Consistently demonstrates clear and concise written and verbal communication Demonstrated problem-solving and decision-making skills Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. - Job Family Group: Technology - Job Family: Applications Development - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 4 days ago

Apply

5.0 years

2 - 3 Lacs

Chennai

On-site

Job Description The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Ab Initio Data Engineer We are looking for Ab Initio Data Engineer to be able to design and build Ab Initio-based applications across Data Integration, Governance & Quality domains for Compliance Risk programs. The individual will be working with both Technical Leads, Senior Solution Engineers and prospective Application Managers in order to build applications, rollout and support production environments, leveraging Ab Initio tech-stack, and ensuring the overall success of their programs. The programs have a high visibility, and are fast paced key initiatives, which generally aims towards acquiring & curating data and metadata across internal and external sources, provide analytical insights and integrate with other Citi systems. Technical Stack: Ab Initio 4.0.x software suite – Co>Op, GDE, EME, BRE, Conduct>It, Express>It, Metadata>Hub, Query>it, Control>Center, Easy>Graph Big Data – Cloudera Hadoop, Hive, Yarn Databases - Oracle 11G/12C, Teradata, MongoDB, Snowflake Others – JIRA, Service Now, Linux, SQL Developer, AutoSys, and Microsoft Office Responsibilities: Ability to design and build Ab Initio graphs (both continuous & batch) and Conduct>it Plans, and integrate with portfolio of Ab Initio softwares. Build Web-Service and RESTful graphs and create RAML or Swagger documentations. Complete understanding and analytical ability of Metadata Hub metamodel. Strong hands on Multifile system level programming, debugging and optimization skill. Hands on experience in developing complex ETL applications. Good knowledge of RDBMS – Oracle, with ability to write complex SQL needed to investigate and analyze data issues Strong in UNIX Shell/Perl Scripting. Build graphs interfacing with heterogeneous data sources – Oracle, Snowflake, Hadoop, Hive, AWS S3. Build application configurations for Express>It frameworks – Acquire>It, Spec-To-Graph, Data Quality Assessment. Build automation pipelines for Continuous Integration & Delivery (CI-CD), leveraging Testing Framework & JUnit modules, integrating with Jenkins, JIRA and/or Service Now. Build Query>It data sources for cataloguing data from different sources. Parse XML, JSON & YAML documents including hierarchical models. Build and implement data acquisition and transformation/curation requirements in a data lake or warehouse environment, and demonstrate experience in leveraging various Ab Initio components. Build Autosys or Control Center Jobs and Schedules for process orchestration Build BRE rulesets for reformat, rollup & validation usecases Build SQL scripts on database, performance tuning, relational model analysis and perform data migrations. Ability to identify performance bottlenecks in graphs, and optimize them. Ensure Ab Initio code base is appropriately engineered to maintain current functionality and development that adheres to performance optimization, interoperability standards and requirements, and compliance with client IT governance policies Build regression test cases, functional test cases and write user manuals for various projects Conduct bug fixing, code reviews, and unit, functional and integration testing Participate in the agile development process, and document and communicate issues and bugs relative to data standards Pair up with other data engineers to develop analytic applications leveraging Big Data technologies: Hadoop, NoSQL, and In-memory Data Grids Challenge and inspire team members to achieve business results in a fast paced and quickly changing environment Perform other duties and/or special projects as assigned Qualifications: Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) and a minimum of 5 years of experience Minimum 5 years of extensive experience in design, build and deployment of Ab Initio-based applications Expertise in handling complex large-scale Data Lake and Warehouse environments Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. - Job Family Group: Technology - Job Family: Applications Development - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an exprienced professional to join our team in the role of Consultant Specialist In this role, you will: Data Pipelines Integration and Management: Expertise in Scala-Spark/Python-Spark development and should be able to Work with Agile application dev team to implement data strategies. Design and implement scalable data architectures to support the bank's data needs. Develop and maintain ETL (Extract, Transform, Load) processes. Ensure the data infrastructure is reliable, scalable, and secure. Oversee the integration of diverse data sources into a cohesive data platform. Ensure data quality, data governance, and compliance with regulatory requirements. Monitor and optimize data pipeline performance. Troubleshoot and resolve data-related issues promptly. Implement monitoring and alerting systems for data processes. Troubleshoot and resolve technical issues optimizing system performance ensuring reliability Create and maintain technical documentation for new and existing system ensuring that information is accessible to the team Implementing and monitoring solutions that identify both system bottlenecks and production issues Requirements To be successful in this role, you should meet the following requirements: 5+ years of experience in data engineering or related field and hands-on experience of building and maintenance of ETL Data pipelines Good experience in Designing and Developing Spark Applications using Scala or Python. Good experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark) Proficiency in programming languages such as Python, Java, or Scala. Optimization and Performance Tuning of Spark Applications GIT Experience on creating, merging and managing Repos. Perform unit testing and performance testing. Good understanding of ETL processes and data pipeline orchestration tools like Airflow, Control-M. Strong problem-solving skills and ability to work under pressure. Excellent communication and interpersonal skills. The successful candidate will also meet the following requirements: (Good to have Requirements) Experience in the banking or financial services industry. Familiarity with regulatory requirements related to data security and privacy in the banking sector. Experience with cloud platforms (Google Cloud) and their data services. Certifications in cloud platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc.). You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI

Posted 4 days ago

Apply

8.0 years

8 - 9 Lacs

Pune

On-site

Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Advisor, Statistical Analysis Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, Mathematics, or a related field. 8+ years of experience in data science, preferably in the payments or fintech industry. Proficiency in Python, SQL, and data science libraries (e.g., pandas, scikit-learn, TensorFlow, PyTorch). Strong understanding of payment systems, transaction flows, and fraud detection techniques. Experience with big data technologies (e.g., Spark, Hadoop) and cloud platforms (e.g., AWS, GCP, Azure). Excellent problem-solving skills and the ability to work in a fast-paced, collaborative environment. Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Bengaluru

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist In this role, you will: Data Pipelines Integration and Management: Expertise in Scala-Spark/Python-Spark development and should be able to Work with Agile application dev team to implement data strategies. Design and implement scalable data architectures to support the bank's data needs. Develop and maintain ETL (Extract, Transform, Load) processes. Ensure the data infrastructure is reliable, scalable, and secure. Oversee the integration of diverse data sources into a cohesive data platform. Ensure data quality, data governance, and compliance with regulatory requirements. Monitor and optimize data pipeline performance. Troubleshoot and resolve data-related issues promptly. Implement monitoring and alerting systems for data processes. Troubleshoot and resolve technical issues optimizing system performance ensuring reliability Create and maintain technical documentation for new and existing system ensuring that information is accessible to the team Implementing and monitoring solutions that identify both system bottlenecks and production issues Requirements To be successful in this role, you should meet the following requirements: 5+ years of experience in data engineering or related field and hands-on experience of building and maintenance of ETL Data pipelines Good experience in Designing and Developing Spark Applications using Scala or Python. Good experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark) Proficiency in programming languages such as Python, Java, or Scala. Optimization and Performance Tuning of Spark Applications GIT Experience on creating, merging and managing Repos. Perform unit testing and performance testing. Good understanding of ETL processes and data pipeline orchestration tools like Airflow, Control-M. Strong problem-solving skills and ability to work under pressure. Excellent communication and interpersonal skills. The successful candidate will also meet the following requirements: (Good to have Requirements) Experience in the banking or financial services industry. Familiarity with regulatory requirements related to data security and privacy in the banking sector. Experience with cloud platforms (Google Cloud) and their data services. Certifications in cloud platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc.). You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI

Posted 4 days ago

Apply

8.0 years

0 Lacs

Karnataka

On-site

At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts. Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet. Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all. Global Payments and Risk is at the forefront of innovation, integrating ground breaking technologies into our products and services. We are dedicated to using the power of data driven solutions to solve real-world problems at internet scale, improve user experiences, and drive business outcomes. Join us in shaping the future with your expertise and passion for technology. We are seeking a motivated Software Engineer with a strong background in full stack software development. The ideal candidate should have a platform-centric approach and a proven track record to build platforms, frameworks to solve complex problems. You will be instrumental in crafting innovative applications that safeguard our marketplace to mitigate risks, and curtail financial losses. Join our collaborative team that thrives on creativity and resourcefulness to tackle sophisticated challenges. What you will accomplish: Develop high-performing solutions that align with eBay's business and technology strategies to enhance risk management, trust, and compliance.. Research, analyze, design, develop and test the solutions that are appropriate for the business and technology strategies. Participate in design discussions, code reviews, and project-related team meetings, impacting significantly to the development process. Collaborate optimally within a multi-functional team comprising of engineers, architects, product managers, and operations, to deliver innovative solutions to address business needs, performance, scale, and reliability. Acquire domain expertise and apply this knowledge to tackle product challenges, ensuring continuous improvement in the domain. Act as an onboarding buddy for new joiners, fostering a supportive and inclusive work environment. What you will bring: At least 8 years of software design and development experience with proven foundation in computer science with strong competencies in data structures, algorithms, distributed computing and scalable software design Hands on expertise with architectural and design patterns, open source platforms & frameworks, technologies and software engineering methodologies. Hands-on experience in developing applications with Spring/Spring Boot, Rest, GraphQl, Java, JEE, Spring batch. Hands-on experience with building data models in Oracle/MySql/RDBMS and NoSql databases e.g key value store, document store like Mongo, Couchbase, Cassandra. Hands-on Experience in building tools and User experience using Html5, Node.js, ReactJs, Arco design, Material design. Hands on experience in finetuning performance bottlenecks of java, node.js, javascript. Proficiency in data & streaming technologies like Hadoop, Spark, Kafka, apache Flink etc. Practiced agile development and ability to adapt changes with business priorities. Experience with building sophisticated integration solutions for internet scale traffic is a major plus. Risk domain, rule engine expertise is a major plus. Excellent with decision-making, communication and collaboration skills. Familiarity with prompt engineering and AI tools is a major plus. Experience with Prometheus, Graphana, OLAP and Devops tools for observability is required. Behaviors: Innovating effectively in a dynamic, fast-changing environment, challenges convention. Develop solutions that deliver tangible results. Strong execution and alignment with timelines and timely addressing of blocking issues when risks arise. Practices learning and collaborates effectively in a team that has multiple functions. Education: Degree in computer science or equivalent discipline with 8+ years of software application development experience. Please see the Talent Privacy Notice for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay. eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at talent@ebay.com. We will make every effort to respond to your request for accommodation as soon as possible. View our accessibility statement to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities. The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our Privacy Center for more information.

Posted 4 days ago

Apply

0 years

0 Lacs

Bengaluru

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist In this role, you will: Influences a cross-functional/cross-cultural team and the performance of individuals/teams against performance objectives and plans Endorses team engagement initiatives, fostering an environment which encourages learning and collaboration to build a sense of community Creates environments where only the best will do and high standards are expected, regularly achieved and appropriately rewarded; encourages and supports continual improvements within the team based on ongoing feedback Develops a network of professional relationships across Wholesale Data & Analytics and our stakeholders to improve collaborative working and encourage openness - sharing ideas, information and collateral Encourages individuals to network and collaborate with colleagues beyond their own business areas and/or the Group to shape change and benefit the business and its customers Support the PMO and delivery managers in documenting accurate status reports as required, in a timely manner Requirements To be successful in this role, you should meet the following requirements: Serve as voice of the customer Develop/scope and define backlog items (epics/features/user stories) that guide the development team Managing the capture, analysis and documentation of data & analytics requirements and processes (both business and IT) Create analysis of customer journeys and own the product roadmap Managing implementation of data & analytics solutions Manage change interventions such as training, adoption planning to stakeholder management Tracking and documenting progress and managing delivery status Helping define and track measure of success for products Risk identification, risk reporting and devising interventions to mitigating risks Budget management and forecasting Management of internal delivery teams and/or external service providers Data Platform, Service and Product Owner experience Experience of applying Design Thinking A Data-Driven mind-set Outstanding cross-platform knowledge (incl. Teradata, Hadoop and GCP) and understands the nuances of delivery across these platforms both technically and functionally and how they impact delivery Communication capabilities, decision-making, problem solving skills, lateral thinking, analytical and interpersonal skills Experience in leading a team and managing multiple, competing priorities and demands in a dynamic environment Highly analytical with strong attention to detail Demonstrable track record of delivery in a Banking and Financial Markets context A strong and diverse technical background and foundation (ETL, SQL, NoSQL, APIs, Data Architecture, Data Management principle and patterns, Data Ingest, Data Refinery, Data Provision, etc.) Experience in customising and managing integration tools, databases, warehouses and analytical tools A passion for designing towards consistency and efficiency, and always striving for continuous improvements via automated processes Experience in Agile Project methodologies and tools such as JIRA and Confluence You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI

Posted 4 days ago

Apply

0 years

0 Lacs

Sonipat, Haryana, India

On-site

About the Role Overview: Newton School of Technology is on a mission to transform technology education and bridge the employability gap. As India’s first impact university, we are committed to revolutionizing learning, empowering students, and shaping the future of the tech industry. Backed by renowned professionals and industry leaders, we aim to solve the employability challenge and create a lasting impact on society. We are currently looking for a Data Engineer + Instructor– Data Mining to join our Computer Science Department. This is a full-time academic role focused on data mining, analytics, and teaching/mentoring students in core data science and engineering topics. Key Responsibilities: ● Develop and deliver comprehensive and engaging lectures for the undergraduate "Data Mining", “BigData” and “Data Analytics” courses, covering the full syllabus from foundational concepts to advanced techniques. ● Instruct students on the complete data lifecycle, including data preprocessing, cleaning, transformation, and feature engineering. ● Teach the theory, implementation, and evaluation of a wide range of algorithms for Classification, Association rules mining, Clustering, and Anomaly detection. ● Design and facilitate practical lab sessions and assignments that provide students with hands-on experience using modern data tools and software. ● Develop and grade assessments, including assignments, projects, and examinations, that effectively measure the Course Learning Objectives (CLOs). ● Mentor and guide students on projects, encouraging them to work with real-world or benchmark datasets (e.g., from Kaggle). ● Stay current with the latest advancements, research, and industry trends in data engineering and machine learning to ensure the curriculum remains relevant and cutting-edge. ● Contribute to the academic and research environment of the department and the university. Required Qualifications: ● A Ph.D. (or a Master's degree with significant, relevant industry experience) in Computer Science, Data Science, Artificial Intelligence, or a closely related field. ● Demonstrable expertise in the core concepts of data engineering and machine learning as outlined in the syllabus. ● Strong practical proficiency in Python and its data science ecosystem, specifically Scikit-learn, Pandas, NumPy, and visualization libraries (e.g., Matplotlib, Seaborn). ● Proven experience in teaching, preferably at the undergraduate level, with an ability to make complex topics accessible and engaging. ● Excellent communication and interpersonal skills. Preferred Qualifications: ● A strong record of academic publications in reputable data mining, machine learning, or AI conferences/journals. ● Prior industry experience as a Data Scientist, Big Data Engineer, Machine Learning Engineer, or in a similar role. ● Experience with big data technologies (e.g., Spark, Hadoop) and/or deep learning frameworks (e.g., TensorFlow, PyTorch). ● Experience in mentoring student teams for data science competitions or hackathons. Perks & Benefits: ● Competitive salary packages aligned with industry standards. ● Access to state-of-the-art labs and classroom facilities.

Posted 4 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Responsibilities: Job Title: Cloud Data Engineer (AWS/Azure/Databricks/GCP) Experience:3-8 years in Data Engineering Job Description: We are seeking skilled and dynamic Cloud Data Engineers specializing in AWS, Azure, Databricks, and GCP. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. Key Responsibilities: - Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS, Azure, Databricks, and GCP. - Implement data ingestion and transformation processes to facilitate efficient data warehousing. - Utilize cloud services to enhance data processing capabilities: - AWS: Glue, Athena, Lambda, Redshift, Step Functions, DynamoDB, SNS. - Azure: Data Factory, Synapse Analytics, Functions, Cosmos DB, Event Grid, Logic Apps, Service Bus. - GCP: Dataflow, BigQuery, DataProc, Cloud Functions, Bigtable, Pub/Sub, Data Fusion. - Optimize Spark job performance to ensure high efficiency and reliability. - Stay proactive in learning and implementing new technologies to improve data processing frameworks. - Collaborate with cross-functional teams to deliver robust data solutions. - Work on Spark Streaming for real-time data processing as necessary. Qualifications: - 3-8 years of experience in data engineering with a strong focus on cloud environments. - Proficiency in PySpark or Spark is mandatory. - Proven experience with data ingestion, transformation, and data warehousing. - In-depth knowledge and hands-on experience with cloud services(AWS/Azure/GCP): - Demonstrated ability in performance optimization of Spark jobs. - Strong problem-solving skills and the ability to work independently as well as in a team. - Cloud Certification (AWS, Azure, or GCP) is a plus. - Familiarity with Spark Streaming is a bonus. Mandatory skill sets: Python, Pyspark, SQL with (AWS or Azure or GCP) Preferred skill sets: Python, Pyspark, SQL with (AWS or Azure or GCP) Years of experience required: 3-8 years Education qualification: BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Technology, Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills PySpark, Python (Programming Language) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 28 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Programmer Analyst position is an intermediate level role where you will be responsible for contributing to the establishment and implementation of new or updated application systems and programs in collaboration with the Technology team. Your main objective will be to participate in applications systems analysis and programming activities. Your responsibilities will include utilizing your knowledge of applications development procedures and concepts, as well as basic understanding of other technical areas, to identify and define necessary system enhancements. You will be expected to identify and analyze issues, provide recommendations, and implement solutions. Additionally, you will apply your knowledge of business processes, system processes, and industry standards to solve complex problems. Your role will involve analyzing information, making evaluative judgments, recommending solutions and improvements, conducting testing and debugging, utilizing script tools, and writing basic code based on design specifications. You will also need to assess the applicability of similar experiences and evaluate options under circumstances not covered by procedures. Developing a working knowledge of various technical aspects such as Citigroup's information systems, client-server application development, network operations, database administration, systems administration, data center operations, and PC-based applications will be essential. It is crucial that you appropriately assess risk when making business decisions, with a focus on safeguarding Citigroup, its clients, and assets by ensuring compliance with laws, rules, and regulations, adhering to policies, applying ethical judgment, and escalating control issues when necessary. Qualifications for this role include having 2-5 years of relevant experience, proficiency in programming/debugging for business applications, familiarity with industry practices and standards, comprehensive knowledge of a specific business area for application development, working knowledge of program languages, and demonstrating clear and concise written and verbal communication consistently. Education requirement for this position is a Bachelor's degree or equivalent experience. This job description offers a detailed overview of the job responsibilities and qualifications required. Other duties related to the role may be assigned as necessary. Skillsets required for this role include a minimum of 3+ years of hands-on experience in Data engineering stream, good knowledge of technologies such as Hadoop, Spark, Hive, Impala, Performance Tuning, Java programming language, SQL, Oracle, and any certification like Java/Big Data would be beneficial. Citi is an equal opportunity and affirmative action employer, and invites all qualified interested applicants to apply for career opportunities. If you require reasonable accommodation due to a disability to use search tools or apply for a career opportunity, review Accessibility at Citi.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be working as a Data Scientist/Analyst with a focus on extracting actionable insights from data. Your role will involve utilizing various tools such as Python, Alteryx, R, SQL, Tableau, MINITAB, CPLEX, Qlikview, QlikSense, Hadoop, GCP, and ML. To qualify for this position, you should possess a Bachelor's degree in computer or data science, engineering, statistics, operations research, or another quantitative area. You must have experience in using SQL to extract, clean, and transform data from large, complex, nested databases. Proficiency in programming languages such as Python or R in a cloud platform is essential. Additionally, you should have a minimum of 3 years of experience as a researcher, analyst, data scientist, or solution developer with expertise in one or more of the following analytics tools: R, SQL, Tableau, Alteryx, MINITAB, CPLEX, Python, Any Logic, Qlikview, QlikSense, Google Cloud Platform, Hadoop, or SAP. Preferred qualifications include at least 3 years of experience as a researcher, analyst, data scientist, or solution developer, with a strong background in Python, Alteryx, R, SQL, Tableau, MINITAB, CPLEX, Qlikview, QlikSense, and Hadoop. Experience in data mining, statistical analysis, modeling, optimization, GCP Big Query, Vertex AI, and machine learning using Python is highly desirable for this role. Strong written and verbal communication skills, along with a high level of intellectual curiosity, are essential to excel in this position.,

Posted 4 days ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title : GCP Data Architect Location : Madurai Experience : 12+ Years Notice Period : Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3-5 years of hands-on experience in GCP Data Service Proficient in : BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership (ref:hirist.tech)

Posted 4 days ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

Remote

Who We Are Beyondsoft is a leading mid-sized business IT and consulting company that combines modern technologies and proven methodologies to tailor solutions that move your business forward. Our global head office is based in Singapore, and our team is made up of a diversely talented team of experts who thrive on innovation and pushing the bounds of technology to solve our customers most pressing challenges. When it comes time to deliver, we set our sights on that sweet spot where brilliance, emerging technologies, best practices, and accountability converge. We have a global presence spanning four continents (North America, South America, Europe, and Asia). Our global network of talent and customer-centric engagement model enables us to provide top-quality services on an unprecedented scale. What Were About We believe that collaboration, transparency, and accountability are the values that guide our business, our delivery, and our brand. Everyone has something to bring to the table, and we believe in working together with our peers and clients to leverage the best of one another in everything we do. When we proactively collaborate, business decisions become easier, innovation is greater, and outcomes are better. Our ability to achieve our mission and live out our values depends upon a diverse, equitable, and inclusive culture. So, we strive to foster a workplace where people have the respect, support, and voice they deserve, where innovative ideas flourish, and where people can unleash their brilliance. For more information regarding DEI at Beyondsoft, please go to https : SUMMARY : As a Data Engineer, you will be responsible for designing, building, and optimizing scalable data pipelines and infrastructure. Youll work closely with analytics, engineering, and product teams to ensure data integrity and enable high-impact decision-making. This position requires flexibility to work in PST timezone. Additional Requirement For Remote Positions For remote positions, all candidates must complete a video screen with our corporate recruiting team. What You Will Be Doing Maintain automated data onboarding and diagnostic tools for AIP partners. Monitor ADF pipelines and mitigate pipeline runs as needed. Maintain Privacy Dashboard and Bing user interests for Bing Growth team usage. Participate and resolve live sites in related areas. Data Platform development and maintenance, Notebook based processing pipelines and MT migration. Manage the regular data quality Cosmos/MT jobs. Online tooling and support such as DADT tools. Watch out the abnormal pattern, perform ad-hoc data quality analysis, investigate daily the user ad click broken cases. Perform additional duties as assigned. Minimum Qualifications Bachelors degree or higher in Computer Science or a related field. At least 3 years of experience in software development. Good quality software development and understanding. Ability to quickly communicate across time zones. Excellent written and verbal communication skills in English. Self-motivated. Coding Language : Java, C#, Python, Scala. Technologies : Apache Spark, Apache Flink, Apache Kafka, Hadoop, Cosmos, SQL. Azure resource management : Azure Data Factory, Azure Databricks, Azure Key vaults, Managed Identity, Azure Storage, etc. MS Project. Big data experience is a plus. Occasional infrequent in person activity may be required. What We Have To Offer Because we know how important our people are to the success of our clients, its a priority to make sure we stay committed to our employees and making Beyondsoft a great place to work. We take pride in offering competitive compensation and benefits along with a company culture that embodies continuous learning, growth, and training with a dedicated focus on employee satisfaction and work/life balance. Beyondsoft provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type with regards to race, color, religion, age, sex, national origin, disability status, genetics, veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, and the full employee lifecycle up through and including termination. (ref:hirist.tech)

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Java Developer at our Pune, India location, you will be responsible for producing scalable software solutions on distributed systems like Hadoop using the Spark Framework. You will work within a cross-functional team involved in the full software development life cycle, from conception to deployment. Your role will require expertise in back-end coding, development frameworks, third-party libraries, and Spark APIs essential for application development on distributed platforms like Hadoop. Being a team player with a flair for visual design and utility is crucial, along with familiarity with Agile methodologies. Given that a significant portion of the workloads and applications will be cloud-based, knowledge and experience with Google Cloud Platform (GCP) will be beneficial. Your responsibilities will include collaborating with development teams and product managers to brainstorm software solutions, designing client-side and server-side architecture, building features and applications capable of running on distributed platforms or the cloud, managing well-functioning applications supporting micro-services architecture, testing software for responsiveness and efficiency, troubleshooting, debugging, and upgrading software, establishing security and data protection settings, writing technical and design documentation, and creating effective APIs (REST & SOAP). To be successful in this role, you should have proven experience as a Java Developer or in a similar role, familiarity with common stacks, strong knowledge and working experience of Core Java, Spring Boot, Rest APIs, Spark API, etc. Knowledge of the React framework and UI experience would be advantageous. Proficiency in Junit, Mockito, or other frameworks is necessary, while familiarity with GCP services, design/architecture, and security frameworks is an added advantage. Experience with databases like Oracle, PostgreSQL, and BigQuery, as well as developing on distributed application platforms like Hadoop with Spark, is expected. Excellent communication, teamwork, organizational, and analytical skills are essential, along with a degree in Computer Science, Statistics, or a relevant field and experience working in Agile environments. It would be beneficial to have knowledge of JavaScript frameworks (e.g., Angular, React, Node.js) and UI/UX design, Python, and NoSQL databases like HBASE and MONGO. The ideal candidate should have 4-7 years of prior working experience in a global banking/insurance/financial organization. We offer a supportive environment with training and development opportunities, coaching from experts in the team, a culture of continuous learning, and a range of flexible benefits tailored to suit your needs. If you are looking to excel in your career and contribute to a collaborative and inclusive work environment, we invite you to apply for the Java Developer position at our organization. For further information about our company and teams, please visit our company website at https://www.db.com/company/company.htm. We strive to create a culture where every individual is empowered to excel together, act responsibly, think commercially, take initiative, and work collaboratively. We celebrate the successes of our people and promote a positive, fair, and inclusive work environment. Join us at Deutsche Bank Group and be part of a team where together, we achieve excellence every day.,

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Role: Senior Associate Tower: Data, Analytics & Specialist Managed Service Experience: 5 - 10 years Key Skills: Azure Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: India Job Description As a Senior Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Azure Cloud Engineer Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 5 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 5 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, AWS, Azure, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Perform data transformation and processing tasks to prepare the data for analysis and reporting in Azure Databricks or Azure Synapse Analytics for large-scale data transformations using tools like Apache Spark. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have Azure certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Role: Senior Associate Tower: Data, Analytics & Specialist Managed Service Experience: 5 - 10 years Key Skills: Azure Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: India Job Description As a Senior Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Azure Cloud Engineer Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 5 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 5 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, AWS, Azure, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Perform data transformation and processing tasks to prepare the data for analysis and reporting in Azure Databricks or Azure Synapse Analytics for large-scale data transformations using tools like Apache Spark. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have Azure certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Role: Senior Associate Tower: Data, Analytics & Specialist Managed Service Experience: 5.0 - 10 years Key Skills: AWS Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: India.;l Job Description As a Senior Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: AWS Cloud Engineer Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 5 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 5 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snoflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Role: Senior Associate Tower: Data, Analytics & Specialist Managed Service Experience: 5.0 - 10.0 years Key Skills: AWS Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: India.;l Job Description As a Senior Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: AWS Cloud Engineer Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 5 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 5 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snoflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 4 days ago

Apply

3.0 - 5.5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Associate Tower: Data, Analytics & Specialist Managed Service Experience: : 3 -5.5 years Key Skills: AWS Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: Bangalore Job Description As a Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: AWS Cloud Engineer Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 2 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 1-3 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snoflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 4 days ago

Apply

13.0 years

0 Lacs

Andhra Pradesh, India

On-site

Summary about Organization A career in our Advisory Acceleration Center is the natural extension of PwC’s leading global delivery capabilities. The team consists of highly skilled resources that can assist in the areas of helping clients transform their business by adopting technology using bespoke strategy, operating model, processes and planning. You’ll be at the forefront of helping organizations around the globe adopt innovative technology solutions that optimize business processes or enable scalable technology. Our team helps organizations transform their IT infrastructure, modernize applications and data management to help shape the future of business. An essential and strategic part of Advisory's multi-sourced, multi-geography Global Delivery Model, the Acceleration Centers are a dynamic, rapidly growing component of our business. The teams out of these Centers have achieved remarkable results in process quality and delivery capability, resulting in a loyal customer base and a reputation for excellence. . Job Description Senior Data Architect with experience in design, build, and optimization of complex data landscapes and legacy modernization projects. The ideal candidate will have deep expertise in database management, data modeling, cloud data solutions, and ETL (Extract, Transform, Load) processes. This role requires a strong leader capable of guiding data teams and driving the design and implementation of scalable data architectures. Key areas of expertise include Design and implement scalable and efficient data architectures to support business needs. Develop data models (conceptual, logical, and physical) that align with organizational goals. Lead the database design and optimization efforts for structured and unstructured data. Establish ETL pipelines and data integration strategies for seamless data flow. Define data governance policies, including data quality, security, privacy, and compliance. Work closely with engineering, analytics, and business teams to understand requirements and deliver data solutions. Oversee cloud-based data solutions (AWS, Azure, GCP) and modern data warehouses (Snowflake, BigQuery, Redshift). Ensure high availability, disaster recovery, and backup strategies for critical databases. Evaluate and implement emerging data technologies, tools, and frameworks to improve efficiency. Conduct data audits, performance tuning, and troubleshooting to maintain optimal performance Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 13+ years of experience in data modeling, including conceptual, logical, and physical data design. 5 – 8 years of experience in cloud data lake platforms such as AWS Lake Formation, Delta Lake, Snowflake or Google Big Query. Proven experience with NoSQL databases and data modeling techniques for non-relational data. Experience with data warehousing concepts, ETL/ELT processes, and big data frameworks (e.g., Hadoop, Spark). Hands-on experience delivering complex, multi-module projects in diverse technology ecosystems. Strong understanding of data governance, data security, and compliance best practices. Proficiency with data modeling tools (e.g., ER/Studio, ERwin, PowerDesigner). Excellent leadership and communication skills, with a proven ability to manage teams and collaborate with stakeholders. Preferred Skills Experience with modern data architectures, such as data fabric or data mesh. Knowledge of graph databases and modeling for technologies like Neo4j. Proficiency with programming languages like Python, Scala, or Java. Understanding of CI/CD pipelines and DevOps practices in data engineering.

Posted 4 days ago

Apply

7.0 years

0 Lacs

Delhi, India

On-site

Role Expectations Data Collection and Cleaning : Collect, organize, and clean large datasets from various sources (internal databases, external APIs, spreadsheets, etc. Ensure data accuracy, completeness, and consistency by cleaning and transforming raw data into usable formats. Data Analysis Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies. Conduct statistical analysis to support decision-making and uncover insights. Use analytical methods to identify opportunities for process improvements, cost reductions, and efficiency enhancements. Reporting And Visualization Create and maintain clear, actionable, and accurate reports and dashboards for both technical and non-technical stakeholders. Design data visualizations (charts, graphs, and tables) that communicate findings effectively to decision-makers. Worked on PowerBI , Tableau and Pythoin Libraries for Data visualization like matplotlib , seaborn , plotly , Pyplot , pandas etc. Experience in generating the Descriptive , Predictive & prescriptive Insights with Gen AI using MS Copilot in PowerBI. Experience in Prompt Engineering & RAG Architectures. Prepare reports for upper management and other departments, presenting key findings and : Work closely with cross-functional teams (marketing, finance, operations, etc.) to understand their data needs and provide actionable insights. Collaborate with IT and database administrators to ensure data is accessible and well-structured. Provide support and guidance to other teams regarding data-related questions or issues. Data Integrity And Security Ensure compliance with data privacy and security policies and practices. Maintain data integrity and assist with implementing best practices for data storage and access. Continuous Improvement Stay current with emerging data analysis techniques, tools, and industry trends. Recommend improvements to data collection, processing, and analysis procedures to enhance operational efficiency. Qualifications Education : Bachelor's degree in Data Science, Statistics, Computer Science, Mathematics, or a related field. A Master's degree or relevant certifications (e.g., in data analysis, business intelligence) is a plus. Experience Proven experience as a Data Analyst or in a similar analytical role (typically 7+ years). Experience with data visualization tools (e.g., Tableau, Power BI, Looker). Strong knowledge of SQL and experience with relational databases. Familiarity with data manipulation and analysis tools (e.g., Python, R, Excel, SPSS). Worked on PowerBI , Tableau and Pythoin Libraries for Data visualization like matplotlib , seaborn , plotly , Pyplot , pandas etc. Experience with big data technologies (e.g., Hadoop, Spark) is a plus. Technical Skills Proficiency in SQL and data query languages. Knowledge of statistical analysis and methodologies. Experience with data visualization and reporting tools. Knowledge of data cleaning and transformation techniques. Familiarity with machine learning and AI concepts is an advantage (for more advanced roles). Soft Skills Strong analytical and problem-solving abilities. Excellent attention to detail and ability to identify trends in complex data sets. Good communication skills to present data insights clearly to both technical and non-technical audiences. Ability to work independently and as part of a team. Strong time management and organizational skills, with the ability to prioritize tasks effectively. (ref:hirist.tech)

Posted 4 days ago

Apply

4.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Big Data Lead with 7-12 years of experience, you will be responsible for leading the development of data processing systems and applications, specifically in the areas of Data Warehousing (DWH). Your role will involve utilizing your strong software development skills in multiple computing languages, with a focus on distributed data processing systems and BIDW programs. You should have a minimum of 4 years of software development experience and a proven track record in developing and testing applications, preferably on the J2EE stack. A sound understanding of best practices and concepts related to Data Warehouse Applications is crucial for this role. Additionally, you should possess a strong foundation in distributed systems and computing systems, with hands-on experience in Spark & Scala, Kafka, Hadoop, Hbase, Pig, and Hive. Experience with NoSQL data stores, data modeling, and data management will be beneficial for this role. Strong interpersonal communication skills are essential, along with excellent oral and written communication abilities. Knowledge of Data Lake implementation as an alternative to Data Warehousing is desirable. Hands-on experience with Spark SQL and SQL proficiency are mandatory requirements for this role. You should have a minimum of 2 end-to-end implementations in either Data Warehousing or Data Lake projects. Your role as a Big Data Lead will involve collaborating with cross-functional teams and driving data-related initiatives to meet business objectives effectively.,

Posted 4 days ago

Apply

2.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Tiger Analytics is a global AI and analytics consulting firm that is at the forefront of solving complex problems using data and technology. With a team of over 2800 experts spread across the globe, we are dedicated to making a positive impact on the lives of millions worldwide. Our culture is built on expertise, respect, and collaboration, with a focus on teamwork. While our headquarters are in Silicon Valley, we have delivery centers and offices in various cities in India, the US, UK, Canada, and Singapore, as well as a significant remote workforce. As an Azure Big Data Engineer at Tiger Analytics, you will be part of a dynamic team that is driving an AI revolution. Your typical day will involve working on a variety of analytics solutions and platforms, including data lakes, modern data platforms, and data fabric solutions using Open Source, Big Data, and Cloud technologies on Microsoft Azure. Your responsibilities may include designing and building scalable data ingestion pipelines, executing high-performance data processing, orchestrating pipelines, designing exception handling mechanisms, and collaborating with cross-functional teams to bring analytical solutions to life. To excel in this role, we expect you to have 4 to 9 years of total IT experience with at least 2 years in big data engineering and Microsoft Azure. You should be well-versed in technologies such as Azure Data Factory, PySpark, Databricks, Azure SQL Database, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Your passion for writing high-quality, scalable code and your ability to collaborate effectively with stakeholders are essential for success in this role. Experience with big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, and Neo4J, as well as knowledge of different file formats and REST API design, will be advantageous. At Tiger Analytics, we value diversity and inclusivity, and we encourage individuals with varying skills and backgrounds to apply. We are committed to providing equal opportunities for all our employees and fostering a culture of trust, respect, and growth. Your compensation package will be competitive and aligned with your expertise and experience. If you are looking to be part of a forward-thinking team that is pushing the boundaries of what is possible in AI and analytics, we invite you to join us at Tiger Analytics and be a part of our exciting journey towards building innovative solutions that inspire and energize.,

Posted 4 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Key responsibilities: Working with clients to understand their data. Basedontheunderstanding you will be building the data structures and pipelines. You will be working on the application from end to end collaborating with UI and other development teams. You will be responsible for building the data pipelines to migrate and load the data into the HDFSeither on-prem or in the cloud. Developing Data ingestion/process/integration pipelines effectively. Creating Hive data structures,metadata and loading the data into data lakes / Big Data warehouse environments. Optimized(Performance tuning) many data pipelines effectively to minimize cost. Codeversioning control and git repository is up to date. You will be responsible for building and maintaining CI/CD of the data pipelines. You will be managing the unit testing of all data pipelines Requirements Skills & Experience: Bachelor’s degree in computer science or related field. Minimum of 5+years working experience with Spark, Hadoop eco systems. Minimum of 4+years working experience on designing data streaming pipelines. Should be an expert in either Python/Scala/Java. Should have experience in Data Ingestion and Integration into data lake using hadoop ecosystem tools such as Sqoop, Spark, SQL, Hive, Airflow, etc.. Should have experience optimizing (Performance tuning) data pipelines. Minimum experience of 3+ years on NoSQL and Spark Streaming. Knowledge of Kubernetes and Docker is a plus. Should have experience with Cloud services either Azure/AWS. Should have experience with on-prem distribution such as Cloudera/HortonWorks/MapR. Basic understanding of CI/CD pipelines. Basic knowledge of Linux environment and commands.

Posted 4 days ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

This job is with Amazon, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Description AWS Global Services includes experts from across AWS who help our customers design, build, operate, and secure their cloud environments. Customers innovate with AWS Professional Services, upskill with AWS Training and Certification, optimize with AWS Support and Managed Services, and meet objectives with AWS Security Assurance Services. Our expertise and emerging technologies include AWS Partners, AWS Sovereign Cloud, AWS International Product, and the Generative AI Innovation Center. You'll join a diverse team of technical experts in dozens of countries who help customers achieve more with the AWS cloud. Amazon has built a global reputation for being the most customer-centric company, a company that customers from all over the world recognize, value, and trust for both our products and services. Amazon has a fast-paced environment where we "Work Hard, Have Fun and Make History." As an increasing number of enterprises move their critical systems to the cloud, AWS India is in need of highly efficient technical consulting talent to help our largest and strategically important customers navigate the operational challenges and complexities of AWS Cloud. We are looking for Technical Consultants to support our customers creative and transformative spirit of innovation across all technologies, including Compute, Storage, Database, Data Analytics, Application services, Networking, Server-less and more. This is not a sales role, but rather an opportunity to be the principal technical advisor for organizations ranging from start-ups to large enterprises. As a Technical Account Manager, you will be the primary technical point of contact for one or more customers helping to plan, debug, and oversee ongoing operations of business-critical applications. You will get your hands dirty, troubleshooting application, network, database, and architectural challenges using a suite of internal AWS Cloud tools as well as your existing knowledge and toolkits. We are seeking individuals with strong backgrounds in I.T. Consulting and in any of these related areas such as Solution Designing, Application and System Development, Database Management, Big Data and Analytics, DevOps Consulting, and Media technologies. Knowledge of programming and scripting is beneficial to the role. Key job responsibilities Every day will bring new and exciting challenges on the job while you: Learn and use new Cloud technologies. Interact with leading technologists around the world. Work on critical, highly complex customer problems that may span multiple AWS Cloud services. Apply advanced troubleshooting techniques to provide unique solutions to our customers' individual needs. Work directly with AWS Cloud subject matter experts to help reproduce and resolve customer issues. Write tutorials, how-to videos, and other technical articles for the customer community. Leverage your extensive customer support experience and provide feedback to internal AISPL teams on how to improve our services. Drive projects that improve support-related processes and our customers' technical support experience. Assist in Design/Architecture of AWS and Hybrid cloud solutions. Help Enterprises define IT and business processes that work well with cloud deployments. Be available outside of business hours to help coordinate the handling of urgent issues as needed. A day in the life As a TAM, you'll start your day reviewing operational metrics and service health for your strategic enterprise customers. You might lead a morning technical deep-dive session with a customer's engineering team, helping them optimize their cloud architecture. By midday, you could be collaborating with AWS service teams to resolve a complex migration challenge or providing proactive recommendations for cost optimization. Afternoons often involve strategic planning sessions, where you'll help customers align their technical roadmap with business objectives. You'll also participate in architecture reviews, incident post-mortems, and best-practice workshops. Throughout the day, you'll leverage your technical expertise to provide timely solutions, whether it's improving security posture, enhancing operational excellence, or architecting for scale. While most work happens during business hours, you'll occasionally support critical situations outside regular hours, ensuring your customers' mission-critical workloads run smoothly About The Team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn't followed a traditional path, or includes alternative experiences, don't let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world's most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating - that's why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do. Mentorship & Career Growth We're continuously raising our performance bar as we strive to become Earth's Best Employer. That's why you'll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there's nothing we can't achieve. Basic Qualifications Bachelor's Degree with 10+ years of hands-on Infrastructure / Troubleshooting / Systems Administration / Networking / DevOps / Applications Development experience in a distributed systems environment. External enterprise customer-facing experience as a technical lead, with strong oral and written communication skills, presenting to both large and small audiences. Ability to manage multiple tasks and projects in a fast-moving environment. Be mobile and travel to client locations as needed. Preferred Qualifications Advanced experience in one or more of the following areas: Software Design or Development, Content Distribution/CDN, Scripting/Automation, Database Architecture, Cloud Architecture, Cloud Migrations, IP Networking, IT Security, Big Data/Hadoop/Spark, Operations Management, Service Oriented Architecture etc. Experience in a 24x7 operational services or support environment. Experience with AWS Cloud services and/or other Cloud offerings. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies