Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Ab Initio Data Engineer We are looking for Ab Initio Data Engineer to be able to design and build Ab Initio-based applications across Data Integration, Governance & Quality domains for Compliance Risk programs. The individual will be working with both Technical Leads, Senior Solution Engineers and prospective Application Managers in order to build applications, rollout and support production environments, leveraging Ab Initio tech-stack, and ensuring the overall success of their programs. The programs have a high visibility, and are fast paced key initiatives, which generally aims towards acquiring & curating data and metadata across internal and external sources, provide analytical insights and integrate with other Citi systems. Technical Stack: Ab Initio 4.0.x software suite – Co>Op, GDE, EME, BRE, Conduct>It, Express>It, Metadata>Hub, Query>it, Control>Center, Easy>Graph Big Data – Cloudera Hadoop, Hive, Yarn Databases - Oracle 11G/12C, Teradata, MongoDB, Snowflake Others – JIRA, Service Now, Linux, SQL Developer, AutoSys, and Microsoft Office Responsibilities: Ability to design and build Ab Initio graphs (both continuous & batch) and Conduct>it Plans, and integrate with portfolio of Ab Initio softwares. Build Web-Service and RESTful graphs and create RAML or Swagger documentations. Complete understanding and analytical ability of Metadata Hub metamodel. Strong hands on Multifile system level programming, debugging and optimization skill. Hands on experience in developing complex ETL applications. Good knowledge of RDBMS – Oracle, with ability to write complex SQL needed to investigate and analyze data issues Strong in UNIX Shell/Perl Scripting. Build graphs interfacing with heterogeneous data sources – Oracle, Snowflake, Hadoop, Hive, AWS S3. Build application configurations for Express>It frameworks – Acquire>It, Spec-To-Graph, Data Quality Assessment. Build automation pipelines for Continuous Integration & Delivery (CI-CD), leveraging Testing Framework & JUnit modules, integrating with Jenkins, JIRA and/or Service Now. Build Query>It data sources for cataloguing data from different sources. Parse XML, JSON & YAML documents including hierarchical models. Build and implement data acquisition and transformation/curation requirements in a data lake or warehouse environment, and demonstrate experience in leveraging various Ab Initio components. Build Autosys or Control Center Jobs and Schedules for process orchestration Build BRE rulesets for reformat, rollup & validation usecases Build SQL scripts on database, performance tuning, relational model analysis and perform data migrations. Ability to identify performance bottlenecks in graphs, and optimize them. Ensure Ab Initio code base is appropriately engineered to maintain current functionality and development that adheres to performance optimization, interoperability standards and requirements, and compliance with client IT governance policies Build regression test cases, functional test cases and write user manuals for various projects Conduct bug fixing, code reviews, and unit, functional and integration testing Participate in the agile development process, and document and communicate issues and bugs relative to data standards Pair up with other data engineers to develop analytic applications leveraging Big Data technologies: Hadoop, NoSQL, and In-memory Data Grids Challenge and inspire team members to achieve business results in a fast paced and quickly changing environment Perform other duties and/or special projects as assigned Qualifications: Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) and a minimum of 5 years of experience Minimum 5 years of extensive experience in design, build and deployment of Ab Initio-based applications Expertise in handling complex large-scale Data Lake and Warehouse environments Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Syniverse is the world’s most connected company. Whether we’re developing the technology that enables intelligent cars to safely react to traffic changes or freeing travelers to explore by keeping their devices online wherever they go, we believe in leading the world forward. Which is why we work with some of the world’s most recognized brands. Eight of the top 10 banks. Four of the top 5 global technology companies. Over 900 communications providers. And how we’re able to provide our incredible talent with an innovative culture and great benefits. Who We're Looking For The Sr Data Engineer is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems or building new solutions from ground up. This role will work with developers, architects, product managers and data analysts on data initiatives and ensure optimal data delivery with good performance and uptime metrics. Your behaviors align strongly with our values because ours do. Some Of What You'll Do Scope of the Role: Direct Reports: This is an individual contributor role with no direct reports Key Responsibilities Create, enhance, and maintain optimal data pipeline architecture and implementations. Analyze data sets to meet functional / non-functional business requirements. Identify, design, and implement data process: automating processes, optimizing data delivery, etc. Build infrastructure and tools to increase data ETL velocity. Work with data and analytics experts to implement and enhance analytic product features. Provide life cycle support the Operations team for existing products, services, and functionality assigned to the Data Engineering team. Experience, Education, And Certifications Bachelor’s degree in Computer Science, Statistics, Informatics or related field or equivalent work experience. 5+ years of Software Development experience, including 3+ years of experience in Data Engineer fields. Experience in building and optimizing big data pipelines, architectures, and data sets: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL databases, such as PostgreSQL, MySQL, etc. Experience with stream-processing systems: Flink, KSQL, Spark-Streaming, etc. Experience with programming languages, such as Java, Scala, Python, etc. Experience with cloud data engineering and development, such as AWS, etc. Additional Requirements Familiar with Agile software design processes and methodologies. Good analytic skills related to working with structured and unstructured datasets. Knowledge of message queuing, stream processing and scalable big data stores. Ownership/accountability for tasks/projects with on time and quality deliveries. Good verbal and written communication skills. Teamwork with independent design and development habits. Work with a sense of urgency and positive attitude. Why You Should Join Us Join us as we write a new chapter, guided by world-class leadership. Come be a part of an exciting and growing organization where we offer a competitive total compensation, flexible/remote work and with a leadership team committed to fostering an inclusive, collaborative, and transparent organizational culture. At Syniverse connectedness is at the core of our business. We believe diversity, equity, and inclusion among our employees is crucial to our success as a global company as we seek to recruit, develop, and retain the most talented people who want to help us connect the world. Know someone at Syniverse? Be sure to have them submit you as a referral prior to applying for this position.
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description What Will You Do? Build and own ETL data pipelines that will power all our reporting & analytics needs Develop clean, safe, testable and cost-efficient solutions Build fast and reliable pipelines with underlying data model that can scale according to business needs and growth Understand the system you are building, foresee possible shortcomings and be able to resolve or compromise appropriately Mentor Junior engineers in data Quality/pipelines etc. Company Overview Fanatics is building the leading global digital sports platform to ignite and harness the passions of fans and maximize the presence and reach for hundreds of partners globally. Leveraging these long-standing partnerships, a database of more than 80 million global consumers and a trusted, beyond recognizable brand name, Fanatics is expanding its position as the global leader for licensed sports merchandise to now becoming a next-gen digital sports platform, featuring an array of offerings across the sports ecosystem. The Fanatics family of companies currently includes Fanatics Commerce, a vertically-integrated licensed merchandise business that has changed the way fans purchase their favorite team apparel, jerseys, headwear and hardgoods through a tech-infused approach to making and quickly distributing fan gear in today’s 24/7 mobile-first economy; Fanatics Collectibles, a transformative company that is building a new model for the hobby and giving collectors an end-to-end collectibles experience; and Fanatics Betting & Gaming, a mobile betting, gaming and retail Sportsbook platform. all major Fanatics’ partners include professional sports leagues (NFL, MLB, NBA, NHL, NASCAR, MLS, PGA) and hundreds of collegiate and professional teams, which include several of the biggest global soccer clubs. As a market leader with more than 8,000 employees, and hundreds of partners, suppliers, and vendors worldwide, we take responsibility for driving toward more ethical and sustainable practices. We are committed to building an inclusive Fanatics community, reflecting and representing society at every level of the business, including our employees, vendors, partners and fans. Fanatics is also dedicated to making a positive impact in the communities where we all live, work, and play through strategic philanthropic initiatives. At Fanatics, we’re a diverse, passionate group of employees aiming to ignite pride and passion in the fans we outfit, celebrate and support. We recognize that diversity helps drive and foster innovation, and through our IDEA program (inclusion, diversity, equality and advocacy) at Fanatics we provide employees with tools and resources to feel connected and engaged in who they are and what they do to support the ultimate fan experience. Job Requirements Must have 5+ years of experience in Data Engineering field, with a proven track record of exposure in Big Data technologies such as Hadoop, Amazon EMR, Hive, Spark. Expertise in SQL technologies and at least one major Data Warehouse technology (Snowflake, RedShift, BigQuery etc.). Must have experience in building data platform – designing and building data model, integrate data from many sources, build ETL and data-flow pipelines, and support all parts of the data platform. Programming proficiency in Python and Scala, with experience writing modular, reusable, and testable code, including robust error handling and logging in data engineering applications. Hands-on experience with AWS cloud services, particularly in areas such as S3, Lambda, Glue, EC2, RDS, and IAM. Experience with orchestration tools such as Apache Airflow, for scheduling, monitoring, and managing data pipelines in a production environment. Familiarity with CI/CD practices, automated deployment pipelines, and version control systems (e.g., Git, GitHub/GitLab), ensuring reliable and repeatable data engineering workflows. Data Analysis skill – can make arguments with data and proper visualization. Energetic, enthusiastic, detail-oriented, and passionate about producing high-quality analytics deliverable. Must have experience in developing application with high performance and low latency. Ability to take ownership of initiatives and drive them independently from conception to delivery, including post-deployment monitoring and support. Strong communication and interpersonal skills with the ability to build relationships with stakeholders, understand business requirements, and translate them into technical solutions. Comfortable working cross-functionally in a multi-team environment, collaborating with data analysts, product managers, and engineering teams to deliver end-to-end data solutions. Job Description We are seeking a Sr. Data Engineer who has strong design, developments skills and upkeeps scalability, availability and excellence when building the next generation of our data pipelines and platform. You are an expert in various data processing technologies and data stores, appreciate the value of clear communication and collaboration, and devote to continual capacity planning and performance fine-tuning for emerging business growth. As the Senior Data Engineer, you will be mentoring Junior engineers in the team. Good To Have Experience in Web Services, API integration, Data exchanges with third parties is preferred. Experience in Snowflake is a big plus. Experience in NoSQL technologies (MongoDB, FoundationDB, Redis) is a plus. We would appreciate candidates who can demonstrate business-side functional understanding and effectively communicate the business context alongside their technical expertise.
Posted 1 week ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Profile Description We’re seeking someone to join our team as (Associate) BI Dev who has credibility, confidence and active participation with leadership teams, business units and technology groups across the project lifecycle. Specific assignments will depend on the size and complexity of the project WM_Technology Wealth Management Technology is responsible for the design, development, delivery, and support of the technical solutions behind the products and services used by the Morgan Stanley Wealth Management Business. Practice areas include: Analytics, Intelligence, & Data Technology (AIDT), Client Platforms, Core Technology Services (CTS), Financial Advisor Platforms, Global Banking Technology (GBT), Investment Solutions Technology (IST), Institutional Wealth and Corporate Solutions Technology (IWCST), Technology Delivery Management (TDM), User Experience (UX), and the CAO team. Analytics Intelligence & Data Technology Analytics, Intelligence and Data Technology (AIDT) enables and drives strategic data initiatives and business capabilities across Wealth Management. Software Engineering This is Associate position that develops and maintains software solutions that support business needs. Morgan Stanley is an industry leader in financial services, known for mobilizing capital to help governments, corporations, institutions, and individuals around the world achieve their financial goals. At Morgan Stanley India, we support the Firm’s global businesses, with critical presence across Institutional Securities, Wealth Management, and Investment management, as well as in the Firm’s infrastructure functions of Technology, Operations, Finance, Risk Management, Legal and Corporate & Enterprise Services. Morgan Stanley has been rooted in India since 1993, with campuses in both Mumbai and Bengaluru. We empower our multi-faceted and talented teams to advance their careers and make a global impact on the business. For those who show passion and grit in their work, there’s ample opportunity to move across the businesses for those who show passion and grit in their work. Interested in joining a team that’s eager to create, innovate and make an impact on the world? Read on… What You’ll Do In The Role Develop and build enterprise level applications using BI technologies. Design and develop scalable data model to create intuitive reporting and dashboard. Design and deliver cross functional custom reporting solutions. Analyze Business User Stories and translate them into meaningful tasks. Interface with global team of developers and business users. Work as part of Agile Squad / Fleet. Perform proof of concepts in new areas of development. Support continuous improvement of automated systems. Participate in all aspects of SDLC (analysis, design, coding, testing and implementation) What You’ll Bring To The Role At least 2 years’ relevant experience would generally be expected to find the skills required for this role Business Objects Reports Development & Universe Design Tableau Dashboard Development. Power BI Development and Azure Relational databases DB2, Sybase, and Teradata. DDL and DML writing skills are essential as well as being able to write complex SQLs for data analysis. Experience data modeling and transformation of large-scale data sources using SQL, Hadoop, Spark, Hive, Snowflake, Teradata or other Big Data technologies. Knowledge of version control systems, such as GIT/BitBucket Data warehousing concepts (Facts, Dimensions, star and snowflake design, etc.) Experience with some scheduling tool ( e.g. Tivoli, Autosys) Database Performance and Tuning Unix Scripting Excellent communication, organizational, and planning skills. Understanding of the requirements of large enterprise applications (security, entitlements, etc.). Knowledge of the SDLC (Software Development Life Cycle). Proficiency writing unit tests within a Test Driven Development (TDD) methodology. What You Can Expect From Morgan Stanley We are committed to maintaining the first-class service and high standard of excellence that have defined Morgan Stanley for over 89 years. Our values - putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back - aren’t just beliefs, they guide the decisions we make every day to do what's best for our clients, communities and more than 80,000 employees in 1,200 offices across 42 countries. At Morgan Stanley, you’ll find an opportunity to work alongside the best and the brightest, in an environment where you are supported and empowered. Our teams are relentless collaborators and creative thinkers, fueled by their diverse backgrounds and experiences. We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry. There’s also ample opportunity to move about the business for those who show passion and grit in their work. To learn more about our offices across the globe, please copy and paste https://www.morganstanley.com/about-us/global-offices into your browser. Morgan Stanley is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of the global communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents.
Posted 1 week ago
3.0 - 5.0 years
4 - 8 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to provide effective technical support to the process and actively resolve client issues directly or through timely escalation to meet process SLAs. Do Support process by managing transactions as per required quality standards Fielding all incoming help requests from clients via telephone and/or emails in a courteous manner Document all pertinent end user identification information, including name, department, contact information and nature of problem or issue Update own availability in the RAVE system to ensure productivity of the process Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Follow standard processes and procedures to resolve all client queries Resolve client queries as per the SLAs defined in the contract Access and maintain internal knowledge bases, resources and frequently asked questions to aid in and provide effective problem resolution to clients Identify and learn appropriate product details to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Maintain and update self-help documents for customers to speed up resolution time Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by complying with service agreements Deliver excellent customer service through effective diagnosis and troubleshooting of client queries Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Assist clients with navigating around product menus and facilitate better understanding of product features Troubleshoot all client queries in a user-friendly, courteous and professional manner Maintain logs and records of all customer queries as per the standard procedures and guidelines Accurately process and record all incoming call and email using the designated tracking software Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract /SLAs Build capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Partner with team leaders to brainstorm and identify training themes and learning issues to better serve the client Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Pentaho DI - Kettle. Experience:3-5 Years.
Posted 1 week ago
8.0 - 12.0 years
55 - 60 Lacs
Mumbai, Navi Mumbai
Work from Office
Must Have Skills: Java, Web Development, AWS, (Spring Boot, hibernate, dropwizard,Hadoop, Kafka, Apache, Role & responsibilities Requirements 1. Proven work experience as a Java Developer 2. In-depth understanding of the entire web development process (design, development and deployment) 3. Hands-on experience with programming languages like Java (Spring Boot, hibernate, dropwizard, etc.) 4. Familiarity with cloud services such as AWS 5. Familiarity with Big Data Technology (Hadoop, Kafka, Apache, etc.) 6. Familiarity with front-end languages (e.g. HTML, JavaScript and CSS) 7. Excellent analytical and time management skills 8. Teamwork skills with a problem-solving attitude Please let us know if you would like to know more about the compensation and growth opportunities associated with this position and share your resume at divya.bnh@gmail.com
Posted 1 week ago
8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Role : Senior Associate Exp : 5--8 Years Location: Mumbai 1. Looking for Candidates with 5 to 8 years of experience. 2. Hands on experience of implementing data pipelines using traditional DWH, Big data & Cloud ecosystem. 3. Hands on Experience on SQL. 4. Hands on Experience in Hadoop, Spark/Databricks, Cloud environment (Azure/AWS) 5. Good understanding of handling Realtime/streaming pipelines. 6. Have experience of interacting with clients. 7. Should be able to gel and work with the team 8. Should understand of finance domain 9. Exposure of managing and leading teams. Mandatory skill sets: AWS/Azure/Snowflake/Hadoop Preferred skill sets: AWS/Azure/Snowflake/Hadoop Years of experience required: 5—8 Years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Technology Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Apache Hadoop, AWS Devops, Microsoft Azure, Snowflake (Platform) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 week ago
12.0 - 17.0 years
27 - 35 Lacs
Madurai, Chennai
Work from Office
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: GCP Data Architect Location: Madurai/Chennai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer: Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership
Posted 1 week ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Job Description: Job Summary: We are seeking a talented Data Engineer with strong expertise in Databricks, specifically in Unity Catalog, PySpark, and SQL, to join our data team. You’ll play a key role in building secure, scalable data pipelines and implementing robust data governance strategies using Unity Catalog. Key Responsibilities: Design and implement ETL/ELT pipelines using Databricks and PySpark. Work with Unity Catalog to manage data governance, access controls, lineage, and auditing across data assets. Develop high-performance SQL queries and optimize Spark jobs. Collaborate with data scientists, analysts, and business stakeholders to understand data needs. Ensure data quality and compliance across all stages of the data lifecycle. Implement best practices for data security and lineage within the Databricks ecosystem. Participate in CI/CD, version control, and testing practices for data pipelines. Required Skills: Proven experience with Databricks and Unity Catalog (data permissions, lineage, audits). Strong hands-on skills with PySpark and Spark SQL. Solid experience writing and optimizing complex SQL queries. Familiarity with Delta Lake, data lakehouse architecture, and data partitioning. Experience with cloud platforms like Azure or AWS. Understanding of data governance, RBAC, and data security standards. Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with tools like Airflow, Git, Azure Data Factory, or dbt. Exposure to streaming data and real-time processing. Knowledge of DevOps practices for data engineering. Mandatory skill sets: Databricks Preferred skill sets: Databricks Years of experience required: 7-14 years Education qualification: BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Master of Engineering, Bachelor of Engineering, Bachelor of Technology Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 33 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 week ago
5.0 - 10.0 years
7 - 14 Lacs
Noida
Hybrid
years of hands-on JD for Business Analyst Profile: Duties: Understands and translates the business strategy, business goals, and business processes into an IT solution. End-to-end Delivery of Reporting for applications Across Adobe Source Systems, Interface with Adobe BUs and technical delivery teams to assist in finalising user stories, translating them to technical documentation and other system documentation required by Agile methodologies (Jira). Strong understanding of how data solves for analytical and operational use cases in Marketing, Sales, Finance, Product development, or Customer Care Produce visualisations and stories that connect data analysis, data science and analytics to useful business insights. Create impactful presentations and reports to educate internal and external customers on solutions, insights and opportunities. Design and build database-driven analytical solutions and dashboards. Analyse performance metrics, business functions and operations to determine insights and recommendations to customers. Provide directions to product managers and engineers using advanced data modeling and analysis techniques Ability to work effectively in a fast-paced, results-orientated work environment. Ability to handle and adjust to changes in requirements and direction. A positive attitude, attention to detail and great problem-solving skills are important. Self-motivated, self-starter with a team-centric focus while working in a fluid environment. Skills: Use structured and disciplined approaches to solving problems. Innovation, creativity and an outside-the-box mindset are all keys to your success as you work with stakeholders and delve into the real business needs of the client and not just take the face value of what the needs may be. 5+ years of functional or data experience with attention to detail 5+ years hands-on experience with business intelligence tools (Power BI, MicroStrategy, Tableau, etc.) 5+ years of hands-on experience with OLAP/multi-dimensional data cubes Hands-on experience in working with Hadoop, Hive, MS SQL Technologies Requires a bachelors degree. A master’s degree in business analytics/computer science or an MBA is a plus. Hands-on experience in developing and deploying web analytics, Power BI, Tableau Dashboards Experience with agile development methodologies. Strong experience with BI tools and ability to create report mockups, self-serve data models and dashboard prototypes Strong experience with the development of MDX/DAX multi-dimensional and tabular cube models Strong understanding of big data platforms (HANA, Hadoop, Redshift, etc.) and both streaming & ETL processes, and familiarity with SQL/Python Good understanding of data modelling and RDBMS concepts and experience working on star and snowflake schemas.
Posted 1 week ago
5.0 - 8.0 years
9 - 14 Lacs
Chennai
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: DataBricks - Data Engineering. Experience:5-8 Years.
Posted 1 week ago
8.0 - 13.0 years
30 - 35 Lacs
Bengaluru
Work from Office
About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics.The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter.As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform.If you"ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 1 week ago
5.0 - 9.0 years
7 - 19 Lacs
Coimbatore
Work from Office
Responsibilities: * Design, develop & maintain data pipelines using Scala/Python & Hadoop/Spark. * Ensure data quality through testing & monitoring. * Collaborate with cross-functional teams on big data initiatives.
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Mandatory Skills: : Hadoop Admin Experience: 5-8 Years
Posted 1 week ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of Manager - Business Consulting Principal Responsibilities The candidate is expected to perform basic data ETL (Extract-Transform-Load) tasks – starting with extraction of data from the source systems, cleaning it, handling exceptions and finally prepare them to feed into various systems/tools depending on the projects/PODs/delivery channels. Testing and validation of data models/data tables/data assets The role also requires the person to understand the data thoroughly and draw business insights which needs to be shared with various business stakeholders for informed decision making/process improvement. Facilitate and manage regular meetings with senior managers, key stakeholders, subject matter experts and delivery partners to reach targeted milestones on time. The candidate needs to be really flexible in their approach and to be ready to understand ever changing business scenarios and cater to the dynamic requirements in a smart and prompt manner Driving automation efforts across multiple reports/processes, understanding the gap in the process and eventually driving a project to move away from tactical solutions and adopt more strategic way of delivering reports and insights. Adherence to controls and governance is must while dealing with the data. The candidate must understand the basic risks arising from managing sensitive banking data and act accordingly to manage the risks in their everyday life at work. Completion of mandatory trainings on time and be self-motivated to take up ample training opportunities the bank provides to upskill oneself for future of work. The candidate must demonstrate excellent communication skills, problem solving and critical thinking abilities. Requirements Hands on Experience in Python , SQL , SAS , Excel and Pyspark required Prior project management experience with knowledge of Agile methodology and exposure to project management tools like Jira, Confluence, GITHUB Exposure to Bigdata and Hadoop data lake is a big plus Knowledge of cloud platforms like GCP, Azure, AWS is a big plus Expertise in visualization tools like Qliksense, Tableau, Power BI is good to have Excellent verbal and written communication Good Stakeholder Management Skills. You’ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.” Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued By HSBC Electronic Data Processing (India) Private LTD***
Posted 1 week ago
6.0 - 9.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA).Soft Skills Having Database Administration skills Experience with data security, backup, and disaster recovery strategies. Implementing WLM on any of the databases (preferably DB2) Experience implementing data governance frameworks, including metadata management and data cataloging tools.
Posted 1 week ago
5.0 - 8.0 years
8 - 13 Lacs
Mumbai
Work from Office
At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client''s challenges of today and tomorrow.Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. Your role In this role you will play a key role in Use Design thinking and a consultative approach to conceive cutting edge technology solutions for business problems, mining core Insights as a service model & contribute to the capacity of a Scrum Master/data business analyst to Data project. Act as an agile coach, implementing and supporting agile principles, practices and rules of the scrum process and other rules that the team has agreed upon Knowledge of different framework like Scrum, Kanban, XP etc. & drive tactical, consistent team-level improvement as part of the scrum. while working closely with Product owner to get the priority by business value, work is aligned with objectives and drive the health of product backlog. Facilitate the scrum ceremonies and analyze the sprint report, burn down charts to identify the areas of improvement & support and coordinate system implementations through the project lifecycle working with other teams on a local and global basis. Engage with project activities across the Information lifecycle, often related to paradigms like -Building & managing Business data lakes and ingesting data streams to prepare data ,Developing machine learning and predictive models to analyse data, Visualizing data ,Specialize in Business Models and architectures across various Industry verticals. Your Profile Proven working experience as a Scrum Master/Data Business Analyst with overall experience of min 5 to 9+ years & preferably, person should have domain knowledge on CPRD/ FS/ MALS/ Utilities/ TMT & independently able to work with Product Management team and prepare Functional Analysis, User Stories Experience in technical writing skills to create BRD , FSD, Non-functional Requirement document, User Manual &Use Cases Specifications with Comprehensive & solid experience of SCRUM as well as SDLC methodologies with excellent meeting moderation and facilitation skills exhibiting strong stakeholder management skills Experience in Azure DevOps/JIRA/Confluence or equivalent system and possess strong knowledge of the other agile frameworks, CSM/PSM 1 or 2 certified is mandate; SAFe 5.0 /6.0 is a plus Good to have knowledge about big data ecosystem like Hadoop & good to have working knowledge of R/Python language What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI.
Posted 1 week ago
10.0 - 20.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Databricks Engineer: 9+ years of total experience with 5 years relevant experience in the mandatory skills. Mandatory Skills: Databricks, Hadoop, Python, Spark, Spark SQL, PySpark, AirFlow And IBM StreamSet Required Skills & Experience: Develop Data Engineering and ML pipelines in Databricks and different AWS services, including S3, EC2, API, RDS, Kinesis/Kafka and Lambda to build serverless applications Solid understanding of Databricks fundamentals/architecture and have hands on experience in setting up Databricks cluster, working in Databricks modules (Data Engineering, ML and SQL warehouse). Knowledge on medallion architecture, DLT and unity catalog within Databricks. Experience in migrating data from on-prem Hadoop to Databricks/AWS Understanding of core AWS services, uses, and AWS architecture best practices Hands-on experience in different domains, like database architecture, business intelligence, machine learning, advanced analytics, big data, etc. Solid knowledge on Airflow Solid knowledge on CI/CD pipeline in AWS technologies Application migration of RDBMS, java/python applications, model code, elastic etc. Solid programming background on scala, python, SQL
Posted 1 week ago
5.0 - 8.0 years
9 - 14 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: DataBricks - Data Engineering. Experience:5-8 Years.
Posted 1 week ago
0.0 - 3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities, and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence, and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview Data Infrastructure & Strategic Initiatives team is responsible for all the automation aspects of testing processes, ensuring quality of data and independent testing of corporate & business level process and regulatory controls by providing seamless access to the appropriate data & platforms required to execute the associated portfolio of tests. A test is defined in the Independent Testing Enterprise Policy as “An independent point-in-time examination of one or more processes, controls, policies and procedures or data sources utilized for managing risk to assess the effectiveness of the control environment. A test is focused on answering a specific objective and has a pre-defined pass/fail criteria.” Compliance testing may include activities such as automated surveillance and transaction level testing and may be performed onsite. Please note : This is not an application/software testing or application/software development role. Job Description Drive sustainable automation in EIT (Enterprise Independent Testing) test design, development, and implementation using a modern infrastructure and thoughtfully-designed solutions. Responsibilities: Developing testing automation that provides timely, useful, and actionable independent insight on the operational health of Bank of America’s processes. Work closely with process owners in the Front-Line Units and Control Functions (FLUs/CFs) to obtain an understanding of their processes, including underlying data, flows, and controls, and to identify risks to successful execution, so that appropriate testing and monitoring can be developed. The processes to be assessed span across multiple Products, Regulations and Enterprise Processes. The outputs of the methodologies will be used to drive process improvements and timely detection and reporting of errors. The role requires being able to document and verbally explain the intuition and details behind the methodologies in a manner that is clear, concise, and consumable for a broad set of audiences that include key stakeholders across the Bank, as well as auditors and regulators. The Test Tools and Automation specialist will then convert the test requirements into automated tests via Python and SQL. Enabling automatic test document generation from code. Leverage SDLC/Agile development, understand Coding standards and best practices. Perform debugging and code reviews to ensure quality of code. Ensure accuracy and quality development and hold validation session with the stakeholders to get a sign-off . Ensure adherence to the SLA’s / Metrics – productivity, turnaround-time and accuracy. Communicate regularly with management and other support colleagues. Manage stakeholders with respect to business deliverables. Drive projects independently with minimal oversight and ensure timely deliverables. Requirements: Education: MBA Experience Range 0-3 years Foundational Skills Prior experience/Knowledge of coding in Python and SQL Working knowledge of relational database and familiarity with data analysis/ mining tools, prior exposure to working with large dataset beneficial Technical Experience in object-oriented or functional software development Experienced in writing effective, scalable code Good understanding of software testing methodologies Worked on varied data problems; structured/unstructured and small/large. Applies critical thinking and connects the dots on how processes relate to one another. Demonstrates understanding of and passion for the “why”. Looks around the corner, explores uncharted territories with an “outside-in” perspective. Life-long learner who not only assertively educates self but encourages others to learn and grow. Feels ownership and accountability for delivering high quality work, able to prioritize effectively, adapt, and meet strict deadlines. Ability to recommend and implement process control improvements. Strong written, verbal, presentation creation and delivery skills. Communications are timely, concise, easy to follow and tailored to topic, audience, and competing priorities. Exercises excellent judgment, discerning appropriate moments to challenge or insert point of view. Presentations tell a compelling story and influence action. Asks the next level of questions, applies context to determine direction. Flexible to shift changes at short notice. Ability to work cross-functionally, and across borders to effectively execute the business goals. Desired skills: Any experience in Operational Risk, Audit or Compliance domain Exposure to Trifacta platform Automation acumen Experience using large data tooling including Hadoop and S3 as well as Spark and Trino Experience building unit and integration tests with Pytest Location: Students should be willing to work in any locations namely – Mumbai, Chennai, Gurugram, Gandhinagar (GIFT), Hyderabad as per company requirement Mandatory Eligibility Requirements Graduates from the Class of 2025 ONLY Must Have Specialization in Finance or as specified Must have scored 60% in the last semester OR CGPA of 6 on a scale of 10 in the last semester No Active Backlogs in any of the semesters Students should be willing to join any of the roles/skills/segment as per company requirement Students should be willing to work in any shifts including night shifts
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: : Hadoop Admin Experience: 5-8 Years
Posted 1 week ago
5.0 - 8.0 years
9 - 14 Lacs
Chennai, Coimbatore, Bengaluru
Work from Office
5+ years data warehouse testing experience, 2+ years of Azure Cloud experience. Strong understanding of data marts and data warehouse concepts Expert in SQL with the ability to create source-to-target comparison test cases in SQL Creation of test plans, testcases, traceability matrix, closure reports Proficient with Pyspark, Python, Git, Jira, JTM Band - B3 Location : Pune, Chennai, Coimbatore, Bangalore Mandatory Skills: ETL Testing. Experience:5-8 Years.
Posted 1 week ago
12.0 - 14.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist In this role, you will: You will work within the Data Engineering team and with a POD of Hadoop Data Engineers aligned to priorities from Product Owner. You are expected to support existing application as well as design and build new. You will be part of an Agile team, take on complex problems and design and code. You are expected to participate in the technical innovation within your product area. Support ‘issue resolution’ and improve processing performance Ensure the use of SQL, Hive, Pentaho, Control-M reduces lead time to delivery and aligns to overall Group strategic direction so that cross-functional development is usable. Take ownership of providing solutions and tools that iteratively increase engineering efficiencies. Work with the Product Owner, Solution & Platform Architects to identify changes required, create & agree necessary stories Work with the Agile Lead(s) to ensure efficient flow of the backlog of change activities Support the solution architect to ensure that solutions and services are supported by the right architectures & systems Requirements To be successful in this role, you should meet the following requirements: 12-14 years of overall hands-on IT experience with mandatory exposure of 5-7 years in Big Data, Hadoop, Pentaho ETL and data pipelines. Good knowledge of industry best practices for ETL Design, Principles, Concepts. Experienced in Python or other mainstream programming language Technical skills – ETL/ Pentaho, Hive/ SQL, Unix Shell scripting, Control-M (or similar) Ability to work independently on specialized assignments within the context of project deliverables Excellent verbal and written communication skills with the ability to effectively advocate technical solutions Experience in Agile ways of working Demonstrable track record of dealing well with ambiguity, prioritizing needs, and delivering results in a dynamic environment Flexible to work in shifts and provide on-call support, owning the smooth operation of applications and systems in a production environment. Nice to have: Experience with Bigdata cluster monitoring tools such as Ambari/ Cloudera Manager/ Accel Pulse Data and deriving insights into performance tuning or monitoring. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
JOB_POSTING-3-72293-4 Job Description Role Title : AVP, Marketing technology audience analyst (L10) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~52% women talent. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview: The Performance Marketing Team is the engine behind business growth as it handles the majority of marketing activities. This includes targeting the right audience through campaigns, distributing content via channel marketing, conducting a thorough analysis of campaign launches and budgets, and also ensuring compliance via surveillance and governance all to maximize results and maintain a competitive edge. Together this team drives ROI and elevates Synchrony's brand presence in a dynamic market. Role Summary Synchrony is continuing to build a world class Performance Marketing Organization committed to driving results with our retail and payments partners while also delivering leading customer experiences. The AVP, Audience Analyst will be responsible for mining and analyzing digital audience performance, delivering insights centered on audience strategies and persona development and developing digital audiences. Additionally, this role will excel in the understanding, building and tracking of audiences in various platforms and develop best practices for audience governance as well as support the broader audience strategy development. Key Responsibilities Perform and deliver audience analyses using multiple internal and external data sources and develop insights which will optimize current audience campaigns and direct future campaign strategies and activities. Create data personas and consult with marketing partners to drive the understanding of trends and audience opportunities Partner with cross functional teams to identify key audiences and assist in the collection of data for audience insight Build audiences, managing the workflow from onboarding of CRM data, the anonymization process, to pushing audience segments to destinations for programmatic and personalization campaigns Develop partnership with cross functional teams, establishing a strong communicative and collaborative atmosphere, to understand their business needs and goals and ultimately deliver processes and opportunities Perform other duties and/or special projects as assigned Qualifications/Requirements Bachelor's Degree with 5+ years of experience with mining and analyzing digital audience performance, or, in lieu of a degree, minimum of 7+ years of experience with mining and analyzing digital audience performance in financial domain 5+ years’ experience working in enterprise-level data sciences, analytics and/or customer intelligence, including: 3+ years of professional digital marketing experience, working directly within Digital Platforms (ex. CDP, DMP, DV360, Google Analytics, Adobe Analytics, Neustar, LiveRamp, Dynamic Yield, Optimizely, Adobe Audience Manager, Salesforce, etc) Desired Characteristics Proven experience executing analyses with massive data sets, complex data structures and multi-variate campaign strategies. Expert proficiency with leading-edge data mining techniques with analytic programming languages including Python, SQL, Java, SAS and others Leadership experience working with cross-functional partners (i.e., Media, Client Marketing, IT, Enterprise Operations) to deliver against mutual goals and ensure alignment of priorities and requirements. Working knowledge of analytic platforms and tools such as Hadoop, R, Hive as well as BI tools such as Tableau Operational understanding in areas such as probability and statistics; familiarity with machine learning and artificial intelligence a plus. Experience with compiling and analyzing data from paid media and digital marketing campaigns to report out actionable results Ability to provide wing-to-wing analytic activities data aggregation, analysis preparation, data interpretation and presenting strategic recommendations to client/product teams Expert proficiency with leading-edge data mining techniques and analytic platforms (including but not limited to Python and R) to feed a strong data management foundation Creative thinker with successful history of synthesizing insights to inform business decisions and lead strategic discussions; providing innovative thought leadership and developing actionable outcomes with tangible results Highly motivated, assertive self-starter with the ability to work autonomously or as a strong team participant Experience in Agile methodologies and processes a plus Experience with consumer financial services organizations, preferably with exposure to credit card marketing or retail marketing, consumer protection, privacy and related laws and policies. Eligibility Criteria Bachelor's Degree with 5+ years of experience with mining and analyzing digital audience performance, or, in lieu of a degree, minimum of 7+ years of experience with mining and analyzing digital audience performance in financial domain Work Timings: 2:00 PM to 11:00 PM IST For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (Formal/Final Formal, LPP) L8+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L8+ Employees can apply Level / Grade - 10 Job Family Group Marketing
Posted 1 week ago
6.0 - 11.0 years
15 - 19 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Project description During the 2008 financial crisis, many big banks failed or faced issues due to liquidity issues. Lack of liquidity can kill any financial institution over the night. That's why it's so critical to constantly monitor liquidity risks and properly maintain collaterals. We are looking for a number of talented developers, who would like to join our team in Pune, which is building liquidity risk and collateral management platform for one of the biggest investment banks over the globe. The platform is a set of front-end tools and back-end engines. Our platform helps the bank to increase efficiency and scalability, reduce operational risk and eliminate the majority of manual interventions in processing margin calls. Responsibilities The candidate will work on development of new functionality for Liqudity Risk platform closely with other teams over the globe. Skills Must have BigData experience (6 years+); Java/python J2EE, Spark, Hive; SQL Databases; UNIX Shell; Strong Experience in Apache Hadoop, Spark, Hive, Impala, Yarn, Talend, Hue; Big Data Reporting, Querying and analysis. Nice to have Spark Calculators based on business logic/rules Basic performance tuning and troubleshooting knowledge Experience with all aspects of the SDLC Experience with complex deployment infrastructures Knowledge in software architecture, design and testing Data flow automation (Apache NiFi, Airflow etc) Understanding of difference between OOP and Functional design approach Understanding of an event driven architecture Spring, Maven, GIT, uDeploy;
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France