Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Datastage ETL Testing. Experience: 5-8 Years.
Posted 4 days ago
5.0 - 8.0 years
9 - 14 Lacs
Gurugram
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: DataBricks - Data Engineering. Experience: 5-8 Years.
Posted 4 days ago
10.0 years
0 Lacs
Chennai, Tamil Nadu
Remote
Title : Senior Data Architect Years of Experience : 10+ years Job Description The Senior Data Architect will design, govern, and optimize the entire data ecosystem for advanced analytics and AI workloads. This role ensures data is collected, stored, processed, and made accessible in a secure, performant, and scalable manner. The candidate will drive architecture design for structured/unstructured data, build data governance frameworks, and support the evolution of modern data platforms across cloud environments. Key responsibilities · Architect enterprise data platforms using Azure/AWS/GCP and modern data lake/data mesh patterns · Design logical and physical data models, semantic layers, and metadata frameworks · Establish data quality, lineage, governance, and security policies · Guide the development of ETL/ELT pipelines using modern tools and streaming frameworks · Integrate AI and analytics solutions with operational data platforms · Enable self-service BI and ML pipelines through Databricks, Synapse, or Snowflake · Lead architecture reviews, design sessions, and CoE reference architecture development Technical Skills · Cloud Platforms: Azure Synapse, Databricks, Azure Data Lake, AWS Redshift · Data Modeling: ERWin, dbt, Power Designer · Storage & Processing: Delta Lake, Cosmos DB, PostgreSQL, Hadoop, Spark · Integration: Azure Data Factory, Kafka, Event Grid, SSIS · Metadata/Lineage: Purview, Collibra, Informatica · BI Platforms: Power BI, Tableau, Looker · Security & Compliance: RBAC, encryption at rest/in transit, NIST/FISMA Qualification · Bachelor’s or Master’s in Computer Science, Information Systems, or Data Engineering · Microsoft Certified: Azure Data Engineer / Azure Solutions Architect · Strong experience building cloud-native data architectures · Demonstrated ability to create data blueprints aligned with business strategy and compliance. Job Types: Full-time, Permanent Work Location: Hybrid remote in Chennai, Tamil Nadu Expected Start Date: 12/07/2025
Posted 4 days ago
7.0 years
0 Lacs
India
On-site
Job Title: Data Analyst About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Requirements Job Title: Data Analyst Experience: 7+ years Location: Hyderabad Roles & Responsibilities University degree in relevant disciplines. Strong analytical and problem-solving skills. Experience working within the Hadoop and GCP ecosystems in addition to strong technical skills in analytical languages such as Python, R, SQL, SAS. Good understanding of banking operations and processes, preferably in Risk, Compliance and Finance functions. Proven experience working in Agile environments (Kanban / Scrum) and familiarity with Agile tools like JIRA, Confluence, MS Teams & SharePoint. Excellent stakeholder engagement and management skills. Ability to navigate within the organization Proficient skills in MS Excel and PowerPoint.
Posted 4 days ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
TransUnion's Job Applicant Privacy Notice What We'll Bring: OneTru Data Operations Team As part of the Data Operations Team, this position should be focused on delivering actionable insights to evaluate and control data ingestion processes across multiple sources (Cloud and On-Prem). This individual will leverage state-of-the-art tools to cultivate the analytical methods needed to consistently refine and improve the efficiency and effectiveness of data onboarding/data operations processes. What You'll Bring: Associate Data Analyst/Data Operations OneTru Data Operations Team As part of the Data Operations Team, this position should be focused on delivering actionable insights to evaluate and control data ingestion processes across multiple sources (Cloud and On-Prem). This individual will leverage state-of-the-art tools to cultivate the analytical methods needed to consistently refine and improve the efficiency and effectiveness of data onboarding processes. In this role, you will be responsible for acting as an ETL Platform subject matter expert and processes that you will be involved in, along with being able to gather knowledge from other SMEs and requirements from key stakeholders. Responsibilities : Identify, analyze, and troubleshoot possible data flow issues between servers and processing steps. Identify, analyze, and interpret trends or patterns across flows and data sets. Measure, track and report key performance indicators and quality metrics from large data sets, automated processes, and data processing stages. Find and solve data problems, ensuring timely short-term and long-term preventive solutions. Develop and improve existing processes to ensure data ingestion through the ETL Platform. Work with management and teammates to prioritize business and information needs. Locate and define new opportunities for process improvement or process automation. Deliver excellent customer support through efficient and accurate handling of tickets/requests and general program inquiries. Perform other work-related tasks and responsibilities assigned to you from time to time. Participate in new products and features deployment and propose technical solutions that meet business needs. Requirements : Active student in Systems Engineering, Statistics, Mathematics, Industrial Engineering, or related field. Logical thinking and troubleshooting skills. Clear verbal and written communication skills. B2+ English Level. Knowledge and experience with Microsoft Excel, SQL, and regular/glob expressions. Knowledge and experience with visualization tools: Tableau, Power BI or Google Looker, etc. (nice to have). Experience with Unix/Linux, Hadoop, and Scripting languages such as Python, Bash, JavaScript etc. . (nice to have). Aptitude : Results-oriented and with a mindset to improve data, processes, and procedures. Ability to work independently and work effectively in a team environment. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and precision. Ability to learn and apply new technologies promptly. Interpersonal skills (leadership, teamwork, teaching, ability to dialogue, and effective interaction with different profiles of the organization) Creative problem solving and research skills with the ability to recognize patterns in data. Aptitude : Impact You'll Make: Results-oriented and with a mindset to improve data, processes, and procedures. Ability to work independently and work effectively in a team environment. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and precision. Ability to learn and apply new technologies promptly. Interpersonal skills (leadership, teamwork, teaching, ability to dialogue, and effective interaction with different profiles of the organization) Creative problem solving and research skills with the ability to recognize patterns in data. This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Analyst, Data Analysis
Posted 4 days ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Mantas Scenario Developer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Hands on Mantas expert throughout the full development life cycle, including: requirements analysis, functional design, technical design, programming, testing, documentation, implementation, and on-going technical support Translate business needs (BRD) into effective technical solutions and documents (FRD/TSD) Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Mantas: Expert in Oracle Mantas/FCCM, Scenario Manager, Scenario Development, thorough knowledge and hands on experience in Mantas FSDM, DIS, Batch Scenario Manager Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 4 days ago
5.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
dunnhumby is the global leader in Customer Data Science, empowering businesses everywhere to compete and thrive in the modern data-driven economy. We always put the Customer First. Our mission: to enable businesses to grow and reimagine themselves by becoming advocates and champions for their Customers. With deep heritage and expertise in retail – one of the world’s most competitive markets, with a deluge of multi-dimensional data – dunnhumby today enables businesses all over the world, across industries, to be Customer First. dunnhumby employs nearly 2,500 experts in offices throughout Europe, Asia, Africa, and the Americas working for transformative, iconic brands such as Tesco, Coca-Cola, Meijer, Procter & Gamble and Metro. We’re looking for a Senior Big Data Engineer who expects more from their career. It’s chance to extend and improve dunnhumby’s Data Engineering Team. It’s an opportunity to work with a market-leading business to explore new opportunities for us and influence global retailers. What We Expect From You 5+ years of experience in Information Technology Experience in managing Big Data space. Extensive experience with high level programming languages - Python, Java & Scala Extensive experience in shell scripting and developing solutions in Linux Environment Experience with Hive, Oozie, Airflow, HBase, MapReduce, Spark along with working knowledge of Hadoop/Spark Toolsets. Extensive Experience working with Git and Process Automation In depth understanding of relational database management systems (RDBMS) and Data Flow Development What You Can Expect From Us We won’t just meet your expectations. We’ll defy them. So you’ll enjoy the comprehensive rewards package you’d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You’ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don’t just talk about diversity and inclusion. We live it every day – with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One, dh Enabled and dh Thrive as the living proof. We want everyone to have the opportunity to shine and perform at your best throughout our recruitment process. Please let us know how we can make this process work best for you. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here)
Posted 4 days ago
175.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. Function Description: Data Management team within Global Servicing MIS supports end to end data lifecycle journey for our business teams. This includes platform ownership, strategy, governance, ingestions, ETL builds, Data Quality, BI & downstream data enablement in collaboration with tech organization. Key responsibilities: Understanding business use cases and be able to convert to technical design Part of a cross-disciplinary team, working closely with other data engineers, software engineers, data scientists, data managers and business partners. You will be designing scalable, testable and maintainable data pipelines Identify areas for data governance improvements and help to resolve data quality problems through the appropriate choice of error detection and correction, process control and improvement, or process design changes Developing metrics to measure effectiveness and drive adoption of Data Governance policies and standards that will be applied to mitigate identified risks across the data lifecycle (e.g., capture / production, aggregation / processing, reporting / consumption). You will continuously monitor, troubleshoot, and improve data pipelines and workflows to ensure optimal performance and cost-effectiveness. Reviewing architecture and design on various aspects like scalability, security, design patterns, user experience, non-functional requirements and ensure that all relevant best practices are followed. Key Skills required : 2-4 years of experience in data engineering roles. Advanced SQL skills with a focus on optimisation techniques Big data and Hadoop experience, with a focus on Spark, Hive (or other query engines), big data storage formats (such as Parquet, ORC, Avro). Cloud experience (GCP preferred) with solutions designed and implemented at production scale Strong understanding of key GCP services, especially those related to data processing [Batch/Real Time] Big Query, Cloud Scheduler, Airflow, Cloud Logging and Monitoring Hands-on experience with Git, advanced automation capabilities & shell scripting. Experience in design, development and implementation of data pipelines for Data Warehousing applications Hands on experience in performance tuning and debugging ETL jobs We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 4 days ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Manager, Data Scientist Our Vision AI Garage is responsible for establishing Mastercard as an AI powerhouse. AI will be leveraged and implemented at scale within Mastercard providing a foundational, competitive advantage for the future. All internal processes, all products and services will be enabled by AI continuously advancing our value proposition, consumer experience, and efficiency. Opportunity Join Mastercard's AI Garage @ Gurgaon, a newly created strategic business unit executing on identified use cases for product optimization and operational efficiency securing Mastercard's competitive advantage through all things AI. The AI professional will be responsible for the creative application and execution of AI use cases, working collaboratively with other AI professionals and business stakeholders to effectively drive the AI mandate. Role Ensure all AI solution development is in line with industry standards for data management and privacy compliance including the collection, use, storage, access, retention, output, reporting, and quality of data at Mastercard Adopt a pragmatic approach to AI, capable of articulating complex technical requirements in a manner this is simple and relevant to stakeholder use cases Gather relevant information to define the business problem interfacing with global stakeholders Creative thinker capable of linking AI methodologies to identified business challenges Identify commonalities amongst use cases enabling a microservice approach to scaling AI at Mastercard, building reusable, multi-purpose models Develop AI/ML solutions/applications leveraging the latest industry and academic advancements Leverage open and closed source technologies to solve business problems Ability to work cross-functionally, and across borders drawing on a broader team of colleagues to effectively execute the AI agenda Partner with technical teams to implement developed solutions/applications in production environment Support a learning culture continuously advancing AI capabilities Experience All About You 3+ years of experience in the Data Sciences field with a focus on AI strategy and execution and developing solutions from scratch Demonstrated passion for AI competing in sponsored challenges such as Kaggle Previous experience with or exposure to: Deep Learning algorithm techniques, open source tools and technologies, statistical tools, and programming environments such as Python, R, and SQL Big Data platforms such as Hadoop, Hive, Spark, GPU Clusters for deep learning Classical Machine Learning Algorithms like Logistic Regression, Decision trees, Clustering (K-means, Hierarchical and Self-organizing Maps), TSNE, PCA, Bayesian models, Time Series ARIMA/ARMA, Recommender Systems - Collaborative Filtering, FPMC, FISM, Fossil Deep Learning algorithm techniques like Random Forest, GBM, KNN, SVM, Bayesian, Text Mining techniques, Multilayer Perceptron, Neural Networks – Feedforward, CNN, LSTM’s GRU’s is a plus. Optimization techniques – Activity regularization (L1 and L2), Adam, Adagrad, Adadelta concepts; Cost Functions in Neural Nets – Contrastive Loss, Hinge Loss, Binary Cross entropy, Categorical Cross entropy; developed applications in KRR, NLP, Speech and Image processing Deep Learning frameworks for Production Systems like Tensorflow, Keras (for RPD and neural net architecture evaluation), PyTorch and Xgboost, Caffe, and Theono is a plus Exposure or experience using collaboration tools such as: Confluence (Documentation) Bitbucket/Stash (Code Sharing) Shared Folders (File Sharing) ALM (Project Management) Knowledge of payments industry a plus Experience with SAFe (Scaled Agile Framework) process is a plus Effectiveness Effective at managing and validating assumptions with key stakeholders in compressed timeframes, without hampering development momentum Capable of navigating a complex organization in a relentless pursuit of answers and clarity Enthusiasm for Data Sciences embracing the creative application of AI techniques to improve an organization's effectiveness Ability to understand technical system architecture and overarching function along with interdependency elements, as well as anticipate challenges for immediate remediation Ability to unpack complex problems into addressable segments and evaluate AI methods most applicable to addressing the segment Incredible attention to detail and focus instilling confidence without qualification in developed solutions Core Capabilities Strong written and oral communication skills Strong project management skills Concentration in Computer Science Some international travel required Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-249983
Posted 4 days ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Data Scientist We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. The Mastercard Launch program is aimed at early career talent, to help you develop skills and gain cross-functional work experience. Over a period of 18 months, Launch participants will be assigned to a business unit, learn and develop skills, and gain valuable on the job experience. Mastercard has over 2 billion payment cards issued by 25,000+ banks across 190+ countries and territories, amassing over 10 petabytes of data. Millions of transactions are flowing to Mastercard in real-time providing an ideal environment to apply and leverage AI at scale. The AI team is responsible for building and deploying innovative AI solutions for all divisions within Mastercard securing a competitive advantage. Our objectives include achieving operational efficiency, improving customer experience, and ensuring robust value propositions of our core products (Credit, Debit, Prepaid) and services (recommendation engine, anti-money laundering, fraud risk management, cybersecurity) Role Gather relevant information to define the business problem Creative thinker capable of linking AI methodologies to identified business challenges Develop AI/ML applications leveraging the latest industry and academic advancements Ability to work cross-functionally, and across borders drawing on a broader team of colleagues to effectively execute the AI agenda All About You : Demonstrated passion for AI competing in sponsored challenges such as Kaggle Previous experience with or exposure to: Deep Learning algorithm techniques, open source tools and technologies, statistical tools, and programming environments such as Python, R, and SQL Big Data platforms such as Hadoop, Hive, Spark, GPU Clusters for deep learning Classical Machine Learning Algorithms like Logistic Regression, Decision trees, Clustering (K-means, Hierarchical and Self-organizing Maps), TSNE, PCA, Bayesian models, Time Series ARIMA/ARMA, Recommender Systems - Collaborative Filtering, FPMC, FISM, Fossil Deep Learning algorithm techniques like Random Forest, GBM, KNN, SVM, Bayesian, Text Mining techniques, Multilayer Perceptron, Neural Networks – Feedforward, CNN, LSTM’s GRU’s is a plus. Optimization techniques – Activity regularization (L1 and L2), Adam, Adagrad, Adadelta concepts; Cost Functions in Neural Nets – Contrastive Loss, Hinge Loss, Binary Cross entropy, Categorical Cross entropy; developed applications in KRR, NLP, Speech and Image processing Deep Learning frameworks for Production Systems like Tensorflow, Keras (for RPD and neural net architecture evaluation), PyTorch and Xgboost, Caffe, and Theono is a plus Concentration in Computer Science Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-252120
Posted 4 days ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: GCP Data Architect Location: Madurai, Chennai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer: Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Experience Technical Skills: Good problem solving abilities, ability to work independently and proactively and possess good communication skills. Eager and enthusiastic to learn new technologies and frameworks. Profieciency and hands on experience of any of the programming solutions including SAS, SQL, R, Python, PySpark, HQL etc. Good knowledge of relational databases like Oracle, SQL Server, MySQL, DB2 and HaaS (Hadoop) with experience with SQL / HQL scripting Experience with web based application development, familiarity with HTML, CSS, JavaScript frameworks like Angular JS and web frameworks such as Django or Flask Exposure to Unix / Linux Systems Familiarity with basic software development productivity tools like Git, Jira, Confluence etc. Entry Experience: Upto 3-7 years in the area of analytics and modeling using statistical tools and packages Any experience in Banking and Financial Crime and Compliance would be an added advantage Entry. Business: Job Summary Analyse comprehensive impact of financial crime related regulatory matters on the relevant business area and its operations. Ensure that key changes (to laws, rules, regulations) are communicated and cascaded (in region/country), in coordination with group communications. Processes: Perform threshold tuning / retuning for detection scenarios and risk indicators across products such as CASA, Trade, Credit Cards, Financing Securities Services and Financial Markets – Third Party Payments Perform segmentation of customers as per the Global Segmentation Model in force at SCB Perform reconciliation of detection scenarios and risk indicators across products such as such as CASA, Trade, Credit Cards, Financing Securities Services and Financial Markets – Third Party Payments Perform Adhoc / Bespoke Analysis (Impact Assessments) based on requests from Country FCC stakeholders within the group / regions / country and FCC / FCSO teams Regularly engage with business stakeholders to understand their requirements, and address their concerns. For technical roles, write production quality code. Adhere to the best practices in coding like following PEP-8 standards, writing unit tests, etc. Key Responsibilities People & Talent: Promote and embed a culture of openness, trust and risk awareness, where ethical, legal, regulatory and policy compliant conduct is the norm. Stimulate an environment where forward planning, prioritisation, deadline management and streamlined workflows and collaborative, inclusive yet effective and efficient work practices are the norm. Risk Management: Understand technical aspects of systems relevant to CDD, Client Risk Assessments, AML Monitoring and Case Management Apply risk and data analytic tools/techniques to optimise and tune relevant detection scenarios, and screening and monitoring systems. Review and assess existing system and controls relevant to FCC to ascertain operational performance and effectiveness. Align/support with the alignment of relevant systems and controls to industry best practice and close out any compliance gaps. Apply Group and FCC policies and processes (AML Monitoring) to manage risks. Ensure that detection scenarios that are developed and deployed are fit-for-purpose Governance Attend relevant team and leadership meetings Ensure tracking and remediation of surveillance and investigations related regulatory findings Prepare and cascade lessons learned from audit findings, FCC assurance activities and specific investigations Skills And Experience Domain Skills: Must be a quick learner, willing to learn any technology that might be required to complete the task Exposure and experience to monitoring systems such as DETICA and MANTAS Knowledge/Experience with Big data tools like Hadoop, Spark, etc. Knowledge/Experience in Machine learning algorithms/systems. Certification from the ACAMS - Association of Certified Anti-Money Laundering Specialists or equivalent Entry Analytics / Statistics / Quantitative Skills: Strong analytical and problem-solving expertise Good communication and documentation skills Excellent collaborative and team building skills and a desire to work as a part of a high functioning team of financial intelligence specialists. Experience in statistical modelling, and analysis using techniques such as regression analysis, multivariate analysis, factor analysis, and clustering Entry. Qualifications Post Graduate degree in Management/Statistics/Mathematics/ OR Graduate degree in Engineering from a reputed institution Active ACAMS / ICA / CFE preferred English About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Recruitment Assessments Some of our roles use assessments to help us understand how suitable you are for the role you've applied to. If you are invited to take an assessment, this is great news. It means your application has progressed to an important stage of our recruitment process. Visit our careers website www.sc.com/careers
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Role: Senior Associate – Data Engineer Tower: Data Analytics & Insights Managed Service Experience: 6 - 10 years Key Skills: Data Engineering Educational Qualification: Bachelor's degree in computer science/IT or relevant field Work Location: Bangalore AC Job Description As a Managed Services - Data Engineer Senior Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution by using Data, Analytics & Insights Skills. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Proven track record as an SME in chosen domain. Mentor Junior resources within the team, conduct KSS and lessons learnt. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review. Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy, and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good Team player. Take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Primary Skill: ETL/ELT, SQL, SSIS, SSMS, Informatica, Python Secondary Skill: Azure/AWS/GCP (preferrable anyone), Power BI, Advance Excel, Excel Macro Data Ingestion Senior Associate Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, SSIS, SSMS, AWS, Azure, ADF, GCP, Snowflake, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Hadoop, Spark etc. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data. Should have experience in creating visually impactful dashboards in Tableau for data reporting. Extract, interpret and analyze data to identify key metrics and transform raw data into meaningful, actionable information. Good understanding of formulas / DAX, Measures, Establishing hierarchies, Data refresh, Row/column/report level security, Report governance, complex visualizations, level of detail (LOD) expressions etc. Ability to create and replicate functionalities like parameters (for top fields, sheet switching), interactive buttons to switch between dashboards, burger menus etc. Participate in requirement gathering with business and evaluate the data as per the requirement. Coordinate and manage data analytics & Reporting activities with stakeholders. Expertise in writing and analyzing complex SQL queries. Excellent problem solving, design, debugging, and testing skills, Competency in Excel (macros, pivot tables, etc.) Good to have minimum 5 years’ hands on Experience of delivering Managed Data and Analytics programs (Managed services and Managed assets) Should have Strong communication, problem solving, quantitative and analytical abilities. Effectively communicate with project team members and sponsors throughout the project lifecycle (status updates, gaps/risks, roadblocks, testing outcomes) Nice To Have Certification in any cloud platform Experience in Data ingestion technology by using any of the industry tools like Informatica, Talend, DataStage, etc. Managed Services- Data, Analytics & Insights At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights Managed Service where we focus more so on the evolution of our clients’ Data, Analytics, Insights and cloud portfolio. Our focus is to empower our clients to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.
Posted 4 days ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Lowe’s Lowe’s Companies, Inc. (NYSE: LOW) is a FORTUNE® 50 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. About The Team The Pro & Services Reporting team is responsible to perform quantitative analysis or dashboard building needed to help guide key business decisions. This includes applying knowledge of Lowe's data concepts to the creation of relevant analytic designs and making sound, data‐driven business recommendations. Job Summary The Manager Pro & Services Reporting is responsible for partnering with Pro & Services business stakeholders to define and execute Reporting and Analytics tools and resources. He/she will do this by acting as a subject matter expert and thought leader on their respective team, partnering with DACI and other Reporting & Analytics team in the organization, and taking on the responsibility of managing and developing a team of Individual Contributors. To be successful in this role, the Manager Pro & Services Reporting must have strong knowledge of analytical and reporting principles in addition to experience supporting a business unit in a cross-functional organization. Core Responsibilities Manages a team of Analysts who will provide reporting and insights to Pro & Services business units, including determining capacity and assigning work based on priorities. Coaches and develops analysts on best practices and technical expertise and provides strategic direction. Assists Analysts on technical work, reviewing for accuracy and providing technical help to analysts as needed. Provides subject matter expertise to business partners for matters concerning data availability, reportability, and accessibility. Assists team on gathering business requirements and translates into reporting solutions, analytic tools, and dashboards to deliver actionable data to end users. Collaborates cross-functionally with other teams, including serving as a liaison between Pro & Service business teams and DACI, ensuring all requirements are documented, communicated, and prioritized. Communicates data driven insights to senior leaders by preparing analyses using multiple data sources, translating findings into clear, understandable themes, identifying complete, consistent, and actional insights and recommendations. Develops, configures, and modifies database components within various environments by using tools such as SQL and/or Power BI to access, manipulate, and present data. Years Of Experience Overall, 8 to 12 years of Experience with 2+ years of experience leading people directly or indirectly. Education Qualification & Certifications (optional) Required Minimum Qualifications Bachelor’s degree in business administration, Finance, Mathematics, or Related Fields and 6 Years Related Experience OR Master’s Degree in in Business Administration, Finance, Mathematics, or Related Fields and 4 Years Related Experience. Primary Skills (must Have) 8-10 years of overall experience with 2+ years of experience leading people directly or indirectly. 6+ years of applied reporting and analytics experience supporting a business unit in retail, technology, or other customer driven organization. 4+ years of experience using analytic tools (e.g., SQL, Alteryx, Knime, SAS). 4+ years of experience using data visualization tools (e.g., Power BI, Microstrategy, Tableau). 4+ years of experience working with Enterprise level databases (e.g., Hadoop, Teradata, GCP, Oracle, DB2). Lowe's is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, ancestry, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under federal, state, or local law. Starting rate of pay may vary based on factors including, but not limited to, position offered, location, education, training, and/or experience. For information regarding our benefit programs and eligibility, please visit https://talent.lowes.com/us/en/benefits.
Posted 4 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Associate Director, Software engineering In this role, you will: A Lead Automation engineer with deep hands-on experience of Software Automation testing and Performance testing tools, practices, processes Need deep understanding of Desktop, Web, Data warehouse application, API Development, Design Patterns, SDLC, IAC tools, testing and site reliability engineering and related ways to design and develop automation framework Define and implement best practices for software automation testing, performance testing, framework, and patterns, including testing methodologies. Be a generalist with the breadth and depth of experience in CICD best practices and has core experience in testing (ie. TDD/BDD/Automated testing/Contract testing/API testing/Desktop/web apps, DW test automation) Able to see a problem or an opportunity with the ability to engineer a solution, be respected for what they deliver not just what they say, should think about the business impact of their work and has a holistic view to problem- solving Proven industry experience of running an Engineering team with focus on optimization of processes, introduction of new technologies, solving challenges, building strategy, business planning, governance and Stakeholder Management. Apply thinking to many problems across multiple technical domains and suggest way to solve the problems Contributes to architectural discussions by asking the right questions to ensure a solution matches the business needs Identify opportunities for system optimization, performance tuning, and scalability enhancements. Implement solutions to improve system efficiency and reliability. Excellent verbal and written communication skills to articulate technical concepts to both technical and non-technical stakeholders. Build Performance assurance procedures with the latest feasible tools and techniques, establish Performance test automation process to improve testing productivity. Responsible for end-to-end Software testing, performance testing and engineering life cycle - technical scoping, performance scripting, testing, and tuning. Analyse the test assessment results, provide recommendations to improve performance or save infrastructure costs. Represent at Scrum meetings and all other key project meetings and provide a single point of accountability and escalation for Performance testing within the scrum teams. Advise on needed infrastructure and Performance Engineering and testing guidelines & be responsible for performance risk assessment of various application features. Work with cross-functional team, opportunity to work with software product, development, and support teams, capable of handling tasks to accelerate the testing delivery and to improve the quality for Applications at HSBC. Able to provide support in product/application design from performance point of view. Able to communicate plans, status, and results as per target audience. Willing to adapt, learn innovative technologies/trades and be flexible to work on projects as demanded by business Define and implement best practices for software automation testing, including testing standards, test reviews, coverage, and testing methodologies, tractability between requirements and test cases. Prepare, develop and maintain test automation framework that can be used for software testing, performance testing., write automation test scripts, conduct reviews. Develop and execute regression, smoke, integration tests timely. Requirements To be successful in this role, you must meet the following requirements: Experience in software testing approaches on automation testing using Tosca, Selenium, cucumber BDD framework Experienced on writing test plans, test strategy, test data management includes test artifacts management for both automation and manual testing. Experience on setting up CI/CD pipeline and work experience on GitHub, Jenkins along with integration to cucumber and Jira. Experience in agile methodology and proven experience in working on agile projects. Experience in analysis of bug tracking, prioritizing and bug reporting with bug tracking tools. Experience in SQL, Unix, Control-M, ETL, Data Testing, API testing, API Automation using Rest Assured. Familiar with following performance testing tools. Micro Focus LoadRunner Enterprise (VuGen, Analysis, LRE OneLG), Protocols: HTTP/HTMP, CITRIX,JMETER, Postman, Insomnia Familiar with following observability tools -AppDynamics, New Relic, Splunk, Geneos., Datadog, Grafana Knowledge of following will be an added advantage -GitHub, Jenkins, Kubernetes, Jira & Confluence. Programming and scripting language skills in Java, Shell, Scala, Groovy, Python,WebLogic server administration. Familiar with BMC Control M tool. CICD tools – Ansible, AWS RO, G3 UNIX/Linux/Web monitors & performance analysis tools to diagnose and resolve performance issues. Experience of working in an Agile environment, "DevOps" team or a similar multi skilled team in a technically demanding function. Experience of working on performance testing and tuning of micro-services/APIs, Desktop applications, Webapps, Cloud Services, ETL Apps, database queries. Experience of writing/modifying performance testing scripts, Implementation & usage of automated tools for result analysis Experience of working on performance testing and tuning of Data warehouse applications doing batch processing on various stages of ETL and information delivery components Good to have skills: Knowledge on latest technology, tools like Python Scripting, Tricentis Toaca, Dataflow, Hive, DevOpS, REST API, Hadoop, Kafka framework, GCP, AWS, will be an added advantage You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI
Posted 4 days ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Key Responsibilities Engage with the product and engineering team to design the best operational structure and processes. Identify and drive opportunities in making resilient systems that help maintain business continuity Proactively perform troubleshooting, RCA and implement permanent resolution of issues across the stacks – hardware, software, database, network and so on Proactive performance engineering activities on infrastructure. Proactive documentation of architecture diagrams and processes Implementation of proactive monitoring, alerting, trend analysis and self-healing systems Develop continuous delivery for multiple platforms in production and staging environments Regular Benchmarking and capacity planning of the infrastructure Test and deploy new technologies/tools as per project's need.Infrastructure and platform security Effectively use and maintain Infrastructure and config management tools like puppet, chef, ansible, terraform to deploy and manage infrastructure Demonstrate technical mentoring and coaching to team members Adaptable to work in a fast-paced environment and alter priorities as per business needs Required Skills Experience with Unix/Linux operating systems internals and administration (e.g. filesystems, inodes, system calls, etc) Good understanding of network stack (e.g. TCP/IP, routing, network topologies and hardware, SDN, etc) Knowledge of performance engineering tools. Hands on experience with any of public clouds like AWS , GCP , Azure Proactive in learning and testing new tools to improve infrastructure Understanding of scripting languages like python and bash and ability to learn new languages when needed Strong understanding of project and infrastructure operational needs and infrastructure architecture , You have expertise in some of the below tools/skills - Container orchestration technologies like Kubernetes and Mesos Understands Infrastructure as a code (we use Puppet, Ansible and Terraform) and containerization tool sets (we use Docker). Data intensive applications and platforms like Kafka, Hadoop, Spark, Zookeeper, Cassandra, PostgreSQL OLAP, Druid Relational databases like MySQL, Oracle, PostgreSQL etc NoSQL databases like Redis, MongoDB, Cassandra, CouchDB etc One or more CI tools like Jenkins, Teamcity Strong knowledge of Centralized logging systems, metrics, and tooling frameworks such as ELK, Prometheus, and Grafana. Web and Application servers like Apache, Nginx, Tomcat Versioning tools such as git. Ability to work independently and own problem statements end-to-end. Great communication, interpersonal and teamwork skills
Posted 4 days ago
4.0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Please mention subject line “Machine Learning Engineer - ESLNK59” while applying to hr@evoortsolutions.com Job Title: Machine Learning Engineer Location: Remote | Full-Time Experience: 4+ years Job Summary We are seeking a highly skilled and self-motivated Machine Learning Engineer / Senior Machine Learning Engineer to join our fast-growing AI/ML startup. You will be responsible for designing and deploying intelligent systems and advanced algorithms tailored to real-world business problems across diverse industries. This role demands a creative thinker with a strong mathematical foundation, hands-on experience in machine learning and deep learning, and the ability to work independently in a dynamic, agile environment. Key Responsibilities Design and develop machine learning and deep learning algorithms in collaboration with cross-functional teams, including data scientists and business stakeholders. Translate complex client problems into mathematical models and identify the most suitable AI/ML approach. Build data pipelines and automated classification systems using advanced ML/AI models. Conduct data mining and apply supervised/unsupervised learning to extract meaningful insights. Perform Exploratory Data Analysis (EDA), hypothesis generation, and pattern recognition from structured and unstructured datasets. Develop and implement Natural Language Processing (NLP) techniques for sentiment analysis, text classification, entity recognition, etc. Extend and customize ML libraries/frameworks like PyTorch, TensorFlow, and Scikit-learn. Visualize and communicate analytical findings using tools such as Tableau, Matplotlib, ggplot, etc. Develop and integrate APIs to deploy ML solutions on cloud-based platforms (AWS, Azure, GCP). Provide technical documentation and support for product development, business proposals, and client presentations. Stay updated with the latest trends in AI/ML and contribute to innovation-driven projects. Required Skills & Qualifications Education: B.Tech/BE or M.Tech/MS in Computer Science, Computer Engineering, or related field. Solid understanding of data structures, algorithms, probability, and statistical methods. Proficiency in Python, R, or Java for building ML models. Hands-on experience with ML/DL frameworks such as PyTorch, Keras, TensorFlow, and libraries like Scikit-learn, SpaCy, NLTK, etc. Experience with cloud services (PaaS/SaaS), RESTful APIs, and microservices architecture. Strong grasp of NLP, predictive analytics, and deep learning algorithms. Familiarity with big data technologies like Hadoop, Spark, Hive, Kafka, and NoSQL databases is a plus. Expertise in building and deploying scalable AI/ML models in production environments. Ability to work independently in an agile team setup and handle multiple priorities simultaneously. Exceptional analytical, problem-solving, and communication skills. Strong portfolio or examples of applied ML use cases in real-world applications. Why Join Us? Opportunity to work at the forefront of AI innovation and solve real-world challenges. Be part of a lean, fast-paced, and high-impact team driving AI solutions across industries. Flexible remote working culture with autonomy and ownership. Competitive compensation, growth opportunities, and access to cutting-edge technology. Embrace our culture of Learning, Engaging, Achieving, and Pioneering (LEAP) in every project you touch.
Posted 4 days ago
1.0 - 3.0 years
3 - 5 Lacs
Bengaluru
Work from Office
As a Backend Engineer (Founding Engineer) at Zintlr, you will have the unique opportunity to build critical components of our cutting-edge AI product from the ground up. You'll play a key role in shaping the infrastructure to handle millions of data requests per hour, working closely with our Lead Architect and Product Owner to bring innovative features to life.This is not just another backend roleit's a chance to be at the forefront of creating a scalable, impactful product in the fast-growing 'people intelligence' space. Your contributions will directly influence the product's success and the experience of thousands of users globally. Requirements Strong expertise in Django and Python. Solid knowledge of SQL and NoSQL databases (e.g., MySQL, MongoDB). Practical experience deploying solutions on GCP or AWS. Strong computer science fundamentals and problem-solving skills. Hands-on experience in building backend applications or products. Passion for programming, with a proactive and organized approach to work. Responsibilities Build critical components of a scalable, high-performance AI system. Develop and maintain infrastructure to handle millions of data requests per hour. Write clean, efficient, and reliable code with minimal oversight. Collaborate with the Lead Architect and Product Owner to align engineering capabilities with product evolution.
Posted 4 days ago
6.0 - 11.0 years
25 - 30 Lacs
Mumbai, Mumbai Suburban, Mumbai (All Areas)
Work from Office
Experience in using SQL, PL/SQL or T-SQL with RDBMSs like Teradata, MS SQL Server, or Oracle in production environments. Experience with Python, ADF,Azure,Data Ricks. Experience working of Microsoft Azure/AWS or other leading cloud platforms Required Candidate profile Hands-on experience with Hadoop, Spark, Hive, or similar frameworks. Data Integration & ETL Data Modelling Database management Data warehousing Big-data framework CI/CD Perks and benefits To be disclosed post interview
Posted 4 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Data Engineer (Azure Data Factory and Azure Databricks) Job Description We seek a Data Engineer with at least three years of experience in Azure Databricks, Azure Data Lake Storage gen2 (ADLS), and Azure Data Factory (ADF). The candidate will design, build, and maintain data pipelines using Azure ADF/Databricks. The candidate should be able to create Data Modelling & Governance, they will also work closely with our data science and engineering teams to develop and deploy machine learning models and other advanced analytics solutions. Primary skills: Azure Data Factory, Azure Data Bricks, SQL, ADLS, Spark-SQL, Python (Pandas), Pyspark or Scala Secondary Skills: Basics of Azure Security (RBAC, Azure AD), Hadoop, HDFS, ADLS, Azure DBFS, PowerBI, and Tableau visualization tool is a plus. Responsibilities · Design, build, and maintain data pipelines using Azure ADF/Databricks. · Strong expertise in Azure Data Factory, Databricks, and related technologies such as Azure Synapse Analytics and Azure Functions. · Hands-on experience in designing and implementing data pipelines and workflows in Azure Data Factory and Databricks. · Experience in Data Extraction, Transformation, and Loading of data from multiple data sources into target databases, using Azure Databricks, Spark SQL, Pyspark, and Azure SQL. · Designing and implementing data ingestion pipelines from multiple sources using Azure Databricks. · Sound working experience in Cleansing, Transformation, Business Logic, Incremental Transformation of Data, and merging the data with datamart tables, · Developing scalable and reusable frameworks for ingesting data sets. · Integrating the end-to-end data pipeline - to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times . · Interacting with stakeholders and leaders to understand business goals and data requirements. · Experience in working with Agile (Scrum, Sprint) and Waterfall Methodologies. · Collaborate with data engineers, data architects, and business analysts to understand and translate business requirements into technical designs. · Provide technical guidance and support to junior team members. · Design, develop, and maintain SQL databases, including creating database schemas, tables, views, stored procedures, and triggers. · Self-starter and team player with excellent communication, problem-solving skills, and interpersonal skills, and a good aptitude for learning.
Posted 4 days ago
4.0 years
0 Lacs
Hyderābād
Remote
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: - Design and Develop Data Pipelines: Build and optimize scalable ETL processes using PySpark or Scala or SparkSQL to handle large volumes of structured and unstructured data from diverse sources. -Cloud-Based Data Solutions: Implement data ingestion, processing, and storage solutions on the Azure cloud platform, utilizing services such as Azure Databricks, Azure Data Lake Storage, and Azure Synapse Analytics. -Data Modeling and Management: Develop and maintain data models, schemas, and metadata to ensure efficient data access, high query performance, and support for analytics requirements. -Pipeline Monitoring and Optimization: Monitor the performance of data pipelines, troubleshoot issues, and enhance workflows for scalability, reliability, and cost-efficiency. -Security and Compliance: Enforce data security protocols and compliance measures to safeguard sensitive information and meet regulatory standards. Responsibilities: -Experience: Proven track record as a Data Engineer with expertise in building and optimizing data pipelines using PySpark, SQL, and Apache Spark. -Cloud Proficiency: Hands-on experience with Azure cloud services, including Azure Databricks, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database. -Programming Skills: Strong proficiency in Python ,PySpark, and SQL with experience in software development practices, version control systems, and CI/CD pipelines. -Data Warehousing Knowledge: Familiarity with data warehousing concepts, dimensional modeling, and relational databases such as SQL Server, PostgreSQL, and MySQL. -Big Data Technologies: Exposure to big data frameworks and tools like Hadoop, Hive, and HBase is a plus. Mandatory skill sets: -Hands-on experience with Azure cloud services, including Azure Databricks, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database. -Monitor the performance of data pipelines, troubleshoot issues, and enhance workflows for scalability, reliability, and cost-efficiency. Preferred skill sets: Hands-on experience with Azure cloud services, including Azure Databricks, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database. Years of experience required: 4-7 years experience req. Education qualification: B.Tech / M.Tech (Computer Science, Mathematics & Scientific Computing etc.) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Cloud Services, Microsoft Azure Databricks Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 4 days ago
5.0 years
4 - 6 Lacs
Hyderābād
On-site
DESCRIPTION Value Added Services Tech (VASTech) improves customer experience and reduces cost-to-serve by transforming selection discovery and addressing un-optimizations during fulfillment. We support three independent LOBs of Home Services, Regionalization and Retail Supply Chain Procurement workflows. We seek talented Engineers with expertise in large-scale systems to join our mission. Ideal candidates are passionate problem-solvers who can drive innovative solutions from design to deployment across teams. You will play a key role in developing global fulfillment models, requiring creativity and excellence. We need adaptable professionals who thrive in dynamic environments, focusing on enhancing customer experience. Join us to shape the future of e-commerce logistics and make a significant impact in this transformative field. BASIC QUALIFICATIONS 5+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Experience managing a data or BI team Experience leading and influencing the data or BI strategy of your team or organization Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS Experience hiring, developing and promoting engineering talent PREFERRED QUALIFICATIONS Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with AWS Tools and Technologies (Redshift, S3, EC2) Bachelor's degree Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, TS, Hyderabad Software Development
Posted 4 days ago
10.0 years
6 - 10 Lacs
Gurgaon
On-site
- Bachelor’s Degree with 10+ years of hands-on Infrastructure / Troubleshooting / Systems Administration / Networking / DevOps / Applications Development experience in a distributed systems environment. - External enterprise customer-facing experience as a technical lead, with strong oral and written communication skills, presenting to both large and small audiences. - Ability to manage multiple tasks and projects in a fast-moving environment. - Be mobile and travel to client locations as needed. AWS Global Services includes experts from across AWS who help our customers design, build, operate, and secure their cloud environments. Customers innovate with AWS Professional Services, upskill with AWS Training and Certification, optimize with AWS Support and Managed Services, and meet objectives with AWS Security Assurance Services. Our expertise and emerging technologies include AWS Partners, AWS Sovereign Cloud, AWS International Product, and the Generative AI Innovation Center. You’ll join a diverse team of technical experts in dozens of countries who help customers achieve more with the AWS cloud. Amazon has built a global reputation for being the most customer-centric company, a company that customers from all over the world recognize, value, and trust for both our products and services. Amazon has a fast-paced environment where we “Work Hard, Have Fun and Make History.” As an increasing number of enterprises move their critical systems to the cloud, AWS India is in need of highly efficient technical consulting talent to help our largest and strategically important customers navigate the operational challenges and complexities of AWS Cloud. We are looking for Technical Consultants to support our customers creative and transformative spirit of innovation across all technologies, including Compute, Storage, Database, Data Analytics, Application services, Networking, Server-less and more. This is not a sales role, but rather an opportunity to be the principal technical advisor for organizations ranging from start-ups to large enterprises. As a Technical Account Manager, you will be the primary technical point of contact for one or more customers helping to plan, debug, and oversee ongoing operations of business-critical applications. You will get your hands dirty, troubleshooting application, network, database, and architectural challenges using a suite of internal AWS Cloud tools as well as your existing knowledge and toolkits. We are seeking individuals with strong backgrounds in I.T. Consulting and in any of these related areas such as Solution Designing, Application and System Development, Database Management, Big Data and Analytics, DevOps Consulting, and Media technologies. Knowledge of programming and scripting is beneficial to the role. Key job responsibilities Every day will bring new and exciting challenges on the job while you: • Learn and use new Cloud technologies. • Interact with leading technologists around the world. • Work on critical, highly complex customer problems that may span multiple AWS Cloud services. • Apply advanced troubleshooting techniques to provide unique solutions to our customers' individual needs. • Work directly with AWS Cloud subject matter experts to help reproduce and resolve customer issues. • Write tutorials, how-to videos, and other technical articles for the customer community. • Leverage your extensive customer support experience and provide feedback to internal AISPL teams on how to improve our services. • Drive projects that improve support-related processes and our customers’ technical support experience. • Assist in Design/Architecture of AWS and Hybrid cloud solutions. • Help Enterprises define IT and business processes that work well with cloud deployments. • Be available outside of business hours to help coordinate the handling of urgent issues as needed. A day in the life As a TAM, you'll start your day reviewing operational metrics and service health for your strategic enterprise customers. You might lead a morning technical deep-dive session with a customer's engineering team, helping them optimize their cloud architecture. By midday, you could be collaborating with AWS service teams to resolve a complex migration challenge or providing proactive recommendations for cost optimization. Afternoons often involve strategic planning sessions, where you'll help customers align their technical roadmap with business objectives. You'll also participate in architecture reviews, incident post-mortems, and best-practice workshops. Throughout the day, you'll leverage your technical expertise to provide timely solutions, whether it's improving security posture, enhancing operational excellence, or architecting for scale. While most work happens during business hours, you'll occasionally support critical situations outside regular hours, ensuring your customers' mission-critical workloads run smoothly About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve. Advanced experience in one or more of the following areas: Software Design or Development, Content Distribution/CDN, Scripting/Automation, Database Architecture, Cloud Architecture, Cloud Migrations, IP Networking, IT Security, Big Data/Hadoop/Spark, Operations Management, Service Oriented Architecture etc. Experience in a 24x7 operational services or support environment. Experience with AWS Cloud services and/or other Cloud offerings. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 4 days ago
10.0 years
0 Lacs
Gurgaon
On-site
DESCRIPTION AWS Global Services includes experts from across AWS who help our customers design, build, operate, and secure their cloud environments. Customers innovate with AWS Professional Services, upskill with AWS Training and Certification, optimize with AWS Support and Managed Services, and meet objectives with AWS Security Assurance Services. Our expertise and emerging technologies include AWS Partners, AWS Sovereign Cloud, AWS International Product, and the Generative AI Innovation Center. You’ll join a diverse team of technical experts in dozens of countries who help customers achieve more with the AWS cloud. Amazon has built a global reputation for being the most customer-centric company, a company that customers from all over the world recognize, value, and trust for both our products and services. Amazon has a fast-paced environment where we “Work Hard, Have Fun and Make History.” As an increasing number of enterprises move their critical systems to the cloud, AWS India is in need of highly efficient technical consulting talent to help our largest and strategically important customers navigate the operational challenges and complexities of AWS Cloud. We are looking for Technical Consultants to support our customers creative and transformative spirit of innovation across all technologies, including Compute, Storage, Database, Data Analytics, Application services, Networking, Server-less and more. This is not a sales role, but rather an opportunity to be the principal technical advisor for organizations ranging from start-ups to large enterprises. As a Technical Account Manager, you will be the primary technical point of contact for one or more customers helping to plan, debug, and oversee ongoing operations of business-critical applications. You will get your hands dirty, troubleshooting application, network, database, and architectural challenges using a suite of internal AWS Cloud tools as well as your existing knowledge and toolkits. We are seeking individuals with strong backgrounds in I.T. Consulting and in any of these related areas such as Solution Designing, Application and System Development, Database Management, Big Data and Analytics, DevOps Consulting, and Media technologies. Knowledge of programming and scripting is beneficial to the role. Key job responsibilities Every day will bring new and exciting challenges on the job while you: Learn and use new Cloud technologies. Interact with leading technologists around the world. Work on critical, highly complex customer problems that may span multiple AWS Cloud services. Apply advanced troubleshooting techniques to provide unique solutions to our customers' individual needs. Work directly with AWS Cloud subject matter experts to help reproduce and resolve customer issues. Write tutorials, how-to videos, and other technical articles for the customer community. Leverage your extensive customer support experience and provide feedback to internal AISPL teams on how to improve our services. Drive projects that improve support-related processes and our customers’ technical support experience. Assist in Design/Architecture of AWS and Hybrid cloud solutions. Help Enterprises define IT and business processes that work well with cloud deployments. Be available outside of business hours to help coordinate the handling of urgent issues as needed. A day in the life As a TAM, you'll start your day reviewing operational metrics and service health for your strategic enterprise customers. You might lead a morning technical deep-dive session with a customer's engineering team, helping them optimize their cloud architecture. By midday, you could be collaborating with AWS service teams to resolve a complex migration challenge or providing proactive recommendations for cost optimization. Afternoons often involve strategic planning sessions, where you'll help customers align their technical roadmap with business objectives. You'll also participate in architecture reviews, incident post-mortems, and best-practice workshops. Throughout the day, you'll leverage your technical expertise to provide timely solutions, whether it's improving security posture, enhancing operational excellence, or architecting for scale. While most work happens during business hours, you'll occasionally support critical situations outside regular hours, ensuring your customers' mission-critical workloads run smoothly About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve. BASIC QUALIFICATIONS Bachelor’s Degree with 10+ years of hands-on Infrastructure / Troubleshooting / Systems Administration / Networking / DevOps / Applications Development experience in a distributed systems environment. External enterprise customer-facing experience as a technical lead, with strong oral and written communication skills, presenting to both large and small audiences. Ability to manage multiple tasks and projects in a fast-moving environment. Be mobile and travel to client locations as needed. PREFERRED QUALIFICATIONS Advanced experience in one or more of the following areas: Software Design or Development, Content Distribution/CDN, Scripting/Automation, Database Architecture, Cloud Architecture, Cloud Migrations, IP Networking, IT Security, Big Data/Hadoop/Spark, Operations Management, Service Oriented Architecture etc. Experience in a 24x7 operational services or support environment. Experience with AWS Cloud services and/or other Cloud offerings. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, HR, Gurugram Solutions Architect
Posted 4 days ago
4.0 years
0 Lacs
Gurgaon
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Analyst - Product Data & Analytics Our Purpose We work to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. We cultivate a culture of inclusion for all employees that respects their individual strengths, views, and experiences. We believe that our differences enable us to be a better team – one that makes better decisions, drives innovation and delivers better business results. Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships, and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Our Team As consumer preference for digital payments continues to grow, ensuring a seamless and secure consumer experience is top of mind. Optimization Solutions team focuses on tracking of digital performance across all products and regions, understanding the factors influencing performance and the broader industry landscape. This includes delivering data-driven insights and business recommendations, engaging directly with key external stakeholders on implementing optimization solutions (new and existing), and partnering across the organization to drive alignment and ensure action is taken. Are you excited about Data Assets and the value they bring to an organization? Are you an evangelist for data-driven decision-making? Are you motivated to be part of a team that builds large-scale Analytical Capabilities supporting end users across 6 continents? Do you want to be the go-to resource for data science & analytics in the company? Role Expertise in creating ETL pipelines, ad hoc reporting and data visualizations. Work closely with global & regional teams to architect, develop, and maintain advanced reporting and data visualization capabilities on large volumes of data to support analytics and reporting needs across products, markets, and services. Act as the direct owner for major sections of the process contributing to a larger role. Translates client/ stakeholder needs into technical analyses and/or custom solutions in collaboration with internal and external partners, derive insights and present findings and outcomes to clients/stakeholders to solve critical business questions. Responsible for developing data-driven innovative scalable analytical solutions and identifying opportunities to support business and client needs in a quantitative manner and facilitate informed recommendations / decisions. Accountable for delivering high quality project solutions and tools within agreed upon timelines and budget parameters and conducting post- implementation reviews. Contributes to the development of custom analyses and solutions, derives insights from extracted data to solve critical business questions. All About You 4+ years of experience in data management, data mining, data analytics, data reporting, data product development and quantitative analysis. Experience in automating and creating data pipeline via tools such as Alteryx, SSIS. Expertise in Data visualization tools (Tableau, Domo, and/or Power BI/similar tools). Experience with data validation, quality control and cleansing processes to new and existing data sources. Advanced SQL skills, ability to write optimized queries for large data sets. Experience on Platforms/Environments: Cloudera Hadoop, Big data technology stack, SQL Server. Bachelor’s or Master’s Degree in a Computer Science, Information Technology, Engineering, Mathematics, Statistics, M.S./M.B.A. preferred Additional Competencies Excellent problem solving, quantitative and analytical skills. In depth technical knowledge, drive and ability to learn new technologies. Strong attention to detail and quality. • Team player, excellent communication skills.• Creativity/Innovation Self-motivated, operates with a sense of urgency Able to prioritize and perform multiple tasks simultaneously Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31300 Jobs | Dublin
Wipro
16502 Jobs | Bengaluru
EY
10539 Jobs | London
Accenture in India
10399 Jobs | Dublin 2
Uplers
8481 Jobs | Ahmedabad
Amazon
8475 Jobs | Seattle,WA
IBM
7957 Jobs | Armonk
Oracle
7438 Jobs | Redwood City
Muthoot FinCorp (MFL)
6169 Jobs | New Delhi
Capgemini
5811 Jobs | Paris,France