Home
Jobs

751 Teradata Jobs - Page 29

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a part of BCG X A&A team, you will work closely with consulting teams on a diverse range of advanced analytics topics. You will have the opportunity to leverage analytical methodologies to deliver value to BCG's Consulting (case) teams and Practice Areas (domain) through providing analytics subject matter expertise, and accelerated execution support. You will collaborate with case teams to gather requirements, specify, design, develop, deliver and support analytic solutions serving client needs. You will provide technical support through deeper understanding of relevant data analytics solutions and processes to build high quality and efficient analytic solutions. YOU'RE GOOD AT Working with case (and proposal) teams Acquiring deep expertise in at least one analytics topic & understanding of all analytics capabilities Defining and explaining expected analytics outcome; defining approach selection Delivering original analysis and insights to BCG teams, typically owning all or part of an analytics module and integrating with case teams Establishing credibility by thought partnering with case teams on analytics topics; drawing conclusions on a range of external and internal issues related to their module Communicating analytical insights through sophisticated synthesis and packaging of results (including PowerPoint presentation, Documents, dashboard and charts) with consultants, collects, synthesizes, analyses case team learning & inputs into new best practices and methodologies Build collateral of documents for enhancing core capabilities and supporting reference for internal documents; sanitizing confidential documents and maintaining a repository Able to lead workstreams and modules independently or with minimal supervision Ability to support business development activities (proposals, vignettes etc.) and build sales collateral to generate leads Team requirements: Guides juniors on analytical methodologies and platforms, and helps in quality checks Contributes to team's content & IP development Imparts technical trainings to team members and consulting cohort Technical Skills: Strong proficiency in statistics (concepts & methodologies like hypothesis testing, sampling, etc.) and its application & interpretation Hands-on data mining and predictive modeling experience (Linear Regression, Clustering (K-means, DBSCAN, etc.), Classification (Logistic regression, Decision trees/Random Forest/Boosted Trees), Timeseries (SARIMAX/Prophet)etc. Strong experience in at least one of the prominent cloud providers (Azure, AWS, GCP) and working knowledge of auto ML solutions (Sage Maker, Azure ML etc.) At least one tool in each category; Programming language - Python (Must have), (R Or SAS OR PySpark), SQL (Must have) Data Visualization (Tableau, QlikView, Power BI, Streamlit) , Data management (using Alteryx, MS Access, or any RDBMS) ML Deployment tools (Airflow, MLflow Luigi, Docker etc.) Big data technologies ( Hadoop ecosystem, Spark) Data warehouse solutions (Teradata, Azure SQL DW/Synapse, Redshift, BigQuery etc,) Version Control (Git/Github/Git Lab) MS Office (Excel, PowerPoint, Word) Coding IDE (VS Code/PyCharm) GenAI tools (OpenAI, Google PaLM/BERT, Hugging Face, etc.) Functional Skills: Expertise in building analytical solutions and delivering tangible business value for clients (similar to the use cases below) Price optimization, promotion effectiveness, Product assortment optimization and sales force effectiveness, Personalization/Loyalty programs, Labor Optimization CLM and revenue enhancement (segmentation, cross-sell/up-sell, next product to buy, offer recommendation, loyalty, LTV maximization and churn prevention) Communicating with confidence and ease: You will be a clear and confident communicator, able to deliver messages in a concise manner with strong and effective written and verbal communication. What You'll Bring Bachelor/Master's degree in a field linked to business analytics, statistics or economics, operations research, applied mathematics, computer science, engineering, or related field required; advanced degree preferred At least 2-4 years of relevant industry work experience providing analytics solutions in a commercial setting Prior work experience in a global organization, preferably in a professional services organization in data analytics role Demonstrated depth in one or more industries not limited to but including Retail, CPG, Healthcare, Telco etc Prior work experience in a global organization, preferably in a professional services organization in data analytics role to join our ranks. #BCGXjob Who You'll Work With Our data analytics and artificial intelligence professionals mix deep domain expertise with advanced analytical methods and techniques to develop innovative solutions that help our clients tackle their most pressing issues. We design algorithms and build complex models out of large amounts of data. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Join us as a Senior Test Automation Engineer at Barclays, responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Senior Test Automation Engineer you should have experience with: Hands on Test Automation with deep understanding of Software/QA Methodologies Understand requirements, user stories and able to prepare Test scope, test cases and execute the same Execute Non-functional requirements tests including performance, load, stress, scalability, and reliability Testing/ Automation Tools / frameworks like – Python, Pytest, BDD, TDD, Karate, Rest Assured, Performance Centre, Load runner etc.. Good understanding of tech stack as AWS, Kafka (Messaging Queues), Mongo DB, SQL, ETL and APIs CICD integration tools like Jenkins, TeamCity, GitLab etc. Collaborate closely with Dev/DevOps/BA teams. Unix commands, ETL architecture & Data Warehouse concepts Python Language (For Test Automation) – In-depth understanding of Data Structures: Lists, Tuples, Sets, Dictionaries. OOPS concepts, Data Frames, Lambda functions, Boto3, File handling, DB handling, debugging techniques etc Perform complex SQL queries/ joins to validate data transformations, migration and integrity across source and target systems. Test and defect management - Document Test results, defects, and track issues to resolutions using tool – Jira/ X-Ray Experience with at least one relational database – Oracle (Golden Gate services), MYSQL or SQL Server or Teradata Experience with at least one CICD tool for integrating Test Automation suits – Jenkins or TeamCity Some Other Highly Valued Skills May Include Functional Corporate Banking knowledge Good understanding of Agile methodologies Hands on experience with Gen AI models Good understanding of Snowflake, DBT & Pyspark Experience with BI tools like Tableau/ Power BI for visual data validations You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Show more Show less

Posted 1 month ago

Apply

8 - 10 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Senior Data Engineer - Google Cloud  7+ years direct experience working in Enterprise Data Warehouse technologies.  7+ years in a customer facing role working with enterprise clients.  Experience with architecting, implementing and/or maintaining technical solutions in virtualized environments.  Experience in design, architecture and implementation of Data warehouses, data pipelines and flows.  Experience with developing software code in one or more languages such as Java, Python and SQL.  Experience designing and deploying large scale distributed data processing systems with few technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Vertica, Netezza, Teradata, Tableau, Qlik or MicroStrategy.  Customer facing migration experience, including service discovery, assessment, planning, execution, and operations.  Demonstrated excellent communication, presentation, and problem-solving skills.  Experience in project governance and enterprise.  Mandatory Certifications Required Google Cloud Professional Cloud Architect Google Cloud Professional Data Engineer Mandatory skill sets-GCP Architecture/Data Engineering, SQL, Python Preferred Skill Sets-GCP Architecture/Data Engineering, SQL, Python Year of experience required-8-10 years Qualifications-B.E / B.TECH/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Entity: Technology Job Family Group: IT&S Group Job Description: Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 13- 15 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Commercial Acumen, Communication, Data Analysis, Data cleansing and transformation, Data domain knowledge, Data Integration, Data Management, Data Manipulation, Data Sourcing, Data strategy and governance, Data Structures and Algorithms (Inactive), Data visualization and interpretation, Digital Security, Extract, transform and load, Group Problem Solving Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 1 month ago

Apply

4 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Overview Enterprise Data Operations Sr Analyst - L08 Job Overview: As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Responsibilities: Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling - documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Qualifications: 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Does the person hired for this job need to be based in a PepsiCo office, or can they be remote?: Employee must be based in a Pepsico office Primary Work Location: Hyderabad HUB-IND Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Senior Data Engineer will be Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 8- 12 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Show more Show less

Posted 1 month ago

Apply

3 - 10 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Roles & Responsibilities: Total Experience : 3 to 10 years Languages: Scala/Python 3.x File System: HDFS Frameworks: Spark 2.x/3.x (Batch/SQL API), Hadoop, Oozie/Airflow Databases: HBase, Hive, SQL Server, Teradata Version Control System: GitHub Other Tools: Zendesk, JIRA Mandatory Skill Sets Big Data, Python, Hadoop, Spark Preferred Skill Sets Big Data, Python, Hadoop, Spark Years Of Experience Required 3-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Big Data Optional Skills Python (Programming Language) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* Global Finance was set up in 2007 as a part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business and Enterprise Finance functions. The capabilities hosted include Legal Entity Controllership, General Accounting & Reconciliations, Regulatory Reporting, Operational Risk and Control Oversight, Finance Systems Support, etc. Our group supports the Finance function for the Consumer Banking, Wealth and Investment Management business teams across Financial Planning and Analysis, period end close, management reporting and data analysis. Job Description* The candidate will be responsible for delivering complex and time critical data mining and analytical request for the Consumer & Small Business Banking, lending products such as Credit Cards and in addition will be responsible for analysis of data for decision making by senior leadership. Candidate will be responsible for data management, data extraction and upload, data validation, scheduling & process automation, report preparation, etc. The individual will play a key role in the team responsible for FP&A data reporting, adhoc reporting & data requirements, data analytics & business analysis and would manage multiple projects in parallel by ensuring adequate understanding of the requirements and deliver data driven insights and solutions to complex business problems. These projects would be time critical which would require the candidate to comprehend & evaluate the strategic business drivers to bring in efficiencies through automation of existing reporting packages or codes. The work would be a mix of standard and ad-hoc deliverables based on dynamic business requirements. Technical competency is essential to build processes which ensure data quality and completeness across all projects / requests related to business. The core responsibility of this individual is process management to achieve sustainable, accurate and well controlled results. Candidate should have a clear understanding of the end-to-end process (including its purpose) and a discipline of managing and continuously improving those processes. Responsibilities* Preparation and maintenance of various KPI reporting (Consumer lending such as Credit Cards) including performing data or business driven deep dive analysis. Credit Cards rewards reporting, data mining & analytics. Understand business requirements and translate those into deliverables. Support the business on periodic and ad-hoc projects related to consumer lending products. Develop and maintain codes for the data extraction, manipulation, and summarization on tools such as SQL, SAS, Emerging technologies like Tableau and Alteryx. Design solutions, generate actionable insights, optimize existing processes, build tool-based automations, and ensure overall program governance. Managing and improve the work: develop full understanding of the work processes, continuous focus on process improvement through simplification, innovation, and use of emerging technology tools, and understanding data sourcing and transformation. Managing risk: managing and reducing risk, proactively identify risks, issues and concerns and manage controls to help drive responsible growth (ex: Compliance, procedures, data management, etc.), establish a risk culture to encourage early escalation and self-identifying issues. Effective communication: deliver transparent, concise, and consistent messaging while influencing change in the teams. Extremely good with numbers and ability to present various business/finance metrics, detailed analysis, and key observations to Senior Business Leaders. Requirements* Education* - Masters/Bachelor’s Degree in Information Technology/Computer Science/ MCA or MBA finance with 7-10 years of relevant work experience. Experience Range* 7-10 years of relevant work experience in data analytics, business analysis & financial reporting in banking or credit card industry. Exposure to Consumer banking businesses would be an added advantage. Foundational skills* Strong abilities in data extraction, data manipulation and business analysis and strong financial acumen. Strong computer skills, including MS excel, Teradata SQL, SAS and emerging technologies like Alteryx, Tableau. Prior Banking and Financial services industry experience, preferably Retail banking, and Credit Cards. Strong business problem solving skills, and ability to deliver on analytics projects independently, from initial structuring to final presentation. Strong communication skills (both verbal and written), Interpersonal skills and relationship management skills to navigate the complexities of aligning stakeholders, building consensus, and resolving conflicts. Proficiency in Base SAS, Macros, SAS Enterprise Guide Querying data from multiple source Experience in data extraction, transformation & loading using SQL/SAS. Proven ability to manage multiple and often competing priorities in a global environment. Manages operational risk by building strong processes and quality control routines. SQL: Querying data from multiple source Data Quality and Governance: Ability to clean, validate and ensure data accuracy and integrity. Troubleshooting: Expertise in debugging and optimizing SAS and SQL codes. Desired Skills Ability to effectively manage multiple priorities under pressure and deliver as well as being able to adapt to changes. Able to work in a fast paced, deadline-oriented environment. Stakeholder management Attention to details: Strong focus on data accuracy and documentation. Work Timings* 11:30 am to 8:30 pm (will require to stretch 7-8 days in a month to meet critical deadlines) Job Location* Mumbai/Gurugram Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* Global Finance was set up in 2007 as a part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business and Enterprise Finance functions. The capabilities hosted include Legal Entity Controllership, General Accounting & Reconciliations, Regulatory Reporting, Operational Risk and Control Oversight, Finance Systems Support, etc. Our group supports the Finance function for the Consumer Banking, Wealth and Investment Management business teams across Financial Planning and Analysis, period end close, management reporting and data analysis. Job Description* The candidate will be responsible for delivering complex and time critical data mining and analytical request for the Consumer & Small Business Banking, lending products such as Credit Cards and in addition will be responsible for analysis of data for decision making by senior leadership. Candidate will be responsible for data management, data extraction and upload, data validation, scheduling & process automation, report preparation, etc. The individual will play a key role in the team responsible for FP&A data reporting, adhoc reporting & data requirements, data analytics & business analysis and would manage multiple projects in parallel by ensuring adequate understanding of the requirements and deliver data driven insights and solutions to complex business problems. These projects would be time critical which would require the candidate to comprehend & evaluate the strategic business drivers to bring in efficiencies through automation of existing reporting packages or codes. The work would be a mix of standard and ad-hoc deliverables based on dynamic business requirements. Technical competency is essential to build processes which ensure data quality and completeness across all projects / requests related to business. The core responsibility of this individual is process management to achieve sustainable, accurate and well controlled results. Candidate should have a clear understanding of the end-to-end process (including its purpose) and a discipline of managing and continuously improving those processes. Responsibilities* Preparation and maintenance of various KPI reporting (Consumer lending such as Credit Cards) including performing data or business driven deep dive analysis. Credit Cards rewards reporting, data mining & analytics. Understand business requirements and translate those into deliverables. Support the business on periodic and ad-hoc projects related to consumer lending products. Develop and maintain codes for the data extraction, manipulation, and summarization on tools such as SQL, SAS, Emerging technologies like Tableau and Alteryx. Design solutions, generate actionable insights, optimize existing processes, build tool-based automations, and ensure overall program governance. Managing and improve the work: develop full understanding of the work processes, continuous focus on process improvement through simplification, innovation, and use of emerging technology tools, and understanding data sourcing and transformation. Managing risk: managing and reducing risk, proactively identify risks, issues and concerns and manage controls to help drive responsible growth (ex: Compliance, procedures, data management, etc.), establish a risk culture to encourage early escalation and self-identifying issues. Effective communication: deliver transparent, concise, and consistent messaging while influencing change in the teams. Extremely good with numbers and ability to present various business/finance metrics, detailed analysis, and key observations to Senior Business Leaders. Requirements* Education* - Masters/Bachelor’s Degree in Information Technology/Computer Science/ MCA or MBA finance with 7-10 years of relevant work experience. Experience Range* 7-10 years of relevant work experience in data analytics, business analysis & financial reporting in banking or credit card industry. Exposure to Consumer banking businesses would be an added advantage. Foundational skills* Strong abilities in data extraction, data manipulation and business analysis and strong financial acumen. Strong computer skills, including MS excel, Teradata SQL, SAS and emerging technologies like Alteryx, Tableau. Prior Banking and Financial services industry experience, preferably Retail banking, and Credit Cards. Strong business problem solving skills, and ability to deliver on analytics projects independently, from initial structuring to final presentation. Strong communication skills (both verbal and written), Interpersonal skills and relationship management skills to navigate the complexities of aligning stakeholders, building consensus, and resolving conflicts. Proficiency in Base SAS, Macros, SAS Enterprise Guide Querying data from multiple source Experience in data extraction, transformation & loading using SQL/SAS. Proven ability to manage multiple and often competing priorities in a global environment. Manages operational risk by building strong processes and quality control routines. SQL: Querying data from multiple source Data Quality and Governance: Ability to clean, validate and ensure data accuracy and integrity. Troubleshooting: Expertise in debugging and optimizing SAS and SQL codes. Desired Skills Ability to effectively manage multiple priorities under pressure and deliver as well as being able to adapt to changes. Able to work in a fast paced, deadline-oriented environment. Stakeholder management Attention to details: Strong focus on data accuracy and documentation. Work Timings* 11:30 am to 8:30 pm (will require to stretch 7-8 days in a month to meet critical deadlines) Job Location* Mumbai/Gurugram Show more Show less

Posted 1 month ago

Apply

4 - 9 years

0 - 3 Lacs

Chennai, Mumbai (All Areas)

Hybrid

Naukri logo

Direct Responsibilities Analyze and interpret requirement & issues specifications received from Business analysts or production teams. Participate with analysts to ensure correct understanding and implementation of specifications. Work in collaboration with the analyst of the project to meet the clients expectations. Take charge of the development work according to the priorities defined by the product owner. Propose technical solutions adapted to the business needs, develop technical requirements. Design and develop IT solutions based on the specifications received. Packaging and deployment on non-production environments and monitor production releases & deployments Participate in the testing support (system, user acceptance, regression) . Bring a high level of quality in developments, in terms of maintainability, testability and performance. Participate in code reviews set up within the program. Contributing Responsibilities Participate in transversal, capability building efforts for the bank. Implementation of best practices, coding & development cultures Work closely as “one team” with all stakeholders jointly to provide high quality delivery Work on data-driven issues, innovative technologies (Java 17 or 21 with Quarkus Framework, Kubernetes, Kogito) and in the Finance & Risk functional areas. Technical & Behavioral Competencies Technical (Mandatory Skills): TRERADATA development (intermediate, expert) Knowledge of Linux shell SQL queries Unit Testing (Optional Skills): Management of temporality in Teradata Knowledge of Linux environment Behavior Skills Ability to work independently and collaborate as part of a team. Rigorous and disciplined, with deep attention to quality of work (software craftsmanship approach is welcome) Result oriented, ability to meet and respect deadlines. Curious, ability to learn and adapt to technological change. Good communication skills Excellent analytical and problem-solving skills

Posted 1 month ago

Apply

11 - 13 years

13 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Position Overview Developer with 11 -1 3 ?years of strong design and developer experience to build? robust APIs and services using Java and Spring Boot , coupled with hands-on experience in data processing . H as knowledge and experience to design and implement scalable On Prem / C loud solutions that efficiently?manage and leverage large datasets . P roficient in Java / Spring Boot with demonstrated ability to integrate with different database s and other APIs and services while ensuring security and best practices are followed throughout the development lifecycle. ? Responsible for overall design and development of data integration code for the engineering team/asset. Responsible for providing technical knowledge on integration and ETL processes to build and establish coding standards. Create a working agreement withing the team and other stakeholders that are involved. Partner and align with Enterprise Architect, Product Owner, Production Support, Analyst, PVS Testing team, and data engineers to build solutions that conform to defined standards. Perform code reviews and implement suggest improvements. Responsibilities Design, develop, and maintain APIs using Java and Spring B oot and ensure efficient data exchange between applications. ?Implement API security measures including authentication, authorization, and rate limiting. Document API specifications and maintain API documentation for internal and external users. De velop integration with different data sources and other APIs / Web Services Develop integrations with IBM MQ and Kafka Develop / Maintain CI/CD pipelines Do performance evaluation and application tuning Monitor and troubleshoot application for stability and performance Well versed in design, development, and unit testing of ETL jobs that read and writes data from database tables, flat files, datasets, IBM MQs, Kafka topics, S3 files, etc. ? Carry out Data Profiling of Source data and generate logical data models (as required or applicable). ? Define, document, and complete System Requirements Specifications including Functional Requirements, Context Diagrams, Non-Functional Requirements, and Business Rules (as applicable for Sprints to be complete). Create Source-to-Target mapping documents as required and applicable. Support definition of business requirements including assisting the Product Owner in writing user stories and acceptance criteria for user stories. Support other scrum team members during the following activities (as required or applicable). Design of test scenarios and test cases. Develop and identify data requirements for Unit, Systems, and?Integration tests. Qualifications Required Skills Programming Languages ? Proficiency in Java. Web Development ? Experience with SOAP and RESTful services. Database Management ? Strong knowledge of SQL (Oracle). Version Control ? Expertise in using version control systems like Git. CI/CD ? Familiarity with CI/CD tools such as GitLab CI and Jenkins. Containerization & Orchestration ? Experience with Docker and OpenShift. Messaging Queues ? Knowledge of IBM MQ and Apache Kafka. Cloud Services ? Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Adept working experience in design and development of performance efficient ETL flows dealing with millions of rows in volume. Must have experience working in SAFE Agile Scrum project delivery model. Good at writing complex SQL queries to pull data out of RDBMS databases like Oracle, SQL Server, DB2, Teradata , etc. Good working knowledge of Unix scripts. Batch job scheduling software such as CA ESP. Experienced in using CI/CD methodologies . Required Experience & Education Must have 11 - 13 years of hands-on development of ETL jobs using IBM DataStage version 11 or higher. Experience managing and/or leading a team of developers. Working knowledge of Data Modelling, solution architecture, normalization, data profiling etc. Adherence to good coding practices, technical documentation, and must be a good team player. Desired Skills Analytical ThinkingAbility to break down complex problems and devise efficient solutions. DebuggingSkilled in identifying and fixing bugs in code and systems. Algorithm Design Proficiency in designing and optimizing algorithms. LeadershipProven leadership skills with experience mentoring junior engineers. CommunicationStrong verbal and written communication skills. TeamworkAbility to collaborate effectively with cross-functional teams. Time ManagementCompetence in managing time and meeting project deadlines. Education Bachelors degree in Computer Science , Software Engineering, or related field. A Master's degree is a plus. Certifications ? Relevant certifications in AWS a plus Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH) About Evernorth Health Services

Posted 1 month ago

Apply

11 - 13 years

13 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Position Overview Developer with 11 -13 years of strong design and developer experience to build? robust APIs and services using Java and Spring Boot , coupled with hands-on experience in data processing . H as knowledge and experience to design and implement scalable On Prem / C loud solutions that efficiently?manage and leverage large datasets . P roficient in Java / Spring Boot with demonstrated ability to integrate with different database s and other APIs and services while ensuring security and best practices are followed throughout the development lifecycle. ? Create a working agreement withing the team and other stakeholders that are involved. Partner and align with Enterprise Architect, Product Owner, Production Support, Analyst, PVS Testing team, and data engineers to build solutions that conform to defined standards. Perform code reviews and implement suggest improvements. Responsibilities Design, develop, and maintain APIs using Java and Spring B oot and ensure efficient data exchange between applications. ?Implement API security measures including authentication, authorization, and rate limiting. Document API specifications and maintain API documentation for internal and external users. ? De velop integration with different data sources and other APIs / Web Services Develop integrations with IBM MQ and Kafka Develop / Maintain CI/CD pipelines Do performance evaluation and application tuning Monitor and troubleshoot application for stability and performance Carry out Data Profiling of Source data and generate logical data models (as required or applicable). ? Define, document, and complete System Requirements Specifications including Functional Requirements, Context Diagrams, Non-Functional Requirements, and Business Rules (as applicable for Sprints to be complete). Create Source-to-Target mapping documents as required and applicable. Support definition of business requirements including assisting the Product Owner in writing user stories and acceptance criteria for user stories. Support other scrum team members during the following activities (as required or applicable). Design of test scenarios and test cases. Develop and identify data requirements for Unit, Systems, and?Integration tests. Qualifications Required Skills Programming Languages Proficiency in Java. Web Development Experience with SOAP and RESTful services. Database Management Strong knowledge of SQL (Oracle). Version Control Expertise in using version control systems like Git. CI/CD Familiarity with CI/CD tools such as GitLab CI and Jenkins. Containerization & Orchestration ? Experience with Docker and OpenShift. Messaging Queues ? Knowledge of IBM MQ and Apache Kafka. Cloud Services ? Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Adept working experience in design and development of performance efficient ETL flows dealing with millions of rows in volume. Must have experience working in SAFE Agile Scrum project delivery model. Good at writing complex SQL queries to pull data out of RDBMS databases like Oracle, SQL Server, DB2, Teradata , etc. Good working knowledge of Unix scripts. Batch job scheduling software such as CA ESP. Experienced in using CI/CD methodologies . Required Experience & Education Must have 11 - 13 years of hands-on development experience Extensive e xperience developing and maintaining APIs Experience managing and/or leading a team of developers. Working knowledge of Data Modelling, solution architecture, normalization, data profiling etc. Adherence to good coding practices, technical documentation, and must be a good team player. Desired Skills Analytical ThinkingAbility to break down complex problems and devise efficient solutions. DebuggingSkilled in identifying and fixing bugs in code and systems. Algorithm Design Proficiency in designing and optimizing algorithms. LeadershipProven leadership skills with experience mentoring junior engineers. CommunicationStrong verbal and written communication skills. TeamworkAbility to collaborate effectively with cross-functional teams. Time ManagementCompetence in managing time and meeting project deadlines. Education Bachelors degree in Computer Science , Software Engineering, or related field. A Master's degree is a plus. Certifications Relevant certifications in AWS a plus Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH)

Posted 1 month ago

Apply

1 - 2 years

3 - 4 Lacs

Bengaluru

Work from Office

Naukri logo

About Lowe s Lowe s Companies, Inc. (NYSE: LOW) is a FORTUNE 50 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. About the Team This team is responsible to perform quantitative analysis or dashboard building needed to help guide key business decisions. This includes applying knowledge of Lowes data concepts to the creation of relevant analytic designs and making sound, data driven business recommendations. Job Summary This role leverages multiple resources, advanced analytic methodologies, and data streams to support recommendations for business decisions and reporting solutions. With a focus specifically on Pro & Services, this role provides data capture capabilities to support analytics needs for all Pro & Services business areas. This role translates business needs to effective analytics specifications that provide metrics for analytic solutions across various initiatives. This individual executes analytic, reporting, and automation projects with minimal support while getting direction from manager and senior level staff to provide expertise in problem analysis, solution implementation, and ongoing opportunities in the assigned business area. To be successful, the individual in this role must have a fair understanding of analytical techniques and disparate data sources both internal and external, reporting tools and techniques. Roles & Responsibilities: Core Responsibilities: Responsible for providing area-specific business data analytics, development and deployment of necessary dashboards and reporting. Helps gather business requirements and translates into reporting solutions, analytic tools, and dashboards to deliver actionable data to end users. Synthesizes findings, prepares reports and presentations, and presents findings to management. Communicates data driven insights to leaders by preparing analyses using multiple data sources, translating findings into clear, understandable themes, identifying complete, consistent, and actional insights and recommendations. Develops, configures, and modifies database components within various computing environments by using various tools such as SQL and/or Power BI to access, manipulate, and present data. Years of Experience: 1-2 years of experience using analytic tools (e.g., SQL, Alteryx, Knime, SAS). 1-2 years of experience using data visualization tools (e.g., Power BI, Microstrategy, Tableau). 1-2 years of experience working with Enterprise level databases (e.g., Hadoop, Teradata, GCP, Oracle, DB2). Education Qualification & Certifications (optional) Required Minimum Qualifications : Bachelor s degree in business administration, Finance, Mathematics, or Related Fields and 1 Years Related Experience OR master s degree in business administration, Finance, Mathematics, or Related Fields. Skill Set Required Primary Skills (must have) Hands on experience in analytical tools (e.g., SQL, Alteryx, Knime, SAS). Experience using data visualization tools (e.g., Power BI, Microstrategy, Tableau). Experience working with Enterprise level databases (e.g., Hadoop, Teradata, GCP, Oracle, DB2). Secondary Skills (desired) Basic understanding of the retail/home improvement industry

Posted 1 month ago

Apply

4 - 9 years

40 - 45 Lacs

Hyderabad

Work from Office

Naukri logo

!. Description As a Cloud Data Platform Engineer, you will be responsible for leading all aspects of a database platform. This would include either database design, database security, DR strategy, develop standard processes, new feature evaluations or analyze workloads to identify optimization opportunities at a system and application level. You will be driving automation efforts to effectively manage the database platform, and build self service solutions for users. You will also be partnering with development teams, product managers and business users to review the solution design being deployed and provide recommendations to optimize and tune. This role will also address any platform wide performance and stability issues. Were looking for an individual who loves to take challenges, takes on problems with imaginative solutions, works well in collaborative teams to build and support a large Enterprise Data Warehouse. 4+ years of experience in database technologies like Snowflake (preferred), Teradata, BigQuery or Redshift. Demonstrated ability working with Advanced SQL. Experience handling DBA functions, DR strategy, data security, governance, associated automation and tooling for a database platform. Experience with object oriented programming in Python or Java. Analyze production workloads and develop strategies to run Snowflake database with scale and efficiency. Experience in performance tuning, capacity planning, managing cloud spend and utilization. Experience with SaaS/PaaS enterprise services on GCP/AWS or Azure is a plus Familiarity with in-memory database platforms like SingleStore is a plus Experience with Business intelligence (BI) platforms like Tableau, Thought-Spot and Business Objects is a plus Good communication and personal skills: ability to interact and work well with members of other functional groups in a project team and a strong sense of project ownership. Education & Experience Bachelor s Degree in Computer Science Engineering or IT from a reputed school

Posted 1 month ago

Apply

8 - 11 years

14 - 19 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

We are looking for a skilled professional with 8 to 11 years of industry experience to lead our migration of data analytics environment from Teradata to Snowflake, focusing on performance and reliability. The ideal candidate will have strong technical expertise in big data engineering and hands-on experience with Snowflake. ### Roles and Responsibility Lead the migration of data analytics environments from Teradata to Snowflake, emphasizing performance and reliability. Design and deploy big data pipelines in a cloud environment using Snowflake Cloud DW. Develop and migrate existing on-prem ETL routines to Cloud Services. Collaborate with senior leaders to understand business goals and contribute to workstream delivery. Design and optimize model codes for faster execution. Work with cross-functional teams to ensure seamless integration of data analytics solutions. ### Job Requirements Minimum 8 years of experience as an Architect on Analytics solutions. Strong technical experience with Snowflake, including modeling, schema, and database design. Experience integrating with third-party tools, ETL, and DBT tools. Proficiency in programming languages such as Java, Scala, or Python. Excellent communication skills, both written and verbal, with the ability to communicate complex technical concepts effectively. Flexible and proactive working style with strong personal ownership of problem resolution. A computer science graduate or equivalent is required.

Posted 1 month ago

Apply

3 - 7 years

10 - 14 Lacs

Hyderabad

Hybrid

Naukri logo

Job Title: Data Expert Teradata & SQL Location: Hyderabad Job Summary: We are seeking an experienced Data Expert with deep expertise in Teradata and SQL to join our data team. In this role, you will be responsible for designing, developing, optimizing, and maintaining scalable data solutions within our Teradata ecosystem. You will work closely with data engineers, analysts, and business stakeholders to enable high-quality, performant data pipelines and queries that support analytics and decision-making across the organization. Key Responsibilities: Develop, optimize, and troubleshoot complex Teradata SQL queries to support business intelligence and analytics initiatives. Design and implement efficient data models , including star and snowflake schemas in Teradata. Monitor, analyze, and tune query performance using Teradata EXPLAIN , DBQL , and statistics collection . Collaborate with cross-functional teams to understand data requirements and translate them into scalable data solutions. Develop and maintain ETL/ELT processes using BTEQ, FastLoad, MultiLoad, or Teradata Parallel Transporter (TPT). Implement data quality checks , validation logic, and ensure consistency across environments. Document data pipelines, definitions, and data lineage for governance and compliance. Support data migration or integration projects involving Teradata and other databases (e.g., Oracle, Snowflake, SQL Server). Automate repetitive tasks using scripting (Shell, Python) and workflow orchestration tools (e.g., Airflow, Control-M). Required Skills and Qualifications: 3+ years of hands-on experience with Teradata SQL in a production environment. Strong knowledge of Teradata architecture , indexing strategies (PI, PPI), and query tuning. Experience with Teradata utilities : BTEQ, FastLoad, MultiLoad, and TPT. Ajith D ajith.d@cgi.com

Posted 1 month ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

- 5+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL - Experience managing a data or BI team - Experience leading and influencing the data or BI strategy of your team or organization - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience hiring, developing and promoting engineering talent - Experience communicating to senior management and customers verbally and in writing We are seeking an ambitious Data Engineering Manager to join our Metrics and Data Platform team. The Metrics and Data Platform team plays a critical role in enabling Amazon Music’s business decisions and data-driven software development by collecting and providing behavioral and operational metrics to our internal teams. We maintain a scalable and robust data platform to support Amazon Music’s rapid growth, and collaborate closely with data producers and data consumers to accelerate innovation using data. As a Data Engineering Manager, you will manage a team of talented Data Engineers. Your team collects billions of events a day, manages petabyte-scale datasets on Redshift and S3, and develops data pipelines with Spark, SQL, EMR, and Airflow. You will collaborate with product and technical stakeholders to solve challenging data modeling, data availability, data quality, and data governance problems. At Amazon Music, engineering managers are the primary drivers of their team’s roadmap, priorities, and goals. You will be deeply involved in your team’s execution, helping to remove obstacles and accelerate progress. A successful candidate will be customer obsessed, highly analytical and detail oriented, able to work effectively in a data-heavy organization, and adept at leading across multiple different complex workstreams at once. Key job responsibilities - Hiring, motivating, mentoring, and growing a high-performing engineering team - Owning and managing big data pipelines, Amazon Music’s foundational datasets, and the quality and operational performance of the datasets - Collaborating with cross-functional teams and customers, including business analysts, marketing, product managers, technical program managers, and software engineers/managers - Defining and managing your team’s roadmap, priorities, and goals in partnership with Product, stakeholders, and leaders - Ensuring timely execution of team priorities and goals by proactively identifying risks and removing blockers - Recognizing and recommending process and engineering improvements that reduce failures and improve efficiency - Clearly communicating business updates, verbally and in writing, to both technical and non-technical stakeholders, peers, and leadership - Effectively influencing other team’s priorities and managing escalations - Owning and improving business and operational metrics of your team's software - Ensuring team compliance with policies (e.g., information security, data handling, service level agreements) - Identifying ways to leverage GenAI to reduce operational overhead and improve execution velocity - Introducing ideas to evolve and modernize our data model to address customer pain points and improve query performance About the team Amazon Music is an immersive audio entertainment service that deepens connections between fans, artists, and creators. From personalized music playlists to exclusive podcasts, concert livestreams to artist merch, Amazon Music is innovating at some of the most exciting intersections of music and culture. We offer experiences that serve all listeners with our different tiers of service: Prime members get access to all the music in shuffle mode, and top ad-free podcasts, included with their membership; customers can upgrade to Amazon Music Unlimited for unlimited, on-demand access to 100 million songs, including millions in HD, Ultra HD, and spatial audio; and anyone can listen for free by downloading the Amazon Music app or via Alexa-enabled devices. Join us for the opportunity to influence how Amazon Music engages fans, artists, and creators on a global scale. Learn more at https://www.amazon.com/music. Experience with AWS Tools and Technologies (Redshift, S3, EC2) Experience in processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 month ago

Apply

0 years

0 - 0 Lacs

Chennai, Tamil Nadu

Work from Office

Indeed logo

Create COBOL, DB2, Informatica, Linux, Teradata and Oracle code artifacts Meet with various IT groups (other departments and computer operations' staff) to address issues/concerns Interact closely with Business Analysis team, ETL team and BI Reporting teams to ensure understanding of proper use of data architecture Analyze requirements to create technical designs, data models and migration strategies Design, build, and maintain physical databases, dimensional data models, OLAP cubes, ETL layer design and data integration strategies Evaluate and influence selection of data warehouse and business intelligence software Collaborate with technology stakeholders to define and implement actionable metrics, KPIs and data visualizations Lead technical design and implementation of dashboards and reporting capabilities Implement data quality, data integrity, and data standardization efforts across products and databases enabling key business processes and applications Recommend improvements to enhance existing ETL and data integration processes to enable performance and overall scalability Job Types: Full-time, Permanent, Fresher Pay: ₹18,455.00 - ₹28,755.00 per month Benefits: Provident Fund Schedule: Day shift Morning shift Rotational shift Supplemental Pay: Yearly bonus Work Location: In person

Posted 1 month ago

Apply

5 - 7 years

6 - 11 Lacs

Chennai

Work from Office

Naukri logo

Job Description Role: Data Engineer Experience level: 5 to 7 years Location: Chennai Can you say Yes, I have! to the following? Good understanding of distributed system architecture, data lake design and best practices Working knowledge of cloud-based deployments in AWS, Azure or GCP Coding proficiency in at least one programming language (Scala, Python, Java) Experience in Airflow is preferred Experience in data warehousing, relational database architectures (Oracle, SQL, DB2, Teradata) Expertise in Big Data storage and processing platform (Hadoop, Spark, Hive, HBASE) Skills: Problem solver, fast learner, energetic and enthusiastic Self-motivated and highly professional, with the ability to lead, and take ownership and responsibility Adaptable and flexible to business demands Can you say Yes, I will! to the following? Lead analytical projects and deliver value to customers Coordinate individual teams to fulfil client requirements and manage deliverables Communicate and present complex concepts to business audiences Manage and strategize business from an analytics point of view Travel to client locations when necessary Design algorithms for product development and build analytics-based products

Posted 1 month ago

Apply

2 - 6 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

About this role: Wells Fargo is seeking an Analytics Consultant. We believe in the power of working together because great ideas can come from anyone. Through collaboration, any employee can have an impact and make a difference for the entire company. Explore opportunities with us for a career in a supportive environment where you can learn and grow. In this role, you will: Consult with business line and enterprise functions on less complex research Use functional knowledge to assist in non-model quantitative tools that support strategic decision making Perform analysis of findings and trends using statistical analysis and document process Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Understand compliance and risk management requirements for supported area Ensure adherence to data management or data governance regulations and policies Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals Collaborate and consult with more experienced consultants and with partners in technology and other business groups Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualification: Responsible for maintaining partner relationships, ensuring high quality team deliverables and SLAs. Working closely with the US partners on daily basis, interacting closely with multiple business partners and program managers. Work independently, foster a culture of healthy and efficient working for the team. Designing and solving complex business problems by analytical techniques and tools. Will be involved directly in the technical build-out and/or support of databases, query tools, reporting tools, BI tools, dashboards, etc. that enable analysis, modeling, and/or advanced data visualization including development of Business Objects reports using multiple databases. Recommends potential data sources, compiles and mine data from multiple, cross business sources. Works with typically very large data sets, both structured and unstructured, and from multiple sources. Develops specific, customized reports, ad hoc analyses and/or data visualizations, formatted with business user-friendly techniques to drive adoption, such as Excel macros/pivoting/filtering, PowerPoint slides and presentations, and clear verbal and e-mail communications. Works with senior consultants or directly with partners, responsible for identifying and defining business requirements and translating business needs into moderately complex analyses and recommendations. Works with local and international colleagues and with internal customers, responsible for identifying and defining business requirements and catering to business needs for the team. Ensure adherence to data management/data governance regulations and policies. Applies knowledge of business, customers, and/or products/services/portfolios to synthesize data to 'form a story' and align information to contrast/ compare to industry perspective. Ability to work overlap hours with US team. 2+ years of experience in one or more of the following: Modeling, Forecasting, Decision Trees as well as other statistical and performance analytics. 2+ years of experience in one or more of the following: Tableau, or Power BI and paginated reports. 2+ years of Python and SQL 2+ years of experience in developing and creating BI dashboards, working on end-to-end reports, deriving insights from data. Excellent verbal, written, and interpersonal communication skill. Extensive knowledge and understanding of research and analysis. Strong analytical skills with high attention to detail and accuracy. Collaborative, team-focused attitude. Experience with Teradata/Oracle databases. Experience with Power Automate and GitHub. Domain knowledge within banking.

Posted 1 month ago

Apply

4 - 9 years

7 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About this role: Wells Fargo is seeking a... In this role, you will: Participate in low risk initiatives within Risk Analytics Review process production, and model documentation in alignment with policy, analyzing trends in current population Receive direction from manager Exercise judgment within Risk Analytics while developing understanding of analytic models, policies, and procedures Provide monthly, quarterly, and annual reports to manager and experienced managers Required Qualifications: 6+ months of Risk Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Required Qualifications for Europe, Middle East Africa only: Experience in Risk Analytics, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 4+ years of experience SQL, Teradata, and or Hadoop experience. 4+ years of experience with BI tools such as Tableau, Power BI or Alteryx applications. 3+ years of experience in risk (includes compliance, financial crimes, operational, audit, legal, credit risk, market risk). Experience researching and resolving data problems and working with technology teams on remediation of data issues. Demonstrated strong analytical skills with high attention to detail and accuracy. Excellent verbal, written, and listening communication skills. Job Expectations: Participate in complex initiatives related to business analysis and modeling, including those that are cross functional, with broad impact, and act as key participant in data aggregation and monitoring for Risk Analytics. Fully understands Data Quality Checks, Methodology, Dimensions for data completeness, accuracy, and that policies and procedures are followed. Becomes a SME in the DQ Check elements, technology infrastructure utilized, and fully understands the metadata and lineage from DQ report to source data. Escalates potential risks, issues, or calendar/timeliness risks in a timely manner to management/Data Management Sharepoint. Ensures the organization and storage of DQ checks artifacts, files, and evidences are effective, efficient, and make sense. Perform deep dive analytics (both Adhoc and structured) and provide reporting or results to both internal and external stakeholders. Design and build rich data visualizations to communicate complex ideas and automate reporting and controls. Create and interpret Business Intelligence data (Reporting, Basic Analytics, Predictive Analytics and Prescriptive Analytics) combined with business knowledge to draw supportable conclusions about current and future risk levels. Becomes a SME in the Reporting, Data Quality check elements, technology infrastructure utilized, and fully understands the metadata and lineage from DQ report to source data. To demonstrate the ability to identify and implement areas of opportunities for quality assurance, data validation, analytics and data aggregation to improve overall reporting efficiencies. Creating and executing the UAT test cases, logging the defects and managing the defects till closure. Collaborate and consult with peers, less experienced to more experienced managers, to resolve production, project, and regulatory issues, and achieve risk analysts, and common modeling goals.

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu

Work from Office

Indeed logo

Job Description Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. Job Description - Grade Specific The involves leading and managing a team of data engineers, overseeing data engineering projects, ensuring technical excellence, and fostering collaboration with stakeholders. They play a critical role in driving the success of data engineering initiatives and ensuring the delivery of reliable and high-quality data solutions to support the organization's data-driven objectives. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted 1 month ago

Apply

2 - 7 years

4 - 9 Lacs

Pune

Work from Office

Naukri logo

Role Overview: We are seeking an experienced ETL Developer with strong expertise in Informatica PowerCenter and Teradata to design and implement robust data integration solutions. This role involves end-to-end ownership of ETL workflows, performance optimization, and close collaboration with business and technical stakeholders to support enterprise data warehouse initiatives. Key Responsibilities: ETL Development (Informatica PowerCenter): Design, develop, and implement scalable ETL processes using Informatica PowerCenter . Extract, transform, and load data from multiple source systems into the Teradata data warehouse. Create, manage, and optimize ETL workflows and mappings. Teradata Database Management: Create and manage tables, indexes, stored procedures , and other database objects in Teradata . Ensure optimal database performance and maintain scalable data structures. Data Mapping and Transformation: Develop data mapping specifications and define transformation logic. Implement data cleansing, validation, and transformation rules within ETL processes. Performance Tuning: Optimize ETL performance by tuning SQL queries, mappings, and workflows. Identify and resolve performance bottlenecks in the ETL and data integration pipeline. Documentation: Maintain detailed documentation for ETL jobs, data mappings, SQL scripts , and Teradata configurations . Ensure adherence to coding standards and best practices. Collaboration & Quality Assurance: Work closely with data architects, business analysts, and cross-functional teams to understand data requirements and ensure accurate data delivery. Conduct unit, system, and integration testing to validate ETL workflows. Troubleshoot and resolve data-related issues in a timely and efficient manner. Required Skills : Hands-on experience in ETL development using Informatica PowerCenter . Strong knowledge of Teradata and its ecosystem. Proficient in SQL , with experience in query optimization and performance tuning. Solid understanding of data modeling , data warehousing concepts , and ETL architecture . Ability to create detailed and clear technical documentation . Familiarity with data quality , validation , and ","

Posted 1 month ago

Apply

7 years

0 Lacs

Chennai, Tamil Nadu, India

Linkedin logo

Job Summary Job Summary: We are looking for an experienced Senior Software Engineer with deep expertise in Spark SQL / SQL development to lead the design, development, and optimization of complex database systems. As a Senior Spark SQL/SQL Developer, you will play a key role in creating and maintaining high performance, scalable database solutions that meet business requirements and support critical applications. You will collaborate with engineering teams, mentor junior developers, and drive improvements in database architecture and performance. Key Responsibilities: Design, develop, and optimize complex Spark SQL / SQL queries, stored procedures, views, and triggers for high performance systems. Lead the design and implementation of scalable database architectures to meet business needs. Perform advanced query optimization and troubleshooting to ensure database performance, efficiency, and reliability. Mentor junior developers and provide guidance on best practices for SQL development, performance tuning, and database design. Collaborate with cross functional teams, including software engineers, product managers, and system architects, to understand requirements and deliver robust database solutions. Conduct code reviews to ensure code quality, performance standards, and compliance with database design principles. Develop and implement strategies for data security, backup, disaster recovery, and high availability. Monitor and maintain database performance, ensuring minimal downtime and optimal resource utilization. Contribute to long term technical strategies for database management and integration with other systems. Write and maintain comprehensive documentation on database systems, queries, and architecture. Required Skills & Qualifications :-- Experience: 7+ years of hands on experience in SQL Developer / data engineering or a related field. Expert level proficiency in Spark SQL and extensive experience with Bigdata (Hive), MPP (Teradata), relational databases such as SQL Server, MySQL, or Oracle. ¿ Strong experience in database design, optimization, and troubleshooting. Deep knowledge of query optimization, indexing, and performance tuning techniques. Strong understanding of database architecture, scalability, and high availability strategies. Experience with large scale, high transaction databases and data warehousing. Strong problem solving skills with the ability to analyze complex data issues and provide effective solutions. Data testing and data reconciliation Ability to mentor and guide junior developers and promote best practices in SQL development. Proficiency in database migration, version control, and integration with applications. Excellent communication and collaboration skills, with the ability to interact with both technical and non technical stakeholders. Preferred Qualifications: Experience with NoSQL databases (e.g., MongoDB, Cassandra) and cloud based databases (e.g., AWS RDS, Azure SQL Database). Familiarity with data analytics, ETL processes, and data pipelines. Experience in automation tools, CI/CD pipelines, and agile methodologies. Familiarity with programming languages such as Python, Java, or C#. Education: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field (or equivalent experience).

Posted 1 month ago

Apply

1 years

0 Lacs

Gurugram, Haryana, India

Hybrid

Linkedin logo

You Lead the Way. We’ve Got Your Back With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities, and each other. Here, you will learn and grow as we help you create a career journey that is unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you will be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we will do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. The Global Servicing (GS) organization delivers extraordinary customer care to Card Members, merchants and commercial clients around the world, while providing world-class credit, collections and fraud services. The GS Servicing Insights & MIS team (part of Global Servicing Enablement, GSE) is the primary point of contact for all GS information needs and is responsible for Executive Decision Support through advanced analytics and MIS. The team has a global footprint and this position will be based out of the American Express Service Center in Gurgaon, India Purpose of the Role: MIS and Analytics to support GS Responsibilities: · Providing Analytical & Decision Support across GS through advanced analytics (from sourcing to staging data, generating insights to exposing them for consumption via reporting platforms/strategy implementation) · Enabling business user self-service through creation of MIS capabilities · Systematically identify out of pattern activities in a timely manner and address information gaps by providing insightful analytics · Working independently assuming responsibility for the development, validation and implementation of projects · Participate on global teams evaluating processes and making suggestions for process and system improvements · Interacting with all levels of the organization across multiple time zones. Critical Factors to Success: · Ensure timely and accurate MIS based on customer requirements · Centrally manage MIS and key operational metrics and address functional data needs across operations and support teams · Provide analytical and decision support framework and address information gaps through insightful analytics and developing lead indicators · Build collaborative relationships across GS groups and participate on global teams evaluating processes and making suggestions for process and system improvements · Put enterprise thinking first, connect the role’s agenda to enterprise priorities and balance the needs of customers, partners, colleagues & shareholders Past Experience: · Preferably a minimum 2 years’ experience with at least 1 year in Quantitative Business Analysis/Data Science with experience in handling large data sets Academic Background: · Bachelor's Degree or equivalent, preferably in a quantitative field · Post-graduate degree in a quantitative field will be an added advantage Functional Skills/Capabilities: · Must possess strong quantitative and analytical skills and be a conceptual and innovative thinker · Project management skills and ability to identify and translate business information needs into insights and information cubes for ease of consumption in reporting and analytics · Proven thought leadership, strong communication, relationship management skills · Ability to work on multiple projects simultaneously, flexibility and adaptability to work within tight deadlines and changing priorities · Data presentation & visualization skills Technical Skills/Capabilities: · Excellent programming skills on Hive/SAS/SQL/Teradata is essential with good understanding of Big Data ecosystems · Exposure to visualization using Business Intelligence software like Tableau or Qlikview will be an added advantage Knowledge of Platforms: · Advanced knowledge of Microsoft Excel and PowerPoint, Word, Access and Project Behavioral Skills/Capabilities: Set The Agenda: Define What Winning Looks Like, Put Enterprise Thinking First, Lead with an External Perspective Bring Others With You: Build the Best Team, Seek & Provide Coaching Feedback, Make Collaboration Essential Do It The Right Way: Communicate Frequently, Candidly & Clearly, Make Decisions Quickly & Effectively, Live the Blue Box Values, Great Leadership Demands Courage We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies