Home
Jobs

751 Teradata Jobs - Page 26

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Overview Data Analyst will be responsible to partner closely with business and S&T teams in preparing final analysis reports for the stakeholders enabling them to make important decisions based on various facts and trends and lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. This role will be interacting with DG, DPM, EA, DE, EDF, PO and D &Ai teams for historical data requirement and sourcing the data for Mosaic AI program to scale solution to new markets. Responsibilities Lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. Partners with FP&A Product Owner and associated business SME's to understand & document business requirements and associated needs Performs the analysis of business data requirements and translates into a data design that satisfies local, sector and global requirements Using automated tools to extract data from primary and secondary sources. Using statistical tools to identify, analyse, and interpret patterns and trends in complex data sets could be helpful for the diagnosis and prediction. Working with engineers, and business teams to identify process improvement opportunities, propose system modifications. Proactively identifies impediments and looks for pragmatic and constructive solutions to mitigate risk. Be a champion for continuous improvement and drive efficiency. Preference will be given to candidate having functional understanding of financial concepts (P&L, Balance Sheet, Cash Flow, Operating Expense) and has experience modelling data & designing data flows Qualifications Bachelor of Technology from a reputed college Minimum 8-10 years of relevant work experience on data modelling / analytics, preferably Minimum 5-6year experience of navigating data in Azure Databricks, Synapse, Teradata or similar database technologies Expertise in Azure (Databricks, Data Factory, Date Lake Store Gen2) Proficient in SQL, Pyspark to analyse data for both development validation and operational support is critical Exposure to GenAI Good Communication & Presentation skill is must for this role.

Posted 1 month ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About This Role Wells Fargo is seeking a Financial Crimes Associate Manager In This Role, You Will Supervise entry to mid level roles in transactional or tactical less complex tasks and processes to ensure timely completion, quality and compliance Manage the implementation of procedures, controls, analytics and trend analysis to ensure identification, prevention execution, detection, investigation, recovery, government and internal reporting of financial crime activity Maintain awareness of financial crimes activity companywide and ensure all issues are proactively addressed, and escalated where necessary Ensure compliance with regulatory requirements such as Bank Secrecy Act, USA PATRIOT Act, and FATCA Identify opportunities for process improvement and risk control development in less complex functions Manage a risk based financial crimes program or functional area with low to moderate risk and complexity Lead implementation of multiple complex initiatives with low to moderate risk Make supervisory and tactical decisions and resolve issues related to team supervision, work allocation and daily operations under direction of functional area management Leverage interpretation of policies, procedures, and compliance requirements Collaborate and consult with peers, colleagues and managers Ensure coordination with team, line of business, other business units, Audit, and regulators on risk related topics Manage allocation of people and financial resources for Financial Crimes Mentor and guide talent development of direct reports and assist in hiring talent Required Qualifications: 2+ years of Financial Crimes, Operational Risk, Fraud, Sanctions, Anti-Bribery, Corruption experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education 1+ years of Leadership experience Desired Qualifications: Hands on experience as a people manager in financial institution managing team of fin crime data quality , model development as well as financial reporting analysts Experience with managing procedures and controls for teams that support data analytics, data platforms, and reporting Demonstrated experience working in Anti-Money Laundering (AML) programs and/or data platform management for large financial institutions Experience mentoring and guiding guide talent development of direct reports and assist in hiring talent Hands on experience with handling BSA/AML/OFAC laws and regulation specific and financial crimes /regulatory/fraud specific data Knowledge of fin crime data quality activities and controls required for large financial institutions. Demonstrated experience with report and dashboard creations using large data sets, including non-standard data is desired but not mandatory Team handling experience for UAT/regression testing on data outputs involving complex data mapping designs Hands on experience as a people manager leading a team of 15 + data analysts who are responsible for conducting data quality analysis to support financial crime data modelling Prior Experience enhancing AML Monitoring Models and Systems including Oracle/Actimize using tools like Advance SQL/SAS/Python Manage team of technical analysts working on AML technology leveraging SAS/SQL/Python/Teradata and technical data validation tools and relevant AML technologies including Norkom, Actimize, Oracle FCCM etc.to support technical project deliveries Support AML technology initiatives during new AML product implementation as well as during technology migrations for Transactions Monitoring and Fraud Detection Handle large technology transformation programs with phased delivery of technical deliverables /features of a Transactions Monitoring and Fraud Detection system and associated data validations/transformation logics as well as MIS reporting using Power BI/Tableau Manage team to deliver AML/BSA technology project deliveries including AML model developments, Transactions Monitoring Model validations and enhancements, Critical Data Elements identification, Data quality/data validation, Threshold testing , MIS Reporting using Tableau/Power BI as well as AI/ML based AML technology developments and testing Manage the implementation of procedures, controls, analytics and trend analysis to ensure identification, prevention execution, detection, investigation, recovery, government and internal reporting of financial crime activity Maintain awareness of financial crimes activity companywide and ensure all issues are proactively addressed, and escalated where necessary Ensure compliance with regulatory requirements such as Bank Secrecy Act, USA PATRIOT Act, and FATCA Posting End Date: 28 May 2025 Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants With Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment And Hiring Requirements Third-Party recordings are prohibited unless authorized by Wells Fargo. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. Reference Number R-448974 Show more Show less

Posted 1 month ago

Apply

5.0 - 14.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Skill: - Abinitio Experience: 5 to 14 years Location: - Kochi (Walkin on 14th June) Responsibilities Ab Initio Skills: Graph Development, Ab Initio standard environment parameters, GD(PDL,MFS Concepts)E, EME basics, SDLC, Data Analysis Database: SQL Proficient, DB Load / Unload Utilities expert, relevant experience in Oracle, DB2, Teradata (Preferred) UNIX:Shell Scripting (must), Unix utilities like sed, awk, perl, python Scheduling knowledge (Control M, Autosys, Maestro, TWS, ESP) Project Profiles:Atleast 2-3 Source Systems, Multiple Targets, simple business transformations with daily, monthly Expected to produce LLD, work with testers, work with PMO and develop graphs, schedules, 3rd level support Should have hands on development experience with various Ab Initio components such as Rollup Scan, join Partition, by key Partition, by Round Robin. Gather, Merge, Interleave Lookup etc Experience in finance and ideally capital markets products. Requires experience in development and support of complex frameworks to handle multiple data ingestion patterns.e.g, messaging files,hierarchical polymorphic xml structures conformance of data to a canonical model curation and distribution of data QA Resource. Data modeling experience creating CDMs LDMs PDMs using tools like ERWIN, Power designer or MagicDraw. Detailed knowledge of the capital markets including derivatives products IRS CDS Options structured products and Fixed Income products. Knowledge on Jenkins and CICD concepts. Knowledge on scheduling tool like Autosys and Control Center. Demonstrated understanding of how AbInitio applications and systems interact with the underlying hardware ecosystem. Experience working in an agile project development lifecycle. Strong in depth knowledge of databases and database concepts DB2 knowledge is a plus Show more Show less

Posted 1 month ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Overview This role serves as an Associate Analyst for the GTM Data analytics COE project development team. This role is one of the go-to resource for building/ maintaining key reports, data pipelines and advanced analytics necessary to bring insights to light for senior leaders and Sector and field end users. Responsibilities The COEs core competencies are a mastery of data visualization, data engineering, data transformation, predictive and prescriptive analytics Enhance data discovery, processes, testing, and data acquisition from multiple platforms. Apply detailed knowledge of PepsiCos applications for root-cause problem-solving. Ensure compliance with PepsiCo IT governance rules and design best practices. Participate in project planning with stakeholders to analyze business opportunities and define end-to-end processes. Translate operational requirements into actionable data presentations. Support data recovery and integrity issue resolution between business and PepsiCo IT. Provide performance reporting for the GTM function, including ad-hoc requests using internal, shipment data systems Develop on-demand reports and scorecards for improved agility and visualization. Collate and analyze large data sets to extract meaningful insights on performance trends and opportunities. Present insights and recommendations to the GTM Leadership team regularly. Manage expectations through effective communication with headquarters partners. Ensure timely and accurate data delivery per service level agreements (SLA). Collaborate across functions to gather insights for action-oriented analysis. Identify and act on opportunities to improve work delivery. Implement process improvements, reporting standardization, and optimal technology use. Foster an inclusive and collaborative environment. Provide baseline support for monitoring SPA mailboxes, work intake, and other ad-hoc requests.Additionally, the role will provide baseline support for monitoring work intake & other adhoc requests, queries Qualifications Undergrad degree in Business or related technology 3-4 Yrs working experience in Power BI 1-2 Yrs working experience in SQL and Python Preferred qualifications : Information technology or analytics experience is a plus Familiarity with Power BI/ Tableau, Python, SQL, Teradata, Azure, MS Fabric Requires a level of analytical, critical thinking, and problem-solving skills as well as great attention to detail Strong time management skills, ability to multitask, set priorities, and plan

Posted 1 month ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling - documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Show more Show less

Posted 1 month ago

Apply

15.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Engineer Lead - AEP Location:Remote Experience Required: 12–15 years overall experience 8+ years in Data Engineering 5+ years leading Data Engineering teams Cloud migration & consulting experience (GCP preferred) Job Summary: We are seeking a highly experienced and strategic Lead Data Engineer with a strong background in leading data engineering teams, modernizing data platforms, and migrating ETL pipelines and data warehouses to Google Cloud Platform (GCP) . You will work directly with enterprise clients, architecting scalable data solutions, and ensuring successful delivery in high-impact environments. Key Responsibilities: Lead end-to-end data engineering projects including cloud migration of legacy ETL pipelines and Data Warehouses to GCP (BigQuery) . Design and implement modern ELT/ETL architectures using Dataform , Dataplex , and other GCP-native services. Provide strategic consulting to clients on data platform modernization, governance, and data quality frameworks. Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders. Define and enforce data engineering best practices , coding standards, and CI/CD processes. Mentor and manage a team of data engineers; foster a high-performance, collaborative team culture. Monitor project progress, ensure delivery timelines, and manage client expectations. Engage in technical pre-sales and solutioning , driving excellence in consulting delivery. Technical Skills & Tools: Cloud Platforms: Strong experience with Google Cloud Platform (GCP) – particularly BigQuery , Dataform , Dataplex , Cloud Composer , Cloud Storage , Pub/Sub . ETL/ELT Tools: Apache Airflow, Dataform, dbt (if applicable). Languages: Python, SQL, Shell scripting. Data Warehousing: BigQuery, Snowflake (optional), traditional DWs (e.g., Teradata, Oracle). DevOps: Git, CI/CD pipelines, Docker. Data Modeling: Dimensional modeling, Data Vault, star/snowflake schemas. Data Governance & Lineage: Dataplex, Collibra, or equivalent tools. Monitoring & Logging: Stackdriver, DataDog, or similar. Preferred Qualifications: Proven consulting experience with premium clients or Tier 1 consulting firms. Hands-on experience leading large-scale cloud migration projects . GCP Certification(s) (e.g., Professional Data Engineer, Cloud Architect). Strong client communication, stakeholder management, and leadership skills. Experience with agile methodologies and project management tools like JIRA. Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Description The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program Manager Role And Responsibilities Represent eClerx in client pitches and external forums. Own platform and expertise through various COE activities and content generation to promote practice and business development. Lead continuous research and assessments to explore best and latest platforms, approaches, and methodologies. Contribute to developing the practice area through best practices, ideas, and Point of Views in the form of white papers and micro articles. Lead/partner in multi-discipline assessments and workshops at client sites to identify new opportunities. Lead key projects and provide development/technical leadership to junior resources. Drive solution design and build to ensure scalability, performance, and reuse. Design robust data architectures, considering performance, data quality, scalability, and data latency requirements. Recommend and drive consensus around preferred data integration and platform approaches, including Azure and Snowflake. Anticipate data bottlenecks (latency, quality, speed) and recommend appropriate remediation strategies. This is a hands-on position with a significant development component, and the ideal candidate is expected to lead the technical development and delivery of highly visible and strategic projects. Technical And Functional Skills Bachelor's Degree with at least 2-3 large-scale Cloud implementations within Retail, Manufacturing, or Technology industries. 10+ years of overall experience with data management and cloud engineering. Expertise in Azure Cloud, Azure Data Lake, Databricks, Snowflake, Teradata, and compatible ETL technologies. Strong attention to detail and ability to collaborate with multiple parties, including analysts, data subject matter experts, external labs, etc. About Us At eClerx, we serve some of the largest global companies – 50 of the Fortune 500 clients. Our clients call upon us to solve their most complex problems, and deliver transformative insights. Across roles and levels, you get the opportunity to build expertise, challenge the status quo, think bolder, and help our clients seize value About The Team eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Essential Qualifications: Bachelor of Engineering / Technology preferably IT, Computer Engg or Electronics & Communications, or ME / MTech / MS. MBA from top institute Preferred Job Overview: PGP Glass Pvt Ltd looking for a Head Analytics to join our growing team of data analytics experts and manage the processes and people responsible for accurate data collection, processing, modeling and analysis. The ideal candidate has a knack for seeing solutions in sprawling data sets and the business mindset to convert insights into strategic opportunities for our company. The Head Analytics will work closely with leaders across Product (Engineering, Manufacturing/ Integrated Supply Chain), Finance, Sales, and Marketing to support and implement high-quality, data-driven decisions. Will ensure data accuracy and consistent reporting by designing and creating optimal processes and procedures for analytics employees to follow. They will use advanced data modeling, predictive modeling and analytical techniques to interpret key findings from company data and leverage these insights into initiatives that will support business outcomes. The right person for the job will apply their exhaustive knowledge of data analysis to solving real-world problems faced by our company and finding opportunities for improvement across multiple projects, teams, business units and external partners/vendors. Responsibilities for Head Analytics Lead cross-functional projects using advanced data modeling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities Build, develop and maintain data models, reporting systems, data automation systems, dashboards and performance metrics support that support key business decisions Design and build technical processes to address business issues Oversee the design and delivery of reports and insights that analyze business functions and key operations and performance metrics Recruit, train, develop and supervise analyst-level employees Ensure accuracy of data and deliverables of reporting employees with comprehensive policies and processes Manage and optimize processes for data intake, validation, mining and engineering as well as modeling, visualization and communication deliverables Examine, interpret and report results of analytical initiatives to stakeholders in leadership, technology, sales, marketing and product teams Oversee the data/report requests process: tracking requests submitted, prioritization, approval, etc. Develop and implement quality controls and departmental standards to ensure quality standards, organizational expectations, and regulatory requirements Anticipate future demands of initiatives related to people, technology, budget and business within your department and design/implement solutions to meet these needs Organize and drive successful completion of data insight initiatives through effective management of analyst and data employees and effective collaboration with stakeholders Communicate results and business impacts of insight initiatives to stakeholders within and outside of the company Travel: Required to travel to sites as per business needs Qualifications for Head Analytics Working knowledge of data mining principles: predictive analytics, mapping, collecting data from multiple data systems on premises and cloud-based data sources Strong SQL skills, ability to perform effective querying involving multiple tables and subqueries Understanding of and experience using analytical concepts and statistical techniques: hypothesis development, designing tests/experiments, analyzing data, drawing conclusions, and developing actionable recommendations for business units Experience and knowledge of statistical modeling techniques: GLM multiple regression, logistic regression, log-linear regression, variable selection, etc. Experience writing advanced SAS code statements, models, and macros Experience working with and creating databases and dashboards using all relevant data to inform decisions Experience using analytics techniques to contribute to company growth efforts, increasing revenue and other key business outcomes Strong problem solving, quantitative and analytical abilities Strong ability to plan and manage numerous processes, people and projects simultaneously Excellent communication, collaboration and delegation skills We’re looking for someone with at least 5 years of experience in a position monitoring, managing, manipulating and drawing insights from data, and someone with at least 3 years of experience leading a team. The right candidate will also be proficient and experienced with the following tools/programs: Strong programming skills with querying languages: SLQ, SAS, etc. Experience with big data tools: Teradata, Aster, Hadoop, Spark etc. Experience with data visualization tools: Tableau, Raw, chart.js, etc. Experience with Adobe Analytics and other analytics tools including Experience with data sets and databases of structured and unstructured characteristics Python, C, C++, JAVA, or other programming languages Experience with Excel, Word, and PowerPoint, Cloud Analytical platforms – AWS Redshift /Azure Synapse / Azure Stream Analytics Show more Show less

Posted 1 month ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 1 month ago

Apply

12.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 1 month ago

Apply

12.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 1 month ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 1 month ago

Apply

12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 1 month ago

Apply

12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 1 month ago

Apply

5.5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Title: Associate KEY SKILLS - Mainframe, Teradata Datastage Mainframe and Teradata DataStage Associate Summary: . Minimum Degree Required : Bachelor's degree in computer science/IT or relevant field Degree Preferred : Master’s degree in computer science/IT or relevant field Minimum Years of Experience : 2 – 5.5 year(s) Certifications Required : NA Job Summary: We are seeking a skilled and experienced IT professional to join our team as a Mainframe and Teradata DataStage Associate. The successful candidate will be responsible for developing, maintaining, and optimizing ETL processes using IBM DataStage, as well as managing and supporting data operations on Mainframe and Teradata platforms. Key Responsibilities: Design, develop, and implement ETL processes using IBM DataStage to support data integration and transformation requirements. Manage and maintain data on Mainframe and Teradata systems, ensuring data integrity and performance optimization. Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications. Troubleshoot and resolve issues related to ETL processes and data management on Mainframe and Teradata platforms. Monitor and tune the performance of ETL jobs and database queries to ensure optimal performance. Develop and maintain documentation related to ETL processes, data flows, and system configurations. Participate in code reviews and ensure adherence to best practices and coding standards. Provide support for data migration and integration projects, ensuring timely and accurate data delivery. Stay updated with the latest developments in Mainframe, Teradata, and DataStage technologies and recommend improvements. Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. years of experience working with Mainframe systems, including experience with COBOL, JCL, and VSAM. years of experience with Teradata, including SQL development and performance tuning. Strong proficiency in IBM DataStage, with experience designing and implementing complex ETL processes. Solid understanding of data warehousing concepts and database design principles. Excellent problem-solving skills and the ability to work under pressure in a fast-paced environment. Strong communication and collaboration skills, with the ability to work effectively in cross-functional teams. Experience with version control systems and agile development practices is a plus. Preferred Qualifications: Experience with additional ETL tools and data integration technologies. Knowledge of other database systems such as Oracle, SQL Server, or DB2. Experience with cloud data solutions and platforms. Show more Show less

Posted 1 month ago

Apply

0.0 - 4.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Location Gurugram, Haryana, India Category Corporate Job Id GGN00002020 Procurement / Strategic Sourcing / Purchasing Job Type Full-Time Posted Date 05/26/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description At United, we care about our customers. To be the best airline in aviation history, we need to deliver the best service to our customers. And it takes a whole team of dedicated customer-focused advocates to make it happen! From our Contact Center to customer analytics, insights, innovation, and everything in between, the Customer Experience team delivers outstanding service and helps us to run a more customer-centric and dependable airline. Job overview and responsibilities The team is currently looking for a well-rounded individual who has a passion for data and analytics. The role requires supporting the team by gathering data, conducting analyses, verifying reports and assist in ad-hoc decision support. Excellent time management, strong analytical capabilities and communication skills are keys to success in this role. Extract and analyze data from relational databases and reporting systems Proactively identify problems and opportunities and perform root cause analysis/diagnosis leading to business impact Identify issues, create hypotheses, and translate data into meaningful insights; present recommendations to key decision makers Responsible for the development and maintenance of reports, analyses, dashboards to drive key business decisions Build predictive models and analyze results for dissemination of insights to leadership of the numerous operations groups at United Prepare presentations that summarize data and help facilitate decision-making for business partners and senior leadership team This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's Degree required 2-4 years of analytics-related experience Must be proficient in Microsoft Excel and PowerPoint Must be competent in querying and manipulating relational databases via Teradata SQL assistant, Microsoft SQL server, Oracle SQL Developer Must be proficient in at least one quantitative analysis tool – Python / R Must be familiar with one or more reporting tools – Tableau / Oracle OBIEE / TIBCO Spotfire Must be detail-oriented, thorough and analytical with a desire for continuous improvements Must be adept at juggling several projects and initiatives simultaneously through appropriate prioritization Exhibit written and spoken English fluency Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): MBA preferred

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Ab Initio Data Engineer We are looking for Ab Initio Data Engineer to be able to design and build Ab Initio-based applications across Data Integration, Governance & Quality domains for Compliance Risk programs. The individual will be working with both Technical Leads, Senior Solution Engineers and prospective Application Managers in order to build applications, rollout and support production environments, leveraging Ab Initio tech-stack, and ensuring the overall success of their programs. The programs have a high visibility, and are fast paced key initiatives, which generally aims towards acquiring & curating data and metadata across internal and external sources, provide analytical insights and integrate with other Citi systems. Technical Stack: Ab Initio 4.0.x software suite – Co>Op, GDE, EME, BRE, Conduct>It, Express>It, Metadata>Hub, Query>it, Control>Center, Easy>Graph Big Data – Cloudera Hadoop, Hive, Yarn Databases - Oracle 11G/12C, Teradata, MongoDB, Snowflake Others – JIRA, Service Now, Linux, SQL Developer, AutoSys, and Microsoft Office Responsibilities: Ability to design and build Ab Initio graphs (both continuous & batch) and Conduct>it Plans, and integrate with portfolio of Ab Initio softwares. Build Web-Service and RESTful graphs and create RAML or Swagger documentations. Complete understanding and analytical ability of Metadata Hub metamodel. Strong hands on Multifile system level programming, debugging and optimization skill. Hands on experience in developing complex ETL applications. Good knowledge of RDBMS – Oracle, with ability to write complex SQL needed to investigate and analyze data issues Strong in UNIX Shell/Perl Scripting. Build graphs interfacing with heterogeneous data sources – Oracle, Snowflake, Hadoop, Hive, AWS S3. Build application configurations for Express>It frameworks – Acquire>It, Spec-To-Graph, Data Quality Assessment. Build automation pipelines for Continuous Integration & Delivery (CI-CD), leveraging Testing Framework & JUnit modules, integrating with Jenkins, JIRA and/or Service Now. Build Query>It data sources for cataloguing data from different sources. Parse XML, JSON & YAML documents including hierarchical models. Build and implement data acquisition and transformation/curation requirements in a data lake or warehouse environment, and demonstrate experience in leveraging various Ab Initio components. Build Autosys or Control Center Jobs and Schedules for process orchestration Build BRE rulesets for reformat, rollup & validation usecases Build SQL scripts on database, performance tuning, relational model analysis and perform data migrations. Ability to identify performance bottlenecks in graphs, and optimize them. Ensure Ab Initio code base is appropriately engineered to maintain current functionality and development that adheres to performance optimization, interoperability standards and requirements, and compliance with client IT governance policies Build regression test cases, functional test cases and write user manuals for various projects Conduct bug fixing, code reviews, and unit, functional and integration testing Participate in the agile development process, and document and communicate issues and bugs relative to data standards Pair up with other data engineers to develop analytic applications leveraging Big Data technologies: Hadoop, NoSQL, and In-memory Data Grids Challenge and inspire team members to achieve business results in a fast paced and quickly changing environment Perform other duties and/or special projects as assigned Qualifications: Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) and a minimum of 5 years of experience Minimum 5 years of extensive experience in design, build and deployment of Ab Initio-based applications Expertise in handling complex large-scale Data Lake and Warehouse environments Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Years of Experience: Candidates with 4+ years of experience in developing and delivering scalable big data pipelines using Apache Spark and Databricks on AWS. ` Position Requirements Must Have : Build and maintain scalable data pipelines using Databricks and Apache Spark. Develop and optimize ETL/ELT processes for structured and unstructured data. Knowledge of Lakehouse architecture for efficient data storage, processing, and analytics. Orchestrating ETL/ELT Pipelines: Design and manage data workflows using Databricks Workflows, Jobs API. Worked with AWS Data Services (S3, Lambda, CloudWatch) for seamless integration. Performance Optimization: Optimize queries using pushdown capabilities and indexing strategies. Worked on data governance with Unity Catalog, security policies, and access controls. Monitor, troubleshoot, and improve Databricks jobs and clusters. Exposure to end-to-end implementation of migration projects to AWS Cloud AWS & Python Expertise with hands-on cloud development. Orchestration: Airflow Code Repositories: Git, GitHub. Strong in writing SQL Cloud Data Migration: Deep understanding of processes. Strong Analytical, Problem-Solving & Communication Skills. Good To Have Knowledge / Skills Experience in Teradata, DataStage , SSIS Knowledge of Databricks Delta Live Table. Knowledge of Delta Lake. Streaming: Kafka, Spark Streaming. CICD: Jenkins IaC & Automation: Terraform for Databricks deployment. Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Years of Experience: Candidates with 8+ years of experience in architecting and delivering scalable big data pipelines using Apache Spark and Databricks on AWS. Position Requirements Must Have : Design, build, and maintain scalable data pipelines using Databricks and Apache Spark. Good knowledge on Medallion Architecture in Databricks Lakehouse Develop and optimize ETL/ELT processes for structured and unstructured data. Implement Lakehouse architecture for efficient data storage, processing, and analytics. Orchestrating ETL/ELT Pipelines: Design and manage data workflows using Databricks Workflows, Jobs API. Work with AWS Data Services (S3, Lambda, CloudWatch) for seamless integration. Performance Optimization: Optimize queries using pushdown capabilities and indexing strategies. Implement data governance with Unity Catalog, security policies, and access controls. Collaborate with data scientists, analysts, and engineers to enable advanced analytics. Monitor, troubleshoot, and improve Databricks jobs and clusters. Strong expertise in end-to-end implementation of migration projects to AWS Cloud Should be aware of Data Management concepts and Data Modelling AWS & Python Expertise with hands-on cloud development. Spark Performance Tuning: Core, SQL, and Streaming. Orchestration: Airflow Code Repositories: Git, GitHub. Strong in writing SQL Cloud Data Migration: Deep understanding of processes. Strong Analytical, Problem-Solving & Communication Skills. Good To Have Knowledge / Skills Experience in Teradata, DataStage , SSIS, Mainframe(Cobol, JCL, Zeke Scheduler) Knowledge on Lakehouse Federation Knowledge of Delta Lake. Knowledge of Databricks Delta Live Table. Streaming: Kafka, Spark Streaming. CICD : Jenkins IaC & Automation: Terraform for Databricks deployment. Knowlege on integrating 3party APIs to Databricks. Knowledge of Transport & Mobility domain. Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bhubaneshwar, Odisha, India

On-site

Linkedin logo

Job Description Senior Associate / Manager / IICS Developer Job Location: Bhubaneswar The IICS Developer will be responsible for designing, developing, and implementing cloud-based ETL (Extract, Transform, Load) solutions using Informatica Intelligent Cloud Services (IICS). The role involves working with Cloud Data Integration (CDI), Cloud Application Integration (CAI), and other Informatica tools to enable seamless data movement across cloud and on-premise environments. Key Responsibilities: ? Design and develop ETL pipelines and data integration workflows using IICS (CDI, CAI, and Cloud Data Quality). ? Extract, transform, and load data across cloud platforms (AWS, Azure, GCP) and on-premise databases. ? Work with REST/SOAP APIs to integrate cloud applications. ? Optimize ETL performance and ensure efficient data processing and workflow execution. ? Collaborate with data architects, analysts, and business teams to gather and understand data requirements. ? Implement error handling, logging, and performance tuning in ETL processes. ? Maintain and enhance data quality and governance within the organization. ? Work with various databases like Snowflake, Redshift, Teradata, Oracle, SQL Server, etc. for data integration. ? Develop automation scripts using Unix/Linux shell scripting or Python for workflow scheduling. Required Skills & Qualifications: ? Technical Skills: ? Strong experience in IICS (CDI, CAI, CDQ) and PowerCenter. ? Hands-on expertise in ETL development, data transformation, and integration. ? Proficiency in SQL, PL/SQL, and working with relational & cloud databases (Snowflake, Redshift, Teradata, etc.). ? Experience with API-based integrations (REST, SOAP). ? Exposure to cloud platforms (AWS, Azure, GCP) and working with cloud-native databases. ? Strong knowledge of data warehousing concepts, ETL methodologies, and best practices. ? Experience with performance tuning and troubleshooting ETL workflows. Skills Required RoleIICS Developer - SA/M Industry TypeIT/ Computers - Software Functional AreaIT-Software Required EducationAny Graduates Employment TypeFull Time, Permanent Key Skills IICS ETL CDI AWS Other Information Job CodeGO/JC/21456/2025 Recruiter NameKamlesh Kumar Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Corporate Technology , you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job Responsibilities Provide end-to-end application and infrastructure service delivery for the successful business operations of the firm. Execute policies and procedures that ensure Engineering and Operational stability and availability. Monitor production environments for anomalies, address issues, and drive evolution of utilization of standard observability tools. Escalate and communicate issues and solutions to the business and technology stakeholders, actively participating from incident resolution to service restoration. Lead incident, problem, and change management in support of full stack technology systems, applications, or infrastructure Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 3+ years applied experience Certified software engineering professional with 8+ years applied experience. Proficiency on AWS Cloud Platform, with system design, application/tools/interface/modules development, testing, problem solving and operational stability. Hands-on infrastructure as code tools, such as Terraform, Helm chart. Design, deploy and manage Kubernetes clusters across various environments (on-premises, AWS cloud, hybrid) as a Kubernetes platform Engineer. Experience with K8s services, deployments, failover mechanism, networking, security polices, including CNI plugins, ingress controllers and service meshes. Minimum 4+ years of experience in Terraform, Python, Shell scripting technologies. Hands-on infrastructure as code tools, Terraform, Helm chart, independently design, build, test and deploy the code. Hands-on experience with Continuous Integration and Delivery tools like, Jules / Spinnaker / Jenkins integration Ability to set up monitoring, logging, and alerting for Kubernetes clusters using Grafana, Prometheus and Splunk Should have strong SQL skills; PostgreSQL, AWS RDS, Aurora, Teradata is preference but experience in any other RDBMS technology would suffice Preferred Qualifications, Capabilities, And Skills Familiarity with modern front-end technologies Exposure to cloud technologies ABOUT US Show more Show less

Posted 1 month ago

Apply

5.0 - 7.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing / Sales / Finance / Supplier / Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP

Posted 1 month ago

Apply

7.0 - 11.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description: Senior Principal Data Engineer - Data Engineering Value Preposition Responsible for building the data platform that support data integrations: building data pipelines to share data enterprise data, designing and building cloud solutions with appropriate data access, data security, data privacy and data governance Lead a team of Data Engineers, to maintain the platform constantly upkeeping it to be in-line with new technologies. Use agile engineering practices and various data development technologies to rapidly develop creative and efficient data products Job Details Position Title: Senior Principal Data Engineer Career Level: P5 Job Category: Vice President Role Type: Hybrid Job Location: Bangalore About the Team: The data engineering team is community of dedicated professionals committed to designing, building, and maintaining data platform solutions for the organization. Impact (Job Summary/Why this Role Matters) Enterprise data warehouse supports several critical business functions for the bank including Regulatory Reporting, Finance, Risk steering, and Customer 360. This role is vital for building and maintaining enterprise data platform, data processes, and to support business objectives. Our values inclusivity, transparency, and excellence drive everything we do. Join us and make a meaningful impact on the organization. Key Deliverables (Duties and Responsibilities) As a Senior Principal Data Engineer, you will be responsible for building and maintaining the data platform that supports data integrations: enriching data pipelines to share enterprise data, designing, building, and maintaining a data platform such as Enterprise Data Warehouse, Operational Data Store or Data Marts etc. with appropriate data access, data security, data privacy and data governance. Demonstrate technical knowledge and leadership in software development, data engineering frameworks and best practices. Building a strategy and execution plan for the multiple programs/initiatives across the organization Helping the teams in architecting and designing large scale applications Acting as a trusted advisor to leaders (Directors / Sr. Directors) on strategic technology and data solution directions Participates on the Change Advisory Board (CAB) and ensures effective change control is implemented for all infrastructure and/or application installations, rollbacks, and updates. Collaborate with the Data Architects, Solution Architects & Data Modelers to enhance the Data platform design, constantly identify a backlog of tech debts in line with identified upgrades and provide technical solutions & implement the same. Collaborate with IT and CSO teams to ensure compliance with data governance, privacy and security policies and regulations. Manage deliverables of developers, perform design reviews and coordinate release management activities. Drive automation, identify inefficiencies, optimize processes and data flows, and recommend improvements. Use agile engineering practices and various data development technologies to rapidly develop and implement efficient data products. Work with global technology teams across different time zones (primarily US) to deliver timely business value. Skills and Qualification (Functional and Technical Skills) Functional Skills: Leadership: Driving strategic and technical initiatives for data engineering team, provide guidance and mentorship. Business/Domain Knowledge: Good understanding of application systems and business domains Partnership and Collaboration: Develop and maintain partnership with business and IT stakeholders Communication: Excellent verbal, written, and interpersonal communication skills. Problem Solving: Excellent problem-solving skills, incident management, root cause analysis, and proactive solutions to improve quality. Team Player: Support peers, team, and department management. Attention to Detail: Ensure accuracy and thoroughness in all tasks. Technical/Business Skills: Data Engineering: Experience in designing and building Data Warehouse and Business Intelligence dashboards. Extensive knowledge of data warehouse principles, design, and concepts. Technical expertise working in large scale Data Warehousing applications and databases such as Oracle, Netezza, Teradata, and SQL Server. Deep technical knowledge in data engineering frameworks and best practices. Experience with public cloud-based data platforms especially Snowflake, AWS, and machine learning capabilities such as Sagemaker, DataRobot. Data integration skills: Expertise in creating and maintaining ETL processes and architecting complex data pipelines - knowledge of data modeling techniques and high-volume ETL/ELT design. Solutions using any industry leading ETL tools such as SAP Business Objects Data Services (BODS), Informatica Cloud Data Integration Services (IICS), IBM Data Stage. Knowledge of ELT tools such as DBT, Fivetran, and AWS Glue Data Model: Expert knowledge of Logical and Physical Data Model using Relational or Dimensional Modeling practices, and high-volume ETL/ELT design. Expert in SQL - development experience in at least one scripting language (Python etc.), adept in tracing and resolving data integrity issues. Data Visualization using Power BI or Tableau Performance tuning of data pipelines and DB Objects to deliver optimal performance. Excellent data analysis skills using SQL and experience in incident management techniques. Data protection/compliance standards like GDPR, CCPA, HIPAA Experience working in Financial Industry is a plus Leadership Qualities (For People Leaders) Communication: Clearly conveys ideas and listens actively. Inspiration: Motivates and encourages the team to achieve their best. Influence: Extensive stakeholder management experience and ability to influence people Driving strategic and technical initiatives Relationships & Collaboration Reports to: Associate Director - Data Engineering Partners: Senior leaders and cross-functional teams Leads: A team of Data Engineering associates Accessibility Needs We are committed to providing an inclusive and accessible hiring process. If you require accommodations at any stage (e.g. application, interviews, onboarding) please let us know, and we will work with you to ensure a seamless experience.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Job Title: Python Data Engineer – AWS Job location: Remote Job Type: Full-time Client: Direct Description We are seeking a highly skilled Python Data Engineer with deep expertise in AWS-based data solutions. This role is responsible for designing, building, and optimizing large-scale data pipelines and frameworks that power analytics and machine learning workloads. You'll lead the modernization of legacy systems by migrating workloads from platforms like Teradata to AWS-native big data environments such as EMR, Glue, and Redshift. Strong emphasis is placed on reusability, automation, observability, and performance optimization. Key Responsibilities Migration & Modernization: Build reusable accelerators and frameworks to migrate data from legacy platforms (e.g., Teradata) to AWS-native architectures such as EMR and Redshift. Data Pipeline Development: Design and implement robust ETL/ELT pipelines using Python, PySpark, and SQL on AWS big data platforms. Code Quality & Testing: Drive development standards with test-driven development, unit testing, and automated validation of data pipelines. Monitoring & Observability: Build operational tooling and dashboards for pipeline observability, including metrics tracking (latency, throughput, data quality, cost). Cloud-Native Engineering: Architect scalable, secure data workflows using AWS services like Glue, Lambda, Step Functions, S3, and Athena. Collaboration: Partner with internal product teams, data scientists, and external stakeholders to clarify requirements and drive solutions aligned with business goals. Architecture & Integration: Work with enterprise architects to evolve data architecture while integrating AWS systems with on-premise or hybrid environments securely. ML Support & Experimentation: Enable data scientists to operationalize machine learning models by providing clean, well-governed datasets at scale. Documentation & Enablement: Document solutions thoroughly and provide technical guidance and knowledge sharing to internal engineering teams. Qualifications Experience: 4+ years in technology roles, with experience in data engineering, software development, and distributed systems. Programming: Expert in Python and PySpark (Scala is a plus) Deep understanding of software engineering best practices AWS Expertise: 4+ years of hands-on experience in AWS data ecosystem Proficient in AWS Glue, S3, Redshift, EMR, Athena, Step Functions, Lambda Experience with AWS Lake Formation and data cataloging tools is a plus AWS Data Analytics or Solutions Architect certification is a strong plus Big Data & MPP Systems: Strong grasp of distributed data processing Experience with MPP data warehouses like Redshift, Snowflake, or Databricks on AWS DevOps & Tooling: Experience with version control (GitHub/CodeCommit) and CI/CD tools (CodePipeline, Jenkins, etc.) Familiarity with containerization and deployment in Kubernetes or ECS Data Quality & Governance: Experience with data profiling, data lineage, and tools Understanding of metadata management and data security best practices Bonus: Experience supporting machine learning or data science workflows Familiarity with BI tools such as QuickSight, PowerBI, or Tableau Show more Show less

Posted 1 month ago

Apply

1.0 - 4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What You’ll Do: The Accounting Specialist will Ensure the monthly accounts for your assigned country are submitted accurately and on time Ensure all adjustments are submitted and processed in a timely manner Ensure all subledgers are interfaced correctly with the General Ledger Researching and resolving accounting issues for your assigned country Perform account reconciliation for your assigned country Who You’ll Work With: As an experienced accountant within the Indian Finance Centre, you will play a key role in providing accounting services to our Teradata organisations in APAC or America (which may change to other regions as per business requirement). Working with the Team Leader and other team members, you will be responsible for the delivery of accurate and timely accounting information. This will often involve close collaboration with other specialized departments. Minimum Requirements: Bachelor Degree, Accounting, Finance, or other related Business discipline. You have 1 to 4 years’ experience within a large multinational organization, preferably within a Shared Services Centre You have solid familiarity with Microsoft Office products and Outlook Experience in ERPs like SAP/Oracle & HFM would be preferred You are fluent in English You are energetic, results oriented with a “can do” attitude What You’ll Bring: Ability to collaborate and partner with other team members and BUs to provide an overall superior level of service Ability to “take the lead” in researching and resolving issues, as needed Ability to take ownership of special projects and effectively deliver positive results Technical and comprehensive knowledge of Finance & Accounting systems and processing Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies