Home
Jobs

787 Teradata Jobs - Page 9

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing… As part of DMBM-BI, you will be helping to create and deliver a comprehensive measurement and reporting approach for all of Verizon Consumer Groups. In this role, you will interact with cross-functional teams working throughout Verizon bringing new experiences to life for our Customers. You will help in measurement, and reporting for cross-functional teams as they plan, build, and launch world-class experiences. You will help translate raw data into actionable insights and better experiences for our customers. Your deep knowledge of measurement solutions will help to determine the best approaches for implementations that best meet business needs. Working closely with the NBx/Pega Business teams and deliver reporting stories each release, and where required build new dashboards in Tableau or Qlik Sense Contributing to requirement sessions with key stakeholders and actively participate in grooming sessions with business teams Defining new metrics and business KPIs. Creating wireframes and mockups of reporting dashboards. Documenting all validated standards and processes to ensure accuracy across the enterprise. Collaborating with cross-functional teams to resolve NBx proposition anomalies and actively contribute to production defect resolutions. What We’re Looking For… You are a strong collaborator who can effectively own and prioritize multiple work streams and adapt during sometimes pressured situations. You display initiative and resourcefulness in achieving goals but are comfortable brainstorming and sharing ideas in a team environment. You will have excellent communication skills and the ability to speak effectively to internal and external stakeholders. You can partner across multiple business and technology teams. You should have strong Business Intelligence and analytics experience in CX (Customer Experience) area/root cause analytics with attention to detail, be adaptable to change and tight deadlines, and be focused on quality. Ability to mine, extract, transform, load large data sets, and create concise readouts and analyses based on the actionable insights found in the data. Bachelor’s degree and Six or more years of work experience. Six or more years of relevant work experience. Experience with SQL and SQL performance tuning. Experience with Tableau and Qlik Sense. Experience with data modeling for different data sources in Tableau or Qlik Sense. Knowledge of Google Suite and database management systems. Experience with dashboard creation with insightful visualization. Knowledge of OneJira or any ticketing tool. Even better if you have one or more of the following: Experience with third-party reporting tools (e.g., ThoughtSpot, IBM Cognos, Looker tools). Exposure to HiveQL, GCP Big Query, Teradata, and Oracle databases. Basic knowledge of programming languages (e.g., VBA/Python). Ability to derive insights from data and recommend action. Knowledge of end-to-end ETL process. If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Data n’ Analytics – Data Strategy - Manager, Strategy and Transactions EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Manager - Data Strategy. The main objective of the role is to develop and articulate a clear and concise data strategy aligned with the overall business strategy. Communicate the data strategy effectively to stakeholders across the organization, ensuring buy-in and alignment. Establish and maintain data governance policies and procedures to ensure data quality, security, and compliance. Oversee data management activities, including data acquisition, integration, transformation, and storage. Develop and implement data quality frameworks and processes.The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. Discipline Data Strategy Key Skills Strong understanding of data models (relational, dimensional), data warehousing concepts, and cloud-based data architectures (AWS, Azure, GCP). Proficiency in data analysis techniques (e.g., SQL, Python, R), statistical modeling, and data visualization tools. Familiarity with big data technologies such as Hadoop, Spark, and NoSQL databases. Client Handling and Communication, Problem Solving, Systems thinking, Passion of technology, Adaptability, Agility, Analytical thinking, Collaboration Skills And Attributes For Success 10-12 years of total experience with 8+ years in Data Strategy and Architecture field Solid hands-on 6+ years of professional experience with designing and architecting of data warehouses/ data lakes on client engagements and helping create enhancements to a data warehouse Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) 5+ years’ experience in Azure database offerings [ Relational, NoSQL, Datawarehouse ] 5+ years experience in various Azure services preferred – Azure Data Factory, Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics, Azure Analysis Services & Databricks Minimum of 8 years of hands-on database design, modelling and integration experience with relational data sources, such as SQL Server databases, Oracle/MySQL, Azure SQL and Azure Synapse Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem-solving skills Excellent written and verbal communication skills Strong project and people management skills with experience in serving global clients To qualify for the role, you must have Master’s Degree in Computer Science, Business Administration or equivalent work experience. Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analysing large volumes of data Relevant work experience of minimum 12 to 14 years in a big 4 or technology/ consulting set up Help incubate new finance analytic products by executing Pilot, Proof of Concept projects to establish capabilities and credibility with users and clients. This may entail working either as an independent SME or as part of a larger team Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Strong experience in SQL server and MS Excel plus atleast one other SQL dialect e.g. MS Access\Postgresql\Oracle PLSQL\MySQLStrong in Data Structures & Algorithm Experience of interfacing with databases such as Azure databases, SQL server, Oracle, Teradata etc Preferred exposure to JSON, Cloud Foundry, Pivotal, MatLab, Spark, Greenplum, Cassandra, Amazon Web Services, Microsoft Azure, Google Cloud, Informatica, Angular JS, Python, etc. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing… As part of DMBM-BI, you will be helping to create and deliver a comprehensive measurement and reporting approach for all of Verizon Consumer Groups. In this role, you will interact with cross-functional teams working throughout Verizon bringing new experiences to life for our Customers. You will help in measurement, and reporting for cross-functional teams as they plan, build, and launch world-class experiences. You will help translate raw data into actionable insights and better experiences for our customers. Your deep knowledge of measurement solutions will help to determine the best approaches for implementations that best meet business needs. Working closely with the NBx/Pega Business teams and deliver reporting stories each release, and where required build new dashboards in Tableau or Qlik Sense Contributing to requirement sessions with key stakeholders and actively participate in grooming sessions with business teams Defining new metrics and business KPIs. Creating wireframes and mockups of reporting dashboards. Documenting all validated standards and processes to ensure accuracy across the enterprise. Collaborating with cross-functional teams to resolve NBx proposition anomalies and actively contribute to production defect resolutions. What We’re Looking For… You are a strong collaborator who can effectively own and prioritize multiple work streams and adapt during sometimes pressured situations. You display initiative and resourcefulness in achieving goals but are comfortable brainstorming and sharing ideas in a team environment. You will have excellent communication skills and the ability to speak effectively to internal and external stakeholders. You can partner across multiple business and technology teams. You should have strong Business Intelligence and analytics experience in CX (Customer Experience) area/root cause analytics with attention to detail, be adaptable to change and tight deadlines, and be focused on quality. Ability to mine, extract, transform, load large data sets, and create concise readouts and analyses based on the actionable insights found in the data. Bachelor’s degree and Six or more years of work experience. Six or more years of relevant work experience. Experience with SQL and SQL performance tuning. Experience with Tableau and Qlik Sense. Experience with data modeling for different data sources in Tableau or Qlik Sense. Knowledge of Google Suite and database management systems. Experience with dashboard creation with insightful visualization. Knowledge of OneJira or any ticketing tool. Even better if you have one or more of the following: Experience with third-party reporting tools (e.g., ThoughtSpot, IBM Cognos, Looker tools). Exposure to HiveQL, GCP Big Query, Teradata, and Oracle databases. Basic knowledge of programming languages (e.g., VBA/Python). Ability to derive insights from data and recommend action. Knowledge of end-to-end ETL process. If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Lowe’s Lowe’s is a FORTUNE® 100 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. Lowe’s India, the Global Capability Center of Lowe’s Companies Inc., is a hub for driving our technology, business, analytics, and shared services strategy. Based in Bengaluru with over 4,500 associates, it powers innovations across omnichannel retail, AI/ML, enterprise architecture, supply chain, and customer experience. From supporting and launching homegrown solutions to fostering innovation through its Catalyze platform, Lowe’s India plays a pivotal role in transforming home improvement retail while upholding strong commitment to social impact and sustainability. For more information, visit Lowes India About The Team A Merchandising Analyst plays a pivotal role in driving the performance of product assortments by leveraging data to optimize strategies. They are responsible for analyzing key trends across sales, margins, inventory, turnover, and other critical KPIs. By incorporating macroeconomic factors and leveraging forecasted expectations, they develop effective strategies to maximize revenue and margins, optimize inventory levels, and ensure customer needs are met efficiently. The ideal candidate should possess strong technical expertise, enabling them to conduct root cause analyses, A/B testing, hypothesis testing, and regression analysis. Their insights should translate into actionable recommendations that drive business results. Additionally, they are expected to collaborate with cross-functional teams to integrate metrics beyond merchandising and engage stakeholders to understand and address their specific requirements effectively. Job Summary The primary purpose of this role is to perform mathematical and statistical analysis or model building as appropriate. This includes following analytical best practices, analyzing and reporting accurate results, and identifying meaningful insights that directly support decision making. This role provides assistance in supporting one functional area of the business in partnership with other team members. At times, this role may work directly with the business function, but the majority of time is spent working with internal team members to identify and understand business needs. Roles & Responsibilities Core Responsibilities: Conduct in-depth analysis of business trends, financial performance, and market conditions. Develop and maintain data models, dashboards, and reports to support business decisions. Identify opportunities for operational improvements and recommend strategic solutions. Collaborate with cross-functional teams to translate data insights into actionable strategies. Ensure data accuracy, integrity, and security while handling large datasets. Present findings and recommendations to leadership in a clear and concise manner Years Of Experience 1 to 3 yrs of experience data analytics Education Qualification & Certifications (optional) Required Minimum Qualifications Bachelor's degree in business administration, computer science, computer information systems (CIS), engineering, or related field (or equivalent work experience in lieu of degree) Skill Set Required Experience using basic analytical tools such as R, Python, SQL, SAS, Adobe, Alteryx, Knime, Aster Experience using visualization tools such as Power BI, Tableau Secondary Skills (desired) Experience with business intelligence and reporting tools (e.g., MicroStrategy, Business Objects, Cognos, Adobe, TM1, Alteryx, Knime, SSIS, SQL, Svr) and Enterprise level databases (Hadoop, GCP, Azure, Oracle, Teradata, DB2) Experience working with big, unstructured data in a retail environment Experience with analytical tools like Python, Alteryx, Knime, SAS, R, etc. Experience with visualization tools like MicroStrategy VI, Power BI, SAS-VA, Tableau, D3, R-Shiny Programming experience using tools such as R, Python Data Science experience using tools such as ML, Text mining Knowledge of SQL Project management experience Experience in home improvement retail Lowe's is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, ancestry, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under federal, state, or local law. Starting rate of pay may vary based on factors including, but not limited to, position offered, location, education, training, and/or experience. For information regarding our benefit programs and eligibility, please visit https://talent.lowes.com/us/en/benefits. Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Overview Sr. Data Engineer II performs development activities within a data engineering team and helps guide, onboard, and train Data Engineer I. You will work closely with product management, engineering, account management, ETL, data warehouse, business intelligence, and reporting teams as you develop data pipelines and enhancements and investigate and troubleshoot issues. You possess an understanding of multiple data structures, including relational and non-relational data models. Roles And Responsibilities Extracting, cleansing, and loading data. Building data pipelines using SQL, Kafka, and other technologies. Investigation and documentation of new data sets Triage incoming bugs and incidents. Perform technical operation tasks. Investigate and troubleshoot issues with data and data pipelines. Participation in sprint refinement, planning, and kick-off to help estimate stories, raise awareness and additional implementation details. Help monitor areas of the data pipeline and raise awareness to the team when issues arise. Performing and implementing new quality assurance rules to maintain consistent and accurate data. Knowledge A solid understanding of data science concepts is required Data analysis expertise Working knowledge of ETL tools Knowledge of BI tools Handling DevOps Task is preferable Experience with Big Data technologies such as Hadoop and Kafka Extensive experience with ML frameworks and libraries including TensorFlow, Spark, PyTorch, and MLPACK Skills (Technical) Experience designing and implementing a full-scale data warehouse solution based on Snowflake. A minimum of three years’ experience in developing production-ready data ingestion and processing pipelines using Java, Spark, Scala, Python. Experience with complex data warehouse solutions on Teradata, Oracle, or DB2 platforms with 2 years of hands-on experience Expertise and excellent proficiency with Snowflake internals and integration of Snowflake with other technologies for data processing and reporting. Experience or Knowledge of Excel and several analytical tools such as Tableau, MicroStrategy, PowerBI would be added advantage Abilities (Competencies) Work Independently. Collaborates with team members. Self-motivated. Typical Experience 4 - 6 years Show more Show less

Posted 1 week ago

Apply

0 years

2 - 3 Lacs

Hyderābād

On-site

GlassDoor logo

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview DAIT is Data Analytics and Insights Technology provides end-to-end technology solutions for multiple line of business. Job Description The individual will be a part of Production support L2 team – Batch Operations with technical expertise in – Hadoop/Teradata/Datastage/Autosys/Linux. Responsible for platform stability, proactive application and job monitoring, issue management & resolution, triage, reporting and timely escalation. Responsible for break-fix activities requiring to review Root Cause Analysis, Small Changes to Code, Unit Test results and help to Deploy in production following the release management & code deployment process. The ideal candidate must be highly self-motivated, proactive, attention to detail, good documentation & communication skills to interact with partners like TI, Application, Other Prod Support teams like CCO, L1, L2, L3, Application and Business stakeholders as required. Ability to think of process improvements to improve platform stability and resiliency. Responsibilities Monitor and support applications for 100% SLA meets On call support Production Ticket/Issue Triage Preparing RCA – Root Cause Analysis (RCA) document Partner with Application team, CCO, L1, Level 2 support teams to resolve the issue Prepare and/or review Impact Analysis based on issue analysis Hands on experience with Batch Ops(L1/L2) and L3 support work load. Write scripts to automate mundane daily BAU tasks Willing to provide support after office hours, weekends and stay on call when business needs Identify root cause in the code, perform break-fix activities in the code and/or DB Work on addition projects for improving production efficiency as well as reducing risk Requirements Education: B.E. / B. Tech/M.E. /M. Tech/B.Sc./M.Sc./BCA/MCA (prefer IT/CS specialization) Certifications, If Any: BFSI Domain certifications (Not Mandatory) Experience Range: 6-10 years Foundational skills: Experience in Bigdata (Hadoop) Experience in UNIX and shell scripting. Experience in ETL (Datastage/Informatica). Experience in Database (Oracle/Exadata), Teradata , DB2 Experience in Job scheduling tools like Autosys Aware of ITIL concepts like Incident and Problem Management Experience in application development or production support.( preferably in Batch Processing , scheduling , monitoring , triaging. Desired skills: Experience with Hadoop architecture ,HIVE , Impala, coding in Python , Experience in Datastage 11.7 and above Working Experience with SQL , Teradata, Oracle ,DB2 . Work Timings: 06:30 a.m. to 03:30 p.m. and 11:30 a.m. to 08:30 p.m, Job Location: Chennai

Posted 1 week ago

Apply

0 years

3 - 9 Lacs

Indore

On-site

GlassDoor logo

AV-281592 INDORE,Madhya Pradesh,India Vollzeit Unbefristet Global Business Services DHL INFORMATION SERVICES (INDIA) LLP Your IT Future, Delivered Solutions Architect With a global team of 5600+ IT professionals, DHL IT Services connects people and keeps the global economy running by continuously innovating and creating sustainable digital solutions. We work beyond global borders and push boundaries across all dimensions of logistics. You can leave your mark shaping the technology backbone of the biggest logistics company of the world. All Our locations have earned #GreatPlaceToWork certification, reflecting our commitment to exceptional employee experiences. Digitalization. Simply delivered. At IT Services, we are passionate about Solution Architect in datawarehouse and business intelligence space. Our Customer Service Complex Data Solution team is continuously expanding. No matter your level of Solution Architect proficiency, you can always grow within our diverse environment. #DHL #DHLITServices #GreatPlace #ppmt #Kart #cscombine Grow together. We strive to deliver efficient and optimized business solutions in the Area of Customer Service Complex Data Solutions for our business. You will work as Solutions Architect for existing and new applications to provide end to end Architecture expertise on wide range of technologies like Azure Cloud, Python, Snowflake, Teradata, Power BI, Matillion and many more. You will be our main Architect providing guidance and direction on the implementation of Application Solutioning & Design, Analytics, Data Warehousing & Reporting products. You will ensure that the Analytics & Reporting solutions meets the required performance benchmark and adheres to standards & guidelines. You will guide the development team with technical expertise for ensuring business requirements are implemented as expected. This would mean you sometime have to get down to coding and provide a solution or high-level approach to achieve the requirement to give direction to the Dev Team. You will work with project teams to ensure Business Requirements are delivered keeping in mind the end-to-end Solution & Application/Data Architecture. You will get to work with some of the complex data structures that will need your expertise to Data Modelling & Design. You will be involved in optimizing the performance and resource utilization of the existing solutions. As a senior member in the team, you will collaborate with business users on Requirements and ensure that the requirements are well defined before assigning for development. Lead discussion with Business during UAT Defects review. You will be working on latest technologies like Snowflake, Matilllion, Teradata, ERWIN, Microservices, Data pipelines, Jenkins, Jira/Confluence, Splunk etc. You will get ample opportunities to grow within the organization and with focus on continuous learning will get opportunity to work & learn many different technologies. Ready to embark on the journey? Here’s what we are looking for: As a Solution Architect, you are well versed in architecture design, software development experiences especially in Python, familiarity of development framework and also analytics and problem solving skills. Having excellent skills in understanding the latest technology relation to the business knowledge of customer service experience is a huge plus. Very good knowledge of data modeling will also be an integral part of this role and experience in implementation of customer facing application. Been part of the Agile / Scrum team experience is useful. You are a business intelligence technology aficionado, therefore you have a good understanding of latest analytics skill sets and experience in implementation of MVP and POC rapid prototyping experience is good to have also in the AI space of new technology adoptions. You are able to work independently prioritize and organize your tasks under time and workload pressure. Working in a multinational environment, you can expect cross-region collaboration with teams around the globe, thus being advanced in spoken and written English will be certainly useful. Basic certification / knowledge of AWS / Azure/ Snowflake/ Teradata/ Power BI related too is a plus. An array of benefits for you: Hybrid work arrangements to balance in-office collaboration and home flexibility. Annual Leave: 42 days off apart from Public / National Holidays. Medical Insurance: Self + Spouse + 2 children. An option to opt for Voluntary Parental Insurance (Parents / Parent -in-laws) at a nominal premium covering pre existing disease. In House training programs: professional and technical training certifications.

Posted 1 week ago

Apply

0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Linkedin logo

Data Modeller JD We are seeking a skilled Data Modeller to join our Corporate Banking team. The ideal candidate will have a strong background in creating data models for various banking services, including Current Account Savings Account (CASA), Loans, and Credit Services. This role involves collaborating with the Data Architect to define data model structures within a data mesh environment and coordinating with multiple departments to ensure cohesive data management practices. Data Modelling: oDesign and develop data models for CASA, Loan, and Credit Services, ensuring they meet business requirements and compliance standards. Create conceptual, logical, and physical data models that support the bank's strategic objectives. Ensure data models are optimized for performance, security, and scalability to support business operations and analytics. Collaboration With Data Architect Work closely with the Data Architect to establish the overall data architecture strategy and framework. Contribute to the definition of data model structures within a data mesh environment. Data Quality And Governance Ensure data quality and integrity in the data models by implementing best practices in data governance. Assist in the establishment of data management policies and standards. Conduct regular data audits and reviews to ensure data accuracy and consistency across systems. Data Modelling Tools: ERwin, IBM InfoSphere Data Architect, Oracle Data Modeler, Microsoft Visio, or similar tools. Databases: SQL, Oracle, MySQL, MS SQL Server, PostgreSQL, Neo4j Graph Data Warehousing Technologies: Snowflake, Teradata, or similar. ETL Tools: Informatica, Talend, Apache NiFi, Microsoft SSIS, or similar. Big Data Technologies: Hadoop, Spark (optional but preferred). Technologies: Experience with data modelling on cloud platforms Microsoft Azure (Synapse, Data Factory) Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Software Developers at IBM are the backbone of our strategic initiatives to design, code, test, and provide industry-leading solutions that make the world run today - planes and trains take off on time, bank transactions complete in the blink of an eye and the world remains safe because of the work our software developers do. Whether you are working on projects internally or for a client, software development is critical to the success of IBM and our clients worldwide. At IBM, you will use the latest software development tools, techniques and approaches and work with leading minds in the industry to build solutions you can be proud of Your Role And Responsibilities This Candidate is responsible for DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 8+years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Req ID: 328481 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Senior Analyst to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Req ID: 328478 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Senior Analyst to join our team in noida, Uttar Pradesh (IN-UP), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less

Posted 1 week ago

Apply

4.0 - 12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role: Data Scientist with Gen AI Shift Time – 2.00 PM – 10.00 PM Location – Bangalore/Hyderabad/Chennai Experience – 4 to 12 Years Work Mode – Hybrid (3 Work from Office) Notice Period – Immediate – 10 days Mandatory Skills – Data Scientist, Gen AI, RAG, LLM, (Python OR Java), (Oracle OR SQL). Required Skills · B.S/BTech in a Science, Technology, Engineering, Mathematics (STEM) or Economics related field of study. · 3 or more years with relational or NoSQL databases (Oracle, Teradata, SQL Server, Hadoop, ELK, etc.) · 3 or more years working with languages such as R, Python or Java · 3 or more years working Gen AI ( LLMs, RAG patterns, Foundational models) · 3 or more years working with advanced statistical methods such as regressions, classifiers, recommenders, anomaly detection, optimization algorithms, tree methods, neural nets, etc. · Experience presenting and delivering results to business partners TIAA reviews/approvals Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Bea able to align data models with business goals and enterprise architecture Collaborate with Data Architects, Engineers, Business Analysts, and Leadership teams Lead data modelling, governance discussions and decision-making across cross-functional teams Proactively identify data inconsistencies, integrity issues, and optimization opportunities Design scalable and future-proof data models Define and enforce enterprise data modelling standards and best practices Experience working in Agile environments (Scrum, Kanban) Identify impacted applications, size capabilities, and create new capabilities Lead complex initiatives with multiple cross-application impacts, ensuring seamless integration Drive innovation, optimize processes, and deliver high-quality architecture solutions Understand business objectives, review business scenarios, and plan acceptance criteria for proposed solution architecture Discuss capabilities with individual applications, resolve dependencies and conflicts, and reach agreements on proposed high-level approaches and solutions Participate in Architecture Review, present solutions, and review other solutions Work with Enterprise architects to learn and adopt standards and best practices Design solutions adhering to applicable rules and compliances Stay updated with the latest technology trends to solve business problems with minimal change or impact Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 8+ years of proven experience in a similar role, leading and mentoring a team of architects and technical leads Extensive experience with Relational, Dimensional, and NoSQL Data Modelling Experience in driving innovation, optimizing processes, and delivering high-quality solutions Experience in large scale OLAP, OLTP, and hybrid data processing systems Experience in complex initiatives with multiple cross-application impacts Expert in Erwin for Conceptual, Logical, and Physical Data Modelling Expertise in Relational Databases, SQL, indexing and partitioning for databases like Teradata, Snowflake, Azure Synapse or traditional RDBMS Expertise in ETL/ELT architecture, data pipelines, and integration strategies Expertise in Data Normalization, Denormalization and Performance Optimization Exposure to cloud platforms, tools, and AI-based solutions Solid knowledge of 3NF, Star Schema, Snowflake schema, and Data Vault Knowledge of Java, Python, Spring, Spring boot framework, SQL, Mongo DBS, KAFKA, React JS, Dynatrace, Power BI kind of exposure Knowledge of Azure Platform as a Service (PaaS) offerings (Azure Functions, App Service, Event grid) Good knowledge of the latest happenings in the technology world Advanced SQL skills for complex queries, stored procedures, indexing, partitioning, macros, recursive queries, query tuning and OLAP functions Understanding of Data Privacy Regulations, Master Data Management, and Data Quality Proven excellent communication and leadership skills Proven ability to think from a long-term perspective and arrive at intentional and strategic architecture Proven ability to provide consistent solutions across Lines of Business (LOB) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

There is a job opening for Data Analyst in Tata Consultancy Services Experience - 5+ Location - Mumbai JD -- Skills -- Teradata/SQL/ Python with Data Transformation, Implementation, Data Management framework • Understanding and clarifying the business need/opportunity/problem • Data Collection and Preparation as needed for analysis • Perform data mining and exploratory data analysis to Identify patterns, trends, and outliers within datasets • Create visualizations (charts, graphs, dashboards) to communicate findings. • Develop reports and presentations to present findings to stakeholders. • Develop recommendations based on data findings to improve business performance. • Communicate findings and recommendations to stakeholders Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer and community banking- Data technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job Responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 3+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred Qualifications, Capabilities, And Skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have) ABOUT US Show more Show less

Posted 1 week ago

Apply

6.0 - 9.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a GCP Data & Cloud Engineer with strong expertise in Google Cloud Platform services, including BigQuery, Cloud Run, Cloud Storage , and Pub/Sub . The ideal candidate will have deep experience in SQL coding , data pipeline development, and deploying cloud-native solutions. Key Responsibilities: Design, implement, and optimize scalable data pipelines and services using GCP Build and manage cloud-native applications deployed via Cloud Run Develop complex and performance-optimized SQL queries for analytics and data transformation Manage and automate data storage, retrieval, and archival using Cloud Storage Implement event-driven architectures using Google Pub/Sub Work with large datasets in BigQuery , including ETL/ELT design and query optimization Ensure security, monitoring, and compliance of cloud-based systems Collaborate with data analysts, engineers, and product teams to deliver end-to-end cloud solutions Required Skills & Experience: 3+ years of experience working with Google Cloud Platform (GCP) Strong proficiency in SQL coding , query tuning, and handling complex data transformations Hands-on experience with: BigQuery Cloud Run Cloud Storage Pub/Sub Understanding of data pipeline and ETL/ELT workflows in cloud environments Familiarity with containerized services and CI/CD pipelines Experience in scripting languages (e.g., Python, Shell) is a plus Strong analytical and problem-solving skills

Posted 1 week ago

Apply

4.0 - 6.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Job Title: GCP Teradata Engineer Location: Chennai, Bangalore, Hyderabad Experience: 4-6 Years Job Summary: We are seeking a GCP Data & Cloud Engineer with strong expertise in Google Cloud Platform services, including BigQuery, Cloud Run, Cloud Storage , and Pub/Sub . The ideal candidate will have deep experience in SQL coding , data pipeline development, and deploying cloud-native solutions. Key Responsibilities: Design, implement, and optimize scalable data pipelines and services using GCP Build and manage cloud-native applications deployed via Cloud Run Develop complex and performance-optimized SQL queries for analytics and data transformation Manage and automate data storage, retrieval, and archival using Cloud Storage Implement event-driven architectures using Google Pub/Sub Work with large datasets in BigQuery , including ETL/ELT design and query optimization Ensure security, monitoring, and compliance of cloud-based systems Collaborate with data analysts, engineers, and product teams to deliver end-to-end cloud solutions Required Skills & Experience: 4 years of experience working with Google Cloud Platform (GCP) Strong proficiency in SQL coding , query tuning, and handling complex data transformations Hands-on experience with: BigQuery Cloud Run Cloud Storage Pub/Sub Understanding of data pipeline and ETL/ELT workflows in cloud environments Familiarity with containerized services and CI/CD pipelines Experience in scripting languages (e.g., Python, Shell) is a plus Strong analytical and problem-solving skills

Posted 1 week ago

Apply

2.0 - 4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Job Description: The role requires very advanced level skills and very good working knowledge in SQL with Tableau (BI Visualization). Resource should capable to do the performance tuning on SQL and ready to support as L3 Support Role for all data related concerns and reporting server issues. Require very good knowledge in developing tableau dashboards and reports Past experience on Databases like Netezza(NPS), Teradata, Oracle, Bigdata Hadoop, Unix Script Utilize knowledge of applications development procedures and concepts and other technical aspects to identify and define requirements to enhance the system Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Resource should have experience on SDLC Deployment Cycle and Agile Methodology Good conceptual and working knowledge of DEV Ops Tools such as : Jira, Bitbucket, Jenkins etc... Candidate will extend support in release timing during weekend. Also candidate willing to upskills in new techstack/skills for project need Qualifications: 2-4 years of relevant experience as Tableau Developer with strong SQL query writing skills. Profile who have the strong knowledge on Basic/Advanced SQL in any Databases. Require Tableau BI Visualization tool experience Database knowledge on Netezza(NPS), Teradata, Oracle, Bigdata Hadoop, Unix Script Experience in programming/debugging used in business applications Working knowledge of industry practice and standards Comprehensive knowledge of specific business area for application development Working knowledge of program languages Consistently demonstrates clear and concise written and verbal communication Professional SQL/Tableau Certifications are good to have. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 week ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Design,develop,maintain applications using Cobol,JCL,SQL. Collaborate with cross-functional teams to define,design,ship new features. Troubleshoot/resolve application issues/bugs Design, implement, and optimize SQL queries and database structures. Required Candidate profile Ensure the performance, quality, and responsiveness of applications using Cobol, JCL Develop, maintain, and support mainframe applications using COBOL, JCL Location Hyderabad,Pune,Bengaluru,Chennai

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Chennai

Work from Office

Naukri logo

An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQLs Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance

Posted 1 week ago

Apply

8.0 - 12.0 years

22 - 27 Lacs

Indore, Chennai

Work from Office

Naukri logo

We are hiring a Senior Python DevOps Engineer to develop scalable apps using Flask/FastAPI, automate CI/CD, manage cloud and ML workflows, and support containerized deployments in OpenShift environments. Required Candidate profile 8+ years in Python DevOps with expertise in Flask, FastAPI, CI/CD, cloud, ML workflows, and OpenShift. Skilled in automation, backend optimization, and global team collaboration.

Posted 1 week ago

Apply

3.0 - 10.0 years

15 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

As a Software Engineer III at JPMorgan Chase within the Consumer and community banking- Data technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e. g. , cloud, artificial intelligence, machine learning, mobile, etc. ) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have)

Posted 1 week ago

Apply

3.0 - 10.0 years

15 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer and community banking - Data technology, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Becomes a technical mentor in the team Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e. g. , cloud, artificial intelligence, machine learning, mobile, etc. ) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have)

Posted 1 week ago

Apply

6.0 - 8.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops What Makes You a Qualified Candidate Minimum 6-8 years of IT experience in a Systems Administrator / Engineer role. Minimum 4 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 week ago

Apply

6.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops What Makes You a Qualified Candidate Minimum 6-8 years of IT experience in a Systems Administrator / Engineer role. Minimum 4 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies