Home
Jobs

777 Teradata Jobs - Page 22

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2025.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

TCS is conducting Walk In Drive on 7th June, 2025 for Abinitio Developer Location: TCS - Siruseri-ATL Building, Chennai Date of Interview: 7th June, 2025 Years of Experience: 3-10yrs ( Accurate) Responsibilities: Must-Have: Strong technical skills in Ab Initio, UNIX shell scripting, SQL(Teradata & Hadoop) and other scheduling tools. Hands-on experience with Google cloud Technologies like Big query, Cloud Storage and Cloud native ETL processing tools. Basic knowledge in Python would be preferred. Extensively worked on Teradata/Oracle/Hadoop as database using Ab Initio as ETL tool for large scale data integration. Good understanding of data warehouse and Metadata management concepts and Tools. Experience on Cloud would be an added advantage Good to Have: Experience in working on Agile projects. Basic Knowledge in Mainframe would be preferred. Thanks & Regards Shilpa Silonee BFSI A&I Team Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Calcutta

On-site

GlassDoor logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. About EY GDS Global Delivery Services (GDS) is a dynamic and truly global delivery network. Across our six locations, we work with teams from all EY service lines, geographies and sectors, and play a vital role in the delivery of the EY growth strategy. We operate from six countries and sixteen cities: Argentina (Buenos Aires) China (Dalian) India (Bangalore, Chennai, Gurgaon, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Trivandrum) Philippines (Manila) Poland (Warsaw and Wroclaw) UK (Manchester, Liverpool) Careers in EY Global Delivery Services Join a team of over 50,000 people, working across borders, to provide innovative and strategic business solutions to EY member firms around the world. Join one of our dynamic teams From accountants to coders, we offer a wide variety of fulfilling career opportunities that span all business disciplines Our Consulting practice provides differentiated focus on the key business themes to help our clients solve better questions around technology. Our vision is to be recognized as a leading provider of differentiated technology consulting services, harnessing new disruptive technology, alliances and attracting talented people to solve our clients' issues. It's an exciting time to join us and grow your career as a technology professional. A technology career is about far more than leading-edge innovations. It’s about the application of these technologies in the real world to make a real, meaningful impact. We are looking for highly motivated, articulate individuals who have the skills to the technology lifecycle and are passionate about designing innovative solutions to solve complex business problems. Your career in Consulting can span across these technology areas/ services lines: Digital Technologies: We are a globally integrated digital architecture and engineering team. Our mission is to deliver tailored, custom-built end to end solutions to our customers that are Digital, Cloud Native and Open Source. Our skills include Experience design, UI development, Design Thinking, Architecture & Design, Full stack development (.Net/ Java/ SharePoint/ Power Platform), Emerging Technologies like Block Chain, IoT, AR\VR, Drones, Cloud and DevSecOps. We use industrialized techniques, built on top of agile methods utilizing our global teams to deliver end to end solutions at best unit cost proposition. Testing Services: We are the yardstick of quality software product. We break something to make the product stronger and successful. We provide entire gamut of testing services including Busines / User acceptance testing. Hence this is a team with all round skills such as functional, technical and process. Data & Analytics: Data and Analytics is amongst the largest and most versatile practices within EY. Our sector and domain expertise combined with technical skills in data, cloud, advanced analytics and artificial intelligence differentiates us in the industry. Our talented team possesses cross-sector and cross-domain expertise and a wide array of skills in Information Management (IM), Business Intelligence (BI), Advance Analytics (AA) and Artificial Intelligence (AI) Oracle: We provide one-stop solution for end-to-end project implementation enabled by Oracle and IBM Products. We use proven methodologies, tools and accelerators to jumpstart and support large Risk and Finance Transformation. We develop solutions using various languages such as SQL or PL/ SQL, Java, Java Script, Python, IBM Maximo and other Oracle Utilities. We also provide consulting services for streamlining the current reporting process using various Enterprise Performance Management tools. SAP: By building on SAP’s S/4HANA digital core and cloud services, EY and SAP are working to help organizations leverage industry-leading technologies to improve operational performance. This collaboration helps drive digital transformation for our clients across areas including finance, human resources, supply chain and procurement. Our goal is to support clients as they initiate or undergo major transformation. Our capabilities span end-to-end solution implementation services from strategy and architecture to production deployment. EY supports clients in three main areas, Technology implementation support, Enterprise and Industry application implementation, Governance Risk Compliance (GRC) Technology. Banking and Capital Market Services: Banking and Capital Market Services companies are transforming their complex tax and finance functions with technologies such as AI and ML. With the right blend of core competencies, tax and finance personnel will shift to data, process and technology skills to service global clients on their Core Banking Platforms and support their business / digital transformation like Deposit system replacements, lending / leasing modernization, Cloud–native architecture (Containerization) etc. Wealth and Asset Management: We help our clients thrive in a transformative age by providing innovative services to global and domestic asset management clients to increase efficiency, effectiveness and manage the overall impact on bottom line profitability by leveraging the technology, data and digital teams. We do many operational efficiency programs and Technology Enabled Transformation to re-platform their front and Back offices with emerging technologies like AI, ML, Blockchain etc. Insurance Transformation: The current changing Macroeconomic trends continue to challenge Insurers globally. However, with disruptive technologies – including IoT, autonomous vehicles, Blockchain etc, we help companies through these challenges and create innovative strategies to transform their business through technology enabled transformation programs. We provide end to end services to Global P&C (General), Life and Health Insurers, Reinsurers and Insurance brokers. Cyber Security: The ever-increasing risk and complexity surrounding cybersecurity and privacy has put cybersecurity at the top of the agenda for senior management, the Board of Directors, and regulators. We help our clients to understand and quantify their cyber risk, prioritize investments, and embed security, privacy and resilience into every digitally-enabled initiative – from day one. Technology Risk: A practice that is a unique, industry-focused business unit that provides a broad range of integrated services where you’ll contribute technically to IT Risk and Assurance client engagements and internal projects. An important part of your role will be to actively establish, maintain and strengthen internal and external relationships. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. Behavioral Competencies: Adaptive to team and fosters collaborative approach Innovative approach to the project, when required Shows passion and curiosity, desire to learn and can think digital Agile mindset and ability to multi-task Must have an eye for detail Skills needed: Should have understanding and/or experience of software development best practices and software development life cycle Understanding of one/more programming languages such as Java/ .Net/ Python, data analytics or databases such as SQL/ Oracle/ Teradata etc. Internship in a relevant technology domain will be an added advantage Qualification: BE - B. Tech / (IT/ Computer Science/ Circuit branches) Should have secured 60% and above No active Backlogs EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

2025.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

TCS is conducting Walk In Drive on 7th June, 2025 for Ab Initio Developer Location: TCS - Block E Cafeteria, Gitanjali Park, Kolkata Date of Interview: 7th June, 2025 Years of Experience: 3-10yrs ( Accurate) Responsibilities: Must-Have: Strong technical skills in Ab Initio, UNIX shell scripting, SQL(Teradata & Hadoop) and other scheduling tools. Hands-on experience with Google cloud Technologies like Big query, Cloud Storage and Cloud native ETL processing tools. Basic knowledge in Python would be preferred. Extensively worked on Teradata/Oracle/Hadoop as database using Ab Initio as ETL tool for large scale data integration. Good understanding of data warehouse and Metadata management concepts and Tools. Experience on Cloud would be an added advantage Good to Have: Experience in working on Agile projects. Basic Knowledge in Mainframe would be preferred. Thanks & Regards Shilpa Silonee BFSI A&I Team Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Andhra Pradesh

On-site

GlassDoor logo

Data Management Strategy & Governance Associate Advisor - HIH - Evernorth ABOUT EVERNORTH: Evernorth℠ exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care, we solve the problems others don’t, won’t or can’t. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Position Overview Work with data governance products to assist Product Owner(s) with guiding functionality and content development with our business and engineering partners. Works daily with our delivery team partners to gain functionality to build trust in data by ensuring quality, accuracy across an organization. Work daily with partners to understand use case and necessary quality checks to govern the data. Works daily with our data stewards and data owners to onboard and train on data governance platforms and concepts. Qualifications Required Skills: Ability to lead functional development and tool on boarding meetings documenting decisions and issues Ability to translate business data governance requirements into technical requirements to meet multiple business needs Strong analytical, creative problem solving, and process management skills Ability to work with various data assets and provide insight into the data Ability to plan and prioritize own work effectively to achieve set end results Excellent attention to detail Excellent oral, written & presentation communication skills Ability to accept ambiguity, open to new ideas; willing to test and fail; and attempt new processes Ability to influence peers and subordinates to modify behaviors and provide support for the implementation and adoption of data governance and metadata strategy and practices. Ability to work independently and as a part of a team with a positive attitude for team goals. Ability to deal with multiple time zones Proactive flexible attitude with a desire to learn new skills in the growing field of data governance Agile Experience a plus. Required Experience & Education: 8 -11 Years of Experience Experience (3-4+ years working experience) with data governance, constructs including technical and business metadata. Understanding of structured and semi-structured data, data quality knowledge Strong ability to drive connections with a wide variety of data platforms including AWS (Databricks, Teradata, Snowflake, Kafka), Azure, DB2 and SQL Server, etc. Knowledgeable on firewalls is a plus. Experience in using BI reporting tool is a plus. Experience with Collibra data governance products, Healthcare Industry data and/or data governance regulations and controls is a plus. Tooling: Excel, Agile knowlage like Jira or Rally, SQL knowledge, Tableau, other reporting tools, Experience with Data Governance Tools (Collibra DQ & Collibra Data Inteligence ) Alation or similar desired Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description: About Us* At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services* Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* Global Finance was set up in 2007 as a part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business and Enterprise Finance functions. The capabilities hosted include Legal Entity Controllership, General Accounting & Reconciliations, Regulatory Reporting, Operational Risk and Control Oversight, Finance Systems Support, etc. Our group supports the Finance function for the Consumer Banking, Wealth and Investment Management business teams across Financial Planning and Analysis, period end close, management reporting and data analysis. Job Description* Candidate will be responsible for developing & validating dashboards and business reports using Emerging Technology tools like MicroStrategy, Tableau, Alteryx, etc. The candidate will be responsible for delivering complex and time critical data mining and analytical projects for the Consumer & Small Business Banking, lending products such as Credit Cards and in addition will be responsible for analysis of data for decision making by senior leadership. Candidate will be responsible for data management, data extraction and upload, data validation, scheduling & process automation, report preparation, etc. The individual will play a key role in the team responsible for financial data reporting, adhoc reporting & data requirements, data analytics & business analysis and would manage multiple projects in parallel by ensuring adequate understanding of the requirements and deliver data driven insights and solutions to complex business problems. These projects would be time critical which would require the candidate to comprehend & evaluate the strategic business drivers to bring in efficiencies through automation of existing reporting packages or codes. The work would be a mix of standard and ad-hoc deliverables based on dynamic business requirements. Technical competency is essential to build processes which ensure data quality and completeness across all projects / requests related to business. The core responsibility of this individual is process management to achieve sustainable, accurate and well controlled results. Candidate should have a clear understanding of the end-to-end process (including its purpose) and a discipline of managing and continuously improving those processes. Responsibilities* Preparation and maintenance of various KPI reporting (Consumer lending such as Credit Cards) including performing data or business driven deep dive analysis. Credit Cards rewards reporting, data mining & analytics. Understand business requirements and translate those into deliverables. Support the business on periodic and ad-hoc projects related to consumer lending products. Develop and maintain codes for the data extraction, manipulation, and summarization on tools such as SQL, SAS, Emerging technologies like MicroStrategy, Tableau and Alteryx. Design solutions, generate actionable insights, optimize existing processes, build tool-based automations, and ensure overall program governance. Managing and improve the work: develop full understanding of the work processes, continuous focus on process improvement through simplification, innovation, and use of emerging technology tools, and understanding data sourcing and transformation. Managing risk: managing and reducing risk, proactively identify risks, issues and concerns and manage controls to help drive responsible growth (ex: Compliance, procedures, data management, etc.), establish a risk culture to encourage early escalation and self-identifying issues. Effective communication: deliver transparent, concise, and consistent messaging while influencing change in the teams. Extremely good with numbers and ability to present various business/finance metrics, detailed analysis, and key observations to Senior Business Leaders. Requirements* Education* Masters/Bachelor’s Degree in Information Technology/Computer Science/ MCA or MBA finance Experience Range* 7-10 years of relevant work experience in data analytics & reporting, business analysis & financial reporting in banking or credit card industry. Exposure to Consumer banking businesses would be an added advantage. Experience around credit cards reporting & analytics would be preferable. Foundational skills* Strong abilities in data extraction, data manipulation and business analysis and strong financial acumen. Strong computer skills, including MS excel, Teradata SQL, SAS and emerging technologies like MicroStrategy, Alteryx, Tableau. Prior Banking and Financial services industry experience, preferably Retail banking, and Credit Cards. Strong business problem solving skills, and ability to deliver on analytics projects independently, from initial structuring to final presentation. Strong communication skills (both verbal and written), Interpersonal skills and relationship management skills to navigate the complexities of aligning stakeholders, building consensus, and resolving conflicts. Proficiency in Base SAS, Macros, SAS Enterprise Guide Querying data from multiple source Experience in data extraction, transformation & loading using SQL/SAS. Proven ability to manage multiple and often competing priorities in a global environment. Manages operational risk by building strong processes and quality control routines. SQL: Querying data from multiple source Data Quality and Governance: Ability to clean, validate and ensure data accuracy and integrity. Troubleshooting: Expertise in debugging and optimizing SAS and SQL codes. Desired Skills Ability to effectively manage multiple priorities under pressure and deliver as well as being able to adapt to changes. Able to work in a fast paced, deadline-oriented environment. Multiple stakeholder management Attention to details: Strong focus on data accuracy and documentation. Work Timings* 11:30 pm to 8:30 pm (will require to stretch 7-8 days in a month to meet critical deadlines) Job Location* Mumbai Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

As a part of the AI & Data Platform organization, the Enterprise Business Intelligence (EBI) team is central to NXP’s data analytic success, providing and maintaining data solutions, platforms and methods that enable self-service business creation of solutions that drive the business forward. Data Engineers within EBI have responsibilities across solution design, delivery, and support. In this role you will work with Product Owners, Architects, Data Scientists, and other stakeholders to design, build and maintain ETL/ ELT pipelines, data pipelines and jobs, combining data from multiple source systems into one or multiple target systems. Solutions delivered must adhere to EBI and IT architectural principles pertaining to capacity planning, performance management, data security, data privacy, lifecycle management and regulatory compliance. Assisting the Operational Support team with analysis and investigation of issues is also expected, as needed. Required Skills And Qualifications Proven experience as a Data Engineer Hands on experience in ETL design and development concepts (5+ years) Experience with AWS and Azure cloud platforms and their data service offerings Proficiency in SQL, PySpark, Python Experience with GitHub, GitLab, CI/CD Knowledge of advanced analytic concepts including AI/ML Strong problem-solving skills and ability to work in a fast-paced and collaborative environment Excellent oral and written communication skills Preferred Skills & Qualifications Experience with Agile / DevOps Proficiency in SQL (Databricks, Teradata) Experience with DBT More information about NXP in India... Show more Show less

Posted 3 weeks ago

Apply

5.0 - 10.0 years

25 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

We're Hiring: Teradata Developer | Bangalore | 5.0 8.0 Years Experience Are you a Teradata expert looking for your next opportunity in a fast-paced and collaborative environment? We're looking for skilled professionals who can join within 1 month immediate joiners preferred! Location: Bangalore (Work from Office) Experience: 5.0 – 8.0 years Position: Teradata Developer Key Responsibilities: Hands-on experience with Teradata utilities (BTEQ, FastLoad, MultiLoad, TPT) Strong command over SQL and ETL development Deep understanding of Teradata architecture , star/snowflake schema, and data modeling Proven experience in SQL optimization and performance tuning Good knowledge of data warehousing concepts and best practices Experience in partitioning strategies , workload management, and system security Able to troubleshoot and resolve performance bottlenecks Collaborate effectively with business analysts to understand data requirements Create and maintain detailed technical documentation Who Should Apply: Only candidates who can join within 1 month Professionals with strong Teradata knowledge and solid data warehousing fundamentals Looking for immediate joiners based in or willing to relocate to Bangalore Interested? Apply now or email to vijay.s@xebia.com Let’s build data-driven solutions together! #Teradata #BangaloreJobs #ImmediateJoiners #HiringNow #ETL #SQL #DataEngineering #DataWarehouse

Posted 3 weeks ago

Apply

0 years

0 Lacs

Khairatabad, Telangana, India

On-site

Linkedin logo

Location: IN - Hyderabad Telangana Goodyear Talent Acquisition Representative: Maria Monica Canding Sponsorship Available: No Relocation Assistance Available: No Job Responsibilities You are responsible for designing and building data products, legal data layers, data streams, algorithms, and reporting systems (e.g., dashboards, front ends). You ensure the correct design of solutions, performance, and scalability while considering appropriate cost control. You link data product design with DevOps and infrastructure. You act as a reference within and outside the Analytics team. You serve as a technical partner to Data Engineers regarding digital product implementation. Qualifications You have a Bachelor’s degree in Computer Science, Engineering, Management Information Systems, or a related discipline, or you have 10 or more years of experience in Information Technology in lieu of a degree. You have 5 or more years of experience in Information Technology. You have an in-depth understanding of database structure principles. You have experience gathering and analyzing system requirements. You have knowledge of data mining and segmentation techniques. You have expertise in SQL and Oracle. You are familiar with data visualization tools (e.g., Tableau, Cognos, SAP Analytics Cloud). You possess proven analytical skills and a problem-solving attitude. You have a proven ability to work with distributed systems. You are able to develop creative solutions to problems. You have knowledge and strong skills with SQL and NoSQL databases and applications, such as Teradata, Redshift, MongoDB, or equivalent. Goodyear is an Equal Employment Opportunity and Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to that individual's race, color, religion or creed, national origin or ancestry, sex (including pregnancy), sexual orientation, gender identity, age, physical or mental disability, ethnicity, citizenship, or any other characteristic protected by law. Goodyear is one of the world’s largest tire companies. It employs about 74,000 people and manufactures its products in 57 facilities in 23 countries around the world. Its two Innovation Centers in Akron, Ohio and Colmar-Berg, Luxembourg strive to develop state-of-the-art products and services that set the technology and performance standard for the industry. For more information about Goodyear and its products, go to www.goodyear.com/corporate Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: Highly motivated Full Stack Engineer with a solid background in software development. The ideal candidate should be adept at multi-tasking across various development activities, including coding, system configuration, testing, and research. Key Responsibilities: Collaborate with an integrated development team to deliver high-quality applications. Develop end-user applications, leveraging research capabilities and SQL knowledge. Utilize open-source tools and technologies effectively, adapting and extending them as needed to create innovative solutions. Communicate effectively across teams to ensure alignment and clarity throughout the development process. Provide post-production support Who you will work with: On our team we collaborate with several cross-functional agile teams that include product owners, other engineering groups, and quality engineers to conceptualize, build, test and ship software solutions for the next generation of enterprise applications. You will report directly to the Manager of the Applications team. What makes you a qualified candidate: 4+ years of relevant experience, preferably in R&D based teams Strong programming experience with JavaScript frameworks such as Angular, React, or Node. js, or experience writing Python-based microservices. Experience driving cloud-native service development with a focus on DevOps principles (CI/CD, TDD, Automation). Hands on experience on Java, JSP and related areas. Proficiency in Docker, Unix or Linux platforms. Experience with Spring Framework or Spring Boot a plus Expertise in designing and deploying scalable solutions in public cloud environments. A passion for innovation and continuous learning, with the ability to quickly adapt to new technologies. Familiarity with software configuration management tools, defect tracking tools, & peer review tools Excellent debugging skills to troubleshoot and resolve issues effectively. Familiarity with relational database management systems (RDBMS) such as PostgreSQL, MySQL, etc. Strong oral and written communication skills, with the ability to produce runbooks and both technical and non-technical documentation. What you will bring: Master’s or bachelor’s degree in computer science or a related discipline. Practical experience in development and support structures. Knowledge of cloud environments, particularly AWS. Proficiency in SQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Greetings from TCS! TCS is hiring for Informatica Power Centre-Teradata Desired Experience Range: 4 to 8 Years Job Location: Pune/Mumbai/Chennai Must-Have ** 1. Informatica PowerCenter 2. DB Experience 3. SQL 4. Unix Commands 5. PL/SQL 6.Teradata 7.Oracle Good-to-Have 1. Insurance Domain Knowledge 2. Good Communication Skills 3. Ability to work Indvidual 4. Good Team Player/Multi-tasking 5. Adopt to new techn Thank You Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Linkedin logo

Primary Technical Data analysis for the business requirement Data platform Solution design of data flow from source Mapping of source data elements to the target table SQL (Teradata , Oracle , DB2 , MSSQL, DAS) ETL/EDW/Informatica Data lake/Azure Data Warehouse architecture Data modelling/Data Architecture Secondary Banking domain experience Show more Show less

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Location Gurugram, Haryana, India Category Corporate Job Id GGN00002011 Marketing / Loyalty / Mileage Plus / Alliances Job Type Full-Time Posted Date 06/02/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description Description - External United's Kinective Media Data Engineering team designs, develops, and maintains massively scaling ad- technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values : At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job overview and responsibilities Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial projects with a digital focus. Data Engineer will be responsible to partner with various teams to define and execute data acquisition, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. Execute unit tests and validating expected results to ensure accuracy & integrity of data and applications through analysis, coding, writing clear documentation and problem resolution. This role will also drive the adoption of data processing and analysis within the AWS environment and help cross train other members of the team. Leverage strategic and analytical skills to understand and solve customer and business centric questions. Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies and partners Leverage data from a variety of sources to develop data marts and insights that provide a comprehensive understanding of the business. Develop and implement innovative solutions leading to automation Use of Agile methodologies to manage projects Mentor and train junior engineers. This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications Qualifications - External Required BS/BA, in computer science or related STEM field 2+ years of IT experience in software development 2+ years of development experience using Java, Python, Scala 2+ years of experience with Big Data technologies like PySpark, Hadoop, Hive, HBASE, Kafka, Nifi 2+ years of experience with database systems like redshift,MS SQL Server, Oracle, Teradata. Creative, driven, detail-oriented individuals who enjoy tackling tough problems with data and insights Individuals who have a natural curiosity and desire to solve problems are encouraged to apply 2+ years of IT experience in software development 2+ years of development experience using Java, Python, Scala Must be legally authorized to work in India for any employer without sponsorship Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Preferred Masters in computer science or related STEM field Experience with cloud based systems like AWS, AZURE or Google Cloud Certified Developer / Architect on AWS Strong experience with continuous integration & delivery using Agile methodologies Data engineering experience with transportation/airline industry Strong problem-solving skills Strong knowledge in Big Data

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Karnataka, India

On-site

Linkedin logo

Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: Highly motivated Full Stack Engineer with a solid background in software development. The ideal candidate should be adept at multi-tasking across various development activities, including coding, system configuration, testing, and research. Key Responsibilities: Collaborate with an integrated development team to deliver high-quality applications. Develop end-user applications, leveraging research capabilities and SQL knowledge. Utilize open-source tools and technologies effectively, adapting and extending them as needed to create innovative solutions. Communicate effectively across teams to ensure alignment and clarity throughout the development process. Provide post-production support Who you will work with: On our team we collaborate with several cross-functional agile teams that include product owners, other engineering groups, and quality engineers to conceptualize, build, test and ship software solutions for the next generation of enterprise applications. You will report directly to the Manager of the Applications team. What makes you a qualified candidate: 2+ years of relevant experience, preferably in R&D based teams Strong programming experience with JavaScript frameworks such as Angular, React, or Node. js, or experience writing Python-based microservices. Experience driving cloud-native service development with a focus on DevOps principles (CI/CD, TDD, Automation). Hands on experience on Java, JSP and related areas. Proficiency in Docker, Unix or Linux platforms. Experience with Spring Framework or Spring Boot a plus Expertise in designing and deploying scalable solutions in public cloud environments. A passion for innovation and continuous learning, with the ability to quickly adapt to new technologies. Familiarity with software configuration management tools, defect tracking tools, & peer review tools Excellent debugging skills to troubleshoot and resolve issues effectively. Familiarity with relational database management systems (RDBMS) such as PostgreSQL, MySQL, etc. Strong oral and written communication skills, with the ability to produce runbooks and both technical and non-technical documentation. What you will bring: Master’s or bachelor’s degree in computer science or a related discipline. Practical experience in development and support structures. Knowledge of cloud environments, particularly AWS. Proficiency in SQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

TCS is hiring for Teradata Developer Location: Chennai, Bangalore, Pune, Gurgaon, Hyderabad Years of Experience: 7-10yrs (Precise) Notice Period: 0-30 days(Precise) Responsibilities: Hand on Experience on writing Teradata SQL. Should be expert in Teradata Utilities Should be good in Unix Shell scripting Sound knowledge of financial service logical data model (Teradata FSLDM) Work closely with Business Users to convert the business requirement into Technical Requirement. Should be able to create Mapping documents for FSLDM and doswnstream data mart Must have work with ETL, preferably in Datastage Prepare Test cases for Unit Testing/SIT/Regression Design and document development standards Should have knowledge on Data warehouse Concepts Hands on experiecne in Control M Scheduling tool Should be able to take ownership and deliver independently Kindly share your updated CVs if it matches above requirements Thanks & Regards Shilpa Silonee BFSI A&I TAG Team Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Evernorth Evernorth℠ exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care, we solve the problems others don’t, won’t or can’t. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Position Overview Work with data governance products to assist Product Owner(s) with guiding functionality and content development with our business and engineering partners. Works daily with our delivery team partners to gain functionality to build trust in data by ensuring quality, accuracy across an organization. Work daily with partners to understand use case and necessary quality checks to govern the data. Works daily with our data stewards and data owners to onboard and train on data governance platforms and concepts. Qualifications Required Skills: Ability to lead functional development and tool on boarding meetings documenting decisions and issues Ability to translate business data governance requirements into technical requirements to meet multiple business needs Strong analytical, creative problem solving, and process management skills Ability to work with various data assets and provide insight into the data Ability to plan and prioritize own work effectively to achieve set end results Excellent attention to detail Excellent oral, written & presentation communication skills Ability to accept ambiguity, open to new ideas; willing to test and fail; and attempt new processes Ability to influence peers and subordinates to modify behaviors and provide support for the implementation and adoption of data governance and metadata strategy and practices. Ability to work independently and as a part of a team with a positive attitude for team goals. Ability to deal with multiple time zones Proactive flexible attitude with a desire to learn new skills in the growing field of data governance Agile Experience a plus. Required Experience & Education 8 -11 Years of Experience Experience (3-4+ years working experience) with data governance, constructs including technical and business metadata. Understanding of structured and semi-structured data, data quality knowledge Strong ability to drive connections with a wide variety of data platforms including AWS (Databricks, Teradata, Snowflake, Kafka), Azure, DB2 and SQL Server, etc. Knowledgeable on firewalls is a plus. Experience in using BI reporting tool is a plus. Experience with Collibra data governance products, Healthcare Industry data and/or data governance regulations and controls is a plus. Tooling: Excel, Agile knowlage like Jira or Rally, SQL knowledge, Tableau, other reporting tools, Experience with Data Governance Tools (Collibra DQ & Collibra Data Inteligence ) Alation or similar desired Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Company Description Seosaph-infotech is a rapidly growing company in customized software development, providing advanced technology solutions and trusted services across multiple business verticals. In Just Two Years, Seosaph-infotech Has Delivered Exceptional Solutions To Industries Such As Finance, Healthcare, And E-commerce, Establishing Itself As a Reliable IT Partner For Businesses Seeking To Enhance Their Technological Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, Spark, Data Bricks Delta Lakehouse or other Cloud data warehousing technologies. Governs data design/modelling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Develop a deep understanding of the business domains like Customer, Sales, Finance, Supplier, and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Drive collaborative reviews of data model design, code, data, security features to drive data product development. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; SAP Data Model. Develop reusable data models based on cloud-centric, code-first approaches to data management and data mapping. Partner with the data stewards team for data discovery and action by business customers and stakeholders. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Assist with data planning, sourcing, collection, profiling, and transformation. Support data lineage and mapping of source system data to canonical data stores. Create Source to Target Mappings (STTM) for ETL and BI needed : Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / domains ). Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Location : Remote. (ref:hirist.tech) Show more Show less

Posted 4 weeks ago

Apply

2.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: Highly motivated Full Stack Engineer with a solid background in software development. The ideal candidate should be adept at multi-tasking across various development activities, including coding, system configuration, testing, and research. Key Responsibilities: Collaborate with an integrated development team to deliver high-quality applications. Develop end-user applications, leveraging research capabilities and SQL knowledge. Utilize open-source tools and technologies effectively, adapting and extending them as needed to create innovative solutions. Communicate effectively across teams to ensure alignment and clarity throughout the development process. Provide post-production support Who you will work with: On our team we collaborate with several cross-functional agile teams that include product owners, other engineering groups, and quality engineers to conceptualize, build, test and ship software solutions for the next generation of enterprise applications. You will report directly to the Manager of the Applications team. What makes you a qualified candidate: 2+ years of relevant experience, preferably in R&D based teams Strong programming experience with JavaScript frameworks such as Angular, React, or Node. js, or experience writing Python-based microservices. Experience driving cloud-native service development with a focus on DevOps principles (CI/CD, TDD, Automation). Hands on experience on Java, JSP and related areas. Proficiency in Docker, Unix or Linux platforms. Experience with Spring Framework or Spring Boot a plus Expertise in designing and deploying scalable solutions in public cloud environments. A passion for innovation and continuous learning, with the ability to quickly adapt to new technologies. Familiarity with software configuration management tools, defect tracking tools, & peer review tools Excellent debugging skills to troubleshoot and resolve issues effectively. Familiarity with relational database management systems (RDBMS) such as PostgreSQL, MySQL, etc. Strong oral and written communication skills, with the ability to produce runbooks and both technical and non-technical documentation. What you will bring: Master’s or bachelor’s degree in computer science or a related discipline. Practical experience in development and support structures. Knowledge of cloud environments, particularly AWS. Proficiency in SQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 4 weeks ago

Apply

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: We are seeking a Reference Data Management Senior Analyst who as the Reference Data Product team member of the Enterprise Data Management organization, will be responsible for managing and promoting the use of reference data, partnering with business Subject Mater Experts on creation of vocabularies / taxonomies and ontologies, and developing analytic solutions using semantic technologies . Roles & Responsibilities: Work with Reference Data Product Owner, external resources and other engineers as part of the product team Develop and maintain semantically appropriate concepts Identify and address conceptual gaps in both content and taxonomy Maintain ontology source vocabularies for new or edited codes Support product teams to help them leverage taxonomic solutions Analyze the data from public/internal datasets. Develop a Data Model/schema for taxonomy. Create a taxonomy in Semaphore Ontology Editor. Perform Bulk-import data templates into Semaphore to add/update terms in taxonomies. Prepare SPARQL queries to generate adhoc reports. Perform Gap Analysis on current and updated data Maintain taxonomies in Semaphore through Change Management process. Develop and optimize automated data ingestion / pipelines through Python/PySpark when APIs are available Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Identify and resolve complex data-related challenges Participate in sprint planning meetings and provide estimations on technical implementation. Basic Qualifications and Experience: Master’s degree with 6 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 8 years of experience in Business, Engineering, IT or related field OR Diploma with 9+ years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Knowledge of controlled vocabularies, classification, ontology and taxonomy Experience in ontology development using Semaphore, or a similar tool Hands on experience writing SPARQL queries on graph data Excellent problem-solving skills and the ability to work with large, complex datasets Understanding of data modeling, data warehousing, and data integration concepts Good-to-Have Skills: Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience using cloud services such as AWS or Azure or GCP Experience working in Product Teams environment Knowledge of Python/R, Databricks, cloud data platforms Knowledge of NLP (Natural Language Processing) and AI (Artificial Intelligence) for extracting and standardizing controlled vocabularies. Strong understanding of data governance frameworks, tools, and best practices Professional Certifications: Databricks Certificate preferred SAFe® Practitioner Certificate preferred Any Data Analysis certification (SQL, Python) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Show more Show less

Posted 4 weeks ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Overview At ReKnew, our mission is to empower enterprises to revitalize their core business and organization by positioning themselves for the new world of AI. We're a startup founded by seasoned practitioners, supported by expert advisors, and built on decades of experience in enterprise technology, data, analytics, AI, digital, and automation across diverse industries. We're actively seeking top talent to join us in this mission. Job Description We're seeking a highly skilled Senior Data Engineer with deep expertise in AWS-based data solutions. In this role, you'll be responsible for designing, building, and optimizing large-scale data pipelines and frameworks that power analytics and machine learning workloads. You'll lead the modernization of legacy systems by migrating workloads from platforms like Teradata to AWS-native big data environments such as EMR, Glue, and Redshift. A strong emphasis is placed on reusability, automation, observability, performance optimization, and managing schema evolution in dynamic data lake environments . Key Responsibilities Migration & Modernization: Build reusable accelerators and frameworks to migrate data from legacy platforms (e.g., Teradata) to AWS-native architectures such as EMR, Glue, and Redshift. Data Pipeline Development: Design and implement robust ETL/ELT pipelines using Python, PySpark, and SQL on AWS big data platforms. Code Quality & Testing: Drive development standards with test-driven development (TDD), unit testing, and automated validation of data pipelines. Monitoring & Observability: Build operational tooling and dashboards for pipeline observability, including tracking key metrics like latency, throughput, data quality, and cost. Cloud-Native Engineering: Architect scalable, secure data workflows using AWS services such as Glue, Lambda, Step Functions, S3, and Athena. Collaboration: Partner with internal product teams, data scientists, and external stakeholders to clarify requirements and drive solutions aligned with business goals. Architecture & Integration: Work with enterprise architects to evolve data architecture while securely integrating AWS systems with on-premise or hybrid environments. This includes strategic adoption of data lake table formats like Delta Lake, Apache Iceberg, or Apache Hudi for schema management and ACID capabilities. ML Support & Experimentation: Enable data scientists to operationalize machine learning models by providing clean, well-governed datasets at scale. Documentation & Enablement: Document solutions thoroughly and provide technical guidance and knowledge sharing to internal engineering teams. Team Training & Mentoring: Act as a subject matter expert, providing guidance, training, and mentorship to junior and mid-level data engineers, fostering a culture of continuous learning and best practices within the team. Qualifications Experience: 7+ years in technology roles, with at least 5+ years specifically in data engineering, software development, and distributed systems. Programming: Expert in Python and PySpark (Scala is a plus). Deep understanding of software engineering best practices. AWS Expertise: 3+ years of hands-on experience in the AWS data ecosystem. Proficient in AWS Glue, S3, Redshift, EMR, Athena, Step Functions, and Lambda. Experience with AWS Lake Formation and data cataloging tools is a plus. AWS Data Analytics or Solutions Architect certification is a strong plus. Big Data & MPP Systems: Strong grasp of distributed data processing. Experience with MPP data warehouses like Redshift, Snowflake, or Databricks on AWS. Hands-on experience with Delta Lake, Apache Iceberg, or Apache Hudi for building reliable data lakes with schema evolution, ACID transactions, and time travel capabilities. DevOps & Tooling: Experience with version control (e.g., GitHub/CodeCommit) and CI/CD tools (e.g., CodePipeline, Jenkins). Familiarity with containerization and deployment in Kubernetes or ECS. Data Quality & Governance: Experience with data profiling, data lineage, and relevant tools. Understanding of metadata management and data security best practices. Bonus: Experience supporting machine learning or data science workflows. Familiarity with BI tools such as QuickSight, PowerBI, or Tableau. Show more Show less

Posted 4 weeks ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

What youll do DocuSign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) T eam, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive can do attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This position is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues in a timely manner Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Evaluate and improve data architecture and processes continuously Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a positions job designation depending on business needs and as permitted by local law. What you bring Basic Bachelor s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 8+ years dimensional and relational data modeling experience Excellent SQL and database management skills Preferred 8+ years in data warehouse engineering (OLAP) Snowflake, BigQuery, Teradata 8+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 8+ years with big data, Hadoop, Data Lake, Spark in a cloud environment(AWS) 8+ years with commercial ETL tools DBT, Matillion etc 8+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, master data management(MDM), sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills Life at Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. for assistance. Applicant and Candidate Privacy Notice #LI-Hybrid #LI-SA4 ","qualifications":" Basic Bachelor s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 8+ years dimensional and relational data modeling experience Excellent SQL and database management skills Preferred 8+ years in data warehouse engineering (OLAP) Snowflake, BigQuery, Teradata 8+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 8+ years with big data, Hadoop, Data Lake, Spark in a cloud environment(AWS) 8+ years with commercial ETL tools DBT, Matillion etc 8+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, master data management(MDM), sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills ","responsibilities":" DocuSign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) T eam, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive can do attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This position is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues in a timely manner Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Evaluate and improve data architecture and processes continuously Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player

Posted 4 weeks ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

What youll do Docusign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) Team, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive can do attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This position is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures. Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues in a timely manner Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Evaluate and improve data architecture and processes continuously Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a positions job designation depending on business needs and as permitted by local law. What you bring Basic Bachelor s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 5+ years dimensional and relational data modeling experience Excellent SQL and database management skills Preferred 5+ years in data warehouse engineering (OLAP) Snowflake, BigQuery, Teradata, Redshift 5+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 5+ years with big data, Hadoop, Data Lake, Spark in a cloud environment(AWS) 5+ years with commercial ETL tools DBT, Matillion etc 5+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills Life at Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. for assistance. Applicant and Candidate Privacy Notice #LI-Hybrid #LI-SA4 ","qualifications":" Basic Bachelor s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 5+ years dimensional and relational data modeling experience Excellent SQL and database management skills Preferred 5+ years in data warehouse engineering (OLAP) Snowflake, BigQuery, Teradata, Redshift 5+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 5+ years with big data, Hadoop, Data Lake, Spark in a cloud environment(AWS) 5+ years with commercial ETL tools DBT, Matillion etc 5+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills ","responsibilities":" Docusign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) Team, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive can do attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This position is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures. Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues in a timely manner Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Evaluate and improve data architecture and processes continuously Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player

Posted 4 weeks ago

Apply

4.0 - 9.0 years

4 - 7 Lacs

Gurugram

Work from Office

Naukri logo

Job Overview: We are seeking a dynamic Consultant to join our data and analytics team, delivering innovative solutions with a focus on the life sciences industry. The ideal candidate will bring current, hands-on expertise in data warehousing (Snowflake, Redshift, Databricks or similar), master data management (MDM), and report development (Power BI, Tableau, Sigma or similar), leveraging cloud platforms (AWS, Azure, GCP). This role involves leading a small team of 2-3 developers, actively contributing to technical delivery, and engaging with clients in an onshore/offshore model. We are particularly excited to find someone passionate about applying Generative AI (Gen AI) to transform the life sciences space, with a preferred understanding of healthcare concepts and data. Key Responsibilities: Hands-On Technical Delivery: Actively design, develop, and optimize data warehouse solutions using Snowflake, Redshift, and Databricks, ensuring high performance and scalability. Reporting Visualization: Build and refine advanced dashboards and reports using Power BI, Tableau, Sigma, and web-based platforms to deliver actionable insights. Cloud Expertise: Implement and manage data solutions on AWS, Azure, and GCP, maintaining cutting-edge technical proficiency. Master Data Management: Execute MDM processes to ensure data quality, governance, and integration, with a focus on life sciences applications. Team Leadership: Lead and mentor a small team of 2-3 developers, providing technical guidance, code reviews, and workload coordination. Gen AI Exploration: Drive the application of Generative AI techniques to solve challenges in the life sciences domain, such as drug discovery, patient analytics, or personalized medicine. Client Collaboration: Work closely with clients to gather requirements, propose solutions, and deliver results that align with business and scientific objectives. Project Contribution: Support project execution within Agile frameworks, collaborating with onshore and offshore teams to meet deadlines. Innovation: Contribute to internal initiatives, particularly those exploring Gen AI and healthcare-focused analytics Key Qualifications: Technical Skills: 4+ years in data engineering, analytics, or a related technical field, with at least 2 years in a leadership or managerial role. Strong proficiency in technologies like Redshift, Teradata, Data bricks, Snowflake, or similar solutions Experience handling huge volumes of data and setting up large scale solutions using tools like Airflow, Airbyte, dbt etc. Proficiency with cloud platforms like AWS, Azure, or Google Cloud, including services like S3, EC2, and Lambda o Proficiency with database technologies such as MySQL, PostgreSQL, SnowSQL etc. Familiarity with back-end technologies like Node.js, Python (Django/Flask), Ruby on Rails, or Java (Spring Boot). Familiarity with front-end technologies such as HTML5, CSS3, JavaScript, and frameworks like React.js, Angular, or Vue.js. Experience with API design and development (RESTful and/or GraphQL). Knowledge of version control systems like Git and collaboration platforms such as GitHub, GitLab, or Bitbucket. Experience of working with US based pharma clients and datasets would be preferred. Leadership and Management: Strong leadership skills with experience in building and managing technical teams. Excellent problem-solving abilities and a strategic mindset. Strong knowledge of master data management principles and practices Excellent project management skills, with experience in Agile/Scrum frameworks What We Offer: Opportunity to work on transformative healthcare projects. A collaborative and inclusive work environment. Competitive salary, performance-based bonuses, and professional development opportunities. Work on cutting edge cloud and Gen AI solutions

Posted 4 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Issue Remediation Senior Analyst – C11 About us: Analytics & Information Management AIM is a global community that is driving data driven transformation across Citi in multiple functions with the objective to create actionable intelligence for our business leaders. We are a fast-growing organization working with Citi businesses and functions across the world. Remediation & Remuneration COE Remediation team is responsible for cross functional coordination of customer facing remediation efforts. Provide oversight, prioritization, and scheduling of remediation activities with remediation partner teams including Technology, FSO, Analytics groups, Shared Services (mail vendor) and Controllers. R&R AIM Team works as the data Analytic partner for the Issue Remediation Business Team. Job responsibilities: R&R team manages the analysis of the customer remediation issues across globe, currently in retail consumer bank. The critical areas are work is divided into: Remediation analysis: Execution of the comprehensive data remediation approach on Customer issues due to gaps observed in policies and governance, Self-identified, or through IA. Impact assessment: Identification of size of the customers and the dollar amount impacted due to these issues. Issue Management & Root cause analysis: Identifying the issues and reasons for the issues by leveraging analytical methods. Audit Support: Tracking implementation plans and providing data evidence, artifacts for audit completion Expertise Required: Tools and Platforms Proficient in SAS, SQL, RDBMS, Teradata, Unix Proficient in MS Excel, PowerPoint, and VBA Jira, Bitbucket Mainframes Exposure to Big data, Python Domain Skills Good understanding of banking domain and consumer products (Retail Banking, Deposit, Loans, Wealth management, Mortgage, Insurance, etc.) (Preferred) Knowledge of Finance Regulations, Understanding on Retail Business/ Banking Domain Analytical Skills Ability to identify, clearly articulate and solve complex business problems and present them to the management in a structured and simpler form Data analysis, Data profiling, Data Management skills MIS reporting and generate actionable Business Insights Coming up with automated Techniques to reduce redundancy, remove false positives and enhance optimization Identification of control gaps and providing recommendations as per data strategy (Preferred) - Risk & control Metrics & Audit Framework Exposure Interpersonal Skills Ability to identify, clearly articulate and solve complex business problems and present them to the management in a structured and simpler form Should have excellent communication and inter-personal skills Good process/project management skills Ability to work well across multiple functional areas Ability to thrive in a dynamic and fast-paced environment Identifying and implementation of new collaboration ideas Contribute to organizational initiatives in wide ranging areas including competency development, training, organizational building activities etc. Proactive approach in solving problems and eye for details, identifying process gaps in solution implementation and suggest the alternatives Other Info: Education Level: Master’s / Advanced Degree in Information Technology/ Computer Applications/ Engineering/ MBA from a premier institute Overall experience of 5-8 years with At least 2 years of experience in Banking Industry delivering data solutions Job Category: Decision Management Schedule: Full-time Shift: Regular Local Working Hours (aligned with NAM working hours) ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Data/Information Management ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less

Posted 4 weeks ago

Apply

3.0 - 8.0 years

13 - 14 Lacs

Hyderabad, Chennai

Work from Office

Naukri logo

Are you ready to make an impact at DTCC Do you want to work on innovative projects, collaborate with a dynamic and encouraging team, and receive investment in your professional developmentAt DTCC, we are at the forefront of innovation in the financial markets. Were committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We develop a growing internal community and are committed to creating a workplace that looks like the world that we serve. Pay and Benefits: Competitive compensation, including base pay and annual incentive. Comprehensive health and life insurance and well-being benefits, based on location. Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (Tuesdays, Wednesdays, and a day unique to each team or employee). The impact you will have in this role: Data Quality and Integration role is a highly technical position - considered a technical expert in system implementation - with an emphasis on providing design, ETL, data quality and warehouse modeling expertise. This role will be accountable for, knowledge of capital development efforts. Performs in an experienced level the technical design of application components, builds applications, interfaces between applications, and understands data security, retention, and recovery. Can research technologies independently and recommend appropriate solutions. Contributes to technology-specific best practices standards; gives to success criteria from design through deployment, including, reliability, cost-effectiveness, performance, data integrity, maintainability, reuse, extensibility, usability and scalability; gives expertise on significant application components, vendor products, program languages, databases, operating systems, etc, completes the plan by building components, testing, configuring, tuning, and deploying solutions. Software Engineer (SE) for Data Quality and Integration applies specific technical knowledge of data quality and data integration in order to assist in the design and construction of critical systems. The SE works as part of an AD project squad and may interact with the business, Functional Architects, and domain experts on related integrating systems. The SE will contribute to the design of components or individual programs and participates fully in the construction and testing. This involves working with the Senior Application Architect, and other technical contributors at all levels. This position contributes expertise to project teams through all phases, including post-deployment support. This means researching specific technologies, and applications, and contributing to the solution design, supporting development teams, testing, troubleshooting, and production support. The ASD must possess experience in integrating large volumes of data, efficiently and in a timely manner. This position requires working closely with the functional and governance functions, and more senior technical resources, reviewing technical designs and specifications, and contributing to cost estimates and schedules. What Youll Do: Technology Expertise - is a domain expert on one or more of programming languages, vendor products specifically, Informatica Data Quality and Informatica Data Integration Hub, DTCC applications, data structures, business lines. Platforms - works with Infrastructure partners to stand up development, testing, and production environments Requirements Elaboration - works with the Functional Architect to ensure designs satisfy functional requirements Data Modeling - reviews and extends data models Data Quality Concepts - Experience in Data Profiling, Scorecards, Monitoring, Matching, Cleansing Is aware of frameworks - that promote concepts of isolation, extensibility, and extendibility System Performance - contributes to solutions that satisfy performance requirements; constructs test cases and strategies that account for performance requirements; tunes application performance issues Security - implements solutions and complete test plans working mentoring other team members in standard process Standards - is aware of technology standards and understands technical solutions need to be consistent with them Documentation - develops and maintains system documentation Is familiar with different software development methodologies (Waterfall, Agile, Scrum, Kanban) Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Educational background and work experience that includes mathematics and conversion of expressions into run time executable code. Ensures own and team s practices support success across all geographic locations Mitigates risk by following established procedures and monitoring controls, spotting key errors and demonstrating strong ethical behavior. Helps roll out standards and policies to other team members. Financial Industry Experience including Trades, Clearing and Settlement Education: Bachelors degree or equivalent experience. Minimum of 3+ years in Data Quality and Integration. Basic understanding of Logical Data Modeling and Database design is a plus Technical experience with multiple database platforms: Sybase, Oracle, DB2 and distributed databases like Teradata / Greenplum / Redshift / Snowflake containing high volumes of data. Knowledge of data management processes and standard methodologies preferred Proficiency with Microsoft Office tools required Supports team in managing client expectations and resolving issues on time. Technical skills highly preferred along with strong analytical skills. Actual salary is determined based on the role, location, individual experience, skills, and other considerations. Please contact us to request accommodation.

Posted 4 weeks ago

Apply

8.0 - 13.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

SUMMARY Job Role: Snowflake Data Engineering Professional Location: Bangalore Experience: The ideal candidate should possess at least 8 years of experience in Snowflake with a focus on Data Engineering. Primary Skills: Proficiency in Snowflake, DBT, and AWS. Good to have Skills: Familiarity with Fivetran (HVR) and Python. Responsibilities: Design, develop, and maintain data pipelines using Snowflake, DBT, and AWS. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize and troubleshoot existing data workflows to ensure efficiency and reliability. Implement best practices for data management and governance. Stay updated with the latest industry trends and technologies to continuously improve the data infrastructure. Required Skills: Strong experience in data modeling, ETL processes, and data warehousing. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Skills: Knowledge of Fivetran (HVR) and Python. Familiarity with data integration tools and techniques. Ability to work in a fast-paced and agile environment. Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Requirements Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. 8 years of relevant experience in Snowflake with Data Engineering. Proficiency in Snowflake, DBT, and AWS. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities.

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies