Home
Jobs

794 Teradata Jobs - Page 11

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Overview C5i C5i is a pure-play AI & Analytics provider that combines the power of human perspective with AI technology to deliver trustworthy intelligence. The company drives value through a comprehensive solution set, integrating multifunctional teams that have technical and business domain expertise with a robust suite of products, solutions, and accelerators tailored for various horizontal and industry-specific use cases. At the core, C5i’s focus is to deliver business impact at speed and scale by driving adoption of AI-assisted decision-making. C5i caters to some of the world’s largest enterprises, including many Fortune 500 companies. The company’s clients span Technology, Media, and Telecom (TMT), Pharma & Lifesciences, CPG, Retail, Banking, and other sectors. C5i has been recognized by leading industry analysts like Gartner and Forrester for its Analytics and AI capabilities and proprietary AI-based platforms. Global offices United States | Canada | United Kingdom | United Arab of Emirates | India Job Summary We are looking for experienced Data Modelers to support large-scale data engineering and analytics initiatives. The role involves developing logical and physical data models, working closely with business and engineering teams to define data requirements, and ensuring alignment with enterprise standards. • Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, Spark, Data Bricks Delta Lakehouse or other Cloud data warehousing technologies. • Governs data design/modelling – documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. • Develop a deep understanding of the business domains like Customer, Sales, Finance, Supplier, and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. • Drive collaborative reviews of data model design, code, data, security features to drive data product development. • Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; SAP Data Model. • Develop reusable data models based on cloud-centric, code-first approaches to data management and data mapping. • Partner with the data stewards team for data discovery and action by business customers and stakeholders. • Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. • Assist with data planning, sourcing, collection, profiling, and transformation. • Support data lineage and mapping of source system data to canonical data stores. • Create Source to Target Mappings (STTM) for ETL and BI developers. Skills needed: • Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) • Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. • Experience with version control systems like GitHub and deployment & CI tools. • Experience of metadata management, data lineage, and data glossaries is a plus. • Working knowledge of agile development, including DevOps and DataOps concepts. • Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail. C5i is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, color, religion, sex, sexual orientation, age, marital status, disability, gender identity, etc. If you have a disability or special need that requires accommodation, please keep us informed about the same at the hiring stages for us to factor necessary accommodations. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job description: Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Teradata . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

25 - 30 Lacs

Indore, Chennai

Work from Office

Naukri logo

We are seeking a Senior Python DevOps Engineer to develop Python services and build CI/CD pipelines for AI/data platforms. Must have strong cloud, container, and ML workflow deployment experience. Required Candidate profile Experienced Python DevOps engineer with expertise in CI/CD, cloud, and AI platforms. Skilled in Flask/FastAPI, Airflow, MLFlow, and model deployment on Dataiku and OpenShift.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: GCP Teradata Engineer Location: Chennai, Bangalore, Hyderabad Experience: 4-6 Years Job Summary We are seeking a GCP Data & Cloud Engineer with strong expertise in Google Cloud Platform services, including BigQuery, Cloud Run, Cloud Storage , and Pub/Sub . The ideal candidate will have deep experience in SQL coding , data pipeline development, and deploying cloud-native solutions. Key Responsibilities Design, implement, and optimize scalable data pipelines and services using GCP Build and manage cloud-native applications deployed via Cloud Run Develop complex and performance-optimized SQL queries for analytics and data transformation Manage and automate data storage, retrieval, and archival using Cloud Storage Implement event-driven architectures using Google Pub/Sub Work with large datasets in BigQuery, including ETL/ELT design and query optimization Ensure security, monitoring, and compliance of cloud-based systems Collaborate with data analysts, engineers, and product teams to deliver end-to-end cloud solutions Required Skills & Experience 4 years of experience working with Google Cloud Platform (GCP) Strong proficiency in SQL coding, query tuning, and handling complex data transformations Hands-on experience with: BigQuery Cloud Run Cloud Storage Pub/Sub Understanding of data pipeline and ETL/ELT workflows in cloud environments Familiarity with containerized services and CI/CD pipelines Experience in scripting languages (e.g., Python, Shell) is a plus Strong analytical and problem-solving skills Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Data Fullstack - Descriptive Analytics Location: Chennai Work Type: Onsite Position Description: The Analytics Service department provides system planning, engineering and operations support for enterprise Descriptive and Predictive Analytics products, as well as Big Data solutions and Analytics Data Management products. These tools are used by the Global Data Insights and Analytics (GDIA) team, data scientists, and IT service delivery partners globally to build line-of-business applications which are directly used by the end-user community. Products and platforms include Power BI, Alteryx, Informatica, Google Big Query, and more - all of which are critical to the client's rapidly evolving needs in the area of Analytics and Big Data. In addition, business intelligence reporting products such as Business Objects, Qlik Sense and WebFOCUS are used by our core line of businesses for both employees and dealers. This position is part of the Descriptive Analytics team. It is a Full Stack Engineering and Operations position, engineering and operating our strategic Power BI dashboarding and visualization platform and other products as required, such as Qlik Sense, Alteryx, Business Objects, WebFOCUS, Looker, and other new platforms as they are introduced. The person in this role will collaborate with team members to produce well-tested and documented run books, test cases, and change requests, and handle change implementations as needed. The candidate will start with primarily Operational tasks until the products are well understood and will then progress to assisting with Engineering tasks. Skills Required: GCP, Tekton, GitHub, TERRAFORM, Powershell, Openshift Experience Required: Position Qualifications: Bachelor's Degree in a relevant field At least 5 years of experience with Descriptive Analytics technologies Dev/Ops experience with Github, Tekton pipelines, Terraform code, Google Cloud Services, and PowerShell and managing large GCP installations (OR) System Administrator experience managing large multi-tenant Windows Server environments based on GCP Compute Engines or OpenShift Virtualization VMs Strong troubleshooting and problem-solving skills Understanding of Product Life Cycle Ability to coordinate issue resolution with vendors on behalf of the client Strong written and verbal communication skills Understanding of technologies like Power BI, Big Query, Teradata, SQL Server, Oracle DB2, etc. Basic understanding of database connectivity and authentication methods (ODBC, JDBC, drivers, REST, WIF, Cloud SA or vault keys, etc.) Experience Preferred: Recommended: Experience with PowerApps and Power Automate Familiarity with Jira Familiarity with the client EAA, RTP, and EAMS processes and the client security policies (GRC) Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity. Show more Show less

Posted 2 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Perforce is a community of collaborative experts, problem solvers, and possibility seekers who believe work should be both challenging and fun. We are proud to inspire creativity, foster belonging, support collaboration, and encourage wellness. At Perforce, you’ll work with and learn from some of the best and brightest in business. Before you know it, you’ll be in the middle of a rewarding career at a company headed in one direction: upward. With a global footprint spanning more than 80 countries and including over 75% of the Fortune 100, Perforce Software, Inc. is trusted by the world’s leading brands to deliver solutions for the toughest challenges. The best run DevOps teams in the world choose Perforce. Position Summay: The Delphix team is seeking engineers with a passion for data security to join our data compliance engineering team. In this position, you will get the opportunity to contribute to the Hyperscale Compliance product for which development and testing is driven completely by the India Engineering Team. This product was launched into the Market in July 2022 and handles the compliance use cases for large scale datasets. Delphix Hyperscale Compliance is based on a Microservices architecture and uses a cluster of Delphix Continuous Compliance engines to achieve faster results. Delphix Continuous Compliance provides a single platform to secure and deliver data across the enterprise, ensuring that sensitive information is protected and allowing data operators to centrally manage security policies and compliance requirements. You will be responsible for writing automation tests for complex features and performing manual, performance and scale tests. You will work closely with Product Management, customers, and other engineering stakeholders to design tests for the new solution. You will also collaborate with other team members to deliver highly-scalable, secure and maintainable features. Responsibilities: Collaborate with Product Management and other stakeholders within Engineering to maintain a high bar for quality. Advocate for improvements to product quality, security, and performance Craft code that meets our internal standards for style, maintainability, and best practices for a high-scale web environment. Maintain and advocate for these standards through code reviews Scope, design and implement test automation for allocated features. Monitor and maintain test automation, across multiple platforms and configurations Requirements: 2-4 years of experience testing enterprise applications deployed on-prem and/or in the cloud using Python Proficiency in Python or related programming language Excellent analytical and problem solving skills Ability and desire to work in a fast paced, test-driven, agile, collaborative, and iterative programming environment Ability to think clearly and articulate your vision with the appropriate technical depth A desire to build great products, learn new technical areas, and dive in wherever there is a need Desired Experience: Proficiency in Docker and Kubernetes Previous experience writing automation frameworks in Python Deep understanding of file systems and operating systems Experience with large/complex relational databases, data warehouses (Oracle, SQL Server, DB2, Azure, Amazon RDS, Teradata, etc.) and other business data formats (XML, ASC X12, etc.) Come work with us! Our team members are valued for their contributions, introduced to new opportunities, and rewarded well . Perforce combines the experience and rewards of a start-up with the security of an established and privately held profitable company. If you are passionate about the technology that impacts our day-to-day lives and want to work with talented and dedicated people across the globe, apply today! www.perforce.com Please click here for: EOE & Belonging Statements | Perforce Software Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Overview TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Systems Engineering Practitioner Location: Chennai Duration: 12 Months Work Type: Onsite Position Description The Analytics Service department provides system planning, engineering and operations support for enterprise Descriptive and Predictive Analytics products, as well as Big Data solutions and Analytics Data Management products. These tools are used by the Global Data Insights and Analytics (GDIA) team, data scientists, and IT service delivery partners globally to build line-of-business applications which are directly used by the end-user community. Products and platforms include Power BI, Alteryx, Informatica, Google Big Query, and more - all of which are critical to the client's rapidly evolving needs in the area of Analytics and Big Data. In addition, business intelligence reporting products such as Business Objects, Qlik Sense and WebFOCUS are used by our core line of businesses for both employees and dealers. This position is part of the Descriptive Analytics team. It is a Full Stack Engineering and Operations position, engineering and operating our strategic Power BI dashboarding and visualization platform and other products as required, such as Qlik Sense, Alteryx, Business Objects, WebFOCUS, Looker, and other new platforms as they are introduced. The person in this role will collaborate with team members to produce well-tested and documented run books, test cases, and change requests, and handle change implementations as needed. The candidate will start with primarily Operational tasks until the products are well understood and will then progress to assisting with Engineering tasks. Skills Required GCP, Tekton, GitHub, TERRAFORM, Powershell, Openshift Experience Required Position Qualifications: Bachelor's Degree in a relevant field At least 5 years of experience with Descriptive Analytics technologies Dev/Ops experience with Github, Tekton pipelines, Terraform code, Google Cloud Services, and PowerShell and managing large GCP installations (OR) System Administrator experience managing large multi-tenant Windows Server environments based on GCP Compute Engines or OpenShift Virtualization VMs Strong troubleshooting and problem-solving skills Understanding of Product Life Cycle Ability to coordinate issue resolution with vendors on behalf of the client Strong written and verbal communication skills Understanding of technologies like Power BI, Big Query, Teradata, SQL Server, Oracle DB2, etc. Basic understanding of database connectivity and authentication methods (ODBC, JDBC, drivers, REST, WIF, Cloud SA or vault keys, etc.) Experience Preferred Recommended: Experience with PowerApps and Power Automate Familiarity with Jira Familiarity with the client EAA, RTP, and EAMS processes and the client security policies (GRC) Education Required Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity. Show more Show less

Posted 2 weeks ago

Apply

15.0 years

0 Lacs

Maharashtra, India

On-site

Linkedin logo

What You’ll Do The COE Solution development Lead is a thought leader responsible for overseeing the detail design, development, and maintenance of complex data and analytic solutions. This role will require strong technical management skills, project management, team building, mentoring and interpersonal skills, a deep understanding of the Teradata Solutions Strategy and Teradata Technology, Data Architecture and an understanding of the partner engagement model. This role will directly report to Teradata’s Head of Solution COE. Key Responsibilities: Lead the team responsible for development of scalable, efficient, and innovative data and analytics solutions that address complex business problems. Oversee the end-to-end process of solution development, including data ingestion, storage, processing, analysis, and visualization. Lead the development of comprehensive solution architectures, including identifying key components, integration points, and potential challenges, ensuring alignment with industry requirements and business objectives. Ensure design flexibility for integration of various data sources and platforms, optimizing for both real-time and batch processing where needed. Design and build custom data pipelines and dashboards, leveraging emerging technologies Implement best practices in the development of data analytics solutions to maintain security, privacy, and legal compliance and collaborate with relevant domain owners for review and compliance. Collaborate with senior leadership to align data and analytics goals with broader company objectives, focusing on business value creation through IP and analytics. Lead and mentor a team of data scientists, analysts, engineers, and IP professionals to foster a culture of innovation, collaboration, and continuous learning. Deliver solutions on time and within budget. Facilitate knowledge sharing across teams and ensure that data solutions are scalable, secure, and aligned with the organization’s overall technological roadmap. The successful candidate will work from any Teradata GSIH (Global Services Innovation Hub) facility or virtual in India/ Pakistan/Prague and will be expected to travel on business (20 -30% travel). Who You’ll Work With You would work with the COE Solutions lead to transform the conceptual solution into detail design and develop them into a packaged solution. You will lead a team of Data scientists, Solution engineers, Data engineers and Software engineers and borrowed resources with same skills from other organizations in Teradata. Collaborate with product development, legal, IT, and business teams to ensure seamless integration of data analytics solutions and the protection of related IP. What Makes You a Qualified Candidate Educational Requirements: Bachelor’s degree in Computer Science, Engineering, Data Science or a related field (MS or MBA preferred). Skills & Experience Required: 15+ years of experience in IT with 10 years + in data & analytics solution development, IP management, or a related field, with at least 4+ years in a leadership or senior management position. Proven track record of successfully developing data-driven solutions (e. g. , AI/ML models, analytics platforms, data pipelines) and managing a diverse IP portfolio. Experience working with cross-functional teams, including legal, R&D, and product teams, to develop and protect analytics-related IP. Strong understanding of emerging trends in data analytics technologies, such as big data, cloud computing, and artificial intelligence. Deep technical knowledge in data analytics, machine learning, or related fields, combined with strong business acumen. In-depth understanding of intellectual property laws, patent filings, and IP strategy in the technology sector. Strong negotiation, communication, and relationship-building skills. Ability to manage complex, cross-functional projects while ensuring alignment with organizational goals. Excellent leadership and people management skills, with a focus on fostering an innovative, collaborative environment What You’ll Bring Ability and willingness to learn. Collaborative attitude and team player. Very strong analytical skills. Ability to independently manage critical situations. Story building skills. Critical thinker and problem-solving skills Excellent communication skills, both written and verbal, with the ability to present complex ideas to technical and non-technical audiences. Ability to coordinate with SMEs, stakeholders, manage timelines, escalation & provide on time Ability to prioritize assignments, handles shifting deadlines, and work independently as well as in a team environment. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 2 weeks ago

Apply

15.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

What You’ll Do The COE Solution development Lead is a thought leader responsible for overseeing the detail design, development, and maintenance of complex data and analytic solutions. This role will require strong technical management skills, project management, team building, mentoring and interpersonal skills, a deep understanding of the Teradata Solutions Strategy and Teradata Technology, Data Architecture and an understanding of the partner engagement model. This role will directly report to Teradata’s Head of Solution COE. Key Responsibilities: Lead the team responsible for development of scalable, efficient, and innovative data and analytics solutions that address complex business problems. Oversee the end-to-end process of solution development, including data ingestion, storage, processing, analysis, and visualization. Lead the development of comprehensive solution architectures, including identifying key components, integration points, and potential challenges, ensuring alignment with industry requirements and business objectives. Ensure design flexibility for integration of various data sources and platforms, optimizing for both real-time and batch processing where needed. Design and build custom data pipelines and dashboards, leveraging emerging technologies Implement best practices in the development of data analytics solutions to maintain security, privacy, and legal compliance and collaborate with relevant domain owners for review and compliance. Collaborate with senior leadership to align data and analytics goals with broader company objectives, focusing on business value creation through IP and analytics. Lead and mentor a team of data scientists, analysts, engineers, and IP professionals to foster a culture of innovation, collaboration, and continuous learning. Deliver solutions on time and within budget. Facilitate knowledge sharing across teams and ensure that data solutions are scalable, secure, and aligned with the organization’s overall technological roadmap. The successful candidate will work from any Teradata GSIH (Global Services Innovation Hub) facility or virtual in India/ Pakistan/Prague and will be expected to travel on business (20 -30% travel). Who You’ll Work With You would work with the COE Solutions lead to transform the conceptual solution into detail design and develop them into a packaged solution. You will lead a team of Data scientists, Solution engineers, Data engineers and Software engineers and borrowed resources with same skills from other organizations in Teradata. Collaborate with product development, legal, IT, and business teams to ensure seamless integration of data analytics solutions and the protection of related IP. What Makes You a Qualified Candidate Educational Requirements: Bachelor’s degree in Computer Science, Engineering, Data Science or a related field (MS or MBA preferred). Skills & Experience Required: 15+ years of experience in IT with 10 years + in data & analytics solution development, IP management, or a related field, with at least 4+ years in a leadership or senior management position. Proven track record of successfully developing data-driven solutions (e. g. , AI/ML models, analytics platforms, data pipelines) and managing a diverse IP portfolio. Experience working with cross-functional teams, including legal, R&D, and product teams, to develop and protect analytics-related IP. Strong understanding of emerging trends in data analytics technologies, such as big data, cloud computing, and artificial intelligence. Deep technical knowledge in data analytics, machine learning, or related fields, combined with strong business acumen. In-depth understanding of intellectual property laws, patent filings, and IP strategy in the technology sector. Strong negotiation, communication, and relationship-building skills. Ability to manage complex, cross-functional projects while ensuring alignment with organizational goals. Excellent leadership and people management skills, with a focus on fostering an innovative, collaborative environment What You’ll Bring Ability and willingness to learn. Collaborative attitude and team player. Very strong analytical skills. Ability to independently manage critical situations. Story building skills. Critical thinker and problem-solving skills Excellent communication skills, both written and verbal, with the ability to present complex ideas to technical and non-technical audiences. Ability to coordinate with SMEs, stakeholders, manage timelines, escalation & provide on time Ability to prioritize assignments, handles shifting deadlines, and work independently as well as in a team environment. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 2 weeks ago

Apply

15.0 years

0 Lacs

India

On-site

Linkedin logo

SAS Solution Designer We are seeking a highly experienced SAS Solution Designer to join our team in a solution engineering lead capacity. This role requires in-depth knowledge of SAS technologies, cloud-based platforms, and data solutions. The ideal candidate will be responsible for end-to-end solution design aligned with enterprise architecture standards and business objectives, providing technical leadership across squads and development teams. Mitra AI is currently looking for experienced SAS Solution Designers who are based in India and are open to relocating. This is a hybrid opportunity in Sydney, Australia. JOB SPECIFIC DUTIES & RESPONSIBILITIES Own and define the end-to-end solution architecture for data platforms, ensuring alignment with business objectives, enterprise standards, and architectural best practices. Design reliable, stable, and scalable SAS-based solutions that support long-term operational effectiveness. Lead solution engineers and Agile squads to ensure the delivery of high-quality, maintainable data solutions. Collaborate independently with business and technical stakeholders to understand requirements and translate them into comprehensive technical designs. Provide high-level estimates for proposed features and technical initiatives to support business planning and prioritization. Conduct and participate in solution governance forums to secure approval for data designs and strategies. Drive continuous improvement by identifying technical gaps and implementing best practices, emerging technologies, and enhanced processes. Facilitate work breakdown sessions and actively participate in Agile ceremonies such as sprint planning and backlog grooming. Ensure quality assurance through rigorous code reviews, test case validation, and enforcement of coding and documentation standards. Troubleshoot complex issues by performing root cause analysis, log reviews, and coordination with relevant teams for resolution. Provide mentoring and coaching to solution engineers and technical leads to support skills growth and consistency in solution delivery. REQUIRED COMPETENCIES AND SKILLS Deep expertise in SAS technologies and ecosystem. Strong proficiency in cloud-based technologies and data platforms (e.g., Azure, Hadoop, Teradata). Solid understanding of RDBMS, ETL/ELT tools (e.g., Informatica), and real-time data streaming. Ability to work across relational and NoSQL databases and integrate with various data and analytics tools. Familiarity with BI and reporting tools such as Tableau and Power BI. Experience guiding Agile delivery teams, supporting full-stack solution development through DevOps and CI/CD practices. Capability to define and implement secure, scalable, and performant data solutions. Strong knowledge of metadata management, reference data, and data lineage concepts. Ability to communicate effectively with both technical and non-technical stakeholders. Problem-solving mindset with attention to detail and an emphasis on delivering high-quality solutions. REQUIRED EXPERIENCE AND QUALIFICATIONS Minimum of 15+ years of experience in solution design and development roles, including leadership responsibilities. Strong exposure to SAS and enterprise data platforms in the financial services industry. Prior experience working within risk, compliance, or credit risk domains is highly desirable. Practical experience with Agile methodologies and DevOps principles. Bachelors or Masters degree in Computer Science, Engineering, Information Technology, or related field. Experience working in cross-functional teams with a focus on business alignment and technology delivery. Show more Show less

Posted 2 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Greetings from TCS! TCS is hiring for Informatica PowerCenter With -Teradata/Oracle Location: - Chennai Desired Experience Range: 4 - 6 Years Must-Have  Informatica, IICS  Teradata/Oracle  Unix Good-to-Have  Control-M  Designing and impact analysis experience with above technologies  Agile scrum experience  Exposure in data ingestion from disparate sources onto big data platform Thanks Anshika Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

TEKsystems is seeking a Senior AWS + Data Engineer to join our dynamic team. The ideal candidate should have expertise Data engineer + Hadoop + Scala/Python with AWS services. This role involves designing, developing, and maintaining scalable and reliable software solutions. Job Title: Data Engineer – Spark/Scala (Batch Processing) Location: Manyata- Hybrid Experience: 7+yrs Type: Full-Time Mandatory Skills: 7-10 years’ experience in design, architecture or development in Analytics and Data Warehousing. Experience in building end-to-end solutions with the Big data platform, Spark or scala programming. 5 years of Solid experience in ETL pipeline building with spark or sclala programming framework with knowledge in developing UNIX Shell Script, Oracle SQL/ PL-SQL. Experience in Big data platform for ETL development with AWS cloud platform. Proficiency in AWS cloud services, specifically EC2, S3, Lambda, Athena, Kinesis, Redshift, Glue , EMR, DynamoDB, IAM, Secret Manager, Step functions, SQS, SNS, Cloud Watch. Excellent skills in Python-based framework development are mandatory. Should have experience with Oracle SQL database programming, SQL performance tuning, and relational model analysis. Extensive experience with Teradata data warehouses and Cloudera Hadoop. Proficient across Enterprise Analytics/BI/DW/ETL technologies such as Teradata Control Framework, Tableau, OBIEE, SAS, Apache Spark, Hive Analytics & BI Architecture appreciation and broad experience across all technology disciplines. Experience in working within a Data Delivery Life Cycle framework & Agile Methodology. Extensive experience in large enterprise environments handling large volume of datasets with High SLAs Good knowledge in developing UNIX scripts, Oracle SQL/PL-SQL, and Autosys JIL Scripts. Well versed in AI Powered Engineering tools like Cline, GitHub Copilo Please send the resumes to nvaseemuddin@teksystems.com or kebhat@teksystems.com Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra

Remote

Indeed logo

Solution Engineer - Data & AI Mumbai, Maharashtra, India Date posted Jun 16, 2025 Job number 1830869 Work site Up to 50% work from home Travel 25-50 % Role type Individual Contributor Profession Technology Sales Discipline Technology Specialists Employment type Full-Time Overview As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. OR 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana

Remote

Indeed logo

Solution Engineer - Cloud & Data AI Gurgaon, Haryana, India Date posted Jun 16, 2025 Job number 1830866 Work site Up to 50% work from home Travel 25-50 % Role type Individual Contributor Profession Technology Sales Discipline Technology Specialists Employment type Full-Time Overview As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Preffered 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. Or 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and other cloud products (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Responsibilities Drive technical conversations with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 19 Lacs

Mumbai

Work from Office

Naukri logo

Overview The Climate Data and Content Services team is responsible for the research and assessment of carbon footprint and business initiatives related to climate change for approximately 14,000 companies globally. The team’s research is focused on climate-related metrics, which include carbon emissions, energy performance, and climate change targets and commitments, using a variety of sources, such as annual reports, sustainability reports, quantitative data feeds from third-party providers, news publications, and other company disclosures. Responsibilities Your primary responsibility is to contribute to the sector expertise of the team and to support all our climate data integration efforts. You will support the quality of our Climate Change Metrics data set within your sector and engage with internal and external stakeholders for better understanding of methodology and data. Your tasks will include reviewing and validating climate data provided by vendors and issuers. In addition, you are expected to address queries pertaining to our Climate data from our internal and external users, reconciling data challenges, training users on our methodologies and processes, and maintaining process documentation. Work with a global team of researchers, data and IT specialists, and vendors to enhance and improve our research and assessment of companies’ carbon footprint, climate change targets and commitments, and climate change mitigation practices. Capture and transform climate data metrics into meaningful information/ratings/scores; Drive coverage and content expansion projects defined by the business; Update and refine industry assessment guides for data collectors and internal analysts; Qualifications Bachelor's degree in finance, statistics, sustainability, environmental science, business, or related degrees. Knowledge of climate change issues and regulations, including carbon offsets, climate reporting standards (e.g. TCFD, ISSB) and frameworks (e.g. IPCC, UNFCCC, etc.). Minimum of 5+ years working experience in Energy (Oil & Gas) or Industrials or Transport sectors and/or a background on GHG assessments, GRI reporting, or environmental impact assessments. Experience in data visualization tools is desirable. Ability to articulate and communicate complex concepts with ease. Fast learner and strong logical thinker. Ability to work under minimal supervision. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 2 weeks ago

Apply

1.0 - 3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description: Location: Indore, Noida, Pune and Bengaluru Qualifications: BE/B.Tech/MCA/M.Tech/M.Com in Computer Science or related field Required Skills: EDW Expertise: Hands-on experience with Teradata or Oracle. PL/SQL Proficiency: Strong ability to write complex queries. Performance Tuning: Expertise in optimizing queries to meet SLA requirements. Communication: Strong verbal and written communication skills. Experience Required (1-3 Years) Preferred Skills: Cloud Technologies: Working knowledge of AWS S3 and Redshift or equivalent. Database Migration: Familiarity with database migration processes. Big Data Tools: Understanding of SparkQL, and PySpark. Programming: Experience with Python for data processing and analytics. Data Management: Experience with import/export operations. Roles & Responsibilities Module Ownership: Manage a module and assist the team. Optimized PL/SQL Development: Write efficient queries. Performance Tuning: Improve database speed and efficiency. Requirement Analysis: Work with business users to refine needs. Application Development: Build solutions using complex SQL queries. Data Validation: Ensure integrity of large datasets (TB/PB). Testing & Debugging: Conduct unit testing and fix issues. Database Strategies: Apply best practices for development. Interested candidates can share their resumes at anubhav.pathania@impetus.com Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role As part of the AI Data organization, the Enterprise Business Intelligence (EBI) team is central to NXP s data analytics success. We provide and maintain scalable data solutions, platforms, and methodologies that empower business users to create self-service analytics and drive data-informed decisions. We are seeking a Data Engineering Manager to lead a team of skilled Data Engineers. In this role, you will be responsible for overseeing the design, development, and maintenance of robust data pipelines and data models across multiple data platforms, including Databricks, Teradata, Postgres and others. You will collaborate closely with Product Owners, Architects, Data Scientists, and cross-functional stakeholders to ensure high-quality, secure, and scalable data solutions. Key Responsibilities Lead, mentor, and grow a team of Data Engineers, fostering a culture of innovation, collaboration, and continuous improvement. Oversee the design, development, and optimization of ETL/ELT pipelines and data workflows across multiple cloud and on-premise environments. Ensure data solutions align with enterprise architecture standards, including performance, scalability, security, privacy, and compliance. Collaborate with stakeholders to translate business requirements into technical specifications and data models. Drive adoption of best practices in data engineering, including code quality, testing, version control, and CI/CD. Partner with the Operational Support team to troubleshoot and resolve data issues and incidents. Stay current with emerging technologies and trends in data engineering and analytics. Required Skills Qualifications Proven experience as a Data Engineer with at least 12 + years in ETL/ELT design and development. 5+ years of experience in a technical leadership or management role , with a track record of building and leading high-performing teams. Strong hands-on experience with cloud platforms (AWS, Azure) and their data services (e.g., S3, Redshift, Glue, Azure Data Factory, Synapse). Proficiency in SQL , Python , and PySpark for data transformation and processing. Experience with data orchestration tools and CI/CD pipelines (GitHub Actions, GitLab CI). Familiarity with data modeling , data warehousing , and data lake architectures . Understanding of data governance , security , and compliance frameworks (e.g., GDPR, HIPAA). Excellent communication and stakeholder management skills. Preferred Skills Qualifications Experience with Agile methodologies and DevOps practices . Proficiency with Databricks , Teradata , Postgres, Fivetran HVR and DBT . Knowledge of AI/ML workflows and integration with data pipelines. Experience with monitoring and observability tools . Familiarity with data cataloging and metadata management tools (e.g., Alation, Collibra).

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Noida

Work from Office

Naukri logo

">Business Analyst 5-10 Years Noida EDW BA Job Summary: We are seeking a detail-oriented and business-savvy EDW Functional Analyst / Business Analyst to join our data team. In this role, you will work closely with business stakeholders including leaders and subject matter experts to understand business requirements and translate them into data solutions that drive strategic decision-making. You will serve as a functional expert with a deep understanding of business processes, data analysis, and Enterprise Data Warehousing (EDW). Key Responsibilities: Collaborate with business stakeholders to gather, analyze, and document requirements related to reporting, KPIs, and business metrics. Translate business needs into comprehensive documentation, including Business Requirement Documents (BRD) , Functional Requirement Documents (FRD) , and user stories . Perform detailed data analysis using SQL to identify data gaps and inconsistencies, and recommend solutions. Partner with data architects and technical teams to ensure alignment between business requirements and data solutions. Contribute to the design of KPIs, metrics, and data models to support business intelligence and reporting initiatives. Ensure stakeholder alignment and satisfaction throughout the development lifecycle through effective communication and engagement. Adapt to evolving business requirements while maintaining a focus on strategic outcomes and data integrity. Required Skills Qualifications: Proven experience in requirement gathering and stakeholder management . Strong proficiency in data analysis using SQL . Solid understanding of data warehousing concepts and business intelligence processes. Experience creating BRDs, FRDs , and user stories . Ability to translate complex business requirements into clear, actionable insights and documentation. Excellent verbal and written communication skills. Strong problem-solving abilities and attention to detail. Preferred Qualifications: Experience with major EDW platforms (e.g., Snowflake, Redshift, Teradata, etc.). Familiarity with BI tools like Tableau, Power BI, or Looker. Background in a domain such as finance, healthcare, retail, etc.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Noida

Work from Office

Naukri logo

">QA Manual - EDW 5-10 Years Noida QA Manual EDW Job Summary: We are looking for a meticulous and analytical Quality Analyst to join our Enterprise Data Warehouse (EDW) team. In this role, you will be responsible for ensuring the accuracy, consistency, and integrity of data across the warehouse by designing and executing thorough testing strategies. You will collaborate closely with digital and business teams to validate data transformations, identify discrepancies, and ensure alignment with business requirements. Key Responsibilities: Develop and execute detailed test plans, test cases, and test strategies for EDW components and data pipelines. Perform functional and technical testing , including data validation, transformation logic verification, and end-to-end testing. Use SQL to validate data accuracy across source systems, staging, and data warehouse layers. Identify, log, and track defects using defect tracking tools; assist in root cause analysis and drive resolution in coordination with development teams. Collaborate with business analysts and developers to ensure thorough understanding of requirements and provide support during User Acceptance Testing (UAT) . Maintain testing documentation including test cases, data mapping documents, and QA status reports. Ensure adherence to business rules and data quality standards throughout the testing lifecycle. Continuously improve test processes and recommend quality best practices. Required Skills Qualifications: Proven experience in test planning, test execution , and data validation for data warehouse and ETL environments. Strong proficiency in SQL for data querying, validation, and debugging. Solid understanding of data warehousing concepts , data models, and ETL processes. Hands-on experience with defect tracking tools (e.g., JIRA, HP ALM, Azure DevOps). Excellent documentation and communication skills. Strong problem-solving ability and attention to detail . Ability to work independently and collaboratively in a cross-functional team environment. Preferred Qualifications: Experience in testing within major EDW platforms (e.g., Snowflake, Redshift, Teradata). Familiarity with automation tools or frameworks for data validation. Exposure to BI/reporting tools (e.g., Tableau, Power BI) for report testing.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

5 - 9 Lacs

Pune

On-site

GlassDoor logo

About VOIS VOIS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VOIS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. VOIS India In 2009, VOIS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VOIS India supports global markets and group functions of Vodafone and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Must have technical / professional qualifications: Primary Skills: Teradata SQL & ETL, Linux/Unix - Shell Scripting Alternate Skills : GCP (BiqQuery,DataForm, DataProc) Python, git or similar versioning tools, Airflow or similar scheduling tools Good experiences in: Minimum experience of 4-8 years in data engineering/data warehousing/ETL engineering Good knowledge of SQL Good knowledge of Python scripting Good understanding of cloud native platforms (GCP) Telecommunication experience Core competencies, knowledge and experience : Essential: Strong Data Warehouse development experience in Cloud Native Technologies (GCP preferred) Strong SQL experience - Advanced level of SQL scripting Expert in Python (at least 12 months in real time projects) Excellent data interpretation skills Good knowledge of data warehouse and business intelligence, good understanding of a wide range of data manipulation and analysis techniques Excellent verbal, written and interpersonal communication skills, demonstrating the ability to communicate information technology concepts to non-technology personnel and should be able to interact with customer team and share ideas. Strong analytical, problem solving and decision-making skills, attitude to plan and organize work to deliver as agreed. Hands on experience in working with large datasets. Able to manage different stakeholders. Experience Experience: Exceptional data manipulation and analysis techniques; comfortable using very large (>10’s millions of rows) datasets, containing both structured and unstructured data. Designing and implementing changes to the existing data model Develop & maintain relational staging areas of Application Layer Supporting operations team on data quality, data consistency issues and essential business critical processes. Drive system optimization and simplification. VOIS Equal Opportunity Employer Commitment VOIS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion, Top 10 Best Workplaces for Women, Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch!

Posted 2 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Andhra Pradesh

On-site

GlassDoor logo

Software Engineering Advisor - HIH - Evernorth About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Position Summary: Data engineer on the Data integration team Job Description & Responsibilities: Work with business and technical leadership to understand requirements. Design to the requirements and document the designs Ability to write product-grade performant code for data extraction, transformations and loading using Spark, Py-Spark Do data modeling as needed for the requirements. Write performant queries using Teradata SQL, Hive SQL and Spark SQL against Teradata and Hive Implementing dev-ops pipelines to deploy code artifacts on to the designated platform/servers like AWS / Azure / GCP. Troubleshooting the issues, providing effective solutions and jobs monitoring in the production environment Participate in sprint planning sessions, refinement/story-grooming sessions, daily scrums, demos and retrospectives. Experience Required: Overall 8-10 years of experience Experience Desired: Strong development experience in Spark, Py-Spark, Shell scripting, Teradata. Strong experience in writing complex and effective SQLs (using Teradata SQL, Hive SQL and Spark SQL) and Stored Procedures Health care domain knowledge is a plus Education and Training Required: Primary Skills: Excellent work experience on Databricks as Data Lake implementations Experience in Agile and working knowledge on DevOps tools (Git, Jenkins, Artifactory) Experience in AWS (S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch) / GCP / Azure Databricks (Delta lake, Notebooks, Pipelines, cluster management, Azure / AWS integration Additional Skills: Experience in Jira and Confluence Exercises considerable creativity, foresight, and judgment in conceiving, planning, and delivering initiatives. Location & Hours of Work (hybrid, Hyderabad ) (11:30am-8:30PM) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Position Description Experience -8 plus Location-Hyderabad Notice period-Only immediate joiners 8 plus years of Strong ETL Informatica experience. Should have Oracle, Hadoop, MongoDB experience. Strong SQL/Unix knowledge. Experience in working with RDBMS. Preference to have Teradata. Good to have Bigdata/Hadoop experience. Good to have Python or any programming knowledge. Your future duties and responsibilities Required Qualifications To Be Successful In This Role B Tech Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

At GoDaddy the future of work looks different for each team. Some teams work in the office full-time, others have a hybrid arrangement (they work remotely some days and in the office some days) and some work entirely remotely. This is a remote position, so you ll be working remotely from your home. You may occasionally visit a GoDaddy office to meet with your team for events or meetings. Join Our Team As a data-driven company, GoDaddy is looking for a quick learner and result oriented Senior Analytics Engineer to join our Strategic Finance team. Strategic Finance is a part of GoDaddy s Finance Data, Analytics, and Technology team, and our overall mission is to optimize the power of data insights & automation solutions by enabling capabilities around technology, data, and people for improved efficiency & scalability. In Strategic Finance, we leverage business intelligence and financial models to derive data-driven insights that drive top-line, strategic growth. As part of this team, you will enable efficiency and velocity of insight discovery through data product development- velocity is key but be a good citizen! You will join forces with analytics and finance leaders to abstract complexity and enable data democratization. It s key that you can translate financial analyst and business partner needs into data products that eliminate the distance between raw data and strategic action. We are seeking team members that are technical and highly communicative. The right person will raise the profile and perfection of our entire team. You can make a difference here. An affinity for efficiency is a huge plus! What youll get to do You will build highly customised data structures to enable rapid-fire analytics and visualisation Perform data automation tasks and use a variety of tools - Amazon Redshift, Spectrum, S3, SQL Server, Teradata, Athena, not to mention ETL administration tools - we use it all here Act as the specialist on technical implementation for novel data products - from the warehouse layer -> database/analytics/query layer -> visualisation layer, you know how to structure data sets and workflows for high efficiency and performance Optimise new and existing workflows for gains in performance of the query and visualisation layers (all the way to the tuning of Tableau/Quick-sight workbooks and data source performance) Build end-to-end reporting solutions from multiple data structures and sources. Support the design, development, and implementation of enterprise-wide views, dashboards, and custom reporting. Document your work products and conform to data governance standards set by the core data engineering team along the way Your experience should include 5+ years of work experience using data engineering and analytics to build an enterprise Proven development skills, particularly in SQL and Python Familiarity with the gambit of old-school and next-gen database technologies, particularly the AWS technical stack, Teradata, and MSSQL Experience balancing both speed and quality. You know how to context switch between the two priorities depending on the project and task Experience with Tableau, crafting data sources, and tuning for performance Weve got your back... We offer a range of total rewards that may include paid time off, retirement savings (e.g., 401k, pension schemes), bonus/incentive eligibility, equity grants, participation in our employee stock purchase plan, competitive health benefits, and other family-friendly benefits including parental leave. GoDaddy s benefits vary based on individual role and location and can be reviewed in more detail during the interview process.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

25 - 30 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

Naukri logo

Senior Analyst, Data Engineer Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods and models. What s more, we are in collaboration with leading academics, industry experts and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data. Join us to do the best work of your career and make a profound social impact as a Data Engineering Sr. Analyst on our Data Engineering Team in Bangalore / Hyderabad/Gurgaon. What you ll achieve As a Data Engineer, you will build and support leading edge AI-Fueled, Data-Driven business solutions within Services Parts, Services Repair and Services Engineering Space. You will: Interact with business leaders and internal customers to create and establish design standards and assurance processes for software, systems and applications development to ensure compatibility and operability. Analyze business goals, defines project scope and identifies functional and technical requirements to produce accurate business insights. Develop fundamentally new approaches to generate meaning from data, creating specifications for reports and analysis based on business needs. Support existing AI and Analytics products by ensuring optimal operations and enhance for new functionalities. Take the first step towards your dream career Every Dell Technologies team member brings something unique to the table. Here s what we are looking for with this role: Essential Requirements Uses proactive outcomes-driven mindset Uses detailed analytical data skills Uses hands on experience developing coding data and analytics solutions Uses hands on knowledge in SQL Uses hands on knowledge in Python, Java, or other modern programming language Desirable Requirements Teradata macro programming Data management concepts Application closing date: 30th June 25

Posted 2 weeks ago

Apply

3.0 - 6.0 years

25 - 30 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

Naukri logo

Data Engineering Advisor Data Science is all about breaking new ground to enable businesses to answer their most urgent questions. Pioneering massively parallel data-intensive analytic processing, our mission is to develop a whole new approach to generating meaning and value from petabyte-scale data sets and shape brand new methodologies, tools, statistical methods and models. What s more, we are in collaboration with leading academics, industry experts and highly skilled engineers to equip our customers to generate sophisticated new insights from the biggest of big data. Join us to do the best work of your career and make a profound social impact as a Data Engineering Advisor on our Data Engineering Team in Bangalore /Hyderabad/Gurgaon. What you ll achieve As a Data Engineer, you will build leading edge AI-Fueled, Data-Driven business solutions within Services Parts, Services Repair and Services Engineering Space. You will: Interact with business leaders and internal customers to create and establish design standards and assurance processes for software, systems and applications development to ensure compatibility and operability. Analyze business goals, defines project scope and identifies functional and technical requirements to produce accurate business insights. Develop fundamentally new approaches to generate meaning from data, creating specifications for reports and analysis based on business needs. Support existing AI and Analytics products by ensuring optimal operations and enhance for new functionalities. Take the first step towards your dream career Every Dell Technologies team member brings something unique to the table. Here s what we are looking for with this role: Essential Requirements Uses proactive outcomes-driven mindset Uses detailed analytical data skills Uses hands on experience developing coding data and analytics solutions Uses hands on expert knowledge in SQL Uses hands on expert knowledge in Python, Java, or other modern programming language Desirable Requirements Teradata macro programming Data management concepts Application closing date: 30th June 25

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies