Home
Jobs

79 Netezza Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Urgent Opening for Solution Architect- Data Warehouse-Bangalore Posted On 04th Jul 2019 12:25 PM Location Bangalore Role / Position Solution Architect- Data Warehouse Experience (required) 8 Plus years Description 8-10 years experience in consulting or IT experience supporting Enterprise Data Warehouses & Business Intelligence environments, including experience with data warehouse architecture & design, ETL design/development, and Analytics. Responsiblefor defining the data strategy andfor ensuring that the programs and project align to that strategy. Provides thought leadership in following areas: -Data Access, Data Integration, Data Visualization, Data Modeling, Data Quality and Metadata management -Analytics, Data Discovery, Use Statistical methods, Database Design and Implementation Expertise in Database Appliance, RDBMS, Teradata,Netezza Hands-on experience with data architecting, data mining, large-scale data modeling, and business requirements gathering/analysis. Experience in ETL and Data Migration Tools. Direct experience in implementing enterprise data management processes, procedures, and decision support. Responsiblefor defining the data strategy andfor ensuring that the programs and project align to that strategy. Strong understanding of relational data structures, theories, principles, and practices. Strong familiarity with metadata management and associated processes. Hands-on knowledge of enterprise repository tools, data modeling tools, data mapping tools, and data profiling tools. Demonstrated expertise with repository creation, and data and information system life cycle methodologies. Experience with business requirements analysis, entity relationship planning, database design, reporting structures, and so on. Ability to manage data and metadata migration. Experience with data processing flowcharting techniques. Hands on Experience in Big Data Technologies(5 years)-Hadoop, MapReduce, MongoDB, and Integration with the Legacy environmentswould be preferred . Experience with Spark using Scala or Python is a big plus Experience in Cloud Technologies(primarily in AWS, Azure) and integration with on premise existing Data warehouse technologies. Have good knowledge on S3, Redshift, Blob Storage, Presto DB etc. Attitude to learn and adopt emerging technologies. Send Resumes to girish.expertiz@gmail.com -->Upload Resume

Posted Just now

Apply

5.0 - 10.0 years

5 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Key Responsibilities: Design, develop, and maintain Qlik View applications and dashboards. Collaborate with business stakeholders to gather requirements and translate them into technical specifications. Perform data analysis and create data models to support business intelligence initiatives. Optimize Qlik View applications for performance and scalability. Provide technical support and troubleshooting for Qlik View applications. Ensure data accuracy and integrity in all Qlik View applications. Integrate Snowflake with Qlik View to enhance data processing and analytics capabilities. Stay updated with the latest Qlik View features and best practices. Conduct training sessions for end-users to maximize the utilization of Qlik View applications. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience between 2-5 years as a Qlik View Developer. Strong knowledge of Qlik View architecture, data modeling, and scripting. Proficiency in SQL and database management. Knowledge of Snowflake and its integration with Qlik View. Excellent analytical and problem-solving skills. Ability to work independently and as part of a team. Strong communication and interpersonal skills.

Posted 3 hours ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Develops ETL solutions using Informatica PowerCentre.

Posted 17 hours ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Chennai

Work from Office

Naukri logo

Develop and manage data integration workflows using IBM InfoSphere DataStage. You will design, implement, and optimize ETL processes to ensure efficient data processing. Expertise in DataStage, ETL tools, and database management is required for this role.

Posted 17 hours ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Develop and manage ETL processes using Informatica, ensuring smooth data extraction, transformation, and loading across multiple systems. Optimize data workflows to ensure high-quality data management.

Posted 17 hours ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Design and develop ETL processes using IBM DataStage. Focus on data integration, transformation, and loading, ensuring efficient data pipelines.

Posted 17 hours ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Chennai

Work from Office

Naukri logo

Designs and develops ETL pipelines using IBM InfoSphere DataStage. Handles data integration and transformation tasks.

Posted 17 hours ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Design and implement business intelligence solutions using MicroStrategy. Build dashboards, reports, and analytics tools that provide actionable insights to help organizations make informed business decisions.

Posted 17 hours ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Design and implement data integration solutions using IBM InfoSphere DataStage. Develop ETL jobs, write PL/SQL scripts, and use Unix Shell Scripting for text processing to manage large datasets efficiently.

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

JD Key Responsibilities Lead the end-to-end migration of legacy data warehouses (e.g., Teradata, Oracle, SQL Server, Netezza, Redshift) to Snowflake. Assess current data architecture and define migration strategy, roadmap, and timelines. Develop ELT/ETL pipelines using tools such as dbt, Apache Airflow, Matillion, Talend, Informatica, etc. Optimize Snowflake configurations, including clustering, caching, and resource management for performance and cost efficiency. Implement security best practices, including role-based access, masking, and data encryption. Collaborate with data engineering, analytics, and business teams to ensure accurate and efficient data transfer. Create and maintain technical documentation, including migration plans, test scripts, and rollback procedures. Support validation, testing, and go-live activities. Required Skills & Experience 5+ years in data engineering or data platform roles, with at least 2+ years in Snowflake migration projects. Hands-on experience in migrating large datasets from legacy data warehouses to Snowflake. Proficient in SQL, Python, and Snowflake scripting (SnowSQL, stored procedures, UDFs). Experience with data migration tools and frameworks (e.g., AWS SCT, Azure Data Factory, Fivetran, etc.). Strong knowledge of cloud platforms (AWS, Azure, or GCP). Familiarity with DevOps practices, CI/CD for data pipelines, and version control (Git). Excellent problem-solving and communication skills. Preferred Qualifications Snowflake certification(s) SnowPro Core or Advanced Architect. Experience with real-time data ingestion (e.g., Kafka, Kinesis, Pub/Sub). Background in data governance, data quality, and compliance (GDPR, HIPAA). Prior experience in Agile/Scrum delivery environments

Posted 20 hours ago

Apply

6.0 - 11.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

5+ years experience inTeradata. Good inTeradata, should know BTEQ scripts, FastLoad, MLoad Experience Informatica Power Center experience Good in Writing SQL queries using Joins, Essential to have DataIKU tool proficient knowledge Proficient in Python Scripting Read logs. Can Script, Can automate script, can build Server

Posted 23 hours ago

Apply

10.0 - 14.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Must Haves o Strong experience with SQL Development/NZSQL including Stored Procedures o Must have strong experience with advanced SQL development and SQL optimization . o Must have used external table, NZLOAD for file loading and unloading o Experience on Materialized views, CBTs o Worked in AWS Redshift development or Netezza for at least 2-3 years o Strong on Unix/Linux Shell Scripting o Ability to interpret data models (3NF or Kimbal) and code SQL accordingly o Must have used DevOps Jenkins, Bitbucket/Git/Sourcetree, Automated Test Scripts using Unix etc. o Must have strong analytic and problem-solving skills. o Must have implemented end to end BI DWH application. Good to Have o Good to have an understanding of Control M , IBM CDC, EC2 etc. o Good to have an understanding of AWS S3 , AWS DMS services. o Good to have an understanding of reporting tools like Tableau or Cognos o Insurance Domain Knowledge would be an added advantage. o Good to have experience in Agile ways of working

Posted 23 hours ago

Apply

5.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Where Data Does More. Join the Snowflake team. The Technical Instructor for the Snowflake Customer Education and Training Team will be responsible for creating and delivering compelling education contents and training sets that make complex concepts come alive in instructor-led classroom venues. The senior instructor will be seen as a subject matter expert and leader in transferring knowledge of Snowflake to customers, partners and internals and in accelerating their technical on-boarding journey. This role will also be responsible for the cross-training efforts, program management and help strategically ramp multiple resources within our external stakeholders. This role is a unique opportunity to contribute in a meaningful way to high value and high impact delivery at a very exciting time for the company. Snowflake is an innovative, high-growth, customer-focused company in a large and growing market. If you are an energetic, self-managed professional with experience teaching data courses to customers and possess excellent presentation and communication skills, we’d love to hear from you. AS A TECHNICAL INSTRUCTOR AT SNOWFLAKE, YOU WILL: Teach a breadth of technical courses to onboard customers and partners to Snowflake, the data warehouse built for the Cloud Cross-train a breadth of technical courses to qualified individuals and resources The scope of course concepts may include foundational and advanced courses in the discipline which includes Snowflake data warehousing concepts, novel SQL capabilities, data consumption and connectivity interfaces, data integration and ingestion capabilities, database security features, database performance topics, Cloud ecosystem topics and more Apply database and data warehousing industry/domain/technology expertise and experience during training sessions to help customers and partners ease their organizations into the Snowflake data warehouse from prior database environments Deliver contents and cross train on delivery best practices using a variety of presentation formats including engaging lectures, live demonstration, and technical labs Work with customers and partners that are investing in the train the trainer program to certify their selected trainers ensuring they are well prepared and qualified to deliver the course at their organization Strong eye for design, making complex training concepts come alive in a blended educational delivery model Work with the education content developers to help prioritize, create, integrate, and publish training materials and hands-on exercises to Snowflake end users; drive continuous improvement of training performance Work with additional Snowflake subject-matter-experts in creating new education materials and updates to keep pace with Snowflake product updates OUR IDEAL TECHNICAL INSTRUCTOR WILL HAVE: Strong data warehouse and data-serving platform background Recent experience with using SQL including potentially in complex workloads 5-10 years of experience in technical content training development and delivery Strong desire and ability to teach and train Prior experience with other databases (e.g. Oracle, IBM Netezza, Teradata,…) Excellent written and verbal communication skills Innovative and assertive, with the ability to pick up new technologies Presence: enthusiastic and high energy, but also poised, confident and extremely professional Track record of delivering results in a dynamic start-up environment Experience working cross functionally, ideally with solution architects, technical writers, and support Strong sense of ownership and high attention to detail Candidates with degrees from fields such as Computer Science or Management Information Systems Comfortable with travel up to 75% of the time BONUS POINTS FOR EXPERIENCE WITH THE FOLLOWING: Experience with creating and delivering training programs to mass audiences Experience with other databases (e.g. Teradata, Netezza, Oracle, Redshift,…) Experience with non-relational platforms and tools for large-scale data processing (e.g. Hadoop, HBase,…) Familiarity and experience with common BI and data exploration tools (e.g. Microstrategy, Business Objects, Tableau,…) Experience and understanding of large-scale infrastructure-as-a-service platforms (e.g. Amazon AWS, Microsoft Azure,…) Experience with ETL pipelines tools Experience using AWS and Microsoft services Participated in Train the Trainer programs Proven success at enterprise software startups Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Posted 2 days ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are looking Delivery Manager With DWH Location: Chennai, Noida & Bangalore Required Skills: 12+ Years of experience in managing delivery of Data Warehouse Projects (Development & Modernization/Migration). Strong Delivery background with experience in managing large complex Data Warehouse engagements. Good to have experience on Snowflake, Matillion, DBT, Netezza/DataStage and Oracle. Healthcare Payer Industry experience Extensive experience in Program/Project Management, Iterative, Waterfall and Agile Methodologies. Ability to track and manage complex program budgets Experience in managing the delivery of complex programs to meet the needs and the required timelines set for the defined programs. Communicate program review results to various stakeholders. Experience in building the team, providing guidance, and education as needed to ensure the success of priority programs and promote cross-training within the department. Experience in developing and managing an integrated program plans that incorporate both technical and business deliverables. Verify that critical decision gates are well defined, communicated and monitored for executive approval throughout the program. Verify that work supports the corporate strategic direction. Review resulting vendor proposals and estimates to ensure they satisfy both our functional requirements and technology strategies. Project management methodologies, processes, and tools. Knowledge of Project Development Life Cycle Establish and maintain strong working relationships with various stakeholders including team members, IT resources, resources in other areas of the business and upper management Ability to track and manage complex program budgets Strong business acumen and political savvy Ability to collaborate while dealing with complex situations Ability to think creatively and to drive innovation Ability to motivate, lead and inspire a diverse group to a common goal/solution with multiple stakeholders Ability to convert business strategy into action oriented objectives and measurable results Strong negotiating, influencing, and consensus-building skills Ability to mentor, coach and provide guidance to others Responsibilities: Responsible for the end to end delivery of the Application Development and Support services for the client Coordinate with Enterprise Program Management Office to execute programs following defined standards and governance structure to ensure alignment to the approved project development life cycle (PDLC). Interface regularly with key senior business leaders to enable a smooth transition from strategy development to program identification and execution. Facilitate meetings with task groups or functional areas as required for EPMO supported initiatives and/or to resolve issues. Proactively engage other members of the organization with specific subject knowledge to resolve issues or provide assistance. Lead post implementation review of major initiatives to provide lessons learned and continuous improvement. Develop accurate and timely summary report for executive management that provide consolidated, clear, and concise assessments of strategic initiatives implementation status. Collaborate with business owners to develop divisional business plans that support the overall strategic direction. Supports budget allocation process through ongoing financial tracking reports. Develop & maintain service plans considering the customer requirements. Track and monitor to ensure the adherence to SLA/KPIs Identify opportunities for improvement to service delivery process. Address service delivery issues/escalations/complaints. First point of escalation for customer escalations Oversee shift management for various tracks. Responsible for publishing production support reports & metrics Best Regards, Sanjay Kumar

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Key Responsibilities • ARCHITECTURE AND DESIGN FOR DATA ENGINEERING AND MACHINE LEARNING PROJECTS Establishing architecture and target design for data engineering and machine learning projects. • REQUIREMENT ANALYSIS, PLANNING, EFFORT AND RESOURCE NEEDS ESTIMATION Current inventory analysis, review and formalize requirements, project planning and execution plan. • ADVISORY SERVICES AND BEST PRACTICES Troubleshooting, Performance Tuning, Cost Optimization, Operational Runbooks and Mentoring • LARGE MIGRATIONS Assist customers with large migrations to Databricks from Hadoop ecosystems, Data Warehouses (Teradata, DataStage, Netezza, Ab Initio), ETL engines (Informatica), SAS, SQL, DW, Cloud-based Data platforms like Redshift, Snowflake, EMR, etc • DESIGN, BUILD AND OPTIMIZE DATA PIPELINES The Databricks implementation will be best in class, with flexibility for future iterations. • PRODUCTION READINESS Assisting with production readiness for customers, including exception handling, production cutover, capture analysis, alert scheduling and monitoring • MACHINE LEARNING (ML) – MODEL REVIEW, TUNING, ML OPERATIONS AND OPTIMIZATION Build and review ML models, ML best practices, model lifecycle, ML frameworks and deploying of models in production. Must Have: ▪ Pre- Sales experience is a must. ▪ Hands on experience with distributed computing framework like DataBricks, Spark Ecosystem (Spark Core, PySpark, Spark Streaming, SparkSQL) ▪ Willing to work with product teams to best optimize product features/functions. ▪ Experience on Batch workloads and real time streaming with high volume data frequency. ▪ Performance optimization on Spark workloads ▪ Environment setup, user management, Authentication and cluster management on Databricks ▪ Professional curiosity and the ability to enable yourself in new technologies and tasks. ▪ Good understanding of SQL and a good grasp of relational and analytical database management theory and practice. Key Skills: • Python, SQL and Pyspark • Big Data Ecosystem (Hadoop, Hive, Sqoop, HDFS, Hbase) • Spark Ecosystem (Spark Core, Spark Streaming, Spark SQL) / Databricks • Azure (ADF, ADB, Logic Apps, Azure SQL database, Azure Key Vaults, ADLS, Synapse) • AWS (Lambda,AWS Glue, S3, Redshift) • Data Modelling, ETL Methodology

Posted 3 days ago

Apply

7.0 - 9.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Looking for a specialized Data engineer who focuses on building and maintaining data pipelines to collect, process, and analyze large volumes of transaction data specifically for the purpose of detecting and preventing fraudulent activity within a company's payment systems. As a Senior Data Engineer Oversee the integration and management of data from various sources and storage systems, establishing processes and pipelines to produce cohesive datasets for analysis and modeling. Design and develop data pipelines to automate repetitive tasks within data science and data engineering. Demonstrated experience leading cross-functional teams or working across different teams to solve complex problems. Partner with software engineering teams to deploy and validate production artifacts. Identify patterns and innovative solutions in existing spaces, consistently seeking opportunities to simplify, automate tasks, and build reusable components for multiple use cases and teams. Create data products that are well-modeled, thoroughly documented, and easy to understand and maintain. Comfortable leading projects in environments with undefined or loose requirements. Qualifications Bachelor’s degree in computer science, Information Systems, or another related field. 7 to 9 years of experience in Data Engineering and implementing multiple end-to-end DW projects in a Big Data environment. ​ Experience in building data pipelines through Spark with Java/Scala on Hadoop or Object storage. ​Experience working with Databases like Oracle, and Netezza and have strong SQL knowledge. ​Experience working on Nifi will be an added advantage. ​Experience working with APIs will be an added advantage. Experience and working in a Unix environment is an added advantage. ​Strong analytical skills are required for debugging production issues, providing root causes, and implementing the mitigation plan. ​Ability to multi-task across multiple projects, interface with external/internal resources, and provide technical leadership to junior team members. ​Flexibility to work as a member of matrix-based diverse and geographically distributed project teams. ​Experience of working in Agile teams.

Posted 3 days ago

Apply

3.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Strong experience in Netezza and Oracle with ability to write complex SQL queries. Expertise in Netezza & Oracle with strong SQL ensuring data integrity, data security and data quality. Good understanding on latest trends and best practices in data engineering & ETL tools Good knowledge on Stored Procedures and Data Modeling Excellent interpersonal and communication skill At least 3-5 years experience working in the healthcare payor domain, with a focus on data governance, HIPAA compliance, and healthcare-related data processing.

Posted 4 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The future is our choice At Atos, as the global leader in secure and decarbonized digital, our purpose is to help design the future of the information space. Together we bring the diversity of our people’s skills and backgrounds to make the right choices with our clients, for our company and for our own futures. Roles & Responsibilities Lead the modernization of operational databases, transitioning from traditional systems (Netezza, Teradata) to modern architectures. Design and implement scalable NoSQL solutions, with a focus on Cassandra. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Provide technical leadership and mentorship to database administrators and developers. Requirements 10+ years of experience in creating Data Platform Solutions Expertise in Teradata SQL, data modeling, ETL tools, and database performance tuning. Must have 3+ experience in NoSQL data modelling using DataStax-Cassandra 5+ years of experience in implementing Data Lakes in Hadoop, Cloud Good knowledge of NoSQL database architectures Design and develop large-scale (multi-terabyte) data architecture for scalable technical solutions in the areas of data management, data processing, performance optimization while handling data volume, variety and velocity Good Knowledge of best practices and hands on experience for data storage, loading and consumption data from data lakes. Hands on Experience in Spark Programming – PySpark & Scala Good knowledge and experience in Kafka implementations Experience of streaming data ingestion and processing using tools like Flink, Spark Streaming, Storm, Flume Experience in orchestration tools like Airflow, Oozie Good Knowledge/Experience of Cloudera Data Platform (CDP) Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Databases - DataStax-Cassandra , Teradata, Netezza Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Here at Atos, diversity and inclusion are embedded in our DNA. Read more about our commitment to a fair work environment for all. Atos is a recognized leader in its industry across Environment, Social and Governance (ESG) criteria. Find out more on our CSR commitment. Choose your future. Choose Atos.

Posted 6 days ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Software Developers at IBM are the backbone of our strategic initiatives to design, code, test, and provide industry-leading solutions that make the world run today - planes and trains take off on time, bank transactions complete in the blink of an eye and the world remains safe because of the work our software developers do. Whether you are working on projects internally or for a client, software development is critical to the success of IBM and our clients worldwide. At IBM, you will use the latest software development tools, techniques and approaches and work with leading minds in the industry to build solutions you can be proud of Your Role And Responsibilities This Candidate is responsible for DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 8+years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Show more Show less

Posted 1 week ago

Apply

6.0 - 8.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops What Makes You a Qualified Candidate Minimum 6-8 years of IT experience in a Systems Administrator / Engineer role. Minimum 4 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 week ago

Apply

6.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops What Makes You a Qualified Candidate Minimum 6-8 years of IT experience in a Systems Administrator / Engineer role. Minimum 4 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills. Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA). soft skills. Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA).Soft Skills Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Drive technical conversations with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications Preffered 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. Or 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and other cloud products (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. OR 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations. Show more Show less

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies