Home
Jobs

79 Netezza Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Roles & Responsibilities Job Description: Build pipelines to bring in wide variety of data from multiple sources within the organization as well as from social media and public data sources. Collaborate with cross functional teams to source data and make it available for downstream consumption. Work with the team to provide an effective solution design to meet business needs. Ensure regular communication with key stakeholders, understand any key concerns in how the initiative is being delivered or any risks/issues that have either not yet been identified or are not being progressed. Ensure dependencies and challenges (risks) are escalated and managed. Escalate critical issues to the Sponsor and/or Head of Data Engineering. Ensure timelines (milestones, decisions and delivery) are managed and value of initiative is achieved, without compromising quality and within budget. Ensure an appropriate and coordinated communications plan is in place for initiative execution and delivery, both internal and external. Ensure final handover of initiative to business as usual processes, carry out a post implementation review (as necessary) to ensure initiative objectives have been delivered, and any lessons learned are fed into future initiative management processes. Who We Are Looking For Competencies & Personal Traits Work as a team player Excellent problem analysis skills Experience with at least one Cloud Infra provider (Azure/AWS) Experience in building data pipelines using batch processing with Apache Spark (Spark SQL, Dataframe API) or Hive query language (HQL) Knowledge of Big data ETL processing tools Experience with Hive and Hadoop file formats (Avro / Parquet / ORC) Basic knowledge of scripting (shell / bash) Experience of working with multiple data sources including relational databases (SQL Server / Oracle / DB2 / Netezza). Basic understanding of CI CD tools such as Jenkins, JIRA, Bitbucket, Artifactory, Bamboo and Azure Dev-ops. Basic understanding of DevOps practices using Git version control Ability to debug, fine tune and optimize large scale data processing jobs Working Experience 1-3 years of broad experience of working with Enterprise IT applications in cloud platform and big data environments. Professional Qualifications Certifications related to Data and Analytics would be an added advantage Education Master/Bachelor’s degree in STEM (Science, Technology, Engineering, Mathematics) Language Fluency in written and spoken English Experience 3-4.5 Years Skills Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): Kafka, Big Data, Apache Hive, SQL Server DBA, CI/CD, Apache Spark About The Company Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP). Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru. Show more Show less

Posted 4 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Design and implement ETL solutions using IBM InfoSphere DataStage to integrate and process large datasets. You will develop, test, and optimize data pipelines to ensure smooth data transformation and loading. Expertise in IBM InfoSphere DataStage, ETL processes, and data integration is essential for this position.

Posted 4 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

The IBM InfoSphere DataStage role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the IBM InfoSphere DataStage domain.

Posted 4 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The IBM InfoSphere DataStage role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the IBM InfoSphere DataStage domain.

Posted 4 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Chennai

Work from Office

Naukri logo

The IBM InfoSphere DataStage, Teradata role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the IBM InfoSphere DataStage, Teradata domain.

Posted 4 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The IBM InfoSphere DataStage E3 role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the IBM InfoSphere DataStage E3 domain.

Posted 4 weeks ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

What You’ll Do We are seeking a dynamic Account Executive Artificial Intelligence (AI) to join our strong and strategic sales tea m . As a n AE (AI), you will drive the adoption of our AI solutions in our “Rest of Cloud” ( RoC ) market that includes to Cloud Service Providers (CSPs) and emerging AI providers such as AI native cloud builders, AI SaaS providers, and AI System Integrators . You will understand their specific needs and drive AI Infrastructure and Networking solutions that align to their business operations. This role requires a deep understanding of AI Infrastructure and large-scale networking with a strong ability to translate technical concepts to a diverse audience. Who You’ll Work With The Cloud + AI Infrastructure team delivers one scalable strategy with local execution for customer transformation and growth. We are the worldwide go-to-market compute and data center networking engine assembling market transitions and engaging with sellers to fuel growth for customers and Cisco. Alongside our colleagues, Cloud & AI Infrastructure builds the sales strategy, activates sellers and technical communities, and accelerates selling every single day. Who You Are You will develop and execute a strategy to deliver incremental revenu e for AI and large-scale networking products and services including network routing and switching, optics and data center interconnects, automation and performance optimization across emerging AI provider accounts and develop relationships with key decision-makers and partners. Engaging with your clients to understand their business challenges and conducting detailed analysis to find new opportunities for AI and networking solutions are the dynamic skills you will bring to this role. You understand AI technology and the market and can translate technical concepts i nto business value for clients. Minimum Qualifications 8 + years of technology-related business development experience Experience unlocking revenue for new innovative technology-based solutions. Experience working with Cloud Service Providers, NeoCloud customers, and/or AI System Integrators . Experience in understanding business issues of Cloud Builders and Providers, Networking Infrastructure / accelerated Computing/ Data Center technology/ Deep learning & machine learning. Proven ability to work cross-functionally with Engineering and Marketing to develop and launch new AI or Networking Infrastructure offers Preferred Qualifications Bachelor’s degree or equivalent experience in Business, Computer Science, Engineering, or a related field; advanced degree is a plus. Excellent verbal and written communication skills. Experience bridging large-scale network ing concepts with AI infrastructure (data center / compute) Experience with deep learning, data science, and NVIDIA GPUs. Experience in two or more data estate workloads such as: Microsoft’s Data & AI Platform (Azure Synapse Analytics, Azure Databricks, CosmosDB , Azure SQL or HDInsight, etc.), AWS (Redshift, Aurora, Glue), Google ( BigQuery ), MongoDB, Cassandra, Snowflake, Teradata, Oracle Exadata, IBM Netezza, SAP (HANA, BW), Apache Hadoop & Spark, MapR or Cloudera/Hortonworks, etc. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Why Cisco #WeAreCisco. We are all unique, but collectively we bring our talents to work as a team, to develop innovative technology and power a more inclusive, digital future for everyone. How do we do it? Well, for starters – with people like you! Nearly every internet connection around the world touches Cisco. We’re the Internet’s optimists. Our technology makes sure the data traveling at light speed across connections does so securely, yet it’s not what we make but what we make happen which marks us out. We’re helping those who work in the health service to connect with patients and each other; schools, colleges, and universities to teach in even the most challenging of times. We’re helping businesses of all shapes and sizes to connect with their employees and customers in new ways, providing people with access to the digital skills they need and connecting the most remote parts of the world – whether through 5G, or otherwise. We tackle whatever challenges come our way. We have each other’s backs, we recognize our accomplishments, and we grow together. We celebrate and support one another – from big and small things in life to big career moments. And giving back is in our DNA (we get 10 days off each year to do just that). We know that powering an inclusive future starts with us. Because without diversity and a dedication to equality, there is no moving forward. Our 30 Inclusive Communities, that bring people together around commonalities or passions, are leading the way. Together we’re committed to learning, listening, caring for our communities, whilst supporting the most vulnerable with a collective effort to make this world a better place either with technology, or through our actions. So, you have colorful hair? Don’t care. Tattoos? Show off your ink. Like polka dots? That’s cool. Pop culture geek? Many of us are. Passion for technology and world changing? Be you, with us! #WeAreCisco Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Linkedin logo

What You’ll Do We are seeking a dynamic Account Executive Artificial Intelligence (AI) to join our strong and strategic sales tea m . As a n AE (AI), you will drive the adoption of our AI solutions in our “Rest of Cloud” ( RoC ) market that includes to Cloud Service Providers (CSPs) and emerging AI providers such as AI native cloud builders, AI SaaS providers, and AI System Integrators . You will understand their specific needs and drive AI Infrastructure and Networking solutions that align to their business operations. This role requires a deep understanding of AI Infrastructure and large-scale networking with a strong ability to translate technical concepts to a diverse audience. Who You’ll Work With The Cloud + AI Infrastructure team delivers one scalable strategy with local execution for customer transformation and growth. We are the worldwide go-to-market compute and data center networking engine assembling market transitions and engaging with sellers to fuel growth for customers and Cisco. Alongside our colleagues, Cloud & AI Infrastructure builds the sales strategy, activates sellers and technical communities, and accelerates selling every single day. Who You Are You will develop and execute a strategy to deliver incremental revenu e for AI and large-scale networking products and services including network routing and switching, optics and data center interconnects, automation and performance optimization across emerging AI provider accounts and develop relationships with key decision-makers and partners. Engaging with your clients to understand their business challenges and conducting detailed analysis to find new opportunities for AI and networking solutions are the dynamic skills you will bring to this role. You understand AI technology and the market and can translate technical concepts i nto business value for clients. Minimum Qualifications 8 + years of technology-related business development experience Experience unlocking revenue for new innovative technology-based solutions. Experience working with Cloud Service Providers, NeoCloud customers, and/or AI System Integrators . Experience in understanding business issues of Cloud Builders and Providers, Networking Infrastructure / accelerated Computing/ Data Center technology/ Deep learning & machine learning. Proven ability to work cross-functionally with Engineering and Marketing to develop and launch new AI or Networking Infrastructure offers Preferred Qualifications Bachelor’s degree or equivalent experience in Business, Computer Science, Engineering, or a related field; advanced degree is a plus. Excellent verbal and written communication skills. Experience bridging large-scale network ing concepts with AI infrastructure (data center / compute) Experience with deep learning, data science, and NVIDIA GPUs. Experience in two or more data estate workloads such as: Microsoft’s Data & AI Platform (Azure Synapse Analytics, Azure Databricks, CosmosDB , Azure SQL or HDInsight, etc.), AWS (Redshift, Aurora, Glue), Google ( BigQuery ), MongoDB, Cassandra, Snowflake, Teradata, Oracle Exadata, IBM Netezza, SAP (HANA, BW), Apache Hadoop & Spark, MapR or Cloudera/Hortonworks, etc. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Why Cisco #WeAreCisco. We are all unique, but collectively we bring our talents to work as a team, to develop innovative technology and power a more inclusive, digital future for everyone. How do we do it? Well, for starters – with people like you! Nearly every internet connection around the world touches Cisco. We’re the Internet’s optimists. Our technology makes sure the data traveling at light speed across connections does so securely, yet it’s not what we make but what we make happen which marks us out. We’re helping those who work in the health service to connect with patients and each other; schools, colleges, and universities to teach in even the most challenging of times. We’re helping businesses of all shapes and sizes to connect with their employees and customers in new ways, providing people with access to the digital skills they need and connecting the most remote parts of the world – whether through 5G, or otherwise. We tackle whatever challenges come our way. We have each other’s backs, we recognize our accomplishments, and we grow together. We celebrate and support one another – from big and small things in life to big career moments. And giving back is in our DNA (we get 10 days off each year to do just that). We know that powering an inclusive future starts with us. Because without diversity and a dedication to equality, there is no moving forward. Our 30 Inclusive Communities, that bring people together around commonalities or passions, are leading the way. Together we’re committed to learning, listening, caring for our communities, whilst supporting the most vulnerable with a collective effort to make this world a better place either with technology, or through our actions. So, you have colorful hair? Don’t care. Tattoos? Show off your ink. Like polka dots? That’s cool. Pop culture geek? Many of us are. Passion for technology and world changing? Be you, with us! #WeAreCisco Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Software Development Engineer II Software Development Engineer (Data Engineering) Overview Mastercard is the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Enterprise Data Solution (EDS) is focused on enabling insights into Mastercard network and help build data-driven products by curating and preparing data in a secure and reliable manner. Moving to a “Unified and Fault-Tolerant Architecture for Data Ingestion and Processing” is critical to achieving this mission. As a Software Development Engineer (Data Engineering) in Enterprise Data Solution (EDS), you will have the opportunity to build high performance data pipelines to load into Mastercard Data Warehouse. Our Data Warehouse provides analytical capabilities to number of business users who help different customers provide answer to their business problems through data. You will play a vital role within a rapidly growing organization, while working closely with experienced and driven engineers to solve challenging problems. Role Participant medium-to-large size data engineering projects Discover, ingest, and incorporate new sources of real-time, streaming, batch, and API-based data into our platform to enhance the insights we get from running tests and expand the ways and properties on which we can test Assist business in utilizing data-driven insights to drive growth and transformation. Build and maintain data processing workflows feeding Mastercard analytics domains. Facilitate reliable integrations with internal systems and third-party API's as needed. Support data analysts as needed, advising on data definitions and helping them derive meaning from complex datasets. Work with cross functional agile teams to drive projects through full development cycle. Help the team improve with the usage of data engineering best practices. Collaborate with other data engineering teams to improve the data engineering ecosystem and talent within Mastercard. All About You At least Bachelor's degree in Computer Science, Computer Engineering or Technology related field or equivalent work experience Experience in Data Warehouse related projects in product or service based organization Expertise in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge Additional experience of building data pipelines through Spark with Scala/Python/Java on Hadoop is preferred Experience of working on Nifi will be an added advantage Experience of working in Agile teams Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan Strong communication skills - both verbal and written – and strong relationship, collaboration skills and organizational skills Ability to be high-energy, detail-oriented, proactive and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement Flexibility to work as a member of a matrix based diverse and geographically distributed project teams Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-246732 Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Linkedin logo

Role Description Job Title: Data Analyst Experience: 8+ Years Location: Thiruvananthapuram Job Summary We are seeking an experienced Data Analyst with a strong background in banking, regulatory reporting, and financial services. The ideal candidate should have expertise in data analysis, governance, SQL, system integration, and business intelligence. This role involves working closely with stakeholders to analyze complex business requirements, ensure data quality, and provide solutions that align with organizational goals. Core Responsibilities Business & System Analysis: Analyze and translate complex business requirements into functional system design documents. Perform technology and system analysis, leveraging knowledge of applications, interfaces, and data structures. Facilitate and participate in design whiteboarding sessions to ensure business needs are met within SVB standards. Provide leadership and support for system production issues, change requests, and maintenance while maintaining documentation. Data Analytics & Governance Perform data mapping, data quality checks, and data profiling to ensure accuracy and consistency. Apply advanced data analysis techniques to understand detailed data flows between and within systems. Ensure data governance best practices and compliance with regulatory standards. Banking & Regulatory Reporting Expertise Work with Sanctions, Fraud, KYC, AML, and Payments domain data. Develop insights and reports for regulatory and compliance reporting within financial services. Ensure data accuracy in risk and compliance frameworks. SQL & Data Management Write complex SQL queries for data extraction, transformation, and analysis. Work with relational databases and data warehouses to manage large datasets. Support ETL processes and API/Microservices-based system integrations. Systems & Implementation Support Configure systems and develop expertise in system functionality and design. Provide expert guidance in data-related projects, including systems implementation and integration. Ensure alignment of future technology and system trends with business needs. Collaboration & Communication Prepare and present subject matter expertise through documentation and presentations. Collaborate with cross-functional teams, including IT, finance, and regulatory teams, to define and refine data requirements. Work with remote teams to resolve production incidents quickly and efficiently. Mandatory Skills & Qualifications 8+ years of experience in Data Analytics, Business Intelligence, or related fields. 5+ years of experience in banking and regulatory reporting domains. Bachelor’s degree in Computer Science, Information Science, or related discipline (or equivalent work experience). Strong data background with expertise in data mapping, data quality, data governance, and data warehousing. Expertise in SQL for data querying, transformation, and reporting. Experience in Sanctions, Fraud, KYC, AML, and Payments domains. Strong experience with systems integration using API/Microservices/web services, ETL. Hands-on experience with SAP BODS (BusinessObjects Data Services). Experience working on systems implementation projects in the banking or financial sector. Excellent written and verbal communication skills for stakeholder interactions. Good To Have Skills & Qualifications Knowledge of Big Data technologies (Snowflake, etc.). Familiarity with BI tools (Tableau, Power BI, Looker, etc.). Exposure to AI/ML concepts and tools for data-driven insights. Skills Sap Bods,Netezza,Big Data Show more Show less

Posted 1 month ago

Apply

4.0 - 6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Foreword: At iGreenTree, we're passionate about empowering energy and utility providers with innovative IT solutions. With deep domain knowledge and a dedication to innovation, we help our clients stay ahead of the curve in a rapidly changing industry. Whether you need IT consulting, application development, system integration, or digital transformation services, our team of experts has the expertise to deliver the right solution for your business. Partner with iGreenTree to unleash the power of technology and achieve sustainable growth in today's dynamic landscape. Who We Are Looking For: An ideal candidate who must demonstrate in-depth knowledge and understanding of RDBMS concepts and experienced in writing complex queries and data integration processes in SQL/TSQL and NoSQL. This individual will be responsible for helping the design, development and implementation of new and existing applications. Roles and Responsibilities: Reviews existing database designs and data management procedures and provides recommendations for improvement. Responsible for providing subject matter expertise in design of database schemes and performing data modeling (logical and physical models), for product feature enhancements as well as extending analytical capabilities. Develop technical documentation as needed. Architect, develop, validate and communicate Business Intelligence (BI) solutions like dashboards, reports, KPIs, instrumentation, and alert tools. Define data architecture requirements for cross-product integration within and across cloud-based platforms. Analyze, architect, develop, validate and support integrating data into SaaS platforms (like ERP, CRM, etc.) from external data source; Files (XML, CSV, XLS, etc.), APIs (REST, SOAP), RDBMS. Perform thorough analysis of complex data and recommend actionable strategies. Effectively translate data modeling and BI requirements into the design process. Big Data platform design i.e. tool selection, data integration, and data preparation for predictive modeling. Required Skills: Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models. 2-3 years of experience in Extraction, Transformation and Loading ETL work using data migration tools like Talend, Informatica, DataStage, etc. 4-6 years of experience as a database developer in Oracle, MS SQL or other enterprise database with focus on building data integration processes. Candidate should have any NoSQL technology exposure preferably MongoDB. Experience in processing large data volumes indicated by experience with Big Data platforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.) Understanding data warehousing concepts and decision support systems. Ability to deal with sensitive and confidential material and adhere to worldwide data security and Experience writing documentation for design and feature requirements. Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure, etc. Excellent communication and collaboration skills. Show more Show less

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. - Grade Specific The role support the team in building and maintaining data infrastructure and systems within an organization. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted 1 month ago

Apply

9.0 - 14.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

This role involves the development and application of engineering practice and knowledge in designing, managing and improving the processes for Industrial operations, including procurement, supply chain and facilities engineering and maintenance of the facilities. Project and change management of industrial transformations are also included in this role. - Grade Specific Focus on Industrial Operations Engineering. Fully competent in own area. Acts as a key contributor in a more complex/ critical environment. Proactively acts to understand and anticipates client needs. Manages costs and profitability for a work area. Manages own agenda to meet agreed targets. Develop plans for projects in own area. Looks beyond the immediate problem to the wider implications. Acts as a facilitator, coach and moves teams forward. Skills (competencies)

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Desired Competencies (Technical/Behavioral Competency) Must-Have IBM DataStage / Informatica, Snowflake, SQL, Unix Good-to-Have DBT SN Responsibility of / Expectations from the Role 1 Analyze and build ETL data assets using IBM DataStage / Informatica and On-Prem Databases such as Netezza. Propose design and implement solutions to migrate such data assets to cloud data warehouses such as AWS S3, Snowflake and built ELT transformations using DBT and Python 2 Create and manage data pipelines supporting CI/CD. 3 Work with project and business analyst leads in order to develop and clarify in-depth technical requirements including logical and physical data modeling activities 4 Develop, test, enhance, debug and support data assets / applications for business units or supporting functions using IBM Infosphere Data Stage ETL tool suite both ETL and ELT approaches. These application program solutions may involve diverse development platforms, software, hardware, technologies and tools 5 Participates in the design, development and implementation of complex applications, often using IBM Infosphere Information Server (IIS) products like Data Stage, Quality Stage on a Linux Grid environment. Snowflake, Control-M/Scheduling tools. Show more Show less

Posted 1 month ago

Apply

6.0 - 12.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

TCS has been a great pioneer in feeding the fire of young techies like you. We are a global leader in the technology arena and there’s nothing that can stop us from growing together. What we are looking for Role: SQL DBA Experience Range: 6 - 12 years Location: New Delhi / Gurugram Interview Mode: Saturday Virtual Drive Must Have: 1. MSSQL Server 2. Azure SQL Server 3. Must have done certifications in SQL Server / Azure SQL Good to Have: Minimum 2 mandate details are mandate with two or 3 liners 1. DB2 2. Netezza 3. PowerShell 4. Azure PostgreSQL Essential: Administer and maintain database systems, with a focus on MS SQL Server along with Azure, PostgreSQL, and DB2. Supporting SQL server in Azure environment as IAAS/SQL MI/PaaS services. Managing Azure SQL databases, SQL Managed Instances, Azure VM in Azure Portal. Monitor database performance and proactively address issues to ensure optimal functionality. Collaborate with project teams to understand database requirements and provide efficient solutions. Participate in the design, implementation, and maintenance of database structures for different applications. Work independently to troubleshoot and resolve database-related issues promptly. Implement best practices to enhance database performance and security Manage databases on Azure Cloud, ensuring seamless integration and optimization for cloud-based solutions Utilize SQL Server tools and other relevant technologies for effective database administration Stay updated on the latest advancements in database tools and incorporate them into daily practices. Collaborate with cross-functional teams, including developers and system administrators, to achieve project goals and Provide guidance and support to team members on database-related issues. Minimum Qualification: •15 years of full-time education •Minimum percentile of 50% in 10th, 12th, UG & PG (if applicable) Show more Show less

Posted 1 month ago

Apply

6.0 - 12.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

TCS has been a great pioneer in feeding the fire of young techies like you. We are a global leader in the technology arena and there’s nothing that can stop us from growing together. What we are looking for Role: SQL DBA Experience Range: 6 - 12 years Location: New Delhi / Gurugram Interview Mode: Saturday Virtual Drive Must Have: Minimum 5 mandate details are mandate with two or 3 liners 1. MSSQL Server 2. Azure SQL Server 3. Must have done certifications in SQL Server / Azure SQL Good to Have: Minimum 2 mandate details are mandate with two or 3 liners 1. DB2 2. Netezza 3. PowerShell 4. Azure PostgreSQL Essential: Administer and maintain database systems, with a focus on MS SQL Server along with Azure, PostgreSQL, and DB2. Supporting SQL server in Azure environment as IAAS/SQL MI/PaaS services. Managing Azure SQL databases, SQL Managed Instances, Azure VM in Azure Portal. Monitor database performance and proactively address issues to ensure optimal functionality. Collaborate with project teams to understand database requirements and provide efficient solutions. Participate in the design, implementation, and maintenance of database structures for different applications. Work independently to troubleshoot and resolve database-related issues promptly. Implement best practices to enhance database performance and security Manage databases on Azure Cloud, ensuring seamless integration and optimization for cloud-based solutions Utilize SQL Server tools and other relevant technologies for effective database administration Stay updated on the latest advancements in database tools and incorporate them into daily practices. Collaborate with cross-functional teams, including developers and system administrators, to achieve project goals and Provide guidance and support to team members on database-related issues. Minimum Qualification: •15 years of full-time education •Minimum percentile of 50% in 10th, 12th, UG & PG (if applicable) Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Are you insatiably curious, deeply passionate about the realm of databases and analytics, and ready to tackle complex challenges in a dynamic environment in the era of AI? If so, we invite you to join our team as a Cloud & AI Solution Engineer in Innovative Data Platform for commercial customers at Microsoft. Here, you'll be at the forefront of innovation, working on cutting-edge projects that leverage the latest technologies to drive meaningful impact. Join us and be part of a team that thrives on collaboration, creativity, and continuous learning. Databases & Analytics is a growth opportunity for Microsoft Azure, as well as its partners and customers. It includes a rich portfolio of products including IaaS and PaaS services on the Azure Platform in the age of AI. These technologies empower customers to build, deploy, and manage database and analytics applications in a cloud-native way. As an Innovative Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Are you insatiably curious, deeply passionate about the realm of databases and analytics, and ready to tackle complex challenges in a dynamic environment in the era of AI? If so, we invite you to join our team as a Cloud & AI Solution Engineer in Innovative Data Platform for commercial customers at Microsoft. Here, you'll be at the forefront of innovation, working on cutting-edge projects that leverage the latest technologies to drive meaningful impact. Join us and be part of a team that thrives on collaboration, creativity, and continuous learning. Databases & Analytics is a growth opportunity for Microsoft Azure, as well as its partners and customers. It includes a rich portfolio of products including IaaS and PaaS services on the Azure Platform in the age of AI. These technologies empower customers to build, deploy, and manage database and analytics applications in a cloud-native way. As an Innovative Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations. Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Build the future of the AI Data Cloud. Join the Snowflake team. The Technical Instructor for the Snowflake Customer Education and Training Team will be responsible for creating and delivering compelling education contents and training sets that make complex concepts come alive in instructor-led classroom venues. The senior instructor will be seen as a subject matter expert and leader in transferring knowledge of Snowflake to customers, partners and internals and in accelerating their technical on-boarding journey. This role will also be responsible for the cross-training efforts, program management and help strategically ramp multiple resources within our external stakeholders. This role is a unique opportunity to contribute in a meaningful way to high value and high impact delivery at a very exciting time for the company. Snowflake is an innovative, high-growth, customer-focused company in a large and growing market. If you are an energetic, self-managed professional with experience teaching data courses to customers and possess excellent presentation and communication skills, we’d love to hear from you. AS A TECHNICAL INSTRUCTOR AT SNOWFLAKE, YOU WILL: Teach a breadth of technical courses to onboard customers and partners to Snowflake, the data warehouse built for the Cloud Cross-train a breadth of technical courses to qualified individuals and resources The scope of course concepts may include foundational and advanced courses in the discipline which includes Snowflake data warehousing concepts, novel SQL capabilities, data consumption and connectivity interfaces, data integration and ingestion capabilities, database security features, database performance topics, Cloud ecosystem topics and more Apply database and data warehousing industry/domain/technology expertise and experience during training sessions to help customers and partners ease their organizations into the Snowflake data warehouse from prior database environments Deliver contents and cross train on delivery best practices using a variety of presentation formats including engaging lectures, live demonstration, and technical labs Work with customers and partners that are investing in the train the trainer program to certify their selected trainers ensuring they are well prepared and qualified to deliver the course at their organization Strong eye for design, making complex training concepts come alive in a blended educational delivery model Work with the education content developers to help prioritize, create, integrate, and publish training materials and hands-on exercises to Snowflake end users; drive continuous improvement of training performance Work with additional Snowflake subject-matter-experts in creating new education materials and updates to keep pace with Snowflake product updates OUR IDEAL TECHNICAL INSTRUCTOR WILL HAVE: Strong data warehouse and data-serving platform background Recent experience with using SQL including potentially in complex workloads 5-10 years of experience in technical content training development and delivery Strong desire and ability to teach and train Prior experience with other databases (e.g. Oracle, IBM Netezza, Teradata,…) Excellent written and verbal communication skills Innovative and assertive, with the ability to pick up new technologies Presence: enthusiastic and high energy, but also poised, confident and extremely professional Track record of delivering results in a dynamic start-up environment Experience working cross functionally, ideally with solution architects, technical writers, and support Strong sense of ownership and high attention to detail Candidates with degrees from fields such as Computer Science or Management Information Systems Comfortable with travel up to 75% of the time BONUS POINTS FOR EXPERIENCE WITH THE FOLLOWING: Experience with creating and delivering training programs to mass audiences Experience with other databases (e.g. Teradata, Netezza, Oracle, Redshift,…) Experience with non-relational platforms and tools for large-scale data processing (e.g. Hadoop, HBase,…) Familiarity and experience with common BI and data exploration tools (e.g. Microstrategy, Business Objects, Tableau,…) Experience and understanding of large-scale infrastructure-as-a-service platforms (e.g. Amazon AWS, Microsoft Azure,…) Experience with ETL pipelines tools Experience using AWS and Microsoft services Participated in Train the Trainer programs Proven success at enterprise software startups Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com Show more Show less

Posted 1 month ago

Apply

6.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Diverse Lynx is looking for Datastage Developer to join our dynamic team and embark on a rewarding career journey Analyzing business requirements and translating them into technical specifications Designing and implementing data integration solutions using Datastage Extracting, transforming, and loading data from various sources into target systems Developing and testing complex data integration workflows, including the use of parallel processing and data quality checks Collaborating with database administrators, data architects, and stakeholders to ensure the accuracy and consistency of data Monitoring performance and optimizing Datastage jobs to ensure they run efficiently and meet SLAs Troubleshooting issues and resolving problems related to data integration Knowledge of data warehousing, data integration, and data processing concepts Strong problem-solving skills and the ability to think creatively and critically Excellent communication and collaboration skills, with the ability to work effectively with technical and non-technical stakeholders

Posted 1 month ago

Apply

8 - 10 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Senior Data Engineer - Google Cloud  7+ years direct experience working in Enterprise Data Warehouse technologies.  7+ years in a customer facing role working with enterprise clients.  Experience with architecting, implementing and/or maintaining technical solutions in virtualized environments.  Experience in design, architecture and implementation of Data warehouses, data pipelines and flows.  Experience with developing software code in one or more languages such as Java, Python and SQL.  Experience designing and deploying large scale distributed data processing systems with few technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Vertica, Netezza, Teradata, Tableau, Qlik or MicroStrategy.  Customer facing migration experience, including service discovery, assessment, planning, execution, and operations.  Demonstrated excellent communication, presentation, and problem-solving skills.  Experience in project governance and enterprise.  Mandatory Certifications Required Google Cloud Professional Cloud Architect Google Cloud Professional Data Engineer Mandatory skill sets-GCP Architecture/Data Engineering, SQL, Python Preferred Skill Sets-GCP Architecture/Data Engineering, SQL, Python Year of experience required-8-10 years Qualifications-B.E / B.TECH/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction The Hybrid Data Management (HDM) team is looking for enthusiastic and talented software developers to join us. Our services include Db2 on Cloud, Db2 Warehouse on Cloud, Netezza on Cloud and Data Virtualization as a Service. Our services are tightly integrated with IBM Cloud Pak for Data where customers can access a suite of leading data and AI capabilities in a unified experience. Your Role And Responsibilities Software Developers at IBM are the backbone of our strategic initiatives to design, code, test, and provide industry-leading solutions that make the world run today - planes and trains take off on time, bank transactions complete in the blink of an eye and the world remains safe because of the work our software developers do. Whether you are working on projects internally or for a client, software development is critical to the success of IBM and our clients worldwide. At IBM, you will use the latest software development tools, techniques and approaches and work with leading minds in the industry to build solutions you can be proud of. Design, develop, test, operate and maintain database features in our products and services and tools to provide a secure environment for the product to be used by customers in the cloud. Evaluate new technologies and processes that enhance our service capabilities. Preferred Education Documenting and sharing your experience, mentoring others Bachelor's Degree Required Technical And Professional Expertise 5+ years of relevant experience in software development Strong software programming experience and skills in C/C++ or in equivalent programming language Strong knowledge of data structures, algorithms, object-oriented programming, and test-driven development. Expertise to best practices in design, development Strong problem determination and resolution skills Preferred Technical And Professional Experience Knowledge of Linux/UNIX Operating Systems Exposure to best practices in design, development and testing of software Working experience with SQL databases (Db2, PostgreSQL, MySQL, Oracle, SQL Server etc) Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Entity: Technology Job Family Group: IT&S Group Job Description: Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 13- 15 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Commercial Acumen, Communication, Data Analysis, Data cleansing and transformation, Data domain knowledge, Data Integration, Data Management, Data Manipulation, Data Sourcing, Data strategy and governance, Data Structures and Algorithms (Inactive), Data visualization and interpretation, Digital Security, Extract, transform and load, Group Problem Solving Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Senior Data Engineer will be Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 8- 12 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Sr. Software Development Engineer (Hadoop / Python / SQL / Impala Dev) Overview Job Description Summary Mastercard is a technology company in the Global Payments Industry. We operate the world’s fastest payments processing network, connecting consumers, financial institutions, merchants, governments and businesses in more than 210 countries and territories. Mastercard products and solutions make everyday commerce activities – such as shopping, travelling, running a business and managing finances – easier, more secure and more efficient for everyone. Mastercard’s Data & Services team is a key differentiator for MasterCard, providing cutting-edge services that help our customers grow. Focused on thinking big and scaling fast around the globe, this dynamic team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business experimentation, and data-driven information and risk management services. We are currently seeking a Software Development Engineer-II for Location Program within the Data & Services group. You will own end-to-end delivery of engineering projects for some of our analytics and BI solutions that leverage Mastercard dataset combined with proprietary analytics techniques, to help businesses around the world solve multi-million dollar business problems. Roles And Responsibilities Work as a member of support team to resolve issues related to product, should have good troubleshooting skills and good knowledge in support work. Independently apply problem solving skills to identify symptoms and root causes of issues. Make effective and efficient decisions even when data is ambiguous. Provide technical guidance, support and mentoring to more junior team members. Make active contributions to improvement decisions and make technology recommendations that balance business needs and technical requirements. Proactively understand stakeholder needs, goals, expectations and viewpoints, to deliver results. Ensure design thinking accounts for long term maintainability of code. Thrive in a highly collaborative company environment where agility is paramount. Stay up to date with latest technologies and technical advancements through self-study, blogs, meetups, conferences, etc. Perform system maintenance, production incident problem management, identification of root cause & issue remediation. All About You Bachelor's degree in Information Technology, Computer Science or Engineering or equivalent work experience, with a proven track-record of successfully delivering on complex technical assignments. A solid foundation in Computer Science fundamentals, web applications and microservices-based software architecture. Full-stack development experience, including , Databases (Oracle, Netezza, SQL Server), Hands-on experience with Hadoop, Python, Impala, etc,. Excellent SQL skills, with experience working with large and complex data sources and capability of comprehending and writing complex queries. Experience working in Agile teams and conversant with Agile/SAFe tenets and ceremonies. Strong analytical and problem-solving abilities, with quick adaptation to new technologies, methodologies, and systems. Excellent English communication skills (both written and verbal) to effectively interact with multiple technical teams and other stakeholders. High-energy, detail-oriented and proactive, with ability to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-240980 Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies