Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Sr. Software Development Engineer (Big Data Engineer) Overview Job Description Summary Mastercard is a technology company in the Global Payments Industry. We operate the world’s fastest payments processing network, connecting consumers, financial institutions, merchants, governments and businesses in more than 210 countries and territories. Mastercard products and solutions make everyday commerce activities – such as shopping, travelling, running a business and managing finances – easier, more secure and more efficient for everyone. Mastercard’s Data & Services team is a key differentiator for MasterCard, providing cutting-edge services that help our customers grow. Focused on thinking big and scaling fast around the globe, this dynamic team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business experimentation, and data-driven information and risk management services. We are currently seeking a Software Development Engineer-II for Location Program within the Data & Services group. You will own end-to-end delivery of engineering projects for some of our analytics and BI solutions that leverage Mastercard dataset combined with proprietary analytics techniques, to help businesses around the world solve multi-million dollar business problems. Roles And Responsibilities Work as a member of support team to resolve issues related to product, should have good troubleshooting skills and good knowledge in support work. Independently apply problem solving skills to identify symptoms and root causes of issues. Make effective and efficient decisions even when data is ambiguous. Provide technical guidance, support and mentoring to more junior team members. Make active contributions to improvement decisions and make technology recommendations that balance business needs and technical requirements. Proactively understand stakeholder needs, goals, expectations and viewpoints, to deliver results. Ensure design thinking accounts for long term maintainability of code. Thrive in a highly collaborative company environment where agility is paramount. Stay up to date with latest technologies and technical advancements through self-study, blogs, meetups, conferences, etc. Perform system maintenance, production incident problem management, identification of root cause & issue remediation. All About You Bachelor's degree in Information Technology, Computer Science or Engineering or equivalent work experience, with a proven track-record of successfully delivering on complex technical assignments. A solid foundation in Computer Science fundamentals, web applications and microservices-based software architecture. Full-stack development experience, including , Databases (Oracle, Netezza, SQL Server), Hands-on experience with Hadoop, Python, Impala, etc,. Excellent SQL skills, with experience working with large and complex data sources and capability of comprehending and writing complex queries. Experience working in Agile teams and conversant with Agile/SAFe tenets and ceremonies. Strong analytical and problem-solving abilities, with quick adaptation to new technologies, methodologies, and systems. Excellent English communication skills (both written and verbal) to effectively interact with multiple technical teams and other stakeholders. High-energy, detail-oriented and proactive, with ability to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-240980 Show more Show less
Posted 2 weeks ago
10.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Build the future of the AI Data Cloud. Join the Snowflake team. We are looking for a Solutions Architect to be part of our Professional Services team to deploy cloud products and services for our customers. This person must be a hands-on, self-starter who loves solving innovative problems in a fast-paced, agile environment. The ideal candidate will have the insight to connect a specific business problem and Snowflake’s solution and communicate that connection and vision to various technical and executive audiences. The person we’re looking for shares our passion about reinventing the data platform and thrives in the dynamic environment. That means having the flexibility and willingness to jump in and get done what needs to be done to make Snowflake and our customers successful. It means keeping up to date on the ever-evolving technologies for data and analytics in order to be an authoritative resource for Snowflake, System Integrators and customers. And it means working collaboratively with a broad range of people both inside and outside the company. AS A SOLUTIONS ARCHITECT AT SNOWFLAKE, YOU WILL: Be a technical expert on all aspects of Snowflake Guide customers through the process of migrating to Snowflake and develop methodologies to improve the migration process Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Support other members of the Professional Services team develop their expertise Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing. OUR IDEAL SOLUTIONS ARCHITECT WILL HAVE: Minimum 10 years of experience working with customers in a pre-sales or post-sales technical role Experience migrating from one data platform to another and holistically addressing the unique challenges of migrating to a new platform University degree in computer science, engineering, mathematics or related fields, or equivalent experience Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos Understanding of complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools Strong skills in databases, data warehouses, and data processing Extensive hands-on expertise with SQL and SQL analytics Experience and track record of success selling data and/or analytics software to enterprise customers; includes proven skills identifying key stakeholders, winning value propositions, and compelling events Extensive knowledge of and experience with large-scale database technology (e.g. Netezza, Exadata, Teradata, Greenplum, etc.) Software development experience with C/C++ or Java.Scripting experience with Python, Ruby, Perl, Bash. Ability and flexibility to travel to work with customers on-site BONUS POINTS FOR THE FOLLOWING: Experience with non-relational platforms and tools for large-scale data processing (e.g. Hadoop, HBase) Familiarity and experience with common BI and data exploration tools (e.g. Microstrategy, Business Objects, Tableau) Experience and understanding of large-scale infrastructure-as-a-service platforms (e.g. Amazon AWS, Microsoft Azure, OpenStack, etc.) Experience implementing ETL pipelines using custom and packaged tools Experience using AWS services such as S3, Kinesis, Elastic MapReduce, Data pipeline Experience selling enterprise SaaS software Proven success at enterprise software WHY JOIN OUR PROFESSIONAL SERVICES TEAM AT SNOWFLAKE? Unique opportunity to work on a truly disruptive software product Get unique, hands-on experience with bleeding edge data warehouse technology Develop, lead and execute an industry-changing initiative Learn from the best! Join a dedicated, experienced team of professionals. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. 3+ years direct experience working in IT Infrastructure 2+ years in a customer facing role working with enterprise clients. Experience with implementing and/or maintaining technical solutions in virtualized environments. Experience in design, architecture and implementation of Data warehouses, data pipelines and flows. Experience with developing software code in one or more languages such as Java and Python. Proficient with SQL Experience designing and deploying large scale distributed data processing systems with one or more technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Vertica, Netezza, Teradata, Tableau, Qlik or MicroStrategy. Customer facing migration experience, including service discovery, assessment, planning, execution, and operations. Demonstrated excellent communication, presentation, and problem-solving skills. Mandatory Certifications Required Google Cloud Professional Data Engineer Mandatory skill sets-GCP Architecture/Data Engineering, SQL, Python Preferred Skill Sets-GCP Architecture/Data Engineering, SQL, Python Year of experience required-8-10 years Qualifications-B.E / B.TECH/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description AWS Fintech team is looking for a Data Engineering Manager to transform and optimize high-scale, world class financial systems that power the global AWS business. The success of these systems will fundamentally impact the profitability and financial reporting for AWS and Amazon. This position will play an integral role in leading programs that impact multiple AWS cost optimization initiatives. These programs will involve multiple development teams across diverse organizations to build sophisticated, highly reliable financial systems. These systems enable routine finance operations as well as machine learning, analytics, and GenAI reporting that enable AWS Finance to optimize profitability and free cash flow. This position requires a proactive, highly organized individual with an aptitude for data-driven decision making, a deep curiosity for learning new systems, and collaborative skills to work with both technical and financial teams. Key job responsibilities Build and lead a team of data engineers, application development engineers, and systems development engineers Drive execution of data engineering programs and projects Help our leadership team make challenging decisions by presenting well-reasoned and data-driven solution proposals and prioritizing recommendations. Identify and execute on opportunities for our organization to move faster in delivering innovations to our customers. This role has oncall responsibilities. A day in the life The successful candidate will build and grow a high-performing data engineering team to transform financial processes at Amazon. The candidate will be curious and interested in the capabilities of Large Language Model-based development tools like Amazon Q to help teams accelerate transformation of systems. The successful candidate will begin with execution to familiarize themselves with the space and then construct a strategic roadmap for the team to innovate. You thrive and succeed in an entrepreneurial environment, and are not hindered by ambiguity or competing priorities. You thrive driving strategic initiatives and also dig in deep to get the job done. About The Team The AWS FinTech team enables the growth of earth’s largest cloud provider by building world-class finance technology solutions for effective decision making. We build scalable long-term solutions that provide transparency into financial business insights while ensuring the highest standards of data quality, consistency, and security. We encourage a culture of experimentation and invest in big ideas and emerging technologies. We are a globally distributed team with software development engineers, data engineers, application developers, technical program managers, and product managers. We invest in providing a safe and welcoming environment where inclusion, acceptance, and individual values are honored. Basic Qualifications Experience managing a data or BI team 2+ years of processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) experience 2+ years of relational database technology (such as Redshift, Oracle, MySQL or MS SQL) experience 2+ years of developing and operating large-scale data structures for business intelligence analytics (using ETL/ELT processes) experience 5+ years of data engineering experience Experience communicating to senior management and customers verbally and in writing Experience leading and influencing the data or BI strategy of your team or organization Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS Preferred Qualifications Knowledge of software development life cycle or agile development environment with emphasis on BI practices Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with AWS Tools and Technologies (Redshift, S3, EC2) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2961772 Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Data Engineer - Google Cloud 5+ years direct experience working in Enterprise Data Warehouse technologies. 5+ years in a customer facing role working with enterprise clients. Experience with implementing and/or maintaining technical solutions in virtualized environments. Experience in design, architecture and implementation of Data warehouses, data pipelines and flows. Experience with developing software code in one or more languages such as Java, Python and SQL. Experience designing and deploying large scale distributed data processing systems with few technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Vertica, Netezza, Teradata, Tableau, Qlik or MicroStrategy. Customer facing migration experience, including service discovery, assessment, planning, execution, and operations. Demonstrated excellent communication, presentation, and problem-solving skills. Mandatory Certifications Required Google Cloud Professional Cloud Architect Or Google Cloud Professional Data Engineer + AWS Big Data Specialty Certification Mandatory skill sets-GCP Data Engineering, SQL, Python Preferred Skill Sets-GCP Data Engineering, SQL, Python Year of experience required-4-8 years Qualifications-B.E / B.TECH/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 3 weeks ago
3.0 years
5 - 8 Lacs
Hyderābād
On-site
We are seeking an experienced and motivated Data Engineer to join our team. In this role, you will design, build, and maintain scalable data solutions to support critical business needs. You will work with distributed data platforms, cloud infrastructure, and modern data engineering tools to enable efficient data processing, storage, and analytics. The role includes participation in an on-call rotation to ensure the reliability and availability of our systems and pipelines Key Responsibilities Data Platform Development : Design, develop, and maintain data pipelines and workflows on distributed data platforms such as BigQuery, Hadoop/EMR/DataProc, or Teradata. Cloud Integration: Build and optimize cloud-based solutions using AWS or GCP to process and store large-scale datasets. Workflow Orchestration: Design and manage workflows and data pipelines using Apache Airflow to ensure scalability, reliability, and maintainability. Containerization and Orchestration : Deploy and manage containerized applications using Kubernetes for efficient scalability and resource management. Event Streaming : Work with Kafka to implement reliable and scalable event streaming systems for real-time data processing. Programming and Automation : Write clean, efficient, and maintainable code in Python and SQL to automate data processing, transformation, and analytics tasks. Database Management : Design and optimize relational and non-relational databases to support high-performance querying and analytics. System Monitoring & Troubleshooting: Participate in the on-call rotation to monitor systems, address incidents, and ensure the reliability of production environments. Collaboration : Work closely with cross-functional teams, including data scientists, analysts, and product managers, to understand data requirements and deliver solutions that meet business objectives. Participate in code reviews, technical discussions, and team collaboration to deliver high-quality software solutions. This role includes participation in an on-call rotation to ensure the reliability and performance of production systems: Rotation Schedule : Weekly rotation beginning Tuesday at 9:00 PM PST through Monday at 9:00 AM PST. Responsibilities During On-Call : Monitor system health and respond to alerts promptly. Troubleshoot and resolve incidents to minimize downtime. Escalate issues as needed and document resolutions for future reference. Requirements: Primary Technologies: Big Query or other distributed data platform, for example, Big Data (Hadoop/EMR/DataProc), SnowFlake, Teradata, or Netezza, ASW, GCP, Kubernetes, Kafka Python, SQL Bachelor’s degree in computer science, Engineering, or a related field (or equivalent work experience). 3+ years of experience in data engineering or related roles. Hands-on experience with distributed data platforms such as BigQuery, Hadoop/EMR/DataProc, Snowflake, or Teradata. Proficiency in Apache Airflow for building and orchestrating workflows and data pipelines. Proficiency in Python and SQL for data processing and analysis. Experience with cloud platforms like AWS or GCP, including building scalable solutions. Familiarity with Kubernetes for container orchestration. Knowledge of Kafka for event streaming and real-time data pipelines. Strong problem-solving skills and ability to troubleshoot complex systems. Excellent communication and collaboration skills to work effectively in a team environment. Preferred Familiarity with CI/CD pipelines for automated deployments. Knowledge of data governance, security, and compliance best practices. Experience with DevOps practices and tools. We have a global team of amazing individuals working on highly innovative enterprise projects & products. Our customer base includes Fortune 100 retail and CPG companies, leading store chains, fast growth fintech, and multiple Silicon Valley startups. What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015 (QMS), ISO 27001:2022 (ISMS), ISO 20000-1:2018 (ITSM) and ISO 14001:2015 (EMS) Certified. We have a vibrant culture of learning via collaboration and making workplace fun. People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves. To know more about Confiz Limited, visit https://www.linkedin.com/company/confiz/
Posted 3 weeks ago
2.0 years
0 Lacs
Hyderābād
On-site
- 2+ years of processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) experience - 2+ years of relational database technology (such as Redshift, Oracle, MySQL or MS SQL) experience - 2+ years of developing and operating large-scale data structures for business intelligence analytics (using ETL/ELT processes) experience - 5+ years of data engineering experience - Experience managing a data or BI team - Experience communicating to senior management and customers verbally and in writing - Experience leading and influencing the data or BI strategy of your team or organization - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS Do you pioneer? Do you enjoy solving complex problems in building and analyzing large datasets? Do you enjoy focusing first on your customer and working backwards? Amazon transportation controllership team is looking for an experienced Data Engineering Manager with experience in architecting large/complex data systems with a strong record of achieving results, scoping and delivering large projects end-to-end. You will be the key driver in building out our vision for scalable data systems to support the ever-growing Amazon global transportation network businesses. Key job responsibilities As a Data Engineering Manager in Transportation Controllership, you will be at the forefront of managing large projects, providing vision to the team, designing and planning large financial data systems that will allow our businesses to scale world-wide. You should have deep expertise in the database design, management, and business use of extremely large datasets, including using AWS technologies - Redshift, S3, EC2, Data-pipeline and other big data technologies. Above all you should be passionate about data warehousing large datasets together to answer business questions and drive change. You should have excellent business acumen and communication skills to be able to work with multiple business teams, and be comfortable communicating with senior leadership. Due to the breadth of the areas of business, you will coordinate across many internal and external teams, and provide visibility to the senior leaders of the company with your strong written and oral communication skills. We need individuals with demonstrated ability to learn quickly, think big, execute both strategically and tactically, motivate and mentor their team to deliver business values to our customers on time. A day in the life On a daily basis you will: • manage and help GROW a team of high performing engineers • understand new business requirements and architect data engineering solutions for the same • plan your team's priorities, working with relevant internal/external stakeholders, including sprint planning • resolve impediments faced by the team • update leadership as needed • use judgement in making the right tactical and strategic decisions for the team and organization • monitor health of the databases and ingestion pipelines Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with AWS Tools and Technologies (Redshift, S3, EC2) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 3 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At Broadridge, we've built a culture where the highest goal is to empower others to accomplish more. If you’re passionate about developing your career, while helping others along the way, come join the Broadridge team. 8+ Years designing, developing, and administering IBM Cognos 11.1.x application. Cognos 11.1.x upgrade experience required. Installing hot fixes / service packs to the existing version of Cognos Analytics. Experience with Motio CI integration with Cognos Analytics. Knowledge on Cognos SDK and Cognos life cycle manager is a plus. Hands on experience on granular level Cognos security customization and installing any third partly tools. Hands on experience in Cognos Framework manager install and Configuration. Experience with Publishing packages and customize the package acccess as per requirement. Knowledge on Cognos TM1 is Plus. Experience with Cognos/Tableau installation and configuration in AWS Responsible in troubleshooting, resolving Cognos Analytics/tableau issues, open service requests with Cognos vendor, work with different teams providing recommendations, driving standards, plan and execute effective transition on development and production operations. Deployment of Cognos in a clustered environment and performing upgrades Implement and document best practices for a Cognos Environment. Experience in Windows/Linux based operating system environment and well versed in Linux OS commands. Experience should include maintenance and support activities, performance monitoring and tuning, upgrading versions, software configuration, business continuity and disaster recovery planning, and general IT processes such as Change Management, Configuration Management, Problem Resolution, and Incident Tracking required. Ability to cross-train team. Implementation of proactive Cognos environment health checks. Hands on Cognos user groups, security, and user entitlement administration Experience with Cognos User LDAP/Active Directory Integration /Synchronization preferred Experience with IIS 7.5 or higher is plus. Integrate Cognos with SharePoint portal/ team is a plus. Ability to provide 24 by 7 production support for Cognos in an on- rotation with excellent communication skills required Any other BI tool experience such as Tableau/Jaspersoft/Crystal is a plus Experience with industry BI/Reporting toolsets including Tableau, Jaspersoft, Cognos, Power BI, and Crystal. Tableau 2022.1.x upgrade experience required. Knowledge on Jasper report server upgrade 6.2 to 8.1 version is plus Experience with connecting to Hadoop, Oracle Sybase, DB2, Netezza, Teradata, and SQL databases Knowledge on Data Science integration and application (Python, R) Knowledge on programming languages (SDK, API's, Java, JavaScript) Customizing Cognos and tableau URL's look and feel is plus. Excellent communication skills (must be able to interface with both technical and business leaders in the organization) Oversee and perform all system administration and change management responsibilities on the Tableau server, including server maintenance, patching, and hardware/software upgrades Experience in migrate tableau workbooks/data sources into higher environments. Expertise in install/configure Jasper report server on- premises and cloud environment Experience in deploy jasper report code from one environment to another environment Experience with install/configure Apache tomcat and knowledge on customization of system.xml and web.xml files. Show more Show less
Posted 3 weeks ago
6.0 - 12.0 years
0 Lacs
Gurugram, Haryana, India
On-site
TCS has been a great pioneer in feeding the fire of young techies like you. We are a global leader in the technology arena and there’s nothing that can stop us from growing together. What we are looking for Role: SQL DBA Experience Range: 6 - 12 years Location: New Delhi / Gurugram Interview Mode: Saturday Virtual Drive Must Have: Minimum 5 mandate details are mandate with two or 3 liners 1. MSSQL Server 2. Azure SQL Server 3. Must have done certifications in SQL Server / Azure SQL Good to Have: Minimum 2 mandate details are mandate with two or 3 liners 1. DB2 2. Netezza 3. PowerShell 4. Azure PostgreSQL Essential: Administer and maintain database systems, with a focus on MS SQL Server along with Azure, PostgreSQL, and DB2. Supporting SQL server in Azure environment as IAAS/SQL MI/PaaS services. Managing Azure SQL databases, SQL Managed Instances, Azure VM in Azure Portal. Monitor database performance and proactively address issues to ensure optimal functionality. Collaborate with project teams to understand database requirements and provide efficient solutions. Participate in the design, implementation, and maintenance of database structures for different applications. Work independently to troubleshoot and resolve database-related issues promptly. Implement best practices to enhance database performance and security Manage databases on Azure Cloud, ensuring seamless integration and optimization for cloud-based solutions Utilize SQL Server tools and other relevant technologies for effective database administration Stay updated on the latest advancements in database tools and incorporate them into daily practices. Collaborate with cross-functional teams, including developers and system administrators, to achieve project goals and Provide guidance and support to team members on database-related issues. Minimum Qualification: •15 years of full-time education •Minimum percentile of 50% in 10th, 12th, UG & PG (if applicable) Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Senior Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Are you looking for a career move that will put you at the heart of a global financial institution? Then bring your skills in Dashboard applications to our Markets Operations technology Team. By Joining Citi, you will become part of a global organization whose mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress. Team/Role Overview Our Markets Operations Technology Team are a global team which provides exposure to all countries such as India, America and UK. The role of this team is to manage the end to end processing of a case within Dashboard Applications. What You’ll Do The Applications Development Senior Dashboard Programmer/Developer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. This role requires deep expertise in system design, hands-on coding, and strong problem-solving skills to create resilient, high-performing, and secure applications. What We’ll Need From You Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality Ensure essential procedures are followed and help define operating standards and processes Acts as SME to senior stakeholders and /or other team members. Drive the adoption of modern engineering ways of working, including Agile, DevOps, and CI/CD. Advocate for automated testing, infrastructure as code, and continuous monitoring to enhance software reliability. Apply Behavior-Driven Development (BDD), Test-Driven Development (TDD), and unit testing to ensure code quality and functionality. Conduct thorough code reviews, ensuring adherence to best practices in readability, performance, and security. Implement and enforce secure coding practices, performing vulnerability assessments and ensuring compliance with security standards. Responsibilities : Candidate should have 8+ years of overall experience that includes 2+ years in the financial services industry (preferably in investment banks). Ideal candidate would be a self contained individual contributor with a go getter attitude to develop software meeting the laid down quality metrics within the project environment. The candidate would have prior working experience in a competitive, high paced environment to deliver software to meet business needs. Relevant Technologies: QlikView, Qlik Sense, Tableau, NPrinting, JScript, HTML and Netezza Technically the Dashboards are built on Qlik View and Qlik Sense and Netezza at the backend. NPrinting are used to generate and send user reports as mail attachment. Experience with high performance & high volume Integrated Dashboard Development and database performance tuning. Strong Qlik View and Qlik Sense Knowledge and experience of using Mashup to build Dashboards is a must Knowledge of design methodologies Display sound analytical, problem solving, presentation and inter-personal skills to handle various critical situations. Ability to carry out adaptive changes necessitated by changes in business requirements and technology. Post trade processing experience; Familiarity with trade life cycle and associated business processes. The role would be based in Pune to drive client interfacing with business and operations to drive new product onboarding into the current platform. The person would be responsible for understanding business requirements and interact with upstream systems. The candidate is expected to deliver new products to be included and enhancement on the existing product for more coverage from the various regions feeds and markets to be covered. Support and manage the existing code base. The candidate must have strong desire to learn, commitment towards roles & responsibilities and zeal to hard work in order to be perfect fit for the team. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 3 weeks ago
9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
🚀 We’re Hiring: Informatica Developer (6–9 Years Experience) | Chennai (Work From Office) 🚀 Are you ready to take your ETL expertise to the next level? Join our client , a leading IT solutions provider, as we look for Informatica professionals with 6–9 years of hands-on experience for an immediate opportunity in Chennai . 🔍 Role Overview As a key member of our data integration team, you'll design, develop, test, and support robust ETL solutions using Informatica and related tools. You'll play a critical role in driving data strategy for top-tier financial services clients. 📍 Location: Chennai (Work From Office) 🕒 Notice Period: Immediate joiners or candidates with up to 30 days notice only ✅ Important: Background verification, PF account, and no dual employment are mandatory 🛠️ Interview Process: Multiple technical rounds by our client ✅ What We’re Looking For: Must-Have Skills: 6–9 years in ETL/ Informatica (Axway or similar tools also considered) Strong SQL skills for data analysis and validation Proficiency in Oracle, Netezza , and data warehouse/lake environments Automation experience using Java frameworks ; scripting in Python, Unix, Shell Experience with DevOps tools : Jenkins, UDeploy, Concourse, CI/CD pipelines Familiarity with Cloud platforms like AWS, Azure , and Snowflake Strong understanding of SDLC/STLC , defect tracking, and Agile methodologies Excellent communication and coordination with business & tech stakeholders Good-to-Have Skills: Exposure to BI/reporting tools like Power BI, OBIEE, Tableau Experience with Atscale (semantic layer platform) Hands-on with data test automation tools like iCEDQ 👤 Who Should Apply? Professionals with a passion for data integration and testing Candidates who thrive in fast-paced, Agile environments Individuals ready to work from our Chennai office and join immediately Ready to make an impact with our client? Apply now or tag someone who fits this role! #HiringNow #InformaticaJobs #ETLDeveloper #ChennaiJobs #DataIntegration #SQL #CloudData #DevOps #ImmediateJoiners #6to9YearsExperience #strive4x #OGI Show more Show less
Posted 3 weeks ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Build the future of the AI Data Cloud. Join the Snowflake team. We are looking for a Solutions Architect to be part of our Professional Services team to deploy cloud products and services for our customers. This person must be a hands-on, self-starter who loves solving innovative problems in a fast-paced, agile environment. The ideal candidate will have the insight to connect a specific business problem and Snowflake’s solution and communicate that connection and vision to various technical and executive audiences. The person we’re looking for shares our passion about reinventing the data platform and thrives in the dynamic environment. That means having the flexibility and willingness to jump in and get done what needs to be done to make Snowflake and our customers successful. It means keeping up to date on the ever-evolving technologies for data and analytics in order to be an authoritative resource for Snowflake, System Integrators and customers. And it means working collaboratively with a broad range of people both inside and outside the company. AS A SOLUTIONS ARCHITECT AT SNOWFLAKE, YOU WILL: Be a technical expert on all aspects of Snowflake Guide customers through the process of migrating to Snowflake and develop methodologies to improve the migration process Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Support other members of the Professional Services team develop their expertise Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing. OUR IDEAL SOLUTIONS ARCHITECT WILL HAVE: Minimum 10 years of experience working with customers in a pre-sales or post-sales technical role Experience migrating from one data platform to another and holistically addressing the unique challenges of migrating to a new platform University degree in computer science, engineering, mathematics or related fields, or equivalent experience Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos Understanding of complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools Strong skills in databases, data warehouses, and data processing Extensive hands-on expertise with SQL and SQL analytics Experience and track record of success selling data and/or analytics software to enterprise customers; includes proven skills identifying key stakeholders, winning value propositions, and compelling events Extensive knowledge of and experience with large-scale database technology (e.g. Netezza, Exadata, Teradata, Greenplum, etc.) Software development experience with C/C++ or Java Scripting experience with Python, Ruby, Perl, Bash Ability and flexibility to travel to work with customers on-site BONUS POINTS FOR THE FOLLOWING: Experience with non-relational platforms and tools for large-scale data processing (e.g. Hadoop, HBase) Familiarity and experience with common BI and data exploration tools (e.g. Microstrategy, Business Objects, Tableau) Experience and understanding of large-scale infrastructure-as-a-service platforms (e.g. Amazon AWS, Microsoft Azure, OpenStack, etc.) Experience implementing ETL pipelines using custom and packaged tools Experience using AWS services such as S3, Kinesis, Elastic MapReduce, Data pipeline Experience selling enterprise SaaS software Proven success at enterprise software WHY JOIN OUR PROFESSIONAL SERVICES TEAM AT SNOWFLAKE? Unique opportunity to work on a truly disruptive software product Get unique, hands-on experience with bleeding edge data warehouse technology Develop, lead and execute an industry-changing initiative Learn from the best! Join a dedicated, experienced team of professionals. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com Show more Show less
Posted 3 weeks ago
12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Senior Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Are you looking for a career move that will put you at the heart of a global financial institution? Then bring your skills in Dashboard applications to our Markets Operations technology Team. By Joining Citi, you will become part of a global organization whose mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress. Team/Role Overview Our Markets Operations Technology Team are a global team which provides exposure to all countries such as India, America and UK. The role of this team is to manage the end to end processing of a case within Dashboard Applications. What You’ll Do The Applications Development Senior Dashboard Programmer/Developer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. This role requires deep expertise in system design, hands-on coding, and strong problem-solving skills to create resilient, high-performing, and secure applications. What We’ll Need From You Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality Ensure essential procedures are followed and help define operating standards and processes Acts as SME to senior stakeholders and /or other team members. Drive the adoption of modern engineering ways of working, including Agile, DevOps, and CI/CD. Advocate for automated testing, infrastructure as code, and continuous monitoring to enhance software reliability. Apply Behavior-Driven Development (BDD), Test-Driven Development (TDD), and unit testing to ensure code quality and functionality. Conduct thorough code reviews, ensuring adherence to best practices in readability, performance, and security. Implement and enforce secure coding practices, performing vulnerability assessments and ensuring compliance with security standards. Responsibilities : Candidate should have 12+ years of overall experience with 8+ years of relevant experience in to Tableau / Qlik sense that includes 2+ years in the financial services industry (preferably in investment banks). Ideal candidate would be a self contained individual contributor with a go getter attitude to develop software meeting the laid down quality metrics within the project environment. The candidate would have prior working experience in a competitive, high paced environment to deliver software to meet business needs. The candidate will be responsible for migrating existing dashboards from Qlik Sense to Tableau as part of strategic initiative. Relevant Technologies: QlikView, Qlik Sense, Tableau, JScript, HTML and Netezza Technically the Dashboards are built on Qlik Sense and Netezza at the backend. Experience with high performance & high volume Integrated Dashboard Development and database performance tuning. Strong Qlik Sense Knowledge and experience of using Mashup to build Dashboards is a must Experience in migrating existing Qlik sense to Tableau will be an added advantage Knowledge of design methodologies Display sound analytical, problem solving, presentation and inter-personal skills to handle various critical situations. Ability to carry out adaptive changes necessitated by changes in business requirements and technology. Post trade processing experience; Familiarity with trade life cycle and associated business processes. The role would be based in Pune to drive client interfacing with business and operations to drive new product onboarding into the current platform. The person would be responsible for understanding business requirements and interact with upstream systems. The candidate is expected to deliver new products to be included and enhancement on the existing product for more coverage from the various regions feeds and markets to be covered. Support and manage the existing code base. The candidate must have strong desire to learn, commitment towards roles & responsibilities and zeal to hard work in order to be perfect fit for the team. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Citi Global Functions Technology is a world-class technology group employing an innovative, disciplined, and business focused approach to developing a wide variety of products and solutions. Across many diverse technology hubs worldwide, our 14k+ technologists design, build and deploy technology solutions for business stakeholders across Risk, Finance, Compliance and HR domains. Citi is going through a major transformation program to improve its overall Financial Management by implementing industry standard cloud-based software platform. Looking for a Business Analyst who has strong experience working with technical teams on AbInitio, ETL, Data Warehousing in strong partnership with Finance stakeholders. Responsibilities : Leading business requirements for buildout of Engineering tools using Ab Initio, Metadata hub, Tricentis Tosca, Unix and Oracle. This role is an Individual contributor role that owns all business requirements for data engineering utilities, enabling rollout and ensure standard GFT processes are deployed for the successful go-live of strategic ledger for a key project. Performing initiatives related to System Business Analysis, Functional Testing, SIT, User Acceptance Testing (UAT) process and product rollout into production. You will be a BA specialist who works with technology project managers, UAT professionals and users to design and implement appropriate scripts/plans for an application testing strategy/approach. Responsibilities may also include defining business requirements software quality assurance testing. Resolves complex and highly variable issues. Analyses trends at an organizational level to improve processes; follows and analyses industry trends. Document Designs standards and procedures; ensures that they are adhered to throughout the software development life cycle. Manages organizational process change. Develops and implements methods for cost, effort and milestones of IT Quality activities. Strives for continuous improvements and streamlining of processes. Ensures consistency and quality of processes across the organization. Exhibits in-depth understanding of concepts and procedures within own area and basic knowledge of these elements in other areas. Requires in-depth understanding of how own area integrates within IT Quality and has basic commercial awareness. Responsible for budgeting, project estimates, task management & balancing prioritization across multiple streams of development. Collaborates with local and global stakeholders like QA team, production support team, environment management team, DBA team, etc. to ensure project stability and productivity. Experience with Citi implementations, Oracle configuration. Performs Other Duties And Functions As Assigned. Appropriately assess risk when business decisions are made, demonstrating consideration for the firm's reputation and safeguarding Citigroup, its clients and assets. Qualifications : Relevant experience in software business analysis or IT covering Finance Technology in Financial Services. Relevant experience in leading business requirements for development for enterprise scale platforms, products, or frameworks preferably using Oracle/Netezza/Teradata. Strong experience in analyzing and communicate complete data problems with tools available at Citi. Knowledge of any well-known software development and testing life-cycle methodology. Requires communication and diplomacy skills and an ability to persuade and influence Adopting a standard process to ensure all test cases coming from key stakeholders are being received, reviewed, and validated in consistent fashion. Experience in all aspects of Data namely Reconciliations, Data Comparison, Data Quality and Data Security for SAAS or Cloud based platform. Demonstrated experience in collaborating with different teams to ensure proactive Cross application/downstream impact analysis, responsible for creating test plans and strategy across multiple business critical applications in Finance. Experience with existing Citi applications, implementations and Configurations. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Business Analysis / Client Services ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Senior Data Engineer will be Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 8- 12 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Show more Show less
Posted 3 weeks ago
1.0 - 3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Roles & Responsibilities Job Description: Build pipelines to bring in wide variety of data from multiple sources within the organization as well as from social media and public data sources. Collaborate with cross functional teams to source data and make it available for downstream consumption. Work with the team to provide an effective solution design to meet business needs. Ensure regular communication with key stakeholders, understand any key concerns in how the initiative is being delivered or any risks/issues that have either not yet been identified or are not being progressed. Ensure dependencies and challenges (risks) are escalated and managed. Escalate critical issues to the Sponsor and/or Head of Data Engineering. Ensure timelines (milestones, decisions and delivery) are managed and value of initiative is achieved, without compromising quality and within budget. Ensure an appropriate and coordinated communications plan is in place for initiative execution and delivery, both internal and external. Ensure final handover of initiative to business as usual processes, carry out a post implementation review (as necessary) to ensure initiative objectives have been delivered, and any lessons learned are fed into future initiative management processes. Who We Are Looking For Competencies & Personal Traits Work as a team player Excellent problem analysis skills Experience with at least one Cloud Infra provider (Azure/AWS) Experience in building data pipelines using batch processing with Apache Spark (Spark SQL, Dataframe API) or Hive query language (HQL) Knowledge of Big data ETL processing tools Experience with Hive and Hadoop file formats (Avro / Parquet / ORC) Basic knowledge of scripting (shell / bash) Experience of working with multiple data sources including relational databases (SQL Server / Oracle / DB2 / Netezza). Basic understanding of CI CD tools such as Jenkins, JIRA, Bitbucket, Artifactory, Bamboo and Azure Dev-ops. Basic understanding of DevOps practices using Git version control Ability to debug, fine tune and optimize large scale data processing jobs Working Experience 1-3 years of broad experience of working with Enterprise IT applications in cloud platform and big data environments. Professional Qualifications Certifications related to Data and Analytics would be an added advantage Education Master/Bachelor’s degree in STEM (Science, Technology, Engineering, Mathematics) Language Fluency in written and spoken English Experience 3-4.5 Years Skills Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): Kafka, Big Data, Apache Hive, SQL Server DBA, CI/CD, Apache Spark About The Company Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP). Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru. Show more Show less
Posted 3 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Mumbai
Work from Office
Design and implement ETL solutions using IBM InfoSphere DataStage to integrate and process large datasets. You will develop, test, and optimize data pipelines to ensure smooth data transformation and loading. Expertise in IBM InfoSphere DataStage, ETL processes, and data integration is essential for this position.
Posted 4 weeks ago
2.0 - 6.0 years
4 - 8 Lacs
Chennai
Work from Office
The IBM InfoSphere DataStage role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the IBM InfoSphere DataStage domain.
Posted 4 weeks ago
2.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Work from Office
The IBM InfoSphere DataStage role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the IBM InfoSphere DataStage domain.
Posted 4 weeks ago
2.0 - 4.0 years
4 - 6 Lacs
Chennai
Work from Office
The IBM InfoSphere DataStage, Teradata role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the IBM InfoSphere DataStage, Teradata domain.
Posted 4 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
The IBM InfoSphere DataStage E3 role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the IBM InfoSphere DataStage E3 domain.
Posted 4 weeks ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
What You’ll Do We are seeking a dynamic Account Executive Artificial Intelligence (AI) to join our strong and strategic sales tea m . As a n AE (AI), you will drive the adoption of our AI solutions in our “Rest of Cloud” ( RoC ) market that includes to Cloud Service Providers (CSPs) and emerging AI providers such as AI native cloud builders, AI SaaS providers, and AI System Integrators . You will understand their specific needs and drive AI Infrastructure and Networking solutions that align to their business operations. This role requires a deep understanding of AI Infrastructure and large-scale networking with a strong ability to translate technical concepts to a diverse audience. Who You’ll Work With The Cloud + AI Infrastructure team delivers one scalable strategy with local execution for customer transformation and growth. We are the worldwide go-to-market compute and data center networking engine assembling market transitions and engaging with sellers to fuel growth for customers and Cisco. Alongside our colleagues, Cloud & AI Infrastructure builds the sales strategy, activates sellers and technical communities, and accelerates selling every single day. Who You Are You will develop and execute a strategy to deliver incremental revenu e for AI and large-scale networking products and services including network routing and switching, optics and data center interconnects, automation and performance optimization across emerging AI provider accounts and develop relationships with key decision-makers and partners. Engaging with your clients to understand their business challenges and conducting detailed analysis to find new opportunities for AI and networking solutions are the dynamic skills you will bring to this role. You understand AI technology and the market and can translate technical concepts i nto business value for clients. Minimum Qualifications 8 + years of technology-related business development experience Experience unlocking revenue for new innovative technology-based solutions. Experience working with Cloud Service Providers, NeoCloud customers, and/or AI System Integrators . Experience in understanding business issues of Cloud Builders and Providers, Networking Infrastructure / accelerated Computing/ Data Center technology/ Deep learning & machine learning. Proven ability to work cross-functionally with Engineering and Marketing to develop and launch new AI or Networking Infrastructure offers Preferred Qualifications Bachelor’s degree or equivalent experience in Business, Computer Science, Engineering, or a related field; advanced degree is a plus. Excellent verbal and written communication skills. Experience bridging large-scale network ing concepts with AI infrastructure (data center / compute) Experience with deep learning, data science, and NVIDIA GPUs. Experience in two or more data estate workloads such as: Microsoft’s Data & AI Platform (Azure Synapse Analytics, Azure Databricks, CosmosDB , Azure SQL or HDInsight, etc.), AWS (Redshift, Aurora, Glue), Google ( BigQuery ), MongoDB, Cassandra, Snowflake, Teradata, Oracle Exadata, IBM Netezza, SAP (HANA, BW), Apache Hadoop & Spark, MapR or Cloudera/Hortonworks, etc. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Why Cisco #WeAreCisco. We are all unique, but collectively we bring our talents to work as a team, to develop innovative technology and power a more inclusive, digital future for everyone. How do we do it? Well, for starters – with people like you! Nearly every internet connection around the world touches Cisco. We’re the Internet’s optimists. Our technology makes sure the data traveling at light speed across connections does so securely, yet it’s not what we make but what we make happen which marks us out. We’re helping those who work in the health service to connect with patients and each other; schools, colleges, and universities to teach in even the most challenging of times. We’re helping businesses of all shapes and sizes to connect with their employees and customers in new ways, providing people with access to the digital skills they need and connecting the most remote parts of the world – whether through 5G, or otherwise. We tackle whatever challenges come our way. We have each other’s backs, we recognize our accomplishments, and we grow together. We celebrate and support one another – from big and small things in life to big career moments. And giving back is in our DNA (we get 10 days off each year to do just that). We know that powering an inclusive future starts with us. Because without diversity and a dedication to equality, there is no moving forward. Our 30 Inclusive Communities, that bring people together around commonalities or passions, are leading the way. Together we’re committed to learning, listening, caring for our communities, whilst supporting the most vulnerable with a collective effort to make this world a better place either with technology, or through our actions. So, you have colorful hair? Don’t care. Tattoos? Show off your ink. Like polka dots? That’s cool. Pop culture geek? Many of us are. Passion for technology and world changing? Be you, with us! #WeAreCisco Show more Show less
Posted 4 weeks ago
8.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
What You’ll Do We are seeking a dynamic Account Executive Artificial Intelligence (AI) to join our strong and strategic sales tea m . As a n AE (AI), you will drive the adoption of our AI solutions in our “Rest of Cloud” ( RoC ) market that includes to Cloud Service Providers (CSPs) and emerging AI providers such as AI native cloud builders, AI SaaS providers, and AI System Integrators . You will understand their specific needs and drive AI Infrastructure and Networking solutions that align to their business operations. This role requires a deep understanding of AI Infrastructure and large-scale networking with a strong ability to translate technical concepts to a diverse audience. Who You’ll Work With The Cloud + AI Infrastructure team delivers one scalable strategy with local execution for customer transformation and growth. We are the worldwide go-to-market compute and data center networking engine assembling market transitions and engaging with sellers to fuel growth for customers and Cisco. Alongside our colleagues, Cloud & AI Infrastructure builds the sales strategy, activates sellers and technical communities, and accelerates selling every single day. Who You Are You will develop and execute a strategy to deliver incremental revenu e for AI and large-scale networking products and services including network routing and switching, optics and data center interconnects, automation and performance optimization across emerging AI provider accounts and develop relationships with key decision-makers and partners. Engaging with your clients to understand their business challenges and conducting detailed analysis to find new opportunities for AI and networking solutions are the dynamic skills you will bring to this role. You understand AI technology and the market and can translate technical concepts i nto business value for clients. Minimum Qualifications 8 + years of technology-related business development experience Experience unlocking revenue for new innovative technology-based solutions. Experience working with Cloud Service Providers, NeoCloud customers, and/or AI System Integrators . Experience in understanding business issues of Cloud Builders and Providers, Networking Infrastructure / accelerated Computing/ Data Center technology/ Deep learning & machine learning. Proven ability to work cross-functionally with Engineering and Marketing to develop and launch new AI or Networking Infrastructure offers Preferred Qualifications Bachelor’s degree or equivalent experience in Business, Computer Science, Engineering, or a related field; advanced degree is a plus. Excellent verbal and written communication skills. Experience bridging large-scale network ing concepts with AI infrastructure (data center / compute) Experience with deep learning, data science, and NVIDIA GPUs. Experience in two or more data estate workloads such as: Microsoft’s Data & AI Platform (Azure Synapse Analytics, Azure Databricks, CosmosDB , Azure SQL or HDInsight, etc.), AWS (Redshift, Aurora, Glue), Google ( BigQuery ), MongoDB, Cassandra, Snowflake, Teradata, Oracle Exadata, IBM Netezza, SAP (HANA, BW), Apache Hadoop & Spark, MapR or Cloudera/Hortonworks, etc. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Why Cisco #WeAreCisco. We are all unique, but collectively we bring our talents to work as a team, to develop innovative technology and power a more inclusive, digital future for everyone. How do we do it? Well, for starters – with people like you! Nearly every internet connection around the world touches Cisco. We’re the Internet’s optimists. Our technology makes sure the data traveling at light speed across connections does so securely, yet it’s not what we make but what we make happen which marks us out. We’re helping those who work in the health service to connect with patients and each other; schools, colleges, and universities to teach in even the most challenging of times. We’re helping businesses of all shapes and sizes to connect with their employees and customers in new ways, providing people with access to the digital skills they need and connecting the most remote parts of the world – whether through 5G, or otherwise. We tackle whatever challenges come our way. We have each other’s backs, we recognize our accomplishments, and we grow together. We celebrate and support one another – from big and small things in life to big career moments. And giving back is in our DNA (we get 10 days off each year to do just that). We know that powering an inclusive future starts with us. Because without diversity and a dedication to equality, there is no moving forward. Our 30 Inclusive Communities, that bring people together around commonalities or passions, are leading the way. Together we’re committed to learning, listening, caring for our communities, whilst supporting the most vulnerable with a collective effort to make this world a better place either with technology, or through our actions. So, you have colorful hair? Don’t care. Tattoos? Show off your ink. Like polka dots? That’s cool. Pop culture geek? Many of us are. Passion for technology and world changing? Be you, with us! #WeAreCisco Show more Show less
Posted 4 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Software Development Engineer II Software Development Engineer (Data Engineering) Overview Mastercard is the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Enterprise Data Solution (EDS) is focused on enabling insights into Mastercard network and help build data-driven products by curating and preparing data in a secure and reliable manner. Moving to a “Unified and Fault-Tolerant Architecture for Data Ingestion and Processing” is critical to achieving this mission. As a Software Development Engineer (Data Engineering) in Enterprise Data Solution (EDS), you will have the opportunity to build high performance data pipelines to load into Mastercard Data Warehouse. Our Data Warehouse provides analytical capabilities to number of business users who help different customers provide answer to their business problems through data. You will play a vital role within a rapidly growing organization, while working closely with experienced and driven engineers to solve challenging problems. Role Participant medium-to-large size data engineering projects Discover, ingest, and incorporate new sources of real-time, streaming, batch, and API-based data into our platform to enhance the insights we get from running tests and expand the ways and properties on which we can test Assist business in utilizing data-driven insights to drive growth and transformation. Build and maintain data processing workflows feeding Mastercard analytics domains. Facilitate reliable integrations with internal systems and third-party API's as needed. Support data analysts as needed, advising on data definitions and helping them derive meaning from complex datasets. Work with cross functional agile teams to drive projects through full development cycle. Help the team improve with the usage of data engineering best practices. Collaborate with other data engineering teams to improve the data engineering ecosystem and talent within Mastercard. All About You At least Bachelor's degree in Computer Science, Computer Engineering or Technology related field or equivalent work experience Experience in Data Warehouse related projects in product or service based organization Expertise in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge Additional experience of building data pipelines through Spark with Scala/Python/Java on Hadoop is preferred Experience of working on Nifi will be an added advantage Experience of working in Agile teams Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan Strong communication skills - both verbal and written – and strong relationship, collaboration skills and organizational skills Ability to be high-energy, detail-oriented, proactive and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement Flexibility to work as a member of a matrix based diverse and geographically distributed project teams Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-246732 Show more Show less
Posted 4 weeks ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
Remote
Role Description Job Title: Data Analyst Experience: 8+ Years Location: Thiruvananthapuram Job Summary We are seeking an experienced Data Analyst with a strong background in banking, regulatory reporting, and financial services. The ideal candidate should have expertise in data analysis, governance, SQL, system integration, and business intelligence. This role involves working closely with stakeholders to analyze complex business requirements, ensure data quality, and provide solutions that align with organizational goals. Core Responsibilities Business & System Analysis: Analyze and translate complex business requirements into functional system design documents. Perform technology and system analysis, leveraging knowledge of applications, interfaces, and data structures. Facilitate and participate in design whiteboarding sessions to ensure business needs are met within SVB standards. Provide leadership and support for system production issues, change requests, and maintenance while maintaining documentation. Data Analytics & Governance Perform data mapping, data quality checks, and data profiling to ensure accuracy and consistency. Apply advanced data analysis techniques to understand detailed data flows between and within systems. Ensure data governance best practices and compliance with regulatory standards. Banking & Regulatory Reporting Expertise Work with Sanctions, Fraud, KYC, AML, and Payments domain data. Develop insights and reports for regulatory and compliance reporting within financial services. Ensure data accuracy in risk and compliance frameworks. SQL & Data Management Write complex SQL queries for data extraction, transformation, and analysis. Work with relational databases and data warehouses to manage large datasets. Support ETL processes and API/Microservices-based system integrations. Systems & Implementation Support Configure systems and develop expertise in system functionality and design. Provide expert guidance in data-related projects, including systems implementation and integration. Ensure alignment of future technology and system trends with business needs. Collaboration & Communication Prepare and present subject matter expertise through documentation and presentations. Collaborate with cross-functional teams, including IT, finance, and regulatory teams, to define and refine data requirements. Work with remote teams to resolve production incidents quickly and efficiently. Mandatory Skills & Qualifications 8+ years of experience in Data Analytics, Business Intelligence, or related fields. 5+ years of experience in banking and regulatory reporting domains. Bachelor’s degree in Computer Science, Information Science, or related discipline (or equivalent work experience). Strong data background with expertise in data mapping, data quality, data governance, and data warehousing. Expertise in SQL for data querying, transformation, and reporting. Experience in Sanctions, Fraud, KYC, AML, and Payments domains. Strong experience with systems integration using API/Microservices/web services, ETL. Hands-on experience with SAP BODS (BusinessObjects Data Services). Experience working on systems implementation projects in the banking or financial sector. Excellent written and verbal communication skills for stakeholder interactions. Good To Have Skills & Qualifications Knowledge of Big Data technologies (Snowflake, etc.). Familiarity with BI tools (Tableau, Power BI, Looker, etc.). Exposure to AI/ML concepts and tools for data-driven insights. Skills Sap Bods,Netezza,Big Data Show more Show less
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane