Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
GCL - C3 Introduction to role Are you ready to lead the charge in AstraZeneca's digital transformation? We are seeking a dynamic SPFx & Power Platform Support Specialist with a robust background in Microsoft technologies. This role is perfect for someone who thrives on support and handling impactful SPFx web parts and solutions, while bringing to bear the Power Platform to streamline business processes and enhance user experiences across Microsoft 365 environments. Join us in shaping the future of digital healthcare! Accountabilities Manage and support SPFx deployments, web parts, extensions, and applications. Deploy and support end-to-end solutions using Power Platform components including Power Apps, Power Automate, Power BI, AI Builder, Power Virtual Agents, Co-Pilot Studio. Integrate SharePoint Online and On-Premises solutions with Microsoft 365 and external systems. Work on Multi technologies “M365 Suites” platform. Maintain and support existing SharePoint and Power Platform solutions, resolving issues and improving efficiency. Responsible for monitoring alerts handling. Responsible for handling service requests. Responsible for diagnosing and fixing P3 & P4 end user related incidents. Responsible for updating SharePoint Security & Cumulative Update Servers. Initial triaging, and remediation if possible, for all P1& P2 or else advance it senior levels. Responsible for incident queue monitoring and management. Responsible for Creating & implementing Change requests. Responsible for fulfilment requests of standard changes (request for data, request for information, server builds, other service catalogue items). Responsible for creating and updating Known Error documentation. Essential Skills/Experience 6-8 years of total IT experience in Microsoft technologies 4+ years of hands-on development experience with SPFx, Power Platform, Co-Pilot Studio & Microsoft Teams & Unified Communication Services. Expertise in developing Power Apps (Canvas and Model-driven), Power Automate flows, Power BI dashboards, Power Automate Desktop, and Copilot Experience with REST APIs, PowerShell, and SharePoint Framework integrations Knowledge of Power Platform, SharePoint Online and On-Premises architectures Experience in scripting and automation using PowerShell Familiarity with Azure services (e.g., Azure Functions, Logic Apps, Azure AD) - Good to have Excellent problem-solving skills and a proactive attitude Good communication and collaboration abilities to work effectively with multi-functional teams Desirable Skills/Experience NA When we put unexpected teams in the same room, we spark bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, our work has a direct impact on patients by redefining our ability to develop life-changing medicines. We empower the business to perform at its peak by combining pioneering science with leading digital technology platforms. With a passion for data, analytics, AI, machine learning, and more, we drive cross-company change to disrupt the entire industry. Here you can innovate, take ownership, and explore new solutions in a dynamic environment that encourages lifelong learning. Ready to make an impact? Apply now to join our team! Date Posted 27-Jun-2025 Closing Date 04-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 2 days ago
5.0 years
0 Lacs
West Delhi, Delhi, India
On-site
We are seeking two highly skilled Cloud & Data Engineering Specialists to join our dynamic team. These roles will focus on designing, building, and optimizing scalable cloud-based solutions, data pipelines, and analytics platforms. The ideal candidates will have strong expertise in cloud platforms, data engineering, and modern technologies, with a focus on delivering robust, secure, and efficient data solutions. Location: Off-Shore (India) Work-Hours: Overlap till 12pm CST Key Responsibilities: Design and implement cloud solutions across Azure, AWS, and GCP platforms. Develop and optimize data pipelines using PySpark, Python, and SQL. Build and manage ETL workflows using Azure Data Factory (ADF). Work with big data technologies such as Apache Spark and Databricks to process large datasets. Design and deliver dashboards and reports using Tableau and Power BI. Implement DevOps practices, including version control with Git, CI/CD pipelines, and containerization using Docker. Collaborate with stakeholders to gather requirements and deliver scalable data solutions. Key Skills: Proficiency in Azure, AWS, and GCP cloud platforms. Strong programming skills in Python, SQL, and PySpark. Experience with Snowflake and SQL Server databases. Expertise in ETL tools like Azure Data Factory (ADF). Hands-on experience with Apache Spark and Databricks for big data processing. Proficiency in reporting tools such as Tableau and Power BI. Knowledge of DevOps practices, including Git, CI/CD pipelines, and Docker. General Requirements: Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience). 5+ years of experience in cloud and data engineering roles. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Proven ability to work in a fast-paced, agile environment.
Posted 2 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for utilizing your expertise in Databricks Unified Data Analytics Platform to develop efficient and effective applications. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with cross-functional teams to gather and analyze business requirements. - Design, develop, and test applications using Databricks Unified Data Analytics Platform. - Troubleshoot and debug applications to ensure optimal performance and functionality. - Implement security and data protection measures. - Document technical specifications and user manuals for reference and reporting purposes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of data engineering concepts and techniques. - Experience with data integration and ETL processes. - Knowledge of programming languages such as Python or Scala. - Familiarity with cloud platforms like AWS or Azure. - Good To Have Skills: Experience with big data technologies such as Hadoop or Spark. - Understanding of data governance and data quality principles. - Knowledge of SQL and database management systems. Additional Information: - The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Hyderabad office. - A 15 years full-time education is required. 15 years full time education
Posted 2 days ago
7.0 - 10.0 years
20 - 30 Lacs
Pune, Chennai
Hybrid
YOULL BUILD TECH THAT EMPOWERS GLOBAL BUSINESSES Our Connect Technology teams are working on our new Connect platform, a unified, global, open data ecosystem powered by Microsoft Azure. Our clients around the world rely on Connect data and insights to innovate and grow. As a Senior Data Engineer, you’ll be part of a team of smart, highly skilled technologists who are passionate about learning and supporting cutting-edge technologies such as Spark, Scala, Pyspark, Databricks, Airflow, SQL, Docker, Kubernetes, and other Data engineering tools. These technologies are deployed using DevOps pipelines leveraging Azure, Kubernetes, Jenkins and Bitbucket/GIT Hub. WHAT YOU’LL DO: Develop, test, troubleshoot, debug, and make application enhancements leveraging, Spark , Pyspark, Scala, Pandas, Databricks, Airflow, SQL as the core development technologies. Deploy application components using CI/CD pipelines. Build utilities for monitoring and automating repetitive functions. Collaborate with Agile cross-functional teams - internal and external clients including Operations, Infrastructure, Tech Ops Collaborate with Data Science team and productionize the ML Models. Participate in a rotational support schedule to provide responses to customer queries and deploy bug fixes in a timely and accurate manner Qualifications WE’RE LOOKING FOR PEOPLE WHO HAVE: 8-10 Years of years of applicable software engineering experience Strong fundamentals with experience in Bigdata technologies, Spark, Pyspark, Scala, Pandas, Databricks, Airflow, SQL, Must have experience in cloud technologies, preferably Microsoft Azure. Must have experience in performance optimization of Spark workloads. Good to have experience with DevOps Technologies as GIT Hub, Kubernetes, Jenkins, Docker. Good to have knowledge of relational databases, preferably PostgreSQL. Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business Minimum B.S. degree in Computer Science, Computer Engineering or related field Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP)
Posted 2 days ago
4.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / Data Bricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred Technical And Professional Experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 2 days ago
10.0 years
0 Lacs
Mysore, Karnataka, India
On-site
Company Description Wiser Solutions is a suite of in-store and eCommerce intelligence and execution tools. We're on a mission to enable brands, retailers, and retail channel partners to gather intelligence and automate actions to optimize in-store and online pricing, marketing, and operations initiatives. Our Commerce Execution Suite is available globally. Job Description When looking to buy a product, whether it is in a brick and mortar store or online, it can be hard enough to find one that not only has the characteristics you are looking for but is also at a price that you are willing to pay. It can also be especially frustrating when you finally find one, but it is out of stock. Likewise, brands and retailers can have a difficult time getting the visibility they need to ensure you have the most seamless experience as possible in selecting their product. We at Wiser believe that shoppers should have this seamless experience, and we want to do that by providing the brands and retailers the visibility they need to make that belief a reality. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze lots of structured and semi-structured data from lots of different places every day (whether it’s 20 million+ products from 500+ websites or data collected from over 300,000 brick and mortar stores across the country). We help our customers be more competitive by discovering interesting patterns in this data they can use to their advantage, while being uniquely positioned to be able to do this across both online and instore. We are looking for a lead-level software engineer to lead the charge on a team of like-minded individuals responsible for developing the data architecture that powers our data collection process and analytics platform. If you have a passion for optimization, scaling, and integration challenges, this may be the role for you. What You Will Do Think like our customers – you will work with product and engineering leaders to define data solutions that support customers’ business practices. Design/develop/extend our data pipeline services and architecture to implement your solutions – you will be collaborating on some of the most important and complex parts of our system that form the foundation for the business value our organization provides Foster team growth – provide mentorship to both junior team members and evangelizing expertise to those on others. Improve the quality of our solutions – help to build enduring trust within our organization and amongst our customers by ensuring high quality standards of the data we manage Own your work – you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table – some of our best innovations originate within the team Technologies We Use Languages: SQL, Python Infrastructure: AWS, Docker, Kubernetes, Apache Airflow, Apache Spark, Apache Kafka, Terraform Databases: Snowflake, Trino/Starburst, Redshift, MongoDB, Postgres, MySQL Others: Tableau (as a business intelligence solution) Qualifications Bachelors/Master’s degree in Computer Science or relevant technical degree 10+ years of professional software engineering experience Strong proficiency with data languages such as Python and SQL Strong proficiency working with data processing technologies such as Spark, Flink, and Airflow Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Snowflake, etc.) Solid understanding of streaming solutions such as Kafka, Pulsar, Kinesis/Firehose, etc. Hands-on experience with Docker, Kubernetes, infrastructure as code using Terraform, and Kubernetes package management with Helm charts Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of columnar/row-oriented data structures (e.g. Parquet, ORC, Avro, etc.) Solid understanding of Apache Iceberg, or other open table formats Proven ability to transform raw unstructured/semi-structured data into structured data in accordance to business requirements Solid understanding of AWS, Linux and infrastructure concepts Proven ability to diagnose and address data abnormalities in systems Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Experience building data warehouses using conformed dimensional models Experience building data lakes and/or leveraging data lake solutions (e.g. Trino, Dremio, Druid, etc.) Experience working with business intelligence solutions (e.g. Tableau, etc.) Experience working with ML/Agentic AI pipelines (e.g. , Langchain, LlamaIndex, etc.) Understands Domain Driven Design concepts and accompanying Microservice Architecture Passion for data, analytics, or machine learning. Focus on value: shipping software that matters to the company and the customer Bonus Points Experience working with vector databases Experience working within a retail or ecommerce environment. Proficiency in other programming languages such as Scala, Java, Golang, etc. Experience working with Apache Arrow and/or other in-memory columnar data technologies Supervisory Responsibility Provide mentorship to team members on adopted patterns and best practices. Organize and lead agile ceremonies such as daily stand-ups, planning, etc Additional Information EEO STATEMENT Wiser Solutions, Inc. is an Equal Opportunity Employer and prohibits Discrimination, Harassment, and Retaliation of any kind. Wiser Solutions, Inc. is committed to the principle of equal employment opportunity for all employees and applicants, providing a work environment free of discrimination, harassment, and retaliation. All employment decisions at Wiser Solutions, Inc. are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, national origin, family or parental status, disability, genetics, age, sexual orientation, veteran status, or any other status protected by the state, federal, or local law. Wiser Solutions, Inc. will not tolerate discrimination, harassment, or retaliation based on any of these characteristics.
Posted 2 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Software Engineer Job Summary As a Senior Software Engineer focused on Data Quality, you will lead the design, development, and deployment of scalable data quality frameworks and pipelines. You will work closely with data engineers, analysts, and business stakeholders to build robust solutions that validate, monitor, and improve data quality across large-scale distributed systems. Key Responsibilities Lead the design and implementation of data quality frameworks and automated validation pipelines using Python, Apache Spark, and Hadoop ecosystem tools. Develop, deploy, and maintain scalable ETL/ELT workflows using Apache Airflow and Apache NiFi to ensure seamless data ingestion, transformation, and quality checks. Collaborate with cross-functional teams to understand data quality requirements and translate them into technical solutions. Define and enforce data quality standards, rules, and monitoring processes. Perform root cause analysis on data quality issues and implement effective fixes and enhancements. Mentor and guide junior engineers, conducting code reviews and fostering best practices. Continuously evaluate and integrate new tools and technologies to enhance data quality capabilities. Ensure high code quality, performance, and reliability in all data processing pipelines. Create comprehensive documentation and reports on data quality metrics and system architecture. Required Skills & Experience Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field with Data Engineering Experience. 5+ years of professional experience in software development, with at least 2 years in a lead or senior engineering role. Strong proficiency in Python programming and experience building data processing applications. Hands-on expertise with Apache Spark and Hadoop for big data processing. Solid experience with workflow orchestration tools like Apache Airflow. Experience designing and managing data ingestion and integration pipelines with Apache NiFi. Understanding on Data Quality automation, CI/CD, Jenkins, Oracle, Power BI, Splunk Deep understanding of data quality concepts, data validation techniques, and distributed data systems. Strong problem-solving skills and ability to lead technical discussions. Experience with cloud platforms (AWS, GCP, or Azure) is a plus. Excellent communication and collaboration skills Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-251594
Posted 2 days ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Job Description Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences. We’re on a mission to hire the very best and are committed to building exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new insights can come from everywhere, and we know the next big idea could be yours! At Adobe, you will be immersed in an exceptional work environment that is recognized around the world. The Opportunity Structured Content is at the heart of the engines powering the new age Experiences such as Chatbots, Voice Based Devices, Fluid and Omni-Channel content delivery. In Technical Communication group of Adobe, we are developing a new age Component Content Management System which powers the journey of Structured Content for large enterprises. Scale and efficiency are key here. Millions of documents are published regularly to multiple platforms and channels across the industry using this solution. We have a strong vision and we are looking for a highly motivated, technically driven and hands on leader to realize it for our product. About The Team AEM Guides is a new age technology CCMS. It is used by Fortune-500 companies which publish millions of documents regularly using this product. It is a unique opportunity to work in a startup like environment within a large organization where all product functions collaborate closely with the business counterparts and with the large enterprise customers. Given the enterprise business domain and the startup nature of the team, we are growing fast and scaling the product scope and customer base at a very rapid pace. The Challenge As a Senior Computer Scientist, you will go beyond traditional coding responsibilities to lead and shape complex features within our systems. Your role will focus on: Architecting Scalable Solutions: Design and implement features that integrate seamlessly with our broader system architecture, leveraging Adobe Experience Manager capabilities to manage customer workflows, and drive actionable insights. Complex Systems Integration: Ensure that new features interact effectively with existing components, maintaining high performance and reliability. Advanced Algorithmic Design: Develop and optimize algorithms to solve complex problems, applying sophisticated design principles to enhance system functionality. Strategic Design Judgment: Make informed, high-level design decisions that align with long-term product goals and architectural standards. Product and Technology Expertise: Stay ahead of the emerging technologies, using this knowledge to drive innovation and continuously improve our offerings. We are looking for passionate and driven senior architects who can translate intricate product features into scalable, efficient solutions. Your expertise in architectural decision-making and team mentoring will be crucial to our success and innovation in the marketing technology space. Roles & Responsibilities: This is an individual contributor position. Expectations will be on the below lines: Responsible for design and architecture of new services and features Well versed in emerging industry technologies and trends, and the ability to channel that knowledge to the team and use it to influence product direction. Be responsible for all phases of engineering. From early specs, design/architecture, technology choice, development, unit-testing/integration automation, and deployment. Collaborate with architects, product management and other engineering teams to build the services and product features Build technical specifications, prototypes and presentations to communicate your ideas. Participate in resolution of production issues and develop solutions to prevent future issues from happening again Orchestrate with team to develop a product or parts of a large product. Required Skills & Expertise: 10+ years of experience in technical roles, proven experience across product life cycle. Well versed with microservices architecture, cloud-based web services architecture, design patterns and frameworks Experience in scaling and leading teams building solution with cloud technologies Should have excellent computer science fundamentals and a good understanding of design, and performance of algorithms Mastery of Java SE (Java 8+), including functional programming, streams, lambdas, and concurrency APIs. Proficiency in designing and developing RESTful APIs and GraphQL. Excellent database fundamentals and hands on with MySQL/Postgres/ MongoDB Understanding of API versioning, security (OAuth 2.0, JWT), and documentation using tools like Swagger/OpenAPI. Knowledge of AWS and/or Azure, Kubernetes, Docker, Jenkins, Splunk Knowledge of application security best practices (e.g., OWASP Top Ten) Experience with implementing encryption, secure communication protocols (TLS/SSL), and vulnerability management Strong ability to partner across organizations and divisions and presents an opportunity to define and contribute to solving some of the most difficult problems Basic understanding of UI/UX design and development is a plus. Ability to build consensus & drive decisions in ambiguous scenarios Excellent work ethics and highly motivated Excellent oral and written communication skills (interpersonal and client-facing) Ability to manage systems development scope and changes in the context of the business environment. Minimum of a Bachelor's degree or equivalent in Computer Science, Information Technology, Engineering, or related field. Architectural Skills Scalable Design Patterns: Proficiency in applying design patterns (e.g., Singleton, Factory, Strategy, and Observer) and architectural patterns like CQRS and Domain-Driven Design (DDD). Integration and Interoperability: Experience with integrating third-party services, SDKs, and APIs. Knowledge of data streaming and batch processing frameworks (e.g., Apache Flink, Apache Spark). Monitoring and Observability: Familiarity with monitoring tools like Prometheus, Grafana, and New Relic. Experience with distributed tracing tools like Jaeger or Zipkin Code Reviews & Standards: Skilled in conducting thorough code reviews and enforcing best practices. Data Management: Proficiency in handling large-scale data processing and ensuring data consistency across distributed systems. Knowledge of caching mechanisms (e.g., Redis, Memcached) for optimized performance Adobe is an equal opportunity employer. We welcome and encourage diversity in the workplace regardless of race, gender, religion, age, sexual orientation, gender identity, disability or veteran status. We ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Other Info: Adobe Fast Facts: https://blogs.adobe.com/adobelife/category/apac/ Life@Adobe Blog: https://blogs.adobe.com/adobelife/category/apac/ Adobe Corporate Social Responsibility: http://www.adobe.com/corporateresponsibility/ Adobe Culture and Benefits: https://benefits.adobe.com/in Adobe Investor Relations: http://www.adobe.com/aboutadobe/invrelations Discover Check-In: https://www.adobe.com/check-in.html Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.
Posted 2 days ago
10.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
Company Description Wiser Solutions is a suite of in-store and eCommerce intelligence and execution tools. We're on a mission to enable brands, retailers, and retail channel partners to gather intelligence and automate actions to optimize in-store and online pricing, marketing, and operations initiatives. Our Commerce Execution Suite is available globally. Job Description When looking to buy a product, whether it is in a brick and mortar store or online, it can be hard enough to find one that not only has the characteristics you are looking for but is also at a price that you are willing to pay. It can also be especially frustrating when you finally find one, but it is out of stock. Likewise, brands and retailers can have a difficult time getting the visibility they need to ensure you have the most seamless experience as possible in selecting their product. We at Wiser believe that shoppers should have this seamless experience, and we want to do that by providing the brands and retailers the visibility they need to make that belief a reality. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze lots of structured and semi-structured data from lots of different places every day (whether it’s 20 million+ products from 500+ websites or data collected from over 300,000 brick and mortar stores across the country). We help our customers be more competitive by discovering interesting patterns in this data they can use to their advantage, while being uniquely positioned to be able to do this across both online and instore. We are looking for an experienced software engineer to lead the charge on a team of like-minded individuals responsible for developing the data architecture that powers our data collection process and analytics platform. If you have a passion for optimization, scaling, and integration challenges, this may be the role for you. What You Will Do Think like our customers – you will work with product and engineering leaders to define data solutions that support customers’ business practices. Design/develop/extend our data pipeline services and architecture to implement your solutions – you will be collaborating on some of the most important and complex parts of our system that form the foundation for the business value our organization provides Foster team growth – provide mentorship to both junior team members and evangelizing expertise to those on others. Improve the quality of our solutions – help to build enduring trust within our organization and amongst our customers by ensuring high quality standards of the data we manage Own your work – you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table – some of our best innovations originate within the team Technologies We Use Languages: SQL, Python Infrastructure: AWS, Docker, Kubernetes, Apache Airflow, Apache Spark, Apache Kafka, Terraform Databases: Snowflake, Trino/Starburst, Redshift, MongoDB, Postgres, MySQL Others: Tableau (as a business intelligence solution) Qualifications Bachelors/Master’s degree in Computer Science or relevant technical degree 10+ years of professional software engineering experience Strong proficiency with data languages such as Python and SQL Strong proficiency working with data processing technologies such as Spark, Flink, and Airflow Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Snowflake, etc.) Solid understanding of streaming solutions such as Kafka, Pulsar, Kinesis/Firehose, etc. Hands-on experience with Docker, Kubernetes, infrastructure as code using Terraform, and Kubernetes package management with Helm charts Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of columnar/row-oriented data structures (e.g. Parquet, ORC, Avro, etc.) Solid understanding of Apache Iceberg, or other open table formats Proven ability to transform raw unstructured/semi-structured data into structured data in accordance to business requirements Solid understanding of AWS, Linux and infrastructure concepts Proven ability to diagnose and address data abnormalities in systems Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Experience building data warehouses using conformed dimensional models Experience building data lakes and/or leveraging data lake solutions (e.g. Trino, Dremio, Druid, etc.) Experience working with business intelligence solutions (e.g. Tableau, etc.) Experience working with ML/Agentic AI pipelines (e.g. , Langchain, LlamaIndex, etc.) Understands Domain Driven Design concepts and accompanying Microservice Architecture Passion for data, analytics, or machine learning. Focus on value: shipping software that matters to the company and the customer Bonus Points Experience working with vector databases Experience working within a retail or ecommerce environment. Proficiency in other programming languages such as Scala, Java, Golang, etc. Experience working with Apache Arrow and/or other in-memory columnar data technologies Supervisory Responsibility Provide mentorship to team members on adopted patterns and best practices. Organize and lead agile ceremonies such as daily stand-ups, planning, etc Additional Information EEO STATEMENT Wiser Solutions, Inc. is an Equal Opportunity Employer and prohibits Discrimination, Harassment, and Retaliation of any kind. Wiser Solutions, Inc. is committed to the principle of equal employment opportunity for all employees and applicants, providing a work environment free of discrimination, harassment, and retaliation. All employment decisions at Wiser Solutions, Inc. are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, national origin, family or parental status, disability, genetics, age, sexual orientation, veteran status, or any other status protected by the state, federal, or local law. Wiser Solutions, Inc. will not tolerate discrimination, harassment, or retaliation based on any of these characteristics.
Posted 2 days ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips – the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world – like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries of science and engineering to make possible the next generations of technology, join us to Make Possible® a Better Future. What We Offer Location: Bangalore,IND, Chennai,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. We’re committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits. You’ll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers. We empower our team to push the boundaries of what is possible—while learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Software Architect About Applied Applied Materials is the leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. Our expertise in modifying materials at atomic levels and on an industrial scale enables customers to transform possibilities into reality. At Applied Materials, our innovations make possible the technology shaping the future. Our Team Our team is developing a high-performance computing solution for low-latency and high throughput image processing and deep-learning workload that enables our Chip Manufacturing process control equipment to offer differentiated value to our customers. Your Opportunity As an architect, you will get the opportunity to grow in the field of high-performance computing, complex system design and low-level optimizations for better cost of ownership. Roles and Responsibility As a Software Architect, you will be responsible for designing and implementing High performance computing software solutions for our organization. You will work closely with cross-functional teams, including software engineers, product managers, and business stakeholders, to understand requirements and translate them into architectural/software designs that meet business needs. You will be coding and developing quick prototypes to establish your design with real code and data. You will be a subject Matter expert to unblock software engineers in the HPC domain. You will be expected to profile systems to understand bottlenecks, optimize workflows and code and processes to improve cost of ownership. Conduct technical reviews and provide guidance to software engineers during the development process. Identify and mitigate technical risks and issues throughout the software development lifecycle. Evaluate and recommend appropriate technologies and frameworks to meet project requirements. Lead the design and implementation of complex software components and systems. Ensure that software systems are scalable, reliable, and maintainable. Mentor and coach junior software architects and engineers. Your primary focus will be on ensuring that the software systems are scalable, reliable, maintainable and cost effective. Our Ideal Candidate Someone who has the drive and passion to learn quickly, has the ability to multi-task and switch contexts based on business needs. Qualifications 7 to 15 years of experience in Design and coding in C/C++ preferably in Linux Environment. Very good knowledge Data structure and Algorithms and complexity analysis. Experience in developing Distributed High Performance Computing software using Parallel programming frameworks like MPI, UCX etc. In depth experience in Multi-threading, Thread Synchronization, Inter process communication, and distributed computing fundamentals. Very Good knowledge of Computer science fundamentals like, Operating systems internals (Linux Preferred), Networking and Storage systems. Experience in performance profiling at application and system level (e.g. vtune, Oprofiler, perf, Nividia Nsight etc.) Experience in low level code optimization techniques using Vectorization and Intrinsics, cache-aware programming, lock free data structures etc. Experience in GPU programming using CUDA, OpenMP, OpenACC, OpenCL etc. Familiarity with microservices architecture and containerization technologies (docker/singularity) and low latency Message queues. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to mentor and coach junior team members. Experience in Agile development methodologies. Additional Qualifications Experience in HPC Job-Scheduling and Cluster Management Software (SLURM, Torque, LSF etc.) Good knowledge of Low-latency and high-throughput data transfer technologies (RDMA, RoCE, InfiniBand) Good Knowledge of Work-flow orchestration Software like Apache Airflow, Apache Spark, Apache storm or Intel TBB flowgraph etc. Education Bachelor's Degree or higher in Computer science or related Disciplines. Years Of Experience 7 - 15 Years Additional Information Time Type: Full time Employee Type Assignee / Regular Travel Yes, 10% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.
Posted 2 days ago
2.0 - 6.0 years
5 - 8 Lacs
Pune
Work from Office
Supports, develops, and maintains a data and analytics platform. Effectively and efficiently processes, stores, and makes data available to analysts and other consumers. Works with Business and IT teams to understand requirements and best leverage technologies to enable agile data delivery at scale. Note:- Although the role category in the GPP is listed as Remote, the requirement is for a Hybrid work model. Key Responsibilities: Oversee the development and deployment of end-to-end data ingestion pipelines using Azure Databricks, Apache Spark, and related technologies. Design high-performance, resilient, and scalable data architectures for data ingestion and processing. Provide technical guidance and mentorship to a team of data engineers. Collaborate with data scientists, business analysts, and stakeholders to integrate various data sources into the data lake/warehouse. Optimize data pipelines for speed, reliability, and cost efficiency in an Azure environment. Enforce and advocate for best practices in coding standards, version control, testing, and documentation. Work with Azure services such as Azure Data Lake Storage, Azure SQL Data Warehouse, Azure Synapse Analytics, and Azure Blob Storage. Implement data validation and data quality checks to ensure consistency, accuracy, and integrity. Identify and resolve complex technical issues proactively. Develop reliable, efficient, and scalable data pipelines with monitoring and alert mechanisms. Use agile development methodologies, including DevOps, Scrum, and Kanban. External Qualifications and Competencies Technical Skills: Expertise in Spark, including optimization, debugging, and troubleshooting. Proficiency in Azure Databricks for distributed data processing. Strong coding skills in Python and Scala for data processing. Experience with SQL for handling large datasets. Knowledge of data formats such as Iceberg, Parquet, ORC, and Delta Lake. Understanding of cloud infrastructure and architecture principles, especially within Azure. Leadership & Soft Skills: Proven ability to lead and mentor a team of data engineers. Excellent communication and interpersonal skills. Strong organizational skills with the ability to manage multiple tasks and priorities. Ability to work in a fast-paced, constantly evolving environment. Strong problem-solving, analytical, and troubleshooting abilities. Ability to collaborate effectively with cross-functional teams. Competencies: System Requirements Engineering: Uses appropriate methods to translate stakeholder needs into verifiable requirements. Collaborates: Builds partnerships and works collaboratively to meet shared objectives. Communicates Effectively: Delivers clear, multi-mode communications tailored to different audiences. Customer Focus: Builds strong customer relationships and delivers customer-centric solutions. Decision Quality: Makes good and timely decisions to keep the organization moving forward. Data Extraction: Performs ETL activities and transforms data for consumption by downstream applications. Programming: Writes and tests computer code, version control, and build automation. Quality Assurance Metrics: Uses measurement science to assess solution effectiveness. Solution Documentation: Documents information for improved productivity and knowledge transfer. Solution Validation Testing: Ensures solutions meet design and customer requirements. Data Quality: Identifies, understands, and corrects data flaws. Problem Solving: Uses systematic analysis to address and resolve issues. Values Differences: Recognizes the value that diverse perspectives bring to an organization. Preferred Knowledge & Experience: Exposure to Big Data open-source technologies (Spark, Scala/Java, Map-Reduce, Hive, HBase, Kafka, etc.). Experience with SQL and working with large datasets. Clustered compute cloud-based implementation experience. Familiarity with developing applications requiring large file movement in a cloud-based environment. Exposure to Agile software development and analytical solutions. Exposure to IoT technology. Additional Responsibilities Unique to this Position Qualifications: Education: Bachelors or Masters degree in Computer Science, Information Technology, Engineering, or a related field. Experience: 3 to 5 years of experience in data engineering or a related field. Strong hands-on experience with Azure Databricks, Apache Spark, Python/Scala, CI/CD, Snowflake, and Qlik for data processing. Experience working with multiple file formats like Parquet, Delta, and Iceberg. Knowledge of Kafka or similar streaming technologies. Experience with data governance and data security in Azure. Proven track record of building large-scale data ingestion and ETL pipelines in cloud environments. Deep understanding of Azure Data Services. Experience with CI/CD pipelines, version control (Git), Jenkins, and agile methodologies. Familiarity with data lakes, data warehouses, and modern data architectures. Experience with Qlik Replicate (optional).
Posted 2 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We're looking for a QA Engineer This role is Office Based, Hyderabad Office Position Title: QA Engineer (Digital & Automation) We are seeking an experienced QA Engineer with a strong background in Azure AI services, Salesforce applications, and Microsoft technologies. The ideal candidate will have extensive experience in testing AI-driven solutions within enterprise IT environments, ensuring seamless integration and functionality across platforms. In this role, you will… Design and execute comprehensive test plans for AI models and applications developed using Microsoft Azure AI services, ensuring optimal performance and reliability. Conduct thorough testing of Salesforce applications, including Sales Cloud, Service Cloud, CPQ, CLM, Marketo, and ServiceNow, to ensure they meet business requirements and integrate seamlessly with other systems. Develop and execute test strategies for mobile applications to ensure compatibility, usability, and performance across various devices and operating systems. This includes both manual and automated testing approaches. Test third-party integrations with Microsoft Power Platform suite, ensuring seamless functionality. Develop and execute automated tests using tools such as Selenium and AccelQ. Use test management tools like ServiceNow, and TestRail to track and manage test cases and defects. Develop and maintain test automation using standard frameworks. Design and implement test strategies based on business requirements, in collaboration with Development, Product Management, and Customer Support teams. Identify, report, and work with development teams to resolve software issues, ensuring solutions are validated. Contribute to various testing activities, including functional, regression, automation, performance, and documentation testing. Provide regular feedback on project/product quality through meetings, reports, and stakeholder communication. Support software quality in production environments from a QA perspective. Act as a quality advocate throughout the development lifecycle to ensure high-quality deliverables. Lead major projects and provide training to junior and intermediate QA engineers. Drive quality initiatives and ensure adherence to industry and organizational QA standards. Continuously improve software and processes through creativity and knowledge sharing. Cross-Functional Collaboration: Collaborate effectively with digital product teams, ensuring that both Digital solutions meet functional and non-functional testing requirements. You’ve Got What It Takes If You Have… Experience: Minimum of 3+ years in QA, with a focus on Microsoft applications and automation testing. Microsoft Technologies: Extensive experience in testing Microsoft Azure and Microsoft Power platform applications and ensuring their seamless operation within enterprise environments. Familiarity with Microsoft Test Studio is preferable. Salesforce Expertise: Hands-on experience with Salesforce Sales Cloud, Service Cloud, CPQ, and other related Salesforce applications. Automation Tools: Proficiency in Selenium or similar automation tools (e.g., AccelQ, cloud-based tools, or low-code/no-code automation platforms). API Automation: Strong experience in automating API tests. CI/CD: Familiarity with CI/CD processes, particularly in Salesforce environments. Test Management: Experience using test management tools such as ServiceNow, TestRail, and other tools in collaboration with the digital team to ensure alignment and smooth execution. Defect Management: In-depth knowledge of ServiceNow, or similar defect management systems. Digital Testing Skills: Experience in testing digital applications such as websites, mobile apps, and other customer-facing digital solutions in collaboration with the digital team. Additional Skills Strong proficiency in QA methodologies and best practices. Excellent communication skills for collaborating with cross-functional teams, especially with the digital team. Our Culture Spark Greatness. Shatter Boundaries. Share Success. Are you ready? Because here, right now – is where the future of work is happening. Where curious disruptors and change innovators like you are helping communities and customers enable everyone – anywhere – to learn, grow and advance. To be better tomorrow than they are today. Who We Are Cornerstone powers the potential of organizations and their people to thrive in a changing world. Cornerstone Galaxy, the complete AI-powered workforce agility platform, meets organizations where they are. With Galaxy, organizations can identify skills gaps and development opportunities, retain and engage top talent, and provide multimodal learning experiences to meet the diverse needs of the modern workforce. More than 7,000 organizations and 100 million+ users in 180+ countries and in nearly 50 languages use Cornerstone Galaxy to build high-performing, future-ready organizations and people today. Check us out on LinkedIn , Comparably , Glassdoor , and Facebook !
Posted 2 days ago
0 years
0 Lacs
India
On-site
🌿 Good Friends Needed. Volunteers Welcome. 💚 In today’s fast-paced and hyper-digital world, genuine human connection is becoming rare. That’s why Friends Chaupal was born—a space where people come together to talk, listen, share, and simply be there for one another. Now, as we grow, we’re looking for volunteers who believe in kindness, empathy, and meaningful contribution. Whether you’re great at: • Hosting or organizing meetups (online/offline) • Designing, writing, or managing social media • Simply being a good listener …we have a place for you. This isn’t a job. It’s a calling. To be part of a movement that nurtures mental wellbeing, emotional connection, and community healing. 🫶 Just one hour of your time can spark a positive ripple in someone’s life. 📌 If this speaks to you, join our circle of changemakers: 👉 https://lnkd.in/ggbUBYiG Let’s build something beautiful—together. hashtag #FriendsChaupal hashtag #Volunteering hashtag #MentalWellbeing hashtag #Empathy hashtag #HumanConnection hashtag #SafeSpaces hashtag #CommunityHealing hashtag #Changemakers hashtag #KindnessInAction
Posted 2 days ago
50.0 years
0 Lacs
Delhi, India
On-site
About Gap Inc. Our past is full of iconic moments — but our future is going to spark many more. Our brands — Gap, Banana Republic, Old Navy and Athleta — have dressed people from all walks of life and all kinds of families, all over the world, for every occasion for more than 50 years. But we’re more than the clothes that we make. We know that business can and should be a force for good, and it’s why we work hard to make product that makes people feel good, inside and out. It’s why we’re committed to giving back to the communities where we live and work. If you're one of the super-talented who thrive on change, aren't afraid to take risks and love to make a difference, come grow with us. About The Role In this role, you will be accountable for the development process and strategy execution for the assigned fabric/product departments. You will also be responsible to execute the overall country and mill/vendor strategy for the department in partnership with the relevant internal teams. What You'll Do Manage the fabric / product / vendor development process (P2M) in a timely manner (development sampling, initial costs, negotiation/ production & capacity planning to meets the design aesthetic as well as commercially acceptable quality standards) Manage relationships with mills/vendors and support vendor allocation & aggregated costing along with overall capacity planning aligned to the cost targets to drive competitive advantage Partner with mills/vendors to drive innovation initiatives and superior quality while resolving any fabric/product and quality issues pro actively Onboard new mills/vendors and provide training to existing mills/vendors along with supporting the evaluation process Look for opportunities for continuous improvement in fabric/product/vendor development, process management and overall sourcing procedures Able to communicate difficult concepts in a simple manner Participate in projects and assignments of diverse scope Who You Are Experience and knowledge of work specific to global fabric/product/vendor development and understands design, merchandising, and global sourcing landscape Ability to drive results through planning and prioritizing along with influencing others and providing recommendations & solutions Present problem analysis and recommended solutions in a creative and logical manner Gap Inc. is an equal-opportunity employer and is committed to providing a workplace free from harassment and discrimination. We are committed to recruiting, hiring, training and promoting qualified people of all backgrounds, and make all employment decisions without regard to any protected status. We have received numerous awards for our long-held commitment to equality and will continue to foster a diverse and inclusive environment of belonging. In 2022, we were recognized by Forbes as one of the World's Best Employers and one of the Best Employers for Diversity.
Posted 2 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Service Operations - Production Engineer Support, AVP Location: Pune, India Role Description You will be operating within Corporate Bank Production domain or in Corporate Banking subdivisions, as a AVP - Production Support Engineer. In this role, you will be accountable to drive a culture of proactive continual improvement into the Production environment through application, user request support, troubleshooting and resolving the errors in production environment. Automation of manual work, monitoring improvements and platform hygiene. Training and Mentoring new and existing team members, supporting the resolution of issues and conflicts and preparing reports and meetings. Candidate should have experience in all relevant tools used in the Service Operations environment and has specialist expertise in one or more technical domains and ensures that all associated Service Operations stakeholders are provided with an optimum level of service in line with Service Level Agreements (SLAs) / Operating Level Agreements (OLAs). Ensure all the BAU support queries from business are handled on priority and within agreed SLA and also to ensure all application stability issues are well taken care off. Support the resolution of incidents and problems within the team. Assist with the resolution of complex incidents. Ensure that the right problem-solving techniques and processes are applied Embrace a Continuous Service Improvement approach to resolve IT failings, drive efficiencies and remove repetition to streamline support activities, reduce risk, and improve system availability. Be responsible for your own engineering delivery and using data and analytics, drive a reduction in technical debt across the production environment with development and infrastructure teams. Act as a Production Engineering role model to enhance the technical capability of the Production Support teams to create a future operating model embedded with engineering culture. Train and Mentor team members to grow to the next role Bring in the culture of innovation engineering and automation mindset Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Lead by example to drive a culture of proactive continual improvement into the Production environment through automation of manual work, monitoring improvements and platform hygiene. Carry out technical analysis of the Production platform to identify and remediate performance and resiliency issues. Engage in the Software Development Lifecycle (SDLC) to enhance Production Standards and controls. Update the RUN Book and KEDB as & when required Participate in all BCP and component failure tests based on the run books Understand flow of data through the application infrastructure. It is critical to understand the dataflow so as to best provide operational support Event monitoring and management via a 24x7 workbench that is both monitoring and regularly probing the service environment and acting on instruction of a run book. Drive knowledge management across the supported applications and ensure full compliance. Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Your Skills And Experience Recent experience of applying technical solutions to improve the stability of production environments Working experience of some of the following technology skills: Technologies/Frameworks: Shell Scripting and/or Python JAVA 8/OpenJDK 11 (at least) - for debugging Familiarity with Spring Boot framework Unix Troubleshooting skills Hadoop framework stack Oracle 12c/19c - for pl/sql, familiarity with OEM tooling to review AWR reports and parameters No-SQL MQ Knowledge ITIL v3 Certified (must) Configuration Mgmt Tooling : Ansible Operating System/Platform: RHEL 7.x (preferred), RHEL6.x OpenShift (as we move towards Cloud computing and the fact that Fabric is dependent on OpenShift) CI/CD: Jenkins (preferred) Team City APM Tooling: Splunk Geneos NewRelic Prometheus-Grafana Other platforms: Scheduling – Ctrl-M is a plus, AIRFLOW, CRONTAB or Autosys, etc Methodology: Micro-services architecture SDLC Agile Fundamental Network topology – TCP, LAN, VPN, GSLB, GTM, etc Distributed systems experience on cloud platforms such as Azure, GCP is a plus familiarity with containerization/Kubernetes Tools: ServiceNow Jira Confluence BitBucket and/or GIT Oracle, SQL Plus Familiarity with simple Unix Tooling – putty, mPutty, exceed (PL/)SQL Developer Good understanding of ITIL Service Management framework such as Incident, Problem, and Change processes. Ability to self-manage a book of work and ensure clear transparency on progress with clear, timely, communication of issues. Excellent troubleshooting and problem solving skills. Excellent communication skills, both written and verbal, with attention to detail. Ability to work in virtual teams and in matrix structures Experience | Exposure (Recommended): 11+ yrs experience in IT in large corporate environments, specifically in the area of controlled production environments or in Financial Services Technology in a client-facing function Service Operations, development experience within a global operations context Global Transaction Banking Experience is a plus. Experience of end-to-end Level 2,3,4 management and good overview of Production/Operations Management overall Experience of supporting complex application and infrastructure domains ITIL / best practice service context. ITIL foundation is plus. Good analytical and problem-solving skills Added advantage if knowing following technologies. ETL Flow and Pipelines. Knowledge of Bigdata, SPARK, HIVE etc. Hands on exp on Splunk/New Relic for creating dashboards along with alerts/rules setups Understanding of messaging systems like SWIFT. MQ messages Understanding Trade life cycles specially for back office How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 2 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
We're looking for a Finance Operation Analyst This role is Office Based, Pune Office Fill in Our Culture Spark Greatness. Shatter Boundaries. Share Success. Are you ready? Because here, right now – is where the future of work is happening. Where curious disruptors and change innovators like you are helping communities and customers enable everyone – anywhere – to learn, grow and advance. To be better tomorrow than they are today. Who We Are Cornerstone powers the potential of organizations and their people to thrive in a changing world. Cornerstone Galaxy, the complete AI-powered workforce agility platform, meets organizations where they are. With Galaxy, organizations can identify skills gaps and development opportunities, retain and engage top talent, and provide multimodal learning experiences to meet the diverse needs of the modern workforce. More than 7,000 organizations and 100 million+ users in 180+ countries and in nearly 50 languages use Cornerstone Galaxy to build high-performing, future-ready organizations and people today. Check us out on LinkedIn , Comparably , Glassdoor , and Facebook !
Posted 2 days ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities, collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow. Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. Your role In this role you will play a key role in Data Strategy - We are looking for a 8+ years’ experience in Data Strategy (Tech Architects, Senior BA’s) who will support our product, sales, leadership teams by creating data-strategy roadmaps. The ideal candidate is adept at understanding the as-is enterprise data models to help Data-Scientists/ Data Analysts to provide actionable insights to the leadership. They must have strong experience in understanding data, using a variety of data tools. They must have a proven ability to understand current data pipeline and ensure minimal cost-based solution architecture is created & must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes. Identify, design, and recommend internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. & identify data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to create frameworks for digital twins/ digital threads having relevant experience in data exploration & profiling, involve in data literacy activities for all stakeholders & coordinating with cross functional team ; aka SPOC for global master data Your Profile 8+ years of experience in a Data Strategy role, who has attained a Graduate degree in Computer Science, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools - Experience with understanding big data tools: Hadoop, Spark, Kafka, etc. & experience with understanding relational SQL and NoSQL databases, including Postgres and Cassandra/Mongo dB & experience with understanding data pipeline and workflow management tools: Luigi, Airflow, etc. 5+ years of Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.: Postgres/ SQL/ Mongo & 2+ years working knowledge in Data Strategy: Data Governance/ MDM etc. Having 5+ years of experience in creating data strategy frameworks/ roadmaps, in Analytics and data maturity evaluation based on current AS-is vs to-be framework and in creating functional requirements document, Enterprise to-be data architecture. Relevant experience in identifying and prioritizing use case by for business; important KPI identification opex/capex for CXO's with 2+ years working knowledge in Data Strategy: Data Governance/ MDM etc. & 4+ year experience in Data Analytics operating model with vision on prescriptive, descriptive, predictive, cognitive analytics What You Will Love About Working Here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Posted 2 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Data Engineer III at JPMorgan Chase within the Consumer & Community Banking Technology Team, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job Responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture. Design & develop data pipelines end to end using PySpark, Java, Python and AWS Services. Utilize Container Orchestration services including Kubernetes, and a variety of AWS tools and services. Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, equity, inclusion, and respect Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 3+ years of applied experience. Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Hands-on practical experience in developing spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark & Spark Streaming. Proficient in coding in one or more Coding languages – Core Java, Python and PySpark Experience with Relational and Datawarehouse databases, Cloud implementation experience with AWS including: AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Airflow (or) Lambda + Step Functions + Event Bridge, ECS Cluster and ECS Apps Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager Proficiency in automation and continuous delivery methods. Preferred Qualifications, Capabilities, And Skills Experience in Snowflake nice to have. Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security. In-depth knowledge of the financial services industry and their IT systems. Practical cloud native experience preferably AWS. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction.
Posted 2 days ago
1.0 - 4.0 years
4 - 8 Lacs
Pune
Work from Office
Join us Engineer Developer Barclays which will be helping us build, maintain and support the all First line of controls applications The successful candidate would be accountable for the technical operation of this control during the Asia hours This role would require high degree of communication with the global leads in US and India To be successful as Engineer Developer, where you should have experience with: Capture the functional requirements by talking to business team and leads in US and India Convert the functional requirement to design and code/ Efficiently write the regression and unit test cases of the developed functionality Co-ordinate with L1 support team and proactively get involved in support activities for the production if required Contribute to the maturity of DevOps on the application Provide timely status updates to relevant stakeholders Graduate in the Computer Science & computer application skills Be proficient in technologies like Java 17 0, Spark, Spring Boot, Micro Services, SQL It would be a great advantage if you know Kafka, Apache Ignite, Cucumber framework, React Should be aware of Agile Scrum/Kanban methodology Some Other Highly Valued Skills May Include Partner very closely with the business operations team Work closely with global team to deliver on agreed outcomes Experience in Gitlab,Autosys and Devops You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills This role is based in Pune Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools Ensuring that code is scalable, maintainable, and optimized for performance Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing Stay informed of industry technology trends and innovations and actively contribute to the organizations technology communities to foster a culture of technical excellence and growth Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions Implementation of effective unit testing practices to ensure proper code design, readability, and reliability Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness Collaborate closely with other functions/ business divisions Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda Take ownership for managing risk and strengthening controls in relation to the work done Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc) to solve problems creatively and effectively Communicate complex information 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience Influence or convince stakeholders to achieve outcomes All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave
Posted 2 days ago
1.0 - 4.0 years
8 - 12 Lacs
Jamnagar, Ahmedabad, Rajkot
Work from Office
About The Business - Tata Electronics Private Limited (TEPL) is a greenfield venture of the Tata Group with expertise in manufacturing precision components Tata Electronics (a wholly owned subsidiary of Tata Sons Pvt Ltd ) is building Indias first AI-enabled state-of-the-art Semiconductor Foundry This facility will produce chips for applications such as power management IC, display drivers, microcontrollers (MCU) and high-performance computing logic, addressing the growing demand in markets such as automotive, computing and data storage, wireless communications and artificial intelligence Tata Electronics is a subsidiary of the Tata group The Tata Group operates in more than 100 countries across six continents, with the mission 'To improve the quality of life of the communities we serve globally, through long term stakeholder value creation based on leadership with Trust Job Responsibilities - Architect and implement a scalable, offline Data Lake for structured, semi-structured, and unstructured data in an on-premises, air-gapped environment Collaborate with Data Engineers, Factory IT, and Edge Device teams to enable seamless data ingestion and retrieval across the platform Integrate with upstream systems like MES, SCADA, and process tools to capture high-frequency manufacturing data efficiently Monitor and maintain system health, including compute resources, storage arrays, disk I/O, memory usage, and network throughput Optimize Data Lake performance via partitioning, deduplication, compression (Parquet/ORC), and implementing effective indexing strategies Select, integrate, and maintain tools like Apache Hadoop, Spark, Hive, HBase, and custom ETL pipelines suitable for offline deployment Build custom ETL workflows for bulk and incremental data ingestion using Python, Spark, and shell scripting Implement data governance policies covering access control, retention periods, and archival procedures with security and compliance in mind Establish and test backup, failover, and disaster recovery protocols specifically designed for offline environments Document architecture designs, optimization routines, job schedules, and standard operating procedures (SOPs) for platform maintenance Conduct root cause analysis for hardware failures, system outages, or data integrity issues Drive system scalability planning for multi-fab or multi-site future expansions Essential Attributes (Tech-Stacks) - Hands-on experience designing and maintaining offline or air-gapped Data Lake environments Deep understanding of Hadoop ecosystem tools: HDFS, Hive, Map-Reduce, HBase, YARN, zookeeper and Spark Expertise in custom ETL design, large-scale batch and stream data ingestion Strong scripting and automation capabilities using Bash and Python Familiarity with data compression formats (ORC, Parquet) and ingestion frameworks (e g , Flume) Working knowledge of message queues such as Kafka or RabbitMQ, with focus on integration logic Proven experience in system performance tuning, storage efficiency, and resource optimization Qualifications - BE/ ME in Computer science, Machine Learning, Electronics Engineering, Applied mathematics, Statistics Desired Experience Level - 4 Years relevant experience post Bachelors 2 Years relevant experience post Masters Experience with semiconductor industry is a plus
Posted 2 days ago
3.0 - 8.0 years
11 - 16 Lacs
Pune
Work from Office
Job Title Lead Engineer Location Pune Corporate Title Director As a lead engineer within the Transaction Monitoring department, you will lead and drive forward critical engineering initiatives and improvements to our application landscape whilst supporting and leading the engineering teams to excel in their roles. You will be closely aligned to the architecture function and delivery leads, ensuring alignment with planning and correct design and architecture governance is followed for all implementation work. You will lead by example and drive and contribute to automation and innovation initiatives with the engineering teams. Join the fight against financial crime with us! Your key responsibilities Experienced hands-on cloud and on-premise engineer, leading by example with engineering squads Thinking analytically, with systematic and logical approach to solving complex problems, and high attention to detail Design & document complex technical solutions at varying levels in an inclusive and participatory manner with a range of stakeholders Liaise and face-off directly to stakeholders in technology, business and modelling areas Collaborating with application development teams to design and prototype solutions (both on-premises and on-cloud), supporting / presenting these via the Design Authority forum for approval and providing good practice and guidelines to the teams Ensuring engineering & architecture compliance with bank-standard processes for deploying new applications, working directly with central functions such as Group Architecture, Chief Security Office and Data Governance Innovate and think creatively, showing willingness to apply new approaches to solving problems and to learn new methods, technologies and potentially outside-of-box solution Your skills and experience Proven hands-on engineering and design experience in a delivery-focused (preferably agile) environment Solid technical/engineering background, preferably with at least two high level languages and multiple relational databases or big-data technologies Proven experience with cloud technologies, preferably GCP (GKE / DataProc / CloudSQL / BigQuery), GitHub & Terraform Competence / expertise in technical skills across a wide range of technology platforms and ability to use and learn new frameworks, libraries and technologies A deep understanding of the software development life cycle and the waterfall and agile methodologies Experience leading complex engineering initiatives and engineering teams Excellent communication skills, with demonstrable ability to interface and converse at both junior and level and with non-IT staff Line management experience including working in a matrix management configuration How well support you Training and development to help you excel in your career Flexible working to assist you balance your personal priorities Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 2 days ago
4.0 - 8.0 years
7 - 12 Lacs
Pune, Bengaluru
Work from Office
Job Title - Streamsets ETL Developer, Associate Location - Pune, India Role Description Currently DWS sources technology infrastructure, corporate functions systems [Finance, Risk, HR, Legal, Compliance, AFC, Audit, Corporate Services etc] and other key services from DB. Project Proteus aims to strategically transform DWS to an Asset Management standalone operating platform; an ambitious and ground-breaking project that delivers separated DWS infrastructure and Corporate Functions in the cloud with essential new capabilities, further enhancing DWS highly competitive and agile Asset Management capability. This role offers a unique opportunity to be part of a high performing team implementing a strategic future state technology landscape for all DWS Corporate Functions globally. We are seeking a highly skilled and motivated ETL developer (individual contributor) to join our integration team. The ETL developer will be responsible for developing, testing and maintaining robust and scalable ETL processes to support our data integration initiatives. This role requires a strong understanding of database, Unix and ETL concepts, excellent SQL skills and experience with ETL tools and databases. Your key responsibilities This role will be primarily responsible for creating good quality software using the standard coding practices. Will get involved with hands-on code development. Thorough testing of developed ETL solutions/pipelines. Do code review of other team members. Take E2E Accountability and ownership of work/projects and work with the right and robust engineering practices. Converting business requirements into technical design Delivery, Deployment, Review, Business interaction and Maintaining environments. Additionally, the role will include other responsibilities, such as: Collaborating across teams Ability to share information, transfer knowledge and expertise to team members Work closely with Stakeholders and other teams like Functional Analysis and Quality Assurance teams. Work with BA and QA to troubleshoot and resolve the reported bugs / issues on applications. Your skills and experience Bachelors Degree from an accredited college or university with a concentration in Science or an IT-related discipline (or equivalent) Hands-on experience with StreamSets, SQL Server and Unix. Experience of developing and optimizing ETL Pipelines for data ingestion, manipulation and integration. Strong proficiency in SQL, including complex queries, stored procedures, functions. Solid understanding of relational database concepts. Familiarity with data modeling concepts (Conceptual, Logical, Physical) Familiarity with HDFS, Kafka, Microservices, Splunk. Familiarity with cloud-based platforms (e.g. GCP, AWS) Experience with scripting languages (e.g. Bash, Groovy). Excellent knowledge of SQL. Experience of delivering within an agile delivery framework Experience with distributed version control tool (Git, Github, BitBucket). Experience within Jenkins or pipelines based modern CI/CD systems
Posted 2 days ago
1.0 - 4.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Data Engineer I (F Band) About The Role As a Data Engineer, you will be responsible for implementing data pipelines and analytics solutions to support key decision-making processes in our Life & Health Reinsurance business You will become part of a project that is leveraging cutting edge technology that applies Big Data and Machine Learning to solve new and emerging problems for Swiss Re You will be expected to gain a full understanding of the reinsurance data and business logic required to deliver analytics solutions Key Responsibilities Include Work closely with Product Owners and Engineering Leads to understand requirements and evaluate the implementation effort Develop and maintain scalable data transformation pipelines Implement analytics models and visualizations to provide actionable data insights Collaborate within a global development team to design and deliver solutions About The Team Life & Health Data & Analytics Engineering is a key tech partner for our Life & Health Reinsurance division, supporting in the transformation of the data landscape and the creation of innovative analytical products and capabilities A large globally distributed team working in an agile development landscape, we deliver solutions to make better use of our reinsurance data and enhance our ability to make data-driven decisions across the business value chain About You Are you eager to disrupt the industry with us and make an impactDo you wish to have your talent recognized and rewardedThen join our growing team and become part of the next wave of data Innovation Key Qualifications Include Bachelor's degree level or equivalent in Computer Science, Data Science or similar discipline At least 1-3 years of experience working with large scale software systems Proficient in Python/PySpark Proficient in SQL (Spark SQL preferred) Palantir Foundry experience is a strong plus Experience working with large data sets on enterprise data platforms and distributed computing (Spark/Hive/Hadoop preferred) Experience with JavaScript/HTML/CSS a plus Experience working in a Cloud environment such as AWS or Azure is a plus Strong analytical and problem-solving skills Enthusiasm to work in a global and multicultural environment of internal and external professionals Strong interpersonal and communication skills, demonstrating a clear and articulate standard of written and verbal communication in complex environments About Swiss Re Swiss Re is one of the worlds leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime We cover both Property & Casualty and Life & Health Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients This is possible thanks to the collaboration of more than 14,000 employees across the world Our success depends on our ability to build an inclusive culture encouraging fresh perspectives and innovative thinking We embrace a workplace where everyone has equal opportunities to thrive and develop professionally regardless of their age, gender, race, ethnicity, gender identity and/or expression, sexual orientation, physical or mental ability, skillset, thought or other characteristics In our inclusive and flexible environment everyone can bring their authentic selves to work and their passion for sustainability If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience Keywords Reference Code: 134086
Posted 2 days ago
5.0 - 10.0 years
7 - 14 Lacs
Pune
Work from Office
We are looking for a skilled Data Engineer with 5-10 years of experience to join our team in Pune. The ideal candidate will have a strong background in data engineering and excellent problem-solving skills. Roles and Responsibility Design, develop, and implement data pipelines and architectures. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data systems and databases. Ensure data quality, integrity, and security. Optimize data processing and analysis workflows. Participate in code reviews and contribute to improving overall code quality. Job Requirements Strong proficiency in programming languages such as Python or Java. Experience with big data technologies like Hadoop or Spark. Knowledge of database management systems like MySQL or NoSQL. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Notice period: Immediate joiners preferred.
Posted 2 days ago
5.0 - 7.0 years
7 - 9 Lacs
Hyderabad
Work from Office
We are looking for a skilled Snowflake Developer with 5-7 years of experience to join our team at IDESLABS PRIVATE LIMITED. The ideal candidate will have expertise in designing, developing, and implementing data warehousing solutions using Snowflake. Roles and Responsibility Design and develop scalable data warehousing solutions using Snowflake. Collaborate with cross-functional teams to identify business requirements and design data models. Develop and maintain complex SQL queries for data extraction and manipulation. Implement data validation and quality checks to ensure accuracy and integrity. Optimize database performance and troubleshoot issues. Work closely with stakeholders to understand business needs and provide technical guidance. Job Requirements Strong understanding of data modeling and data warehousing concepts. Proficiency in writing complex SQL queries and stored procedures. Experience with Snowflake development tools and technologies. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane