Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Minimum qualifications: Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience. 8 years of experience with software development in one or more programming languages (e.g., Python, C, C++, Java, JavaScript). 3 years of experience in a technical leadership role; overseeing projects, with 2 years of experience in a people management, supervision/team leadership role. Experience in one or more disciplines such as machine learning, recommendation systems, natural language processing, computer vision, pattern recognition, or artificial intelligence. Preferred qualifications: Understanding of agentic AI/ML and Large Language Model (LLM). Excellent coding skills. About The Job Like Google's own ambitions, the work of a Software Engineer goes beyond just Search. Software Engineering Managers have not only the technical expertise to take on and provide technical leadership to major projects, but also manage a team of Engineers. You not only optimize your own code but make sure Engineers are able to optimize theirs. As a Software Engineering Manager you manage your project goals, contribute to product strategy and help develop your team. Teams work all across the company, in areas such as information retrieval, artificial intelligence, natural language processing, distributed computing, large-scale system design, networking, security, data compression, user interface design; the list goes on and is growing every day. Operating with scale and speed, our exceptional software engineers are just getting started -- and as a manager, you guide the way. With technical and leadership expertise, you manage engineers across multiple teams and locations, a large product budget and oversee the deployment of large-scale projects across multiple sites internationally. At Corp Eng, we build world-leading business solutions that scale a more helpful Google for everyone. As Google’s IT organization, we provide end-to-end solutions for organizations across Google. We deliver the right tools, platforms, and experiences for all Googlers as they create more helpful products and services for everyone. In the simplest terms, we are Google for Googlers. Responsibilities Manage a team of AI software engineers, fostering a collaborative and high-performing environment. This includes hiring, mentoring, performance management, and career development. Drive the design, development, and deployment of scalable and reliable Artificial Intelligence/Machine Learning (AI/ML) systems and infrastructure relevant to HR applications (e.g., talent acquisition, performance management, employee engagement, workforce planning). Collaborate with Product Managers and HR stakeholders to understand business needs, define product requirements, and translate them into technical specifications and project plans. Oversee the architecture and implementation of data pipelines using Google's data processing infrastructure (e.g., Beam, Dataflow) to support AI/ML initiatives. Stay up-to-date of the latest advancements in AI/ML and related technologies, evaluating their potential application within human resources and guiding the team's adoption of relevant innovations. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
What You’ll Do Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 2+ years of software engineering experience 2+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 2+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 2+ years experience designing and developing microservices using Java, Spring Framework, GCP SDKs, GKE/Kubernetes 2+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Big Data Technologies : Spark/Scala/Hadoop What could set you apart Experience designing and developing big data processing solutions using DataProc, Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others Cloud Certification especially in GCP Self-starter that identifies/responds to priority shifts with minimal supervision. You have excellent leadership and motivational skills You have an inquisitive and innovative mindset with a shown ability to recognize opportunities to create distinctive value You can successfully evaluate workload to drive efficiency Show more Show less
Posted 1 month ago
7.0 - 9.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. Contribute to the release of code and applications. Support IT functions hosting services on Azure & GCP, continuous integration, test automation. Help our team support build and own the implementation and IaC supporting CI/CD Use modern infrastructure tools and platforms to automate our systems. Help define the DevOps roadmap, discover what is needed, define the scope and technologies, and help to build the backlog. Job Qualifications 7-9 years professional work experience Proficient with Cloud based implementations including Azure & GCP Strong knowledge of Ansible, Terraform & IaC scripting Continuous integration (Semaphore, Jenkins), Test Automation DB: PostgreSQL, Snowflake, MySQL Snowflake, Google Dataflow, & PowerBI DevOps knowledge Experience with NGINX, Linux Experience with Windows Server and MS SQL Server 2019 or newer Strong programming and scripting fundamentals (Bash, PowerShell, Visual script) Containers such as Docker Excellent communication skills Strong attention to detail and excellent analytical capabilities A desire to write tools and applications to automate work rather than do everything by hand Passionate about Continuous Build, Integration, Test, and Delivery systems Familiarity With The Following Is a Strong Plus Tools: JIRA, Confluence, Github. TFS Experience or exposure to Agile/SCRUM is ideal Cotality's Diversity Commitment Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 6-9 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300079 Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300075 Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300075 Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 6-9 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300079 Show more Show less
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description The Opportunity: Full Stack Data Engineer We're seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform. In this role, you'll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment. You'll work with GCP Native technologies like BigQuery, Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance. This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at Ford. Responsibilities What You'll Do: ( Responsibilities) Data Pipeline Architect & Builder: Spearhead the design, development, and maintenance of scalable data ingestion and curation pipelines from diverse sources. Ensure data is standardized, high-quality, and optimized for analytical use. Leverage cutting-edge tools and technologies, including Python, SQL, and DBT/Dataform, to build robust and efficient data pipelines. End-to-End Integration Expert: Utilize your full-stack skills to contribute to seamless end-to-end development, ensuring smooth and reliable data flow from source to insight. GCP Data Solutions Leader : Leverage your deep expertise in GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that not only meet but exceed business needs and expectations. Data Governance & Security Champion : Implement and manage robust data governance policies, access controls, and security best practices, fully utilizing GCP's native security features to protect sensitive data. Data Workflow Orchestrator : Employ Astronomer and Terraform for efficient data workflow management and cloud infrastructure provisioning, championing best practices in Infrastructure as Code (IaC). Performance Optimization Driver : Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions, ensuring optimal resource utilization and cost-effectiveness. Collaborative Innovator : Collaborate effectively with data architects, application architects, service owners, and cross-functional teams to define and promote best practices, design patterns, and frameworks for cloud data engineering. Automation & Reliability Advocate : Proactively automate data platform processes to enhance reliability, improve data quality, minimize manual intervention, and drive operational efficiency. Effective Communicator : Clearly and transparently communicate complex technical decisions to both technical and non-technical stakeholders, fostering understanding and alignment. Continuous Learner : Stay ahead of the curve by continuously learning about industry trends and emerging technologies, proactively identifying opportunities to improve our data platform and enhance our capabilities. Business Impact Translator : Translate complex business requirements into optimized data asset designs and efficient code, ensuring that our data solutions directly contribute to business goals. Documentation & Knowledge Sharer : Develop comprehensive documentation for data engineering processes, promoting knowledge sharing, facilitating collaboration, and ensuring long-term system maintainability. Qualifications What You'll Bring: (Qualifications) Bachelor's degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field (or equivalent combination of education and experience). 5-7 years of experience in Data Engineering or Software Engineering, with at least 2 years of hands-on experience building and deploying cloud-based data platforms (GCP preferred). Strong proficiency in SQL, Java, and Python, with practical experience in designing and deploying cloud-based data pipelines using GCP services like BigQuery, Dataflow, and DataProc. Solid understanding of Service-Oriented Architecture (SOA) and microservices, and their application within a cloud data platform. Experience with relational databases (e.g., PostgreSQL, MySQL), NoSQL databases, and columnar databases (e.g., BigQuery). Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments. Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform and Tekton, and other automation frameworks. Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data platform and microservices issues. Experience in monitoring and optimizing cost and compute resources for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc). A passion for data, innovation, and continuous learning. Show more Show less
Posted 1 month ago
5.0 years
6 - 7 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role you will be Design and Develop ETL Processes: Lead the design and implementation of ETL processes using all kinds of batch/streaming tools to extract, transform, and load data from various sources into GCP. Collaborate with stakeholders to gather requirements and ensure that ETL solutions meet business needs. Data Pipeline Optimization: Optimize data pipelines for performance, scalability, and reliability, ensuring efficient data processing workflows. Monitor and troubleshoot ETL processes, proactively addressing issues and bottlenecks. Data Integration and Management: Integrate data from diverse sources, including databases, APIs, and flat files, ensuring data quality and consistency. Manage and maintain data storage solutions in GCP (e.g., BigQuery, Cloud Storage) to support analytics and reporting. GCP Dataflow Development: Write Apache Beam based Dataflow Job for data extraction, transformation, and analysis, ensuring optimal performance and accuracy. Collaborate with data analysts and data scientists to prepare data for analysis and reporting. Automation and Monitoring: Implement automation for ETL workflows using tools like Apache Airflow or Cloud Composer, enhancing efficiency and reducing manual intervention. Set up monitoring and alerting mechanisms to ensure the health of data pipelines and compliance with SLAs. Data Governance and Security: Apply best practices for data governance, ensuring compliance with industry regulations (e.g., GDPR, HIPAA) and internal policies. Collaborate with security teams to implement data protection measures and address vulnerabilities. Documentation and Knowledge Sharing: Document ETL processes, data models, and architecture to facilitate knowledge sharing and onboarding of new team members. Conduct training sessions and workshops to share expertise and promote best practices within the team. Requirements To be successful in this role, you should meet the following requirements: Education: Bachelor’s degree in Computer Science, Information Systems, or a related field. Experience: Minimum of 5 years of industry experience in data engineering or ETL development, with a strong focus on Data Stage and GCP. Proven experience in designing and managing ETL solutions, including data modeling, data warehousing, and SQL development. Technical Skills: Strong knowledge of GCP services (e.g., BigQuery, Dataflow, Cloud Storage, Pub/Sub) and their application in data engineering. Experience of cloud-based solutions, especially in GCP, cloud certified candidate is preferred. Experience and knowledge of Bigdata data processing in batch mode and streaming mode, proficient in Bigdata eco systems, e.g. Hadoop, HBase, Hive, MapReduce, Kafka, Flink, Spark, etc. Familiarity with Java & Python for data manipulation on Cloud/Bigdata platform. Analytical Skills:Strong problem-solving skills with a keen attention to detail. Ability to analyze complex data sets and derive meaningful insights. Benefits:Competitive salary and comprehensive benefits package. Opportunity to work in a dynamic and collaborative environment on cutting-edge data projects. Professional development opportunities to enhance your skills and advance your career. If you are a passionate data engineer with expertise in ETL processes and a desire to make a significant impact within our organization, we encourage you to apply for this exciting opportunity! You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI
Posted 1 month ago
10.0 years
25 - 30 Lacs
Cochin
On-site
Job Role: Data Architect Experience: 10+ years Notice period: Immediate to 15 days Location: Trivandrum / Kochi Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP. Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. Ensure data architecture aligns with business goals, governance, and compliance requirements. Collaborate with stakeholders to define data strategy and roadmap. Design and deploy BigQuery solutions for optimized performance and cost efficiency. Build and maintain ETL/ELT pipelines for large-scale data processing. Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. Implement best practices for data security, privacy, and compliance in cloud environments. Integrate machine learning workflows with data pipelines and analytics tools. Define data governance frameworks and manage data lineage. Lead data modeling efforts to ensure consistency, accuracy, and performance across systems. Optimize cloud infrastructure for scalability, performance, and reliability. Mentor junior team members and ensure adherence to architectural standards. Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). Ensure high availability and disaster recovery solutions are built into data systems. Conduct technical reviews, audits, and performance tuning for data solutions. Design solutions for multi-region and multi-cloud data architecture. Stay updated on emerging technologies and trends in data engineering and GCP. Drive innovation in data architecture, recommending new tools and services on GCP. Certifications : Google Cloud Certification is Preferred. Primary Skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. Strong experience in data warehousing, data lakes, and real-time data pipelines. Proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks. Strong problem-solving skills and ability to architect solutions for complex data environments. Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Leadership experience and ability to mentor technical teams. Excellent communication and collaboration skills. Job Types: Full-time, Permanent Pay: ₹2,500,000.00 - ₹3,000,000.00 per year Benefits: Paid time off Provident Fund Schedule: Monday to Friday UK shift Supplemental Pay: Performance bonus Ability to commute/relocate: Kochi, Kerala: Reliably commute or planning to relocate before starting work (Required) License/Certification: Google Cloud Certification (Required) Willingness to travel: 100% (Required) Work Location: In person
Posted 1 month ago
7.0 years
30 Lacs
Cochin
On-site
Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP. Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. Ensure data architecture aligns with business goals, governance, and compliance requirements. Collaborate with stakeholders to define data strategy and roadmap. Design and deploy BigQuery solutions for optimized performance and cost efficiency. Build and maintain ETL/ELT pipelines for large-scale data processing. Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. Must Have skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. Strong experience in data warehousing, data lakes, and real-time data pipelines. Proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks. Strong problem-solving skills and ability to architect solutions for complex data environments. Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Leadership experience and ability to mentor technical teams. Excellent communication and collaboration skills. Job Type: Full-time Pay: Up to ₹3,000,000.00 per year Application Question(s): How many years of experience do you have in total ? Do you have all skills as per the JD ? Current CTC ? Expected CTC ? Location ? Notice Period ? Work Location: In person
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Basic Qualifications: Bachelor’s or Master’s degree in a Computer Science, Engineering or a related or related field of study 5+ Years - Ability to work effectively across organizations, product teams and business partners. 5+ Years - Knowledge Agile (Scrum) Methodology, experience in writing user stories 5+ Years - Strong understating of Database concepts and experience with multiple database technologies – optimizing query and data processing performance. 5+ Years - Full Stack Data Engineering Competency in a public cloud – Google, MS Azure, AWS Critical thinking skills to propose data solutions, test, and make them a reality. 5+ Years - Highly Proficient in SQL, Python, Java, Scala, or Go (or similar) - Experience programming engineering transformation in Python or a similar language. 5+ Years Demonstrated ability to lead data engineering projects, design sessions and deliverables to successful completion. Cloud native technologist Deep understanding of data service ecosystems including data warehousing, lakes, metadata, meshes, fabrics and AI/ML use cases. User experience advocacy through empathetic stakeholder relationship. Effective Communication both internally (with team members) and externally (with stakeholders) Knowledge of Data Warehouse concepts – experience with Data Warehouse/ ETL processes Strong process discipline and thorough understating of IT processes (ISP, Data Security). Responsibilities Responsibilities: Interact with GDIA product lines and business partners to understand data engineering opportunities, tooling and needs. Collaborate with Data Engineering and Data Architecture to design and build templates, pipelines and data products including automation, transformation and curation using best practices Develop custom cloud solutions and pipelines with GCP native tools – Data Prep, Data Fusion, Data Flow, DBT and Big Query Operationalize and automate data best practices: quality, auditable, timeliness and complete Participate in design reviews to accelerate the business and ensure scalability Work with Data Engineering and Architecture and Data Platform Engineering to implement strategic solutions Advise and direct team members and business partners on Ford standards and processes. Qualifications Preferred Qualifications: Excellent communication, collaboration and influence skills; ability to energize a team. Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality Hands on experience in Python using libraries like NumPy, Pandas, etc. Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, DataFusion, PubSub / Kafka, Looker Studio, VertexAI Experience with Teradata, Hadoop, Hive, Spark and other parts of legacy data platform Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products. Data Governance concepts including GDPR (General Data Protection Regulation), CCPA (California Consumer Protection Act), PoLP and how these can impact technical architecture Show more Show less
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
Kochi, Kerala, India
On-site
Role Description UST is looking for a talented GCP Data Engineer with 5 to 10 years of experience to join our team and play a crucial role in designing and implementing efficient data solutions on the Google Cloud Platform (GCP). The ideal candidate should possess strong data engineering skills, expertise in GCP services, and proficiency in data processing technologies, particularly PySpark. Responsibilities Data Pipeline Development: Design, implement, and optimize end-to-end data pipelines on GCP, focusing on scalability and performance. Develop and maintain ETL workflows for seamless data processing. GCP Cloud Expertise Utilize GCP services such as BigQuery, Cloud Storage, and Dataflow for effective data engineering. Implement and manage data storage solutions on GCP. Data Transformation With PySpark Leverage PySpark for advanced data transformations, ensuring high-quality and well-structured output. Implement data cleansing, enrichment, and validation processes using PySpark. Requirements Proven experience as a Data Engineer, with a strong emphasis on GCP. Proficiency in GCP services such as BigQuery, Cloud Storage, and Dataflow. Expertise in PySpark for data processing and analytics is a must. Experience with data modeling, ETL processes, and data warehousing. Proficiency in programming languages such as Python, SQL, or Scala for data processing. Relevant certifications in GCP or data engineering are plus. Skills GCP, PySpark Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
At Swarovski, where innovation meets inspiration, our people desire to explore, experience and create. We are looking for a Data Engineer where you will get a chance to work in a rewarding role within a diverse team that is pushing boundaries. Be part of a truly iconic global brand, learn and grow with us. We’re bold and inventive, revealing astonishing things like no one else can. A world of wonder awaits you. About The Job Maintain and support data transformation pipelines from the Landing Zone to the Consumption Layer. Monitor the health and performance of data pipelines, infrastructure, and associated systems within GCP and SAP BW environments, ensuring seamless and efficient operations. Identify, diagnose, and resolve issues or anomalies in real-time, minimizing downtime and safeguarding data integrity. Manage and resolve support tickets related to data engineering operations, ensuring timely and effective responses to stakeholder needs. Collaborate with cross-functional teams to understand ongoing data requirements and provide necessary support for existing solutions, ensuring alignment with current business needs. Act as the primary contact for troubleshooting and resolving data-related issues raised by other departments. Regularly review the performance of existing data pipelines, identifying opportunities for improvement and implementing optimizations to enhance efficiency, performance, and scalability. Provide ongoing support to data scientists and analysts, ensuring consistent access to reliable data and the necessary tools for analysis and reporting. Support and maintain processes for integrating and transforming data within the Data Warehouse and across all Data Analytics platforms.About you We are looking for a unique and amazing talent, who brings along the following: Bachelor's or Master's degree in Computer Science, Engineering, or a related field 5 years of hands-on experience in data engineering, building and maintaining data pipelines and systems Strong proficiency in GCP (Google Cloud Platform), including services such as BigQuery, Dataflow, Dataform, Cloud Storage and Pub/Sub Solid programming skills in languages like Python, SQL Experience with data modeling, data warehousing, and ETL/ELT processes Familiarity with data governance, data privacy, and security practices Strong problem-solving and analytical skills, with the ability to handle complex data-related issues Excellent communication and collaboration skills to work effectively with cross-functional teams Self-motivated and able to work independently, as well as in a team-oriented environment Deep understanding of SAP modules Nice-to-have: Experience in SAP BW data engineering (SAP BW 3.x, 7.x, BW on HANA) About Swarovski Masters of Light Since 1895 Swarovski creates beautiful crystals-based products of impeccable quality and craftsmanship that bring joy and celebrate individuality. Founded in 1895 in Austria, the company designs, manufactures and sells the world's finest crystals, gemstones, Swarovski Created Diamonds and zirconia, jewelry, and accessories, as well as objects and home accessories. Swarovski Crystal Business has a global reach with approximately 2,400 stores and 6,700 points of sales in over 150 countries and employs more than 18,000 people. Together with its sister companies Swarovski Optik (optical devices) and Tyrolit (abrasives), Swarovski Crystal Business forms the Swarovski Group. A responsible relationship with people and the planet is part of Swarovski’s heritage. Today this legacy is rooted in sustainability measures across the value chain, with an emphasis on circular innovation, championing diversity, inclusion and self-expression, and in the philanthropic work of the Swarovski Foundation, which supports charitable organizations bringing positive environmental and social impact. Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Kochi, Kerala, India
On-site
Company Description Chadwick Professional Services specializes in IT Permanent and Temporary Staffing, Leadership/Executive Hiring, Recruitment Process Outsourcing (RPO), and Market Insight. Recognized by MSME, Start-up India, and Start-up Karnataka, Chadwick is dedicated to innovative solutions in the industry. Our goal is to help clients undertake crucial projects by providing and managing exceptional talent. We offer a spectrum of staffing solutions, from short-term to strategic, suitable for organizations across all sizes and locations. With a blend of advanced technology and an agile team of recruiters, we enable our clients to focus on maximizing business performance and productivity. Role Description We are seeking a full-time GCP Data Architect for an on-site role located in Kochi and Trivandrum. The GCP Data Architect will be responsible for designing, implementing, and managing data architecture solutions. Daily tasks include overseeing data governance, data modeling, ETL processes, and data warehousing. The role requires close collaboration with various teams to ensure that data architecture solutions meet the business's needs and strategic goals. Qualifications • 7+ years of experience in data architecture, with at least 3 years in GCP environments. • Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. • Strong experience in data warehousing, data lakes, and real-time data pipelines. • Proficiency in SQL, Python, or other data processing languages. • Experience with cloud security, data governance, and compliance frameworks. •Strong problem-solving skills and ability to architect solutions for complex data environments. • Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Leadership experience and ability to mentor technical teams. Excellent communication and collaboration skills. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Skills: Python, Spark, Data Engineer, Cloudera, Onpremise, Azure, Snlowfow, Kafka, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title: Lead Data Engineer Location: Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Skills: Python, Apache Spark, Snowflake, data engineer, spark, kafka, azure, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title : Lead Data Engineer Location: Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools . Agile environments (e.g. Scrum, XP) Relational databases Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. How will you make an impact in this role? We are looking for energetic, high-performing and highly skilled Java Full Stack Engineers with some GCP Cloud experience to help shape our technology and product roadmap. You will be part of the fast-paced, entrepreneurial Enterprise Personalization portfolio focused on delivering the next generation global marketing capabilities. This team is responsible for building products that power Merchant Offers personalization for Amex card members. Job Description: Demonstrated leadership in designing sustainable software products, setting development standards, automated code review process, continuous build and rigorous testing etc Ability to effectively lead and communicate across 3rd parties, technical and business product managers on solution design Primary focus is spent writing code, API specs, conducting code reviews & testing in ongoing sprints or doing proof of concepts/automation tools Applies visualization and other techniques to fast track concepts Functions as a core member of an Agile team driving User story analysis & elaboration, design and development of software applications, testing & builds automation tools Works on a specific platform/product or as part of a dynamic resource pool assigned to projects based on demand and business priority Identifies opportunities to adopt innovative technologies Design, develop, and deploy scalable applications on the Google Cloud Platform using Java 8 & above. In-depth understanding of Google Cloud Platform services and architecture. Proficiency in developing and deploying applications on GCP using services such as Compute Engine, App Engine, Cloud Functions, Big Query, Cloud SQL, DataProc, Dataflow etc. Experience with containerization technologies such as Docker and Kubernetes is a plus. Awareness of best practices for cloud-based application development, ensuring security, reliability, and efficiency. Troubleshoot and resolve issues related to application performance and functionality on cloud Qualification: A bachelor’s degree in computer science, computer engineering, other technical discipline, or equivalent work experience 7+ years of experience in software development; 3-5 years in leading teams of engineers Demonstrated experience with Agile or other rapid application development methods Demonstrated experience with object-oriented design and coding Full-Stack development exposure Demonstrated experience on these core technical skills (Mandatory) Core Java, Spring Framework, Spring boot, Java EE Relational Database (Postgres / MySQL / DB2 etc.) Data Serialization techniques (Ex: Avro) Cloud development (Micro-services) Parallel & distributed (multi-tiered) systems Application design, software development and automated testing Demonstrated experience on these additional technical skills (Nice to Have) Unix / Shell scripting GenNext languages: Python / Scala / Golang / Julia etc Message Queuing, Stream processing (Kafka) AJAX tools/ Frameworks. Web services, open API development, and REST concepts Experience with implementing integrated automated release management using tools/technologies/frameworks like Maven, Git, code/security review tools, Jenkins, Automated testing and Junit. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector FS X-Sector Specialism Risk Management Level Senior Associate Job Description & Summary We are seeking a highly skilled Sailpoint Developer .If candidate has experience of 2-3 years, he/she must be Sailpoint Certified, above 3 years experience sailpoint certification is not mandatory but good to have. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary:. We are seeking an experienced Big Query Specialist with 3-4 years of hands-on experience in managing, analyzing, and integrating data within Google Cloud Platform. The ideal candidate will have a strong background in data exploration, ETL processes, and log management, along with proficiency in Python for data manipulation and exportation Responsibilities Big Query Setup and Modification: o Design, implement, and modify Big Query datasets and tables. o Optimize Big Query architecture for performance and cost-effectiveness. Data Analysis: o Querying Big Query: Writing and executing SQL queries to extract, transform, and load (ETL) data from Big Query. o Data Exploration: Using SQL to explore datasets, identify patterns, trends, and anomalies. o Data Interpretation: Analyzing results and drawing meaningful conclusions to inform business decisions. o Data Visualization: Creating reports and dashboards using tools like Looker Studio, Google Sheets, or other BI tools to communicate insights. o Data Quality: Identifying and addressing data quality issues, ensuring data accuracy and consistency. Log Ingestion and Modification: o Develop and manage processes for ingesting logs into Big Query tables. o Ensure data accuracy and consistency through effective data transformation techniques. Integration with Security Solutions: o Collaborate with security teams to integrate Big Query with other security solutions. o Implement data governance and security best practices within Big Query. Licensing and Module Management: o Understand and manage GCP licensing and module configurations related to Big Query. o Provide guidance on cost management and optimization strategies. Data Modeling and Management: Data Modeling: Understanding data structures and relationships within Big Query datasets. o Data Pipelines: Developing and maintaining data pipelines for ingesting, transforming, and loading data into Big Query. o Data Governance: Ensuring data security, privacy, and compliance with relevant regulations Data Exportation and Python Proficiency: o Export logs and data efficiently from Big Query using Python. o Develop scripts and tools for data manipulation and automation tasks. Optional Skills: o Data visualization skills (e.g., Tableau, Looker). o Basic programming skills (e.g., Python, R). o Experience with machine learning or predictive analytics. Mandatory Skill Sets 3-4 years of experience working with Google Big Query and GCP. Strong proficiency in SQL and Python programming. Experience with data integration, ETL processes, and data pipeline management. Familiarity with security solutions and data governance practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Skill Sets Experience with other GCP services such as Google Cloud Storage, Dataflow, or Pub/Sub. Knowledge of machine learning concepts and Big Query ML. Certification in Google Cloud Platform (e.g., Professional Data Engineer). Years of experience required: 3-7 Years Education Qualification B.Tech/MCA/MBA with IT background/ Bachelor’s degree in Information Technology, Cybersecurity, Computer Science Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Google BigQuery Optional Skills Accepting Feedback, Accepting Feedback, Access Control Models, Access Control System, Access Management, Active Listening, Analytical Thinking, Authorization Compliance, Authorization Management Systems, Azure Active Directory, Cloud Identity and Access Management (IAM), Communication, Creativity, CyberArk Management, Cybersecurity, Embracing Change, Emotional Regulation, Empathy, Encryption Technologies, Federated Identity Management, ForgeRock Identity Platform, Identity and Access Management (IAM), Identity-Based Encryption, Identity Federation, Identity Governance Framework (IGF) {+ 22 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: GCP Data Engineer Experience: 3–4 Years Location: Gurgaon (On-site) Shift: Rotational (Night shift allowance applicable) Joiners: Immediate only Role Overview: We are looking for a skilled and motivated GCP Data Engineer with 3–4 years of experience to join our growing team in Gurgaon . This role involves working in rotational shifts , including night shifts , for which additional allowance will be provided . The ideal candidate should have a strong command over Python, SQL, GCP, and BigQuery . Key Responsibilities: Develop, manage, and optimize data pipelines and workflows on Google Cloud Platform (GCP) Write and optimize complex SQL queries to transform and extract data Build scalable and efficient solutions using BigQuery Collaborate with data scientists and business teams to understand requirements and deliver data-driven solutions Monitor and troubleshoot production jobs and ensure data quality Work in rotational shifts to support global data operations Required Skills: Proficient in Python for data manipulation and scripting Strong in SQL – writing, debugging, and optimizing queries Hands-on experience with Google Cloud Platform (GCP) Expertise in BigQuery for data warehousing and analytics Good understanding of data pipelines, ETL/ELT processes, and cloud data architecture Nice to Have: Experience working in a production environment with monitoring and alerting tools Knowledge of GCP services like Dataflow, Cloud Storage, and Pub/Sub Exposure to CI/CD pipelines and version control systems (e.g., Git) Additional Information: Location: Gurgaon (on-site role) Shift: Rotational (Night shift allowance will be provided) Notice Period: Immediate Joiners Preferred Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
What you’ll do? Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of relevant software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Cloud Certification strongly preferred Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Get to know Okta Okta is The World’s Identity Company. We free everyone to safely use any technology—anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box - we’re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We’re building a world where Identity belongs to you. Job Description: Okta is looking for a Marketing Operations Manager to join the Marketing Data Operations and Technology team. Okta has ambitious growth plans and is striving for operational excellence across the underlying marketing technology infrastructure to support the wider go-to-market organization. This role will be instrumental in optimizing our marketing automation and CRM processes, ensuring data integrity, and driving lead lifecycle management operations. You will work closely with various teams across the organization (including Marketing Automation, Business Technology, and Digital Marketing) to maintain and optimize Okta’s Marketing Operations infrastructure. Our ideal candidate will be a creative problem-solver with a strong technical aptitude and a passion for leveraging our tools to help solve business problems. This role will be reporting into the Sr. Manager, Marketing Operations. Job Duties And Responsibilities : Oversee Marketo and Marketo integrations as a subject matter expert (including tool provisioning, supporting internal security initiatives, monitoring Marketo system health, and engaging with Marketo technical support teams). Develop and maintain comprehensive documentation including technical architecture, business requirements, and data dictionaries. Support and troubleshoot third party integration setup and dataflow into Marketo such as paid media, webinar platforms (On24, BrightTalk…etc.), content syndication vendors, and more. Lead tool technical support and drive tool optimization in Marketo integrated systems, including Splash, 6sense, Folloze, and more. Routinely audit, optimize, and manage the core operational processing programs and architecture within Marketo. Partner closely with the Marketing Automation & Analytics team to ensure that data collection and data transfer across Marketo and the Marketing Technology ecosystem support the marketing needs. Partner closely with Legal & Privacy to manage Marketo privacy and compliance programs through audits, updates, and documentation. Support extended teams like Marketing and Business Technology with Marketing Operations initiatives and inquiries. Skills & Experience: To successfully undertake this role, the individual will be expected to have: 7+ years relevant Marketing Operations experience at a fast-growing B2B SaaS company Required: 5+ years of experience of managing and working with Marketo at an admin capacity - Marketo Certified Expert preferred Required: Strong critical thinker and problem solver, with an eye for detail Required: Proven ability to collaborate effectively across functionally and geographically distributed teams Required: Experience of working with business stakeholders to understand existing workflows, business requirements and translate this into solution design and delivery. Required: Exceptional written and verbal communication skill with the ability to convey technical information in a clear and concise manner Preferred: Experience of working with Salesforce (preference for candidates who have worked directly with systems integrations with Salesforce). Candidates should be comfortable with the core Salesforce Object models Preferred: Candidates with exposure to a range of Marketing Technology applications: For example: Sales outreach platforms: Such as Outreach / SalesLoft. ABM platforms: Such as 6Sense / Folloze. Event tech platforms: such as On24 / BrightTalk / Rainfocus Data Enrichment Tools: Such as Leadspace/ Clay / ZoomInfo / Clearbit "This role requires in-person onboarding and travel to our Bengaluru, IN office during the first week of employment." What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Developing Talent and Fostering Connection + Community at Okta Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live. Find your place at Okta today! https://www.okta.com/company/careers/. Some roles may require travel to one of our office locations for in-person onboarding. Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws. If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation. Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Privacy Policy at https://www.okta.com/privacy-policy/. Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Lead a team of Software Engineers to design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.). Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity. What Experience You Need Bachelor's degree or equivalent experience 10+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs. What Could Set You Apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France