Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a PostgreSQL Database Developer with a minimum of 4 years of experience in database management. The ideal candidate should be passionate about technology, dedicated to continuous learning, and committed to providing exceptional customer experiences through client interactions. Qualifications: - Must have a degree in BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or related fields. - Expertise and hands-on experience in PostgreSQL, PLSQL, Oracle, query optimization, performance tuning, and GCP Cloud. Job Description: The responsibilities of the PostgreSQL Database Developer include: - Proficient in PL/SQL and PostgreSQL programming, with the ability to write complex SQL queries and stored procedures. - Experience in migrating database structure and data from Oracle to Postgres SQL, preferably on GCP Alloy DB or Cloud SQL. - Familiarity with Cloud SQL/Alloy DB and tuning them for better performance. - Working knowledge of Big Query, Fire Store, Memory Store, Spanner, and bare metal setup for PostgreSQL. - Expertise in tuning Alloy DB/Cloud SQL database for optimal performance. - Experience with GCP Data migration service, MongoDB, Cloud Dataflow, Disaster Recovery, job scheduling, logging techniques, and OLTP/OLAP. - Desirable: GCP Database Engineer Certification. Roles & Responsibilities: - Develop, test, and maintain data architectures. - Migrate Enterprise Oracle database from On-Prem to GCP cloud, focusing on autovacuum in PostgreSQL. - Tuning autovacuum in PostgreSQL. - Performance tuning of PostgreSQL stored procedures and queries. - Convert Oracle stored procedures and queries to PostgreSQL equivalents. - Create a hybrid data store with Datawarehouse, NoSQL GCP solutions, and PostgreSQL. - Migrate Oracle table data to Alloy DB. - Lead the database team. Mandatory Skills: PostgreSQL, PLSQL, Bigquery, GCP Cloud, tuning, and optimization. To apply, please share your resume at sonali.mangore@impetus.com with details of your current CTC, expected CTC, notice period, and Last Working Day (LWD).,
Posted 12 hours ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
Wipro Limited is a leading technology services and consulting company dedicated to developing innovative solutions that cater to the most complex digital transformation needs of clients. Our comprehensive range of consulting, design, engineering, and operational capabilities enables us to assist clients in achieving their most ambitious goals and establishing sustainable, future-ready businesses. With a global presence of over 230,000 employees and business partners spanning 65 countries, we remain committed to supporting our customers, colleagues, and communities in navigating an ever-evolving world. We are currently seeking an individual with hands-on experience in data modeling for both OLTP and OLAP systems. The ideal candidate should possess a deep understanding of Conceptual, Logical, and Physical data modeling, coupled with a robust grasp of indexing, partitioning, and data sharding, supported by practical experience. Experience in identifying and mitigating factors impacting database performance for near-real-time reporting and application interaction is essential. Proficiency in at least one data modeling tool, preferably DB Schema, is required. Additionally, functional knowledge of the mutual fund industry would be beneficial. Familiarity with GCP databases such as Alloy DB, Cloud SQL, and Big Query is preferred. The role demands willingness to work from our Chennai office, with a mandatory presence on-site at the customer site requiring five days of work per week. Cloud-PaaS-GCP-Google Cloud Platform is a mandatory skill set for this position. The successful candidate should have 5-8 years of relevant experience and should be prepared to contribute to the reimagining of Wipro as a modern digital transformation partner. We are looking for individuals who are inspired by reinvention - of themselves, their careers, and their skills. At Wipro, we encourage continuous evolution, reflecting our commitment to adapt to the changing world around us. Join us in a business driven by purpose, where you have the freedom to shape your own reinvention. Realize your ambitions at Wipro. We welcome applications from individuals with disabilities. For more information, please visit www.wipro.com.,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
As a Solutions Architect with over 7 years of experience, you will have the opportunity to leverage your expertise in cloud data solutions to architect scalable and modern solutions on AWS. In this role at Quantiphi, you will be a key member of our high-impact engineering teams, working closely with clients to solve complex data challenges and design cutting-edge data analytics solutions. Your responsibilities will include acting as a trusted advisor to clients, leading discovery/design workshops with global customers, and collaborating with AWS subject matter experts to develop compelling proposals and Statements of Work (SOWs). You will also represent Quantiphi in various forums such as tech talks, webinars, and client presentations, providing strategic insights and solutioning support during pre-sales activities. To excel in this role, you should have a strong background in AWS Data Services including DMS, SCT, Redshift, Glue, Lambda, EMR, and Kinesis. Your experience in data migration and modernization, particularly with Oracle, Teradata, and Netezza to AWS, will be crucial. Hands-on experience with ETL tools such as SSIS, Informatica, and Talend, as well as a solid understanding of OLTP/OLAP, Star & Snowflake schemas, and data modeling methodologies, are essential for success in this position. Additionally, familiarity with backend development using Python, APIs, and stream processing technologies like Kafka, along with knowledge of distributed computing concepts including Hadoop and MapReduce, will be beneficial. A DevOps mindset with experience in CI/CD practices and Infrastructure as Code is also desired. Joining Quantiphi as a Solutions Architect is more than just a job it's an opportunity to shape digital transformation journeys and influence business strategies across various industries. If you are a cloud data enthusiast looking to make a significant impact in the field of data analytics, this role is perfect for you.,
Posted 3 days ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a minimum of 8 years of experience as a Power BI Developer with at least 7 years to 12 years of total experience. Your role will involve hands-on experience in handling teams and clients. You should possess expert knowledge in using advanced calculations in MS Power BI Desktop, including DAX languages such as Aggregate, Date, Logical, String, and Table functions. Prior experience in connecting Power BI with both on-premise and cloud computing platforms is required. A deep understanding and the ability to utilize and explain various aspects of relational database design, multidimensional database design, OLTP, OLAP, KPIs, Scorecards, and Dashboards are essential for this role. You should have a very good understanding of Data Modeling Techniques for Analytical Data, including Facts, Dimensions, and Measures. Experience in data warehouse design, specifically dimensional modeling, and data mining will be beneficial for this position. Additionally, hands-on experience in SSIS, SSRS, and SSAS will be considered a plus.,
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a skilled Data Modeler with expertise in using DBSchema within GCP environments. In this role, you will be responsible for creating and optimizing data models for both OLTP and OLAP systems, ensuring they are well-designed for performance and maintainability. Your key responsibilities will include developing conceptual, logical, and physical models using DBSchema, aligning schema design with application requirements, and optimizing models in BigQuery, CloudSQL, and AlloyDB. Additionally, you will be involved in supporting schema documentation, reverse engineering, and visualization tasks. Your must-have skills for this role include proficiency in using the DBSchema modeling tool, strong experience with GCP databases such as BigQuery, CloudSQL, and AlloyDB, as well as knowledge of OLTP and OLAP system structures and performance tuning. It is essential to have expertise in SQL and schema evolution/versioning best practices. Preferred skills include experience integrating DBSchema with CI/CD pipelines and knowledge of real-time ingestion pipelines and federated schema design. As a Data Modeler, you should possess soft skills such as being detail-oriented, organized, and communicative. You should also feel comfortable presenting schema designs to cross-functional teams. By joining this role, you will have the opportunity to work with industry-leading tools in modern GCP environments, enhance modeling workflows, and contribute to enterprise data architecture with visibility and impact.,
Posted 5 days ago
10.0 - 15.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Staff Software Engineer specializing in Java at Walmart Global Tech in Chennai, you will play a crucial role in guiding the team in making architectural decisions and best practices for building scalable applications. Your responsibilities will include driving design, development, implementation, and documentation of cutting-edge solutions that impact associates of Walmart globally. You will collaborate with engineering teams across different locations, engage with Product Management and Business to drive product agendas, and work closely with architects to ensure solutions meet Quality, Cost, and Delivery standards. With a Bachelor's/Master's degree in Computer Science or a related field and a minimum of 10 years of experience in software design, development, and automated deployments, you will bring valuable expertise to the team. Your prior experience in delivering highly scalable Java applications, strong system design skills, and proficiency in CS fundamentals, Microservices, Data Structures, and Algorithms will be essential for success in this role. You should have hands-on experience with CICD development environments and tools like Git, Maven, and Jenkins, as well as expertise in writing modular and testable code using frameworks such as JUnit and Mockito. Your experience in building Java-based backend systems, working with cloud-based solutions, and familiarity with technologies like Spring Boot, Kafka, and Spark will be crucial. Additionally, you should be well-versed in microservices architecture, distributed concepts, design patterns, and cloud-native development. Your experience with relational and NoSQL databases, caching technologies, event-based systems like Kafka, monitoring tools like Prometheus and Splunk, and containerization tools like Docker and Kubernetes will be highly valuable. At Walmart Global Tech, you will have the opportunity to work in an innovative environment where your contributions can impact millions of people. The company values diversity, inclusion, and belonging, and offers a flexible, hybrid work environment along with competitive compensation, benefits, and opportunities for personal and professional growth. As an Equal Opportunity Employer, Walmart fosters a workplace culture where every individual is respected and valued, contributing to a welcoming and inclusive environment for all associates, customers, and suppliers.,
Posted 5 days ago
9.0 - 14.0 years
25 - 40 Lacs
Chennai
Work from Office
Role & responsibilities We are seeking a Data Modeller with over 12+ years of progressive experience in information technology, including a minimum of 4 years in a Data migration projects to cloud(refactor, replatform etc) and 2 years exposer to GCP. Preferred candidate profile In-depth knowledge of Data Warehousing/Lakehouse architectures, Master Data Management, Data Quality Management, Data Integration, and Data Warehouse architecture. Work with the business intelligence team to gather requirements for the database design and model Understand current on-premise DB model and refactoring to Google cloud for better performance. Knowledge of ER modeling, big data, enterprise data, and physical data models designs and implements data structures to support business processes and analytics, ensuring efficient data storage, retrieval, and management Create a logical data model and validate it to ensure it meets the demands of the business application and its users Experience in developing physical Model for SQL, No SQL, Key-Value pair, document database like Oracle, BigQuery, spanner, Postgresql, firestore, mongo DB etc Understand the data needs of the company or client Collaborate with the development team to design and build the database model for both Application and Datawarehousing development Classify the business needs and build both MicroServices & Reporting Database Model Strong hands on experience in SQL, Database procedures Work with the development team to develop and implement phase wise migration plan, go existing of on-prem and cloud DB, Help determine and manage data cleaning requirements
Posted 6 days ago
5.0 - 10.0 years
1 - 6 Lacs
Chennai
Work from Office
Strong Knowledge in Linux internals (Preferable RHEL / Ubuntu) Essential Knowledge in Windows internals Comprehensive understanding in DevOps / SRE, IaC and 12 Factor Principles Excellent hands-on experience in configuration management, orchestration and IaC tools (Ansible, Jenkins, Terraform) Strong understanding of Virtualization Technologies (KVM / Libvirt / oVirt / KubeVirt. OVM, Openstack) Strong understanding of Software Defined Storage Technologies (CEPH, GlusterFS) Strong understanding of Repository and Artifact management Tools (Red Hat Satellite, Spacewalk, Nexus) Strong understanding of Container Technologies (Docker, Kubernetes, Openshift) Strong understanding of ELK and its beats (Auditbeat, FileBeat) Strong understanding of OS Compliance Policies (CIS Benchmark) Agile methodologies and its ceremonies Architect, write and implement software that improves the stability, scalability, availability of products. Own multiple services and have the authonomy to do what suits the business and our customers in IT. Solve occurring problems and create solutions and automation to prevent them from happen again. Plan for reliability for systems to work across multi datacenter/environment and handle the outages. Conceptual understanding about infrastructure and how it works, DNS (Authoritive and Non-Authoritive DNS, Dynamic and bind DNS, Forwarder) SSL Communication (Handshake of SSL traffic, Cipher Suites, Enc Algorithyms,) Active Directory (Security OUs, policies)
Posted 6 days ago
5.0 - 7.0 years
16 - 20 Lacs
Hyderabad
Work from Office
Ready to shape the future? We are a global technology-driven organization committed to innovation, data intelligence, and scalable digital transformation. Our mission is to empower businesses and communities through cutting-edge cloud and AI solutions. We believe in fostering a culture of collaboration, continuous learning, and impactful change. Job Description As a Solution Architect, you will design and govern modern, cloud-native, AI-enabled solutions. Collaborating with global teams, youll review solution designs, identify architectural risks, and ensure alignment with enterprise standards, shaping the future technology landscape. Key Responsibilities You will be #LI-hybrid based in Hyderabad and reporting to Head of Architecture Lead architectural reviews and provide feedback on designs and non-functional requirements. Ensure architectural alignment and promote reuse and quality practices across global teams. Contribute to the UKI Batch Strategy and advocate for the UKI Batch Platform. Design scalable & secure AWS-based solutions. Guide teams in applying AWS Well-Architected Framework and best practices. Maintain architectural documentation (blueprints, patterns, specs). Lead migration to the UKI Batch Platform and decommission legacy systems. Collaborate with product teams to gather requirements and shape the platform backlog. Help define and evolve the UKI Batch Platform roadmap. Stay updated on latest technologies and best practices. Mentor engineering teams and promote architectural excellence. Provide leadership in integrating AI and Large Language Models (LLMs). About Experian Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.co m Experience and Skills Minimum 5 yrs of experience as a Solution Architect in enterprise solution design. Bachelors degree in Computer Science, IT, or a related field. Strong hands-on expertise with core AWS services (compute, storage, databases), container orchestration, serverless, data streaming, ETL, monitoring, performance tuning, and infrastructure as code. Experience with cloud-native architectures, including OLTP, event-driven, and streaming workloads. Proficient in microservices, containers, serverless, and cloud architecture patterns. Experience with AI/ML frameworks and Large Language Models (LLMs). Familiarity with architectural frameworks and AWS Well-Architected Framework. Knowledge of Agile methodologies and CI/CD practices Proficient in C#, Java, and Python. Additional Information Our uniqueness is that we celebrate yours. Experians culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experians people first approach is award-winning; Worlds Best Workplaces 2024 (Fortune Global Top 25), Great Place To Work in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Benefits Experian care for employees work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Staff Software Engineer (Java) at Walmart Global Tech in Chennai, you will play a crucial role in guiding the team in making architectural decisions and implementing best practices for building scalable applications. Your responsibilities will include driving design, development, and documentation, as well as building, testing, and deploying cutting-edge solutions that impact associates of Walmart worldwide. You will collaborate with Walmart engineering teams globally, engage with Product Management and Business to drive the agenda, and work closely with Architects and cross-functional teams to deliver solutions meeting Quality, Cost, and Delivery standards. To excel in this role, you should have a Bachelor's/Masters degree in Computer Science or a related field with a minimum of 10 years of experience in software design, development, and automated deployments. Your expertise should include delivering highly scalable Java applications, strong system design skills, knowledge of CS Fundamentals, Microservices, Data Structures, Algorithms, and proficiency in writing modular and testable code. Experience with Java, Spring Boot, Kafka, and Spark, as well as working in cloud-based solutions, is essential. You should also have a good understanding of microservices architecture, distributed concepts, design principles, and cloud native development. Additionally, your skills should encompass working with relational and NoSQL databases, caching technologies, event-based systems like Kafka, and monitoring tools like Prometheus and Splunk. Experience with containerization tools such as Docker, Helm, and Kubernetes, as well as knowledge of public cloud platforms like Azure and GCP, will be advantageous in this role. At Walmart Global Tech, you will work in an innovative environment where your contributions can impact millions of people. You will have the opportunity to grow your career, gain new skills, and collaborate with experts in the field. The company offers a flexible, hybrid work model, competitive compensation, and a range of benefits including maternity and parental leave, health benefits, and more. Walmart is committed to creating a culture of belonging where every associate feels valued and respected, fostering inclusivity and diversity across its global team. Join Walmart Global Tech to be part of a team that is shaping the future of retail, innovating at scale, and making a positive impact on the world.,
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Chennai
Work from Office
Hands-on experience in data modelling for both OLTP and OLAP systems. In-depth knowledge of Conceptual, Logical, and Physical data modelling. Strong understanding of indexing, partitioning, and data sharding with practical experience. Experience in identifying and addressing factors affecting database performance for near-real-time reporting and application interaction. Proficiency with at least one data modelling tool (preferably DB Schema). Functional knowledge of the mutual fund industry is a plus. Familiarity with GCP databases like Alloy DB, Cloud SQL, and Big Query. Willingness to work from Chennai (office presence is mandatory) Chennai customer site, requiring five days of on-site work each week. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform Experience: 5-8 Years
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a PostgreSQL Database Developer with a minimum of 4 years of experience in Database management. We are looking for an individual who is enthusiastic about technology, committed to continuous learning, and approaches every client interaction as an opportunity to deliver exceptional customer service. Qualifications: - BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or any related degrees - Proficiency in PostgreSQL, PLSQL, Oracle, Query optimization, Performance tuning, and GCP Cloud Key Responsibilities: - Proficient in PL/SQL and PostgreSQL programming with the ability to write complex SQL Queries and Stored Procedures - Experience in migrating Database structure and data from Oracle to PostgreSQL, preferably on GCP Alloy DB or Cloud SQL - Expertise in working with Cloud SQL/Alloy DB, tuning Alloy DB/PostgreSQL for enhanced performance, and utilizing BigQuery, Fire Store, Memory Store, Spanner, and bare metal setup - Familiarity with GCP Data migration service, MongoDB, Cloud Dataflow, Database Disaster Recovery, Job scheduling, logging techniques, and OLTP/OLAP - Desirable: GCP Database Engineer Certification Additional Responsibilities: - Develop, test, and maintain data architectures - Migrate Enterprise Oracle database from On-Premises to GCP cloud with a focus on autovacuum in PostgreSQL - Performance tuning of PostgreSQL stored procedure code and queries - Converting Oracle stored procedures & queries to PostgreSQL equivalents - Create Hybrid data stores integrating Datawarehouse and NoSQL GCP solutions with PostgreSQL - Lead the database team Mandatory Skills: PostgreSQL, PLSQL, BigQuery, GCP Cloud, Tuning, and Optimization If you meet the requirements and are interested in this position, kindly share your resume with details including CTC, expected CTC, Notice period, and Last Working Day (LWD) at sonali.mangore@impetus.com.,
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Job Overview We are on the cusp of a major transformation in our data capabilities, and we are looking for a hands-on Data Platform Engineer to help lead the way. Working alongside a Staff Data Architect and Engineer, you will play a key role in designing and setting up the next generation of our lakehouse and analytics platform a foundation that will directly drive the next phase of our companys growth and innovation. This is a unique opportunity to build an entire modern data stack from the ground up, combining cutting-edge open-source tools and cloud-native infrastructure. Key Responsibilities Hands-on Architecture and Development: Define, design, prototype, and deliver a robust lakehouse and analytics platform leveraging best-in-class technologies. Build the Core Platform: Collaborate with the Staff Data Architect to set up core infrastructure involving ingestion (ETL/ELT), warehouse, transformation, orchestration, and BI/analytics layers. Leverage Modern Data Stack: Work across an environment involving GCP-based compute instances and containers (Airbyte, ClickHouse, dbt, metriql, Prefect, Metabase, Next.js frontend) bringing cutting-edge engineering principles to life. Optimize for Scale and Security: Implement warehouse best practices, optimize performance for large-scale analytics, and enforce role- and attribute-based access controls. Data Modeling Excellence: Design flexible and scalable models to support real-time analytics, complex business queries, and long-term historical analysis. Innovation and Automation: Drive automation wherever possible in data ingestion, validation, transformation, and reporting processes. Cross-functional Collaboration: Work closely with Product, Engineering, and Leadership teams to ensure alignment with broader company objectives and business strategies. Mentor and Uplift: Provide technical mentorship to junior engineers and set high standards for operational excellence across the data organization. Drive Success Metrics: Align milestone progress with leadership goals around platform adoption, scalability, reliability, and impact on business KPIs. Skills Required 5+ years of experience designing, building, and scaling modern data platforms. Proven hands-on expertise with ETL/ELT pipelines, cloud-based warehouses (SQL, NoSQL, MPP, OLAP/OLTP), and advanced database optimization. Strong SQL and Python skills for data engineering and automation. Familiarity with setting up containerized data services (Docker/Kubernetes environments). Deep understanding of cloud platforms at least one of GCP, AWS, or Azure. Exposure to analytics workflows and BI environments. Strong experience working with Git, Jira, Confluence, Jenkins, and modern DevOps pipelines.
Posted 1 week ago
15.0 - 20.0 years
10 - 14 Lacs
Hyderabad
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : Snowflake SchemaMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are functioning optimally and meeting the needs of stakeholders. Your role will require you to stay updated on industry trends and best practices to enhance application performance and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Snowflake Schema.- Strong understanding of data warehousing concepts and best practices.- Experience with SQL and data modeling techniques.- Familiarity with cloud-based data solutions and integration methods. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Dataproc, Apache Spark Good to have skills : Apache AirflowMinimum 5 year(s) of experience is required Educational Qualification : minimum 15 years of fulltime education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google Dataproc. Your typical day will involve working with Apache Spark and collaborating with cross-functional teams to deliver impactful data-driven solutions. Roles & Responsibilities:- Design, build, and configure applications to meet business process and application requirements using Google Dataproc.- Collaborate with cross-functional teams to deliver impactful data-driven solutions.- Utilize Apache Spark for data processing and analysis.- Develop and maintain technical documentation for applications. Professional & Technical Skills: - Strong expereince in Apache Spark and Java for Spark.- Strong experience with multiple database models ( SQL, NoSQL, OLTP and OLAP)- Strong experience with Data Streaming Architecture ( Kafka, Spark, Airflow)- Strong knowledge of cloud data platforms and technologies such as GCS, BigQuery, Cloud Composer, Dataproc and other cloud-native offerings- Knowledge of Infrastructure as Code (IaC) and associated tools (Terraform, ansible etc)- Experience pulling data from a variety of data source types including Mainframe (EBCDIC), Fixed Length and delimited files, databases (SQL, NoSQL, Time-series)- Experience performing analysis with large datasets in a cloud-based environment, preferably with an understanding of Googles Cloud Platform (GCP)- Comfortable communicating with various stakeholders (technical and non-technical)- GCP Data Engineer Certification is a nice to have Additional Information:- The candidate should have a minimum of 5 years of experience in Google Dataproc and Apache Spark.- The ideal candidate will possess a strong educational background in software engineering or a related field.- This position is based at our Mumbai office. Qualification minimum 15 years of fulltime education
Posted 1 week ago
5.0 - 10.0 years
7 - 11 Lacs
Noida, Chandigarh, Hyderabad
Work from Office
Design, develop, and execute comprehensive ETL test cases and scenarios to validate data transformation, migration, and integration processes. Perform data validation between source and target systems to ensure data accuracy and completeness. Verify ETL workflows, transformations, mappings, and scripts for accuracy and performance. Strong hands-on experience in ETL tools such as Informatica, Talend, SSIS, or DataStage. Proficiency in SQL and database technologies (e.g., Oracle, SQL Server, MySQL). Familiarity with data warehousing concepts, data modelling, and OLAP/OLTP systems. Identify, document, and track defects in the ETL processes using defect management tools. Work closely with development and data engineering teams to resolve issues promptly. Collaborate with business analysts, developers, and stakeholders to gather and understand requirements. Support UAT and end-to-end testing efforts by coordinating with cross-functional teams. Generate and maintain detailed testing reports, including defect logs, execution results, and performance metrics. Communicate testing progress, risks, and outcomes effectively to stakeholders. Experience with test management and defect tracking tools like JIRA, HP ALM, or equivalent. Exposure to automation tools/frameworks for ETL testing is a plus. Ability to analyse and troubleshoot performance issues in ETL processes. Strong analytical and problem-solving abilities. Excellent communication and collaboration skills.
Posted 1 week ago
5.0 - 12.0 years
0 Lacs
karnataka
On-site
Job Description: As an Engineering Manager, you will lead a high-performing team of 8-12 engineers and engineering leads in the end-to-end delivery of software applications through sophisticated CI/CD pipelines. Your role involves mentoring engineers to build scalable, resilient, and robust cloud-based solutions for Walmart's suite of products, contributing to quality and agility. Within Enterprise Business Services, the Risk Tech/Financial Services Compliance team focuses on designing, developing, and operating large-scale data systems and real-time applications. The team works on creating pipelines, aggregating data on Google Cloud Platform, and collaborating with various teams to provide technical solutions. Key Responsibilities: - Manage a team of engineers and engineering leads across multiple technology stacks, including Java, NodeJS, and Spark with Scala on GCP. - Drive design, development, and documentation processes. - Establish best engineering and operational practices based on product and scrum metrics. - Interact with Walmart engineering teams globally, contribute to the tech community, and collaborate with product and business stakeholders. - Work with senior leadership to plan the future roadmap of products, participate in hiring and mentoring, and lead technical vision and roadmap development. - Prioritize feature development aligned with strategic objectives, establish clear expectations with team members, and engage in organizational events. - Collaborate with business owners and technical teams globally, and develop mid-term technical vision and roadmap to meet future requirements. Qualifications: - Bachelor's/Master's degree in Computer Science or related field with a minimum of 12+ years of software development experience and at least 5+ years of managing engineering teams. - Experience in managing agile technology teams, building Java, Scala-Spark backend systems, and working in cloud-based solutions. - Proficiency in JavaScript, NodeJS, ReactJS, NextJS, CS Fundamentals, Microservices, Data Structures, and Algorithms. - Strong skills in CI/CD development environments/tools, writing modular and testable code, microservices architecture, and working with relational and NoSQL databases. - Hands-on experience with technologies like Spring Boot, concurrency, RESTful services, and cloud platforms such as Azure, GCP. - Knowledge of containerization tools like Docker, Kubernetes, and monitoring/alert tools like Prometheus, Splunk. - Ability to lead a team, contribute to technical design, and collaborate across geographies. About Walmart Global Tech: Walmart Global Tech is a team of software engineers, data scientists, and service professionals at the forefront of retail disruption. We innovate to impact millions and reimagine the future of retail, offering opportunities for personal growth, skill development, and innovation at scale. Flexible Work Approach: Our hybrid work model combines in-office and virtual presence, ensuring collaboration, flexibility, and personal development opportunities across our global team. Benefits: In addition to competitive compensation, we offer incentive awards, best-in-class benefits, maternity/paternal leave, health benefits, and more. Equal Opportunity Employer: Walmart, Inc. is committed to diversity, inclusivity, and valuing unique identities, experiences, and opinions. We strive to create an inclusive environment where all individuals are respected and valued. Minimum Qualifications: - Bachelor's degree in computer science or related field with 5 years of experience in software engineering or 7 years of experience in software engineering with 2 years of supervisory experience. Preferred Qualifications: - Master's degree in computer science or related field with 3 years of experience in software engineering. Location: Pardhanani Wilshire II, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli, India R-1998235.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
Position Type : Full time Type Of Hire : Experienced (relevant combo of work and education) Education Desired : Bachelor of Computer Engineering Travel Percentage : 0% Senior Lead Analyst - Implementation Conversion As the world works and lives faster, FIS is leading the way. Our fintech solutions touch nearly every market, company and person on the planet. Our teams are inclusive and diverse. Our colleagues work together and celebrate together. If you want to advance the world of fintech, wed like to ask you: Are you FIS About the role: STS is a major group in capital markets which comprises of various groups like test automation, data migration and Business Analysis. This role is for Data Migration and Integration engineer in Data Migration team performing duties as listed below. About the team: Data Migration Team focuses on Data migration of client data from legacy / 3rd party systems into FIS product This team consists of specialist in Adeptia, PL / SQL, SSIS with solid domain knowledge of Lending, Risk and Treasury products. What you will be doing: As an Implementation Conversion Analyst, you will be working on Client or internal engagement with full responsibilities in terms of completing the Data Migration/Conversion/Integration within the prescribed time and with the highest quality. Candidate will also be responsible for a clear and timely communication for the projects they would be working on. Escalate problems in timely manner and take them to closure with dedicated efforts. What you will need: Graduate in Computer Science or Computer Applications such as B. Sc. / B.C.A. / B. Tech. / B. E./ M.C.A. Min 8-10 Years experience overall and major experience in ETL. 5-6 years experience with ETL tool preferably in COBOL 5-6 years experience in writing PL/SQL or T-SQL programming (Stored procedures, functions and Packages) and queries on Oracle / SQL Server databases. Strong knowledge on RDBMS concept and OLTP system architecture. Competent with other analytical programs like Power BI, Crystal Reports/SSRS , with the desire and ability to understand new software applications Experience reviewing query performance and optimizing/developing more efficient code Experience with creating table indexes to improve database performance Experience writing complex operations, views, stored procedures, triggers and functions to support business needs in a high availability environment Strong knowledge on source code control mechanism on any tool. Knowledge on GIT / BitBucket is added advantage. Strong knowledge of XML and JSON structures and Jenkins. Experience of job scheduling and working knowledge on at least one 3rd party scheduler Experience utilizing SOAP and REST to access web services Ability to analyze and solve problems using learned techniques and tools Ability to translate client requirements to technical specifications and participate in Migration and Conversion Strategy. Excellent written and verbal communication skills Excellent inter-personal skills and comfortable establishing professional relationships especially remotely (electronic, phone, written) Proven ability to plan and execute effectively to meet critical time-sensitive objectives Ability to effectively work alone and independently Experience in either the Banking or Financial Industry is preferred Good mentorship skills Ability to deliver effectively in high pressure situations Added bonus if you have: Good to have Hands-on languages (Python/Java/C#) What we offer you: At FIS, you can learn, grow and make an impact in your career. Extensive Health Benefits Program along with the Family Protection Plan Best-in-class career mobility options across the globe Award-winning learning offerings for career development Adaptable home - office work model Opportunity to work with global teams and clients Privacy Statement FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the Online Privacy Notice. Sourcing Model Recruitment at FIS works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. FIS does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass,
Posted 1 week ago
15.0 - 20.0 years
50 - 55 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
About Netskope Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter @Netskope . About the team: We are a distributed team of passionate engineers dedicated to continuous learning and building impactful products. Our core product is built from the ground up in Scala, leveraging the Lightbend stack (Play/Akka). About Data Security Posture Management (DSPM) DSPM is designed to provide comprehensive data visibility and contextual security for the modern AI-driven enterprise. Our platform automatically discovers, classifies, and secures sensitive data across diverse environments including AWS, Azure, Google Cloud, Oracle Cloud , and on-premise infrastructure. DSPM is critical in empowering CISOs and security teams to enforce secure AI practices, reduce compliance risk, and maintain continuous data protection at scale. As a key member of the DSPM team, you will contribute to developing innovative and scalable systems designed to protect the exponentially increasing volume of enterprise data. Our platform continuously maps sensitive data across all major cloud providers, and on-prem environments, automatically detecting and classifying sensitive and regulated data types such as PII, PHI, financial, and healthcare information. It flags data at risk of exposure, exfiltration, or misuse and helps prioritize issues based on sensitivity, exposure, and business impact. What youll do: Drive the enhancement of Data Security Posture Management (DSPM) capabilities, by enabling the detection of sensitive or risky data utilized in (but not limited to) training private LLMs or accessed by public LLMs. Improve the DSPM platform to extend support of the product to all major cloud infrastructures, on-prem deployments, and any new upcoming technologies. Provide technical leadership in all phases of a project, from discovery and planning through implementation and delivery. Contribute to product development: understand customer requirements and work with the product team to design, plan, and implement features. Support customers by investigating and fixing production issues. Help us improve our processes and make the right tradeoffs between agility, cost, and reliability. Collaborate with the teams across geographies. What you bring to the table: 15+ years of software development experience with enterprise-grade software. Must have experience in building scalable, high-performance cloud services. Expert coding skills in Scala or Java. Development on cloud platforms including AWS. Deep knowledge on databases and data warehouses (OLTP, OLAP) Analytical and troubleshooting skills. Experience working with Docker and Kubernetes. Able to multitask and wear many hats in a fast-paced environment. This week you might lead the design of a new feature, next week you are fixing a critical bug or improving our CI infrastructure. Cybersecurity experience and adversarial thinking is a plus. Expertise in building REST APIs. Experience leveraging cloud-based AI services (e.g., AWS Bedrock, SageMaker), vector databases, and Retrieval Augmented Generation (RAG) architectures is a plus. Education: Bachelors degree in Computer Science. Masters degree strongly preferred. #LI-SK3
Posted 1 week ago
5.0 - 9.0 years
20 - 30 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered
Posted 2 weeks ago
7.0 - 11.0 years
20 - 25 Lacs
Noida, Kolkata, Pune
Work from Office
Proficient in application, data, and infrastructure architecture disciplines. Advanced knowledge of architecture, design, and business processes. Hands-on experience with AWS. Proficiency in modern programming languages such as Python and Scala. Expertise in Big Data technologies like Hadoop, Spark, and PySpark. Experience with deployment tools for CI/CD, such as Jenkins. Design and develop integration solutions involving Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions. Apply system development lifecycle methodologies, such as Waterfall and Agile. Understand and implement data architecture and modeling practices, including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional modeling, and metadata modeling. Utilize knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development, and Big Data solutions. Work collaboratively in teams to develop meaningful relationships and achieve common goals. Strong analytical skills with deep expertise in SQL. Solid understanding of Big Data concepts, particularly with Spark and PySpark/Scala. Experience with CI/CD using Jenkins. Familiarity with NoSQL databases. Excellent communication skills.
Posted 2 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Noida
Work from Office
Sr. MongoDB Administrator Job Description: Optimize performance through indexing, query tuning, and resource allocation. Conduct benchmarking and stress testing to evaluate database performance and stability under load. Perform real-time monitoring and health checks for all databases. Create, review, and optimize complex NoSQL queries. Implement and oversee replication, sharding, and backup drills. Develop and maintain disaster recovery plans with regular testing. Lead database migration projects with minimal downtime. Design and manage database architecture for OLTP and OLAP systems. Manage integration with Big Data systems, Data Lakes, Data Marts, and Data Warehouses. Administer databases on Linux environments and AWS cloud (RDS, EC2, S3, etc.). Use Python scripting for automation and custom DBA tools. Collaborate with DevOps, engineering, and analytics teams. Experience Range: 3 - 6 years Educational Qualifications: Any graduation Skills Required: MongoDB , AWS , BigData
Posted 2 weeks ago
3.0 - 7.0 years
37 - 45 Lacs
Bengaluru
Work from Office
About Netskope Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter @Netskope . About the role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. As part of the Digital Experience Management team, you will work on state-of-the-art, cloud-scale distributed systems at the intersections of networking, cloud security and big data. You will be part of designing and building systems that provide critical infrastructure for global Fortune 100 companies. What you will be doing Designing and implementing planet-scale distributed data platforms, services and frameworks including solutions to address high-volume and complex data collections, processing, transformations and analytical reporting Working with the application development team to implement data strategies, build data flows and develop conceptual data models Understanding and translating business requirements into data models supporting long-term solutions Analyzing data system integration challenges and proposing optimized solutions Researching effective data designs, new tools and methodologies for data analysis Providing guidance and expertise to other developers on the effective implementation of data models, and building high throughput data access services Providing technical leadership in all phases of a project, from discovery and planning through implementation and delivery. Required skills and experience 3 to 7 years experience in designing and coding scalable distributed systems The ability to conceptualize and articulate ideas clearly and concisely Experience in building big data pipelines that manage petabytes of data and ingest billions of data items every day. Experience with big data analytics Excellent algorithm, data structure, and coding skills with Python, Go, C++ and / or Rust Proficiency in networking protocols and network security such as TCP/IP, TLS, IPSec/GRE, PKI , etc. Experience coding with OSS like Kafka, Redis, and Clickhouse , as well as cloud infrastructure, ideally GCP Excellent written and verbal communication skills Bonus points for contributions to the open source community Education BSCS or equivalent required, MSCS or equivalent strongly preferred #LI-JB3
Posted 2 weeks ago
1.0 - 3.0 years
4 - 9 Lacs
Hyderabad
Work from Office
Key Responsibilities: Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional/non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. using Python/open source technologies. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and regions. Work with data and analytics experts to strive for greater functionality in our data systems. Test databases and perform bug fixes. Develop best practices for database design and development activities. Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code. Take on technical leadership responsibilities of database projects across various scrum teams Manage exploratory data analysis to support database and dashboard development Required Skills: Expert knowledge in databases like PostgreSQL (preferably cloud hosted in any one or more cloud offerings like AWS, Azure, GCP), and any cloud-based Data Warehouse (like Snowflake, Azure Synapse) with strong programming experience in SQL. Competence in data preparation and/or ETL tools like snapLogic, MATILLION, Azure Data Factory, AWS Glue, and SSIS (preferably strong working experience in one or more) to build and maintain data pipelines and flows. Understanding of data modeling techniques and working knowledge with OLTP and OLAP systems Deep knowledge of databases, stored procedures, optimizations of huge data In-depth knowledge of ingestion techniques, data cleaning, de-dupe, and partitioning. Experience with building the infrastructure required for data ingestion and analytics Ability to fine-tune report-generating queries Solid understanding of normalization and denormalization of data, database exception handling, transactions, profiling queries, performance counters, debugging, database & query optimization techniques Understanding of index design and performance-tuning techniques Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting Adhere to standards for all databases e.g., Data Models, Data Architecture, and Naming Conventions Exposure to Source control like GIT, Azure DevOps Understanding of Agile methodologies (Scrum, Kanban) Preferably experience with NoSQL database to migrate data into other types of databases with real-time replication. Experience with automated testing and coverage tools Experience with CI/CD automation tools (desirable)
Posted 2 weeks ago
7.0 - 10.0 years
30 - 36 Lacs
Gandhinagar
Work from Office
Responsibilities: * Design and implement scalable cloud architectures on AWS using Aws CloudFormation, Python, Terraform, Kinesis, Lambda, Glue, S3, Redshift, DynamoDB, OLAP & OLTP databases with SQL. Flexi working Work from home Health insurance Performance bonus Provident fund Mobile bill reimbursements
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France