Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Data Modeler JD Proven experience as a Data Modeler or in a similar role (8 years depending on seniority level). Proficiency in data modeling tools (e.g., ER/Studio, Erwin, SAP PowerDesigner, or similar). Strong understanding of database technologies (e.g., SQL Server, Oracle, PostgreSQL, Snowflake). Experience with cloud data platforms (e.g., AWS, Azure, GCP). Familiarity with ETL processes and tools. Excellent knowledge of normalization and denormalization techniques. Strong analytical and problem-solving skills. Show more Show less
Posted 2 weeks ago
6.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Gracenote is the content business unit of Nielsen that powers the world of media entertainment. Our metadata solutions help media and entertainment companies around the world deliver personalized content search and discovery, connecting audiences with the content they love. We’re at the intersection of people and media entertainment. With our cutting-edge technology and solutions, we help audiences easily find TV shows, movies, music and sports across multiple platforms. As the world leader in entertainment data and services, we power the world’s top streaming platforms, cable and satellite TV providers, media companies, consumer electronics manufacturers, music services and automakers to navigate and succeed in the competitive streaming world. Our metadata entertainment solutions have a global footprint of 80+ countries, 100K+ channels and catalogs, 70+ sports and 100M+ music tracks, all across 35 languages. Job Purpose As a senior DBA, your role is to own the databases in our data pipeline and the data governance of our Data Strategy. Our Data Strategy underpins our suite of Client-facing Applications, Data Science activities, Operational Tools and Business Analytics Responsibilities Architect and build scalable, resilient and cost-effective data storage solutions to support complex data pipelines The architecture has two facets: Storage and Compute. The DBA is responsible for designing and maintaining the different tiers of the data storage, including (but not limited to) archival, long-term persistent storage, transactional and reporting storage Design, implement and maintain various data pipelines such as self-service ingestion tools, exports to application-specific warehouses, and indexing activities The senior DBA is responsible for data modeling, as well as designing, implementing and maintaining various data catalogs, to support data transformation and product requirements Configure and deploy databases on AWS cloud, ensuring optimal performance and scalability Monitor database activities for compliance and security purposes Set up and manage backup and recovery strategies for cloud databases ensuring availability and quality Monitor database performance metrics and identify areas for optimization .Create scripts for database configuration and provisioning Collaborate with Data Science to understand, translate, and integrate methodologies into engineering build pipelines Partner with product owners to translate complex business requirements into technical solutions, imparting design and architecture guidance Provide expert mentorship to project teams on technology strategy, cultivating advanced skill sets in software engineering and modern SDLC Stay informed about the latest technologies and methodologies by participating in industry forums, having an active peer network, and engaging actively with customers Cultivate a team environment focused on continuous learning, where innovative technologies are developed and refined through teamwork Must have skills: Experience with languages such as ANSI SQL, TSQL, PL/pgSQL, PLSQL, plus database design, normalization, server tuning, and query plan optimization.6+ years of professional DBA experience with large datastores including HA and DR planning and support Software Engineering experience with programming languages such as Java, Scala, and Python Demonstrated understanding and experience with big data tools such as Kafka, Spark and Trino/PrestoExperience with orchestration tools such as Airflow Comfortable using Docker and Kubernetes for container management DevOps experience deploying and tuning the applications you’ve built Monitoring tools such as Datadog, Prometheus, Grafana, Cloudwatch Good to have: Software Engineering experience with Unix Shell Understanding of File Systems Experience configuring database replication (physical and/or logical) ETL experience (3rd party and proprietary).A personal technical blogA personal (Git) repository of side projectsParticipation in an open-source community Qualifications B.E / B.Tech / BCA/ MCA in Computer Science, Engineering or a related subject Strong Computer Science fundamentals Comfortable with version control systems such as git A thirst for learning new Tech and keeping up with industry advances Excellent communication and knowledge-sharing skills Comfortable working with technical and non-technical teams Strong debugging skills Comfortable providing and receiving code review feedback A positive attitude, adaptability, enthusiasm, and a growth mindset About Nielsen: By connecting clients to audiences, we fuel the media industry with the most accurate understanding of what people listen to and watch. To discover what audiences love, we measure across all channels and platforms—from podcasts to streaming TV to social media. And when companies and advertisers are truly connected to their audiences, they can see the most important opportunities and accelerate growth. Do you want to move the industry forward with Nielsen? Our people are the driving force. Your thoughts, ideas, and expertise can propel us forward. Whether you have fresh thinking around maximizing a new technology or you see a gap in the market, we are here to listen and act. Our team is made strong by a diversity of thoughts, experiences, skills, and backgrounds. You’ll enjoy working with smart, fun, curious colleagues, who are passionate about their work. Come be part of a team that motivates you to do your best work! Show more Show less
Posted 2 weeks ago
5.0 years
8 - 10 Lacs
Hyderābād
On-site
FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate, serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients’ needs and exceeding their expectations. About FactSet FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate, serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients’ needs and exceeding their expectations. Your Team's Impact FactSet is seeking an Experienced software development engineering with proven proficiency in deployment of software adhering to best practices and with fluency in the development environment and with related tools, code libraries and systems. Responsible for the entire development process and collaborates to create a theoretical design. Demonstrated ability to critique code and production for improvement, as well as to receive and apply feedback effectively. Proven ability to maintain expected levels of productivity and increasingly becoming independent as a software developer, requiring less direct engagement and oversight on a day to day basis from one’s manager. Focus is on developing applications, testing & maintaining software, and the implementation details of development ; increasing volume of work accomplished (with consistent quality, stability and adherence to best practices), along with gaining a mastery of the products to which one is contributing and beginning to participate in forward design discussions for how to improve based on one’s observations of the code, systems and production involved. Software Developers provide project leadership and technical guidance along every stage of the software development life cycle. What You'll Do Work on the Data Lake/DAM platform handling millions of documents annually. Focus on developing new features while supporting and maintaining existing systems, ensuring the platform's continuous improvement. Participate in weekly On Call support to address urgent queries and issues in common communication channels, ensuring operational reliability and user satisfaction. Create comprehensive design documents for major architectural changes and facilitate peer reviews to ensure quality and alignment with best practices. Collaborate with product managers and key stakeholders to thoroughly understand requirements and propose strategic solutions, leveraging cross-functional insights. Actively participate in technical discussions with principal engineers and architects to support proposed design solutions, fostering a collaborative engineering environment. Work effectively as part of a geographically diverse team, coordinating with other departments and offices for seamless project progression What We're Looking For Bachelor’s or master’s degree in computer science, Engineering, or a related field is required. 5+ years of experience in software development, with a focus on Database systems handling & operations. Writing and optimizing complex SQL queries, stored procedures, views, triggers Developing and maintaining database schema and structures Creating ETL pipelines for data ingestion and transformation Troubleshooting data issues and performance bottlenecks Mentoring junior developers Proven experience working with APIs, ensuring robust connectivity and integration across the system. Working experience with AWS services such as Lambda, EC2, S3, and AWS Glue is beneficial for cloud-based operations and deployments. Strong analytical and problem-solving skills are critical for developing innovative solutions and optimizing existing platform components. Excellent collaborative and communication skills, enabling effective interaction with geographically diverse teams and key stakeholders. Capability to address system queries and provide weekly On Call support, ensuring system reliability and user satisfaction. Ability to prioritize and manage work effectively in a fast-paced environment, demonstrating self-direction and resourcefulness. Desired Skills: Deep RDBMS knowledge (e.g., SQL Server, Oracle, PostgreSQL) Strong T-SQL/PLSQL scripting Query tuning and performance optimization Data modelling and DWH concepts Often part of app development or analytics teams Stored procedures, functions, views, triggers Query optimization techniques Execution plan analysis Indexing strategies Partitioning and table optimization Logical and physical data modelling Normalization/denormalization What's In It for You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an S&P 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Learn more about our benefits here . Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview: FactSet (NYSE:FDS | NASDAQ:FDS) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees’ Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . Ex US: At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law. Diversity: At FactSet, we celebrate diversity of thought, experience, and perspective. We are committed to disrupting bias and a transparent hiring process. All qualified applicants will be considered for employment regardless of race, color, ancestry, ethnicity, religion, sex, national origin, gender expression, sexual orientation, age, citizenship, marital status, disability, gender identity, family status or veteran status. FactSet participates in E-Verify. Return to Work: Returning from a break? We are here to support you! If you have taken time out of the workforce and are looking to return, we encourage you to apply and chat with our recruiters about our available support to help you relaunch your career. Company Overview: FactSet (NYSE:FDS | NASDAQ:FDS) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees’ Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.
Posted 2 weeks ago
0 years
0 Lacs
Gurgaon
On-site
JOB DESCRIPTION About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. Skills and Expertise: The consultant should have expertise in the following: 1. Database design and management using PostgreSQL, MSSQL and Oracle. 2. Advanced SQL development, including writing efficient queries, stored procedures, and performance tuning. 3. PL/SQL programming for developing complex database Analytics functions and optimizing database performance. 4. Data modeling and normalization to ensure structured and optimized database schemas. 5. Query optimization techniques for improving database performance and reducing query execution time. 6. Indexing and partitioning strategies to enhance database efficiency and scalability. 7. ETL process development to extract, transform, and load data into databases. 8. Troubleshooting and debugging SQL queries to resolve issues related to data retrieval and integrity. 9. Database security and access control, including user permissions and role-based access management. 10. Integration of databases with applications, ensuring seamless data connectivity and interaction. 11. Proficiency in version control tools such as Git for managing database scripts and configurations. 12. Automating database processes using scripts for scheduled tasks and batch processing. 13. Experience in cloud-based database services such as Azure SQL Database or Oracle Cloud (good to have). 14. Strong communication and presentation skills to effectively convey technical concepts to stakeholders. Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. QUALIFICATIONS Minimum Bachelor of Engineering (BE) required
Posted 2 weeks ago
5.0 years
0 Lacs
New Delhi, Delhi, India
On-site
About the Role: We are looking for a hands-on Data Engineer to join our team and take full ownership of scraping pipelines and data quality. You'll be working on data from 60+ websites involving PDFs, processed via OCR and stored in MySQL/PostgreSQL. You’ll build robust, self-healing pipelines and fix common data issues (missing fields, duplication, formatting errors). Responsibilities: Own and optimize Airflow scraping DAGs for 60+ sites Implement validation checks, retry logic, and error alerts Build pre-processing routines to clean OCR'd text Create data normalization and deduplication workflows Maintain data integrity across MySQL and PostgreSQL Collaborate with ML team for downstream AI use cases Requirements: 2–5 years of experience in Python-based data engineering Experience with Airflow, Pandas, OCR (Tesseract or AWS Textract) Solid SQL and schema design skills (MySQL/PostgreSQL) Familiarity with CSV processing and data pipelines Bonus: Experience with scraping using Scrapy or Selenium Location: Delhi (in-office only) Salary Range : 50-80k/Month Show more Show less
Posted 2 weeks ago
15.0 years
0 Lacs
Chennai
On-site
Project Role : Service Management Practitioner Project Role Description : Support the delivery of programs, projects or managed services. Coordinate projects through contract management and shared service coordination. Develop and maintain relationships with key stakeholders and sponsors to ensure high levels of commitment and enable strategic agenda. Must have skills : Microsoft Power Business Intelligence (BI) Good to have skills : Microsoft Power Apps Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Service Management Practitioner, you will support the delivery of programs, projects, or managed services. Coordinate projects through contract management and shared service coordination. Develop and maintain relationships with key stakeholders and sponsors to ensure high levels of commitment and enable strategic agenda. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Coordinate the delivery of programs, projects, or managed services. - Develop and maintain relationships with key stakeholders and sponsors. - Ensure high levels of commitment from stakeholders. - Enable strategic agenda through effective coordination. - Provide regular updates and reports on project progress. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Business Intelligence (BI). - Good To Have Skills: Experience with Microsoft Power Apps. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Microsoft Power Business Intelligence (BI). - This position is based at our Chennai office. - A 15 years full-time education is required. 15 years full time education
Posted 2 weeks ago
3.0 years
0 Lacs
Kūdangulam
On-site
Job Summary The team will involve in commissioning/ maintenance of I&C systems ensuring pre-job briefing, isolation, removal of instruments from locations, cleaning, decontamination, physical inspection, testing, trouble shooting and repair, calibration, filling of data sheet, normalization, giving feedback to ENC Job Type: Full-time Pay: From ₹35,000.00 per month Schedule: Rotational shift Education: Diploma (Required) Experience: work: 3 years (Required) Electrical Engineering: 3 years (Required) Language: hindi (Preferred) Work Location: In person
Posted 2 weeks ago
15.0 years
0 Lacs
Ahmedabad
On-site
Project Role : Technology Delivery Lead Project Role Description : Manages the delivery of large, complex technology projects using appropriate frameworks and collaborating with sponsors to manage scope and risk. Drives profitability and continued success by managing service quality and cost and leading delivery. Proactively support sales through innovative solutions and delivery excellence. Must have skills : SAP FI S/4HANA Accounting Good to have skills : NA Minimum 15 year(s) of experience is required Educational Qualification : Any Degree Summary: As a Technology Delivery Lead, you will manage the delivery of large, complex technology projects using appropriate frameworks and collaborating with sponsors to manage scope and risk. You will drive profitability and continued success by managing service quality and cost and leading delivery. Additionally, you will proactively support sales through innovative solutions and delivery excellence. Roles & Responsibilities: - Expected to be a SME with deep knowledge and experience. - Should have Influencing and Advisory skills. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Manage the delivery of large, complex technology projects using appropriate frameworks. - Collaborate with sponsors to manage scope and risk. - Drive profitability and continued success by managing service quality and cost. - Lead delivery and ensure delivery excellence. - Proactively support sales through innovative solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP FI S/4HANA Accounting. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 15 years of experience in SAP FI S/4HANA Accounting. - This position is based at our Hyderabad office. - A Any Degree is required. Any Degree
Posted 2 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Proficiency in data modeling tools such as ER/Studio, ERwin or similar. Deep understanding of relational database design, normalization/denormalization, and data warehousing principles. Experience with SQL and working knowledge of database platforms like Oracle, SQL Server, PostgreSQL, or Snowflake. Strong knowledge of metadata management, data lineage, and data governance practices. Understanding of data integration, ETL processes, and data quality frameworks. Ability to interpret and translate complex business requirements into scalable data models. Excellent communication and documentation skills to collaborate with cross-functional teams. Show more Show less
Posted 2 weeks ago
6.0 - 9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Position Summary SAP HANA Developer - Level 4 (Assistant Manager) SAP HANA Developer should analyze, plan, design, develop, and implement the SAP HANA solutions to meet strategic, usability, performance, reliability, control, and security requirements of Analytics reporting processes. Requires good knowledge in areas of Analytics, Data warehouse and reporting applications. Must be innovative. Work you will do A unique opportunity to be a part of growing SAP HANA Data and Analytics team that work on latest cutting technology like HANA and BODS for developing the Analytics applications. You will be responsible for implementation/delivering of SAP HANA solutions to support Deloitte US Member Firm. Should be able to understand the functional requirements from the client and appropriately convert them into Technical Design documents. Should be able to lead/assist the team with his/her technical skills whenever an issue is encountered. Should have good team leader skills to co-ordinate with team members. Should be able to assist the team with his/her technical skills whenever an issue is encountered. Build relationships with clients and keeping abreast of the various client developments. Excellent written, verbal, listening, analytical, and communication skills are required. Highly self-motivated and directed. Experience in working in team-oriented, collaborative environment. Should take ownership of individual deliverables. Key Responsibilities Perform SAP HANA programming as required. Perform troubleshooting and problem resolution of applications built and being supported. Perform effort estimation for various implementation and enhancement activities. Support and coordinate the efforts of Subject Matter Experts, Development, Quality Assurance, Usability, Training, Transport Management, and other internal resources for the successful implementation of system enhancements and fixes. Participate in the continuous improvement processes as assigned. Work with team members to analyze, plan, design, develop, and implement solutions to meet strategic, usability, performance, reliability, control, and security requirements. Create and maintain internal documentation and end-user training materials as needed. Provide input to standards and guidelines and implement best practices to enable consistency across all projects. Education: Bachelor’s degree in Computer Science or Business Information Systems or MCA or equivalent degree. Qualifications: 6 to 9 years Advanced Level of experience in design & creation of Information Models in HANA using Attributes views, Analytic views, Calculation views (both Script based & graphical). Advanced level of experience in HANA SQL is must. Advanced level of experience in using aggregate and Window functions is must. Advanced level of experience in Stored Procedures, Triggers, Functions, Creating Tables and Views using SQL is must. Expertise in Transact-SQL (DDL, DML, DCL) and in Design and Normalization of the database tables. Hands on knowledge and expertise on BODS ETL is desirable. Define reusable components/frameworks, common schemas, standards to be used & tools to be used. Translate the Business KPIs into HANA & Define the Reporting Layouts. Experience in Crystal reports, Analysis for Office, Tableau and Qlik view is desirable. Knowledge in SLT and SDI is desirable (good to have). Working on Agile/scrum Methodology is desirable. Experience and good knowledge of Performance Tuning and Stress Testing is desirable (good to have). Strong analytical, problem solving and multi-tasking skills, as well as communication and interpersonal skills, is required. Strong verbal and written communication skills, with an ability to express complex business concepts in non-technical terms. The Team Information Technology Services (ITS) helps power Deloitte’s success. ITS drives Deloitte, which serves many of the world’s largest, most respected organizations. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. The ~3,000 professionals in ITS deliver services including: Security, risk & compliance Technology support Infrastructure Applications Relationship management Strategy Deployment PMO Financials Communications Product Engineering(PxE) Product Engineering(PxE) is the internal software and applications development team responsible for delivering leading-edge technologies to Deloitte professionals. Their broad portfolio includes web and mobile productivity tools that empower our people to log expenses, enter timesheets, book travel and more, anywhere, anytime. PxE enables our client service professionals through a comprehensive suite of applications across the business lines. In addition to application delivery, PxE offers full-scale design services, a robust mobile portfolio, cutting-edge analytics, and innovative custom development. How you will grow At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want allourpeopletodevelopintheirownway,playingtotheirownstrengthsastheyhonetheirleadershipskills.And,asa part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. Notwopeoplelearninexactlythesameway.So,weprovidearangeofresources,includingliveclassrooms,team-based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world-class learningcenterintheHyderabadoffice,isanextensionoftheDUinWestlake,Texas,andrepresentsatangiblesymbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center inIndia . Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Ourpositiveandsupportivecultureencouragesourpeopletodotheirbestworkeveryday.Wecelebrateindividualsby recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered,confident,andaware.Weofferwell-beingprogramsandarecontinuouslylookingfornewwaystomaintaina culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life atDeloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/engagement requirements and at the discretion of the management. #EAG-Technology Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300907 Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
India
On-site
Job Title: SQL Developer Employment Type: Full-time Job Summary: Metacube Software is seeking a skilled SQL Developer to join our dynamic team and support our data management needs. The ideal candidate will have a strong background in SQL development, particularly with Microsoft SQL Server (MSSQL). This role focuses on performance tuning, indexing, database design, and SQL coding. Please note that we are not looking for candidates with a background in reporting or Business Intelligence (BI) work. Key Responsibilities: Design, develop, and optimize complex SQL queries, stored procedures, views, and triggers. Create and maintain data models, schemas, and database objects. Collaborate with ETL teams to extract, transform, and load data from multiple sources. Work closely with developers and business users to understand data requirements and build efficient, scalable solutions. Ensure data accuracy, integrity, and consistency across platforms. Troubleshoot performance issues and recommend solutions to optimize database performance. Document solutions, data structures, and workflows for future reference. Maintain database security and implement data access controls as needed. Required Skills and Qualifications: Proficiency in SQL (or PL/SQL if working with Oracle). Hands-on experience with at least one RDBMS (e.g., Microsoft SQL Server). Strong understanding of relational database design, normalization, and indexing. Ability to analyze and optimize long-running SQL queries. Familiarity with data warehousing and data modeling concepts is a plus. Basic knowledge of scripting or programming languages (Python, PowerShell, etc.) is an advantage. If you are passionate about database development and optimization and want to be part of a collaborative team, we would love to hear from you! Metacube Software is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
Role: Database Engineer Location : Remote Skills and Experience ● Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. ● Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. ● Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. ● Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. ● Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). ● Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. ● Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. ● Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. ● Knowledge of SQL and understanding of database design principles, normalization, and indexing. ● Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. ● Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. ● Eagerness to develop import workflows and scripts to automate data import processes. ● Knowledge of data security best practices, including access controls, encryption, and compliance standards. ● Strong problem-solving and analytical skills with attention to detail. ● Creative and critical thinking. ● Strong willingness to learn and expand knowledge in data engineering. ● Familiarity with Agile development methodologies is a plus. ● Experience with version control systems, such as Git, for collaborative development. ● Ability to thrive in a fast-paced environment with rapidly changing priorities. ● Ability to work collaboratively in a team environment. ● Good and effective communication skills. ● Comfortable with autonomy and ability to work independently. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Key Responsibilities · Design, write, and optimize complex SQL queries, stored procedures, views, and triggers in MS SQL Server. · Collaborate with business analysts and application developers to gather data requirements and implement solutions. · Work within the iPROOF LCNC platform to develop, configure, and deploy business processes and user interfaces. · Document solutions, data structures, and workflows effectively. Required Skills · Strong knowledge of MS SQL Server, including T-SQL, indexing, and performance tuning. · Familiarity with stored procedures, functions, and triggers. · Understanding of relational database design and normalization. · Willingness to learn and work on the iPROOF LCNC platform. · Basic understanding of JSON data structures and its use in dynamic systems. · Good communication and documentation skills. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Sales and Distribution (SD) Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute to key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the effort to design, build, and configure applications - Act as the primary point of contact for the project - Manage the team and ensure successful project delivery - Collaborate with multiple teams to make key decisions - Provide solutions to problems for the immediate team and across multiple teams Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Sales and Distribution (SD) - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 5 years of experience in SAP Sales and Distribution (SD) - This position is based in Mumbai - A 15 years full-time education is required 15 years full time education Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Location Bangalore, Karnataka, 560100 Category Engineering / Information Technology Job Type Full time Job Id 1184886 No NoSQL Developer This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are: Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description: What you'll do: HPE Operations is our innovative IT services organization. It provides the expertise to advise, integrate, and accelerate our customers’ outcomes from their digital transformation. Our teams collaborate to transform insight into innovation. In today’s fast paced, hybrid IT world, being at business speed means overcoming IT complexity to match the speed of actions to the speed of opportunities. Deploy the right technology to respond quickly to market possibilities. Join us and redefine what’s next for you. Scope of Work for NoSQL Database Administrators 1. Database Design and Architecture: Collaborate with developers and architects to design and implement efficient database schemas. Ensure proper normalization & indexing, for optimal performance and scalability. Evaluate and implement replication architectures as needed for high availability and fault tolerance. 2. Performance Tuning: Monitor database performance using tools like NoSQL Enterprise Monitor or custom scripts. Identify and optimize poorly performing queries through query analysis, index optimization, and query rewriting. Configure NoSQL server settings, buffer pools, and caches to maximize throughput and minimize response times. 3. Security and Compliance: Configure role-based access controls (RBAC) and auditing features to ensure data integrity and confidentiality. Coordinate with UIDAI-appointed GRCP and security audit agencies to conduct regular security audits and share artifacts to address any identified risks promptly. 4. Backup and Disaster Recovery: Ensure integration of databases to suitable backup mechanisms and recovery procedures to safeguard against data loss and ensure business continuity. Coordinate with relevant teams to conduct regular DR drills. 5. Monitoring and Alerting: Set up monitoring systems to track database health and performance metrics. Configure automated alerts to notify administrators of critical issues, such as performance degradation, replication lags, or storage constraints. Proactively investigate and resolve alerts to maintain system stability and availability. 6. Capacity Planning: Monitor database growth trends and resource utilization to forecast future capacity requirements. Evaluate and recommend hardware upgrades or version upgrades to support long-term scalability goals. 7. Maintenance and Upgrades: Perform routine maintenance tasks, including database backups, index rebuilds, and statistics updates, during scheduled maintenance windows. Support execution of NoSQL version upgrades and patch deployments, ensuring compatibility and minimal downtime. Coordinate with application teams to test and validate database changes in development and staging environments before production rollout. 8. Documentation and Knowledge Sharing: Maintain comprehensive documentation of database configurations and other Standard Operating Procedures Provide training and knowledge transfer to team members on database administration best practices, tools, and technologies. Foster a culture of continuous learning and improvement through regular team meetings and knowledge sharing sessions. 9. Incident Response and Problem Resolution: Respond to database-related incidents and outages promptly, following established incident management procedures. Jointly work with relevant teams to carry out root cause analysis to identify underlying issues and support the implementation of corrective actions to prevent recurrence. Collaborate with cross-functional teams, including developers, network engineers, and system administrators, to troubleshoot complex issues and drive resolution. 10. Service Ticket Handling for DML Operations: Receive and prioritize service tickets related to Data Manipulation Language (DML) operations, including INSERT, UPDATE, DELETE, and SELECT queries. Analyze and troubleshoot reported issues, such as data inconsistency, performance degradation, or query optimization. Work closely with application developers and end-users to understand the context and requirements of DML operations. Provide guidance and recommendations to developers & architects on optimizing DML queries for improved performance and efficiency. Implement database schema changes, data migrations, and data transformations as requested through service tickets, ensuring proper testing and validation procedures are followed by the development team. Communicate updates and resolutions to stakeholders in a timely and transparent manner, ensuring customer satisfaction and alignment with service level agreements. Collaborate with other teams, such as application support, quality assurance, and release management, to address cross-functional dependencies and ensure smooth execution of DML-related tasks. 12. Collaboration with Developers for DDL Operations: Assist developers/ application architects in planning and executing Data Definition Language (DDL) operations, such as creating, altering, and dropping database objects (tables, indexes, views, etc.). Review proposed schema changes and provide recommendations on best practices for database design and optimization in consultation with application architects and developers Perform impact analysis to assess the potential implications of DDL changes on existing data, applications, and performance. Execute DDL changes during scheduled maintenance windows, following change management procedures and ensuring minimal disruption to production systems. 13. Data Archival and Cleanup: Collaborate with application architects and developers to define data retention policies and archival strategies for each schema/table based on UIDAI data retention policies and business needs. Develop and implement data archival processes to move inactive or historical data to secondary storage or archival databases, freeing up space and improving database performance. Monitor data growth trends and implement proactive measures, such as purging, to manage database size and mitigate performance degradation, in consultation with application architects and developers. Document data archival and cleanup procedures, including retention periods, criteria for data selection, and execution schedules, ensuring compliance with data governance policies. What you need to bring: Qualification – BE / BTech/MCA/ MSc Min years of TOTAL experience – 4+ years Location – Bengaluru, UIDAI Onsite deployment Nature/Key activities – DBA related activities Additional Skills: Accountability, Accountability, Action Planning, Active Learning (Inactive), Active Listening, Bias, Business Growth, Business Planning, Cloud Computing, Cloud Migrations, Coaching, Commercial Acumen, Creativity, Critical Thinking, Cross-Functional Teamwork, Customer Experience Strategy, Data Analysis Management, Data Collection Management (Inactive), Data Controls, Design Thinking, Empathy, Follow-Through, Growth Mindset, Hybrid Clouds, Infrastructure as a Service (IaaS) {+ 10 more} What We Can Offer You: Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected: Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #operations Job: Services Job Level: TCP_02 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.
Posted 2 weeks ago
4.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Location Bangalore, Karnataka, 560100 Category Engineering / Information Technology Job Type Full time Job Id 1184887 No NoSQL Specialist This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are: Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description: What you'll do: HPE Operations is our innovative IT services organization. It provides the expertise to advise, integrate, and accelerate our customers’ outcomes from their digital transformation. Our teams collaborate to transform insight into innovation. In today’s fast paced, hybrid IT world, being at business speed means overcoming IT complexity to match the speed of actions to the speed of opportunities. Deploy the right technology to respond quickly to market possibilities. Join us and redefine what’s next for you. Scope of Work for NoSQL Database Administrators 1. Database Design and Architecture: Collaborate with developers and architects to design and implement efficient database schemas. Ensure proper normalization & indexing, for optimal performance and scalability. Evaluate and implement replication architectures as needed for high availability and fault tolerance. 2. Performance Tuning: Monitor database performance using tools like NoSQL Enterprise Monitor or custom scripts. Identify and optimize poorly performing queries through query analysis, index optimization, and query rewriting. Configure NoSQL server settings, buffer pools, and caches to maximize throughput and minimize response times. 3. Security and Compliance: Configure role-based access controls (RBAC) and auditing features to ensure data integrity and confidentiality. Coordinate with UIDAI-appointed GRCP and security audit agencies to conduct regular security audits and share artifacts to address any identified risks promptly. 4. Backup and Disaster Recovery: Ensure integration of databases to suitable backup mechanisms and recovery procedures to safeguard against data loss and ensure business continuity. Coordinate with relevant teams to conduct regular DR drills. 5. Monitoring and Alerting: Set up monitoring systems to track database health and performance metrics. Configure automated alerts to notify administrators of critical issues, such as performance degradation, replication lags, or storage constraints. Proactively investigate and resolve alerts to maintain system stability and availability. 6. Capacity Planning: Monitor database growth trends and resource utilization to forecast future capacity requirements. Evaluate and recommend hardware upgrades or version upgrades to support long-term scalability goals. 7. Maintenance and Upgrades: Perform routine maintenance tasks, including database backups, index rebuilds, and statistics updates, during scheduled maintenance windows. Support execution of NoSQL version upgrades and patch deployments, ensuring compatibility and minimal downtime. Coordinate with application teams to test and validate database changes in development and staging environments before production rollout. 8. Documentation and Knowledge Sharing: Maintain comprehensive documentation of database configurations and other Standard Operating Procedures Provide training and knowledge transfer to team members on database administration best practices, tools, and technologies. Foster a culture of continuous learning and improvement through regular team meetings and knowledge sharing sessions. 9. Incident Response and Problem Resolution: Respond to database-related incidents and outages promptly, following established incident management procedures. Jointly work with relevant teams to carry out root cause analysis to identify underlying issues and support the implementation of corrective actions to prevent recurrence. Collaborate with cross-functional teams, including developers, network engineers, and system administrators, to troubleshoot complex issues and drive resolution. 10. Service Ticket Handling for DML Operations: Receive and prioritize service tickets related to Data Manipulation Language (DML) operations, including INSERT, UPDATE, DELETE, and SELECT queries. Analyze and troubleshoot reported issues, such as data inconsistency, performance degradation, or query optimization. Work closely with application developers and end-users to understand the context and requirements of DML operations. Provide guidance and recommendations to developers & architects on optimizing DML queries for improved performance and efficiency. Implement database schema changes, data migrations, and data transformations as requested through service tickets, ensuring proper testing and validation procedures are followed by the development team. Communicate updates and resolutions to stakeholders in a timely and transparent manner, ensuring customer satisfaction and alignment with service level agreements. Collaborate with other teams, such as application support, quality assurance, and release management, to address cross-functional dependencies and ensure smooth execution of DML-related tasks. 12. Collaboration with Developers for DDL Operations: Assist developers/ application architects in planning and executing Data Definition Language (DDL) operations, such as creating, altering, and dropping database objects (tables, indexes, views, etc.). Review proposed schema changes and provide recommendations on best practices for database design and optimization in consultation with application architects and developers Perform impact analysis to assess the potential implications of DDL changes on existing data, applications, and performance. Execute DDL changes during scheduled maintenance windows, following change management procedures and ensuring minimal disruption to production systems. 13. Data Archival and Cleanup: Collaborate with application architects and developers to define data retention policies and archival strategies for each schema/table based on UIDAI data retention policies and business needs. Develop and implement data archival processes to move inactive or historical data to secondary storage or archival databases, freeing up space and improving database performance. Monitor data growth trends and implement proactive measures, such as purging, to manage database size and mitigate performance degradation, in consultation with application architects and developers. Document data archival and cleanup procedures, including retention periods, criteria for data selection, and execution schedules, ensuring compliance with data governance policies. What you need to bring: Qualification – BE / BTech/MCA/ MSc Min years of TOTAL experience – 4+ years Location – Bengaluru, UIDAI Onsite deployment Nature/Key activities – DBA related activities Additional Skills: Accountability, Accountability, Active Learning (Inactive), Active Listening, Bias, Business Growth, Client Expectations Management, Coaching, Creativity, Critical Thinking, Cross-Functional Teamwork, Customer Centric Solutions, Customer Relationship Management (CRM), Design Thinking, Empathy, Follow-Through, Growth Mindset, Information Technology (IT) Infrastructure, Infrastructure as a Service (IaaS), Intellectual Curiosity (Inactive), Long Term Planning, Managing Ambiguity, Process Improvements, Product Services, Relationship Building {+ 5 more} What We Can Offer You: Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected: Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #operations Job: Services Job Level: TCP_03 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.
Posted 2 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
Remote
IMEA (India, Middle East, Africa) India LIXIL INDIA PVT LTD Employee Assignment Fully remote possible Full Time 1 May 2025 Title Senior Data Engineer Job Description A Data Engineer is responsible for designing, building, and maintaining large-scale data systems and infrastructure. Their primary goal is to ensure that data is properly collected, stored, processed, and retrieved to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities Design and Develop Data Pipelines: Create data pipelines to extract data from various sources, transform it into a standardized format, and load it into a centralized data repository. Build and Maintain Data Infrastructure: Design, implement, and manage data warehouses, data lakes, and other data storage solutions. Ensure Data Quality and Integrity: Develop data validation, cleansing, and normalization processes to ensure data accuracy and consistency. Collaborate with Data Analysts and Business Process Owners: Work with data analysts and business process owners to understand their data requirements and provide data support for their projects. Optimize Data Systems for Performance: Continuously monitor and optimize data systems for performance, scalability, and reliability. Develop and Maintain Data Governance Policies: Create and enforce data governance policies to ensure data security, compliance, and regulatory requirements. Experience & Skills Hands-on experience in implementing, supporting, and administering modern cloud-based data solutions (Google BigQuery, AWS Redshift, Azure Synapse, Snowflake, etc.). Strong programming skills in SQL, Java, and Python. Experience in configuring and managing data pipelines using Apache Airflow, Informatica, Talend, SAP BODS or API-based extraction. Expertise in real-time data processing frameworks. Strong understanding of Git and CI/CD for automated deployment and version control. Experience with Infrastructure-as-Code tools like Terraform for cloud resource management. Good stakeholder management skills to collaborate effectively across teams. Solid understanding of SAP ERP data and processes to integrate enterprise data sources. Exposure to data visualization and front-end tools (Tableau, Looker, etc.). Strong command of English with excellent communication skills. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
Remote
IMEA (India, Middle East, Africa) India LIXIL INDIA PVT LTD Employee Assignment Fully remote possible Full Time 1 May 2025 Title Data Engineer Job Description A Data Engineer is responsible for designing, building, and maintaining large-scale data systems and infrastructure. Their primary goal is to ensure that data is properly collected, stored, processed, and retrieved to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities Design and Develop Data Pipelines: Create data pipelines to extract data from various sources, transform it into a standardized format, and load it into a centralized data repository. Build and Maintain Data Infrastructure: Design, implement, and manage data warehouses, data lakes, and other data storage solutions. Ensure Data Quality and Integrity: Develop data validation, cleansing, and normalization processes to ensure data accuracy and consistency. Collaborate with Data Analysts and Business Process Owners: Work with data analysts and business process owners to understand their data requirements and provide data support for their projects. Optimize Data Systems for Performance: Continuously monitor and optimize data systems for performance, scalability, and reliability. Develop and Maintain Data Governance Policies: Create and enforce data governance policies to ensure data security, compliance, and regulatory requirements. Experience & Skills Hands-on experience in implementing, supporting, and administering modern cloud-based data solutions (Google BigQuery, AWS Redshift, Azure Synapse, Snowflake, etc.). Strong programming skills in SQL, Java, and Python. Experience in configuring and managing data pipelines using Apache Airflow, Informatica, Talend, SAP BODS or API-based extraction. Expertise in real-time data processing frameworks. Strong understanding of Git and CI/CD for automated deployment and version control. Experience with Infrastructure-as-Code tools like Terraform for cloud resource management. Good stakeholder management skills to collaborate effectively across teams. Solid understanding of SAP ERP data and processes to integrate enterprise data sources. Exposure to data visualization and front-end tools (Tableau, Looker, etc.). Strong command of English with excellent communication skills. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Lucknow, Uttar Pradesh, India
On-site
About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Gruve Gruve is an innovative software services startup dedicated to transforming enterprises to AI powerhouses. We specialize in cybersecurity, customer experience, cloud infrastructure, and advanced technologies such as Large Language Models (LLMs). Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks. About The Role We are looking for a highly skilled SIEM Consultant with deep hands-on experience in designing, implementing, and configuring Splunk SIEM solutions. The ideal candidate will be responsible for deploying Splunk into customer environments, onboarding diverse log sources, configuring security use cases, and integrating external tools for end-to-end threat visibility. This role demands strong technical expertise, project delivery experience, and the ability to translate security monitoring requirements into Splunk configurations and dashboards. Key Responsibilities SIEM Design s Implementation Lead the design and deployment of Splunk architecture (single/multi-site, indexer clustering, search head clustering, ). Define data ingestion strategies and architecture best Install, configure, and optimize Splunk components (forwarders, indexers, heavy forwarders, search heads, deployment servers). Set up and manage Splunk deployment servers, apps, and configuration bundles. Log Source Onboarding Identify, prioritize, and onboard critical log sources across IT, cloud, network, security, and application Develop onboarding playbooks for common and custom log Create parsing, indexing, and field extraction logic using conf, transforms.conf, and custom apps. Ensure log data is normalized and categorized according to CIM (Common Information Model). Use Case Development s Configuration Work with SOC teams to define security monitoring requirements and detection Configure security use cases, correlation rules, and alerting within Splunk Enterprise Security (ES) or core Develop dashboards, alerts, and scheduled reports to support threat detection, compliance, and operational Tune and optimize correlation rules to reduce false Tool Integration Integrate Splunk with third-party tools and platforms such as: Ticketing systems (ServiceNow, JIRA) Threat Intelligence Platforms (Anomali) SOAR platforms (Splunk SOAR, Palo Alto XSOAR) Endpoint C Network tools (CrowdStrike, Fortinet, Cisco, ) Develop and manage APIs, scripted inputs, and custom connectors for data ingestion and bidirectional Documentation s Handover Maintain comprehensive documentation for architecture, configurations, onboarding steps, and operational Conduct knowledge transfer and operational training for security Create runbooks, SOPs, and configuration backups for business Prepare HLD and LLD documents for Solution Required Skills s Experience 5+ years of experience in SIEM implementation, with at least 3 years focused on Strong knowledge of Splunk architecture, deployment methods, data onboarding, and advanced search. Experience in building Splunk dashboards, alerts, and use case logic using SPL (Search Processing Language). Familiarity with Common Information Model (CIM) and data normalization Experience integrating Splunk with external tools and writing automation scripts (Python, Bash, ). Preferred Certifications Splunk Core Certified Power User Splunk Certified Admin or Architect Splunk Enterprise Security Certified Admin (preferred) Security certifications like CompTIA Security+, GCIA, or CISSP (optional but beneficial) Why Gruve At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you. Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted. Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Service Management Practitioner Project Role Description : Support the delivery of programs, projects or managed services. Coordinate projects through contract management and shared service coordination. Develop and maintain relationships with key stakeholders and sponsors to ensure high levels of commitment and enable strategic agenda. Must have skills : Microsoft Power Business Intelligence (BI) Good to have skills : Microsoft Power Apps Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Service Management Practitioner, you will support the delivery of programs, projects, or managed services. Coordinate projects through contract management and shared service coordination. Develop and maintain relationships with key stakeholders and sponsors to ensure high levels of commitment and enable strategic agenda. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Coordinate the delivery of programs, projects, or managed services. - Develop and maintain relationships with key stakeholders and sponsors. - Ensure high levels of commitment from stakeholders. - Enable strategic agenda through effective coordination. - Provide regular updates and reports on project progress. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Business Intelligence (BI). - Good To Have Skills: Experience with Microsoft Power Apps. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Microsoft Power Business Intelligence (BI). - This position is based at our Chennai office. - A 15 years full-time education is required. 15 years full time education Show more Show less
Posted 2 weeks ago
12.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM Cognos TM1 Good to have skills : NA Minimum 12 Year(s) Of Experience Is Required Educational Qualification : Must Complete 15 years of full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the successful implementation of applications and collaborating with various teams to deliver high-quality solutions. Your typical day will involve designing and developing applications, troubleshooting issues, and contributing to key decisions to enhance application functionality and performance. Roles & Responsibilities: - Expected to be an SME in IBM Cognos TM1 - Collaborate and manage the team to perform effectively - Responsible for team decisions and ensuring successful application implementation - Engage with multiple teams and contribute to key decisions - Expected to provide solutions to problems that apply across multiple teams - Design, build, and configure applications based on business process and requirements - Troubleshoot and resolve application issues - Contribute to key decisions to enhance application functionality and performance Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM Cognos TM1 - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on experience implementing various machine learning algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 12 years of experience in IBM Cognos TM1 - This position is based in Coimbatore - Must complete 15 years of full-time education Show more Show less
Posted 2 weeks ago
1.0 - 3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Gruve Gruve is an innovative software services startup dedicated to transforming enterprises to AI powerhouses. We specialize in cybersecurity, customer experience, cloud infrastructure, and advanced technologies such as Large Language Models (LLMs). Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks. About The Role We are seeking a skilled SIEM Administrator to manage and optimize different SIEM solutions. The ideal candidate will be responsible for system administration, log integration, troubleshooting, Deployment, Implementation and maintaining security posture for the organization. Key Responsibilities SIEM Administration: Install, configure, maintain, and upgrade SIEM components. (IBM Qradar SIEM, DNIF, Splunk & Securonix). Log Management Onboard, parse, and normalize logs from various data sources (firewalls, servers, databases, applications, etc.) Custom log source integration and parser development. System Monitoring & Troubleshooting Ensure SIEM tools are functioning optimally. Monitor & regular health check perform for SIEM tools. troubleshoot system errors and resolve performance issues. Conduct regular performance tuning and capacity planning Perform root cause analysis for system failures & performance issues. Optimize system performance and storage management for SIEM Integration & Automation Integrate third-party security tools (firewalls, EDR, threat intelligence feeds) with SIEM. Compliance & Audits Ensure log retention policies comply with regulatory standards. Develop & enforce SIEM access controls & user roles/permissions. Documentation & Training Document system configurations, SOP’s & troubleshooting documents. Prepare monthly/ weekly reports and PPT, onboarding documentation as per business/ client requirement. Dashboard & Report Development Create & maintain custom dashboards & reports Optimize searches & reports for performance and efficiency. Other Knowledge Base Hands on experience with Linux OS & Windows OS Basic to mediator level knowledge in networking skills Should be familiar with Azure, AWS or GCP products Required Skills & Qualifications B.E/B.Tech degree in computer science, Cybersecurity, or related field (preferred). 1-3 years experience as Soc Admin Strong knowledge of SIEM architecture, log sources, and event correlation. Proficiency in log management, regular expressions, and network security concepts. Experience integrating SIEM with various security tools (firewalls, IDS/IPS, antivirus, etc.). Scripting knowledge (Python, Bash, or PowerShell) is a plus. Training or Certificate on Splunk or IBM Qradar Preferred. Soft Skills Strong analytical and problem-solving skills. Excellent communication and documentation abilities. Ability to work independently and in a team. Must Have Skills Hands-on experience with SIEM tools like IBM QRadar, Splunk, Securonix, LogRhythm, Microsoft Sentinel, DNIF etc. Proficiency in IBM Qradar & Splunk administration Configuring, maintaining, and troubleshooting SIEM solutions. Log source integration, parsing, and normalization. Strong knowledge of TCP/IP, DNS, HTTP, SMTP, FTP, VPNs, proxies, and firewall rules. Familiarity with Linux and Windows system administration. Why Gruve At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you. Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted. Show more Show less
Posted 2 weeks ago
7.5 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Project Role : Full Stack Engineer Project Role Description : Responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. Use development skills to deliver innovative solutions that help our clients improve the services they provide. Leverage new technologies that can be applied to solve challenging business problems with a cloud first and agile mindset. Must have skills : Java Full Stack Development, Node.js Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : BE Summary: As a Full Stack Engineer, you will be responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. You will use your development skills to deliver innovative solutions that help our clients improve the services they provide. Additionally, you will leverage new technologies to solve challenging business problems with a cloud-first and agile mindset. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Develop and engineer end-to-end features of a system. - Deliver innovative solutions to improve client services. - Utilize development skills to solve challenging business problems. - Stay updated with new technologies and apply them to projects. Professional & Technical Skills: - Must To Have Skills: Proficiency in Java Full Stack Development, Apache Kafka. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Java Full Stack Development. - This position is based at our Bengaluru office. - A BE degree is required. Show more Show less
Posted 2 weeks ago
7.5 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Business Analyst Project Role Description : Analyze an organization and design its processes and systems, assessing the business model and its integration with technology. Assess current state, identify customer requirements, and define the future state and/or business solution. Research, gather and synthesize information. Must have skills : Data Analytics Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Business Analyst, you will analyze an organization and design its processes and systems, assessing the business model and its integration with technology. You will assess the current state, identify customer requirements, and define the future state and/or business solution. Research, gather, and synthesize information to drive business decisions. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Lead process improvement initiatives to enhance efficiency. - Conduct data analysis to identify trends and insights. - Develop business cases and recommendations based on data analysis. - Facilitate communication between business stakeholders and technical teams. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Analytics. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Data Analytics. - This position is based at our Hyderabad office. - A 15 years full-time education is required. Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane