Jobs
Interviews

2326 Data Governance Jobs - Page 30

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

8 - 9 Lacs

Bengaluru

Work from Office

Are you passionate about data and code? Does the prospect of dealing with mission-critical data excite you? Do you want to build data engineering solutions that process a broad range of business and customer data? Do you want to continuously improve the systems that enable annual worldwide revenue of hundreds of billions of dollars? If so, then the eCommerce Services (eCS) team is for you! In eCommerce Services (eCS), we build systems that span the full range of eCommerce functionality, from Privacy, Identity, Purchase Experience and Ordering to Shipping, Tax and Financial integration. eCommerce Services manages several aspects of the customer life cycle, starting from account creation and sign in, to placing items in the shopping cart, proceeding through checkout, order processing, managing order history and post-fulfillment actions such as refunds and tax invoices. eCS services determine sales tax and shipping charges, and we ensure the privacy of our customers. Our mission is to provide a commerce foundation that accelerates business innovation and delivers a secure, available, performant, and reliable shopping experience to Amazon s customers. The goal of the eCS Data Engineering and Analytics team is to provide high quality, on-time reports to Amazon business teams, enabling them to expand globally at scale. Our team has a direct impact on retail CX, a key component that runs our Amazon fly wheel. As a Data Engineer, you will own the architecture of DW solutions for the Enterprise using multiple platforms. You would have the opportunity to lead the design, creation and management of extremely large datasets working backwards from business use cases. You will use your strong business and communication skills to be able to work with business analysts and engineers to determine how best to design the data warehouse for reporting and analytics. You will be responsible for designing and implementing scalable ETL processes in the data warehouse platform to support the rapidly growing and dynamic business demand for data and use it to deliver the data as service which will have an immediate influence on day-to-day decision making. Develop data products, infrastructure and data pipelines leveraging AWS services (such as Redshift, Kinesis, EMR, Lambda etc.) and internal BDT tools (DataNet, Cradle, Quick Sight etc. Improve existing solutions and come up with next generation Data Architecture to improve scale, quality, timeliness, coverage, monitoring and security. Develop new data models and end to data pipelines. Create and implement Data Governance strategy for mitigating privacy and security risks. 3+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Bachelors degree Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)

Posted 2 weeks ago

Apply

2.0 - 5.0 years

10 - 14 Lacs

Pune

Work from Office

What you ll do Works independently within Data and Analytics with limited design help from manager or senior associates Leverage coding best practices and advanced techniques to ensure efficient execution of code against large datasets, ensuring code is repeatable and scalable Run, create and optimize standard processes to ensure metrics, reports and insights are delivered consistently to stakeholders with minimal manual intervention Leverage knowledge of data structures to prepare data for ingestion efforts, analysis, assembling data from disparate data sources for the creation of insights; accurately integrate new and complex data sources Integrate Equifax, customer and third party data to solve internal or customer analytical problems of moderate complexity and report findings to managers and stakeholders Review output of code for anomalies and perform analysis to determine cause, and work with Data, Analytics, Product and Technology counterparts to implement corrective measures Ability to communicate impacts and importance of findings on the business (either Equifax or external customer) and recommend appropriate course of action. Understands the concepts of quantitative and qualitative data and how to relate them to the customer to show value of analysis. Ensure proper use of Equifax data assets by working closely with data governance and compliance professionals What experience you need BS degree in a STEM major or equivalent discipline 2-5 years of experience in a related analyst role Cloud certification strongly preferred Technical capabilities including SQL, BigQuery, R, Python, MS Excel / Google Sheets, Tableau, Looker Experience working as a team and collaborating with others on producing descriptive and diagnostic analysis What could set you apart Cloud certification such as GCP strongly preferred Self Starter Excellent communicator / Client Facing Ability to work in fast paced environment Flexibility work across A/NZ time zones based on project needs Primary Location: IND-Pune-Equifax Analytics-PEC Function: Function - Data and Analytics Schedule: Full time

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Noida

Work from Office

Job Description We are looking for a seasoned Data Engineer with extensive experience in designing and implementing data pipelines using Medallion Architecture, Azure Databricks, and Snowflake. The ideal candidate will be responsible for building scalable ETL pipelines, optimizing data flows, and ensuring data quality for large-scale data platforms. Key Responsibilities: Design, develop, and optimize data pipelines following Medallion Architecture (Bronze, Silver, Gold layers). Implement and maintain ETL pipelines using Databricks and Python (multi-threading and multi-processing). Leverage Snowflake for data warehousing, including schema design, data loading, and performance tuning. This also includes experience with Linux, Docker, Anaconda, Pandas, PySpark, Apache Hive and Iceberg, Trino, and Prefect. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver robust data solutions. Develop data models and manage data lakes for structured and unstructured data. Implement data governance and security best practices. Monitor and troubleshoot data pipelines for performance and reliability. Stay up-to-date with industry trends and best practices in data engineering and cloud technologies. Minimum Qualification B.Tech/B.E. (Computer Science/IT/Electronics) MCA Computer diploma in development with 3+ years of experience compulsory

Posted 2 weeks ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Mumbai

Work from Office

Overview: We are seeking a highly skilled and experienced Power BI Developer to join our dynamic team. The ideal candidate will have 5-7 years of hands-on experience in developing and implementing data analytics and business intelligence solutions using Power BI. This role involves working closely with business stakeholders, data engineers, and IT teams to deliver impactful insights and reports that drive business decisions. Key Resonsibilities: Report and Dashboard Development: Design, develop, and maintain interactive Power BI dashboards and reports, ensuring the quality and integrity of data visualizations. Data Modeling: Build and maintain complex data models, including transforming, cleaning, and structuring data from multiple sources to create user-friendly reports. DAX & Power Query Expertise: Develop advanced DAX measures and formulas to enhance the functionality of Power BI reports. Utilize Power Query for data transformation and manipulation. Performance Optimization: Ensure Power BI reports and dashboards are optimized for performance, ensuring quick data load and processing times. Stakeholder Collaboration: Work closely with business users and other teams (data engineering, data science, etc.) to understand business requirements and translate them into technical solutions. Data Integration: Integrate data from various sources (SQL Server, Excel, APIs, etc.) into Power BI for comprehensive analysis. Data Governance & Security: Implement row-level security (RLS) in Power BI reports to ensure data privacy and governance. Documentation & Training: Create and maintain documentation on report and dashboard functionality, and provide training to users on how to interact with reports and dashboards. Troubleshooting: Troubleshoot and resolve issues with Power BI reports, including data inconsistencies, performance problems, and system errors Skill & Qualification: Experience: 5-7 years of hands-on experience in Power BI development, with a strong portfolio of reports and dashboards. Power BI Expertise: Proficient in Power BI Desktop, Power BI Service, and Power BI Report Server. DAX and Power Query: Advanced knowledge of DAX (Data Analysis Expressions) and Power Query M language. Data Visualization: Strong understanding of data visualization best practices and ability to create effective and engaging dashboards. Data Modeling: Experience in designing and implementing complex data models for reporting and analysis. SQL: Strong SQL skills, with the ability to write complex queries for data extraction and manipulation. ETL Process: Knowledge of ETL processes and data integration from multiple sources (SQL Server, Excel, APIs, etc.). BI Tools Knowledge: Familiarity with other BI tools (Tableau, Qlik, etc.) is a plus. Data Warehousing: Knowledge of data warehousing concepts, star/snowflake schema, and dimensional modeling is preferred. Problem Solving: Strong analytical and troubleshooting skills to identify and resolve issues with data and reporting. Communication Skills: Excellent verbal and written communication skills to work with business users and technical teams.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Overview Azure Data Architect Bangalore Aptean is changing. Our ERP solutions are transforming a huge range of global businesses, from food producers to manufacturers. In a world of generic enterprise software, we provide targeted solutions that bring together the very best technology and drive greater results. With over 4500 employees, 90 different products and a global client base, there s no better time to advance your career at Aptean. Are you ready for what s next, now? We are! If being part of a dynamic, high growth organization excites you and you are a Senior Data Architect, and eager to learn and grow, then this opportunity is for you! Our fast-paced environment and dynamic global R&D department is eager for a mover and shaker to step into this role and become an integral part of our team. Job Summary: We are looking for a seasoned Data Architect with deep expertise in Spark to lead the design and implementation of modern data processing solutions. The ideal candidate will have extensive experience in distributed data processing, large-scale data pipelines, and cloud-native data platforms. This is a strategic role focused on building scalable, fault-tolerant, and high-performance data systems. Key Responsibilities: Architect, design, and implement large-scale data pipelines using Spark (batch and streaming). Optimize Spark jobs for performance, cost-efficiency, and scalability. Define and implement enterprise data architecture standards and best practices. Guide the transition from traditional ETL platforms to Spark-based solutions. Lead the integration of Spark-based pipelines into cloud platforms (Azure Fabric/Spark pools). Establish and enforce data architecture standards, including governance, lineage, and quality. Mentor data engineering teams on best practices with Spark (e.g., partitioning, caching, join strategies). Implement and manage CI/CD pipelines for Spark workloads using tools like GIT or DevOps. Ensure robust monitoring, alerting, and logging for Spark applications. Required Skills & Qualifications: 10+ years of experience in data engineering, with 7+ years of hands-on experience with Apache Spark (PySpark/Scala). Proficiency in Spark optimization techniques, Monitoring, Caching, advanced SQL, and distributed data design. Experience with Spark on Databricks and Azure Fabric. Solid understanding of Delta Lake, Spark Structured Streaming, and data pipelines. Strong experience in cloud platforms ( Azure). Proven ability to handle large-scale datasets (terabytes to petabytes). Familiarity with data lakehouse architectures, schema evolution, and data governance. Candidate to be experienced in Power BI, with at least 3+ years of experience. Preferred Qualifications: Experience implementing real-time analytics using Spark Streaming or Structured Streaming. Certifications in Databricks, Fabric or Spark would be a plus. If you share our mindset, you can share in our success. To find out more about joining Aptean, get in touch today. Learn from our differences. Celebrate our diversity. Grow and succeed together. Aptean pledges to promote a company culture where diversity, equity and inclusion are central. We are committed to applying this principle as we interact with our customers, build our teams, cultivate our leaders and shape a company in which any employee can succeed, regardless of race, color, sex, national origin, sexuality and gender identity, religion, disability, age, status as a protected veteran or any other group status protected by law. Celebrating our diverse experiences, opinions and beliefs allows us to embrace what makes us unique and to use this as an asset in bringing innovative solutions to our customer base. At Aptean, our global and diverse employee base is our greatest asset. It is through embracing and understanding our differences that we are able to harness our individual power to maximize the success of our customers, our employees and our company. - TVN Reddy

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Design and create compelling data visualizations, dashboards, and reports that provide actionable insights to support decision-making.Hands on experience in writing complex SQL queries and creating the stored procedure to create SSRS paginated reporting.Good understanding of Reporting.Working closely with data engineers and data analysts.Continuously optimize existing reports, ensuring performance, accuracy, and responsiveness, and addressing any data quality issues.SkillMinimum Experience - 3 to 8 YearsT-SQL/PL-SQL, SSRS Paginated Report, Sap BI , Power BIGood CommunicationStrong Aptitude Qualifications Power BI Key Features of Power BI Data Integration Techniques Data Refresh in Power BI Data governance and security in Power BI Active and Inactive Relationship Filters in Power BI Function in Power BI DAX CALCULATE SUMX AVERAGEX Time Intelligence Functions (ex : YTD) Filter Functions Text Functions Logical Functions (ex: AND) Date and Time Functions(ex: Month) SSRS Key Components of SSRS Paginated Reports Creating Parameters in SSRS Best Practices for SSRS Paginated Reports Best Practices for Parameters Change the sequence of report parameters in SSRS SQL DML & DDL Primary key, Unique key, foreign key Types of Joins Date & Aggregate function String functions Set Operators Windows function CTE Temp table in SQL Local Global Performance tunning Sample query based on above topics Query to identify and remove data redundancy in table Number of records based on joins

Posted 2 weeks ago

Apply

5.0 - 10.0 years

12 - 15 Lacs

Gurugram, Ahmedabad

Work from Office

We are seeking a highly skilled GCP Data Engineer with experience in designing and developing data ingestion frameworks, real-time processing solutions, and data transformation frameworks using open-source tools. The role involves operationalizing open-source data-analytic tools for enterprise use, ensuring adherence to data governance policies, and performing root-cause analysis on data-related issues. The ideal candidate should have a strong understanding of cloud platforms, especially GCP, with hands-on expertise in tools such as Kafka, Apache Spark, Python, Hadoop, and Hive. Experience with data governance and DevOps practices, along with GCP certifications, is preferred.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

6 - 11 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of the role is to facilitate visual interpretation of data from multiple sources and use this information to develop data driven solutions as per the clients requirements. Do 1. Develop valuable insights from multiple data source as per client requirements a. Customer engagement and requirements gathering i. Understand customer needs and objectives, technology trends and requirements to define how data will be seen as final output ii. Develop wireframes, prototypes, use cases in order to demonstrate the final data output as is required by customer iii. Analyse, propose and implement the data technology and tools used for data visualization iv. Provide solutioning of RFPs received from clients and ensure the final data output is as per business needs v. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view b. Design and Implementation of data visual aspects i. Architect and build data visualization capabilities to produce classical BI dashboards and solutions ii. Create the solutions by using a variety of data mining/data analysis methods, variety of data tools, data models and data semantics iii. Contribute to the design and implementation of the data platform architecture related to data visualization needs iv. Collaborate with other data architects to establish and run a data governance processes v. Manage metadata, semantic layer data on data domains and other enterprise data assets vi. Identify problem areas and perform root cause analysis of overall data flow and provide relevant solutions to the problem c. Enable Pre-Sales Team i. Support pre-sales team while presenting the entire data design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create visual data output as proposed iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 2. Capability Building and Team Management a. Ensure completion of necessary trainings and certifications b. Develop and present a point of view of Wipro on data visualization concepts and architect by writing white papers, blogs etc. c. Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) d. Mentor developers, designers and Junior architects for their further career development and enhancement e. Anticipate new talent requirements as per the market/ industry trends or client requirements f. Hire adequate and right resources for the team g. Contribute to the data visualization practice by conducting selection interviews etc Deliver No Performance Parameter Measure 1. Project Delivery Quality of design/ architecture, delivery as per cost, quality and timeline. Mandatory Skills: Business Analyst/ Data Analyst(Maps). Experience8-10 Years.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Data Governance. Experience5-8 Years.

Posted 2 weeks ago

Apply

0.0 - 3.0 years

2 - 6 Lacs

Pune

Work from Office

About The Role eClerx is hiring a Product Data Management Analyst who will work within our Product Data Management team to help our customers enhance online product data quality. It will also involve creating technical specifications and product descriptions for online presentation. The candidate will also be working on consultancy projects on redesigning e-commerce customers website taxonomy and navigation. The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, adding attributes, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelors Degree (Any Graduate) Good Understanding of tools and technology. Intermediate knowledge of MS Office/Internet.

Posted 2 weeks ago

Apply

0.0 - 3.0 years

2 - 6 Lacs

Mumbai

Work from Office

About The Role eClerx is hiring a Product Data Management Analyst who will work within our Product Data Management team to help our customers enhance online product data quality. It will also involve creating technical specifications and product descriptions for online presentation. The candidate will also be working on consultancy projects on redesigning e-commerce customers website taxonomy and navigation. The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, adding attributes, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelors Degree (Any Graduate) Good Understanding of tools and technology. Intermediate knowledge of MS Office/Internet.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Coimbatore

Work from Office

About the job : Exp :5+yrs NP : Imm-15 days Rounds : 3 Rounds (Virtual) Mandate Skills : Apache spark, hive, Hadoop, spark, scala, Databricks Job Description : The Role : - Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights. - Constructing infrastructure for efficient ETL processes from various sources and storage systems. - Leading the implementation of algorithms and prototypes to transform raw data into useful information. - Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations. - Creating innovative data validation methods and data analysis tools. - Ensuring compliance with data governance and security policies. - Interpreting data trends and patterns to establish operational alerts. - Developing analytical tools, programs, and reporting mechanisms - Conducting complex data analysis and presenting results effectively. - Preparing data for prescriptive and predictive modeling. - Continuously exploring opportunities to enhance data quality and reliability. - Applying strong programming and problem-solving skills to develop scalable solutions. Requirements : - Experience in the Big Data technologies (Hadoop, Spark, Nifi, Impala) - 5+ years of hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines. - High proficiency in Scala/Java and Spark for applied large-scale data processing. - Expertise with big data technologies, including Spark, Data Lake, and Hive

Posted 2 weeks ago

Apply

12.0 - 16.0 years

35 - 50 Lacs

Chennai

Work from Office

Job Summary As an Infra. Architect you will be responsible for designing and implementing robust infrastructure solutions using Microsoft technologies. You will collaborate with cross-functional teams to ensure seamless integration and security of systems. Your expertise in Microsoft Purview Microsoft Defender Suite and Azure AD Identity Protection will be crucial in enhancing the companys infrastructure capabilities. Responsibilities Design and implement infrastructure solutions leveraging Microsoft technologies to meet business needs. Collaborate with cross-functional teams to ensure seamless integration of systems and applications. Provide expertise in Microsoft Purview to enhance data governance and compliance across the organization. Utilize Microsoft Defender Suite to strengthen the security posture of the companys infrastructure. Implement Azure AD Identity Protection to safeguard user identities and access management. Configure Always on VPN to ensure secure remote access for employees. Deploy App Locker to control application execution and enhance endpoint security. Utilize Microsoft Defender ATP to detect investigate and respond to advanced threats. Manage Microsoft Entra ID to streamline identity and access management processes. Enhance Microsoft 365 Security to protect organizational data and communications. Oversee the hybrid work model implementation ensuring seamless connectivity and security. Provide technical guidance and support to IT teams to resolve complex infrastructure issues. Contribute to the continuous improvement of infrastructure processes and practices. Qualifications Possess a deep understanding of Microsoft Purview and its application in data governance. Demonstrate expertise in Microsoft Defender Suite for comprehensive security management. Have experience with Azure AD Identity Protection to enhance identity security. Be proficient in configuring Always on VPN for secure remote access. Show capability in deploying App Locker for application control. Be skilled in using Microsoft Defender ATP for threat detection and response. Have knowledge of Microsoft Entra ID for effective identity management. Certifications Required Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Security Compliance and Identity Fundamentals

Posted 2 weeks ago

Apply

5.0 - 8.0 years

3 - 6 Lacs

Navi Mumbai

Work from Office

Required Details. 1.Total IT Exp: 2.Exp in Kafka: 3.Exp in Kafka Connect, Schema Registry, Kafka Streams 4.Exp in Kafka cluster: 5.Current CTC: 6.Exp CTC: 7.Notice Period/LWD: 8.Current Location: 9.Willing to relocate to Navi Mumbai: 10.Willing to work on Alternate Saturdays: Job Title: Kafka Administrator (5+ Years Experience) Location : CBD Belapur Navi Mumbai Job Type : [Full-time] Experience Required : 5+ Years Educational Qualification: B.E B.Tech BCA B.Sc-IT MCA M.Sc-IT M.Tech Job Summary: We are looking for a skilled and experienced Kafka Administrator with a minimum of 5 years of experience in managing Apache Kafka environments. The ideal candidate will be responsible for the deployment, configuration, monitoring, and maintenance of Kafka clusters to ensure system scalability, reliability, and performance. Key Responsibilities: Install, configure, and maintain Apache Kafka clusters in production and development environments. Monitor Kafka systems using appropriate tools and proactively respond to issues. Set up Kafka topics, manage partitions, and define data retention policies. Perform upgrades and patch management for Kafka and its components. Collaborate with application teams to ensure seamless Kafka integration. Troubleshoot and resolve Kafka-related production issues. Develop and maintain scripts for automation of routine tasks. Ensure security, compliance, and data governance for Kafka infrastructure. Maintain documentation and operational runbooks. Required Skills: Strong experience with Apache Kafka and its ecosystem (Kafka Connect, Schema Registry, Kafka Streams). Proficient in Kafka cluster monitoring and performance tuning. Experience with tools such as Prometheus, Grafana, ELK stack. Solid knowledge of Linux/Unix system administration. Hands-on experience with scripting languages like Bash, Python. Familiarity with DevOps tools (Ansible, Jenkins, Git). Experience with cloud-based Kafka deployments (e.g., Confluent Cloud, AWS MSK) is a plus. Qualification Criteria: Candidates must hold at least one of the following degrees: - B.E (Bachelor of Engineering) - B.Tech (Bachelor of Technology) - BCA (Bachelor of Computer Applications) - B.Sc-IT (Bachelor of Science in Information Technology) - MCA (Master of Computer Applications) - M.Sc-IT (Master of Science in Information Technology) - M.Tech (Master of Technology) Preferred Certifications (Not Mandatory): Confluent Certified Administrator for Apache Kafka (CCAAK) Linux and Cloud Administration Certifications (RHCSA, AWS, Azure)

Posted 2 weeks ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Job title: Business Analyst Responsibilities : Analytical Support : Gather all operational and financial data across all centers to provide inputs into the weekly MIS as well as a Monthly Review Meeting. Drive meaningful weekly / monthly reports that will help the regional Managers to take decisions on their centers health Analyse financial data (budgets, income statements, etc.) to understand Oasis Fertility's financial health. Coordinate all operational issues captured at center level and program manager the closure through cross functional collaboration Evaluate operational expenditures (OPEX) and capital expenditures (Capex) against the budget to identify variances. Analyse operational data to identify trends and areas for improvement. Conduct ad-hoc analytics towards a hypothesis and derive insights that will impact business performance Operational support : Coordinate assimilation of data for calculating doctor payouts and facilitate the final file to finance Coordinate and assimilate data to calculate incentives for the eligible operations team members. Use key metrics like yearly growth, return on assets (ROA), return on equity (ROE), and earnings per share (EPS) to assess operational performance. Collaborate with the operations and finance teams to ensure alignment between operational and financial goals. Strategic Support : Conduct business studies to understand past, present, and potential future performance. Conduct market research to stay updated on financial trends in the fertility industry. Evaluate the effectiveness of current processes and recommend changes for better efficiency. Develop data-driven recommendations to improve operational efficiency. Prepare financial models to assess the profitability of different business units and potential investment opportunities. Participate in process improvement initiatives and policy development to optimize business functions.

Posted 2 weeks ago

Apply

10.0 - 20.0 years

20 - 35 Lacs

Pune

Hybrid

Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is one of our leading MNC client. PFB the details for your better understanding: WORK LOCATION: PUNE Job Role: Big Data Solution Architect EXPERIENCE: 10 Yrs - 20 Yrs CTC Range: 25 LPA -35 LPA Work Type: Hybrid Required Skills & Experience: 10+ years of progressive experience in software development, data engineering, and solution architecture roles, with a strong focus on large-scale distributed systems. Expertise in Big Data Technologies : Apache Spark : Deep expertise in Spark architecture, Spark SQL, Spark Streaming, performance tuning, and optimization techniques. Experience with data processing paradigms (batch and real-time). Hadoop Ecosystem : Strong understanding of HDFS, YARN, Hive, and other related Hadoop components. Real-time Data Streaming: Apache Kafka: Expert-level knowledge of Kafka architecture, topics, partitions, producers, consumers, Kafka Streams, KSQL, and best practices for high-throughput, low-latency data pipelines. NoSQL Databases: Couchbase: In-depth experience with Couchbase (or similar document/key-value NoSQL databases like MongoDB, Cassandra), including data modeling, indexing, querying (N1QL), replication, scaling, and operational best practices. API Design & Development: Extensive experience in designing and implementing robust, scalable, and secure APIs (RESTful, GraphQL) for data access and integration. Programming & Code Review: Hands-on coding proficiency in at least one relevant language (Python, Scala, Java) with a preference for Python and/or Scala for data engineering tasks. Proven experience in leading and performing code reviews, ensuring code quality, performance, and adherence to architectural guidelines. Cloud Platforms : Extensive experience designing and implementing solutions on at least one major cloud platform (AWS, Azure, GCP), leveraging their Big Data, streaming, and compute services. Database Fundamentals: Solid understanding of relational database concepts, SQL, and data warehousing principles. System Design & Architecture Patterns : Deep knowledge of various architectural patterns (e.g., Microservices, Event-Driven Architecture, Lambda/Kappa Architecture, Data Mesh) and their application in data solutions. DevOps & CI/CD: Familiarity with DevOps principles, CI/CD pipelines, infrastructure as code (IaC), and automated deployment strategies for data platforms. If interested, kindly APPLY for IMMEDIATE response Thanks & Rgds SHOBANA GSN | Mob : 8939666294 (Whatsapp) | Email :Shobana@gsnhr.net | Web : www.gsnhr.net Google Reviews : https://g.co/kgs/UAsF9W

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress and make necessary adjustments to keep everything on track, fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to support their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

12.0 - 15.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work seamlessly together to support the organization's data needs and objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate team performance to ensure alignment with project goals.- Accountable and responsible for team outcome and delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and best practices.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance frameworks and compliance standards.- Profound in Databricks and data transformations. Led teams before. Did conceptual work as well.- Ability to troubleshoot and optimize data workflows for performance. Additional Information:- The candidate should have minimum 12 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data governance and security best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data platform's capabilities. You will be actively involved in problem-solving and contributing innovative ideas to improve the overall data architecture, ensuring that the platform meets the evolving needs of the organization and its stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and best practices.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and management frameworks.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure effective communication within the team and stakeholders Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM- Strong understanding of data integration and data quality management- Experience in leading application development projects- Knowledge of data governance principles- Hands-on experience in configuring and customizing Informatica MDM solutions Additional Information:- The candidate should have a minimum of 5 years of experience in Informatica MDM- This position is based at our Gurugram office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Navi Mumbai

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Purview Good to have skills : Collibra Data GovernanceMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive successful outcomes. You will also engage in problem-solving activities, ensuring that the applications meet the required standards and specifications while fostering a collaborative environment for your team. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously assess and improve application development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Purview.- Good To Have Skills: Experience with Collibra Data Governance.- Strong understanding of data governance principles and practices.- Experience in application design and architecture.- Familiarity with cloud-based solutions and integration techniques. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Purview.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 25.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Engineering Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :The Data Mesh Expert is responsible for supporting the design of the conceptual frameworks for data contract management within the SSDP (Sef-Serve Data Platform based on Databricks). This expert will define how data quality is governed and formalize the expectations between data producers and consumers by establishing data contracts as an integral part of a data product. A key responsibility is to test this proposal through a Proof-of-Concept implementation on the SSDP (Sef-Serve Data Platform based on Databricks) and EDC and to find common acceptance of the framework such that the finalized result is a reusable pattern for all data domains. Roles & Responsibilities:- Expected to be a Subject Matter Expert with deep knowledge and experience.- Should have influencing and advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and discussions to gather requirements and feedback from stakeholders.- Mentor junior professionals in best practices and emerging technologies. Professional & Technical Skills: - Data Governance Expertise:Deep knowledge of data quality management principles and data contract lifecycle management.- Data Mesh & Architecture:Strong understanding of Data Mesh principles and experience in designing conceptual solutions for data platforms.- Technical Proficiency:Familiarity with data platform technologies and data catalogs (like Atlan) to design and oversee a Proof-of-Concept. Understanding of how contracts can be validated in data pipelines and CI/CD processes.- Stakeholder Management:Ability to coordinate with and gather feedback from various data domain teams and persuade them on the final concept. Ability to collaborate effectively with the Data Mesh Enablement Team for the final handover of deliverables. Additional Information:- The candidate should have minimum 15 years of experience in Data Engineering.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work seamlessly together to support the organization's data needs and objectives. Your role will require you to analyze requirements, propose solutions, and contribute to the overall strategy of the data platform, making it a dynamic and impactful position within the team. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate team performance to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and best practices.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance frameworks and compliance standards.- Ability to work with large datasets and perform data analysis.- Profound in Databricks, delivered on a project before. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the development process and ensure successful project delivery. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Lead the application development team in designing and building applications.- Act as the primary point of contact for project-related communication.- Ensure timely delivery of projects and adherence to quality standards.- Provide technical guidance and mentorship to team members.- Collaborate with stakeholders to gather requirements and define project scope. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration and master data management concepts.- Experience in leading application development projects.- Knowledge of data modeling and database concepts.- Hands-on experience with Informatica MDM tools.- Good To Have Skills: Experience with data governance and data quality management. Additional Information:- The candidate should have a minimum of 3 years of experience in Informatica MDM.- This position is based at our Gurugram office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies