Jobs
Interviews

2211 Data Governance Jobs - Page 25

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 5.0 years

3 - 5 Lacs

Kolkata

Remote

Job Title: Data Protection Officer (Contract-Based | Hourly | On-Call) Location: Remote / India (with availability for EU/UK time zone coordination) Type: Contractual | Hourly Basis | As-needed Engagement Experience: 4 - 5 years of relevant experience Job Summary: We are looking for an experienced and independent Data Protection Officer (DPO) to support our organization in ensuring compliance with the General Data Protection Regulation (GDPR) and other applicable data privacy laws. This is a contract-based, hourly paid position , and the DPO will be engaged on an as-needed basis . The role requires flexibility to provide consultation, conduct reviews, and respond to data protection matters when required. Key Responsibilities: Serve as the point of contact for UK/EU residents , supervisory authorities, and internal teams regarding data protection issues. Identify and evaluate the companys data processing activities . Provide expert advice on conducting Data Protection Impact Assessments (DPIA) . Monitor compliance with GDPR and applicable local data protection laws . Review and advise on data management procedures and internal policies. Offer consultation on incident response and handling of privacy breaches . Track regulatory changes and provide recommendations to maintain compliance. Maintain and update a register of processing operations , including risk-prone processes for prior checks. Support internal awareness and training initiatives regarding data protection obligations. Requirements- Work experience in data protection and legal compliance is a must. Solid knowledge of GDPR and data protection laws. Ability to handle confidential information. Ensure that controllers and data subjects are informed about their data protection rights, obligations and responsibilities and raise awareness about them; Create a register of processing operations within the institution and notify the EDPS those that present specific risks (so-called prior checks); Ethical, with the ability to remain impartial and report all non-compliances Organizational skills with attention to details. Experience: 4-5 years expertise in managing international data protection compliance programs and implementing data governance policies, technology compliance standards and programs, and privacy-by-design frameworks. To be successful in this role, you should have in-depth knowledge of GDPR and local data protection laws and be familiar with our industry and the nature of its data processing activities.

Posted 1 week ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Mumbai

Work from Office

Transform raw data into strategic insights. Conduct analyses to support decisions in demand forecasting, SKU profiling, procurement, warehouse management, and logistics. Optimize inventory levels and enhance overall supply chain efficiency. Required Candidate profile Bachelor’s or Master’s degree in Data Science, Business Analytics, or Supply Chain Management. Advanced skills in Excel, SQL, and BI/visualisation tools (e.g., Power BI, Tableau) Oversee MIS platforms

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 5 Lacs

Dhule

Work from Office

The Development Lead will oversee the design, development, and delivery of advanced data solutions using Azure Databricks, SQL, and data visualization tools like Power BI. The role involves leading a team of developers, managing data pipelines, and creating insightful dashboards and reports to drive data-driven decision-making across the organization. The individual will ensure best practices are followed in data architecture, development, and reporting while maintaining alignment with business objectives. Key Responsibilities: Data Integration & ETL Processes: Design, build, and optimize ETL pipelines to manage the flow of data from various sources into data lakes, data warehouses, and reporting platforms. Data Visualization & Reporting: Lead the development of interactive dashboards and reports using Power BI, ensuring that business users have access to actionable insights and performance metrics. SQL Development & Optimization: Write, optimize, and review complex SQL queries for data extraction, transformation, and reporting, ensuring high performance and scalability across large datasets. Azure Cloud Solutions: Implement and manage cloud-based solutions using Azure services (Azure Databricks, Azure SQL Database, Data Lake) to support business intelligence and reporting initiatives. Collaboration with Stakeholders: Work closely with business leaders and cross-functional teams to understand reporting and analytics needs, translating them into technical requirements and actionable data solutions. Quality Assurance & Best Practices: Implement and maintain best practices in development, ensuring code quality, version control, and adherence to data governance standards. Performance Monitoring & Tuning: Continuously monitor the performance of data systems, reporting tools, and dashboards to ensure they meet SLAs and business requirements. Documentation & Training: Create and maintain comprehensive documentation for all data solutions, including architecture diagrams, ETL workflows, and data models. Provide training and support to end-users on Power BI reports and dashboards. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Development Lead or Senior Data Engineer with expertise in Azure Databricks, SQL, Power BI, and data reporting/visualization. Hands-on experience in Azure Databricks for large-scale data processing and analytics, including Delta Lake, Spark SQL, and integration with Azure Data Lake. Strong expertise in SQL for querying, data transformation, and database management. Proficiency in Power BI for developing advanced dashboards, data models, and reporting solutions. Experience in ETL design and data integration across multiple systems, with a focus on performance optimization. Knowledge of Azure cloud architecture, including Azure SQL Database, Data Lake, and other relevant services. Experience leading agile development teams, with a strong focus on delivering high-quality, scalable solutions. Strong problem-solving skills, with the ability to troubleshoot and resolve complex data and reporting issues. Excellent communication skills, with the ability to interact with both technical and non-technical stakeholders. Preferred Qualifications: Knowledge of additional Azure services (e.g., Azure Synapse, Data Factory, Logic Apps) is a plus. Experience in Power BI for data visualization and custom calculations.

Posted 1 week ago

Apply

4.0 - 6.0 years

1 - 2 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Responsibilities: Design and implement scalable data pipelines to ingest, process, and analyze large volumes of structured and unstructured data from various sources. Develop and optimize data storage solutions, including data warehouses, data lakes, and NoSQL databases, to support efficient data retrieval and analysis. Implement data processing frameworks and tools such as Apache Hadoop, Spark, Kafka, and Flink to enable real-time and batch data processing. Collaborate with data scientists and analysts to understand data requirements and develop solutions that enable advanced analytics, machine learning, and reporting. Ensure data quality, integrity, and security by implementing best practices for data governance, metadata management, and data lineage. Monitor and troubleshoot data pipelines and infrastructure to ensure reliability, performance, and scalability. Develop and maintain ETL (Extract, Transform, Load) processes to integrate data from various sources and transform it into usable formats. Stay current with emerging technologies and trends in big data and cloud computing, and evaluate their applicability to enhance our data engineering capabilities. Document data architectures, pipelines, and processes to ensure clear communication and knowledge sharing across the team. Strong programming skills in Java, Python, or Scala.Strong understanding of data modelling, data warehousing, and ETL processes. Min 4 to Max 6yrs of Relevant exp.Strong understanding of Big Data technologies and their architectures, including Hadoop, Spark, and NoSQL databases. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 1 week ago

Apply

3.0 - 5.0 years

12 - 14 Lacs

Gurugram

Work from Office

We are seeking an experienced Salesforce Marketing Cloud Consultant to implement, integrate, and optimize Salesforce Marketing Cloud across multiple channels such as Email, SMS, WhatsApp, and Push notifications. Responsibilities include configuring SFMC components like Email Studio, Mobile Studio, Contact Builder, Audience Builder, Automation Studio, Journey Builder, Reporting, and Einstein features. The role involves developing automated processes, writing SQL queries, and handling API integrations. The consultant will collaborate with cross-functional teams, providing support, troubleshooting, and guiding clients in best practices, while ensuring system stability and performance. This role requires a strong understanding of marketing automation and data governance.

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 12 Lacs

Bengaluru, Doddakannell, Karnataka

Work from Office

We are seeking a highly skilled Data Engineer with expertise in ETL techniques, programming, and big data technologies. The candidate will play a critical role in designing, developing, and maintaining robust data pipelines, ensuring data accuracy, consistency, and accessibility. This role involves collaboration with cross-functional teams to enrich and maintain a central data repository for advanced analytics and machine learning. The ideal candidate should have experience with cloud-based data platforms, data modeling, and data governance processes. Location - Bengaluru,Doddakannell, Karnataka, Sarjapur Road

Posted 1 week ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Mumbai

Work from Office

JD for Power Bi. Role Description: Power BI Competencies: Digital : Microsoft Power BI Experience (Years): 2-4 Essential Skills: Power BI Job description Key Responsibilities: Power BI Report and Dashboard Development: Design, develop, and deploy interactive Power BI dashboards and reports for business users. Leverage Power BI features such as DAX (Data Analysis Expressions) , Power Query , and Power BI Service to develop insightful analytics. Create custom visualizations and interactive reports that help business leaders understand key metrics. Data Modeling and Transformation: Build and maintain optimized data models to support business requirements and improve report performance. Develop ETL (Extract, Transform, Load) processes using Power Query and integrate data from multiple sources (SQL Server, Excel, Azure, etc.). Data Integration: Connect Power BI to various data sources (e.g., SQL Server , Excel , SharePoint , Azure , API integrations ) to pull data and create real-time reports. Design and implement data pipelines for seamless data flow and processing. Collaboration with Stakeholders: Work with business analysts, managers, and end-users to gather reporting requirements and translate them into Power BI solutions. Deliver training sessions to end-users, ensuring they can efficiently navigate Power BI reports and dashboards. Performance Tuning: Optimize Power BI reports for speed and efficiency, ensuring they can handle large datasets and deliver fast results. Troubleshoot and resolve performance issues related to reports and dashboards. Governance and Security: Implement role-based security to control access to specific data based on user roles. Maintain and monitor Power BI workspaces , ensuring appropriate governance and data access controls. Documentation: Create and maintain documentation for developed reports, dashboards, and data models. Ensure that reports are properly version-controlled and aligned with data governance practices. Continuous Improvement: Stay up-to-date with the latest Power BI features, best practices, and industry trends. Suggest improvements and enhancements to existing reports and dashboards based on user feedback and business needs.

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 6 Lacs

Bengaluru

Work from Office

Req ID: 331269 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Informatica Admin to join our team in Bangalore, Karn taka (IN-KA), India (IN). Informatica Cloud Data Governance & Catalog (CDGC): Glossary creation, metadata management, data classification Data lineage, policy definition, and domain configuration Informatica Administration: User/role management, Secure Agent installation & maintenance Job monitoring, repository backups, system troubleshooting Environment configuration and version upgrades Informatica Data Quality (IDQ): Data profiling, rule specification, transformations Scorecards, DQ metrics, accuracy, and completeness checks Exception handling and remediation Additionally, it would be beneficial if the candidate has knowledge and experience in: Scripting: Shell scripting (Bash/Korn), Python scripting for automation Experience in building monitoring and housekeeping scripts Cloud Knowledge: Familiarity with Azure, AWS, or GCP Working with cloud-hosted Informatica services DevOps & CI/CD: Azure DevOps: Creating and managing pipelines, repos, and releases Integration with Informatica for automated deployments

Posted 1 week ago

Apply

2.0 - 7.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Role: Product Manager, Campaign Management Location: Bengaluru What you ll do We re MiQ, a global programmatic media partner for marketers and agencies. Our people are at the heart of everything we do, so you will be too. No matter the role or the location, we re all united in the vision to lead the programmatic industry and make it better. As a Product Manager in our Product department, you ll have the chance to: Research and stakeholder management: Gain a deep understanding of client needs by meeting with stakeholders, as well as global MiQ team leads to ensure the global validity of roadmap additions Coordinate resources across a range of departments of the business in order to develop great products, in particular, working closely with product stakeholders across the product and tech team Collaborate with product stakeholders to research, validate and prioritize new features that align with business priorities Planning and delivery: Plan, prioritise & project manage product development features to accelerate development of features Convert global client needs into technical features, and then convert the resulting features into marketable benefits. Work closely with the global product marketing team to align product enhancements internally with sales and marketing collateral externally Integrate usability studies, research and market studying into product requirements Lead the ideation, technical development, launch and continued adoption of features you build Drive product development by partnering with a team of platform product managers, engineers, data scientists, and UX/UI designers Partner evaluation and collaboration Identify and work closely with key external partners to ensure that MIQ product roadmap is additive Measuring and reporting on success Understand, collect and analyse metrics that inform the success of the product Drive adoption and measure success of new measurement offerings and features. Spread knowledge and train teams on new developments and evangelize MiQ s capabilities What impact will you create? In MiQ the Product Manager is tasked with building on our existing capabilities for campaign management, identifying gaps and opportunities in our offering, working with MiQ s technology and product teams to execute a development roadmap for this capability, and working to scale features across our regional markets. Who are your stakeholders? The key set of people from product stakeholders across the product and tech team, such as platform product managers, engineers, data scientists, and UX/UI designers will be your major stakeholders What you ll bring 2+ years of advertising platform technology experience in product management or related capacity, buy-side experience preferred In depth knowledge of the programmatic landscape, the data and technology that underpins it A fail fast and learn mentality in experimenting with new concepts and industry opportunities. Detail-oriented with an ability to prioritise projects/tasks simultaneously and to completion Alignment with MiQs core values A can-do attitude to provide energy, drive, and enthusiasm We ve highlighted some key skills, experience and requirements for this role. But please don t worry if you don t meet every single one. Our talent team strives to find the best people. They might see something in your background that s a fit for this role, or another opportunity at MiQ. If you have a passion for the role, please still apply. MiQ Values Our values are so much more than statements . They unite MiQers in every corner of the world. They shape the way we work and the decisions we make. And they inspire us to stay true to ourselves and to aim for better. Our values are there to be embraced by everyone, so that we naturally live and breathe them. Just like inclusivity, our values flow through everything we do - no matter how big or small. We do what we love - Passion We figure it out - Determination We anticipate the unexpected - Agility We always unite - Unite We dare to be unconventional - Courage What s in it for you? Our Center of Excellence is the very heart of MiQ, and it s where the magic happens. It means everything you do and everything you create will have a huge impact across our entire global business. MiQ is incredibly proud to foster a welcoming culture. We do everything possible to make sure everyone feels valued for what they bring. With global teams committed to diversity, equity, and inclusion, we re always moving towards becoming an even better place to work. Benefits Every region and office have specific perks and benefits, but every person joining MiQ can expect: A hybrid work environment New hire orientation with job specific onboarding and training Internal and global mobility opportunities Competitive healthcare benefits Bonus and performance incentives Generous annual PTO paid parental leave, with two additional paid days to acknowledge holidays, cultural events, or inclusion initiatives. Employee resource groups designed to connect people across all MiQ regions, drive action, and support our communities. Apply today! Equal Opportunity Employer Role: Associate Product Manager Location: Bengaluru What you ll do We re MiQ, a global programmatic media partner for marketers and agencies. Our people are at the heart of everything we do, so you will be too. No matter the role or the location, we re all united in the vision to lead the programmatic industry and make it better. As an Associate Product Manager in our Data Management product stream, you ll have the chance to: Ideate, vision, validate and help build key data management features which is at the core of MiQ s data-driven programmatic media offering Help build tools, frameworks, SDKs, APIs, pipelines, data cubes and data formats for other teams to consume data platforms and data products for building business-critical product features Collaborate with product stakeholders to research, validate and prioritize new features that align with business priorities Identify and work closely with key external partners to ensure that MIQ product roadmap is additive Integrate usability studies, research and market analysis into product requirements Lead the ideation, technical development, launch and continued adoption of features you build Drive product development with a team of world-class engineers & data scientists. Define, collect and analyse metrics that inform the success of the product Who are your stakeholders? The key set of people who will be consuming the data management features that you will help build our analysts, data scientists and product teams at MiQ, Hence they will be your major stakeholders In a few scenarios, we also have client facing data platform requirements directly being delivered by the Data Management team - e.g. 1PD onboarding platform, data cubes. Here stakeholders are clients and client facing teams like Sales and Client Services who represent the clients internally within MiQ What you ll bring Prior Experience in Product Management for a minimum of 2+ years, preferably as a Technical Product Manager of data platforms / products meant for developers or data scientists. (e.g. API products) Prior software development experience is preferable but not mandatory, however, you should have at least tried out a number of hobby projects like app creation, data pipelines etc. on cloud. Prior exposure to ETL solutions, big data tool sets, data ops and data governance, Jupyter Notebooks, Cloudera, Apache Nifi Exposure and understanding of more than one of the following is preferable data platforms like Databricks, Snowflake etc data formats like Delta, Apache Iceberg newer data paradigms like Data As A Product, Medallion Architecture, Data Mesh etc. Understanding of cloud platforms like AWS or GCP Prior experience in handling data pipeline optimization kind of product scenarios for failures, speed and efficiency. Prior experience of stakeholder interaction, managing conflicting requirements/priorities across stakeholders Extensive experience in handling and maintaining a relationship with vendors for product issues and also for build/buy decisioning. Exposure to ad-tech is preferable but not mandatory Willingness to try new technologies and ability to grasp and learn quickly Comfort with numbers and a quantitative approach to solve problems We ve highlighted some key skills, experience and requirements for this role. But please don t worry if you don t meet every single one. Our talent team strives to find the best people. They might see something in your background that s a fit for this role, or another opportunity at MiQ. If you have a passion for the role, please still apply. What impact will you create? MiQ has a petabyte scale data platform serving our data-driven programmatic offering. As an Associate Product Manager in the Data Management team, you will build key data management features which will have an impact at the bottom line of the company enabling sophisticated data and AI driven programmatic media capabilities and elevate MiQ in its offering. You will get an opportunity to democratise data and AI within MiQ and contribute to the data culture organically in a product-led way. What s in it for you? Our Center of Excellence is the very heart of MiQ, and it s where the magic happens. It means everything you do and everything you create will have a huge impact across our entire global business. MiQ is incredibly proud to foster a welcoming culture. We do everything possible to make sure everyone feels valued for what they bring. With global teams committed to diversity, equity, and inclusion, we re always moving towards becoming an even better place to work. Values Our values are so much more than statements . They unite MiQers in every corner of the world. They shape the way we work and the decisions we make. And they inspire us to stay true to ourselves and to aim for better. Our values are there to be embraced by everyone, so that we naturally live and breathe them. Just like inclusivity, our values flow through everything we do - no matter how big or small. We do what we love - Passion We figure it out - Determination We anticipate the unexpected - Agility We always unite - Unite We dare to be unconventional - Courage Benefits Every region and office have specific perks and benefits, but every person joining MiQ can expect: A hybrid work environment New hire orientation with job specific onboarding and training Internal and global mobility opportunities Competitive healthcare benefits Bonus and performance incentives Generous annual PTO paid parental leave, with two additional paid days to acknowledge holidays, cultural events, or inclusion initiatives. Employee resource groups designed to connect people across all MiQ regions, drive action, and support our communities. Apply today! Equal Opportunity Employer!

Posted 1 week ago

Apply

6.0 - 11.0 years

25 - 30 Lacs

Bengaluru

Work from Office

We are seeking an experienced Senior Data Engineer to join our data team. As a Data Engineer at ThoughtSpot, you will be responsible for designing, building, and maintaining the data infrastructure that powers our analytics and drives data-driven decision-making for leadership. You will work closely with business teams to ensure our data systems are robust, scalable, and efficient. We have a rapidly expanding list of happy customers who love our product, and were growing to serve even more. What youll do: Design, develop, and maintain scalable data pipelines to process large volumes of data from various sources. Working closely with our business teams to process & curate analytics ready data. Ensure data quality and consistency through rigorous testing and validation processes. Monitor and troubleshoot data pipeline performance and resolve any issues. What you bring: 6+ years of experience in data engineering, building data infra and pipelines. Experience building and maintaining large data pipelines, data infrastructure. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases/warehouses. Experience with ETL tools like Hevo, Matillion etc. Experience with data analytics products. Thoughtspot experience is good to have. Experience with cloud services such as AWS, GCP, Azure etc. Knowledge of data warehousing concepts and experience with EDW like Databricks, Snowflake or Redshift. Proficiency in programming languages such as Python and data processing libraries such as Pandas etc. Understanding of data governance, data quality and security best practices. Knowledge of development good practices such as testing, code reviews and git. Ability to work independently and coordinate with different stakeholders. You love building and leading exceptional teams in a fast-paced, entrepreneurial environment. You have a strong bias for action and being resourceful Bring amazing problem-solving skills and an ability to identify, quantify, debug, and remove bottlenecks and functional issues Great communication skills, both verbal and written, and an interest in working with a diverse set of peers and customers Alignment with ThoughtSpot Values What makes ThoughtSpot a great place to work? ThoughtSpot is the experience layer of the modern data stack, leading the industry with our AI-powered analytics and natural language search. We hire people with unique identities, backgrounds, and perspectives this balance-for-the-better philosophy is key to our success. When paired with our culture of Selfless Excellence and our drive for continuous improvement (2% done), ThoughtSpot cultivates a respectful culture that pushes norms to create world-class products. If you re excited by the opportunity to work with some of the brightest minds in the business and make your mark on a truly innovative company, we invite you to read more about our mission, and apply to the role that s right for you. ThoughtSpot for All Building a diverse and inclusive team isnt just the right thing to do for our people, its the right thing to do for our business. We know we can t solve complex data problems with a single perspective. It takes many voices, experiences, and areas of expertise to deliver the innovative solutions our customers need. At ThoughtSpot, we continually celebrate the diverse communities that individuals cultivate to empower every Spotter to bring their whole authentic self to work. We re committed to being real and continuously learning when it comes to equality, equity, and creating space for underrepresented groups to thrive. Research shows that in order to apply for a job, women feel they need to meet 100% of the criteria while men usually apply after meeting 60%. Regardless of how you identify, if you believe you can do the job and are a good match, we encourage you to apply.

Posted 1 week ago

Apply

3.0 - 8.0 years

8 - 9 Lacs

Bengaluru

Work from Office

Are you passionate about data and code? Does the prospect of dealing with mission-critical data excite you? Do you want to build data engineering solutions that process a broad range of business and customer data? Do you want to continuously improve the systems that enable annual worldwide revenue of hundreds of billions of dollars? If so, then the eCommerce Services (eCS) team is for you! In eCommerce Services (eCS), we build systems that span the full range of eCommerce functionality, from Privacy, Identity, Purchase Experience and Ordering to Shipping, Tax and Financial integration. eCommerce Services manages several aspects of the customer life cycle, starting from account creation and sign in, to placing items in the shopping cart, proceeding through checkout, order processing, managing order history and post-fulfillment actions such as refunds and tax invoices. eCS services determine sales tax and shipping charges, and we ensure the privacy of our customers. Our mission is to provide a commerce foundation that accelerates business innovation and delivers a secure, available, performant, and reliable shopping experience to Amazon s customers. The goal of the eCS Data Engineering and Analytics team is to provide high quality, on-time reports to Amazon business teams, enabling them to expand globally at scale. Our team has a direct impact on retail CX, a key component that runs our Amazon fly wheel. As a Data Engineer, you will own the architecture of DW solutions for the Enterprise using multiple platforms. You would have the opportunity to lead the design, creation and management of extremely large datasets working backwards from business use cases. You will use your strong business and communication skills to be able to work with business analysts and engineers to determine how best to design the data warehouse for reporting and analytics. You will be responsible for designing and implementing scalable ETL processes in the data warehouse platform to support the rapidly growing and dynamic business demand for data and use it to deliver the data as service which will have an immediate influence on day-to-day decision making. Develop data products, infrastructure and data pipelines leveraging AWS services (such as Redshift, Kinesis, EMR, Lambda etc.) and internal BDT tools (DataNet, Cradle, Quick Sight etc. Improve existing solutions and come up with next generation Data Architecture to improve scale, quality, timeliness, coverage, monitoring and security. Develop new data models and end to data pipelines. Create and implement Data Governance strategy for mitigating privacy and security risks. 3+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Bachelors degree Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)

Posted 1 week ago

Apply

2.0 - 5.0 years

10 - 14 Lacs

Pune

Work from Office

What you ll do Works independently within Data and Analytics with limited design help from manager or senior associates Leverage coding best practices and advanced techniques to ensure efficient execution of code against large datasets, ensuring code is repeatable and scalable Run, create and optimize standard processes to ensure metrics, reports and insights are delivered consistently to stakeholders with minimal manual intervention Leverage knowledge of data structures to prepare data for ingestion efforts, analysis, assembling data from disparate data sources for the creation of insights; accurately integrate new and complex data sources Integrate Equifax, customer and third party data to solve internal or customer analytical problems of moderate complexity and report findings to managers and stakeholders Review output of code for anomalies and perform analysis to determine cause, and work with Data, Analytics, Product and Technology counterparts to implement corrective measures Ability to communicate impacts and importance of findings on the business (either Equifax or external customer) and recommend appropriate course of action. Understands the concepts of quantitative and qualitative data and how to relate them to the customer to show value of analysis. Ensure proper use of Equifax data assets by working closely with data governance and compliance professionals What experience you need BS degree in a STEM major or equivalent discipline 2-5 years of experience in a related analyst role Cloud certification strongly preferred Technical capabilities including SQL, BigQuery, R, Python, MS Excel / Google Sheets, Tableau, Looker Experience working as a team and collaborating with others on producing descriptive and diagnostic analysis What could set you apart Cloud certification such as GCP strongly preferred Self Starter Excellent communicator / Client Facing Ability to work in fast paced environment Flexibility work across A/NZ time zones based on project needs Primary Location: IND-Pune-Equifax Analytics-PEC Function: Function - Data and Analytics Schedule: Full time

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Noida

Work from Office

Job Description We are looking for a seasoned Data Engineer with extensive experience in designing and implementing data pipelines using Medallion Architecture, Azure Databricks, and Snowflake. The ideal candidate will be responsible for building scalable ETL pipelines, optimizing data flows, and ensuring data quality for large-scale data platforms. Key Responsibilities: Design, develop, and optimize data pipelines following Medallion Architecture (Bronze, Silver, Gold layers). Implement and maintain ETL pipelines using Databricks and Python (multi-threading and multi-processing). Leverage Snowflake for data warehousing, including schema design, data loading, and performance tuning. This also includes experience with Linux, Docker, Anaconda, Pandas, PySpark, Apache Hive and Iceberg, Trino, and Prefect. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver robust data solutions. Develop data models and manage data lakes for structured and unstructured data. Implement data governance and security best practices. Monitor and troubleshoot data pipelines for performance and reliability. Stay up-to-date with industry trends and best practices in data engineering and cloud technologies. Minimum Qualification B.Tech/B.E. (Computer Science/IT/Electronics) MCA Computer diploma in development with 3+ years of experience compulsory

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Mumbai

Work from Office

Overview: We are seeking a highly skilled and experienced Power BI Developer to join our dynamic team. The ideal candidate will have 5-7 years of hands-on experience in developing and implementing data analytics and business intelligence solutions using Power BI. This role involves working closely with business stakeholders, data engineers, and IT teams to deliver impactful insights and reports that drive business decisions. Key Resonsibilities: Report and Dashboard Development: Design, develop, and maintain interactive Power BI dashboards and reports, ensuring the quality and integrity of data visualizations. Data Modeling: Build and maintain complex data models, including transforming, cleaning, and structuring data from multiple sources to create user-friendly reports. DAX & Power Query Expertise: Develop advanced DAX measures and formulas to enhance the functionality of Power BI reports. Utilize Power Query for data transformation and manipulation. Performance Optimization: Ensure Power BI reports and dashboards are optimized for performance, ensuring quick data load and processing times. Stakeholder Collaboration: Work closely with business users and other teams (data engineering, data science, etc.) to understand business requirements and translate them into technical solutions. Data Integration: Integrate data from various sources (SQL Server, Excel, APIs, etc.) into Power BI for comprehensive analysis. Data Governance & Security: Implement row-level security (RLS) in Power BI reports to ensure data privacy and governance. Documentation & Training: Create and maintain documentation on report and dashboard functionality, and provide training to users on how to interact with reports and dashboards. Troubleshooting: Troubleshoot and resolve issues with Power BI reports, including data inconsistencies, performance problems, and system errors Skill & Qualification: Experience: 5-7 years of hands-on experience in Power BI development, with a strong portfolio of reports and dashboards. Power BI Expertise: Proficient in Power BI Desktop, Power BI Service, and Power BI Report Server. DAX and Power Query: Advanced knowledge of DAX (Data Analysis Expressions) and Power Query M language. Data Visualization: Strong understanding of data visualization best practices and ability to create effective and engaging dashboards. Data Modeling: Experience in designing and implementing complex data models for reporting and analysis. SQL: Strong SQL skills, with the ability to write complex queries for data extraction and manipulation. ETL Process: Knowledge of ETL processes and data integration from multiple sources (SQL Server, Excel, APIs, etc.). BI Tools Knowledge: Familiarity with other BI tools (Tableau, Qlik, etc.) is a plus. Data Warehousing: Knowledge of data warehousing concepts, star/snowflake schema, and dimensional modeling is preferred. Problem Solving: Strong analytical and troubleshooting skills to identify and resolve issues with data and reporting. Communication Skills: Excellent verbal and written communication skills to work with business users and technical teams.

Posted 1 week ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Overview Azure Data Architect Bangalore Aptean is changing. Our ERP solutions are transforming a huge range of global businesses, from food producers to manufacturers. In a world of generic enterprise software, we provide targeted solutions that bring together the very best technology and drive greater results. With over 4500 employees, 90 different products and a global client base, there s no better time to advance your career at Aptean. Are you ready for what s next, now? We are! If being part of a dynamic, high growth organization excites you and you are a Senior Data Architect, and eager to learn and grow, then this opportunity is for you! Our fast-paced environment and dynamic global R&D department is eager for a mover and shaker to step into this role and become an integral part of our team. Job Summary: We are looking for a seasoned Data Architect with deep expertise in Spark to lead the design and implementation of modern data processing solutions. The ideal candidate will have extensive experience in distributed data processing, large-scale data pipelines, and cloud-native data platforms. This is a strategic role focused on building scalable, fault-tolerant, and high-performance data systems. Key Responsibilities: Architect, design, and implement large-scale data pipelines using Spark (batch and streaming). Optimize Spark jobs for performance, cost-efficiency, and scalability. Define and implement enterprise data architecture standards and best practices. Guide the transition from traditional ETL platforms to Spark-based solutions. Lead the integration of Spark-based pipelines into cloud platforms (Azure Fabric/Spark pools). Establish and enforce data architecture standards, including governance, lineage, and quality. Mentor data engineering teams on best practices with Spark (e.g., partitioning, caching, join strategies). Implement and manage CI/CD pipelines for Spark workloads using tools like GIT or DevOps. Ensure robust monitoring, alerting, and logging for Spark applications. Required Skills & Qualifications: 10+ years of experience in data engineering, with 7+ years of hands-on experience with Apache Spark (PySpark/Scala). Proficiency in Spark optimization techniques, Monitoring, Caching, advanced SQL, and distributed data design. Experience with Spark on Databricks and Azure Fabric. Solid understanding of Delta Lake, Spark Structured Streaming, and data pipelines. Strong experience in cloud platforms ( Azure). Proven ability to handle large-scale datasets (terabytes to petabytes). Familiarity with data lakehouse architectures, schema evolution, and data governance. Candidate to be experienced in Power BI, with at least 3+ years of experience. Preferred Qualifications: Experience implementing real-time analytics using Spark Streaming or Structured Streaming. Certifications in Databricks, Fabric or Spark would be a plus. If you share our mindset, you can share in our success. To find out more about joining Aptean, get in touch today. Learn from our differences. Celebrate our diversity. Grow and succeed together. Aptean pledges to promote a company culture where diversity, equity and inclusion are central. We are committed to applying this principle as we interact with our customers, build our teams, cultivate our leaders and shape a company in which any employee can succeed, regardless of race, color, sex, national origin, sexuality and gender identity, religion, disability, age, status as a protected veteran or any other group status protected by law. Celebrating our diverse experiences, opinions and beliefs allows us to embrace what makes us unique and to use this as an asset in bringing innovative solutions to our customer base. At Aptean, our global and diverse employee base is our greatest asset. It is through embracing and understanding our differences that we are able to harness our individual power to maximize the success of our customers, our employees and our company. - TVN Reddy

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Design and create compelling data visualizations, dashboards, and reports that provide actionable insights to support decision-making.Hands on experience in writing complex SQL queries and creating the stored procedure to create SSRS paginated reporting.Good understanding of Reporting.Working closely with data engineers and data analysts.Continuously optimize existing reports, ensuring performance, accuracy, and responsiveness, and addressing any data quality issues.SkillMinimum Experience - 3 to 8 YearsT-SQL/PL-SQL, SSRS Paginated Report, Sap BI , Power BIGood CommunicationStrong Aptitude Qualifications Power BI Key Features of Power BI Data Integration Techniques Data Refresh in Power BI Data governance and security in Power BI Active and Inactive Relationship Filters in Power BI Function in Power BI DAX CALCULATE SUMX AVERAGEX Time Intelligence Functions (ex : YTD) Filter Functions Text Functions Logical Functions (ex: AND) Date and Time Functions(ex: Month) SSRS Key Components of SSRS Paginated Reports Creating Parameters in SSRS Best Practices for SSRS Paginated Reports Best Practices for Parameters Change the sequence of report parameters in SSRS SQL DML & DDL Primary key, Unique key, foreign key Types of Joins Date & Aggregate function String functions Set Operators Windows function CTE Temp table in SQL Local Global Performance tunning Sample query based on above topics Query to identify and remove data redundancy in table Number of records based on joins

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 15 Lacs

Gurugram, Ahmedabad

Work from Office

We are seeking a highly skilled GCP Data Engineer with experience in designing and developing data ingestion frameworks, real-time processing solutions, and data transformation frameworks using open-source tools. The role involves operationalizing open-source data-analytic tools for enterprise use, ensuring adherence to data governance policies, and performing root-cause analysis on data-related issues. The ideal candidate should have a strong understanding of cloud platforms, especially GCP, with hands-on expertise in tools such as Kafka, Apache Spark, Python, Hadoop, and Hive. Experience with data governance and DevOps practices, along with GCP certifications, is preferred.

Posted 1 week ago

Apply

8.0 - 10.0 years

6 - 11 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of the role is to facilitate visual interpretation of data from multiple sources and use this information to develop data driven solutions as per the clients requirements. Do 1. Develop valuable insights from multiple data source as per client requirements a. Customer engagement and requirements gathering i. Understand customer needs and objectives, technology trends and requirements to define how data will be seen as final output ii. Develop wireframes, prototypes, use cases in order to demonstrate the final data output as is required by customer iii. Analyse, propose and implement the data technology and tools used for data visualization iv. Provide solutioning of RFPs received from clients and ensure the final data output is as per business needs v. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view b. Design and Implementation of data visual aspects i. Architect and build data visualization capabilities to produce classical BI dashboards and solutions ii. Create the solutions by using a variety of data mining/data analysis methods, variety of data tools, data models and data semantics iii. Contribute to the design and implementation of the data platform architecture related to data visualization needs iv. Collaborate with other data architects to establish and run a data governance processes v. Manage metadata, semantic layer data on data domains and other enterprise data assets vi. Identify problem areas and perform root cause analysis of overall data flow and provide relevant solutions to the problem c. Enable Pre-Sales Team i. Support pre-sales team while presenting the entire data design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create visual data output as proposed iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 2. Capability Building and Team Management a. Ensure completion of necessary trainings and certifications b. Develop and present a point of view of Wipro on data visualization concepts and architect by writing white papers, blogs etc. c. Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) d. Mentor developers, designers and Junior architects for their further career development and enhancement e. Anticipate new talent requirements as per the market/ industry trends or client requirements f. Hire adequate and right resources for the team g. Contribute to the data visualization practice by conducting selection interviews etc Deliver No Performance Parameter Measure 1. Project Delivery Quality of design/ architecture, delivery as per cost, quality and timeline. Mandatory Skills: Business Analyst/ Data Analyst(Maps). Experience8-10 Years.

Posted 1 week ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Data Governance. Experience5-8 Years.

Posted 1 week ago

Apply

0.0 - 3.0 years

2 - 6 Lacs

Pune

Work from Office

About The Role eClerx is hiring a Product Data Management Analyst who will work within our Product Data Management team to help our customers enhance online product data quality. It will also involve creating technical specifications and product descriptions for online presentation. The candidate will also be working on consultancy projects on redesigning e-commerce customers website taxonomy and navigation. The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, adding attributes, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelors Degree (Any Graduate) Good Understanding of tools and technology. Intermediate knowledge of MS Office/Internet.

Posted 1 week ago

Apply

0.0 - 3.0 years

2 - 6 Lacs

Mumbai

Work from Office

About The Role eClerx is hiring a Product Data Management Analyst who will work within our Product Data Management team to help our customers enhance online product data quality. It will also involve creating technical specifications and product descriptions for online presentation. The candidate will also be working on consultancy projects on redesigning e-commerce customers website taxonomy and navigation. The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, adding attributes, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelors Degree (Any Graduate) Good Understanding of tools and technology. Intermediate knowledge of MS Office/Internet.

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Coimbatore

Work from Office

About the job : Exp :5+yrs NP : Imm-15 days Rounds : 3 Rounds (Virtual) Mandate Skills : Apache spark, hive, Hadoop, spark, scala, Databricks Job Description : The Role : - Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights. - Constructing infrastructure for efficient ETL processes from various sources and storage systems. - Leading the implementation of algorithms and prototypes to transform raw data into useful information. - Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations. - Creating innovative data validation methods and data analysis tools. - Ensuring compliance with data governance and security policies. - Interpreting data trends and patterns to establish operational alerts. - Developing analytical tools, programs, and reporting mechanisms - Conducting complex data analysis and presenting results effectively. - Preparing data for prescriptive and predictive modeling. - Continuously exploring opportunities to enhance data quality and reliability. - Applying strong programming and problem-solving skills to develop scalable solutions. Requirements : - Experience in the Big Data technologies (Hadoop, Spark, Nifi, Impala) - 5+ years of hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines. - High proficiency in Scala/Java and Spark for applied large-scale data processing. - Expertise with big data technologies, including Spark, Data Lake, and Hive

Posted 1 week ago

Apply

12.0 - 16.0 years

35 - 50 Lacs

Chennai

Work from Office

Job Summary As an Infra. Architect you will be responsible for designing and implementing robust infrastructure solutions using Microsoft technologies. You will collaborate with cross-functional teams to ensure seamless integration and security of systems. Your expertise in Microsoft Purview Microsoft Defender Suite and Azure AD Identity Protection will be crucial in enhancing the companys infrastructure capabilities. Responsibilities Design and implement infrastructure solutions leveraging Microsoft technologies to meet business needs. Collaborate with cross-functional teams to ensure seamless integration of systems and applications. Provide expertise in Microsoft Purview to enhance data governance and compliance across the organization. Utilize Microsoft Defender Suite to strengthen the security posture of the companys infrastructure. Implement Azure AD Identity Protection to safeguard user identities and access management. Configure Always on VPN to ensure secure remote access for employees. Deploy App Locker to control application execution and enhance endpoint security. Utilize Microsoft Defender ATP to detect investigate and respond to advanced threats. Manage Microsoft Entra ID to streamline identity and access management processes. Enhance Microsoft 365 Security to protect organizational data and communications. Oversee the hybrid work model implementation ensuring seamless connectivity and security. Provide technical guidance and support to IT teams to resolve complex infrastructure issues. Contribute to the continuous improvement of infrastructure processes and practices. Qualifications Possess a deep understanding of Microsoft Purview and its application in data governance. Demonstrate expertise in Microsoft Defender Suite for comprehensive security management. Have experience with Azure AD Identity Protection to enhance identity security. Be proficient in configuring Always on VPN for secure remote access. Show capability in deploying App Locker for application control. Be skilled in using Microsoft Defender ATP for threat detection and response. Have knowledge of Microsoft Entra ID for effective identity management. Certifications Required Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Security Compliance and Identity Fundamentals

Posted 1 week ago

Apply

5.0 - 8.0 years

3 - 6 Lacs

Navi Mumbai

Work from Office

Required Details. 1.Total IT Exp: 2.Exp in Kafka: 3.Exp in Kafka Connect, Schema Registry, Kafka Streams 4.Exp in Kafka cluster: 5.Current CTC: 6.Exp CTC: 7.Notice Period/LWD: 8.Current Location: 9.Willing to relocate to Navi Mumbai: 10.Willing to work on Alternate Saturdays: Job Title: Kafka Administrator (5+ Years Experience) Location : CBD Belapur Navi Mumbai Job Type : [Full-time] Experience Required : 5+ Years Educational Qualification: B.E B.Tech BCA B.Sc-IT MCA M.Sc-IT M.Tech Job Summary: We are looking for a skilled and experienced Kafka Administrator with a minimum of 5 years of experience in managing Apache Kafka environments. The ideal candidate will be responsible for the deployment, configuration, monitoring, and maintenance of Kafka clusters to ensure system scalability, reliability, and performance. Key Responsibilities: Install, configure, and maintain Apache Kafka clusters in production and development environments. Monitor Kafka systems using appropriate tools and proactively respond to issues. Set up Kafka topics, manage partitions, and define data retention policies. Perform upgrades and patch management for Kafka and its components. Collaborate with application teams to ensure seamless Kafka integration. Troubleshoot and resolve Kafka-related production issues. Develop and maintain scripts for automation of routine tasks. Ensure security, compliance, and data governance for Kafka infrastructure. Maintain documentation and operational runbooks. Required Skills: Strong experience with Apache Kafka and its ecosystem (Kafka Connect, Schema Registry, Kafka Streams). Proficient in Kafka cluster monitoring and performance tuning. Experience with tools such as Prometheus, Grafana, ELK stack. Solid knowledge of Linux/Unix system administration. Hands-on experience with scripting languages like Bash, Python. Familiarity with DevOps tools (Ansible, Jenkins, Git). Experience with cloud-based Kafka deployments (e.g., Confluent Cloud, AWS MSK) is a plus. Qualification Criteria: Candidates must hold at least one of the following degrees: - B.E (Bachelor of Engineering) - B.Tech (Bachelor of Technology) - BCA (Bachelor of Computer Applications) - B.Sc-IT (Bachelor of Science in Information Technology) - MCA (Master of Computer Applications) - M.Sc-IT (Master of Science in Information Technology) - M.Tech (Master of Technology) Preferred Certifications (Not Mandatory): Confluent Certified Administrator for Apache Kafka (CCAAK) Linux and Cloud Administration Certifications (RHCSA, AWS, Azure)

Posted 1 week ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Job title: Business Analyst Responsibilities : Analytical Support : Gather all operational and financial data across all centers to provide inputs into the weekly MIS as well as a Monthly Review Meeting. Drive meaningful weekly / monthly reports that will help the regional Managers to take decisions on their centers health Analyse financial data (budgets, income statements, etc.) to understand Oasis Fertility's financial health. Coordinate all operational issues captured at center level and program manager the closure through cross functional collaboration Evaluate operational expenditures (OPEX) and capital expenditures (Capex) against the budget to identify variances. Analyse operational data to identify trends and areas for improvement. Conduct ad-hoc analytics towards a hypothesis and derive insights that will impact business performance Operational support : Coordinate assimilation of data for calculating doctor payouts and facilitate the final file to finance Coordinate and assimilate data to calculate incentives for the eligible operations team members. Use key metrics like yearly growth, return on assets (ROA), return on equity (ROE), and earnings per share (EPS) to assess operational performance. Collaborate with the operations and finance teams to ensure alignment between operational and financial goals. Strategic Support : Conduct business studies to understand past, present, and potential future performance. Conduct market research to stay updated on financial trends in the fertility industry. Evaluate the effectiveness of current processes and recommend changes for better efficiency. Develop data-driven recommendations to improve operational efficiency. Prepare financial models to assess the profitability of different business units and potential investment opportunities. Participate in process improvement initiatives and policy development to optimize business functions.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies