Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
15 - 20 Lacs
Mumbai
Work from Office
This role requires deep understanding of data warehousing, business intelligence (BI), and data governance principles, with strong focus on the Microsoft technology stack. Data Architecture Develop and maintain the overall data architecture, including data models, data flows, data quality standards. Design and implement data warehouses, data marts, data lakes on Microsoft Azure platform Business Intelligence Design and develop complex BI reports, dashboards, and scorecards using Microsoft Power BI. Data Engineering Work with data engineers to implement ETL/ELT pipelines using Azure Data Factory. Data Governance Establish and enforce data governance policies and standards. Primary Skills Experience 15+ years of relevant experience in data warehousing, BI, and data governance. Proven track record of delivering successful data solutions on the Microsoft stack. Experience working with diverse teams and stakeholders. Required Skills and Experience Technical Skills: Strong proficiency in data warehousing concepts and methodologies. Expertise in Microsoft Power BI. Experience with Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. Knowledge of SQL and scripting languages (Python, PowerShell). Strong understanding of data modeling and ETL/ELT processes. Secondary Skills Soft Skills Excellent communication and interpersonal skills. Strong analytical and problem-solving abilities. Ability to work independently and as part of a team. Strong attention to detail and organizational skills.
Posted 1 month ago
1.0 - 3.0 years
7 - 10 Lacs
Coimbatore
Work from Office
The Opportunity Entry level position; support Avantors data management strategies by investigating and resolving data quality issues in enterprise applications via deletion and merging, while safeguarding against data loss. Execute mass data management processes while ensuring data quality. Manage documentation, updates to the Data Dictionary and data management training materials, under the guidance of the Enterprise Data Management & Analytics team. Coordinate and conduct mass data imports into core systems, and mass data-cleansing initiatives, ensuring the integrity and eliminating redundancy from corporate databases. Job Summary: The Junior Associate in Customer & Vendor Master Data will be responsible for maintaining, updating, and ensuring the accuracy of customer and vendor information in the organizations database. This role requires a high attention to detail and the ability to work collaboratively with internal teams and external stakeholders to support data integrity and smooth business operations. Experience - 0 to 1 Year Key Responsibilities: Maintain and update customer and vendor master data within the companys database, ensuring accuracy and completeness. Verify and validate new customer and vendor data by liaising with relevant departments or stakeholders. Assist with the creation and review of data entry guidelines and processes. Support the data entry process for both new customers and vendors, as well as modifications to existing records. Ensure compliance with data governance standards, including privacy policies and regulations. Collaborate with internal teams (e.g., Sales, Procurement, Finance) to resolve any discrepancies or issues related to master data. Monitor data quality and take proactive steps to identify and resolve data inaccuracies. Assist in running regular data audits and clean-up activities to maintain the integrity of the customer and vendor database. Prepare and maintain reports related to master data for review by management. Provide support for system upgrades or data migration activities, ensuring data integrity is maintained. Assist in handling queries related to master data from both internal and external stakeholders. Qualifications: Any degree Strong attention to detail and accuracy in data entry and management. Basic understanding of database management systems and data governance principles. Proficient in Microsoft Office Suite (Excel, Word, etc.). Strong communication and interpersonal skills to collaborate with various teams. Ability to manage multiple tasks with competing deadlines. Previous experience in data management or administrative support is a plus. Skills and Competencies: Attention to detail and accuracy in handling sensitive data. Analytical mindset with the ability to identify and resolve discrepancies. Strong organizational and time management skills. Problem-solving and troubleshooting abilities. Ability to work independently and as part of a team Disclaimer: The above statements are intended to describe the general nature and level of work being performed by employees assigned to this classification. They are not intended to be construed as an exhaustive list of all responsibilities, duties and skills required of employees assigned to this position. Avantor is proud to be an equal opportunity employer. Why Avantor Dare to go further in your career. Join our global team of 14,000+ associates whose passion for discovery and determination to overcome challenges relentlessly advances life-changing science. The work we do changes peoples lives for the better. It brings new patient treatments and therapies to market, giving a cancer survivor the chance to walk his daughter down the aisle. It enables medical devices that help a little boy hear his moms voice for the first time. Outcomes such as these create unlimited opportunities for you to contribute your talents, learn new skills and grow your career at Avantor. We are committed to helping you on this journey through our diverse, equitable and inclusive culture which includes learning experiences to support your career growth and success. At Avantor, dare to go further and see how the impact of your contributions set science in motion to create a better world. Apply today! EEO Statement: We are an Equal Employment/Affirmative Action employer and VEVRAA Federal Contractor. We do not discriminate in hiring on the basis of sex, gender identity, sexual orientation, race, color, religious creed, national origin, physical or mental disability, protected Veteran status, or any other characteristic protected by federal, state/province, or local law. If you need a reasonable accommodation for any part of the employment process, please contact us by email at recruiting@avantorsciences.com and let us know the nature of your request and your contact information. Requests for accommodation will be considered on a case-by-case basis. Please note that only inquiries concerning a request for reasonable accommodation will be responded to from this email address. 3rd party non-solicitation policy:
Posted 1 month ago
5.0 - 8.0 years
7 - 10 Lacs
Pune
Work from Office
This job is hiring for Pan India Location. Marketing Cloud Intelligence(Datorama) Specialist, Marketing Cloud Personalization (Interaction Studio), Marketing Cloud Account Engagement(Pardot), Salesforce Data Cloud, Salesforce Marketing Cloud B2C. We are seeking an experienced Salesforce Data Cloud professional The ideal candidate will have a deep understanding of (SFMC) or Pardot or Marketing Cloud Personalization, with a focus on data management, customer segmentation, and analytics within the Salesforce Data Cloud environment. This role will involve designing and implementing data solutions that drive customer engagement, optimize marketing efforts, and enhance data-driven decision-making. Salesforce Data Cloud Implementation, Data management & Integration, Customer Segmentation & Personalization, Analytics & Reporting, Data Governance & Compliance.
Posted 1 month ago
6.0 - 10.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Position: Palantir Foundry & Pyspark Data Engineer Location: Hyderabad (PG&E Office) Key Skills: Palantir Foundry, Python, spark, AWS, Pyspark Experience: 6 -10 Years will be perfect fit Responsibilities: Preferred candidate having experience with Palantir Foundry (Code Repository, Contour, Data connection and workshop). Palantir Foundry experience is must to have. Develop and enhance data-processing, orchestration, monitoring, and more by leveraging popular open-source software, AWS, and GitLab automation. Collaborate with product and technology teams to design and validate the capabilities of the data platform Identify, design, and implement process improvements: automating manual processes, optimizing for usability, re-designing for greater scalability Provide technical support and usage guidance to the users of our platforms services. Drive the creation and refinement of metrics, monitoring, and alerting mechanisms to give us the visibility we need into our production services. Qualifications: Experience building and optimizing data pipelines in a distributed environment Experience supporting and working with cross-functional teams Proficiency working in Linux environment 4+ years of advanced working knowledge of Palantir Foundry, SQL, Python, and PySpark 2+ years of experience with using a broad range of AWS technologies Experience using tools such as: Git/Bitbucket, Jenkins/CodeBuild, CodePipeline Experience with platform monitoring and alerts tools
Posted 1 month ago
9.0 - 14.0 years
18 - 25 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Dear Professional, We are excited to present a unique opportunity at Cognizant, a leading IT firm renowned for fostering growth and innovation. We are seeking talented professionals with 9 to 14 years of experience in Power BI Administration,Power BI Desktop ,Service,Workspace Management,Dataset Management,Report Publishing,Tenant Migration,Security,Performance Optimization,SQL Server ,RLS,Data Governance ,DAX Optimization ,Azure Synapse Analytics ,L3 support,L4 Support to join our dynamic team. Your expertise in these areas is highly sought after, and we believe your contributions will be instrumental in driving our projects to new heights. We offer a collaborative environment where your skills will be valued and nurtured. To proceed to the next step of the recruitment process, please provide us with the following details with Updated resume to sathish.kumarmr@cognizant.com Please share below details (Mandatory) : Full Name(As per Pan card): Contact number: Email Current Location: Interested Locations: Total Years of experience: Relevant years of experience: Current company: Notice period: NP negotiable: if yes how many days they can negotiate? : If you are Serving any Notice period Means please mention Last date of Working: Current CTC- Expected CTC- Availability for interview on Weekdays ? Highest Qualification? Additionally, we would like to schedule a virtual interview with you on 26nd May 2025. Kindly confirm your availability for the same. We look forward to the possibility of you bringing your valuable experience to Cognizant. Please respond at your earliest convenience. Thanks & Regards, Sathish Kumar M R HR-Cognizant Sathish.KumarMR@cognizant.com
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Mumbai
Work from Office
We are hiring Informatica CDQ Professionals with 4 to 8 years of experience for a contract position (6 months to 1 year). Type: Contract (6 months to 1 year) Start Date: Immediate joiners preferred Skills Required: Strong hands-on experience with Informatica Cloud Data Quality (CDQ) Expertise in data profiling, data cleansing, and implementing data quality rules Solid knowledge of data governance and data management Strong troubleshooting and performance optimization skills To Apply: Please share your updated resume along with: Current CTC Expected CTC Current Location Email to: navaneetha@suzva.com
Posted 1 month ago
4.0 - 9.0 years
8 - 10 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Develop a strong understand of business flows, processes and architecture and leverage that in designing and developing BI content. Translate business/functional requirements into technical specifications encompassing the ETL, Metadata layer and Reporting layer. Significant experience in the areas of HANA Modeling, HANA data provisioning, HANA Views and SQL Script, Stored procedures. Hands-on development within all layers of the SAP Datasphere environment. Data acquisition, modeling, transformation, and load to HANA Platform. Design, build and execute data conversions from legacy systems. Write and execute test plans (unit, regression and integration) Technical and user documentation and training. Provide production support and user support, including researching data questions. Provide technical guidance and oversee work completed by junior team members Required Experience: 6+ years of full life-cycle experience in Data Warehouse/Reporting projects, preferably in an SAP environment. Hands-on experience in all facets of EDW Architecture, Data flow strategy, Data modeling, Metadata & Master data management. Experience with SAP HANA architecture and HANA Modeling. Understanding of HANA data provisioning, HANA views and SQL Script. Experience in SAP Datasphere implementation project with different data source like ERP (SAP ECC on HANA), Oracle R-12 Knowledge on extracting data from various sources including SAP and non-SAP system to SAP DS is required. Knowledge on Advanced modeling concepts including Analytic Views, Attribute Views, Hierarchies, Creating Restricted & Calculated Columns, Filter Operations, Variables, Creating Calculation Views, SAP HANA SQL, SQL Script and Procedures.Understanding of BI and HANA security (users, roles, privileges, etc).Excellent written and verbal communication skills Strong technical documentation SAP Datasphere Knowledge Location : -Remote
Posted 1 month ago
5.0 - 9.0 years
25 - 30 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Role - Microsoft Purview Consultant Exp - 5-8 Years Location - All EXL Locations (Hybrid) Key Responsibilities Data Governance & Compliance: Design and implement Microsoft Purview solutions to ensure data classification, retention, and protection, aligning with organizational and regulatory standards. Policy Development: Develop and enforce data policies, including Data Loss Prevention (DLP), Insider Risk Management (IRM), and Information Rights Management (IRM), to safeguard sensitive information. Integration & Architecture: Leverage Azure core infrastructure to integrate Purview with other Azure services and Microsoft 365 applications, ensuring robust and secure data governance solutions. Collaboration & Stakeholder Engagement: Work closely with IT, security, compliance, and business teams to understand requirements and deliver effective solutions, including providing training and support to end-users and IT staff. Documentation & Reporting: Generate comprehensive as-built documentation representing the total output of work delivered to clients, ensuring transparency and alignment with organizational policies. Qualifications & Skills Experience : Typically, 38 years of experience in data governance, compliance, and security within a Microsoft 365 environment. Certifications : Relevant certifications such as Microsoft Certified: Security, Compliance, and Identity Fundamentals, or Microsoft Certified: Information Protection Administrator Associate, are often preferred. Technical Skills : Proficiency in Microsoft Purview, Microsoft 365 applications (Exchange Online, SharePoint, Teams, OneDrive), and Azure services. Analytical & Communication Skills : Strong analytical and problem-solving skills, along with excellent communication and interpersonal abilities to collaborate effectively with various teams.
Posted 1 month ago
5.0 - 9.0 years
8 - 12 Lacs
Mumbai
Hybrid
Requirements - Support the implementation of the data governance framework across Zurich Cover-More, ensuring regulatory compliance and adherence to Zurich standards. - Collaborate with data owners to create documentation on data quality, privacy, and retention. - Manage metadata in the data catalogue, ensuring accuracy for reviews and tests. - Engage with various stakeholders to promote the adoption of the governance framework. - Monitor project scope, timelines, and resources while identifying risks. - Define and implement data quality rules with data owners and stewards. - Work with legal and compliance teams to ensure adherence to data security and privacy regulations like GDPR. - Develop training materials on data governance principles. - Establish metrics to track and improve the effectiveness of the governance framework. - Process access requests and evaluate changes related to data assets. - Refine the framework based on feedback and business needs. - Conduct Privacy Impact Assessments to identify and mitigate risks with personal data. - Manage the One Trust platform for Data Mapping and Automated Assessments. **Qualifications:** - Bachelors degree in Computer Science, Engineering, or a related field with 5+ years of experience. - Experience in data management or related fields. - Understanding of data governance frameworks and concepts. - Strong collaboration skills to work with cross-functional teams. - Relevant certifications (CIPPE, CDMP) are beneficial. - Familiarity with GDPR, CCPA, and data privacy best practices. - Experience conducting risk assessments and DPIAs.
Posted 1 month ago
3.0 - 6.0 years
12 - 22 Lacs
Gurugram
Work from Office
Overview 170+ Years Strong. Industry Leader. Global Impact. At Pinkerton, the mission is to protect our clients. To do this, we provide enterprise risk management services and programs specifically designed for each client. Pinkerton employees are one of our most important assets and critical to the delivery of world-class solutions. Bonded together, we share a commitment to integrity, vigilance, and excellence. Pinkerton is an inclusive employer who seeks candidates with diverse backgrounds, experiences, and perspectives to join our family of industry subject matter experts. The Data Engineer will be part of a high-performing and international team with the goal to expand Data & Analytics solutions for our CRM application which is live in all Securitas countries. Together with the dedicated Frontend– & BI Developer you will be responsible for managing and maintaining the Databricks based BI Platform including the processes from data model changes, implementation and development of pipelines are part of the daily focus, but ETL will get most of your attention. Continuous development to do better will need the ability to think bigger and work closely with the whole team. The Data Engineer (ETL Specialist) will collaborate with the Frontend– & BI Developer to align on possibilities to improve the BI Platform deliverables specifically for the CEP organization. Cooperation with other departments such as integrations or specific IT/IS projects and business specialists is part of the job. The expectation is to always take data privacy into consideration when talking about moving data or sharing data with others. For that purpose, there is a need to develop the security layer as agreed with the legal department. Responsibilities Represent Pinkerton’s core values of integrity, vigilance, and excellence. Maintain & Develop the Databricks workspace used to host the BI CEP solution Active in advising needed changes the data model to accommodate new BI requirements Develop and implement new ETL scripts and improve the current ones Ownership on resolving the incoming tickets for both incidents and requests Plan activities to stay close to the Frontend- & BI Developer to foresee coming changes to the backend Through working with different team members improve the teamwork using the DevOps tool to keep track of the status of the deliverable from start to end Ensure understanding and visible implementation of the company’s core values of Integrity, Vigilance and Helpfulness. Knowledge about skills and experience available and required in your area today and tomorrow to drive liaison with other departments if needed. All other duties, as assigned. Qualifications At least 3+ years of experience in Data Engineering Understanding of designing and implementing data processing architectures in Azure environments Experience with different SSAS - modelling techniques (preferable Azure, databricks - Microsoft related) Understanding of data management and – treatment to secure data governance & security (Platform management and administration) An analytical mindset with clear communication and problem-solving skills Experience in working with SCRUM set up Fluent in English both spoken and written Bonus: knowledge of additional language(s) Ability to communicate, present and influence credibly at all levels both internally and externally Business Acumen & Commercial Awareness Working Conditions: With or without reasonable accommodation, requires the physical and mental capacity to effectively perform all essential functions; Regular computer usage. Occasional reaching and lifting of small objects and operating office equipment. Frequent sitting, standing, and/or walking. Travel, as required. Pinkerton is an equal opportunity employer to all applicants and positions without regard to race/ethnicity, color, national origin, ancestry, sex/gender, gender identity/expression, sexual orientation, marital/prenatal status, pregnancy/childbirth or related conditions, religion, creed, age, disability, genetic information, veteran status, or any protected status by local, state, federal or country-specific law.
Posted 1 month ago
8.0 - 13.0 years
16 - 20 Lacs
Noida
Work from Office
Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and sustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. About the Job At Brightly, our dedication to innovation drives our product management team to create products that address our customers' evolving needs. As a Senior Product Manager for Data, you will work in a collaborative, energetic, dynamic, and creative environment to drive the product strategy of market-leading data products for our emerging and existing vertical-focused products and services. Reporting to the Director of Product Management, you will play a crucial role in assisting in a forward-thinking Data & AI strategy aligned with market needs. What you'll be doing Your key responsibilities are to develop and execute a comprehensive product strategy aligned with market demands and business goals through: Drive monetizationBuild new high value offers on our Snowflake Data cloud. Build Data-as-a-product to drive revenue. Enable Reporting, Analytics and AI roadmap. Data-Driven Decision MakingUtilize analytics and data to drive product decisions, measure success, and iterate on features. Market AnalysisStay up to date with industry trends, competitor products, and emerging technologies to ensure our data products remain competitive. Stakeholder ManagementCollaborate with stakeholders across the organization to align on product goals, priorities, financials, and timelines. Exposure working with Legal to ensure compliance and data governance, data integrity and data retention Customer FocusDeep empathy for users and a passion for creating delightful product experiences. User Research and InsightsConduct user research and gather insights to inform product decisions and ensure the data products meet user needs and expectations. Resource ManagementIdentify value driven opportunities (including establishing TAM, SAM, SOM), managing, understand and share financial outlook, align resources effectively to drive success for your domain. What you'll need EducationBachelor's degree and advanced degree in a technical field or an MBA from a top business school (IIM, XLRI, ISB) or equivalent experience. ExperienceOverall 8+ years of experience with at least 2 years in a business facing role, preferably in SaaS / PaaS. AdaptabilityComfortable navigating and prioritizing in situations of ambiguity, especially in the early stages of discovery and product development. Motivated self-starter with the ability to learn and adapt. Communication Skills: Strong communication and social skills; ability to work across teams with geographically remote team members, including the ability to frame complex concepts for a non-technical audience. InfluenceDemonstrated ability to influence without authority and communicate to multi-level audiences including growing and mentoring more junior product peers. Who we are Brightly, the global leader in intelligent asset management solutions, enables organizations to transform the performance of their assets. Brightly's sophisticated cloud-based platform leverages more than 20 years of data to deliver predictive insights that help users through the key phases of the entire asset lifecycle. More than 12,000 clients of every size worldwide depend on Brightly's complete suite of intuitive software "“ including CMMS, EAM, Strategic Asset Management, IoT Remote Monitoring, Sustainability and Community Engagement. Paired with award-winning training, support, and consulting services, Brightly helps light the way to a bright future with smarter assets and sustainable communities. The Brightly culture Service. Ingenuity. Integrity. Together. These values are core to who we are and help us make the best decisions, manage change, and provide the foundations for our future. These guiding principles help us innovate, flourish, and make a real impact in the businesses and communities we help to thrive. We are committed to the great experiences that nurture our employees and the people we serve while protecting the environments in which we live. Together we are Brightly.
Posted 1 month ago
8.0 - 10.0 years
15 - 20 Lacs
Gurugram
Work from Office
Position Summary: We are looking for an experienced Microsoft 365 Specialist to join our dynamic team for streamlining the enterprise Project data. The ideal candidate will possess a strong proficiency in Microsoft 365 applications and Generative AI tools, along with extensive knowledge of data governance principles. This role will focus on data aggregation, integration, and the development of a robust data architecture to ensure data integrity and accessibility across multiple digital projects in the organization. The candidate should be capable of acting as a developer to build a future-proof architecture that connects various data storage options in our Digital business groups. This would make our digital projects future-proof and AI implementation ready with respect to data flow, data quality and lead to overall operational excellence A Snapshot of your Day How You'll Make an Impact (responsibilities of role) Utilize the full suite of Microsoft 365 applications to streamline data & workflows across different Digital Projects and segments. Customization of the same (as required) will be needed. Act as a developer to build a future-proof architecture that connects various data storage options, including applications, cloud services, drives, and SharePoint etc. Designed architecture shall consolidate fragmented data from various sources to create a single, reliable source of truth for accurate reporting and analysis Integrate and leverage Generative AI tools, such as Co-Pilot, to improve data analysis and reporting capabilities Implement data governance policies, workflows and practices to ensure data quality, security, and compliance with relevant regulations Experience in data integration and transformation techniques, including ETL (Extract, Transform, Load) processes, to ensure data consistency and accuracy Collaborate with stakeholders to identify data needs and ensure accurate reporting and analysis Ensure data integrity and accessibility across the organization, enabling informed decision-making Communicate effectively with cross-functional teams and stakeholders to understand data requirements and deliver solutions that meet business needs Provide training and support to team members on data governance policies, procedures and required operability of Microsoft 365 tools Keep abreast of new features and capabilities in Microsoft 365 related to data governance. What You Bring Bachelor's/master's degree in information technology or computer science, or a related field. 8 to 10 years of experience in developing architectures for data governance. Proven experience with Microsoft 365 applications and Generative AI tools, like Co-Pilot Strong understanding of data governance principles, practices, and policies Experienced in utilizing a variety of database management systems and data exchange formats to optimize data storage, retrieval, and interoperability Knowledge of relevant industry regulations and standards . Proficiency in data architecture design Excellent communication skills, with the ability to convey complex concepts to non-technical stakeholders Strong problem-solving skills and the ability to work collaboratively in a dynamic team environment across the globe
Posted 1 month ago
5.0 - 8.0 years
12 - 15 Lacs
Chennai
Work from Office
Job Summary We are seeking a skilled Informatica DGDQ Developer with over 5 years of experience to manage data governance and data quality processes. The successful candidate will play a crucial role in ensuring data accuracy, consistency, and quality across various systems. You will work closely with cross-functional teams to design and implement solutions that ensure data integrity and compliance with governance policies. Mandatory Skills Proficiency in Informatica PowerCenter, including mapping, workflows, and transformations. Strong SQL and PL/SQL skills for data manipulation and querying. Experience with ETL processes and data warehousing concepts. Knowledge of data integration, data quality, and data migration techniques. Experience in performance tuning and troubleshooting of Informatica jobs. Knowledge of version control systems (e.g., Git) and CI/CD processes. Roles and Responsibilities Design, develop, and implement Informatica DGDQ solutions to manage data quality and governance Collaborate with business teams to gather data quality and governance requirements Perform data profiling, cleansing, and validation using Informatica tools Develop and maintain data quality rules, workflows, and dashboards Monitor and troubleshoot data quality issues, ensuring timely resolution Ensure compliance with data governance policies and frameworks Work with data stewards and stakeholders to define data standards and best practices Document data governance processes, data lineage, and metadata management Provide training and support for data quality tools and processes to the organization Qualifications Bachelors degree in computer science, Information Technology, or a related field 5+ years of experience with Informatica DGDQ (Data Governance and Data Quality) Strong understanding of data governance frameworks and data quality management Experience with data profiling, cleansing, and validation tools Informatica Data Governance and Data Quality certification is a plus Technical Skills Expertise in Informatica DGDQ tools for data quality and governance management Strong knowledge of SQL and data modeling techniques Experience with data profiling, cleansing, validation, and enrichment Knowledge of data governance best practices, including data stewardship and metadata management Experience working with large datasets and complex data environments Familiarity with data security, compliance, and regulatory requirements Soft Skills Excellent communication and collaboration skills Ability to work closely with cross-functional teams and stakeholders Strong problem-solving and analytical abilities Detail-oriented with a focus on data accuracy and consistency Adaptability and a proactive approach to addressing data governance challenges Good to Have Experience with cloud platforms like AWS or Azure for data governance and quality solutions Knowledge of Big Data tools and technologies Experience with REST APIs for integrating data governance tools with third-party systems Work Experience Minimum of 5 to 8 years of experience in Informatica development and data integration. Proven ability to deliver high-quality data solutions in a fast-paced environment.
Posted 1 month ago
11.0 - 16.0 years
27 - 32 Lacs
Noida
Work from Office
Responsibilities: - Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc.. - Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions - Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions. - Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions. - Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel. - Stay up to date on the latest GCP offerings, trends, and best practices. Experience : - Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP). - Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau) - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI) - Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data. - Strong knowledge and experience in best practices for data governance, security, and compliance - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs - Strong analytical and problem-solving skills. - Ability to work independently and as part of a team. Apply Save Save Pro Insights
Posted 1 month ago
8.0 - 12.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Job Position Python Lead Total Exp Required 6+ years Relevant Exp Required around 5 Mandatory skills required Strong Python coding and development Good to have skills required Cloud, SQL , data analysis skills Location Pune - Kharadi - WFO - 3 days/week. About The Role : We are seeking a highly skilled and experienced Python Lead to join our team. The ideal candidate will have strong expertise in Python coding and development, along with good-to-have skills in cloud technologies, SQL, and data analysis. Key Responsibilities : - Lead the development of high-quality, scalable, and robust Python applications. - Collaborate with cross-functional teams to define, design, and ship new features. - Ensure the performance, quality, and responsiveness of applications. - Develop RESTful applications using frameworks like Flask, Django, or FastAPI. - Utilize Databricks, PySpark SQL, and strong data analysis skills to drive data solutions. - Implement and manage modern data solutions using Azure Data Factory, Data Lake, and Data Bricks. Mandatory Skills : - Proven experience with cloud platforms (e.g. AWS) - Strong proficiency in Python, PySpark, R, and familiarity with additional programming languages such as C++, Rust, or Java. - Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL. - Experience with the Apache Spark, and multi-cloud platforms (AWS, GCP, Azure). - Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus. Good to Have Skills : - Experience with modern data solutions via Azure. - Knowledge of principles summarized in the Microsoft Cloud Adoption Framework. - Additional expertise in SQL and data analysis. Educational Qualifications Bachelor's/Master's degree or equivalent with a focus on software engineering. If you are a passionate Python developer with a knack for cloud technologies and data analysis, we would love to hear from you. Join us in driving innovation and building cutting-edge solutions! Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
7.0 - 10.0 years
1 - 5 Lacs
Pune
Work from Office
Responsibilities : - Design, develop, and deploy data pipelines using Databricks, including data ingestion, transformation, and loading (ETL) processes. - Develop and maintain high-quality, scalable, and maintainable Databricks notebooks using Python. - Work with Delta Lake and other advanced features. - Leverage Unity Catalog for data governance, access control, and data discovery. - Develop and optimize data pipelines for performance and cost-effectiveness. - Integrate with various data sources, including but not limited to databases and cloud storage (Azure Blob Storage, ADLS, Synapse), and APIs. - Experience working with Parquet files for data storage and processing. - Experience with data integration from Azure Data Factory, Azure Data Lake, and other relevant Azure services. - Perform data quality checks and validation to ensure data accuracy and integrity. - Troubleshoot and resolve data pipeline issues effectively. - Collaborate with data analysts, business analysts, and business stakeholders to understand their data needs and translate them into technical solutions. - Participate in code reviews and contribute to best practices within the team. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
0.0 - 1.0 years
8 - 12 Lacs
Mumbai
Work from Office
About The Role : At RBHU, we are a team of hyper-focused individuals focused on achieving clear, outstanding results. We thrive on a foundation of trust, collaboration, and a shared sense of purpose. By harnessing our collective talents and aligning our efforts, our teams consistently surpass benchmarks and set new standards of excellence. Requirements We are looking for bright, young engineers with good mathematics skills and an interest in data analytics/science. Experience 0 to 2 years Knowledge of the following will be advantageous : - SQL Query writing- MS Power BI- Python/R for data science Responsibilities : Selected candidates will be responsible for the following :- Data Cleaning and Preparation This involves filtering the data, handling missing values, and preparing the dataset for analysis to ensure accuracy and relevance.- Data Exploration and Analysis Analysts use statistical tools and techniques to explore and analyze data, identifying patterns, relationships, and trends.- Data Visualization They create visual representations of data findings through charts, graphs, and dashboards to make the data understandable.- Reporting Data analysts prepare reports and presentations to communicate the insights and findings from the data to stakeholders, which can influence policy and decision-making processes.- Collaboration They often work with other departments to understand their data needs and help them make informed decisions based on data insights.- Understand requirements given by business users in the functional domain.- Access data stored in databases/warehouses/flat files.- Create/test/deploy intuitive and interactive, analytical dashboards.- Create algorithms in Python/R for advanced analytics.- Create data governance policies like RLS to manage data security.ApplyInsightsFollow-upSave this job for future referenceDid you find something suspiciousReport Here! Hide This JobClick here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
2.0 - 4.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Manager, Internal Audit Bengaluru, India About PhonePe Group: PhonePe is Indias leading digital payments company with 50 crore (500 Million) registered users and 3.7 crore (37 Million) merchants covering over 99% of the postal codes across India. On the back of its leadership in digital payments, PhonePe has expanded into financial services (Insurance, Mutual Funds, Stock Broking, and Lending) as well as adjacent tech- enabled businesses such as Pincode for hyperlocal shopping and Indus App Store which is India's first localized App Store. The PhonePe Group is a portfolio of businesses aligned with the company's vision to offer every Indian an equal opportunity to accelerate their progress by unlocking the flow of money and access to services. Culture At PhonePe, we take extra care to make sure you give your best at work, Everyday! And creating the right environment for you is just one of the things we do. We empower people and trust them to do the right thing. Here, you own your work from start to finish, right from day one. Being enthusiastic about tech is a big part of being at PhonePe. If you like building technology that impacts millions, ideating with some of the best minds in the country and executing on your dreams with purpose and speed, join us! Job Role / Responsibilities - Develop a comprehensive understanding of PhonePes business, systems, and processes. - Conduct risk-based internal audits across stock broking processes such as trading, settlements, depository participant (DP) operations, client onboarding, and regulatory reporting. - Stay updated and ensure compliance with SEBI, NSE, BSE, NSDL, CDSL guidelines related to capital market regulations and Research Analyst including best practices, and emerging trends - Acquire in-depth process understanding for planned audits, including processes, systems involved, data flow, and datasets. - Execute audits from risk assessment, audit planning & scoping to execution & reporting. - Prepare / review RCMs, process notes, flow charts, and other working papers. - Drives and understand root cause of the observations and follow the 5 why approach for corrective action plans. - Prepare comprehensive audit reports, ensuring clarity, accuracy, and adherence to internal reporting standards. - Facilitate discussions with auditees on audit findings, ensuring a clear understanding of identified issues and recommendations, and report significant issues to senior management. - Provide timely updates on audits to the Internal Audit head, holding regular meetings with auditees. - Ensure the overall quality of audit reports and audit documentation based on methodology - Plan resource and budget for audit and be able to lead the audit team - Handle Internal Audit organizational activities, including budgeting, risk assessment, and external stakeholder management. - Conduct periodic follow-ups with auditees to monitor the timely and effective implementation of management action plans. - Demonstrates good understanding of data governance processes, practices, policies, and guidelines. Essential Skills/Qualification - Min 7 years of post-qualification experience in internal audit and/ or relevant function - Prior Industry experience in Broking Industry especially in Exchange or Depository Operations - Chartered Accountant or equivalent qualification - Sound understanding of control environment, compliance, and risk frameworks. - Excellent written and verbal communication skills. - Strong problem solving and analytical skills. - Be able to work in a fast-paced role with competing priorities. Adaptable to project requirements and does what is required to get the job done. - Demonstrate ability for seamless execution, continuous improvement and problem solving. Preferred Skills & Qualification - Experience in internal audits or organization risk and control functions. - Review quantitative analysis that translates data into actionable insights. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, AccidentalInsurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center,Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, AdoptionAssistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity,NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance PolicyWorking at PhonePe is a rewarding experience! Great people, a work environmentthat thrives on creativity, the opportunity to take on roles beyond a defined jobdescription are just some of the reasons you should work with us. Read more aboutPhonePe on our blog. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy
Posted 1 month ago
7.0 - 8.0 years
17 - 22 Lacs
Mumbai
Work from Office
About the Role - We are seeking a highly skilled and experienced Senior Data Architect to join our growing data engineering team. - As a Senior Data Architect, you will play a critical role in designing, developing, and implementing robust and scalable data solutions to support our business needs. - You will be responsible for defining data architectures, ensuring data quality and integrity, and driving data-driven decision making across the organization. Key Responsibilities - Design and implement data architectures for various data initiatives, including data warehouses, data lakes, and data marts. - Define data models, data schemas, and data flows for complex data integration projects. - Develop and maintain data dictionaries and metadata repositories. - Ensure data quality and consistency across all data sources. - Design and implement data warehousing solutions, including ETL/ELT processes, data transformations, and data aggregations. - Support the development and implementation of business intelligence and reporting solutions. - Optimize data warehouse performance and scalability. - Define and implement data governance policies and procedures. - Ensure data security and compliance with relevant regulations (e.g., GDPR, CCPA). - Develop and implement data access controls and data masking strategies. - Design and implement data solutions on cloud platforms (AWS, Azure, GCP), leveraging cloud-native data services. - Implement data pipelines and data lakes on cloud platforms. - Collaborate effectively with data engineers, data scientists, business analysts, and other stakeholders. - Communicate complex technical information clearly and concisely to both technical and non-technical audiences. - Present data architecture designs and solutions to stakeholders. Qualifications Essential - 7+ years of experience in data architecture, data modeling, and data warehousing. - Strong understanding of data warehousing concepts, including dimensional modeling, ETL/ELT processes, and data quality. - Experience with relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). - Experience with data integration tools and technologies. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal skills. - Bachelor's degree in Computer Science, Computer Engineering, or a related field. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
1.0 - 3.0 years
5 - 9 Lacs
Noida
Work from Office
Join us as a Data Analyst at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unapparelled customer experiences, You may be assessed on the key critical skills relevant for success in role, such as experience with dashboard development, reporting, and workflow automation, To be successful in a Data Analyst role you should have experience with: Basic/ Essential Qualifications Graduate in any discipline, Hands-on experience with data visualization tools such as Tableau, Power BI, Business Objects, and Alteryx, Proficiency in MS Office, including advanced skills in Excel, PowerPoint, Word, and Visio, Expertise in generating data insights and creating dashboards from large and diverse data sets, Strong automation skills using VBA, Power Query, PowerApps, and other relevant tools, Experience with ETL tools, Proven experience in performing User Acceptance Testing (UAT), Excellent verbal and written communication skills, Highly motivated, business-focused, and forward-thinking, Experience in stakeholder management, Self-driven with a proactive approach to team initiatives, Demonstrated ability to identify and resolve problems independently, Desirable Skillsets/ Good To Have Proficient in data crunching and analysis, including automation, Solid understanding of Database Management Systems (DBMS), Experience in developing within Low Code/No Code environments, Strong grasp of Data Management Principles and data governance, Skilled in designing and managing SharePoint sites, Knowledgeable in procurement processes and practices, The location is based out of Noida, Purpose of the role To support the Risk Function in delivering its objective of safeguarding the bank's financial and operational stability by proactively identifying, assessing, mitigating, and monitoring risks across various business units and activities, Accountabilities Development of strategic direction for risk, including the implementation of up-to-date methodologies and processes, Management of the risk department, including oversight of risk colleagues and their performance, implementation of risk priorities and objectives, oversight of department efficiency and effectiveness, Relationship management of risk stakeholders, including identifying relevant stakeholders, and maintenance of the quality of external third-party services, Adherence to the Risk policy, standards and frameworks, and maintaining a robust control environment, Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement, Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources, If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others, OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate, Will have an impact on the work of related teams within the area, Partner with other functions and business areas, Takes responsibility for end results of a teams operational processing and activities, Escalate breaches of policies / procedure appropriately, Take responsibility for embedding new policies/ procedures adopted due to risk mitigation, Advise and influence decision making within own area of expertise, Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct, Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function, Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function, Make evaluative judgements based on the analysis of factual information, paying attention to detail, Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents, Guide and persuade team members and communicate complex / sensitive information, Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation, All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave,
Posted 1 month ago
10.0 - 15.0 years
30 - 45 Lacs
Pune
Work from Office
Azure Cloud Data Solutions Architect Job Title: Azure Cloud Data Solutions Architect Location: Pune, India Experience: 10 - 15 Years Work Mode: Full-time, Office-based Company : Smartavya Analytica Private Limited Company Overview: Smartavya Analytica is a niche Data and AI company based in Mumbai, established in 2017. We specialize in data-driven innovation, transforming enterprise data into strategic insights. With expertise spanning over 25+ Data Modernization projects and handling large datasets up to 24 PB in a single implementation, we have successfully delivered data and AI projects across multiple industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are specialists in Cloud, Hadoop, Big Data, AI, and Analytics, with a strong focus on Data Modernization for On-premises, Private, and Public Cloud Platforms. Visit us at: https://smart-analytica.com Job Summary: We are seeking an experienced Azure Cloud Data Solutions Architect to lead end-to-end architecture and delivery of enterprise-scale cloud data platforms. The ideal candidate will have deep expertise in Azure Data Services , Data Engineering , and Data Governance , with the ability to architect and guide cloud modernization initiatives. Key Responsibilities: Architect and design data lakehouses , data warehouses , and analytics platforms using Azure Data Services . Lead implementations using Azure Data Factory (ADF) , Azure Synapse Analytics , and Azure Fabric (OneLake ecosystem). Define and implement data governance frameworks including cataloguing, lineage, security, and quality controls. Collaborate with business stakeholders, data engineers, and developers to translate business requirements into scalable Azure architectures. Ensure platform design meets performance, scalability, security, and regulatory compliance needs. Guide migration of on-premises data platforms to Azure Cloud environments. Create architectural artifacts: solution blueprints, reference architectures, governance models, and best practice guidelines. Collaborate with Sales / presales to customer meetings to understand the business requirement, the scope of work and propose relevant solutions. Drive the MVP/PoC and capability demos to prospective customers / opportunities Must-Have Skills: 1015 years of experience in data architecture, data engineering, or analytics solutions. Hands-on expertise in Azure Cloud services: ADF , Synapse , Azure Fabric (OneLake) , and Databricks (good to have). Strong understanding of data governance , metadata management, and compliance frameworks (e.g., GDPR, HIPAA). Deep knowledge of relational and non-relational databases (SQL, NoSQL) on Azure. Experience with security practices (IAM, RBAC, encryption, data masking) in cloud environments. Strong client-facing skills with the ability to present complex solutions clearly. Preferred Certifications: Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Azure Data Engineer Associate
Posted 1 month ago
8.0 - 13.0 years
20 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Zetwerk is looking for Architect & Data Governance to join our dynamic team and embark on a rewarding career journey Collaborating with clients, engineers, and other stakeholders to determine project requirements and goals Developing and presenting design concepts, plans, and models to clients for approval Conducting site surveys and analyzing data to determine the best design solutions for a particular location and purpose Preparing detailed drawings and specifications Staying current with relevant building codes, regulations, and industry trends Managing budgets, schedules, and other project-related activities Ensuring that projects are completed within budget, on time, and to the satisfaction of clients and stakeholders An Architect must possess a combination of technical, creative, and interpersonal skills
Posted 1 month ago
5.0 - 10.0 years
20 - 25 Lacs
Noida
Work from Office
{"company":" At myKaarma, we re not just leading the way in fixed ops solutions for the automotive industry we re redefining what s possible for dealership service centers. Headquartered in Long Beach, California, and powered by a global team, our industry-leading SaaS platform combines communication, scheduling, and payment tools in one seamless solution that keeps dealerships and vehicle owners connected. With myKaarma, every service interaction flows effortlessly, bringing good karma to customers and service teams. Rooted in the principles of the Toyota Production System, we operate with precision, efficiency, and a relentless focus on continuous improvement to deliver a better experience for all. We re looking for innovators, problem-solvers, and tech enthusiasts passionate about building solutions that people love to use. If you re ready to make an impact in an industry ripe for change, join us at myKaarma and help shape the future of automotive service. ","role":" Role Description: We are building a modern data lake architecture centered around BigQuery and Looker, and we re looking for a hands-on Looker Data Engineer/Architect to help us shape and scale our data platform. In this role, you ll own the design and implementation of Looker Views, Explores, and Dashboards, working closely with data stakeholders to ensure accurate, efficient, and business-relevant insights. You ll play a critical role in modelling our existing data architecture into LookML, and driving modelling and visualization best practices across the organization. This will also include reviewing our existing data lake models and identifying inefficiencies/areas of improvement. This role also offers the opportunity to integrate AI/ML in our data lake and provide intelligent insights and recommendations to our internal as well as external customers. Key Responsibilities: Design and develop LookML models, views, and explores based on our legacy data warehouse in MariaDB Create and maintain high-quality dashboards and visualizations in Looker that deliver actionable business insights Collaborate with engineers, product managers, and business stakeholders to gather requirements and translate them into scalable data models Guide other engineers and non-technical staff on how to build and maintain Looker dashboards and models. Ensure data accuracy, performance, and efficiency across our Looker and BigQuery resources Maintain strong ownership over the Looker platform, proactively improving structure, documentation, and data usability Monitor and troubleshoot data issues in Looker and BigQuery Required Skills and Qualifications : 5+ years of experience in data engineering and 2+ years of hands-on experience with Looker, including LookML modeling and dashboard development Strong experience with Google BigQuery, including writing and optimizing complex SQL queries, and managing BigQuery costs Experience with building and maintaining projects in Google Cloud Experience implementing row-level security, access controls, or data governance in Looker Proven ability to manage and own end-to-end Looker projects with minimal supervision Experience with source control systems, preferably git Excellent communication skills and a strong sense of ownership and accountability Comfortable working in a fast-paced, collaborative environment Nice to Have Skills & Qualifications Familiarity with batch processing, stream processing and real-time analytics Familiarity with MySQL queries and syntax Being able to understand and write java code We value diverse experiences and backgrounds, so we encourage you to apply if you meet some but not all of the listed qualifications. Total Rewards at myKaarma At myKaarma, we offer a comprehensive Total Rewards package that extends beyond the base salary. Our commitment to competitive compensation includes bonuses and benefits that support both personal and professional well-being: Flexible Work Environment : We embrace a high-performance, flexible structure that values freedom and responsibility. Our Highly Aligned, Loosely Coupled model empowers teams to innovate and continuously improve using data-driven insights. Health and Wellness : Comprehensive medical, life, and disability benefits. Time Off: Generous vacation time to recharge and balance life outside work. In-Office Perks: Work in an agile office space with perks like ping pong and foosball to unwind and connect and unlimited lunch, snacks or refreshments onsite. Our Commitment to Inclusion At myKaarma, diverse perspectives drive innovation and success. We are committed to creating a safe, welcoming, and inclusive workplace where every employee feels valued and empowered and can do meaningful work. Our mission to deliver exceptional solutions to our clients is strengthened by the unique contributions and perspectives of our team members from all backgrounds. As an equal opportunity employer, myKaarma prohibits any form of unlawful discrimination or harassment based on race, color, religion, gender, gender identity, gender expression, sexual orientation, national origin, family or parental status, disability, age, veteran status, or any other status protected by applicable laws in the regions where we operate. We adhere to all EEOC regulations and actively promote an environment that celebrates and supports diversity, equity, and inclusion for all. Applicants with disabilities may be entitled to reasonable accommodation under the terms of the Americans with Disabilities Act and certain state or local laws. Reasonable accommodation is a change in the way things are normally done, which will ensure an equal employment opportunity without imposing undue hardship on myKaarma. Please let us know if you require reasonable accommodations during the application or interview process by filling out . myKaarma participates in the . "},"
Posted 1 month ago
8.0 - 12.0 years
9 - 14 Lacs
Pune
Work from Office
Design, develop, and maintain high-performance data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Composer (Airflow). Design and develop Looker Dashboards, with apt security provisioning and drill down capabilities. Ensure data security, lineage, quality, and compliance across GCP data ecosystems through IAM, audit logging, data encryption, and schema management. Monitor, troubleshoot, and optimize pipeline and warehouse performance using GCP native tools such as Cloud Monitoring, Cloud Logging, and BigQuery Optimizer. Write SQL queries, dbt models, or Dataflow pipelines to transform raw data into analytics-ready datasets. Develop and optimize SQL queries and data transformation scripts for data warehousing and reporting purposes. Lead proof-of-concepts (POCs) and best practice implementations for modern data architecture, including data lakes and cloud-native data warehouses. Ensure data quality, governance, and security best practices across all layers of the data stack. Write clean, maintainable, and efficient code following best practices. Requirements Data Engineering: 8–12 years of experience in data engineering, with at least 3–5 years hands-on experience specifically in Google Cloud Platform (GCP) and BI tools like Looker. BigQuery (data modeling, optimization, security); Advanced SQL proficiency with complex data transformation, windowing functions, and analytical querying. Ability to design and develop modular, maintainable SQL models using dbt best practices. Basic to intermediate knowledge of Python for scripting and automation. Exposure to ETL and batch scheduling/ orchestration solutions Strong understanding of data architecture patterns: data lakes, cloud-native data warehouses, event-driven architectures. Experience with version control systems like Git and branching strategies. Looker: Hands on experience in Looker with design, development, configuration/ setup, dashboarding and reporting techniques. Experience building and maintaining LookML models, Explores, PDTs, and semantic layers. Understanding of security provisioning and access controls, performance tuning of dashboard/ reports based on large dataset, building drill down capabilities. Proven ability to design scalable, user-friendly dashboards and self-service analytics environments. Expertise in optimizing Looker performance: materialized views, query tuning, aggregate tables. Strong command over Row-Level Security, Access Filters, and permission sets in Looker to support enterprise-grade data governance. General: Experience with Agile delivery methodologies (e.g. Scrum, Kanban) Demonstrable track record of dealing well with ambiguity, prioritizing needs, and delivering results in a dynamic environment. Conduct regular workshops, demos and stakeholder reviews to showcase data solutions and capture feedback. Excellent communication and collaboration skills. Collaborate with development teams to streamline the software delivery process and improve system reliability. Mentor and upskill junior engineers and analysts on GCP tools, Looker modeling best practices, and advanced visualization techniques. Ability to translate business objectives into data solutions with a focus on delivering measurable business value. Flexible to work in shifts and provide on-call support, owning the smooth operation of applications and systems in a production environment.
Posted 1 month ago
8.0 - 10.0 years
10 - 12 Lacs
Bengaluru
Work from Office
8 - 10 years of progressive experience in building and implementing model-driven, enterprise-level business solutions and applications 1+ years of working Pega experience in Pega Decisioning and Pega Marketing skills is required including making model changes Experience in implementing Pega Marketing, Strong understanding of Pega methodologies. Excellent object-oriented analysis and design skills and system integration skills. Experience in working on various rules and features like Flows, Activities, User Interface, Flow Actions, Agents, SLA, Correspondence, Security, Reports, Listeners (File, MQ), Connectors etc Experience working collaboratively with business stakeholders, business analysts, data governance, analytics, and technical leads to ensure the right solution is created for the business need. Understanding of Predictive and Adaptive Analytics and the capabilities in Pega around Artificial Intelligence Hands on experience in implementing Pega integration services using REST, SOAP, etc Knowledge of industry standard project delivery frameworks including: Agile, Waterfall and Scrum ecture and all PRPC design and implementation features Experience and desire to work in Global offshore/on Pega Certified Decisioning Consultant (PCDC) and Pega Certified Marketing Consultant (PCMC) Develops and demonstrates an advanced knowledge of the PRPC Archit shore model. 10 - 12 years of progressive experience in building and implementing model-driven, enterprise-level business solutions and applications 1+ years of working Pega experience in Pega Decisioning and Pega Marketing skills is required including making model changes Experience in implementing Pega Marketing, Strong understanding of Pega methodologies. Excellent object-oriented analysis and design skills and system integration skills. Experience in working on various rules and features like Flows, Activities, User Interface, Flow Actions, Agents, SLA, Correspondence, Security, Reports, Listeners (File, MQ), Connectors etc Experience working collaboratively with business stakeholders, business analysts, data governance, analytics, and technical leads to ensure the right solution is created for the business need. Understanding of Predictive and Adaptive Analytics and the capabilities in Pega around Artificial Intelligence Hands on experience in implementing Pega integration services using REST, SOAP, etc Knowledge of industry standard project delivery frameworks including: Agile, Waterfall and Scrum Pega Certified Decisioning Consultant (PCDC) and Pega Certified Marketing Consultant (PCMC) Develops and demonstrates an advanced knowledge of the PRPC Architecture and all PRPC design and implementation features Experience and desire to work in Global offshore/onshore model. Approx. vendor billing rate* (INR/Day) 14000 Inr/day (negotiable based on proficiency of the candidate) 15000 Inr/day (negotiable based on the proficiency of the candidate) Work Location* Hyderabad Hyderabad Background check (pre/post onboarding) Pre-Onboarding Pre-Onboarding Mandatory Skills Pega CDH Pega CDH
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane