Jobs
Interviews

3321 Redshift Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

18.0 years

5 - 10 Lacs

Gurgaon

On-site

Senior Assistant Vice President EXL/SAVP/1393076 ServicesGurgaon Posted On 14 Jul 2025 End Date 28 Aug 2025 Required Experience 18 - 28 Years Basic Section Number Of Positions 1 Band D2 Band Name Senior Assistant Vice President Cost Code D014685 Campus/Non Campus NON CAMPUS Employment Type Permanent Requisition Type New Max CTC 2500000.0000 - 3500000.0000 Complexity Level Not Applicable Work Type Hybrid – Working Partly From Home And Partly From Office Organisational Group Analytics Sub Group Analytics - UK & Europe Organization Services LOB Services SBU Analytics Country India City Gurgaon Center EXL - Gurgaon Center 38 Skills Skill SQL PYTHON Minimum Qualification BCOM Certification No data available Job Description Position Summary: We are seeking a highly experienced and visionary Principal Data Engineer with over 20 years of industry experience to lead the design, development, and optimization of our enterprise-scale data infrastructure. This role is ideal for a seasoned data professional who thrives at the intersection of technology strategy, architecture, and hands-on engineering. You will drive innovation, mentor engineering teams, and collaborate with cross-functional stakeholders to enable data-driven decision-making across the organization. Key Responsibilities: Architecture & Strategy Define and own the data engineering roadmap, architecture, and data platform strategy. Evaluate and implement scalable, secure, and cost-effective data technologies. Lead the transition to modern data platforms (e.g., cloud, data lakehouse, streaming architecture). Engineering Leadership Lead end-to-end design and development of robust ETL/ELT pipelines, data lakes, and real-time streaming systems. Provide hands-on guidance on data modeling, pipeline optimization, and system integration. Oversee deployment and automation of data pipelines using CI/CD, orchestration tools (e.g., Airflow), and infrastructure-as-code. Data Governance & Quality Define best practices for data governance, lineage, privacy, and security. Implement data quality frameworks and monitoring strategies. Cross-Functional Collaboration Work closely with data scientists, analysts, product owners, and engineering leaders to define data requirements. Serve as a thought leader and technical advisor to senior management and stakeholders. Mentorship & Team Development Mentor and coach senior engineers and technical leads. Lead architectural reviews, code reviews, and foster a culture of technical excellence. Required Qualifications: 20+ years of hands-on experience in data engineering, software engineering, or data architecture. Proven track record of building and scaling enterprise data platforms and systems. Deep expertise in SQL, Python/Scala/Java, Spark, and distributed data systems. Strong experience with cloud platforms (AWS, GCP, or Azure), especially cloud-native data tools (e.g., Redshift, BigQuery, Snowflake, Databricks). Experience with real-time streaming technologies (Kafka, Flink, Kinesis). Solid understanding of data modeling (dimensional, normalized, NoSQL). Familiarity with DevOps and MLOps practices in a data environment. Excellent communication skills and ability to influence at the executive level. Preferred Qualifications: Master’s or Ph.D. in Computer Science, Data Engineering, or a related field. Experience in regulated industries (e.g., finance, healthcare). Prior experience in global organizations and managing geographically distributed teams. Workflow Workflow Type Back Office

Posted 1 week ago

Apply

3.0 years

6 - 9 Lacs

Ahmedabad, Gujarat

Remote

Job Title: Power BI Developer Location: Ahmedabad, Gujarat (Preferred) Experience Required: 3+ Years Employment Type: Full-time (Immediate Joiners Preferred) About IGNEK: At IGNEK, we specialize in remote staff augmentation and custom development solutions, offering expert teams in technologies like Liferay, AEM, Java, React, and Node.js. We help global clients meet their project goals efficiently by delivering innovative and scalable digital solutions. Job Summary: We’re looking for an experienced Power BI Developer to join our analytics team at IGNEK. The ideal candidate will be responsible for transforming complex data into visually impactful dashboards and providing actionable insights for data-driven decision-making. Key Responsibilities: Develop, maintain, and optimize interactive Power BI dashboards and reports. Write complex SQL queries to extract, clean, and join data from multiple sources including data warehouses and APIs. Understand business requirements and collaborate with cross-functional teams to deliver scalable BI solutions. Ensure data accuracy and integrity across all reporting outputs. Create robust data models and DAX measures within Power BI. Work with data engineers and analysts to streamline data pipelines. Maintain documentation for all dashboards, definitions, and processes. (Optional) Use Python for automation, data manipulation, or API integration. Requirements: 3+ years of experience in BI or Analytics roles. Strong expertise in Power BI , including DAX, Power Query, and data modeling. Advanced SQL skills and experience with relational databases or cloud data warehouses (e.g., SQL Server, Redshift, Snowflake). Understanding of ETL processes and data quality management. Ability to communicate data-driven insights effectively to stakeholders. Bonus: Working knowledge of Python for scripting or automation. Preferred Qualifications: Hands-on experience with Power BI Service , Power BI Gateway , or Azure . Exposure to agile methodologies and collaborative development teams. Familiarity with key business metrics across functions like sales, operations, or finance. How to Apply: Please send your resume and a cover letter detailing your experience to Job Type: Full-time Pay: ₹600,000.00 - ₹900,000.00 per year Benefits: Flexible schedule Leave encashment Provident Fund Work from home Work Location: In person

Posted 1 week ago

Apply

4.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – AWS EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Cloud Experts with design experience in Bigdata cloud implementations. Your Key Responsibilities AWS Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred. Experience building on AWS using S3, EC2, Redshift, Glue,EMR, DynamoDB, Lambda, QuickSight, etc. Experience in Pyspark/Spark / Scala Experience using software version control tools (Git, Jenkins, Apache Subversion) AWS certifications or other related professional technical certifications Experience with cloud or on-premise middleware and other enterprise integration technologies Experience in writing MapReduce and/or Spark jobs Demonstrated strength in architecting data warehouse solutions and integrating technical components Good analytical skills with excellent knowledge of SQL. 4+ years of work experience with very large data warehousing environment Excellent communication skills, both written and verbal 7+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools. 4+ years of experience data modelling concepts 3+ years of Python and/or Java development experience 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Exposure to tools like Tableau, Alteryx etc. To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 4+ years hand-on experience in one or more key areas. Minimum 7+ years industry experience What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Overview: We are seeking an exceptional Senior Data Product Engineering SRE to lead the development and operational excellence of our data products that deliver insights and drive critical business decisions. This role uniquely combines product engineering mindset, data platform expertise, and site reliability engineering practices to build, scale, and maintain customer-facing data products and internal analytics platforms. You will own the end-to-end reliability of data products from ingestion to user experience, ensuring they deliver business value at scale. Key Responsibilities: Define and execute product reliability strategy for customer-facing analytics, dashboards, and data APIs Partner with Product Management to translate business requirements into scalable, reliable data product architectures Establish product metrics, KPIs, and success criteria for data products serving external and internal customers Lead cross-functional initiatives to improve data product adoption, engagement, and customer satisfaction Build and maintain data products including real-time dashboards, analytics APIs, and embedded analytics solutions Design user-centric data experiences with focus on performance, reliability, and scalability Implement A/B testing frameworks and experimentation platforms for data product optimization Establish and maintain SLAs for data product availability, latency, and accuracy Implement comprehensive monitoring for user-facing data products including frontend and backend metrics Build automated testing frameworks for data product functionality, performance, and data quality Lead incident response for data product issues affecting customer experience Monitor and optimize data product performance from end-user perspective (page load times, query response times) Implement user feedback collection and product analytics to drive continuous improvement Collaborate closely with Product, Engineering, Data Science, and Customer Success teams Establish engineering practices for data product development including code reviews and deployment processes Influence product roadmap with technical feasibility and reliability considerations Champion data product best practices across the organization Balance innovation with operational stability and customer commitments Collaborate with Product Management on feature prioritization and requirements Required Qualifications: 8+ years of experience in product engineering, data engineering, or SRE roles 5+ years building customer-facing data products, analytics platforms, or business intelligence solutions 3+ years in senior or lead positions with direct team management experience Proven track record of shipping data products that drive measurable business impact Experience with product development lifecycle from ideation to launch and optimization Expert-level experience building user-facing applications and APIs Deep expertise with analytics databases (Redshift, BigQuery, ClickHouse), real-time processing (Kafka, Spark Streaming), and BI tools (Tableau, Looker, Power BI) Proficiency with React, Vue.js, or Angular for building data visualization interfaces Advanced skills in Python, Java, or Node.js for API development and data services Expert-level SQL skills and experience optimizing queries for interactive analytics workloads Extensive experience with AWS or GCP data and compute services Strong product sense with ability to balance technical constraints with user needs Experience with product analytics tools (Amplitude, Mixpanel, Google Analytics) and metrics-driven development Ability to understand business requirements and translate them into technical solutions Strong technical writing skills for customer-facing documentation and API specifications Experience with agile product development methodologies (Scrum, Kanban, Design Thinking) Track record of building and scaling product engineering teams Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – AWS EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Cloud Experts with design experience in Bigdata cloud implementations. Your Key Responsibilities AWS Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred. Experience building on AWS using S3, EC2, Redshift, Glue,EMR, DynamoDB, Lambda, QuickSight, etc. Experience in Pyspark/Spark / Scala Experience using software version control tools (Git, Jenkins, Apache Subversion) AWS certifications or other related professional technical certifications Experience with cloud or on-premise middleware and other enterprise integration technologies Experience in writing MapReduce and/or Spark jobs Demonstrated strength in architecting data warehouse solutions and integrating technical components Good analytical skills with excellent knowledge of SQL. 4+ years of work experience with very large data warehousing environment Excellent communication skills, both written and verbal 7+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools. 4+ years of experience data modelling concepts 3+ years of Python and/or Java development experience 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Exposure to tools like Tableau, Alteryx etc. To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 4+ years hand-on experience in one or more key areas. Minimum 7+ years industry experience What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Senior – AWS EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Cloud Experts with design experience in Bigdata cloud implementations. Your Key Responsibilities AWS Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred. Experience building on AWS using S3, EC2, Redshift, Glue,EMR, DynamoDB, Lambda, QuickSight, etc. Experience in Pyspark/Spark / Scala Experience using software version control tools (Git, Jenkins, Apache Subversion) AWS certifications or other related professional technical certifications Experience with cloud or on-premise middleware and other enterprise integration technologies Experience in writing MapReduce and/or Spark jobs Demonstrated strength in architecting data warehouse solutions and integrating technical components Good analytical skills with excellent knowledge of SQL. 4+ years of work experience with very large data warehousing environment Excellent communication skills, both written and verbal 7+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools. 4+ years of experience data modelling concepts 3+ years of Python and/or Java development experience 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Exposure to tools like Tableau, Alteryx etc. To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 4+ years hand-on experience in one or more key areas. Minimum 7+ years industry experience What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Let’s be #BrilliantTogether ISS STOXX is actively hiring a Senior Software Engineer in C#/.NET and AWS to join our Liquid Metrix team in Mumbai office (Goregaon East) . Overview Purpose, mastery and autonomy! We provide all three at ISS to develop state of the art data processing and analytical platforms. We're looking for a Senior Software Engineer in C#/.NET and AWS to join our LQM development team to architect, design, develop, and release optimal solutions to business requirements. Shift hours : 11 AM to 8 PM IST Responsibilities Play an active role in a global agile team developing complex applications to consume, process and transform large data sets. Responsible for all phases of software development process – from requirements analysis, effort estimation, technical design, implementation, and testing - to defect management and support. Study current systems and redesign them to add features & improve performance, stability and scale. Qualifications Bachelor/Master’s Degree in computer science, information technology, or related fields. At least 6 to 12 years of programming experience with C#, .NET 4.8+, .NET core 6+ Proficiency with SQL Server - Schema design, query optimization, data imports, & partitioning. 2+ years’ experience with big data (TB), using data driven, distributed, and parallel processing. Knowledge of AWS – Redshift and experience with S3, Kinesis Firehose, EC2, or Elastic Search Knowledge of Algorithms & Heuristics to process large data sets. Excellent problem-solving and analytical skills Experience architecting complex systems with pragmatic use of design patterns and SOLID principles. Ability to work independently to proactively meet the goals of a distributed global team. Experience with version control using Git or Maven. Experience with Test/Behavior Driven Development Good To Have Proficiency in UI development using HTML/JS/React/Vue.JS/Angular Experience with WPF, LINQ Performance profiling CI/CD pipelines such as Azure Dev-Ops An interest in BI Visualization tools such as Tableau, Sisense, & Power BI Ability to quickly create prototypes Familiarity with Python and a desire to learn further to be able enhance existing systems Knowledge of financial markets & security trading #MIDSENIOR #LQM What You Can Expect From Us At ISS STOXX, our people are our driving force. We are committed to building a culture that values diverse skills, perspectives, and experiences. We hire the best talent in our industry and empower them with the resources, support, and opportunities to grow—professionally and personally. Together, we foster an environment that fuels creativity, drives innovation, and shapes our future success. Let’s empower, collaborate, and inspire. Let’s be #BrilliantTogether. About ISS STOXX ISS STOXX GmbH is a leading provider of research and technology solutions for the financial market. Established in 1985, we offer top-notch benchmark and custom indices globally, helping clients identify investment opportunities and manage portfolio risks. Our services cover corporate governance, sustainability, cyber risk, and fund intelligence. Majority-owned by Deutsche Börse Group, ISS STOXX has over 3,400 professionals in 33 locations worldwide, serving around 6,400 clients, including institutional investors and companies focused on ESG, cyber, and governance risk. Clients trust our expertise to make informed decisions for their stakeholders' benefit. Specifically, ISS LiquidMetrix provides a wide range of offerings, including Transaction Cost Analysis (TCA), execution quality, market abuse, and pre-trade analysis services across every public order and trade executed on European venues. Clients include sell sides, buy sides, exchanges, and regulators that require actionable analysis, reports, compliance tools, and global coverage. Visit our website: https://www.issgovernance.com View additional open roles: https://www.issgovernance.com/join-the-iss-team/ Institutional Shareholder Services (“ISS”) is committed to fostering, cultivating, and preserving a culture of diversity and inclusion. It is our policy to prohibit discrimination or harassment against any applicant or employee on the basis of race, color, ethnicity, creed, religion, sex, age, height, weight, citizenship status, national origin, social origin, sexual orientation, gender identity or gender expression, pregnancy status, marital status, familial status, mental or physical disability, veteran status, military service or status, genetic information, or any other characteristic protected by law (referred to as “protected status”). All activities including, but not limited to, recruiting and hiring, recruitment advertising, promotions, performance appraisals, training, job assignments, compensation, demotions, transfers, terminations (including layoffs), benefits, and other terms, conditions, and privileges of employment, are and will be administered on a non-discriminatory basis, consistent with all applicable federal, state, and local requirements.

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Job Description: Data Analyst – Professional Services Why SailPoint? Love what you do. And love where you do it! Smart people, fun culture, innovative work, beautiful offices — that is what people say about SailPoint. We are known as the company where everyone wants to work, and we have the awards to prove it; most recently ranked as #17 on Glassdoor’s best places to work in 2025. If you are passionate about outsmarting cybercriminals and seek working at a company where you can truly have an impact with other good people, we want you to join our team. SailPoint empowers the largest, most complex organizations by putting identity at the Center of Security and IT. Our 2,000+ customers include global financial institutions, government entities, pharmaceutical organizations, and more. Who You Are: You are an eager and highly motivated professional looking to grow your skills by building your working relationships and quickly learning from peer interaction at all levels within an organization. You have worked indirectly and directly with data analytics teams and internal stakeholders to successfully complete analytics deliverables and coordinate meetings focused on performance metrics. You are an excellent communicator, with experience in planning for and achieving challenging deadlines, assessing and mitigating project risks, and proactively building elegant solutions that deliver results regardless of initial starting point or data maturity. What You will Do: As a Data Analyst within the professional services team, you will play a leading role in establishing and maintaining successful relationships across the business focused on making the process of building and reporting analytics easier in support of making better business decisions. This role requires the successful collaboration across many teams to understand business needs, create and maintain solutions, while continually looking for ways to improve those solutions. While new data solutions will be created in this position from the ground up, the end goal is to mature these solutions so that they are ready for successful transition to the IT team for long term support. Responsibilities and Qualifications: Must be willing to work 4 overlapping hours with US timezone. You will work closely with US team Business Collaboration: Partner with cross-functional stakeholders to gather requirements, understand data needs, and translate complex business questions into actionable solution modifying existing solutions and/or creating net new. Team Support: Collaborate with Partner Delivery Enablement, Professional Services Enablement, and SailPoint Professional Services Operations teams to address their analytical needs. Dashboard Development: Assume ownership of existing Tableau dashboards and Smartsheets to ensure their ongoing maintenance. Design, build, and maintain new interactive dashboards and reports using Tableau, Smartsheet, and Salesforce to support strategic decision-making. Data Pipeline Support: Collaborate with data engineers to build robust data pipeline concepts that enable seamless visualization and reporting. Initially these pipelines are manual with the role responsible for continually pushing toward full automation and support from IT. Data Quality Assurance: Ensure the accuracy, consistency, and integrity of data across various platforms and processes. Documentation: Develop and maintain clear, detailed documentation for data workflows, analyses, and dashboard configurations. Performance Monitoring: Monitor and fine-tune the performance of Tableau dashboards and underlying data sources to ensure optimal efficiency. Cross-Functional Training: Educate team members and stakeholders on Tableau functionality, data analysis techniques, and best practices to foster data literacy and independence. Self-Educate: Stay informed on the latest trends, tools, and best practices in data visualization and analytics to continually improve processes and outputs. To excel in this role, you possess the following referenceable qualifications: Embody a strong sense of responsibility and proactive positive initiative to directly own project outcomes, drawing on wisdom gained through previous referenceable project delivery experience. Technical curiosity to quickly understand and proactively address needs in an elegant way, with an eye toward future maintainability, always working to advance simplified solutions. A strong inherent attention to detail to ensure that data accuracy is maintained while also being able to see the larger business problems that we are working to solve. Proven hands-on expertise with the confidence to demonstrate skills in Tableau, basic SQL (RedShift, BigQuery), SmartSheets, Excel, and other related tools. A history of successful project delivery and technical internal or external consultation work. A successful record of accomplishment as a data steward, working directly with internal teams, building strong relationships that solidify informed solutions created leveraging cross functional team collaboration. Exceptional interpersonal communication skills, with the ability to fluidly articulate complex technical solutions to non-technical audiences. Proficient problem-solving abilities to address data challenges that range in maturity from manually updated data acquisition to fully automated data pipelines and Tableau and/or Smartsheet reporting. You must qualify as a trusted representative of SailPoint, with a mature customer-centric attitude and unwavering commitment to company values. This is not a “nice to have,” but instead a required trait, necessary for building the trust required to build collaborative relationships with the internal and external teams required to be successful in this role. Education & Certification: Bachelor's degree or equivalent with Tableau and/or Office certifications preferred. Travel: SailPoint is a remote first company, however some travel may be required in your assigned region to meet with team members or to attend company sponsored events; estimated at 10-15%. Onboard Ramp: 30 Days Complete SailPoint bootcamp, learning about the organization's mission, values, and culture. Complete required security and compliance training. Understand the structure of the professional services department and key stakeholders. Gain an understanding of the organization’s data architecture, tools, and processes. Review and familiarize yourself with existing dashboards, reports, and data workflows. Meet with key stakeholders to understand ongoing projects and business objectives. Shadow team members to learn about operational challenges and improvement opportunities. 60 Days Take ownership of existing dashboards, ensuring they meet stakeholder expectations. Begin creating and refining new dashboards and reports based on identified needs. Collaborate with data engineers to evaluate and enhance data pipelines with goal to transfer work to fully automated workflows managed by IT. Start identifying areas for automation in data workflows and reporting. Propose process improvements and initiate small-scale enhancements. 90 Days Lead the design and implementation of advanced dashboards to provide strategic insights. Develop automated solutions for recurring data transformation and reporting tasks. Conduct training sessions for team members to improve Tableau proficiency and data analysis skills. Present a plan for long-term optimization of data visualization and analytics processes. Actively participate in team projects, driving measurable outcomes and improvements. Develop a plan for continuous improvement beyond the initial 90 days. SailPoint is an equal opportunity employer, and we welcome everyone to our team who is committed to living our four core values. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status. SailPoint is an equal opportunity employer and we welcome all qualified candidates to apply to join our team. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other category protected by applicable law. Alternative methods of applying for employment are available to individuals unable to submit an application through this site because of a disability. Contact hr@sailpoint.com or mail to 11120 Four Points Dr, Suite 100, Austin, TX 78726, to discuss reasonable accommodations.

Posted 1 week ago

Apply

8.0 years

0 Lacs

India

On-site

At The Institute of Clever Stuff (ICS), we don’t just solve problems—we revolutionise results. Our mission is to empower a new generation of Future Makers today, to revolutionise results and create a better tomorrow. Our vision is to pioneer a better future together. We are a consulting firm with a difference, powered by AI, driving world-leading results from data and change. We partner with visionary organisations to solve their toughest challenges, drive transformation, and deliver high-impact results. We combine a diverse network of data professionals, designers, software developers, and rebel consultants alongside our virtual AI consultant, fortu.ai, who combine human ingenuity with fortu.ai’s AI-powered intelligence to deliver smarter, faster and more effective results. Meet fortu.ai Used by some of the world’s leading organisations as a business question pipeline generator, ROI tracker, and innovation engine all in one. Trained on 400+ accelerators and 8 years of solving complex problems with global organisations. With fortu.ai, we’re disrupting a $300+ billion industry, turning traditional consulting on its head. Key Responsibilities: Complete Data Modelling Tasks Initiate and manage Gap Analysis and Source-to-Target Mapping Exercises. Gain a comprehensive understanding of the EA extract. Map the SAP source used in EA extracts to the AWS Transform Zone, AWS Conform Zone, and AWS Enrich Zone. Develop a matrix view of all Excel/Tableau reports to identify any missing fields or tables from SAP in the Transform Zone. Engage with SME’s to finalize the Data Model (DM). Obtain email confirmation and approval for the finalized DM. Perform data modelling using ER Studio and STTM. Generate DDL scripts for data engineers to facilitate implementation. Complete Data Engineering Tasks Set up infrastructure for pipelines – this includes Glue Jobs, crawlers, scheduling, step functions etc. Build, deploy, test and run pipelines on demand in lower environments. Verify data integrity: no duplicates, all columns in final table etc. Write unit tests for methods used in pipeline and use standard tools for testing. Code formatting and linting. Collaborate with other Modelling Engineers to align on correct approach. Update existing pipelines for CZ tables (SDLF and OF) where necessary with new columns if they are required for EZ tables. Raise DDP requests to register databases and tables, and to load data into the raw zone. Create comprehensive good documentation. Ensure each task is accompanied by detailed notes specific to its functional area for clear tracking and reference. Analyse and manage bugs, and change requests raised by business/SMEs. Collaborate with Data Analyst and Virtual Engineers (VE) to refine and enhance semantic modelling in Power BI. Plan out work using Microsoft Azure, ADO. Dependencies, status and effort is correctly reflected. Required Skills: Proven experience in data modelling and data pipeline development. Proficiency with tools like ER Studio, STTM, AWS Glue, Redshift & Athena, and Power BI. Strong SQL and experience with generating DDL scripts. Experience working in SAP data environments. Experience in any of these domain areas is highly desirable: Logistics, Supply Planning, Exports and IFOT. Familiarity with cloud platforms, particularly AWS. Hands-on experience with DevOps and Agile methodologies (e.g., Azure ADO). Strong communication and documentation skills. Ability to work collaboratively with cross-functional teams.

Posted 1 week ago

Apply

8.0 years

0 Lacs

India

On-site

Experience: 8+ years in Data Engineering or related roles, with a focus on cloud technologies and big data solutions. Technical Expertise: Deep knowledge of cloud platforms such as Azure, AWS, and GCP. Hands-on experience with Big Data technologies like Apache Spark, Hadoop, Kafka, and Flink. Solid understanding of ETL frameworks (DBT, Apache Airflow, AWS Glue, etc.). Expertise in containerization with Docker and Kubernetes (AKS, EKS, GKE). Proven experience in Data Warehousing & Modeling (Snowflake, Redshift, BigQuery, Synapse). Strong background in Data Security and Governance (IAM, RBAC, encryption, data lineage). Experience with CI/CD pipelines using Terraform, GitHub Actions, and Azure DevOps. Programming & Scripting Skills: Proficiency in Python, Bash, or PowerShell for automation tasks. Cloud Architecture: Experience designing hybrid/multi-cloud architectures to ensure high availability and fault tolerance across Azure, AWS, and GCP. Leadership & Mentorship: Proven ability to lead teams, mentor junior engineers, and collaborate effectively with cross-functional teams. Preferred Skills: Familiarity with Machine Learning pipelines and predictive analytics. Experience with Serverless Computing (AWS Lambda, Azure Functions, Google Cloud Functions).

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Role And Responsibilities Analyze and solve problems at their root, stepping back to understand the broader context Establish, meet, and monitor SLAs for support issues in conjunction with the rest of the teams Interface with customers, understand their requirements and deliver complete application and data solutions Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment Address and effectively manage sensitive issues and manage escalations from business teams Build and maintain effective internal relationships, specifically with Engineering, Site Reliability Engineering, Client Support Managers, and Ad Operations to help identify, report, and resolve issues quickly Learn and understand a broad range of Samsung Ads Platform and applications and know when, how, and which to use and which not to use Ability to work in a fast-paced environment where ambiguity is the norm Practical Problem-solving skills with a focus on unblocking the customer and planning for longer-term remediation measures Organized and process-oriented with the ability to drive resolutions by working with multiple groups Strong interpersonal skills with a customer success orientation Weekly and Monthly reporting Flexible towards work timings Skills And Qualifications Bachelor's degree in CS/IT or related field 4+ yrs Ad-Tech domain experience 1+ yrs experience with SQL & databases Strong written and verbal communication skills Customer centric mindset and structured approach to troubleshooting issue resolution Demonstrated experience of solving technical issues with third party SSPs,MMPs, SSAI vendors, Measurement solutions, Fraud Prevention companies etc Experience with 3P measurement vendor troubleshooting and integration Experience with support platforms tools, CRM or ticketing tools Ability to present complex technical information in a clear and concise manner to a variety of audiences, especially non-technical / go-to-market teams Experience using cloud-based tools like S3, Athena, QuickSights, and Kibana is a plus Experience as a Business/Product/Data Analyst is a plus Experience with Redshift and Snowflake is a plus This job is provided by Shine.com

Posted 1 week ago

Apply

8.0 years

0 Lacs

India

On-site

We are seeking a Subject Matter Expert (SME) in Big Data & Analytics to lead the design, development, and optimization of data-driven solutions. The ideal candidate will have deep experience in big data technologies, data pipelines, and advanced analytics to drive business intelligence, predictive modeling, and strategic decision-making. Scope of Work: ● Create a course structure for a certificate program with 4-5 courses (number of courses to be based on scoping). Each course is likely to have 4-5 modules and a total of 25 lessons. So a 4-course program could have up to 100 lessons. ● Work closely with the client in a rigorous scoping process to create a Job Task Analysis document and content structure for each program and course. ● Create program-level learning objectives for professional certificate courses. The number of objectives will depend on the level - beginner, intermediate, or advanced - and the type of certification course. ● Create course-level learning objectives aligned with the overall certification goal. ● Create module-level learning objectives based on skill development relevant to the TG’s career track. ● Review/create Course Outlines for each of the courses. ● Review video scripts and confirm technical accuracy of the content, suggest edits and updates as required. Re-write content and codes as needed. Incorporate one round of internal and client feedback. ● Record talking head videos (onsite/virtually on Zoom) for each course. Incorporate one round of internal and client feedback. ● Provide relevant recorded demos/ screencasts to be integrated in the videos. Check the codes and technical accuracy before providing the demos for integration. Incorporate one round of internal and client feedback. ● For AI/software/tool-based courses, suggest relevant freeware. Write/review and test the codes to check. ● Create/review 2-3 readings per lesson (why and what, 1500 words maximum per reading). The How readings should have detailed instructions/screenshots with short code block type practice that learners can do in their local environment. ● Create One Coach item per lesson - review/reflect on key ideas ● Create/review an ungraded lab per lesson - in-depth activity to apply skills in the learner's local environment. ● Create/review practice quizzes for each lesson and suggest suitable edits, confirm technical accuracy. Incorporate one round of internal and client feedback. ● Create module-level and course-level graded assignments that meet ACE recommendation requirements with 2 additional variations to each item in an assessment bank for each course. ● Create hands-on activities (3-4 lab or any other client preferred format) per course. Incorporate one round of internal and client feedback. ● Create a minimum of one 3-5 min career resources video per course that showcases career path planning. Requirements: ● 8+ years of experience in data engineering, big data architecture, or analytics roles. ● Strong expertise in Hadoop ecosystem (HDFS, Hive, Pig, HBase) and Apache Spark. ● Proficiency in data integration tools and frameworks like Apache NiFi, Airflow, or Talend. ● Experience with cloud platforms (AWS Redshift, Azure Synapse, Google BigQuery) and data lake/storage solutions. ● Hands-on experience with SQL, Python, Scala, or Java. ● Solid understanding of data warehousing, data modeling, and real-time data streaming (e.g., Kafka, Flink). ● Familiarity with BI tools like Power BI, Tableau, or Looker. ● Strong problem-solving and communication skills with the ability to explain technical concepts to non-technical stakeholders. Preferred Qualifications: ● Master's or Bachelor's degree in Computer Science, Data Science, Engineering, or related field. ● Experience working in regulated industries (e.g., finance, healthcare) with a focus on data compliance and privacy. ● Familiarity with AI/ML frameworks like TensorFlow, PyTorch, or MLlib. ● Certifications in cloud platforms or big data technologies (e.g., AWS Big Data Specialty, GCP Data Engineer). Interested candidates can share resume on saloni@digifocal.in

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Greetings from Themesoft! We are seeking an AWS Data Engineer for one of our Client. Please go through the below Job Details and apply if interested. Location: PAN India Experience Required: 5-9 years of experience Notice Period: Immediate Joiners Mandatory Skillsets: RDBMS, AWS Service, Data Integration, Data Warehousing, Data Security, Scripting and Automation, and Troubleshooting. Technical Skills Proficiency in working with relational databases, particularly PostgreSQL, Microsoft SQL Server, Oracle Experience in database schema design, optimization, and management. Strong knowledge of AWS services, including S3, AWS DMS (Database Migration Service), and AWS Redshift Serverless. Experience in setting up and managing data pipelines using AWS DMS. Proficiency in creating and managing data storage solutions using AWS S3. Expertise in data integration techniques and tools. Experience in designing and implementing ETL (Extract, Transform, Load) processes. Ability to perform data mapping, transformation, and data cleansing activities. Experience in setting up and managing data warehouses, particularly AWS Redshift Serverless. Proficiency in creating and managing views in AWS Redshift Proficiency in scripting languages such as Python or SQL for automating data integration tasks. Experience with automation tools and frameworks Share updated profiles with mythili@themesoft.com

Posted 1 week ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About BizAcuity Who are we? BizAcuity is on a mission to help enterprises get most out of their data by providing Business Intelligence and Data Analytics services, product development and consulting services for clients across globe in various domains / verticals. Established in 2011, by a strong leadership team and a team of 200+ engineers, we have made a mark as a world class service provider and compete with large service providers to win business. BizAcuity has developed and delivered high class enterprise solutions to many medium to large clients using modern and the best technologies in the data engineering and analytics world. Our services include - Business Intelligence Consulting , Advanced Analytics, Managed Services, Data Management, Cloud Services, Technology Consulting , Application Development and Product Engineering. For more information on BizAcuity, log on to - https://bizacuity.com/ Job Title: Pre-Sales Manager-Enterprise Solutions (Full Time / on BizAcuity payroll) Job Mode: Work from Office Job Location: Bangalore / Hyderabad Job Summary: The Pre-Sales Manager / Solution Consultant will be a key driver of new business acquisition, leveraging deep technical and business expertise to guide the sales process. This role demands a strong ability to understand complex customer challenges, architect tailored solutions, and persuasively demonstrate our product's value to prospective clients. The ideal candidate will possess a blend of technical acumen, sales aptitude, and a consultative approach to effectively translate client needs into compelling solutions. Key Responsibilities: Sales Skills & Customer Engagement: Collaborate strategically with the sales team to identify opportunities, qualify leads, and develop effective sales strategies. Employ a consultative sales approach to build strong relationships with prospects and customers, understand their business challenges, and position our solutions as the optimal solution. Effectively communicate the business value and ROI of our solutions to both technical and non-technical audiences, including C-level executives. Act as a trusted technical advisor and advocate for customers, building credibility and fostering long-term partnerships. Proactively identify and address customer concerns and objections throughout the sales process. Assist the sales team in preparing and delivering compelling responses to RFPs, RFIs, and technical inquiries. Create high-quality technical documentation, proposals, and presentations that support the sales process and effectively communicate our value proposition. Technical Solutioning: Develop and maintain in-depth technical knowledge of our products, solutions, and underlying technologies. Stay current with industry trends, emerging technologies, and advancements in the data analytics and business intelligence space. Conduct thorough needs assessments to capture detailed customer requirements, business objectives, and technical specifications. Architect and design innovative, tailored solutions that precisely address customer challenges and align with their strategic goals. Develop and deliver compelling, persuasive technical presentations and product demonstrations that showcase the value and differentiation of our solutions. Lead the development of Proof-of-Concept (POC) projects and customized demonstrations to validate proposed solutions and drive customer confidence. Provide expert technical guidance and solution recommendations throughout the sales cycle. Collaborate closely with product management, engineering, and support teams to ensure proposing solutions are feasible and customer requirements are accurately captured for implementation. Post-Sales Transition: Ensure a smooth and efficient handover of new customers to the implementation and support teams. Maintain customer satisfaction during the onboarding process by providing ongoing support and guidance. Qualifications: Education: Bachelor in Engineering / degree in Computer Science/Information Technology, is a plus. Experience: 8 + years of experience in the IT / Software Industry is a MUST, in a pre-sales, solution consulting, or similar customer-facing solutioning role, Experience in the Business Intelligence or Data Analytics domain is a big plus. Having an experience working on projects within a BI ecosystem, including solution design, implementation, and deployment, with a focus on data warehousing, data analytics, and master data management, a plus Proven track record of driving revenue growth in a pre-sales or technical sales support role. Technical Skills: Strong understanding of software solutions, cloud technologies, and enterprise IT environments. Proficiency in data analytics concepts, methodologies, and technologies (e.g., SQL, data warehousing, ETL, data modeling, data mining, BI tools, data visualization, reporting, and dashboarding). Experience with Master Data Management (MDM) principles, practices, and tools. Experience with BI platforms (e.g., Tableau, Power BI, Looker) and database systems. Familiarity with cloud-based data warehousing solutions (e.g., Snowflake, Redshift, BigQuery. Soft Skills: Excellent communication, presentation, and interpersonal skills, with the ability to effectively communicate complex technical concepts to diverse audiences. Exceptional problem-solving, analytical, and critical-thinking skills. Strong business acumen and a customer-centric approach. Ability to work independently, take initiative, and collaborate effectively within a team environment.

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

gwalior, madhya pradesh

On-site

As a Data Engineer at Synram Software Services Pvt. Ltd., a subsidiary of FG International GmbH, you will be an integral part of our team dedicated to providing innovative IT solutions in ERP systems, E-commerce platforms, Mobile Applications, and Digital Marketing. We are committed to delivering customized solutions that drive success across various industries. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure. Working closely with data analysts, data scientists, and software engineers, you will facilitate data-driven decision-making throughout the organization. Your key responsibilities will include developing, testing, and maintaining data architectures, designing and implementing ETL processes, optimizing data systems, collaborating with cross-functional teams to understand data requirements, ensuring data quality, integrity, and security, automating repetitive data tasks, monitoring and troubleshooting production data pipelines, and documenting systems, processes, and best practices. To excel in this role, you should possess a Bachelor's/Master's degree in Computer Science, Information Technology, or a related field, along with at least 2 years of experience as a Data Engineer or in a similar role. Proficiency in SQL, Python, or Scala is essential, as well as experience with data pipeline tools like Apache Airflow and familiarity with big data tools such as Hadoop and Spark. Hands-on experience with cloud platforms like AWS, GCP, or Azure is preferred, along with knowledge of data warehouse solutions like Snowflake, Redshift, or BigQuery. Preferred qualifications include knowledge of CI/CD for data applications, experience with containerization tools like Docker and Kubernetes, and exposure to data governance and compliance standards. If you are ready to be part of a data-driven transformation journey, apply now to join our team at Synram Software Pvt Ltd. For inquiries, contact us at career@synram.co or +91-9111381555. Benefits of this full-time, permanent role include a flexible schedule, internet reimbursement, leave encashment, day shift with fixed hours and weekend availability, joining bonus, and performance bonus. The ability to commute/relocate to Gwalior, Madhya Pradesh, is preferred. Don't miss the opportunity to contribute your expertise to our dynamic team. The application deadline is 20/07/2025, and the expected start date is 12/07/2025. We look forward to welcoming you aboard for a rewarding and challenging career in data engineering.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

navi mumbai, maharashtra

On-site

Seekify Global is looking for an experienced and motivated Data Catalog Engineer to join the Data Engineering team. The ideal candidate should have a significant background in designing and implementing metadata and data catalog solutions within AWS-centric data lake and data warehouse environments. As a Data Catalog Engineer at Seekify Global, you will play a crucial role in improving data discoverability, governance, and lineage across our enterprise data assets. Your responsibilities will include leading the end-to-end implementation of a data cataloging solution within AWS, establishing and managing metadata frameworks for structured and unstructured data assets, and integrating the data catalog with various AWS-based storage solutions such as S3, Redshift, Athena, Glue, and EMR. You will collaborate closely with data Governance/BPRG/IT projects teams to define metadata standards, data classifications, and stewardship processes. Additionally, you will be responsible for developing automation scripts for catalog ingestion, lineage tracking, and metadata updates using tools like Python, Lambda, Pyspark, or Glue/EMR custom jobs. Working in coordination with data engineers, data architects, and analysts, you will ensure that metadata is accurate, relevant, and up to date. Implementing role-based access controls and ensuring compliance with data privacy and regulatory standards will also be part of your role. Moreover, you will be expected to create detailed documentation and conduct training/workshops for internal stakeholders on effectively utilizing the data catalog. **Key Responsibilities:** - Lead end-to-end implementation of a data cataloging solution within AWS, preferably AWS Glue Data Catalog or third-party tools like Apache Atlas, Alation, Collibra, etc. - Establish and manage metadata frameworks for structured and unstructured data assets in data lake and data warehouse environments. - Integrate the data catalog with AWS-based storage solutions such as S3, Redshift, Athena, Glue, and EMR. - Collaborate with data Governance/BPRG/IT projects teams to define metadata standards, data classifications, and stewardship processes. - Develop automation scripts for catalog ingestion, lineage tracking, and metadata updates using Python, Lambda, Pyspark, or Glue/EMR custom jobs. - Work closely with data engineers, data architects, and analysts to ensure metadata is accurate, relevant, and up to date. - Implement role-based access controls and ensure compliance with data privacy and regulatory standards. **Required Skills and Qualifications:** - 7-8 years of experience in data engineering or metadata management roles. - Proven expertise in implementing and managing data catalog solutions within AWS environments. - Strong knowledge of AWS Glue, S3, Athena, Redshift, EMR, Data Catalog, and Lake Formation. - Hands-on experience with metadata ingestion, data lineage, and classification processes. - Proficiency in Python, SQL, and automation scripting for metadata pipelines. - Familiarity with data governance and compliance standards (e.g., GDPR, RBI guidelines). - Experience integrating with BI tools (e.g., Tableau, Power BI) and third-party catalog tools is a plus. - Strong communication, problem-solving, and stakeholder management skills. **Preferred Qualifications:** - AWS Certifications (e.g., AWS Certified Data Analytics, AWS Solutions Architect). - Experience with data catalog tools like Alation, Collibra, or Informatica EDC, or open-source tools hands-on experience. - Exposure to data quality frameworks and stewardship practices. - Knowledge of data migration with data catalog and data-mart is a plus. This is a full-time position with the work location being in person.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Data Engineer2 at GoKwik, you will have the opportunity to closely collaborate with product managers, data scientists, business intelligence teams, and SDEs to develop and implement data-driven strategies. Your role will involve identifying, designing, and executing process improvements to enhance data models, architectures, pipelines, and applications. You will play a vital role in continuously optimizing data processes, overseeing data management, governance, security, and analysis to ensure data quality and security across all product verticals. Additionally, you will design, create, and deploy new data models and pipelines as necessary to achieve high performance, operational excellence, accuracy, and reliability in the system. Your responsibilities will include utilizing tools and technologies to establish a data architecture that supports new data initiatives and next-gen products. You will focus on building test-driven products and pipelines that are easily maintainable and reusable. Furthermore, you will design and construct an infrastructure for data extraction, transformation, and loading from various data sources, supporting the marketing and sales team. To excel in this role, you should possess a Bachelor's or Master's degree in Computer Science, Mathematics, or relevant computer programming training, along with a minimum of 4 years of experience in the Data Engineering field. Proficiency in SQL, relational databases, query authoring, data pipelines, architectures, and working with cross-functional teams in a dynamic environment is essential. Experience with Python, data pipeline tools, and AWS cloud services is also required. We are looking for individuals who are independent, resourceful, analytical, and adept at problem-solving. The ability to adapt to changing environments, excellent communication skills, and a collaborative mindset are crucial for success in this role. If you are passionate about tackling challenging problems at scale and making a significant impact within a dynamic and entrepreneurial setting, we welcome you to join our team at GoKwik.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a technology services and solutions provider specializing in Data, AI, and Digital, Zimetrics is dedicated to assisting enterprises in harnessing the economic potential and business value of data from various sources. Our core principles of Integrity, Intellect, and Ingenuity influence our value system, engineering expertise, and organizational behavior, making us problem solvers and innovators who challenge conventional wisdom and believe in endless possibilities. You will be responsible for designing scalable and secure cloud-based data architecture solutions, leading data modeling, integration, and migration strategies across platforms, and engaging directly with clients to comprehend business needs and translate them into technical solutions. Additionally, you will support sales and pre-sales teams with solution architecture, technical presentations, and proposals, collaborate with cross-functional teams including engineering, BI, and product, and ensure adherence to best practices in data governance, security, and performance optimization. To excel in this role, you must possess strong experience with Cloud platforms such as AWS, Azure, or GCP, a deep understanding of Data Warehousing concepts and tools like Snowflake, Redshift, and BigQuery, proven expertise in data modeling encompassing conceptual, logical, and physical aspects, excellent communication and client engagement skills, and experience in pre-sales or solution consulting, which is considered a strong advantage. Furthermore, the ability to articulate complex technical concepts to non-technical stakeholders will be vital for success in this position.,

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description: Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications: Bachelor's degree in Engineering or a related field. 3 to 5 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications Bachelor's degree in Engineering or a related field. 5 to 9 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams.

Posted 1 week ago

Apply

2.0 - 5.0 years

0 Lacs

Greater Kolkata Area

Remote

About The Role We are looking for a talented and motivated Senior Data Scientist to join our Marketing and Product Analytics team. In this role, you will work closely with cross-functional teams to analyse data, build models, and generate insights that drive key decisions across our marketing and product functions. This is a fantastic opportunity to apply your analytical skills in a fast-paced, data-driven environment with a strong culture of experimentation and impact. Responsibilities Conduct deep-dive analyses to uncover insights on user behaviour, marketing performance, and product engagement. Create clear, concise dashboards and reports to communicate findings to stakeholders. Partner with product managers, marketers, and engineers to design experiments and evaluate their outcomes. Continuously identify opportunities to improve decision-making through data. Extract, clean, and transform data to ensure quality and usability across analytics initiatives. Build, validate, and deploy machine learning and statistical models to solve business problems. Required Skills Proficient in Python and SQL (MySQL or Redshift experience preferred). Experience with data visualization tools (e.g., Tableau, Power BI, or Plotly). Solid understanding of machine learning, A/B testing, customer segmentation and statistical analysis. Strong problem-solving skills and a structured approach to working with complex datasets. Excellent communication skills with the ability to present findings clearly o non-technical audiences. Highly motivated and collaborative to excel in a fast-paced remote environment. Education And Background B.E./B.Tech. from a Tier I College (IITs/NITs/BITS) in Computer Science, Data Science, Engineering, or a related quantitative field. 2-5 years of experience in an analytics/data science role. Prior experience in marketing/customer analytics required. Prior experience working in a consumer-tech, credit/lending or fintech environment is a plus. Why Join Us Be part of a high-impact analytics team shaping marketing and product strategy. Work on real-world problems with direct influence on business outcomes. Enjoy a remote-first work environment with flexibility and autonomy. Get access to mentorship, continuous learning, and growth opportunities. (ref:hirist.tech)

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As an Informatica IDMC Developer at Coforge, your primary responsibility will be to design, develop, and maintain robust ETL pipelines using Informatica Intelligent Data Management Cloud (IDMC/IICS). You will collaborate with data architects, analysts, and business stakeholders to gather and comprehend data requirements. Your role will involve integrating data from various sources including databases, APIs, and flat files, and optimizing data workflows for enhanced performance, scalability, and reliability. Monitoring and troubleshooting ETL jobs to address data quality issues will be a part of your daily tasks. Implementing data governance and security best practices will also be crucial, along with maintaining detailed documentation of data flows, transformations, and architecture. Your contribution to code reviews and continuous improvement initiatives will be valued. The ideal candidate for this position should possess strong hands-on experience with Informatica IDMC (IICS) and cloud-based ETL tools. Proficiency in SQL and prior experience working with relational databases like Oracle, SQL Server, and PostgreSQL is essential. Additionally, familiarity with cloud platforms such as AWS, Azure, or GCP, and knowledge of data warehousing concepts and tools like Snowflake, Redshift, or BigQuery are required. Excellent problem-solving skills and effective communication abilities are highly desirable qualities for this role. Preferred qualifications for this position include experience with CI/CD pipelines and version control systems, as well as knowledge of data modeling and metadata management. Holding certifications in Informatica or cloud platforms will be considered a plus. If you have 5-8 years of relevant experience and possess the mentioned skill set, we encourage you to apply for this position by sending your CV to Gaurav.2.Kumar@coforge.com. This position is based in Greater Noida.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

At Medtronic, you can embark on a rewarding career dedicated to exploration and innovation, all while contributing to the advancement of healthcare access and equity for all. As a Digital Engineer at our new Minimed India Hub, you will play a crucial role in leveraging technology to enhance healthcare solutions on a global scale. Specifically, as a PySpark Data Engineer, you will be tasked with designing, developing, and maintaining data pipelines using PySpark. Your collaboration with data scientists, analysts, and stakeholders will be essential in ensuring the efficient processing and analysis of large datasets, as well as handling complex transformations and aggregations. This role offers an exciting opportunity to work within Medtronic's Diabetes business. As the Diabetes division prepares for separation to foster future growth and innovation, you will have the chance to operate with increased speed and agility. By working as a separate entity, there will be a focus on driving meaningful innovation and enhancing the impact on patient care. Your responsibilities will include designing, developing, and maintaining scalable and efficient ETL pipelines using PySpark, working with structured and unstructured data from various sources, optimizing PySpark applications for performance and scalability, collaborating with data scientists and analysts to understand data requirements, implementing data quality checks, monitoring and troubleshooting data pipeline issues, documenting technical specifications, and staying updated on the latest trends and technologies in big data and distributed computing. To excel in this role, you should possess a Bachelor's degree in computer science, engineering, or a related field, along with 4-5 years of experience in data engineering focusing on PySpark. Proficiency in Python and Spark, strong coding and debugging skills, knowledge of SQL and relational databases, hands-on experience with cloud platforms, familiarity with data warehousing solutions, experience with big data technologies, problem-solving abilities, and effective communication and collaboration skills are essential. Preferred skills include experience with Databricks, orchestration tools like Apache Airflow, knowledge of machine learning workflows, understanding of data security and governance best practices, familiarity with streaming data platforms, and knowledge of CI/CD pipelines and version control systems. Medtronic offers a competitive salary and flexible benefits package, along with a commitment to recognizing and supporting employees at every stage of their career and life. As part of the Medtronic team, you will contribute to the mission of alleviating pain, restoring health, and extending life by tackling the most challenging health problems facing humanity. Join us in engineering solutions that make a real difference in people's lives.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

As the Technical Lead for Data Engineering at Assent, you will collaborate with other team members to identify opportunities and assess the feasibility of solutions. Your role will involve providing technical guidance, influencing decision-making, and aligning data engineering initiatives with business goals. You will drive the technical strategy, team execution, and process improvements to build resilient and scalable data systems. Mentoring a growing team and building robust, scalable data infrastructure will be essential aspects of your responsibilities. Your key requirements and responsibilities will include driving the technical execution of data engineering projects, working closely with Architecture members to design and implement scalable data pipelines, and providing technical guidance to ensure best practices in data engineering. You will collaborate cross-functionally with various teams to define and execute data initiatives, plan and prioritize work with the team manager, and stay updated with emerging technologies to drive their adoption. To be successful in this role, you should have 10+ years of experience in data engineering or related fields, expertise in cloud data platforms such as AWS, proficiency in modern data technologies like Spark, Airflow, and Snowflake, and a deep understanding of distributed systems and data pipeline design. Strong programming skills in languages like Python, SQL, or Scala, experience with infrastructure as code and DevOps best practices, and the ability to influence technical direction and advocate for best practices are also necessary. Strong communication and leadership skills, a learning mindset, and experience in security, compliance, and governance related to data systems will be advantageous. At Assent, we value your talent, energy, and passion, and offer various benefits to support your well-being, financial health, and personal growth. Our commitment to diversity, equity, and inclusion ensures that all team members are included, valued, and provided with equal opportunities for success. If you require any assistance or accommodation during the interview process, please feel free to contact us at talent@assent.com.,

Posted 1 week ago

Apply

7.0 years

0 Lacs

Kochi, Kerala, India

On-site

Experience: 5–7 years Role Overview Experience in seamless replication and integration of data from source systems (Oracle EBS, PLM, Retail, SQL Server) to Redshift and BI platforms using GoldenGate, Qlik Replicate, and ODI. Key Responsibilities Configure, monitor, and troubleshoot Oracle GoldenGate replication pipelines (EBS/PLM/BI). Administer Qlik Replicate jobs and monitor data sync from Oracle/SQL Server to AWS Redshift. Manage ODI interfaces for batch ETL processes feeding the data lake or reporting systems. Ensure data integrity, latency SLAs, and failure recovery in replication and transformation pipelines. Collaborate with BI teams to resolve source-side data issues and support schema evolution planning. Required Skills Experience in Oracle GoldenGate (active-active and active-passive replication) Familiarity with Qlik Replicate and Redshift pipelines. ODI interface development and job orchestration. Familiarity with data integrity validation, performance tuning, and log-based replication.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies