Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
6 - 10 Lacs
Telangana
Work from Office
Key Responsibilities: Team Leadership: Lead and mentor a team of Azure Data Engineers, providing technical guidance and support. Foster a collaborative and innovative team environment. Conduct regular performance reviews and set development goals for team members. Organize training sessions to enhance team skills and technical capabilities. Azure Data Platform: Design, implement, and optimize scalable data solutions using Azure data services such as Azure Databricks, Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics. Ensure data engineering best practices and data governance are followed. Stay up-to-date with Azure data technologies and recommend improvements to enhance data processing capabilities. Data Architecture: Collaborate with data architects to design efficient and scalable data architectures. Define data modeling standards and ensure data integrity, security, and governance compliance. Project Management: Work with project managers to define project scope, goals, and deliverables. Develop project timelines, allocate resources, and track progress. Identify and mitigate risks to ensure successful project delivery. Collaboration & Communication: Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to deliver data-driven solutions. Communicate effectively with stakeholders to understand requirements and provide updates. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Team Lead or Manager in data engineering. Extensive experience with Azure data services and cloud technologies. Expertise in Azure Databricks, PySpark, and SQL. Strong understanding of data engineering best practices, data modeling, and ETL processes. Experience with agile development methodologies. Certifications in Azure data services (preferred). Preferred Skills: Experience with big data technologies and data warehousing solutions. Familiarity with industry standards and compliance requirements. Ability to lead and mentor a team.
Posted 1 month ago
2.0 - 7.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Amazon strives to be Earths most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want low prices, vast selection, and convenience Amazon continues to grow and evolve as a world-class e-commerce platform. Do you have solid analytical thinking, metrics-driven decision making and want to solve problems with solutions that will meet the growing worldwide need? Then SmartCommerce is the team for you. We are looking for top notch Business Intelligence Engineer to be part of our analytics team. The ideal candidate will be curious, have attention to detail, be energized by challenging entrepreneurial environment, be comfortable thinking big while also diving deep. Are you a smart, hungry, flexible, and world-class analytics professional excited by the challenge of launching a new business initiative for Amazon? SmartCommerce team is looking for Business Intelligence Engineer to be part of a new team being built from the ground up. They will be primarily working on our product SmartBiz. SmartBiz by Amazon is a one-stop shop for Indian sellers to fulfill their online selling needs. Whether a small business, an entrepreneur, or a neighborhood store, a seller can now create their own e-commerce store within minutes and start showcasing and selling their products online. 1.Responsible for designing, building and maintaining complex data solutions for Amazons SmartCommerce businesses 2.Actively participates in the code review process, design discussions, team planning, operational excellence, and constructively identifies problems and proposes solutions 3.Makes appropriate trade-offs, re-use where possible, and is judicious about introducing dependencies 4.Makes efficient use of resources (e.g., system hardware, data storage, query optimization, AWS infrastructure etc.) 5.Asks correct questions when data model and requirements are not well defined and comes up with designs which are scalable, maintainable and efficient 6.Makes enhancements that improve team s data architecture, making it better and easier to maintain (e.g., data auditing solutions, automating, ad-hoc or manual operation steps) 7.Owns the data quality of important datasets and any new changes/enhancements 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) and statistical methods (e.g. t-test, Chi-squared) Experience with scripting language (e.g., Python, Java, or R) Masters degree, or Advanced technical degree Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis
Posted 1 month ago
9.0 - 12.0 years
15 - 20 Lacs
Chennai
Work from Office
Job Title:Data Engineer Lead / Architect (ADF)Experience9-12YearsLocation:Remote / Hybrid : Role and ResponsibilitiesTalk to client stakeholders, and understand the requirements for building their data warehouse / data lake / data Lakehouse. Design, develop and maintain data pipelines in Azure Data Factory (ADF) for ETL from on-premise and cloud based sources Design, develop and maintain data warehouses and data lakes in Azure Run large data platform and other related programs to provide business intelligence support Design and Develop data models to support business intelligence solutions Implement best practices in data modelling and data warehousing Troubleshoot and resolve issues related to ETL and data connections Skills Required: Excellent written and verbal communication skills Excellent knowledge and experience in ADF Well versed with ADLS Gen 2 Knowledge of SQL for data extraction and transformation Ability to work with various data sources (Excel, SQL databases, APIs, etc.) Knowledge in SAS would be added advantage Knowledge in Power BI would be added advantage
Posted 1 month ago
12.0 - 22.0 years
40 - 60 Lacs
Bengaluru
Work from Office
Location: Bangalore Onsite Experience: 12+ years Type: Full-time --- Role Overview We are looking for a Technical Program Manager (TPM) to drive the execution of a next-generation data and AI platform that powers real-time analytics, machine learning, and industrial applications across multiple domains such as aviation, logistics, and manufacturing. You will work at the intersection of engineering, product, architecture, and business, managing the roadmap, resolving technical dependencies, and ensuring delivery of critical platform components across cross-functional and geographically distributed teams. --- Key Responsibilities Program & Execution Management Drive end-to-end delivery of platform features and sector-specific solutions by coordinating multiple scrum teams (AI/ML, Data, Fullstack, DevOps). Develop and maintain technical delivery plans, sprint milestones, and program-wide timelines. Identify and resolve cross-team dependencies, risks, and technical bottlenecks. Technical Fluency & Architecture Alignment Understand the platform’s architecture (Kafka, Spark, data lakes, ML pipelines, hybrid/on-prem deployments) and guide teams toward cohesive delivery. Translate high-level product goals into detailed technical milestones and backlog items in collaboration with Product Owners and Architects. Cross-Functional Collaboration Liaise between globally distributed engineering teams, product owners, architects, and domain stakeholders to align on priorities and timelines. Coordinate multi-sector requirements and build scalable components that serve as blueprints across industries (aviation, logistics, etc.). Governance & Reporting Maintain clear, concise, and timely program reporting (dashboards, OKRs, status updates) for leadership and stakeholders. Champion delivery best practices, quality assurance, and documentation hygiene. Innovation & Agility Support iterative product development with flexibility to handle ambiguity and evolving priorities. Enable POCs and rapid prototyping efforts while planning for scalable production transitions. ---Required Skills & Qualifications 12+ years of experience in software engineering and technical program/project management. Strong understanding of platform/data architecture, including event streaming (Kafka), batch/stream processing (Spark, Flink), and AI/ML pipelines. Proven success delivering complex programs in agile environments with multiple engineering teams. Familiarity with DevOps, cloud/on-prem infrastructure (AWS, Azure, hybrid models), CI/CD, and observability practices. Excellent communication, stakeholder management, and risk mitigation skills. Strong grasp of Agile/Scrum or SAFe methodologies. --- Good-to-Have Experience working in or delivering solutions to industrial sectors such as aviation, manufacturing, logistics, or utilities. Experience with tools like Jira, Confluence, Notion, Asana, or similar. Background in engineering or data (Computer Science, Data Engineering, AI/ML, or related).
Posted 1 month ago
12.0 - 15.0 years
13 - 18 Lacs
Gurugram
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : SAP Data Services Development Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various stakeholders to gather requirements and translate them into effective data solutions, while also addressing any challenges that arise during the design and implementation phases. Your role will be pivotal in ensuring that the data architecture is robust, scalable, and capable of supporting future growth and innovation within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing and best practices among team members.- Monitor and evaluate the effectiveness of data solutions and make necessary adjustments. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Data Migration.- Experience with SAP Data & Development.- Strong understanding of data modeling techniques and best practices.- Familiarity with data integration tools and methodologies.- Ability to design and implement data governance frameworks. Additional Information:- The candidate should have minimum 12 years of experience in SAP Data Migration.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
12.0 - 15.0 years
15 - 20 Lacs
Bengaluru
Work from Office
Project Role : Solution Architect Project Role Description : Translate client requirements into differentiated, deliverable solutions using in-depth knowledge of a technology, function, or platform. Collaborate with the Sales Pursuit and Delivery Teams to develop a winnable and deliverable solution that underpins the client value proposition and business case. Must have skills : Solution Architecture Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Solution Architect, you will engage in a dynamic and collaborative environment where you will translate client requirements into innovative and effective solutions. Your typical day will involve working closely with various teams to ensure that the solutions developed are not only deliverable but also align with the client's business objectives. You will leverage your extensive knowledge of technology and platforms to create value propositions that resonate with clients, ensuring that their needs are met with precision and creativity. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality outcomes that drive client satisfaction and business success. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and discussions to gather requirements and feedback from stakeholders.- Mentor junior team members to enhance their skills and knowledge in solution architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Solution Architecture.- Strong understanding of cloud computing platforms and services.- Experience with enterprise application integration and API management.- Ability to design scalable and resilient architectures.- Familiarity with agile methodologies and project management practices. Additional Information:- The candidate should have minimum 12 years of experience in Solution Architecture.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
8.0 - 13.0 years
8 - 13 Lacs
Telangana
Work from Office
Key Responsibilities: Team Leadership: Lead and mentor a team of Azure Data Engineers, providing technical guidance and support. Foster a collaborative and innovative team environment. Conduct regular performance reviews and set development goals for team members. Organize training sessions to enhance team skills and technical capabilities. Azure Data Platform: Design, implement, and optimize scalable data solutions using Azure data services such as Azure Databricks, Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics. Ensure data engineering best practices and data governance are followed. Stay up-to-date with Azure data technologies and recommend improvements to enhance data processing capabilities. Data Architecture: Collaborate with data architects to design efficient and scalable data architectures. Define data modeling standards and ensure data integrity, security, and governance compliance. Project Management: Work with project managers to define project scope, goals, and deliverables. Develop project timelines, allocate resources, and track progress. Identify and mitigate risks to ensure successful project delivery. Collaboration & Communication: Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to deliver data-driven solutions. Communicate effectively with stakeholders to understand requirements and provide updates. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Team Lead or Manager in data engineering. Extensive experience with Azure data services and cloud technologies. Expertise in Azure Databricks, PySpark, and SQL. Strong understanding of data engineering best practices, data modeling, and ETL processes. Experience with agile development methodologies. Certifications in Azure data services (preferred). Preferred Skills: Experience with big data technologies and data warehousing solutions. Familiarity with industry standards and compliance requirements. Ability to lead and mentor a team.
Posted 1 month ago
7.0 - 10.0 years
20 - 35 Lacs
Noida
Remote
Position: Cloud Data Architect Location Remote Work Time: US EST Hours Job Description: ECC/BW/HANA Solution/Data Architect Design and implement end-to-end SAP ECC, BW, and HANA data architectures, ensuring scalable and robust solutions. Develop and optimize data models, ETL processes, and reporting frameworks across SAP landscapes. Lead integration efforts, defining and applying best practices for connecting SAP systems with external platforms and cloud services. Collaborate with business stakeholders to translate requirements into technical solutions, focusing on data quality and governance. Provide technical leadership and mentorship to project teams, ensuring alignment with enterprise integration patterns and standards. Interested candidate can apply : dsingh15@fcsltd.com
Posted 1 month ago
1.0 - 7.0 years
3 - 9 Lacs
Bengaluru
Work from Office
Design, develop, and implement machine learning models and statistical algorithms.Analyze large datasets to extract meaningful insights and trends.Collaborate with stakeholders to define business problems and deliver data-driven solutions.Optimize and scale machine learning models for production environments.Present analytical findings and recommendations in a clear, actionable manner.Key Skills:Proficiency in Python, R, and SQL.Experience with ML libraries like TensorFlow, PyTorch, or Scikit-learn.Strong knowledge of statistical methods and data visualization tools.Excellent problem-solving and storytelling skills
Posted 1 month ago
8.0 - 12.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Job Summary We are seeking an experienced Data Architect with expertise in Snowflake, dbt, Apache Airflow, and AWS to design, implement, and optimize scalable data solutions. The ideal candidate will play a critical role in defining data architecture, governance, and best practices while collaborating with cross-functional teams to drive data-driven decision-making. Key Responsibilities Data Architecture & Strategy: Design and implement scalable, high-performance cloud-based data architectures on AWS. Define data modelling standards for structured and semi-structured data in Snowflake. Establish data governance, security, and compliance best practices. Data Warehousing & ETL/ELT Pipelines: Develop, maintain, and optimize Snowflake-based data warehouses. Implement dbt (Data Build Tool) for data transformation and modelling. Design and schedule data pipelines using Apache Airflow for orchestration. Cloud & Infrastructure Management: Architect and optimize data pipelines using AWS services like S3, Glue, Lambda, and Redshift. Ensure cost-effective, highly available, and scalable cloud data solutions. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to align data solutions with business goals. Provide technical guidance and mentoring to the data engineering team. Performance Optimization & Monitoring: Optimize query performance and data processing within Snowflake. Implement logging, monitoring, and alerting for pipeline reliability. Required Skills & Qualifications 10+ years of experience in data architecture, engineering, or related roles. Strong expertise in Snowflake, including data modeling, performance tuning, and security best practices. Hands-on experience with dbt for data transformations and modeling. Proficiency in Apache Airflow for workflow orchestration. Strong knowledge of AWS services (S3, Glue, Lambda, Redshift, IAM, EC2, etc.). Experience with SQL, Python, or Spark for data processing. Familiarity with CI/CD pipelines, Infrastructure-as-Code (Terraform/CloudFormation) is a plus. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, etc.). Preferred Qualifications Certifications: AWS Certified Data Analytics - Specialty, Snowflake SnowPro Certification, or dbt Certification. Experience with streaming technologies (Kafka, Kinesis) is a plus. Knowledge of modern data stack tools (Looker, Power BI, etc.). Experience in OTT streaming could be added advantage.
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Database Administrator at NTT DATA, you will be a seasoned subject matter expert responsible for ensuring the availability, integrity, and performance of critical data assets. You will work closely with cross-functional teams to support data-driven applications, troubleshoot issues, and implement robust backup and recovery strategies. Collaboration with Change Control, Release Management, Asset and Configuration Management, and Capacity and Availability Management will be essential to establish user needs, monitor access and security, and control database environments. Key responsibilities include performing the installation, configuration, and maintenance of database management systems, collaborating with software developers/architects to optimize database schemas, and designing backup and disaster recovery strategies. You will monitor database performance, identify bottlenecks, and optimize queries for optimal performance. Additionally, you will work on database documentation, data validation, integrity checks, and data cleansing activities. Supporting database-related initiatives, applying patches, and communicating with technical teams and stakeholders are also crucial aspects of the role. To excel in this position, you should have seasoned proficiency in database administration tasks, SQL knowledge, database security principles, backup and recovery strategies, and data architecture. Effective communication, problem-solving, analytical skills, and the ability to manage multiple projects concurrently are necessary. Academic qualifications include a Bachelor's degree in computer science or related field, along with relevant certifications such as MCSE DBA or Oracle Associate. Prior experience as a Database Administrator in an IT organization, working with Oracle Enterprise and Microsoft SQL Server, and managing databases is required. NTT DATA is a trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a focus on R&D and a diverse team of experts, NTT DATA provides consulting, data, AI, and industry solutions to move organizations confidently into the digital future. As an Equal Opportunity Employer, NTT DATA offers a workplace where diversity and inclusion thrive, allowing employees to grow, belong, and succeed.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
You will be part of a global and diverse community at American Express, where you will have the opportunity to contribute to the company's success and work in an environment that values integrity and inclusivity. As a member of the U.S. Consumer Services and Enterprise Digital & Data Technology Team, you will play a key role in developing technology capabilities that enhance digital engagement and support various business lines and customer segments. In this role, you will be responsible for Business Process Management architecture, focusing on increasing adoption of automation platforms, delivering technology products, and providing innovative solutions. You will spend 25% of your time on hands-on technical projects, analyzing and recommending solutions based on research and introspection. Your responsibilities will include designing and leading solutions, ensuring alignment with enterprise architecture standards, and collaborating with engineering teams to implement solutions that drive measurable business improvements. To succeed in this role, you should have a minimum of 10 years of engineering or architecture experience, with a strong background in designing and deploying enterprise-grade technology assets. A bachelor's degree in computer science or a related field is required, and an advanced degree is preferred. You should have deep knowledge of architectural disciplines and experience in introducing new technologies based on research. Proficiency in technology architecture, containers, cloud management, and COTS product evaluation is essential, along with the ability to write, read, and debug code. Preferred qualifications include experience in delivering technology products for enterprise scale, exposure to AI technologies, and expertise in service-oriented architecture and microservices architecture. You should also have experience with high-throughput messaging technologies and channel-specific architecture skills. American Express offers competitive salaries, bonus incentives, and comprehensive benefits to support your holistic well-being, including medical, dental, vision, life insurance, and disability benefits. You will have access to career development opportunities and flexible working arrangements based on business needs. Join Team Amex and be part of a team that values your contributions and supports your professional growth and personal well-being.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
The Reporting & Data Product Owner - ISS Data (Associate Director) role at Fidelity involves leading the creation and execution of a future state data reporting product to enable Regulatory, Client, Vendor, Internal & MI reporting and analytics. This key role requires an in-depth knowledge of data domains related to institutional clients, investment life cycle, and regulatory and client reporting data requirements. Sitting within the ISS Delivery Data Analysis chapter, the successful candidate will collaborate with Business Architecture, Data Architecture, and business stakeholders to build a future state platform. Maintaining strong relationships with various business contacts is essential to ensure superior service to internal business stakeholders and clients. **Key Responsibilities** **Leadership and Management:** - Lead ISS distribution, Client Propositions, Sustainable Investing, and Regulatory reporting data outcomes - Define data roadmap and capabilities, supporting execution and delivery of data solutions as a Data Product lead - Line management responsibilities for junior data analysts within the chapter - Define data product vision and strategy with end-to-end thought leadership - Lead and define the data product backlog, documentation, analysis effort estimation, and planning - Drive efficiencies, scale, and innovation as a catalyst for change **Data Quality and Integrity:** - Define data quality use cases for all required data sets - Contribute to technical frameworks of data quality - Align functional solution with best practice data architecture & engineering **Coordination and Communication:** - Communicate at a senior management level to influence senior tech and business stakeholders globally - Coordinate with internal and external teams impacted by data flows - Advocate for the ISS Data Programme - Collaborate closely with Data Governance, Business Architecture, Data owners, etc. - Conduct workshops within scrum teams and across business teams, effectively documenting minutes and driving actions This role offers a comprehensive benefits package, prioritizes wellbeing, supports development, and provides flexibility in work arrangements. Fidelity is committed to ensuring a motivating work environment where employees feel valued and part of a team. Visit careers.fidelityinternational.com to learn more about our work, approach to dynamic working, and opportunities for building a future with us.,
Posted 1 month ago
5.0 - 10.0 years
20 - 25 Lacs
Noida
Remote
Architect & manage data solutions using Snowflake & advanced SQL Design & implement data pipelines, data warehouses, & data lakes, ensuring efficient data transformation Develop best practices for data security, access control, & compliance Required Candidate profile Exp 8-14 yrs Strong data architect SQL& Snowflake exp must Collaborate with cross-functional teams, integrate & translate them into robust data architectures Manufacturing industry exp is a must
Posted 1 month ago
5.0 - 6.0 years
5 - 6 Lacs
Chennai
Hybrid
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Software Engineer Practitioner Location: Chennai Work Type: Hybrid Position Description: We're seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform. In this role, you'll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment. You'll work with GCP Native technologies like BigQuery, Dataform,Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance. This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at client. Basic Qualifications: Bachelors or Masters degree in a Computer Science, Engineering or a related or related field of study 5+ Years - Strong understating of Database concepts and experience with multiple database technologies optimizing query and data processing performance. 5+ Years - Full Stack Data Engineering Competency in a public cloud Google Critical thinking skills to propose data solutions, test, and make them a reality. 5+ Years - Highly Proficient in SQL, Python, Java- Experience programming engineering transformation in Python or a similar language. 5+ Years - Ability to work effectively across organizations, product teams and business partners. 5+ Years - Knowledge Agile (Scrum) Methodology, experience in writing user stories Deep understanding of data service ecosystems including data warehousing, lakes and Marts User experience advocacy through empathetic stakeholder relationship. Effective Communication both internally (with team members) and externally (with stakeholders) Knowledge of Data Warehouse concepts experience with Data Warehouse/ ETL processes Strong process discipline and thorough understating of IT processes (ISP, Data Security). Skills Required: Data Architecture, Data Warehousing, DataForm, Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API Experience Required: Excellent communication, collaboration and influence skills; ability to energize a team. Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality Hands on experience in Python using libraries like NumPy, Pandas, etc. Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, Dataform, PubSub Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products. Experience Required: 5+ Years Education Required: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.
Posted 1 month ago
6.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Role Description: As a Data Engineering Lead, you will play a crucial role in overseeing the design, development, and maintenance of our organization's data architecture and infrastructure. You will be responsible for designing and developing the architecture for the data platform that ensures the efficient and effective processing of large volumes of data, enabling the business to make informed decisions based on reliable and high-quality data. The ideal candidate will have a strong background in data engineering, excellent leadership skills, and a proven track record of successfully managing complex data projects. Responsibilities : Data Architecture and Design : Design and implement scalable and efficient data architectures to support the organization's data processing needs Work closely with cross-functional teams to understand data requirements and ensure that data solutions align with business objectives ETL Development : Oversee the development of robust ETL processes to extract, transform, and load data from various sources into the data warehouse Ensure data quality and integrity throughout the ETL process, implementing best practices for data cleansing and validation Big Data Technology - Stay abreast of emerging trends and technologies in big data and analytics, and assess their applicability to the organization's data strategy Implement and optimize big data technologies to process and analyze large datasets efficiently Cloud Integration: Collaborate with the IT infrastructure team to integrate data engineering solutions with cloud platforms, ensuring scalability, security, and performance. Performance Monitoring and Optimization : Implement monitoring tools and processes to track the performance of data pipelines and proactively address any issues Optimize data processing. Documentation : Maintain comprehensive documentation for data engineering processes, data models, and system architecture Ensure that team members follow documentation standards and best practices. Collaboration and Communication : Collaborate with data scientists, analysts, and other stakeholders to understand their data needs and deliver solutions that meet those requirements Communicate effectively with technical and non-technical stakeholders, providing updates on project status, challenges, and opportunities. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 6-8 years of professional experience in data engineering In-depth knowledge of data modeling, ETL processes, and data warehousing. In-depth knowledge of building the data warehouse using Snowflake Should have experience in data ingestion, data lakes, data mesh and data governance Must have experience in Python programming Strong understanding of big data technologies and frameworks, such as Hadoop, Spark, and Kafka. Experience with cloud platforms, such as AWS, Azure, or Google Cloud. Familiarity with database systems like SQL, NoSQL, and data pipeline orchestration tools .Excellent problem-solving and analytical skills .Strong communication and interpersonal skills. Proven ability to work collaboratively in a fast-paced, dynamic environment.
Posted 1 month ago
2.0 - 5.0 years
20 - 25 Lacs
Hyderabad
Work from Office
About the Role We are looking for an Analytics Engineer with 2+ years of experience to help build and maintain our modern data platform. You'll work with dbt , Snowflake , and Airflow to develop clean, well-documented, and trusted datasets. This is a hands-on role ideal for someone who wants to grow their technical skills while contributing to a high-impact analytics function. Key Responsibilities Build and maintain scalable data models using dbt and Snowflake Develop and orchestrate data pipelines with Airflow or similar tools Partner with teams across DAZN to translate business needs into robust datasets Ensure data quality through testing, validation, and monitoring practices Follow best practices in code versioning, CI/CD, and data documentation Contribute to the evolution of our data architecture and team standards What Were Looking For 2+ years of experience in analytics/data engineering or similar roles Strong skills in SQL and working knowledge of cloud data warehouses (Snowflake preferred) Experience with dbt for data modeling and transformation Familiarity with Airflow or other workflow orchestration tools Understanding of ELT processes, data modeling, and data governance principles Strong collaboration and communication skills Nice to Have Experience working in media, OTT, or sports technology domains Familiarity with BI tools like Looker , Tableau , or Power BI Exposure to testing frameworks like dbt tests or Great Expectations
Posted 1 month ago
8.0 - 12.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Job Summary We are seeking an experienced Data Architect with expertise in Snowflake, dbt, Apache Airflow, and AWS to design, implement, and optimize scalable data solutions. The ideal candidate will play a critical role in defining data architecture, governance, and best practices while collaborating with cross-functional teams to drive data-driven decision-making. Key Responsibilities Data Architecture & Strategy: Design and implement scalable, high-performance cloud-based data architectures on AWS. Define data modelling standards for structured and semi-structured data in Snowflake. Establish data governance, security, and compliance best practices. Data Warehousing & ETL/ELT Pipelines: Develop, maintain, and optimize Snowflake-based data warehouses. Implement dbt (Data Build Tool) for data transformation and modelling. Design and schedule data pipelines using Apache Airflow for orchestration. Cloud & Infrastructure Management: Architect and optimize data pipelines using AWS services like S3, Glue, Lambda, and Redshift. Ensure cost-effective, highly available, and scalable cloud data solutions. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to align data solutions with business goals. Provide technical guidance and mentoring to the data engineering team. Performance Optimization & Monitoring: Optimize query performance and data processing within Snowflake. Implement logging, monitoring, and alerting for pipeline reliability. Required Skills & Qualifications 10+ years of experience in data architecture, engineering, or related roles. Strong expertise in Snowflake, including data modeling, performance tuning, and security best practices. Hands-on experience with dbt for data transformations and modeling. Proficiency in Apache Airflow for workflow orchestration. Strong knowledge of AWS services (S3, Glue, Lambda, Redshift, IAM, EC2, etc.). Experience with SQL, Python, or Spark for data processing. Familiarity with CI/CD pipelines, Infrastructure-as-Code (Terraform/CloudFormation) is a plus. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, etc.). Preferred Qualifications Certifications: AWS Certified Data Analytics Specialty, Snowflake SnowPro Certification, or dbt Certification. Experience with streaming technologies (Kafka, Kinesis) is a plus. Knowledge of modern data stack tools (Looker, Power BI, etc.). Experience in OTT streaming could be added advantage.
Posted 1 month ago
3.0 - 6.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Challenging. Meaningful. Life-changing. Those aren t words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You ll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more: careers. bms. com/working-with-us . Summary: As a Data Engineer based out of our BMS Hyderabad you are part of the Data Platform team along with supporting the larger Data Engineering community, that delivers data and analytics capabilities across different IT functional domains. The ideal candidate will have a strong background in data engineering, DataOps, cloud native services, and will be comfortable working with both structured and unstructured data. Key Responsibilities The Data Engineer will be responsible for designing, building, and maintaining the ETL pipelines, data products, evolution of the data products, and utilize the most suitable data architecture required for our organizations data needs. Responsible for delivering high quality, data products and analytic ready data solution Work with an end-to-end ownership mindset, innovate and drive initiatives through completion. Develop and maintain data models to support our reporting and analysis needs Optimize data storage and retrieval to ensure efficient performance and scalability Collaborate with data architects, data analysts and data scientists to understand their data needs and ensure that the data infrastructure supports their requirements Ensure data quality and integrity through data validation and testing Implement and maintain security protocols to protect sensitive data Stay up-to-date with emerging trends and technologies in data engineering and analytics Closely partner with the Enterprise Data and Analytics Platform team, other functional data teams and Data Community lead to shape and adopt data and technology strategy. Serves as the Subject Matter Expert on Data & Analytics Solutions. Knowledgeable in evolving trends in Data platforms and Product based implementation Has end-to-end ownership mindset in driving initiatives through completion Comfortable working in a fast-paced environment with minimal oversight Mentors other team members effectively to unlock full potential Prior experience working in an Agile/Product based environment Qualifications & Experience 7+ years of hands-on experience working on implementing and operating data capabilities and cutting-edge data solutions, preferably in a cloud environment. Breadth of experience in technology capabilities that span the full life cycle of data management including data lakehouses, master/reference data management, data quality and analytics/AI ML is needed. In-depth knowledge and hands-on experience with ASW Glue services and AWS Data engineering ecosystem. Hands-on experience developing and delivering data, ETL solutions with some of the technologies like AWS data services (Redshift, Athena, lakeformation, etc. ), Cloudera Data Platform, Tableau labs is a plus 5+ years of experience in data engineering or software development Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Strong programming skills in languages such as Python, R, PyTorch, PySpark, Pandas, Scala etc. Experience with SQL and database technologies such as MySQL, PostgreSQL, Presto, etc. Experience with cloud-based data technologies such as AWS, Azure, or Google Cloud Platform Strong analytical and problem-solving skills Excellent communication and collaboration skills Functional knowledge or prior experience in Lifesciences Research and Development domain is a plus Experience and expertise in establishing agile and product-oriented teams that work effectively with teams in US and other global BMS site. Initiates challenging opportunities that build strong capabilities for self and team Demonstrates a focus on improving processes, structures, and knowledge within the team. Leads in analyzing current states, deliver strong recommendations in understanding complexity in the environment, and the ability to execute to bring complex solutions to completion. If you come across a role that intrigues you but doesn t perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. With a single vision as inspiring as Transforming patients lives through science , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms. com . Visit careers. bms. com/ eeo -accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers. bms. com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations.
Posted 1 month ago
3.0 - 7.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Challenging. Meaningful. Life-changing. Those aren t words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You ll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more: careers. bms. com/working-with-us . As a Data Engineer based out of our BMS Hyderabad you are part of the Data Platform team along with supporting the larger Data Engineering community, that delivers data and analytics capabilities across different IT functional domains. The ideal candidate will have a strong background in data engineering, DataOps, cloud native services, and will be comfortable working with both structured and unstructured data. Key Responsibilities The Data Engineer will be responsible for designing, building, and maintaining the ETL pipelines, data products, evolution of the data products, and utilize the most suitable data architecture required for our organizations data needs. Responsible for delivering high quality, data products and analytic ready data solution Work with an end-to-end ownership mindset, innovate and drive initiatives through completion. Develop and maintain data models to support our reporting and analysis needs Optimize data storage and retrieval to ensure efficient performance and scalability Collaborate with data architects, data analysts and data scientists to understand their data needs and ensure that the data infrastructure supports their requirements Ensure data quality and integrity through data validation and testing Implement and maintain security protocols to protect sensitive data Stay up-to-date with emerging trends and technologies in data engineering and analytics Closely partner with the Enterprise Data and Analytics Platform team, other functional data teams and Data Community lead to shape and adopt data and technology strategy. Serves as the Subject Matter Expert on Data & Analytics Solutions. Knowledgeable in evolving trends in Data platforms and Product based implementation Has end-to-end ownership mindset in driving initiatives through completion Comfortable working in a fast-paced environment with minimal oversight Mentors other team members effectively to unlock full potential Prior experience working in an Agile/Product based environment Qualifications & Experience 7+ years of hands-on experience working on implementing and operating data capabilities and cutting-edge data solutions, preferably in a cloud environment. Breadth of experience in technology capabilities that span the full life cycle of data management including data lakehouses, master/reference data management, data quality and analytics/AI ML is needed. In-depth knowledge and hands-on experience with ASW Glue services and AWS Data engineering ecosystem. Hands-on experience developing and delivering data, ETL solutions with some of the technologies like AWS data services (Redshift, Athena, lakeformation, etc. ), Cloudera Data Platform, Tableau labs is a plus 5+ years of experience in data engineering or software development Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Strong programming skills in languages such as Python, R, PyTorch, PySpark, Pandas, Scala etc. Experience with SQL and database technologies such as MySQL, PostgreSQL, Presto, etc. Experience with cloud-based data technologies such as AWS, Azure, or Google Cloud Platform Strong analytical and problem-solving skills Excellent communication and collaboration skills Functional knowledge or prior experience in Lifesciences Research and Development domain is a plus Experience and expertise in establishing agile and product-oriented teams that work effectively with teams in US and other global BMS site. Initiates challenging opportunities that build strong capabilities for self and team Demonstrates a focus on improving processes, structures, and knowledge within the team. Leads in analyzing current states, deliver strong recommendations in understanding complexity in the environment, and the ability to execute to bring complex solutions to completion. If you come across a role that intrigues you but doesn t perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. With a single vision as inspiring as Transforming patients lives through science , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms. com . Visit careers. bms. com/ eeo -accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers. bms. com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations.
Posted 1 month ago
5.0 - 8.0 years
25 - 40 Lacs
Bengaluru
Hybrid
We are currently hiring for a Data Architect role with a leading technology-driven team focused on building scalable, AI-integrated systems. Please find the job details below: Role: Data Architect Location: Bangalore Experience: 5-8 years Work mode: Hybrid (3 days/week from office) Job description: High-Level Requirements: Design scalable, secure, and flexible data architecture spanning the entire PIQ lifecyclefrom PoC through PIQ v1, v2, and v3. Architect hybrid graph-relational data models capturing complex multi-tier supplier-component relationships. Design data ingestion pipelines supporting external data and Bring Your Own Data (BYOD) integration. Ensure data governance, quality, lineage, and compliance across all product versions. Collaborate with technical leadership to define long-term data strategy and roadmap. Plan for scalability to cover extended supplier networks and support advanced analytics, risk scoring, and scenario modelling. Job Requirments: Extensive experience in enterprise data architecture and design. Expertise in graph and relational database technologies. Strong knowledge of data governance, security, and compliance. Proven ability to architect scalable data platforms integrating diverse data sources. Excellent leadership, strategic thinking, and collaboration skills. Regards, Arun Kumar. A
Posted 1 month ago
18.0 - 23.0 years
15 - 19 Lacs
Hyderabad
Work from Office
About the Role We are seeking a highly skilled and experienced Data Architect to join our team. The ideal candidate will have at least 18 years of experience in Data engineering and Analytics and a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will be expected to design, create, deploy, and manage Blackbauds data architecture. This role has considerable technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. This individual acts as an evangelist for proper data strategy with other teams at Blackbaud and assists with the technical direction, specifically with data, of other projects. What you'll do Develop and direct the strategy for all aspects of Blackbauds Data and Analytics platforms, products and services Set, communicate and facilitate technical direction more broadly for the AI Center of Excellence and collaboratively beyond the Center of Excellence Design and develop breakthrough products, services or technological advancements in the Data Intelligence space that expand our business Work alongside product management to craft technical solutions to solve customer business problems. Own the technical data governance practices and ensures data sovereignty, privacy, security and regulatory compliance. Continuously challenging the status quo of how things have been done in the past. Build data access strategy to securely democratize data and enable research, modelling, machine learning and artificial intelligence work. Help define the tools and pipeline patterns our engineers and data engineers use to transform data and support our analytics practice Work in a cross-functional team to translate business needs into data architecture solutions. Ensure data solutions are built for performance, scalability, and reliability. Mentor junior data architects and team members. Keep current on technologydistributed computing, big data concepts and architecture. Promote internally how data within Blackbaud can help change the world. What you'll bring 18+ years of experience in data and advanced analytics At least 8 years of experience working on data technologies in Azure/AWS Expertise in SQL and Python Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. Expertise in Databricks, Microsoft Fabric Strong understanding of data modeling, data warehousing, data lakes, data mesh and data products. Experience with machine learning Excellent communication and leadership skills. Preferred Qualifications Experience working with .Net/Java and Microservice Architecture Stay up to date on everything Blackbaud, follow us on Linkedin, X, Instagram, Facebook and YouTube Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.
Posted 1 month ago
4.0 - 8.0 years
7 - 11 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Job Title: Erwin Data Modeler, Insurance domain Location: Any Job Type: Full-Time | 2-11pm Shift Job Summary We are seeking a skilled and experienced Data Modeler with hands-on expertise in Erwin Data Modeling to join our team. The ideal candidate will have a strong background in data architecture and modeling, with a minimum of 4 years of relevant experience. Knowledge of the insurance domain is a significant plus. Key Responsibilities Design, develop, and maintain conceptual, logical, and physical data models using Erwin Data Modeler. Collaborate with business analysts, data architects, and developers to understand data requirements and translate them into data models. Ensure data models align with enterprise standards and best practices. Perform data analysis and profiling to support modeling efforts. Maintain metadata and documentation for data models. Support data governance and data quality initiatives. Participate in reviews and provide feedback on data models and database designs. Required Skills & Qualifications Strong understanding of data modeling concepts including normalization, denormalization, and dimensional modeling. Knowledge on any relational database will be an advantage. Familiarity with data warehousing and ETL processes. Excellent analytical and problem-solving skills. Strong communication and collaboration abilities.
Posted 1 month ago
16.0 - 18.0 years
50 - 60 Lacs
Bengaluru
Work from Office
Join us as a Data Engineer You ll be the voice of our customers, using data to tell their stories and put them at the heart of all decision-making We ll look to you to drive the build of effortless, digital first customer experiences If you re ready for a new challenge and want to make a far-reaching impact through your work, this could be the opportunity you re looking for Were offering this role at vice president level What youll do As a Data Engineer, you ll be looking to simplify our organisation by developing innovative data driven solutions through data pipelines, modelling and ETL design, inspiring to be commercially successful while keeping our customers, and the bank s data, safe and secure. You ll drive customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tool to gather and build data solutions. You ll support our strategic direction by engaging with the data engineering community to deliver opportunities, along with carrying out complex data engineering tasks to build a scalable data architecture. Your responsibilities will also include: Building advanced automation of data engineering pipelines through removal of manual stages Embedding new data techniques into our business through role modelling, training, and experiment design oversight Delivering a clear understanding of data platform costs to meet your departments cost saving and income targets Sourcing new data using the most appropriate tooling for the situation Developing solutions for streaming data ingestion and transformations in line with our streaming strategy The skills youll need To thrive in this role, you must have twelve years expereince and you ll need a strong understanding of data usage and dependencies and experience of extracting value and features from large scale data. You ll also bring practical experience of programming languages alongside knowledge of data and software engineering fundamentals. Additionally, you ll need: Experience of data engineering toolsets such as Airflow, RDBM tools like PGSQL/Oracle/DB2, Snowflake, S3, EMR/DataBricks and Data Pipelines etc. Proficiency in Python, PySpark, SQL, CICD pipelines, Git version control Experience in reporting tools such as QuickSight would be an added advantage Good understanding of Database, Data Warehouse and ETL concepts Strong communication skills with the ability to proactively engage and manage a wide range of stakeholders Hours 45 Job Posting Closing Date: 29/07/2025
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Pune
Work from Office
About Forma.ai: Forma.ai is a Series B startup thats revolutionizing how sales compensation is designed, managed and optimized. We handle billions in annual managed commissions for market leaders like Edmentum, Stryker, and Autodesk. Our growth has been fuelled by our passion for fundamentally changing and shaping how companies use sales intelligence to drive business strategy. We re welcoming equally driven individuals who are excited about creating something big! About the Team: The Customer Operations team works closely with new and existing customers by implementing product features, managing the operational parts of the platform, and optimizing our client s sales performance management processes. We are always ready to support and help our customers to identify ways they can unleash the revenue-driving potential of their sales compensation program. If you re passionate about data analytics and want to contribute to sales operations, we d love to hear from you! What you ll be doing: Work with new and existing customers to implement the companys new platform features as well as manage and optimize client processes Learn the architecture and design of the companys platform to the extent of being able to independently complete updates, enhancements, and change requests Lead onboarding & activities, including requirements gathering for incentive compensation plan rules, data analysis, and quality control Assist the ongoing operations for a portfolio of existing customers and implement new features (i.e. rule building, process execution, reporting and dashboarding, and product support) Scope, build and test automated workflows, processes and reporting capabilities to support automation of incentive compensation processes and improve business visibility Support design projects including analysis, financial modelling, project planning, customer workshops, and presentation/recommendation of findings Interact with key customer stakeholders to coordinate project execution Hire, guide and coach a team of Analysts, Associates, and Managers to be high performing and client focused Act as the main point of contact for senior customer stakeholders Act as a key point of contact to articulate customer feedback and support the development of new product features Act as a key point of contact for the companys to implement new platform features across customers to support continuous product improvement What were looking for: Education or Background in Engineering, Commerce, Mathematics and/or Statistics +6 years of working experience Experience in working with large datasets (SQL) and a strong understanding of logical structures of databases and workflows a plus Strength in Excel to perform data profiling, segmentation, and aggregation Understanding of data architecture and process flows Analytical problem-solving ability, organizational, and project management skills Ability to take ownership and run with tasks in a fast-paced and evolving environment Understanding of common analytical and presentation tools (i.e., Excel, PowerPoint) Nice to haves: Experience with SQL and/or Python Experience with sales incentive compensation Our values: Work well, together. We re real. We have kids and pets. Mortgages and student loans. We re in this together, so no matter how brilliant any one of us is, we always play nice with one another - no exceptions. Be precise. Be relentless. We believe complacency breeds failure, so we set new goals as quickly as we achieve them. We persist in the face of adversity, learn from our mistakes, and push each other to continuously improve. The status-quo is kryptonite. Love our tech. Love our customers. Our platform solves a very complex problem in a currently underserved market. While everyone at Forma isn t customer-facing, we re all customer-focused. Maybe even slightly customer-obsessed. Our commitment to you: We know that applying to a new role takes a lot of effort. Youre encouraged to apply even if your experience doesnt precisely match the job description. There are many paths to a successful career and we re looking forward to reading yours. We thank all applicants for their interest.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |