Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
8 - 10 Lacs
Hyderabad
Work from Office
Position: Sr. BI Developer Work Location: Hyderabad Mode of work - Hybrid Experience - 6 to 8 years Summary: We are seeking a skilled Senior Spotfire Developer with 6 - 8 years of experience to join our analytics team. The ideal candidate will bring expertise in TIBCO Spotfire and a strong foundation in business intelligence and data visualization. This role involves developing, optimizing, and supporting interactive dashboards and reports that provide key insights to support data-driven decision-making. You will work closely with cross-functional teams, including data analysts, engineers, and business stakeholders, to deliver impactful solutions that meet business objectives. Key Responsibilities: Spotfire Development and Customization: Design, develop, and deploy Spotfire applications, dashboards, and reports to support various business units in data-driven initiatives. Requirement Analysis: Collaborate with stakeholders to gather and understand requirements, translating them into technical solutions within Spotfire. Data Integration and Transformation: Use data blending and transformation techniques to prepare data for analysis in Spotfire, ensuring quality and integrity. Optimize Performance: Implement best practices for data loading, caching, and optimization to ensure responsive and efficient dashboards. Customization and Scripting: Enhance Spotfire functionality through scripting (IronPython, JavaScript) and integrate R or Python when needed for advanced analytics. Documentation and Support: Maintain documentation for dashboards and reports and provide support to users, addressing any technical or functional issues. Qualifications: Education: Bachelor s degree in Computer Science, Data Analytics, Information Systems, or a related field. Experience: 4 6 years in BI development, with 4+ years specifically in TIBCO Spotfire Technical Skills: Proficiency in TIBCO Spotfire, including visualization techniques and dashboard configuration. Strong SQL skills and experience with data modeling and data blending. Scripting experience in IronPython and/or JavaScript; knowledge of R or Python for advanced Spotfire functionalities. Familiarity with data integration tools such as Informatica, Alteryx, or equivalent. Analytical Skills: Ability to interpret complex data sets and create visually appealing, user-friendly dashboards. Soft Skills: Strong communication and interpersonal skills with the ability to work collaboratively in a team setting. Preferred Skills: Experience with other BI tools (e.g., Power BI, Tableau) is a plus. Understanding of machine learning and predictive analytics in a BI context. Exposure to cloud platforms like AWS, Azure, or Google Cloud.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
About Us Observe.AI is transforming customer service with AI agents that speak, think, and act like your best human agents helping enterprises automate routine customer calls and workflows, support agents in real time, and uncover powerful insights from every interaction. With Observe.AI , businesses boost automation, deliver faster, more consistent 24/7 service and build stronger customer loyalty. Trusted by brands like Accolade, Prudential, Concentrix, Cox Automotive, and Included Health, Observe.AI is redefining how businesses connect with customers driving better experiences and lasting relationships at every touchpoint. The Opportunity We are looking for a Senior Data Engineer with strong hands-on experience in building scalable data pipelines and real-time processing systems. You will be part of a high-impact team focused on modernizing our data architecture, enabling self-serve analytics, and delivering high-quality data products. This role is ideal for engineers who love solving complex data challenges, have a growth mindset, and are excited to work on both batch and streaming systems. What you ll be doing: Build and maintain real-time and batch data pipelines using tools like Kafka, Spark, and Airflow. Contribute to the development of a scalable LakeHouse architecture using modern data formats such as Delta Lake, Hudi, or Iceberg. Optimize data ingestion and transformation workflows across cloud platforms (AWS, GCP, or Azure). Collaborate with Analytics and Product teams to deliver data models, marts, and dashboards that drive business insights. Support data quality, lineage, and observability using modern practices and tools. Participate in Agile processes (Sprint Planning, Reviews) and contribute to team knowledge sharing and documentation. Contribute to building data products for inbound (ingestion) and outbound (consumption) use cases across the organization. Who you are: 5-8 years of experience in data engineering or backend systems with a focus on large-scale data pipelines. Hands-on experience with streaming platforms (e.g., Kafka) and distributed processing tools (e.g., Spark or Flink). Working knowledge of LakeHouse formats (Delta/Hudi/Iceberg) and columnar storage like Parquet. Proficient in building pipelines on AWS, GCP, or Azure using managed services and cloud-native tools. Experience in Airflow or similar orchestration platforms. Strong in data modeling and optimizing data warehouses like Redshift, BigQuery, or Snowflake. Exposure to real-time OLAP tools like ClickHouse, Druid, or Pinot. Familiarity with observability tools such as Grafana, Prometheus, or Loki. Some experience integrating data with MLOps tools like MLflow, SageMaker, or Kubeflow. Ability to work with Agile practices using JIRA, Confluence, and participating in engineering ceremonies. Compensation, Benefits and Perks Excellent medical insurance options and free online doctor consultations Yearly privilege and sick leaves as per Karnataka S&E Act Generous holidays (National and Festive) recognition and parental leave policies Learning & Development fund to support your continuous learning journey and professional development Fun events to build culture across the organization Flexible benefit plans for tax exemptions (i.e. Meal card, PF, etc.) Our Commitment to Inclusion and Belonging Observe.AI is an Equal Employment Opportunity employer that proudly pursues and hires a diverse workforce. Observe AI does not make hiring or employment decisions on the basis of race, color, religion or religious belief, ethnic or national origin, nationality, sex, gender, gender identity, sexual orientation, disability, age, military or veteran status, or any other basis protected by applicable local, state, or federal laws or prohibited by Company policy. Observe.AI also strives for a healthy and safe workplace and strictly prohibits harassment of any kind. We welcome all people. We celebrate diversity of all kinds and are committed to creating an inclusive culture built on a foundation of respect for all individuals. We seek to hire, develop, and retain talented people from all backgrounds. Individuals from non-traditional backgrounds, historically marginalized or underrepresented groups are strongly encouraged to apply. If you are ambitious, make an impact wherever you go, and youre ready to shape the future of Observe.AI, we encourage you to apply. For more information, visit www.observe.ai .
Posted 2 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Manager, Data Engineer Based in Hyderabad, join a global healthcare biopharma company and be part of a 130-year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of the company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers . Role Overview: For the Data Engineer role, we are looking for professional with experience in designing, developing, and maintaining data pipelines. We intend to make data reliable, governed, secure and available for analytics within the organization. As part of a team this role will be responsible for data management with a broad range of activities like data ingestion to cloud data lakes and warehouses, quality control, metadata management and orchestration of machine learning models. We are also forward looking and plan to bring innovations like data mesh and data fabric into our ecosystem of tools and processes. What will you do in this role: Design, develop and maintain data pipelines to extract data from a variety of sources and populate data lake and data warehouse. Develop the various data transformation rules and data modeling capabilities. Collaborate with Data Analyst, Data Scientists, Machine Learning Engineers to identify and transform data for ingestion, exploration, and modeling. Work with data governance team and implement data quality checks and maintain data catalogs. Use Orchestration, logging, and monitoring tools to build resilient pipelines. Use test driven development methodology when building ELT/ETL pipelines. Develop pipelines to ingest data into cloud data warehouses. Analyze data using SQL. Use serverless AWS services like Glue, Lambda, Step Functions Use Terraform Code to deploy on AWS. Containerize Python code using Docker. Use Git for version control and understand various branching strategies. Build pipelines to work with large datasets using PySpark Develop proof of concepts using Jupyter Notebooks Work as part of an agile team. Create technical documentation as needed. What Should you have: 4-8 years of relevant experience Good experience with AWS services like S3, ECS, Fargate , Glue, Any AWS developer or architect certification Agile development methodology Step Functions , CloudWatch, Lambda, EMR SQL Proficient in Python, PySpark Good with Git, Docker, Terraform Ability to work in cross functional teams Bachelor s Degree or equivalent experience in a relevant field such as Engineering (preferably computer eng ineers .), Computer Scienc e Our t echnology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation . Who we are: What we look for: Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, S you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Agile Application Development, Agile Application Development, Agile Methodology, Branching Strategy, Business, Business Intelligence (BI), Business Partnerships, Computer Science, Database Administration, Data Engineering, Data Management, Data Modeling, Data Pipelines, Data Quality Control, Data Visualization, Design Applications, Digital Transformation, Information Management, Information Technology Operations, IT Operation, Management Process, Social Collaboration, Software Development, Software Development Life Cycle (SDLC), SQL Data Analysis {+ 1 more} Preferred Skills: Job Posting End Date: 07/31/2025 *A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
":" As a Senior Software Engineer, you will play a key role in designing, developing, and maintaining complex software systems. You will drive technical initiatives, and mentor junior engineers. Your expertise will be instrumental in ensuring high-quality, scalable, and performant solutions that align with the companys architectural goals and business needs. You will contribute to technical strategy, architectural decisions, and process improvements, while fostering a culture of innovation, collaboration, and engineering excellence. Key Outcomes/Objectives: Design and implement robust, scalable, and high-performance software architectures. Lead and mentor junior engineers, fostering a culture of technical excellence and continuous learning. Ensure code quality, adherence to coding standards, and best practices across the team, acting as a champion for engineering rigor. Drive the resolution of complex technical challenges and contribute to the development of innovative solutions, leading the way in overcoming technical obstacles. Contribute to the development of technical roadmaps and strategic plans, influencing the future direction of the product/sub-product. Core Responsibilities: Technical Leadership and Architecture: Design and implement complex software components and features with a focus on scalability, performance, and maintainability. Contribute to sub-product or feature-level architectural decisions, ensuring alignment with overall system architecture. Lead technical discussions within the team, influencing design choices and engineering practices. Identify and mitigate technical risks early in the development lifecycle. Evaluate and recommend new technologies, frameworks, and tools to improve development efficiency. Code Development and Quality Assurance: Write clean, efficient, and well-documented code that adheres to coding standards and best practices . Lead code reviews and ensure adherence to quality standards across the team . Develop and maintain automated tests (unit, integration, and end-to-end) to improve software reliability. Identify and resolve performance bottlenecks, scalability issues, and technical debt. Mentorship and Team Collaboration: Mentor and guide junior engineers in technical development, best practices, and problem-solving. Lead technical discussions and knowledge-sharing sessions within the team, fostering a culture of continuous learning and collaboration. Be an active contributor in your Community of Practice: You play an active role in the OVO Engineering community on all things related to engineering, sharing practices and offering firsthand experience to the wider community Project Execution and Agile Practices: Participate in sprint planning, backlog refinement, and daily stand-ups to ensure timely and efficient delivery. Break down complex projects into well-defined, executable tasks and contribute to sprint commitments. Monitor delivery progress and technical dependencies, proactively resolving potential blockers. Contribute to technical roadmaps and long-term engineering strategies for sub-products and features. Documentation and Knowledge Sharing: Create and maintain technical documentation, including architecture diagrams, design documents, and API specifications. Share knowledge and expertise through presentations, workshops, and documentation. Contribute to the development of internal tools and libraries. Community of Practice : Contribute to the appropriate Community of Practice (CoP) for your role by leading discussions, sharing practices, offering firsthand experience to the wider community, engaging in knowledge exchange / cross-pollination to further your craft. Create content and and individually contribute to the stated successful outcomes for this CoP Qualifications: Education / Experience : Bachelors or Master\u2019s degree in a technical field or equivalent qualifications, or substantial industry experience demonstrating comparable expertise 5-8 years of hands-on software development experience with a strong track record of delivering high-quality code. Committed to technical excellence and clean code, with the ability to work in Agile, Lean software teams Proven experience in designing and implementing complex software architectures. Experience leading technical initiatives and mentoring junior engineers Ability to thrive in high-ownership environments Skills: Strong proficiency in multiple programming languages, including Node.js, Python, TypeScript, JavaScript, React Native, and React.js, with a focus on building and maintaining microservices-based architectures. Equivalent experience with related technologies and frameworks will also be considered. Deep understanding of software architecture, design patterns, and distributed systems. Experience with cloud platforms such as GCP and AWS (Azure is not preferred), along with expertise in containerization technologies like Docker and Kubernetes. Strong understanding of database systems and data modeling. Experience with CI/CD pipelines and automation tools. Strong leadership and mentorship skills. Excellent communication and interpersonal skills. Strong problem-solving and analytical skills. Ability to work independently and as part of a team. Strong attention to detail and a commitment to quality. Ability to learn and adapt to new technologies quickly. Strategic thinking and planning skills. ","Experience":"5-8","
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Position: Guidewire Integration QA Experience: 5+ years Key Responsibilities: Perform functional, integration, and end-to-end testing for Guidewire InsuranceSuite (PolicyCenter, BillingCenter, ClaimCenter) integrations. Validate APIs, web services, and data flows across various upstream/downstream systems. Develop and maintain test plans, test cases, and test scripts for integration scenarios. Work closely with developers and business analysts to ensure high-quality deliverables. Conduct regression testing, defect management, and root cause analysis. Must-Have Skills: Strong knowledge of Guidewire InsuranceSuite (PC/BC/CC) with a focus on integration points. Hands-on experience with SOAP/REST API testing (Postman, SOAP UI, or similar tools). Experience with SQL queries and database validation. Familiarity with test automation frameworks (e.g., Selenium, TestNG, or similar). Understanding of insurance domain processes. Good to Have: Exposure to CI/CD pipelines (Jenkins, Git). Knowledge of Gosu scripting and Guidewire data model. Experience with performance and security testing tools. ",
Posted 2 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
Gurugram
Work from Office
Ensuring rake availability for loading in coordination with LSPs and Indian Railways & achievement of Railway dispatches as per monthly plan. 2. Coordination with MSIL planning team for timely planning of rakes from respective plants. 3. Ensuring timely invoicing/retrieval at plant in coordination with LSP/SND 4. Ensuring on time departure and timely arrival of rake at destination thereby maintaining overall standard transit time 5.Optimise TAT (Turn around time) of rake by reducing the arrival to placement,Loading & drawn out time. 6. Coordination with LSPs and aligning of fleet for managing first mile & last mile dispatch as per norms of MSIL 7. Coordination with all stakeholders within MSIL plant and Railways for resolution of issues (Electricity failure/OHE failure,derailments,P-way, damages etc) 8. Preparation of Business Plan and Strategies. 9. Preparation of MIS 10.Coordination with teams at TVPs and Port for Railway dispatches. 11. RFQ and Rate negotiation for New and existing destinations . 12. Railway Liaisoning & over all corordination within MSIL & with LSPS Strong knowledge of Channel Management - Dealers & Distribution Proficiency in MS Excel and Data modelling Knowledge of Power BI is preferred Data Analysis and Data Visualization with ability to handle large data sets Strong Interpersonal Skills & collaborative approach Key Account Mangement Skills also preferred
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Chennai
Work from Office
Job Title: ServiceNow Developer Location: Offshore Remote Experience Required: 5+ years Job Type: Full Time Job Summary: We are looking for a skilled and experienced ServiceNow Developer to join our team. The ideal candidate will have strong scripting capabilities and be well-versed in end-to-end ServiceNow development. You will play a critical role in designing, configuring, and customizing the platform to meet business requirements, particularly across ITSM modules and beyond. Candidates should be skilled in all ServiceNow development (not specific to ITAM) with extensive experience with ITSM. Key Responsibilities: Develop and customize core applications using ServiceNow platform tools and best practices Write clean, scalable server-side and client-side scripts using JavaScript and Glide APIs Implement workflows, business rules, UI actions, client scripts, and scheduled jobs Integrate ServiceNow with third-party applications and systems Participate in design sessions, provide architectural guidance, and ensure scalability of solutions Troubleshoot and resolve application issues and defects Maintain documentation for technical designs and code changes Collaborate with stakeholders, including ITSM process owners, to gather requirements and deliver tailored solutions Required Skills & Experience: 4+ years of hands-on experience with ServiceNow development Strong scripting experience (JavaScript, Glide, Script Includes, Business Rules, etc.) Expertise in ServiceNow ITSM modules (Incident, Problem, Change, CMDB, Knowledge) Solid understanding of ServiceNow architecture and data model Experience with Flow Designer, IntegrationHub, and REST/SOAP integrations Knowledge of Agile/Scrum methodologies Familiarity with Service Portal and custom widget development is a plus ServiceNow certifications (CSA, CAD) are desirable Nice to Have: Experience with Scoped Applications and App Engine Studio Knowledge of ITOM, SecOps, or HRSD modules Exposure to CI/CD pipelines with ServiceNow
Posted 2 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Noida, Bengaluru
Work from Office
Job Summary: We are seeking a highly skilled Technical Project Manager with strong expertise in Power BI, Azure and Agile methodologies to lead and deliver data-driven projects. The ideal candidate will have a strong technical background combined with excellent leadership, coordination, and communication skills to drive successful project outcomes. Key Responsibilities: Manage end-to-end delivery of BI and cloud-based projects across multiple teams. Work closely with business stakeholders to gather requirements, define project scope, timelines, and deliverables. Lead Agile ceremonies sprint planning, daily stand-ups, retrospectives, and sprint reviews. Oversee development and deployment of Power BI dashboards and reports, ensuring high-quality data visualization and insights. Coordinate with Azure engineering teams for data pipelines, integration, and platform readiness. Identify project risks and develop mitigation strategies. Ensure projects are delivered on time, within scope, and within budget. Act as the primary point of contact for technical teams, business users, and leadership. Technical Skills Required: Power BI: Data modeling, DAX, Power Query, report development, performance optimization. Azure: Azure Data Factory, Azure Synapse, Azure SQL Database, Azure DevOps. Experience with cloud data architecture and integration. Familiarity with CI\/CD pipelines and release management (Azure DevOps preferred). Project Management Skills: 7+ years of experience managing technical projects. Strong expertise in Agile frameworks (Scrum, Kanban, SAFe). Experience managing cross-functional technical teams. Excellent documentation, reporting, and stakeholder management skills.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Designation - PowerBI Lead / Sr. Dev Work Location - Hyderabad (Hybrid) Experience - 5 to 10 years Responsibilities: Understand business requirements and translate them into technical specifications for Power BI reports and dashboards. Design, develop, and publish interactive dashboards using Power BI (Power BI Desktop & Power BI Service). Integrate data from various sources including SQL Server, Excel, SharePoint, and cloud-based sources (Azure, etc.). Build and optimize data models (star/snowflake schema) and DAX queries for performance and usability. Develop and manage ETL processes using Power Query, Dataflows, or Azure Data Factory. Implement row-level security (RLS) and access controls within Power BI reports. Perform data validation and ensure accuracy of reports and dashboards. Collaborate with stakeholders, business analysts, and data engineers to ensure report alignment with business goals. Monitor Power BI service performance and schedule dataset refreshes. Stay up-to-date with Power BI updates, best practices, and industry trends. Required Skills & Qualifications: Bachelor s degree in Computer Science, Information Systems, or a related field. 5+ years of hands-on experience in Power BI report and dashboard development. Proficiency in DAX, Power Query (M language), and data modeling. Strong SQL skills ability to write complex queries and optimize them. Experience in integrating Power BI with cloud data sources (e.g., Azure SQL, Data Lake). Familiarity with Power BI Gateway, Power BI Report Server (optional), and Workspace Management. Solid understanding of data warehousing concepts and relational databases. Strong analytical and problem-solving skills. Excellent communication and stakeholder management abilities. Preferred Qualifications: Microsoft Power BI cerrtifications. Exposure to other BI tools like Tableau or QlikView (optional). Experience working in Agile or Scrum teams.
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Job Summary We are seeking a skilled Guidewire BillingCenter Configuration Developer to join our dynamic team working on enterprise-scale insurance platforms. The ideal candidate will have strong hands-on experience with BillingCenter configuration, business rules, and UI customization using Gosu. Key Responsibilities Configure and customize Guidewire BillingCenter components using Gosu Develop and maintain business rules, workflows, and UI elements Integrate BillingCenter with internal and third-party systems via SOAP/REST APIs Collaborate with business analysts, architects, and QA teams to deliver high-quality solutions Participate in Agile/Scrum ceremonies and contribute to sprint planning and delivery Must-Have Skills 3+ years of experience with Guidewire BillingCenter (version 10.x preferred) Proficiency in Gosu programming language Strong experience in BillingCenter Configuration : PCFs, Data Model, Business Rules, Plugins Good understanding of Guidewire product architecture and integration frameworks Hands-on experience with SOAP/REST APIs , GUnit, Jenkins, GIT Nice-to-Have Skills Experience with Guidewire Cloud Platform (GWCP) Exposure to DevOps, CI/CD tools (Jenkins, Docker, Kubernetes) Familiarity with other Guidewire modules (PolicyCenter/ClaimCenter) Insurance domain knowledge (especially billing processes) ",
Posted 2 weeks ago
10.0 - 17.0 years
35 - 40 Lacs
Pune
Work from Office
Job Overview: As a Sr Specialist - Software Development, you will develop new product features/modules using best practices and provide maintenance to the existing systems. All our products are solutions for the airline, transport and travel industry using different technologies. Responsibilities: Translate processes and enhancement specifications into programs Develop and refine error-free code within agreed timescales using development techniques, tools, methods and languages with the aim of optimizing operational efficiency. Evaluate changes and perform impact analysis Work with functional staff to establish and clarify requirements Investigate reported faults in operational code to determine changes and approaches to the code for promotion and replacement, conforming to established procedures. Design and prepare unit testing plan strategies and write test scripts to validate all new software development and enhancements. Take ownership of the test and implementation phases of projects. Qualifications: Bachelor s degree in computer science, Engineering, or related field (or equivalent work experience). 8+ years experience in software development Strong problem solving and analytical skills. ASP.NET 4.0, C#, DOT NET Framework & MVC 3.0, Entity Framework, SQL, AWS Cloud, Node JS Strong knowledge and practical experience with AWS services for backend development and deployment. Experience in implementing and maintaining Test-Driven Development (TDD) practices. Familiarity with database technologies (e.g., SQL, NoSQL databases) and data modeling. Understanding of RESTful API design principles and best practices. Solid understanding of software development methodologies and practices. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities in a team environment. Preferred Skills: Experience with serverless architectures (AWS Lambda, etc.). Familiarity with CI/CD pipelines and related tools. XML, XSLT, REST API, LINQ Knowledge of performance optimization techniques for backend systems. Understanding of security principles and best practices in backend development.
Posted 2 weeks ago
4.0 - 9.0 years
6 - 11 Lacs
Chennai
Work from Office
At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. ZoomInfo is seeking a Business Intelligence Analyst III to play a pivotal role in scaling our product analytics operations and empowering teams with the tools and insights they need to make fast, data-driven decisions. Based in India, this role is at the center of enabling world-class product intelligence, ensuring our tools, dashboards, and systems run seamlessly and are fully leveraged by product managers across the organization. You ll work closely with Product Managers, Product Operations, Data Engineering, and other stakeholders to maintain, improve, and evangelize analytics tooling. You ll also be responsible for increasing tool adoption and proficiency across the product organization, ensuring every team has the skills and access they need to extract value from our data stack. This role is ideal for someone who loves blending technical expertise with cross-functional enablement, and is excited about making product analytics more accessible, scalable, and impactful. You will serve as the primary expert and champion for analytics tools like Amplitude and Tableau, ensuring every team can self-serve insights and make confident decisions. What Youll Do: Lead Amplitude administration and governance across the organization, including managing user permissions, monitoring and optimizing event and property usage, maintaining clean and scalable instrumentation, curating key dashboards and cohorts, and driving best practices to ensure long-term data hygiene and analytics consistency. Drive product analytics enablement : lead training and upskilling efforts for Product Managers and other stakeholders in tools like Amplitude and Tableau, introducing advanced features (e.g., screen recordings, heatmaps, and in-platform guides). Maintain and optimize Tableau dashboards : ensure business-critical dashboards are accurate, performant, and relevant to evolving product and business needs. Own Tableau online administration , including project structure, permissions, consistent naming conventions, and documentation. Maintain and optimize business-critical dashboards to ensure they are accurate, performant, and aligned with evolving product and business needs Establish and maintain best practices in tool usage and data accessibility, partnering with Data Engineering and Product Operations to improve data literacy across the organization. Document and operationalize tooling workflows for product analytics processes, such as event naming conventions, funnel tracking, retention metrics, and user segmentation. Monitor analytics tool adoption and effectiveness : gather feedback, identify gaps, and implement improvements to ensure teams are getting the most out of our analytics investments. Act as a bridge between technical and non-technical teams : translate business needs into technical requirements and vice versa, ensuring tool functionality aligns with real-world usage. Proactively recommend tooling enhancements and identify opportunities to scale self-service analytics capabilities across product teams. Automate recurring reports and enable a self-service analytics environment : Streamline regular reporting processes through automation and empower teams to independently explore insights and access key metrics via intuitive, reusable dashboards. Support ad-hoc analytical requests from Product and cross-functional teams : Translate business questions into structured analyses and deliver timely, actionable insights to inform decisions. What You Bring: Bachelor s degree in Analytics, Computer Science, Information Systems, or a related field. 4+ years of experience in business intelligence, product analytics, or data operations within a SaaS or tech environment. At least 1-2 years of experience with Amplitude and at least 2 years with Tableau, including administration, dashboard development, and stakeholder training. Strong SQL skills with at least 4 years of hands-on experience, including query optimization and a solid understanding of data modeling concepts. Proven ability to lead enablement programs and deliver effective training content for both technical and non-technical audiences. Strong communication and collaboration skills to work effectively with cross-functional stakeholders. Experience managing tooling documentation, data taxonomies, or analytics governance frameworks (preferred). A passion for helping teams work smarter and more effectively with data. Strong project management skills to lead multiple initiatives in a fast-paced, data-driven environment. Bonus Experience: Experience with dbt, Snowflake and ETL tools Knowledge of product lifecycle metrics or product experimentation (e.g., A/B testing platforms) Data quality monitoring tools (e.g., Monte Carlo, Great Expectations) Python knowledge and experience Familiarity with tools like Looker, Mixpanel, Google Analytics, or other BI platforms. Experience working in a distributed or global team environment. #LI-PR #LI-Hybrid About us: ZoomInfo (NASDAQ: GTM) is the Go-To-Market Intelligence Platform that empowers businesses to grow faster with AI-ready insights, trusted data, and advanced automation. Its solutions provide more than 35,000 companies worldwide with a complete view of their customers, making every seller their best seller.
Posted 2 weeks ago
3.0 - 15.0 years
5 - 17 Lacs
Pune
Work from Office
Skills: Power Apps + Power Automate + Dataverse+ Any combination of (Copilot studio / Azure foundry / Power BI ) Location Pan India Experience Range 3 to 15 years Key Responsibilities: Lead the development of custom applications using Power Apps (Canvas and Model-Driven Apps). Design and implement complex UI in Canvas Apps and build critical workflows and approval processes using Power Automate. Work extensively with Microsoft Dataverse for data modeling and integration. Develop and integrate custom connectors and use APIs within Power Apps. Customize views and forms using JavaScript as per business requirements. Design and develop PCF controls for both Canvas and Model-Driven Apps. Create and manage Power BI reports and dashboards to support business insights. Work with Copilot Studio (formerly Power Virtual Agents) to build intelligent chatbots. Contribute to Power Pages development for external-facing applications. Implement CI/CD pipelines using Azure DevOps for streamlined development and deployment. Collaborate effectively with cross-functional teams while being able to work independently when required. Communicate clearly with technical and non-technical stakeholders. Required Skills and Experience: Minimum of 3 years of experience working on Microsoft Power Platform. Proven involvement in at least 3 full-cycle implementation projects in a Senior Developer role. Strong proficiency in Power Apps, Power Automate, Dataverse, and API integrations. Good understanding of JavaScript and custom development for platform extensibility. Familiarity with Power BI, Copilot Studio, Power Pages, and Azure DevOps. Key Attributes: Strong problem-solving skills and attention to detail. Excellent verbal and written communication skills. Ability to work autonomously and as part of a collaborative team.
Posted 2 weeks ago
6.0 - 9.0 years
8 - 11 Lacs
Hyderabad
Work from Office
Company Overview We are looking for an exceptionally talented professional to join one of our cross-functional product teams in our Hyderabad/Bangalore/Gurgaon office for an Individual contributor role. This product team is responsible for building the investor allocation product offering, PerformA TM . This position offers opportunity to define and design the next generation of products on our platform, which is used by some of the most sophisticated hedge funds in the world; and to collaborate with some of the brightest minds in the industry. What you ll do: Work closely with the engineers/architects to translate the Product Specification to design, and then to the product itself Prepare comprehensive business test cases/beds to aid the engineering process Rigorously and continuously evaluate the progress of product/feature-in-flight by leveraging the created test cases/beds and ensure compliance to the product/feature specification and the vision Prepare prototypes using Python and AI Track and question risks/assumptions Proactivly escalate issues and mitigate execution risks What you ll need: 6 to 9 years of experience working in working in the front, middle and/or back-office space with minimum 3 years of Fund Accounting/ Investor Allocation experience Technical skills needed Familiarity with all phases of Software Development Life Cycle Strong grasp of programming fundamentals in a high-level language like Java, Python, or Javascript for simple scripting, quick prototyping or understanding code reviews. Not requiring deep development expertise, but enough to engage meaningfully with engineers. Knowledge of core concepts like APIs, microservices, and basic programming paradigms. Working knowledge of databases, data modeling, and basics of SQL. Basic understanding of cloud services (e.g., AWS, Azure) and how they impact scalability and deployment. Ability to evaluate technical trade-offs in product features. Ability to quickly learn and adapt to new systems, platforms, or tools as required by the project Exceptional verbal and written communication skills Critical thinking and the ability to articulate standpoints/ideas and influence stakeholders Candidate should have a graduate degree in software engineering Advanced knowledge in field of Fund Accounting and Investor Allocations will be an added advantage.
Posted 2 weeks ago
1.0 - 9.0 years
3 - 11 Lacs
Gurugram
Work from Office
Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact on this role? The Analyst, Sales Operations will focus on partnering with the U.S. Small and Medium Enterprises (SME) Sales and Account Development teams. They will be a key thought partner to the U.S. SME Sales Enablement leadership team on cross-channel workstreams, including owning the U.S. SME Hierarchy, project managing governance initiatives, and managing ad hoc requests from multiple stakeholders. The Analyst will work very closely with several cross-functional partners including SABE, SPT -P&I, L&D, Control Management, U.S. SME Commercial Effectiveness & Governance, and other supporting teams. The ideal candidate has familiarity with GCS sales and account development teams and will coordinate across key partners to ensure there is alignment and collaboration on priorities. They will possess thought leadership, critical thinking, communication, and organizational skills, and will have a consistent track record of excellence operating independently within a strong team environment! Job Responsibilities Become a SME (subject matter expert) on the U.S. SME Hierarchy design and mechanics to streamline and improve the process Gather information and collaborate with cross-functional teams to deliver accurate and timely reports Automate repetitive reporting task using tools such as Excel (VBA), SQL, Power BI Ensure data accuracy and consistency across reports and month-on-month Maintain documentation for reporting processes and metrics definition Perform risk management to proactively identify potential problems and mitigate risks to achieving desired objectives Serve as a PMO to lead highly complex, business-critical initiatives from inception to completion and act with an agile approach Possess a deep understanding of U.S. SME Sales and Account Development team business objectives / priorities and challenges to formulate solutions Build positive relationships with US SME Sales Enablement and Sales and Account Development teams to successfully gain consensus and support for strategic projects Minimum Qualifications Undergraduate/Postgraduate degree required 2+ years of experience in reporting, business analysis or data analytics role Strong Proficiency in Excel (Pivot table, formulas, chats) and SQL (preferred) Familiarity with databases, data warehousing and data modeling concepts The candidate should be flexible for rotating shift hours (1:30 PM to 9:30 PM IST) This is a hybrid role with the candidate expected to work from office 3 days a week Strong attention to detail and commitment to data integrity Strong project management skills with a record of successful results on complex, large-scale, cross-functional initiatives Ability to build strong partnerships and work collaboratively with others to meet shared objectives Exceptional written and verbal communication skills and comfort presenting at all levels of the organization Ability to manage multiple and complex workstreams and working across departmental boundaries to deliver a diverse set of initiatives that result in successful outcomes Preferred Qualifications Postgraduate degree or equivalent experience in quantitative fields (math, economics, computer science etc.) Proactive approach to tackle new opportunities and challenges with high energy and enthusiasm Accountability for self and others to meet all commitments and deliverables in a timely manner Strong business insight and experience with Sales and Account Development organizations, and the ability to understand their structure, operations, and strategic priorities can win attitude and a desire to learn in a fast-paced environment! We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally:
Posted 2 weeks ago
6.0 - 8.0 years
8 - 10 Lacs
Bengaluru
Work from Office
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. The person will work on a variety of projects in a highly collaborative, fast-paced environment. The person will be responsible for software development activities of KPMG, India. Part of the development team, he/she will work on the full life cycle of the process, develop code and unit testing. He/she will work closely with Technical Architect, Business Analyst, user interaction designers, and other software engineers to develop new product offerings and improve existing ones. Additionally, the person will ensure that all development practices are in compliance with KPMG s best practices policies and procedures. This role requires quick ramp up on new technologies whenever required. Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. . Role Celonis Data E ngineer Location Bangalore Experience 6 to 8 Years Key Responsibilities :- At least 6+ years of experience in databases, data integration, and data modeling. Minimum 3 years of experience in Celonis process mining. Strong skills in SQL and PQL. Experience with cloud-based data technologies. Ability to lead teams and drive project completion as per business requirements. Experience in managing the Celonis platform and optimizing APC consumption.
Posted 2 weeks ago
3.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Design, develop, and maintain backend services using Golang . Write clean, modular, and testable code that meets business and technical requirements. Integrate and collaborate with existing systems developed in Java, Python, or PHP . Build RESTful APIs, handle data modeling, and implement business logic. Ensure high system availability, performance, and scalability. Participate in code reviews, design discussions, and agile ceremonies. Work with DevOps and QA teams to support CI/CD and automated testing Go- Lang, Golang, Java, Python
Posted 2 weeks ago
4.0 - 9.0 years
8 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Accelerate your development and exposure to high performance applications and cloud infrastructure. Join Faptic Technology (remote or hybrid), a fast-growing scale-up organization with an ambition to be recognized as one of the leading technology companies in Europe. Our global client base needs builders: engineers and developers who love technology, have deep expertise in software and cloud technologies, and importantly, have a passion for culture and customers. We obsess about our customers We build outstanding technical solutions We create an awesome culture We accelerate learning and careers Technology We are seeking a skilled Power Platform Developer with comprehensive experience in Power Apps, Pages, Automate, Dataverse & BI and general data integration to join our dynamic team. The ideal candidate will have a strong background in developing and deploying Power Platform solutions, integrating Power BI reports and dashboards, and ensuring seamless data integration from multiple sources. Your activities will include: Power Platform Development: Design and develop customized PowerApps and Automate applications tailored to business requirements. Utilize Power Apps formulas and development methods to create both canvas and model-driven apps. Implement variables (contextual and global), collections, and business rules to enhance app functionality. Conduct thorough testing and debugging to ensure optimal performance and user experience. Power BI Integration: Integrate Power BI reports and dashboards into PowerApps to provide comprehensive data insights. Design and develop Power BI solutions including data modeling, DAX expressions, and visualizations. Work with various data sources and utilize import and direct query connectivity modes. Create and manage calculated columns, measures, and transformations within Power BI. Data Integration and Management: Connect to various data sources such as SQL Server, OneDrive for Business, and others to pull data into Dataverse and Power BI. Develop, test, and implement data integration solutions ensuring data integrity and reliability. Perform database management tasks including backup, restore, and optimization using T-SQL. Write and maintain DDL/DML/DCL commands for efficient data manipulation and storage. Project Management: Engage with stakeholders to gather requirements and manage project lifecycles using agile methodologies. Provide consistent support and handle production issues and escalations within SLA. Participate in code reviews and contribute to the continuous improvement of development practices. Document and communicate project progress, challenges, and solutions effectively. Additional Responsibilities: Stay updated with the latest Power Platform, Power BI, and data integration technologies. Train and mentor junior team members and share best practices. Create and maintain technical documentation and user guides. Experience Required: Bachelor s degree in computer science, Information Technology, or a related field. 4+ years of experience in IT with a focus on PowerApps and Power BI development. Proven experience with PowerApps, including creating canvas and model-driven apps. Strong knowledge of Power BI including data modeling, DAX expressions, and data visualization. Expertise in SQL Server management, T-SQL, and data integration techniques. Experience with REST APIs, Power Automate, and Microsoft 365 tools. Familiarity with agile methodologies and project management tools such as DevOps. Excellent problem-solving skills, attention to detail, and ability to work independently. Strong communication and interpersonal skills. Preferred Certifications: Microsoft Certified: Power Platform Fundamentals (PL-900) or equivalent. Certified Safe Agilist or other relevant certifications. Benefits at Faptic: Private medical insurance Training on market trends and client needs Continuous personal improvement - 8h/month during work hours Lunch on Friday (twice per month, we pay) 21 days annual leave, with one day per year extra up to 25 days 3 days sick leave without medical proof off 1 day for your Birthday off 0.5 days Christmas Shopping off Competitive package Quarterly fun budget for team events 10+ years of programme management experience in technology delivery, with time spent in a consulting or services environment. Strong commercial and financial acumen comfortable building pricing models and managing costs.
Posted 2 weeks ago
4.0 - 9.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Be an essential element to a brighter future. We work together to transform essential resources into critical ingredients for mobility, energy, connectivity and health. Join our values-led organization committed to building a more resilient world with people and planet in mind. Our core values are the foundation that make us successful for ourselves, our customers and the planet. Job Description Overview As part of the Global Data & Analytics Technology team within Corporate IT, the Enterprise Master Data Architect plays a strategic role in shaping and executing enterprise-wide master data initiatives. This role partners closely with business leaders, the Corporate Master Data Management team, and Business Relationship Managers to define and deliver scalable solutions using SAP Master Data Governance (MDG). We re looking for a forward-thinking architect with a strong blend of technical expertise and business acumen someone who can balance innovation with execution, and who thrives in a fast-paced, collaborative environment. Key Responsibilities Collaborate with business stakeholders to define enterprise master data strategies and governance frameworks. Design and implement SAP MDG solutions that support the collection, processing, and stewardship of master data across domains. Lead the development and enforcement of data governance policies, standards, and best practices. Architect and deliver SAP-centric master data solutions that align with enterprise goals and compliance requirements. Provide technical leadership and mentorship to MDM team members and cross-functional partners. Ensure consistency, quality, and accessibility of master data across systems and business units. Drive continuous improvement in data architecture, modeling, and integration practices. Qualifications Bachelor s degree in Computer Science, Information Systems, or a related field. Proven experience designing and architecting enterprise Master Data solutions. 4+ years of hands-on experience with SAP MDG and SAP Data Architecture. Strong functional knowledge of master data domains: customer, vendor, product/material, and finance in S/4HANA or ECC. Experience with SAP Data Services and SAP Information Steward for data conversion, quality, and cleansing. Proficiency in defining systems strategy, requirements gathering, prototyping, testing, and deployment. Strong configuration and solution design skills. ABAP development experience required, including custom enhancements and data modeling. Experience with SAP S/4HANA 2021 or later preferred. Excellent communication, collaboration, and time management skills. Ability to lead cross-functional teams and manage multiple priorities in a dynamic environment. Benefits of Joining Albemarle Competitive compensation Comprehensive benefits package A diverse array of resources to support you professionally and personally. We are partners to one another in pioneering new ways to be better for ourselves, our teams, and our communities. When you join Albemarle, you become our most essential element and you can anticipate competitive compensation, a comprehensive benefits package, and resources that foster your well-being and fuel your personal growth. Help us shape the future, build with purpose and grow together.
Posted 2 weeks ago
6.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Summary Guidewire is expanding its Professional Services team in India with a variety of roles open in our Professional Services center. In professional services, we provide implementation and production services to our customers around the world. We are constantly innovating to accelerate the pace, predictability, and affordability of solutions to help customers adopt the Guidewire platform and applications. We are looking for smart, proactive, and results-oriented Consultants to join our team in Bangalore. Job Description Technical Leadership Lead the design, development, and implementation of complex Guidewire PolicyCenter solutions Provide technical guidance and mentorship to junior and mid-level developers, fostering their growth and ensuring adherence to best practices. Conduct code reviews, provide constructive feedback, and ensure the quality, maintainability, and performance of the codebase. Act as a subject matter expert for Guidewire PolicyCenter within the team and organization Participate in client-facing discussions, understanding their needs, and providing expert technical advice Configuration Design, develop, and maintain Guidewire PolicyCenter application, tailoring them to meet specific client needs and business requirements. Work with Guidewire s proprietary language (Gosu), Page Configuration File(PCF), , Plugins, Workflow, etc. Create new data model elements and extend the existing applications data model with new properties for data persistence. Leverage PolicyCenter Biz Rules for developing & managing Activities,Underwriting Rules and Admin data . Experience in Conceptualization, Visualization, and Realization phases of product design phase or SBT. Use of XMIND and APD app to create Products/LOBS Develop a thorough understanding of the PolicyCenter data model to effectively manage and transform data within the system. Configure product models, lines of business, and UI elements based on client needs. Integration Establish seamless connections between PolicyCenter and other internal and external systems (e.g., third-party vendors, policy administration systems, digital portals, etc ) using various integration methods like SOAP/REST web services, batch processes, message queues, and event messaging. Develop and test integration interfaces to ensure data consistency and accuracy across integrated systems. Perform comprehensive unit testing and develop GUnit tests for custom code, aiming for maximum coverage. Understand and analyze integration requirements, propose solutions, and estimate effort. Collaboration & Teamwork Collaborate with BAs, PMs, QA, and other developers to understand requirements, plan deliverables, and deliver robust solutions. Effectively engage in team meetings to discuss project progress, challenges, and solutions Lead the estimation and prioritization of sprint backlog items, ensuring alignment with project goals and efficient resource utilization. Write clean, efficient, and maintainable code by adhering to Guidewire coding standards and guidelines, and contribute to the development of best practices. Create and maintain comprehensive technical documentation, including design specifications, integration guides, and deployment procedures. Required Skills/Experience: 6-10 years of hands-on experience working with Guidewire PolicyCenter. Ace certification would be preferred. Possess an in-depth understanding of the Guidewire platform, its components (UI, Data Model, Studio), out-of-the-box features, and accelerators.. Experience with Gosu, PCF, Entities, Typelists, Apache Camel, RESTful API s, Bitbucket & IntelliJ Experience working on APD, Rating engine , Integration gateway In-depth understanding of Object-Oriented Programming (OOP) principles. Strong problem-solving and analytical skills to diagnose and address technical problems and ensure the smooth operation of the PolicyCenter application Flexibility to do shift work as needed (aligning to US colleagues/customers). Familiarity with Agile/Scrum development methodologies BSc in Computer Science or equivalent. Experience with Guidewire Cloud is preferred. As a partner to our customers, we continually evolve to enable their success. We are proud of our unparalleled implementation track record with 1600+ successful projects, supported by the largest R&D team and partner ecosystem in the industry. Our Marketplace provides hundreds of applications that accelerate integration, localization, and innovation. For more information, please visit www.guidewire.com and follow us on Twitter: @Guidewire_PandC.
Posted 2 weeks ago
10.0 - 12.0 years
22 - 25 Lacs
Pune
Work from Office
Vice President, Full- Stack Engineer I We re seeking a future team member for the role of Vice President, Full- Stack Engineer I to join our Data Solution & SVCS Platform team. This role is in Pune, MH - HYBRID. In this role, you ll make an impact in the following ways: Design, develop, and maintain backend services and RESTful APIs using Java (Spring Boot or similar frameworks) Work closely with product, QA, operations, and business stakeholders to understand requirements and translate them into technical solutions Build and manage CI/CD pipelines to enable automated testing, deployment, and continuous delivery Participate actively in SAFe Agile ceremonies including PI planning, sprint planning, and retrospectives Collaborate with DBAs and data architects to ensure performant and accurate data handling within the Client Data Masters platform Contribute to front-end development as needed using modern JavaScript frameworks Ensure code quality and maintainability through code reviews, automated tests, and clean design Support production systems, troubleshoot issues, and implement sustainable fixes To be successful in this role, we re seeking the following: 10 12 years of full stack development experience with strong emphasis on backend/API development Proficient in Java, Spring/Spring Boot, and RESTful API design Hands-on experience with CI/CD tools like Jenkins, GitLab CI, or GitHub Actions Experience working in Scaled Agile Framework (SAFe) or similar large-scale agile delivery environments Familiarity with front-end technologies (React, Angular, HTML, CSS, JavaScript) Strong understanding of relational databases (e.g., Oracle, PostgreSQL) and ability to write complex SQL queries Experience with source control systems (Git) and build automation Excellent communication and collaboration skills for working with cross-functional teams Proficiency in Jira, Confluence, and similar collaboration and project tracking tools Preferred Qualifications: - Strong DB performance tuning, data modeling, and ETL experience - Exposure to client data domains in financial services or regulated environments - Experience with microservices, containerization (Docker), and orchestration (Kubernetes) - Familiarity with cloud environments (AWS, Azure, or GCP) - Knowledge of messaging platforms (Kafka, MQ) and event-driven architecture Best Places to Work for Disability Inclusion , Disability: IN 100% score, 2023-2024
Posted 2 weeks ago
4.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive datadriven decisionmaking. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. We are looking for a skilled Automation Engineer to join our dynamic team. The ideal candidate will have strong technical and automation expertise, with a focus on Java programming, API testing, and handson experience with key automation tools like Selenium, GIT, and Jenkins. You will play a crucial role in ensuring the quality and reliability of our software products through automated testing. Responsibilities Develop, implement, and maintain automated test scripts using Java. Perform API testing to ensure the functionality and reliability of our services. Write and execute Seleniumbased automated tests for web applications. Work within an Agile environment, participating in sprint planning and daily standups. Maintain and update test cases in JIRA and track issues as part of the testing lifecycle. Requirements Proven experience in automation testing with strong proficiency in Java. Handson experience with Selenium for automated UI testing. Proficiency with API testing using tools such as Postman, RestAssured, or similar. Understanding of version control using GIT and Jenkins. Familiarity with Agile methodologies and working in Agile teams. Knowledge of JIRA for test management and issue tracking. Nice to Have Understanding of functional testing. Note Please do not consider candidates whose work experience is primarily in BDD frameworks. Location Bangalore Exp 4 to 7 Mandatory Skill Sets Automation Testing Preferred Skill Sets QE Testing Years of Experience required 47 years Education Qualifications BTECH/MTECH Education Degrees/Field of Study required Master of Engineering, Bachelor of Technology Degrees/Field of Study preferred Required Skills Automation Engineering Accepting Feedback, Accepting Feedback, Active Listening, Algorithm Development, Alteryx (Automation Platform), Analytical Thinking, Analytic Research, Big Data, Business Data Analytics, Communication, Complex Data Analysis, Conducting Research, Creativity, Customer Analysis, Customer Needs Analysis, Dashboard Creation, Data Analysis, Data Analysis Software, Data Collection, DataDriven Insights, Data Integration, Data Integrity, Data Mining, Data Modeling, Data Pipeline {+ 38 more} No
Posted 2 weeks ago
1.0 - 6.0 years
14 - 16 Lacs
Bengaluru
Work from Office
KPMG India is looking for Associate consultant- Data Governance Associate consultant- Data Governance to join our dynamic team and embark on a rewarding career journey Undertake short-term or long-term projects to address a variety of issues and needs Meet with management or appropriate staff to understand their requirements Use interviews, surveys etc. to collect necessary data Conduct situational and data analysis to identify and understand a problem or issue Present and explain findings to appropriate executives Provide advice or suggestions for improvement according to objectives Formulate plans to implement recommendations and overcome objections Arrange for or provide training to people affected by change Evaluate the situation periodically and make adjustments when needed Replenish knowledge of industry, products and field
Posted 2 weeks ago
10.0 - 12.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Job Summary We are seeking a highly skilled and collaborative Analytics Engineer to join our growing data team. In this role, you will be responsible for designing, building, and maintaining scalable data models and transformation pipelines that power analytics and business intelligence across the organization. You will work closely with data analysts, data engineers, and business stakeholders to ensure data is accurate, accessible, and actionable. Key Responsibilities Data Modeling & Transformation Develop and maintain modular, well-documented data models using dbt Write efficient, maintainable SQL transformations across Snowflake and SQL Server Apply best practices in dimensional modeling and ELT architecture Cloud & Platform Integration Build and orchestrate data pipelines using a variety of tools including SSIS, Python, Azure Data Factory or Synapse Pipelines Integrate data from various sources including SQL Server, Oracle, Postgres, Snowflake, and Azure Blob Storage Data Quality & Governance Implement data validation tests and monitoring using SSIS, dbt and SQL Collaborate with data stewards to ensure data governance, lineage, and compliance Maintain clear and up-to-date documentation in Confluence Testing & Deployment Use GitHub for version control and CI/CD pipelines to deploy data pipeline and transformation projects Write and maintain unit and integration tests for data transformations Participate in code reviews and enforce data engineering standards Analytics Enablement Partner with analysts and business teams to deliver trusted, reusable datasets Build and maintain semantic layers and data marts for self-service analytics Project & Workflow Management Track work using Jira and participate in agile ceremonies (sprint planning, retrospectives) Document technical decisions, workflows, and architecture in Confluence Security & Performance Monitor and optimize query performance in Snowflake and SQL Server Implement role-based access controls and ensure secure data access Qualifications Solid understanding of analytics engineering, data engineering, or BI development Strong SQL and SSIS skills, and experience with dbt in a production environment Experience with Snowflake, SQL Server, Oracle, and Azure Data Services Familiarity with GitHub, CI/CD workflows, and agile project management tools (e.g., Jira) Excellent communication and documentation skills Nice to Have Experience with Python or Power BI Knowledge of data observability tools (e.g., Monte Carlo, Great Expectations) Exposure to data privacy and compliance frameworks (e.g., GDPR, HIPAA) Understanding of SDLC and its adaptation to data engineering
Posted 2 weeks ago
2.0 - 7.0 years
14 - 18 Lacs
Pune
Work from Office
Grade H - Office/ CoreResponsible for supporting software / platform engineering activities (depending on specialism), working with users to capture requirements, using sound technical capabilities to lead the design, development and maintenance of the relevant systems and ensuring compliance with the relevant standards. Specialisms: Software Engineering; Platform Engineering. Entity: Technology IT&S Group As an enterprise engineer, you will be responsible for building, maintaining and troubleshooting the software infrastructure and services that powers our technology platforms. In this role, you will work with a team of engineers and collaborators to ensure that the platform is highly available, scalable, and secure. You will also be responsible for automating routine tasks, improving the platforms performance, and providing technical support to other teams. What you will deliver Design and build the technology platforms features and infrastructure Ensure the platform and services are highly available, scalable, and secure Continuously monitor and evaluate the platform to identify potential issues and make recommendations for improvements Collaborate with other platform and services teams to identify and resolve complex problems Mentor junior engineers and contribute to the development of the engineering team Write software design and operational support documentation What you will need to be successful (experience and qualifications) Technical skills Technical skills Bachelor s degree in Computer Science, Engineering, Computer Information Systems or equivalent work experience Capable in the ability to adapt to new technologies and processes, and work independently and as part of a team Capable in problem-solving, analyzing complex problems, identify root causes and develop creative and effective solutions Excellent communication skills and ability to communicate with your peers through to senior leaders. You should be able to engage and influence others to collect requirements, describe what you re doing, work through problems, and find productive solutions Self-starter, able to handle ambiguity, navigate uncertainty, identify risks, and find the right people and tools to get the job done Infrastructure Skills Capable in building and scaling infrastructure services using Amazon Web Services or Microsoft Azure Expertise in infrastructure as code, scripting, infrastructure pipelines, configure cloud resources using reusable templates Capable in the understanding of using core cloud application infrastructure services including identity platforms, networking, storage, databases, containers, and serverless Capable in solving large-scale distributed systems issues 2+ years of experience in application development and support environments with more than one technology and multiple design techniques. You ll have supported these production systems through on-call rotations Software Skills Capable in C# / Python programming, specializing in developing cross-platform solutions that are efficient and scalable 2+ years of non-internship professional software development experience Capable in software engineering practices & standard methodologies for full SDLC, including coding standards, code reviews, source control management, continuous deployments, testing, and operations Experienced in building complex software systems end-to-end which have been effectively delivered and operated in production. You should understand security and privacy standard methodologies as well as how to properly monitor, log, and alarm production systems Capable knowledge of databases, such as relational, graph, document, and key-value, ability in data modeling and database design, ability in SQL About bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Additional Information We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Even though the job is advertised as full time, please contact the hiring manager or the recruiter as flexible working arrangements may be considered. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more}
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40005 Jobs | Dublin
Wipro
19416 Jobs | Bengaluru
Accenture in India
16187 Jobs | Dublin 2
EY
15356 Jobs | London
Uplers
11435 Jobs | Ahmedabad
Amazon
10613 Jobs | Seattle,WA
Oracle
9462 Jobs | Redwood City
IBM
9313 Jobs | Armonk
Accenture services Pvt Ltd
8087 Jobs |
Capgemini
7830 Jobs | Paris,France