Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
14 - 18 Lacs
Hyderabad
Work from Office
The Impact you will have in this role: The Development family is responsible for crafting, designing, deploying, and supporting applications, programs, and software solutions. May include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities related to software products used internally or externally on product platforms supported by the firm. The software development process requires in-depth domain expertise in existing and emerging development methodologies, tools, and programming languages. Software Developers work closely with business partners and / or external clients in defining requirements and implementing solutions. The Software Engineering role specializes in planning, documenting technical requirements, crafting, developing, and testing all software systems and applications for the firm. Works closely with architects, product managers, project management, and end-users in the development and improvement of existing software systems and applications, proposing and recommending solutions that solve complex business problems. Your Primary Responsibilities: Act as a technical expert on one or more applications used by DTCC Work with the Business System Analyst to ensure designs satisfy functional requirements Partner with Infrastructure to identify and deploy optimal hosting environments Tune application performance to eliminate and reduce issues Research and evaluate technical solutions consistent with DTCC technology standards Align risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Apply different software development methodologies dependent on project needs Contribute expertise to the design of components or individual programs, and participate in the construction and functional testing Support development teams, testing, solving, and production support Create applications and construct unit test cases that ensure compliance with functional and non-functional requirements Work with peers to mature ways of working, continuous integration, and continuous delivery Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Qualifications: Minimum of 8 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: Expertise in Snowflake DB and its various architecture principles, capabilities Experience with data warehousing, data architecture, ETL data pipeline and/or data engineering environments at enterprise scale that are built on Snowflake Ability to create Strong SQL Procedures in Snowflake, Build a Data Pipeline efficiently in a cost-optimizing & performance efficient way Proficient understanding of code versioning tools - Git, Mercurial, SVN Knowledge of SDLC, Testing & CI/CD aspects such as Jenkins, BB , JIRA Fosters a culture where integrity and transparency are encouraged. Stays ahead of on changes in their own specialist area and seeks out learning opportunities to ensure knowledge is up-to-date. Invests in effort to individually coach others. Build collaborative teams across the organization. Communicates openly keeping everyone across the organization advised.
Posted 1 week ago
5.0 - 10.0 years
14 - 19 Lacs
Hyderabad
Work from Office
DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact you will have in this role: The SLM and JDM applications monitor all jobs, processes and output of systems run within DTCC. Our goal is to identify deviation from historical trends and known patterns of execution that may lead to issues latter in the process cycle that would interrupt our clients business interaction Your Primary Responsibilities: Working with the current Power BI based dashboards, design new Java based dashboards to replace and enhance the functionality of the JDM system. Help develop specifications for new dashboards and required applications changes to support the new design. Build and deploy the new dashboards and work with the application and business support teams to train in the usage of the application. Utilize feedback to further design enhancements to the application. Qualifications: Minimum of 4 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: 4+ Years of Active DevelopmentExperience/ Expertise in Java/J2EE Based Applications Must HaveStrong Frontend Experience - Angular Experience in Web based UI development and SPA Development Experience with CI/CD technologies like GIT, Jenkins, and Maven Experience with containers like OpenShift is a plus. Experience with Messaging, ETL or Reporting tools is a plus. Database and PL/SQL skills (snowflake preferred) is a plus Knowledge of Python a plus Familiarity with Agile development methodology
Posted 1 week ago
5.0 - 9.0 years
14 - 19 Lacs
Chennai
Work from Office
The Impact you will have in this role: The Development family is responsible for crafting, designing, deploying, and supporting applications, programs, and software solutions. May include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities related to software products used internally or externally on product platforms supported by the firm. The software development process requires in-depth domain expertise in existing and emerging development methodologies, tools, and programming languages. Software Developers work closely with business partners and / or external clients in defining requirements and implementing solutions. The Software Engineering role specializes in planning, documenting technical requirements, designing, developing, and testing all software systems and applications for the firm. Works closely with architects, product managers, project management, and end-users in the development and improvement of existing software systems and applications, proposing and recommending solutions that solve complex business problems. Your Primary Responsibilities: Lead technical processes and designs considering reliability, data integrity, maintainability, reuse, extensibility, usability, and scalability. Review code of development team to ensure quality and adherence to best practices and standards. Mentor junior developers to develop their skills and build strong talent. Collaborate with Infrastructure partners to identify and deploy optimal hosting environments. Define scalability and performance criteria for assigned applications. Ensure application meets the performance, privacy, and security requirements. Verify test plans to ensure compliance with performance and security requirements. Support business and technical presentations in relation to technology platforms and business solutions. Mitigate risk by following established procedures and monitoring controls. Help develop solutions that balance cost and delivery while meeting business requirements. implement technology-specific best practices that are consistent with corporate standards. Partner with multi-functional teams to ensure the success of product strategy and project deliverables. Manage the software development process. Drive new technical and business process improvements. Estimate total costs of modules/projects covering both hours and expense. Research and evaluate specific technologies, and applications, and contributes to the solution design. Construct application Architecture encompassing end-to-end designs. Mitigates risk by following established procedures and monitoring controls, spotting key errors, and demonstrating strong ethical behavior. Qualifications: Minimum of 7+ years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: 7+ years Strong Frontend Experience - jQuery and JavaScript 7+ years of Active Development Experience/ Expertise in Java/J2EE Based Applications proven ability with Hibernate, Spring, Spring MVC Experience in Web based UI development Experience with CSS, HTML, JavaScript, and similar UI frameworks (jQuery, React) Familiarity with Microservices based architecture and distributed systems Hands on experience with AI tools such as Amazon Q is a plus Ability to develop and work with REST APIs using Spring Boot framework. Hands-on experience with AWS technologies, Snow Flake is a plus Strong database and PL/SQL skills (Oracle, Postgres preferred) Experience with Messaging, ETL or Reporting tools is a plus. Knowledge of Python a plus Familiarity with Agile development methodology Collaborate with multiple collaborators such as product management, application development, DevOps, and other technical groups.
Posted 1 week ago
5.0 - 9.0 years
14 - 19 Lacs
Hyderabad, Chennai
Work from Office
The Impact you will have in this role: The role involves developing and maintaining control functions for GTR application. This role is also expected to work closely with the required development teams, our Enterprise Infrastructure partners and our internal business clients to resolve and escalate technical support incidents where necessary Your Primary Responsibilities: Developing Python based control functions and maintaining it. Developing Data model for various applications based on snowflake database Working in streams, streamlit in Snowflake for GUI based developments Work with support teams like EAS GTR for resolving Production & PSE related incidents Qualifications: Minimum of 6 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: 5+ Years of Active DevelopmentExperience/ Expertise in Python Based Applications Experience in ticket tracking tools like ServiceNOW (SNOW),Jira etc. Database and PL/SQL skills (snowflake preferred) is a plus. Experience in Bitbucket and Jenkin tools. Experience with Messaging, ETL or Reporting tools is a plus. Familiarity with Agile development methodology.
Posted 1 week ago
9.0 - 14.0 years
30 - 35 Lacs
Chennai
Work from Office
DTCC Digital Assets DTCC Digital Assets is at the forefront of driving institutional adoption of digital assets technology with a steadfast commitment to innovation anchored in security and stability. As the financial services industrys trusted technology partner, we pride ourselves on empowering a globally interconnected and efficient ecosystem.Our mission is to provide secure and compliant infrastructure for digital assets, enabling financial institutions to unlock the full potential of blockchain technology We are seeking an experienced and highly skilled Principal Data Engineer to join our dynamic team. As a Principal Data Engineer, you will play a crucial role in designing and building and growing our greenfield Snowflake Data Platform for Digital Assets. The Impact you will have in this role: Principal Data Engineer role is substantial in shaping the data infrastructure and strategic direction of the Digital Assets department. By leading the design and implementation of a greenfield Snowflake Data Platform, this role directly influences the organizations ability to manage and leverage data for operational efficiency and risk assessment. The Associate Director ensures that data systems are scalable, secure, and aligned with business goals, enabling faster decision-making and innovation. Their leadership in managing cross-functional teams and collaborating with stakeholders ensures that technical solutions are not only robust but also responsive to evolving business needs. Beyond technical execution, this role plays a pivotal part in fostering a culture of accountability, growth, and inclusion. By mentoring team members, driving employee engagement, and promoting best practices in agile development and data governance, the Associate Director helps build a resilient and high-performing engineering organization. Their contributions to incident management, platform adoption, and continuous improvement efforts ensure that the data platform remains reliable and future-ready, positioning the company to stay competitive in the rapidly evolving digital assets landscape. Role description: Lead engineering and development focused projects from start to finish with minimal supervision. Provide technical and operational support for our customer base as well as other technical areas within the company. Review and supervise the system design and architecture. Interact with stakeholder to understand requirements and provide solutions. Risk management functions such as reconciliation of vulnerabilities, security baselines as well as other risk and audit related objectives. Refine and prioritize the backlog for the team in partnership with product management. Groom and guide the team of employees and consultants. Responsible for employee engagement, growth and appraisals. Participate in user training to increase awareness of the platform. Ensure incident, problem and change tickets are addressed in a timely fashion, as well as escalating technical and managerial issues. Ensure quality and consistency of data from source systems and align with data product managers on facilitating resolution of these issues in a consistent manner. Follow DTCCs ITIL process for incident, change and problem resolution Talents Needed for Success Bachelors degree in Computer Science, Information Technology, Engineering (any) or related field 8 years of experience in the job or related position. Prior experience to include: 5 years of experience in managing data warehouses in a production environment. This includes all phases of lifecycle management: planning, design, deployment, upkeep, and retirement. 5 years leading development teams from with mix of onshore and offshore members. Experience designing and architecting data warehousing applications. Warehousing concepts involving fact and dimensions. Star/snowflake schemas and data integration methods and tools. Deep understanding of the Snowflake platform. Designing data pipelines. SQL and relational databases. Development in agile scrum teams. Development following CI/CD processes. Demonstrable experience with data streaming technologies like Kafka for data ingestion. Knowledge of Blockchain technologies, Smart Contracts and Financial Services a plus. Designing low latency data platforms a plus. Knowledge of Data Governance principles a plus. Optimize/Tune source streams, queries, and Power BI (or equivalent) Dashboards Leadership competencies Champion Inclusion - Embrace individual difference and create an environment of support, belonging and trust. Communicate Clearly - Listen to understand. Ask questions for clarity and deliver messages with purpose. Cultivate Relationships show care and compassion for others and authentically build networks across functions. Instill Ownership Ensure accountability, manage execution, and mitigate risk to deliver results. Inspire Growth Develop yourself and others through coaching, feedback, and mentorship to meet carer goals. Propel Change Think critically, respectfully challenge, and create innovative ways to drive growth.
Posted 1 week ago
5.0 - 10.0 years
14 - 19 Lacs
Hyderabad, Chennai
Work from Office
The Impact you will have in this role: The role involves developing and maintaining control functions for GTR application. This role is also expected to work closely with the required development teams, our Enterprise Infrastructure partners and our internal business clients to resolve and escalate technical support incidents where necessary Your Primary Responsibilities: Developing Python based control functions and maintaining it. Developing Data model for various applications based on snowflake database Working in streams, streamlit in Snowflake for GUI based developments Work with support teams like EAS GTR for resolving Production & PSE related incidents Qualifications: Minimum of 6 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: 5+ Years of Active DevelopmentExperience/ Expertise in Python Based Applications Experience in ticket tracking tools like ServiceNOW (SNOW),Jira etc. Database and PL/SQL skills (snowflake preferred) is a plus. Experience in Bitbucket and Jenkin tools. Experience with Messaging, ETL or Reporting tools is a plus. Familiarity with Agile development methodology.
Posted 1 week ago
5.0 - 10.0 years
18 - 22 Lacs
Hyderabad
Work from Office
DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (Tuesdays, Wednesdays, and a day unique to each team or employee). The impact you will have in this role: The Lead Platform Engineer is responsible for design analysis, documentation, testing, installation, implementation, optimization, maintenance and support for the z/OS Operating System, Third Party products, the UNIX System Services environment, Mainframe WebSphere Application Server, and WebSphere Liberty. You willcollaborate with application developers, middleware support, database administrators, and other IT professionals.Requires experience with z/OS, JES2, USS internals, SMP/E installations, and mainframe vendor product knowledge. Skills in creating and managing web sites using both common and advanced Web programming languages is advantageous. What You'll Do: Perform design analysis, documentation, testing, implementation, and support for the mainframe infrastructure environment Install and manage mainframe software deployments in a highly granular SYSPLEX environment Experience with installing, and maintaining WASz and Liberty Enhance reporting and automation using supported mainframe tools such as JCL, REXX, SAS, SQL, PYTHON and Java/JavaScript Complete assignments by due dates, without detailed supervision Responsible for Incident, Problem, and Change Management for all assigned products Ensure incidents and problems are closed according to domain standards, and all change management requirements are strictly followed Mitigates risk by following established procedures and monitoring controls, spotting key errors, and demonstrating strong ethical behavior. Participate in team on-call coverage rotation, which includes tactical systems administration and provide weekend support Aligns risk and control processes into day-to-day responsibilities to monitor and mitigate risk; escalates appropriately. Participate in disaster recovery tests (on weekends) Actively engage in strategic goals for mainframe engineering, the department and organization Provide input and follow-through for continuous improvement to mainframe systems engineering processes and procedures Perform level 1 network troubleshooting for mainframe applications. Education: Bachelor's degree or equivalent experience. Talents Needed for Success: Accountability: Demonstrates reliability by taking necessary actions to continuously meet required deadlines and goals. Global Collaboration: Applies global perspective when working within a team by being aware of ones own style and ensures all relevant parties are involved in key team tasks and decisions. Communication: Articulates information clearly and presents information effectively and confidently when working with others. Influencing: Convinces others by making a strong case, bringing others along to their viewpoint; maintains strong, trusting relationships, while at the same time is comfortable challenging ideas. Innovation and Creativity: Thinks > Additional Qualification: A minimum of 6+ years System Programmers experience in an IBM z/OS environment REXX programming experience preferred HTML, XML, Java, and Java Script programming experience is preferred Experience with Mainframe system automation (BMC AMI Ops) is a plus Understanding of VTAM and TCP/IP is a plus. Knowledge of Ansible, Splunk, Snowflake, ZOWE, SAS is a plus Knowledge of Bitbucket, Jira and DevOps orchestration tools are a plus Excellent written and verbal skills. The ability to multitask and work in a team environment is a must. Excellent customer service skills to be able to develop mutuallybeneficial relationships with a diverse set of customers. Knowledge of Infrastructure as Code (IaC) standards is a plus Experience in a 24x7 global environment with knowledge of system highavailability (HA), design and industry standard disaster recovery practices.
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Database Administrator Project Role Description : Design, implement and maintain databases. Install database management systems (DMBS). Develop procedures for day-to-day maintenance and problem resolution. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Database Administrator, you will design, implement, and maintain databases to ensure optimal performance and reliability. Your typical day will involve installing database management systems, developing procedures for daily maintenance, and resolving any issues that arise. You will work collaboratively with team members to enhance database functionality and support various applications, ensuring that data is accessible and secure for users across the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Monitor database performance and implement improvements as necessary.- Ensure data integrity and security through regular audits and updates. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and ETL processes.- Strong understanding of database management systems and their architecture.- Familiarity with SQL and query optimization techniques.- Experience in troubleshooting and resolving database issues. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Database Administrator Project Role Description : Design, implement and maintain databases. Install database management systems (DMBS). Develop procedures for day-to-day maintenance and problem resolution. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Database Administrator, you will design, implement, and maintain databases to ensure optimal performance and reliability. Your typical day will involve installing database management systems, developing procedures for daily maintenance, and resolving any issues that arise. You will work collaboratively with team members to enhance database functionality and support various applications, ensuring that data is accessible and secure for users across the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Monitor database performance and implement improvements as necessary.- Ensure data integrity and security through regular audits and updates. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and ETL processes.- Strong understanding of database management systems and their architecture.- Experience in performance tuning and optimization of database queries.- Familiarity with backup and recovery strategies to safeguard data. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
8.0 - 12.0 years
20 - 35 Lacs
Hyderabad, Bengaluru
Hybrid
Essential Responsibilities: Architecture & Design Define and document the overall data platform architecture in GCP, including ingestion (Pub/Sub, Dataflow), storage (BigQuery, Cloud Storage), and orchestration (Composer, Workflows). Establish data modeling standards (star/snowflake schemas, partitioning, clustering) to optimize performance and cost. Platform Implementation Build scalable, automated ETL/ELT pipelines for IoT telemetry and events. Implement streaming analytics and CDC where required to support real-time dashboards and alerts. Data Products & Exchange Collaborate with data scientists and product managers to package curated datasets and ML feature tables as consumable data products. Architect and enforce a secure, governed data exchange layerleveraging BigQuery Authorized Views, Data Catalog, and IAM—to monetize data externally. Cost Management & Optimization Design cost-control measures: table partitioning/clustering, query cost monitoring, budget alerts, and committed-use discounts. Continuously analyze query performance and storage utilization to drive down TCO. Governance & Security Define and enforce data governance policies (cataloging, lineage, access controls) using Cloud Data Catalog and Cloud IAM. Ensure compliance with privacy, security, and regulatory requirements for internal and external data sharing. Stakeholder Enablement Partner with business stakeholders to understand data needs and translate them into platform capabilities and SLAs. Provide documentation, training, and self-service tooling (Data Studio templates, APIs, notebooks) to democratize data access. Mentorship & Leadership Coach and mentor engineers on big data best practices, SQL optimization, and cloud-native architecture patterns. Lead architecture reviews, proof-of-concepts, and pilot projects to evaluate emerging technologies (e.g., BigQuery Omni, Vertex AI). Additional Job DescriptionAdditional Job DescriptionMinimum Qualifications Bachelor’s degree in Computer Science, Engineering, or related field. 8+ years designing and operating large-scale data platforms, with at least 5 years hands-on experience in GCP (BigQuery, Dataflow, Pub/Sub). Deep expertise in BigQuery performance tuning, data partitioning/clustering, and cost-control techniques. Proven track record building streaming and batch pipelines (Apache Beam, Dataflow, Spark). Strong SQL skills and experience with data modeling for analytics. Familiarity with data governance tools: Data Catalog, IAM, VPC Service Controls. Experience with Python or Java for ETL/ELT development. Excellent communication skills, able to translate technical solutions for non-technical stakeholders.
Posted 1 week ago
3.0 - 7.0 years
8 - 10 Lacs
Vadodara
Remote
The Job: At Convoso, were constantly, vigilantly looking for ways to reshape the future of lead generation contact centers. Our mission is to revolutionize the call center industry by empowering agents to convert leads faster. Thats where you come in. We are looking for an experienced and detail-oriented Business Intelligence Analyst to join our dynamic team. As a Business Intelligence Analyst, you will play a critical role in transforming data into actionable insights that drive informed decision-making and optimize business performance. Your expertise in data analysis, reporting, and visualization will be instrumental in providing valuable strategic recommendations to our organization. Stepping into this very challenging role will mean stepping into a dynamic environment. Therell be a steep learning curve, but we believe the future belongs to those who build it. Therefore, success for you would mean reaching your full potential in a short period of time, while doing whatever it takes to get up to speed. Success would mean having a strong ability to manage multiple projects with competing deadlines. Responsibilities : Data Analysis: Collect, organize, and analyze large volumes of structured and unstructured data from various sources to identify trends, patterns, and opportunities. Reporting and Visualization: Develop and maintain reports and interactive visualizations using BI tools. Present data in a clear and concise manner to facilitate understanding and enable stakeholders to make informed business decisions. Performance Monitoring: Monitor key performance indicators (KPIs) and track business metrics to identify areas for improvement and measure the impact of initiatives. Collaborate with cross-functional teams to define performance targets, establish benchmarks, and create performance reports. Data Quality Assurance: Ensure data accuracy, consistency, and integrity by conducting data validation, cleansing, and quality checks. Identify and resolve data discrepancies. Business Insights and Recommendations: Collaborate with cross functional teams to understand their requirements and translate them into actionable insights. Knowledge & Skills Bachelor's degree in a relevant field such as Business Administration, Statistics, Mathematics, Economics, or Computer Science. 3-4 years of experience as a Business Intelligence Analyst or in a similar analytical role, with a focus on data analysis, reporting, and visualization is required. Experience in Alteryx or similar BI tools such as Tableau or Power BI knowledge of SQL and Snowflake is a plus. Detail-oriented mindset with excellent analytical and problem-solving skills. Ability to work with complex datasets and derive meaningful insights. Strong communication skills, with the ability to effectively convey complex information to both technical and non-technical stakeholders. Knowledge of Excel is a must Proven ability to handle multiple projects and prioritize tasks in a fast-paced environment. US company experience preferred Role & responsibilities Immediate Joiner Preferable | Full Time Remote
Posted 1 week ago
1.0 - 6.0 years
6 - 13 Lacs
Bengaluru
Work from Office
Are you a seasoned data engineer with a passion for hands-on technical work? Do you thrive in an environment that values innovation, collaboration, and cutting-edge technologies? We are looking for a seasoned Integration Engineer to join our team, someone who is passionate about building and maintaining scalable data pipelines and integrations. The ideal candidate will have a strong foundation in Python programming, experience with Snowflake for data warehousing, proficiency in AWS and Kubernetes (EKS) for cloud services management, and expertise in CI/CD practices, Apache Airflow, DBT, and API development. This role is critical to enhancing our data integration capabilities and supporting our data-driven initiatives. Role and Responsibilities: As the Technical Data Integration Engineer, you will play a pivotal role in shaping the future of our data integration engineering initiatives. You will be part of talented data integration engineers while remaining actively involved in the technical aspects of the projects. Your responsibilities will include: Hands-On Contribution: Continue to be hands-on with data integration engineering tasks, including data pipeline development, EL processes, and data integration. Be the go-to expert for complex technical challenges. Integrations Architecture: Design and implement scalable and efficient data integration architectures that meet business requirements. Ensure data integrity, quality, scalability, and security throughout the pipeline. Tool Proficiency: Leverage your expertise in Snowflake, SQL, Apache Airflow, AWS, API, and Python to architect, develop, and optimize data solutions. Stay current with emerging technologies and industry best practices. Data Quality: Monitor data quality and integrity, implementing data governance policies as needed. Cross-Functional Collaboration: Collaborate with data science, data warehousing, analytics, and other cross-functional teams to understand data requirements and deliver actionable insights. Performance Optimization :Identify and address performance bottlenecks within the data infrastructure. Optimize data pipelines for speed, reliability, and efficiency. Qualification Minimum Bachelor's degree in Computer Science, Engineering, or related field. Advanced degree is a plus. 5 years of hands-on experience in data engineering. Familiarity with cloud platforms, such as AWS or Azure. Expertise in Apache Airflow, Snowflake, SQL, Python, Shell scripting, API gateways, web services setup. Strong experience in full-stack development, AWS, Linux administration, data lake construction, data quality assurance, and integration metrics. Excellent analytical, problem-solving, and decision-making abilities. Strong communication skills, with the ability to articulate technical concepts to non-technical stakeholders. A collaborative mindset, with a focus on team success. If you are a results-oriented Data Integration Engineer with a strong background in Apache Airflow, Snowflake, SQL, Python and API, we encourage you to apply. Join us in building data solutions that drive business success and innovation Additional Information Intuitive is an Equal Employment Opportunity Employer. We provide equal employment opportunities to all qualified applicants and employees, and prohibit discrimination and harassment of any type, without regard to race, sex, pregnancy, sexual orientation, gender identity, national origin, color, age, religion, protected veteran or disability status, genetic information or any other status protected under federal, state, or local applicable laws. We will consider for employment qualified applicants with arrest and conviction records in accordance with fair chance laws.
Posted 1 week ago
7.0 - 8.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Role & responsibilities Oversee the design, implementation, and optimization of data warehousing solutions leveraging tools like Snowflake , Databricks , and other cloud data platforms. Lead the delivery of software projects from initiation through implementation. Lead the delivery of ETL processes for ingesting, transforming, and managing large-scale datasets. Lead the delivery of Data Analytics dashboards and reports on modern data stacks Develop project plans , allocate resources, and track progress using project management tools such as Jira, Asana, Trello, or MS Project . Act as the primary point of contact for clients, building strong relationships, providing regular updates, and addressing concerns promptly. Manage risks and resolve project roadblocks to ensure timely delivery of high-quality solutions. Ensure projects align with data governance best practices, security protocols, and client standards. Provide technical guidance to the development team, ensuring high-quality and timely delivery. Work with stakeholders to define KPIs and ensure delivery meets the business and technical goals. Drive continuous improvement initiatives in delivery processes, data quality, and team efficiency. Provide leadership and mentoring to project teams, fostering a culture of collaboration, accountability, and excellence. Preferred candidate profile Bachelors degree in computer science, Information Systems, Data Engineering, or a related field. 5+ years of experience managing the delivery of Data Warehouse , data engineering and data analytics projects . Strong experience with cloud-based data platforms such as Snowflake , Databricks , or Amazon Redshift. Proficiency in managing ETL pipelines and understanding data transformation processes . Solid knowledge of data warehousing concepts (e.g., dimensional modelling, star/snowflake schema, OLAP/OLTP systems). Experience working with SQL for data querying, performance optimization, and testing. Proven ability to manage multiple stakeholders, prioritize tasks, and ensure client satisfaction. Proficiency with project management tools: Jira, Asana, Trello, or MS Project . Familiarity with Agile, Scrum, and Waterfall methodologies.
Posted 1 week ago
8.0 - 13.0 years
0 - 1 Lacs
Chennai
Hybrid
Duties and Responsibilities Lead the design and implementation of scalable, secure, and high-performance solutions for data-intensive applications. Collaborate with stakeholders, other product development groups and software vendors to identify and define solutions for complex business and technical requirements. Develop and maintain cloud infrastructure using platforms such as AWS, Azure, or Google Cloud. Articulate technology solutions as well as explain the competitive advantages of various technology alternatives. Evangelize best practices to analytics teams Ensure data security, privacy, and compliance with relevant regulations. Optimize cloud resources for cost-efficiency and performance. Lead the migration of on-premises data systems to the cloud. Implement data storage, processing, and analytics solutions using cloud-native services. Monitor and troubleshoot cloud infrastructure and data pipelines. Stay updated with the latest trends and best practices in cloud computing and data management" Skills 5+ years of hands-on design and development experience in implementing Data Analytics applications using AWS Services such as S3, Glue, AWS Step Functions, Kinesis, Lambda, Lake Formation, Athena, Elastic Container Service/Elastic Kubernetes Service, Elastic Search, and Amazon EMR or Snowflake Experience with AWS services such as AWS IoT Greengrass, AWS IoT SiteWise, AWS IoT Core, AWS IoT Events-Strong understanding of cloud architecture principles and best practices. Proficiency in designing network topology, endpoints, application registration, network pairing Well verse with the access management in Azure or Cloud Experience with containerization technologies like Docker and Kubernetes. Expertise in CI/CD pipelines and version control systems like Git. Excellent problem-solving skills and attention to detail. Strong communication and leadership skills. Ability to work collaboratively with cross-functional teams and stakeholders. Knowledge of security and compliance standards related to cloud data platforms." Technical / Functional Skills Atleast 3+ years of experience in the implementation of all the Amazon Web Services (listed above) Atleast 3+ years of experience as a SAP BW Developer Atleast 3+ years of experience in Snowflake (or Redshift) Atleast 3+ years of experience as Data Integration Developer in Fivetran/HVR/DBT, Boomi (or Talend/Infomatica) Atleast 2+ years of experience with Azure Open AI, Azure AI Services, Microsoft CoPilot Studio, PowerBI, PowerAutomate Experience in Networking and Security Domain Expertise: 'Epxerience with SDLC/Agile/Scrum/Kanban. Project Experience Hands on experience in the end-to-end implementation of Data Analytics applications on AWS Hands on experience in the end to end implementation of SAP BW application for FICO, Sales & Distribution and Materials Management Hands on experience with Fivetran/HVR/Boomi in development of data integration services with data from SAP, SalesForce, Workday and other SaaS applications Hands on experience in the implementation of Gen AI use cases using Azure Services Hands on experience in the implementation of Advanced Analytics use cases using Python/R Certifications AWS Certified Solutions Architect - Professional
Posted 1 week ago
6.0 - 10.0 years
12 - 22 Lacs
Chennai
Hybrid
Role & responsibilities The Role: We are looking for a candidate to join our team who will be involved in the ongoing development of our Enterprise Data Warehouse (EDW) and supporting our POS [Point of Sale] Channel Data Management team. This role will include participating in the loading and extraction of data, including POS to and from the warehouse. The ideal candidate will be involved in all stages of the project lifecycle, from initial planning through to deployment in production. A key focus of the role will be data analysis, data modeling, and ensuring these aspects are successfully implemented in the production environment. Your Contribution: Be Yourself. Be Open. Stay Hungry and Humble. Collaborate. Challenge. Decide and just Do. These are the behaviors youll need for success at Logitech. In this role you will: Design, Develop, document, and test ETL solutions using industry standard tools. Ability to design Physical and Reporting Data models for seamless cross-functional and cross-systems data reporting. Enhance point-of-sale datasets with additional data points to provide stakeholders with useful insights. Ensure data integrity by rigorously validating and reconciling data obtained from third-party providers. Collaborate with data providers and internal teams to address customer data discrepancies and enhance data quality. Work closely across our D&I teams to deliver datasets optimized for consumption in reporting and visualization tools like Tableau Collaborate with channel data and cross-functional teams to define requirements for POS and MDM data flows. Support Customer MDM & POS Adhoc Requests and Data Clarification from the Channel Data Team and the Finance Team. Collaborate with the BIOPS team to support Quarter-end user activities and ensure compliance with SOX regulations. Should be willing to explore and learn new technologies and concepts to provide the right kind of solution. Key Qualifications: For consideration, you must bring the following minimum skills and behaviors to our team: A total of 4 to 7 years of experience in ETL design, development, and populating data warehouses. This includes experience with heterogeneous OLTP sources such as Oracle R12 ERP systems and other cloud technologies. At least 3 years of hands-on experience with Pentaho Data Integration or similar ETL tools. Practical experience working with cloud-based Data Warehouses such as Snowflake and Redshift. Significant hands-on experience with Snowflake utilities, including SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Comprehensive expertise in databases, data acquisition, ETL strategies, and the tools and technologies within Pentaho DI and Snowflake. Demonstrated experience in designing complex ETL processes for extracting data from various sources, including XML files, JSON, RDBMS, and flat files. Exposure to standard support ticket management tools. A strong understanding of Business Intelligence and Data warehousing concepts and methodologies. Extensive experience in data analysis and root cause analysis, along with proven problem-solving and analytical thinking capabilities. A solid understanding of software engineering principles and proficiency in working with Unix/Linux/Windows operating systems, version control, and office software. A deep understanding of data warehousing principles and cloud architecture, including SQL optimization techniques for building efficient and scalable data systems. Familiarity with Snowflake’s unique features, such as its multi-cluster architecture and shareable data capabilities. Excellent skills in writing and optimizing SQL queries to ensure high performance and data accuracy across all systems. The ability to troubleshoot and resolve data quality issues promptly, maintaining data integrity and reliability. Strong communication skills are essential for effective collaboration with both technical and non-technical teams to ensure a clear understanding of data engineering requirements. In addition, preferable skills and behaviors include: Exposure to Oracle ERP environment, Basic understanding of Reporting tools like OBIEE, Tableau Education: BS/BTech/MS in computer science Information Systems or a related technical field or equivalent industry expertise. Logitech is the sweet spot for people who are passionate about products, making a mark, and having fun doing it. As a company, we’re small and flexible enough for every person to take initiative and make things happen. But we’re big enough in our portfolio, and reach, for those actions to have a global impact. That’s a pretty sweet spot to be in and we’re always striving to keep it that way.
Posted 1 week ago
8.0 - 13.0 years
25 - 37 Lacs
Pune
Hybrid
Job Title Data Engineer Job Description Job Duties and Responsibilities: We are looking for a self-starter to join our Data Engineering team. You will work in a fast-paced environment where you will get an opportunity to build and contribute to the full lifecycle development and maintenance of the data engineering platform. With the Data Engineering team you will get an opportunity to - Design and implement data engineering solutions that is scalable, reliable and secure on the Cloud environment Understand and translate business needs into data engineering solutions Build large scale data pipelines that can handle big data sets using distributed data processing techniques that supports the efforts of the data science and data application teams Partner with cross-functional stakeholder including Product managers, Architects, Data Quality engineers, Application and Quantitative Science end users to deliver engineering solutions Contribute to defining data governance across the data platform Basic Requirements: A minimum of a BS degree in computer science, software engineering, or related scientific discipline is desired 5+ years of work experience in building scalable and robust data engineering solutions Strong understanding of Object Oriented programming and proficiency with programming in Python (TDD) and Pyspark to build scalable algorithms 5+ years of experience in distributed computing and big data processing using the Apache Spark framework including Spark optimization techniques 5+ years of experience with Databricks, Delta tables, unity catalog, Delta Sharing, Delta live tables(DLT) and incremental data processing Experience with Delta lake, Unity Catalog Advanced SQL coding and query optimization experience including the ability to write analytical and nested queries 5+ years of experience in building scalable ETL/ ELT Data Pipelines on Databricks and AWS (EMR) 5+ Experience of orchestrating data pipelines using Apache Airflow/ MWAA Understanding and experience of AWS Services that include ADX, EC2, S3 5+ years of experience with data modeling techniques for structured/ unstructured datasets Experience with relational/columnar databases - Redshift, RDS and interactive querying services - Athena/ Redshift Spectrum Passion towards healthcare and improving patient outcomes Demonstrate analytical thinking with strong problem solving skills Stay on top of emerging technologies and posses willingness to learn. Bonus Experience (optional) Experience with Agile environment Experience operating in a CI/CD environment Experience building HTTP/REST APIs using popular frameworks Healthcare experience
Posted 1 week ago
8.0 - 10.0 years
10 - 15 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Snowflake. Experience: 8-10 Years.
Posted 1 week ago
8.0 - 13.0 years
25 - 37 Lacs
Bengaluru
Work from Office
100% Remote Snowflake / SQL Architect • Architect and manage scalable data solutions using Snowflake and advanced SQL, optimizing performance for analytics and reporting. • Design and implement data pipelines, data warehouses, and data lakes, ensuring efficient data ingestion and transformation. • Develop best practices for data security, access control, and compliance within cloud-based data environments. • Collaborate with cross-functional teams to understand business needs and translate them into robust data architectures. • Evaluate and integrate third-party tools and technologies to enhance the Snowflake ecosystem and overall data strategy.
Posted 1 week ago
5.0 - 7.0 years
0 - 1 Lacs
Bengaluru
Remote
Job Role: Snowflake / SQL Architect Duration: 6 months contract Job Description: • Architect and manage scalable data solutions using Snowflake and advanced SQL, optimizing performance for analytics and reporting. • Design and implement data pipelines, data warehouses, and data lakes, ensuring efficient data ingestion and transformation. • Develop best practices for data security, access control, and compliance within cloud-based data environmentsCollaborate with cross-functional teams to understand business needs and translate them into robust data architectures. • Evaluate and integrate third-party tools and technologies to enhance the Snowflake ecosystem and overall data strategy.
Posted 1 week ago
8.0 - 10.0 years
13 - 18 Lacs
Chennai
Work from Office
Core Qualifications 12+ years in software/data architecture with hands on experience. Agentic AI & AWS Bedrock (MustHave): Demonstrated handson design, deployment, and operational experience with Agentic AI solutions leveraging AWS Bedrock and AWS Bedrock Agents . Deep expertise in cloud-native architectures on AWS (compute, storage, networking, security). Proven track record defining technology stacks across microservices, event streaming, and modern data platforms (e.g., Snowflake, Databricks). Proficiency with CI/CD and IaC (Azure DevOps, Terraform). Strong knowledge of data modeling, API design (REST/GraphQL), and integration patterns (ETL/ELT, CDC, messaging). Excellent communication and stakeholder-management skillsable to translate complex tech into business value. Preferred Media or broadcasting industry experience. Familiarity with Salesforce, or other enterprise iPaaS solutions. Certifications: AWS/Azure/GCP Architect , Salesforce Integration Architect , TOGAF . Mandatory Skills: Generative AI.
Posted 1 week ago
5.0 - 8.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: AWS Glue. Experience: 5-8 Years.
Posted 1 week ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
The Data Transformation Team is responsible for maintaining and evolving robust data transformation pipelines using dbt, enabling consistent and high-quality data delivery to power our BX data catalog and downstream analytics. We promote best practices in data engineering and work collaboratively across all BXTI teams to elevate the overall data maturity of the organization. The success of this role is measured across two core capability areas: 1) Database Design & Data Analysis Success in this role requires a strong foundation in data modeling and database design, with the ability to structure data that supports scalable and efficient analytics. The ideal candidate can analyze raw data, interpret business logic, and translate it into well-documented, tested data models. Familiarity with SDLC processes is essential, including requirement gathering, validation, and quality assurance of data assets. 2) Technical Execution & Infrastructure The ideal candidate has strong expertise in developing and managing data transformation pipelines using SQL and Python, with a focus on performance, scalability, and reliability. They should be well-versed in orchestrating workflows, deploying solutions in Snowflake, and working with AWS services across various environments. Experience with CI/CD using tools like Jenkins, along with Docker for containerization, is essential for maintaining robust and repeatable deployment processes. 3+ years of hands-on experience in data transformation , data warehouse/database development , and ETL processing on large-scale datasets. Proven expertise in SQL with deep understanding of complex queries , including joins, unions, CTEs, and conditional logic. Proficiency in Python for scripting, automation, and data manipulation. Comfortable working in Linux environments and with version control systems (e.g., Git). Experience working within SDLC frameworks and Agile methodologies in a professional software development setting. Experience with containerization technologies such as Docker , Kubernetes , or serverless architectures . Strong organizational and multitasking skills , with the ability to manage multiple parallel projects effectively. Excellent communication skills , both written and verbal, to collaborate with business and technical teams toward shared outcomes. Self-driven and proactive, with a strong customer-focused mindset. Working knowledge of Regular Expressions (Regex) . Bachelors degree in Computer Science , Information Systems , Data Science , or a related field. Preferred / Nice to Have Familiarity with dbt (data build tool) for managing and scaling transformation logic. Hands-on experience with business intelligence tools like Tableau or Sigma . Mandatory Skills: Data Warehousing. Experience: 3-5 Years.
Posted 1 week ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Responsibilities: - Design, develop, and maintain robust automation frameworks using Python and JavaScript. - Create and execute automated test scripts for both front-end and back-end (API) testing. - Develop and implement comprehensive test plans and test cases. - Perform API testing and automation to ensure the reliability and performance of web services. - Utilize performance/load testing tools like K6, Locust, or similar to evaluate system performance under load. - Collaborate with development and DevOps teams to integrate automated tests into CI/CD pipelines using Jenkins. - Deploy and manage test environments using Docker and AWS. - Identify, document, and track defects and issues, working with development teams to ensure timely resolution. - Continuously improve testing methodologies and processes to enhance quality and efficiency. - Stay updated with the latest industry trends and technologies to ensure the continuous improvement of our testing practices. - Creating test cases, also perform test case execution for feature under test using Test Management Tool - Reporting defects for issues found under testing and manage the defect through its life cycle. - Collaboratively work with the design and development team members to resolve identified issues/bugs in timely manner. - Applying the functionality knowledge and understanding to determine the impacted testing areas and produce the relevant regression testing scope Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. 3-6 years of experience in quality assurance/testing, with a focus on automation. Strong proficiency in Python and JavaScript. Proven experience in creating and maintaining automation frameworks. Solid understanding of API testing and automation. Hands-on experience with performance/load testing tools such as K6, Locust, or similar. Knowledge of AWS services and cloud architecture. Proficiency with Docker for containerization. Experience with CI/CD tools, particularly Jenkins. Excellent problem-solving skills and attention to detail. Strong communication and teamwork skills. Ability to work independently and manage multiple tasks simultaneously. Familiarity with Agile/Scrum methodologies. Primary Skills: Proven experience in creating and maintaining automation frameworks using Python. API Testing/Automation Functional Testing Secondary skills AWS, Docker, Snowflake Mandatory Skills: API Automation Testing. Experience:3-5 Years.
Posted 1 week ago
4.0 - 8.0 years
0 - 1 Lacs
Hyderabad, Navi Mumbai, Pune
Work from Office
Role & responsibilities Key Responsibilities: Design, develop, and deploy interactive dashboards and visualizations using TIBCO Spotfire . Work with stakeholders to gather business requirements and translate them into scalable BI solutions. Optimize Spotfire performance and apply best practices in visualization and data storytelling. Integrate data from multiple sources such as SQL databases, APIs, Excel, SAP , or cloud platforms. Implement advanced analytics using IronPython scripting , data functions , and R/Statistical integration . Conduct data profiling, cleansing, and validation to ensure accuracy and consistency. Support end-users with training, troubleshooting, and dashboard enhancements. Must-Have Skills: 58 years of experience in BI and Data Visualization . Minimum 4 years hands-on with TIBCO Spotfire including custom expressions and calculated columns. Strong knowledge of data modeling , ETL processes , and SQL scripting . Expertise in IronPython scripting for interactivity and automation within Spotfire. Experience working with large datasets and performance tuning visualizations. Good to Have: Experience with R , Python , or Statistica for advanced analytics in Spotfire. Familiarity with cloud-based data platforms (AWS Redshift, Snowflake, Azure Synapse). Understanding of data governance , metadata management , and access controls . Exposure to other BI tools like Tableau, Power BI , or QlikView .
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
As a member of the technology innovation team at Wipro, you will work with a group of talented individuals in a highly team- oriented environment to create the next generation of innovative solutions and the financial community at large. Our Product Owners are involved in the full development life cycle and will work closely with all stakeholders during the conceptual, design, development, and production stages of each project. The Product Owners serves as the glue between accountants, quality assurance and the engineers that build the designs, so they need to be able to discuss, manage and document their work, product roadmaps, and decisions effectively to varied audiences. Gather and transform business needs from business stakeholders (Real Estate, Private Equity etc.) into epic/user story and work with Engineering Team and drive solution delivery Partner closely with global counterparts to facilitate regional implementation of global projects Utilize available technology tools (e.g. Tableau, Sigma, Snowflake, Anaplan, Appian etc.) within Blackstone to design and implement technology solutions for business stakeholders Bridging cross-functional teams (client working groups, engineering, QA, etc.) to drive the entire project lifecycle from concept to completion. Contributing product and project roadmaps to help team define scope, deliverables, schedule, and workstreams. Assist with the product rollout process by driving decisions, tracking issues, and assisting in time estimation. Communicating project status updates to relevant stakeholders across the team, including but not limited to business management, senior technical leads, developers, and designers. Qualifications: Wipro seeks to hire individuals who work well in a team-driven working group, are highly motivated, intelligent, have sound judgment and have demonstrated excellence in prior endeavors. The successful candidate must possess: Education: Bachelors degree in Finance, Accounting, Business, Computer Science, or a related field. Advanced degrees or certifications (e.g., CFA, CPA, Agile Product Owner) are a plus. Prior experience as a Product Owner or Business Analyst is highly desirable. Ability to understand user and technical requirements, establish a roadmap, manage projects, and drive decisions. Strong understanding of fund accounting principles and processes. Familiarity with fund accounting applications like Investran or similar systems. Knowledge of Agile methodologies and tools (e.g., Jira, Confluence). Proficiency in Tableau, Sigma, Snowflake, and Anaplan for data visualization, reporting, and financial planning. A self-starting, entrepreneurial attitude, willing to work through obstacles to accomplish tasks. An ability to juggle multiple projects on competing deadlines without compromising quality and take complete ownership over a product or line or initiative. Outstanding communication (written and verbal), presentation, documentation, and interpersonal skills. Strong Microsoft Office skills (MS Excel, MS PowerPoint and MS Word) Programming skills and experience would be beneficial (i.e. Python, SQL or VBA) Experience working collaboratively with multiple groups in a variety of settings and strong communication skills with the ability to listen, convey positions, and advocate designs. Intellectual curiosity and the ability to ask thoughtful questions to identify core user needs. Mandatory Skills: Business Analysis Experience: 3-5 Years
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France