Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a DevOps Engineer for our team based in Europe, you will be responsible for leveraging your skills in Informatica Powercenter and PowerExchange, Datavault modeling, and Snowflake. With over 7 years of experience, you will bring valuable expertise in ETL development, specifically with Informatica Powercenter and Datavault modeling. Your proficiency in DevOps practices and SAFe methodologies will be essential in ensuring the smooth operation of our systems. Moreover, your hands-on experience with Snowflake and DBT will be advantageous in optimizing our data processes. You will have the opportunity to work within a scrum team environment, where your contributions will be vital. If you have previous experience as a Scrum Master or aspire to take on such a role, we encourage you to apply. If you are a detail-oriented professional with a passion for driving efficiency and innovation in a dynamic environment, we would love to hear from you. Please send your profile to contact@squalas.com to be considered for this exciting opportunity.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
The role within Niro Money, Data and Analytics team involves translating data into actionable insights to enhance marketing ROI, drive business growth, and improve customer experience for various financial products such as personal loan, home loan, credit card, and insurance. The successful candidate will possess a strong background in data analytics and be capable of providing strategic recommendations to key stakeholders and business leaders. You will lead, mentor, and develop a high-performing team of data analysts and data scientists focused on building decision science models and segmentations to predict customer behaviors. Collaborating with the Partnership & Marketing team, you will conduct marketing experiments to enhance funnel conversion rates. Additionally, you will evaluate the effectiveness of marketing campaigns, identify successful strategies, recommend necessary changes, and oversee the implementation of customer journey-related product enhancements. Creating a culture of collaboration, innovation, and data-driven decision-making across various teams is crucial. You will manage multiple analytics projects concurrently, prioritizing them based on potential business impacts and ensuring timely and accurate completion. Project planning, monitoring, and addressing challenges promptly to keep projects on track are essential responsibilities. Collaborating with Data Engineering, Technology, and Product teams, you will develop and implement data capabilities for conducting marketing experiments and delivering actionable insights at scale. Applicants should hold a Master's degree in statistics, mathematics, data science, economics, or a BTech in computer science or engineering. A minimum of 5 years of hands-on experience in decision science analytics and developing data-driven strategies, preferably in the financial services industry, is required. You should also have at least 2 years of experience in managing and leading teams of data analysts and data scientists. Proficiency in statistical model development within financial service industries, including the use of logistic regression/gradient boosting algorithms with Python packages like Scikit-learn, XGBoost, Stats models, or decision tree tools, is essential. Moreover, candidates should have a minimum of 2 years of practical experience in SQL and Python. A proven track record of making data-driven decisions and solving problems based on analytics is necessary. Familiarity with Snowflake, AWS Athena/S3, Redshift, BI Tools like AWS Quicksight is advantageous. An analytical mindset, the ability to assess complex scenarios, and make data-driven decisions are essential qualities. A creative and curious nature, willingness to learn new tools and techniques, and a data-oriented personality are desired traits. Excellent communication and interpersonal skills are crucial for effectively collaborating with diverse stakeholders.,
Posted 1 week ago
7.0 - 10.0 years
6 - 16 Lacs
Bengaluru
Hybrid
Experienced Database Developer (7–10 yrs) with strong skills in SQL/PLSQL, performance tuning, SCD, dimensional modeling (Star/Snowflake), and handling ad-hoc tasks across multiple projects
Posted 1 week ago
7.0 - 9.0 years
20 - 25 Lacs
Hyderabad, Bengaluru
Work from Office
Immediate Joiners Only Role & responsibilities 6+ years of experience with Snowflake (Snowpipe, Streams, Tasks) Strong proficiency in SQL for high-performance data transformations Hands-on experience building ELT pipelines using cloud-native tools Proficiency in dbt for data modeling and workflow automation Python skills (Pandas, PySpark, SQLAlchemy) for data processing Experience with orchestration tools like Airflow or Prefect
Posted 1 week ago
15.0 - 20.0 years
40 - 100 Lacs
Bengaluru
Hybrid
Hiring, Investment Management and Risk Data Product Owner - ISS Data (Associate Director) Role The Investment and Risk & Attribution Data Product Owner role is instrumental in the creation and execution of a future state design for investment and risk data across our key business areas. The successful candidate will have an in-depth knowledge of all data domains that services Investment management, risk and attribution capabilities within the asset management industry. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned to deliver cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our clients. Key Responsibilities Leadership and Management: Lead the Investment and Risk data outcomes and capabilities for the ISS Data Programme. Realign existing resources and provide coaching and line management for junior data analysts within the chapter, influence and motivate them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead data product documentation, enable peer-reviews, get analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for improving efficiencies and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. An advocate for the ISS Data Programme. Coordinate with internal and external teams to communicate with those impacted by data flows. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. Essential Skills Required Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years s a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. In depth knowledge of how data vendor solutions such as Rimes, Bloomberg, MSCI, FactSet support Investment, Risk, Performance and Attribution business needs. Outstanding knowledge of data life cycle that drives Investment Management such as research, order management, trading, risk and attribution. In depth expertise in data and calculations across the investment industry covering the below. Financial data: This includes information on asset prices, market trends, economic indicators, interest rates, and other financial metrics that help in evaluating asset performance and making investment decisions. Asset-specific data: This includes data related to financial instruments reference data like asset specifications, maintenance records, usage history, and depreciation schedules. Market data: This includes data like security prices, exchange rates, index constituent and licensing restrictions on them. Risk data: This includes data related to risk factors such as market risk, credit risk, operational risk, and compliance risk. Performance & Attribution data: This includes data on fund performance returns and attribution using various methodologies like Time Weighted Returns, Transaction based performance attribution. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Hands on SQL, Advanced Excel, Python, ML (optional) and knowledge of end-to-end tech solutions involving data platforms. Knowledge of data management, data governance and data engineering practices. Hands on experience on data modelling techniques like dimensional, data vault etc. Willingness to own and drive things, collaboration across business and tech stakeholders.
Posted 1 week ago
15.0 - 20.0 years
40 - 100 Lacs
Bengaluru
Hybrid
Hiring, Sustainable, Client and Regulatory Reporting Data Product Owner - ISS Data (Associate Director) About your team The Technology function provides IT services that are integral to running an efficient run-the business operating model and providing change-driven solutions to meet outcomes that deliver on our business strategy. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, marketing and customer service functions. The broader organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. The Technology group is responsible for providing Technology solutions to the Investment Solutions & Services business (which covers Investment Management, Asset Management Operations & Distribution business units globally) The Technology team supports and enhances existing applications as well as designs, builds and procures new solutions to meet requirements and enable the evolving business strategy. As part of this group, a dedicated Data Programme team has been mobilised as a key foundational programme to support the execution of the overarching Investment Solutions and Service strategy. About your role The Investment Reporting Data Product Owner role is instrumental in the creation and execution of a future state data reporting product to enable Regulatory, Client, Vendor, Internal & MI reporting and analytics. The successful candidate will have an in- depth knowledge of all data domains that represent institutional clients , the investment life cycle , regulatory and client reporting data requirements. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned with our cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our internal business stakeholders and our clients. Key Responsibilities Leadership and Management: Lead the ISS distribution, Client Propositions, Sustainable Investing and Regulatory reporting data outcomes defining the data roadmap and capabilities and supporting the execution and delivery of the data solutions as a Data Product lead within the ISS Data Programme. Line management responsibilities for junior data analysts within the chapter, coaching, influencing and motivating them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead and define the data product backlog , documentation, enable peer-reviews, analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for driving efficiencies, scale and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. Coordinate with internal and external teams to communicate with those impacted by data flows. An advocate for the ISS Data Programme. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. About you The Investment Reporting Data Product Owner role is instrumental in the creation and execution of a future state data reporting product to enable Regulatory, Client, Vendor, Internal & MI reporting and analytics. The successful candidate will have an in- depth knowledge of all data domains that represent institutional clients , the investment life cycle , regulatory and client reporting data requirements. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned with cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our internal business stakeholders and our clients. Key Responsibilities Leadership and Management: Lead the ISS distribution, Client Propositions, Sustainable Investing and Regulatory reporting data outcomes defining the data roadmap and capabilities and supporting the execution and delivery of the data solutions as a Data Product lead within the ISS Data Programme. Line management responsibilities for junior data analysts within the chapter, coaching, influencing and motivating them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead and define the data product backlog , documentation, enable peer-reviews, analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for driving efficiencies, scale and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. Coordinate with internal and external teams to communicate with those impacted by data flows. An advocate for the ISS Data Programme. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. Your Skills and Experience Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years as a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. Outstanding knowledge of Client life cycle covering institutional & wholesale with a focus on CRM data, Transfer agency data. Very good understanding of the data generated by investment management processes and how that is leveraged in Go-to market capabilities such as client reporting, Sales, Marketing. Excellent knowledge of regulatory environment with a focus on European regulations and ESG specific ones such as MIFID II, EMIR, SFDR. Work effortlessly in different operating models such as insourcing, outsourcing and hybrid models. Automation mindset that can drive efficiencies and quality in the reporting landscape. Knowledge of industry standard data calcs for fund factsheets, Institutional admin and investment reports would be an added advantage. In Depth expertise in data and calculations across the investment industry covering the below. Client Specific data: This includes institutional and wholesale client, account and channels data, client preferences and data sets needed for client analytics. Knowledge of Salesforce desirable. Transfer Agency & Platform data: This includes granular client holdings at various levels, client transactions and relevant ref data. Knowledge of role of TPAs as TA and integrating external feeds/products with strategic inhouse data platforms. Investment data: This includes investment life cycle data covering data domains such as trading, ABOR, IBOR, Security and fund reference. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Hands on SQL, Advanced Excel, Python, ML (optional) and knowledge of end-to-end tech solutions involving data platforms. Knowledge of data management, data governance, and data engineering practices Hands on experience with data modelling techniques such as dimensional, data vault. Willingness to own and drive things, collaboration across business and tech stakeholders.
Posted 1 week ago
7.0 - 12.0 years
30 - 45 Lacs
Bengaluru
Work from Office
Lead Data Engineer - What You Will Do: As a PR3 Lead Data Engineer, you will be instrumental in driving our data strategy, ensuring data quality, and leading the technical execution of a small, impactful team. Your responsibilities will include: Team Leadership: Establish the strategic vision for the evolution of our data products and our technology solutions, then provide technical leadership and guidance for a small team of Data Engineers in executing the roadmap. Champion and enforce best practices for data quality, governance, and architecture within your team's work. Embody a product mindset over the teams data. Oversee the team’s use of Agile methodologies (e.g., Scrum, Kanban), ensuring smooth and predictable delivery, and overtly focusing on continuous improvement. Data Expertise & Domain Knowledge: Actively seek out, propose, and implement cutting-edge approaches to data transfer, transformation, analytics, and data warehousing to drive innovation. Design and implement scalable, robust, and high-quality ETL processes to support growing business demand for information, delivering data as a reliable service that directly influences decision making. Develop a profound understanding and "feel" for the business meaning, lineage, and context of each data field within our domain. Communication & Stakeholder Partnership: Collaborate with other engineering teams and business partners, proactively managing dependencies and holding them accountable for their contributions to ensure successful project delivery. Actively engage with data consumers to achieve deep understanding of their specific data usage, pain points, and current gaps, then plan initiatives to implement improvements collaboratively. Clearly articulate project goals, technical strategies, progress, challenges, and business value to both technical and non-technical audiences. Produce clear, concise, and comprehensive documentation. Your Qualifications: At Vista, we value the experience and potential that individual team members add to our culture. Please don’t hesitate to apply even if you don’t meet the exact qualifications, we look forward to learning more about you! Bachelor's or Master's degree in computer science, data engineering, or a related field . 10+ years of professional experience, with at least 6 years of hands-on Data Engineering, specifically in e-commerce or direct to consumer, and 4 years of team leadership Demonstrated experience in leading a team of data engineers, providing technical guidance, and coordinating project execution Stakeholder management experience and excellent communication skills Strong knowledge of SQL and data warehousing concepts is a must Strong knowledge of Data Modeling concepts and hands-on experience designing complex multi-dimension data models Strong hands-on experience in designing and managing scalable ETL pipelines in cloud environments with large volume datasets (both structured/unstructured data) Proficiency with cloud services in AWS (Preferred), including S3, EMR, RDS, Step Functions, Fargate, Glue etc. Critical hands-on experience with cloud-based data platforms (Snowflake strongly preferred) Data Visualization experience with reporting and data tools (preferably Looker with LookML skills) Coding mastery in at least one modern programming language: Python (strongly preferred), Java, Golang, PySpark, etc. Strong knowledge in production standards such as versioning, CI/CD, data quality, documentation, automation, etc. Problem solving and multi-tasking ability in a fast-paced, globally distributed environment Nice To Have: Experience with API development on enterprise platforms, with GraphQL APIs being a clear plus Hands-on experience designing DBT data pipelines Knowledge of finance, accounting, supply chain, logistics, operations, procurement data is a plus Experience managing work in Jira and writing documentation in Confluence Proficiency in AWS account management, including IAM, infrastructure, and monitoring for health, security and cost optimization Experience with Gen AI/ML tools for enhancing data pipelines or automating analysis. Why You'll Love Working Here There is a lot to love about working at Vista. We are an award winning Remote-First company. We’re an inclusive community. We’re growing (which means you can too). And to help orient us all in the same direction, we have our Vista Behaviors which exemplify the behavioral attributes that make us a culturally strong and high-performing team. Our Team: Enterprise Business Solutions Vistas Enterprise Business Solutions (EBS) domain is working to make our company one of the most data-driven organizations to support Finance, Supply Chain, and HR functions. The cross-functional team includes product owners, analysts, technologists, data engineers and more – all focused on providing Vista with cutting-edge tools and data we can use to deliver jaw-dropping customer value. EBS team members are empowered to learn new skills, communicate openly, and be active problem-solvers. Join our EBS Domain as a Lead Data Engineer! This Lead level within the organization will be responsible for the work of a small team of data engineers, focusing not only on implementations but also operations and support. The Lead Data Engineer will implement best practices, data standards, and reporting tools. The role will oversee and manage the work of other data engineers as well as being an individual contributor. This role has a lot of opportunity to impact general ETL development and implementation of new solutions. We will look to the Lead Data Engineer to modernize data technology solutions in EBS, including the opportunity to work on modern warehousing, finance, and HR datasets and integration technologies. This role will require an in-depth understanding of cloud data integration tools and cloud data warehousing, with a strong and pronounced ability to lead and execute initiatives to tangible results.
Posted 1 week ago
15.0 - 21.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Data Architect with over 15 years of experience, your primary responsibility will be to lead the design and implementation of scalable, secure, and high-performing data architectures. You will collaborate with business, engineering, and product teams to develop robust data solutions that support business intelligence, analytics, and AI initiatives. Your key responsibilities will include designing and implementing enterprise-grade data architectures using cloud platforms such as AWS, Azure, or GCP. You will lead the definition of data architecture standards, guidelines, and best practices while architecting scalable data solutions like data lakes, data warehouses, and real-time streaming platforms. Collaborating with data engineers, analysts, and data scientists, you will ensure optimal solutions are delivered based on data requirements. In addition, you will oversee data modeling activities encompassing conceptual, logical, and physical data models. It will be your duty to ensure data security, privacy, and compliance with relevant regulations like GDPR and HIPAA. Defining and implementing data governance strategies alongside stakeholders and evaluating data-related tools and technologies are also integral parts of your role. To excel in this position, you should possess at least 15 years of experience in data architecture, data engineering, or database development. Strong experience in architecting data solutions on major cloud platforms like AWS, Azure, or GCP is essential. Proficiency in data management principles, data modeling, ETL/ELT pipelines, and modern data platforms/tools such as Snowflake, Databricks, and Apache Spark is required. Familiarity with programming languages like Python, SQL, or Java, as well as real-time data processing frameworks like Kafka, Kinesis, or Azure Event Hub, will be beneficial. Moreover, experience in implementing data governance, data cataloging, and data quality frameworks is important. Knowledge of DevOps practices, CI/CD pipelines for data, and Infrastructure as Code (IaC) is a plus. Excellent problem-solving, communication, and stakeholder management skills are necessary for this role. A Bachelor's or Master's degree in Computer Science, Information Technology, or a related field is preferred, along with certifications like Cloud Architect or Data Architect (AWS/Azure/GCP). Join us at Infogain, a human-centered digital platform and software engineering company, where you will have the opportunity to work on cutting-edge data and AI projects in a collaborative and inclusive work environment. Experience competitive compensation and benefits while contributing to experience-led transformation for our clients in various industries.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for designing, developing, and implementing data-centric software solutions using various technologies. This includes conducting code reviews, recommending best coding practices, and providing effort estimates for the proposed solutions. Additionally, you will design audit business-centric software solutions and maintain comprehensive documentation for all proposed solutions. As a key member of the team, you will lead architect and design efforts for product development and application development for relevant use cases. You will provide guidance and support to team members and clients, implementing best practices of data engineering and architectural solution design, development, testing, and documentation. Your role will require you to participate in team meetings, brainstorming sessions, and project planning activities. It is essential to stay up-to-date with the latest advancements in the data engineering area to drive innovation and maintain a competitive edge. You will stay hands-on with the design, development, and validation of systems and models deployed. Collaboration with audit professionals to understand business, regulatory, and risk requirements, as well as key alignment considerations for audit, is a crucial aspect of the role. Driving efforts in the data engineering and architecture practice area will be a key responsibility. In terms of mandatory technical and functional skills, you should have a deep understanding of RDBMS (MS SQL Server, ORACLE, etc.), strong programming skills in T-SQL, and proven experience in ETL and reporting (MSBI stack/COGNOS/INFORMATICA, etc.). Additionally, experience with cloud-centric databases (AZURE SQL/AWS RDS), ADF (AZURE Data Factory), data warehousing skills using SYNAPSE/Redshift, understanding and implementation experience of datalakes, and experience in large data processing/ingestion using Databricks APIs, Lakehouse, etc., are required. Knowledge in MPP databases like SnowFlake/Postgres-XL is also essential. Preferred technical and functional skills include understanding financial accounting, experience with NoSQL using MONGODB/COSMOS, Python coding experience, and an aptitude towards emerging data platforms technologies like MS AZURE Fabric. Key behavioral attributes required for this role include strong analytical, problem-solving, and critical-thinking skills, excellent collaboration skills, the ability to work effectively in a team-oriented environment, excellent written and verbal communication skills, and the willingness to learn new technologies and work on them.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Supply Chain Data Integration Consultant Senior The opportunity We're looking for Senior Level Consultants with expertise in Data Modelling, Data Integration, Data Manipulation, and analysis to join the Supply Chain Technology group of our GDS consulting Team. This is a fantastic opportunity to be part of a leading firm while being instrumental in the growth of a new service offering. This role demands a highly technical, extremely hands-on Data Warehouse Modelling consultant who will work closely with our EY Partners and external clients to develop new business as well as drive other initiatives on different business needs. The ideal candidate must have a good understanding of the value of data warehouse and ETL with Supply Chain industry knowledge and proven experience in delivering solutions to different lines of business and technical leadership. Your key responsibilities A minimum of 5+ years of experience in BI/Data integration/ETL/DWH solutions in cloud and on-premises platforms such as Informatica/PC/IICS/Alteryx/Talend/Azure Data Factory (ADF)/SSIS/SSAS/SSRS and experience on any reporting tool like Power BI, Tableau, OBIEE, etc. Performing Data Analysis and Data Manipulation as per client requirements. Expert in Data Modelling to simplify business concepts. Create extensive ER Diagrams to help business in decision-making. Working experience with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures, and integrated datasets using data integration technologies. Should be able to develop sophisticated workflows & macros (Batch, Iterative, etc.) in Alteryx with enterprise data. Design and develop ETL workflows and datasets in Alteryx to be used by the BI Reporting tool. Perform end-to-end Data validation to maintain the accuracy of data sets. Support client needs by developing SSIS Packages in Visual Studio (version 2012 or higher) or Azure Data Factory (Extensive hands-on experience implementing data migration and data processing using Azure Data Factory). Support client needs by delivering Various Integrations with third-party applications. Experience in pulling data from a variety of data source types using appropriate connection managers as per Client needs. Develop, Customize, Deploy, maintain SSIS packages as per client business requirements. Should have thorough knowledge in creating dynamic packages in Visual Studio with multiple concepts such as - reading multiple files, Error handling, Archiving, Configuration creation, Package Deployment, etc. Experience working with clients throughout various parts of the implementation lifecycle. Proactive with a Solution-oriented mindset, ready to learn new technologies for Client requirements. Analyzing and translating business needs into long-term solution data models. Evaluating existing Data Warehouses or Systems. Strong knowledge of database structure systems and data mining. Skills and attributes for success Deliver large/medium DWH programs, demonstrate expert core consulting skills and an advanced level of Informatica, SQL, PL/SQL, Alteryx, ADF, SSIS, Snowflake, Databricks knowledge, and industry expertise to support delivery to clients. Demonstrate management and an ability to lead projects or teams individually. Experience in team management, communication, and presentation. To qualify for the role, you must have 5+ years ETL experience as Lead/Architect. Expertise in the ETL Mappings, Data Warehouse concepts. Should be able to design a Data Warehouse and present solutions as per client needs. Thorough knowledge in Structured Query Language (SQL) and experience working on SQL Server. Experience in SQL tuning and optimization using explain plan and SQL trace files. Should have experience in developing SSIS Batch Jobs Deployment, Scheduling Jobs, etc. Building Alteryx workflows for data integration, modeling, optimization, and data quality. Knowledge of Azure components like ADF, Azure Data Lake, and Azure SQL DB. Knowledge of data modeling and ETL design. Design and develop complex mappings, Process Flows, and ETL scripts. In-depth experience in designing the database and data modeling. Ideally, you'll also have Strong knowledge of ELT/ETL concepts, design, and coding. Expertise in data handling to resolve any data issues as per client needs. Experience in designing and developing DB objects such as Tables, Views, Indexes, Materialized Views, and Analytical functions. Experience of creating complex SQL queries for retrieving, manipulating, checking, and migrating complex datasets in DB. Experience in SQL tuning and optimization using explain plan and SQL trace files. Candidates ideally should have ideally good knowledge of ETL technologies/tools such as Alteryx, SSAS, SSRS, Azure Analysis Services, Azure Power Apps. Good verbal and written communication in English, Strong interpersonal, analytical, and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Candidates having additional knowledge of BI tools such as PowerBi, Tableau, etc will be preferred. Experience with Cloud databases and multiple ETL tools. What we look for The incumbent should be able to drive ETL Infrastructure related developments. Additional knowledge of complex source system data structures preferably in Financial services (preferred) Industry and reporting related developments will be an advantage. An opportunity to be a part of market-leading, multi-disciplinary team of 10000 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY GDS consulting practices globally with leading businesses across a range of industries. What working at EY offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies, and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching, and feedback from some of the most engaging colleagues around. Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As a Software Engineer II at PAR, you will utilize your expertise in GoLang to develop Enterprise Grade Systems that are scalable and maintainable. With 3+ years of experience, you will leverage the unique paradigms, idioms, and syntax of GoLang to create well-documented programs with reasonable test coverage. Your role will involve collaborating with the team to ensure the infrastructure functions seamlessly. Key Responsibilities: - Develop Enterprise Grade Systems using GoLang - Design scalable and maintainable programs - Coordinate with team members across different layers of the infrastructure - Ensure thorough documentation and reasonable test coverage - Solve complex problems through collaborative problem-solving and sophisticated design Requirements: - Proficiency in GoLang - Experience working on enterprise grade systems - Design web services - Full Stack development skills with frontend JavaScript Frameworks like Vue Js, React Js - Ability to scale systems with database bottlenecks - Knowledge of Microservices architecture - Familiarity with OAuth, JWT, SSO, Authentication, and Identity Federation - Experience with AWS, Docker, Kubernetes, Pods, and Meshes - Proficiency in MySQL, Snowflake, and MongoDB Why Join Us: - Contribute to writing scalable, robust, and testable code - Translate software requirements into high-performance software - Play a key role in architectural and design decisions for efficient microservices distributed architecture If you are passionate about creating innovative solutions that connect people to the restaurants, meals, and moments they love, we welcome you to join our team at PAR as a Software Engineer II.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You should have a minimum of 6 years of experience in the technical field and possess the following skills: Python, Spark SQL, PySpark, Apache Airflow, DBT, Snowflake, CI/CD, Git, GitHub, and AWS. Your role will involve understanding the existing code base in AWS services and SQL, and converting it to a tech stack primarily using Airflow, Iceberg, Python, and SQL. Your responsibilities will include designing and building data models to support business requirements, developing and maintaining data ingestion and processing systems, implementing data storage solutions, ensuring data consistency and accuracy through validation and cleansing techniques, and collaborating with cross-functional teams to address data-related issues. Proficiency in Python, experience with big data Spark, orchestration experience with Airflow, and AWS knowledge are essential for this role. You should also have experience in security and governance practices such as role-based access control (RBAC) and data lineage tools, as well as knowledge of database management systems like MySQL. Strong problem-solving and analytical skills, along with excellent communication and collaboration abilities, are key attributes for this position. At NucleusTeq, we foster a positive and supportive culture that encourages our associates to perform at their best every day. We value and celebrate individual uniqueness, offering flexibility for making daily choices that contribute to overall well-being. Our well-being programs and continuous efforts to enhance our culture aim to create an environment where our people can thrive, lead healthy lives, and excel in their roles.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You are familiar with AWS and Azure Cloud. You have extensive knowledge of Snowflake, with SnowPro Core certification being a must-have. In at least one project, you have utilized DBT to deploy models in production. Furthermore, you have experience in configuring and deploying Airflow, integrating various operators in Airflow, especially DBT & Snowflake. Your capabilities also include designing build, release pipelines, and a solid understanding of the Azure DevOps Ecosystem. Proficiency in Python, particularly PySpark, allows you to write metadata-driven programs. You are well-versed in Data Vault (Raw, Business) and concepts such as Point In Time and Semantic Layer. In ambiguous situations, you demonstrate resilience and possess the ability to clearly articulate problems in a business-friendly manner. Documenting processes, managing artifacts, and evolving them over time are practices you believe in and adhere to diligently. Required Skills: data vault, dbt, python, snowflake, data, Azure Cloud, AWS, articulate, PySpark, concepts, Azure, Airflow, artifacts, Azure DevOps.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You will play a crucial role in enhancing the Analytics capabilities for our businesses. Your responsibilities will include engaging with key stakeholders to comprehend Fidelity's sales, marketing, client services, and propositions context. You will collaborate with internal teams such as the data support team and technology team to develop new tools, capabilities, and solutions. Additionally, you will work closely with IS Operations to expedite the development and sharing of customized data sets. Maximizing the adoption of Cloud-Based Data Management Services will be a significant part of your role. This involves setting up sandbox analytics environments using platforms like Snowflake, AWS, Adobe, and Salesforce. You will also support data visualization and data science applications to enhance business operations. In terms of stakeholder management, you will work with key stakeholders to understand business problems and translate them into suitable analytics solutions. You are expected to facilitate smooth execution, delivery, and implementation of these solutions through effective engagement with stakeholders. Your role will also involve collaborating with the team to share knowledge and best practices, including coaching on deep learning and machine learning methodologies. Taking independent ownership of projects and initiatives within the team is crucial, demonstrating leadership and accountability. Furthermore, you will be responsible for developing and evaluating tools, methodologies, or infrastructure to address long-term business challenges. This may involve enhancing modelling software, methodologies, data requirements, and optimization environments to elevate the team's capabilities. To excel in this role, you should possess 5 to 8 years of overall experience in Analytics, with at least 4 years of experience in SQL, Python, open-source Machine Learning Libraries, and Deep Learning. Experience working in an AWS Environment, preferably using Snowflake, is preferred. Proficiency in analytics applications such as Python, SAS, SQL, and interpreting statistical results is necessary. Knowledge of SPARK, Hadoop, and Big Data Platforms will be advantageous.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
tirupati, andhra pradesh
On-site
You are an experienced Snowflake Data Engineer with expertise in Python and SQL, holding a Snowflake certification and having at least 4 years of hands-on experience with Snowflake. Your primary responsibility will be to design, develop, and maintain robust data pipelines in a cloud environment, ensuring efficient data integration, transformation, and storage within the Snowflake data platform. Your key responsibilities will include designing and developing data pipelines to handle large volumes of structured and unstructured data using Snowflake and SQL. You will also be responsible for developing and maintaining efficient ETL/ELT processes to integrate data from various sources into Snowflake, ensuring data quality and availability. Additionally, you will write Python scripts to automate data workflows, implement data transformation logic, and integrate with external APIs for data ingestion. You will create and optimize complex SQL queries for data extraction, transformation, and reporting purposes. Moreover, you will develop and maintain data models to support business intelligence and analytics, leveraging Snowflake best practices. Ensuring proper data governance, security, and compliance within the Snowflake environment will also be one of your responsibilities by implementing access controls, encryption, and monitoring. Collaboration is key, as you will work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver scalable solutions. As a qualified candidate, you must have a Snowflake Certification, 4+ years of experience with Snowflake, and active Snowflake certification. You should possess strong experience with Python for data processing, automation, and API integration. Expertise in writing and optimizing complex SQL queries and experience with data warehousing and database management is essential. Hands-on experience with designing and implementing ETL/ELT pipelines using Snowflake is also required. Familiarity with cloud environments such as AWS, GCP, or Azure, especially in relation to data storage and processing, is necessary. Experience with implementing data governance frameworks and security protocols in a cloud data platform is also a prerequisite. Preferred skills include experience with CI/CD pipelines for data projects, knowledge of Apache Airflow or other orchestration tools, and familiarity with big data technologies and distributed systems. Educational background should include a Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Additionally, possessing strong problem-solving and analytical skills, excellent communication skills to interact with both technical and non-technical stakeholders, and the ability to work in a fast-paced, agile environment are essential soft skills for this role.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
About Mindstix Software Labs: Mindstix accelerates digital transformation for the world's leading brands. We are a team of passionate innovators specialized in Cloud Engineering, DevOps, Data Science, and Digital Experiences. Our UX studio and modern-stack engineers deliver world-class products for our global customers that include Fortune 500 Enterprises and Silicon Valley startups. Our work impacts a diverse set of industries - eCommerce, Luxury Retail, ISV and SaaS, Consumer Tech, and Hospitality. A fast-moving open culture powered by curiosity and craftsmanship. A team committed to bold thinking and innovation at the very intersection of business, technology, and design. That's our DNA. Roles and Responsibilities: Mindstix is looking for a proficient Data Engineer. You are a collaborative person who takes pleasure in finding solutions to issues that add to the bottom line. You appreciate technical work by hand and feel a sense of ownership. You require a keen eye for detail, work experience as a data analyst, and in-depth knowledge of widely used databases and technologies for data analysis. Your responsibilities include: - Building outstanding domain-focused data solutions with internal teams, business analysts, and stakeholders. - Applying data engineering practices and standards to develop robust and maintainable solutions. - Being motivated by a fast-paced, service-oriented environment and interacting directly with clients on new features for future product releases. - Being a natural problem-solver and intellectually curious across a breadth of industries and topics. - Being acquainted with different aspects of Data Management like Data Strategy, Architecture, Governance, Data Quality, Integrity & Data Integration. - Being extremely well-versed in designing incremental and full data load techniques. Qualifications and Skills: - Bachelors or Master's degree in Computer Science, Information Technology, or allied streams. - 2+ years of hands-on experience in the data engineering domain with DWH development. - Must have experience with end-to-end data warehouse implementation on Azure or GCP. - Must have SQL and PL/SQL skills, implementing complex queries and stored procedures. - Solid understanding of DWH concepts such as OLAP, ETL/ELT, RBAC, Data Modelling, Data Driven Pipelines, Virtual Warehousing, and MPP. - Expertise in Databricks - Structured Streaming, Lakehouse Architecture, DLT, Data Modeling, Vacuum, Time Travel, Security, Monitoring, Dashboards, DBSQL, and Unit Testing. - Expertise in Snowflake - Monitoring, RBACs, Virtual Warehousing, Query Performance Tuning, and Time Travel. - Understanding of Apache Spark, Airflow, Hudi, Iceberg, Nessie, NiFi, Luigi, and Arrow (Good to have). - Strong foundations in computer science, data structures, algorithms, and programming logic. - Excellent logical reasoning and data interpretation capability. - Ability to interpret business requirements accurately. - Exposure to work with multicultural international customers. - Experience in the Retail/ Supply Chain/ CPG/ EComm/Health Industry is a plus. Who Fits Best - You are a data enthusiast and problem solver. - You are a self-motivated and fast learner with a strong sense of ownership and drive. - You enjoy working in a fast-paced creative environment. - You appreciate great design, have a strong sense of aesthetics and have a keen eye for detail. - You thrive in a customer-centric environment with the ability to actively listen, empathize and collaborate with globally distributed teams. - You are a team player who desires to mentor and inspire others to do their best. - You love expressing ideas and articulating well with strong written and verbal English communication and presentation skills. - You are detail-oriented with an appreciation for craftsmanship. Benefits: - Flexible working environment. - Competitive compensation and perks. - Health insurance coverage. - Accelerated career paths. - Rewards and recognition. - Sponsored certifications. - Global customers. - Mentorship by industry leaders. Location: This position is primarily based at our Pune (India) headquarters, requiring all potential hires to work from this location. A modern workplace is deeply collaborative by nature, while also demanding a touch of flexibility. We embrace deep collaboration at our offices with reasonable flexi-timing and hybrid options to our seasoned team members. Equal Opportunity Employer.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are an experienced Senior QA Specialist being sought to join a dynamic team for a critical AWS to GCP migration project. Your primary responsibility will involve the rigorous testing of data pipelines and data integrity in GCP cloud to ensure seamless reporting and analytics capabilities. Your key responsibilities will include designing and executing test plans to validate data pipelines re-engineered from AWS to GCP, ensuring data integrity and accuracy. You will work closely with data engineering teams to understand AVRO, ORC, and Parquet file structures in AWS S3, and analyze the data in external tables created in Athena used for reporting. It will be essential to ensure that schema and data in Bigquery match against Athena to support reporting in PowerBI. Additionally, you will be required to test and validate Spark pipelines and other big data workflows in GCP. Documenting all test results and collaborating with development teams to resolve discrepancies will also be part of your responsibilities. Furthermore, providing support to UAT business users during UAT testing is expected. To excel in this role, you should possess proven experience in QA testing within a big data DWBI ecosystem. Strong familiarity with cloud platforms such as AWS, GCP, or Azure, with hands-on experience in at least one is necessary. Deep knowledge of data warehousing solutions like BigQuery, Redshift, Synapse, or Snowflake is essential. Expertise in testing data pipelines and understanding different file formats like Avro and Parquet is required. Experience with reporting tools such as PowerBI or similar is preferred. Your excellent problem-solving skills and ability to work independently will be valuable, along with strong communication skills and the ability to collaborate effectively across teams.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Cloud Data Integration Consultant, you will be responsible for leading a complex data integration project that involves API frameworks, a data lakehouse architecture, and middleware solutions. The project focuses on technologies such as AWS, Snowflake, Oracle ERP, and Salesforce, with a high transaction volume POS system. Your role will involve building reusable and scalable API frameworks, optimizing middleware, and ensuring security and compliance in a multi-cloud environment. Your expertise in API development and integration will be crucial for this project. You should have deep experience in managing APIs across multiple systems, building reusable components, and ensuring bidirectional data flow for real-time data synchronization. Additionally, your skills in middleware solutions and custom API adapters will be essential for integrating various systems seamlessly. In terms of cloud infrastructure and data processing, your strong experience with AWS services like S3, Lambda, Fargate, and Glue will be required for data processing, storage, and integration. You should also have hands-on experience in optimizing Snowflake for querying and reporting, as well as knowledge of Terraform for automating the provisioning and management of AWS resources. Security and compliance are critical aspects of the project, and your deep understanding of cloud security protocols, API security, and compliance enforcement will be invaluable. You should be able to set up audit logs, ensure traceability, and enforce compliance across cloud services. Handling high-volume transaction systems and real-time data processing requirements will be part of your responsibilities. You should be familiar with optimizing AWS Lambda and Fargate for efficient data processing and be skilled in operational monitoring and error handling mechanisms. Collaboration and support are essential for the success of the project. You will need to provide post-go-live support, collaborate with internal teams and external stakeholders, and ensure seamless integration between systems. To qualify for this role, you should have at least 10 years of experience in enterprise API integration, cloud architecture, and data management. Deep expertise in AWS services, Snowflake, Oracle ERP, and Salesforce integrations is required, along with a proven track record of delivering scalable API frameworks and handling complex middleware systems. Strong problem-solving skills, familiarity with containerization technologies, and experience in retail or e-commerce industries are also desirable. Your key responsibilities will include leading the design and implementation of reusable API frameworks, optimizing data flow through middleware systems, building robust security frameworks, and collaborating with the in-house team for seamless integration between systems. Ongoing support, monitoring, and optimization post-go-live will also be part of your role.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a Data Engineering Lead/Architect with 10+ years of experience, you will play a crucial role in architecting and designing data solutions to meet business requirements effectively. You will collaborate with cross-functional teams to design scalable and efficient data architectures, models, and integration strategies. Your technical leadership will be essential in implementing data pipelines, ETL processes, and data warehousing solutions. Your expertise in Snowflake technologies will be key in building and optimizing data warehouses. You will develop and maintain Snowflake data models and schemas, following best practices such as cost analysis, resource allocation, and security configurations. Additionally, you will leverage Azure cloud services and Databricks platforms to manage and process large datasets efficiently, building and maintaining data pipelines on Azure services. Implementing best practices for data warehousing, ensuring data quality, consistency, and reliability will be part of your responsibilities. You will create and manage data integration processes, including real-time and batch data movement between systems. Your mastery in writing complex SQL and PL/SQL queries will enable you to extract, transform, and load data effectively, optimizing SQL queries and database performance for high volume data processing. Continuous monitoring and enhancement of data pipelines and data storage systems will be crucial for performance tuning and optimization. You will troubleshoot and resolve data-related issues to minimize downtime while documenting data engineering processes, data flows, and architectural decisions. Collaboration with data scientists, analysts, and stakeholders is essential to ensure data availability and usability. Your role will also involve implementing data security measures and adhering to compliance standards to protect sensitive data. In addition to your technical skills, you will be expected to showcase abilities such as driving data engineering strategies, engaging in sales and proposal activities, developing customer relationships, leading a technical team, and mentoring other team members. You should be able to clarify and translate customer requirements into Epics/Stories, removing ambiguity and aligning others to your ideas/solutions. To qualify for this role, you should have a Bachelor's or Master's degree in computer science, Information Technology, or a related field, along with over 10 years of experience in Data Engineering with a strong focus on architecture. Your proven expertise in Snowflake, Azure, and Databricks technologies, comprehensive knowledge of data warehousing concepts, ETL processes, data integration techniques, and exceptional SQL and PL/SQL skills are essential. Certifications in relevant technologies like Snowflake and Azure will be a plus. Strong problem-solving skills, the ability to work in a fast-paced, collaborative environment, and excellent communication skills are also required to convey technical concepts to non-technical stakeholders effectively.,
Posted 1 week ago
5.0 - 8.0 years
10 - 20 Lacs
Hyderabad
Work from Office
7+ years of experience as a Data Engineer or Snowflake Developer. Expert-level knowledge of SQL (joins, subqueries, CTEs). Experience with ETL tools (e.g., Informatica, Talend, Matillion). Experience with cloud platforms like AWS, Azure, or GCP.
Posted 2 weeks ago
8.0 - 10.0 years
10 - 12 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Adobe Launch and Analytics. Experience: 8-10 Years.
Posted 2 weeks ago
8.0 - 10.0 years
10 - 12 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Telco Solution Architecture. Experience: 8-10 Years.
Posted 2 weeks ago
10.0 - 12.0 years
12 - 14 Lacs
Chennai
Work from Office
Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Mainframe. Experience: >10 YEARS.
Posted 2 weeks ago
6.0 - 9.0 years
0 - 2 Lacs
Bengaluru
Work from Office
Manager- Data Engineer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Corporate and Investment Banking & Investment Research (CIB & IR) As a global leader in knowledge processes, research, and analytics, youll be working with a team that specializes in global market research, working with the top-rated investment research organizations, bulge bracket investment banks, and leading asset managers. We cater to 8 of the top 10 global banks, working alongside their product and sector teams, supporting them on deal origination, execution, valuation, and transaction advisory -related projects. What you will be doing at Evalueserve Construct analytical dashboards from alternative data use cases, such as sector or thematic and financial KPI dashboards. Load and import data into internal warehouses through Azure Blob Storage and/or S3 deliveries, SFTP, and other ingestion mechanisms. Design and implement ETL workflows for preprocessing of transactional and aggregated datasets including complex joins, window functions, aggregations, bins and partitions. Manipulate and enhance time series datasets into relational data stores. Implement and refine panels in transactional datasets and relevant panel normalization. Conduct web scraping, extraction and post-processing of numerical data from web-based datasets. What were looking for Previous experience working within fundamental equity investment workflows, such as exposure to financial modeling. High proficiency in SQL and the Python data stack (pandas, numpy, sklearn). Experience working with scheduling and execution platforms, such as Airflow, Prefect, or similar scheduled DAG frameworks. Understanding of efficient query management in Snowflake, DataBricks, or equivalent platforms. Optional familiarity with automation of workflows that produce Excel outputs, such as through openpyxl. Optional familiarity with integrations and import/exports to REST/gRPC/GraphQL APIs. Security: This role is performed in a dedicated, secure workspace Travel: Annual travel to the U.S. for onsite collaboration is expected. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking on achievements AI-powered supply chain optimization solution built on Google Cloud. How Evalueserve is now Leveraging NVIDIA NIM to enhance our AI and digital transformation solutions and to accelerate AI Capabilities . Know more about ho w Evalueserve has climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer: The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.
Posted 2 weeks ago
6.0 - 9.0 years
0 - 2 Lacs
Bengaluru
Work from Office
Manager- Data Engineer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Corporate and Investment Banking & Investment Research (CIB & IR) As a global leader in knowledge processes, research, and analytics, youll be working with a team that specializes in global market research, working with the top-rated investment research organizations, bulge bracket investment banks, and leading asset managers. We cater to 8 of the top 10 global banks, working alongside their product and sector teams, supporting them on deal origination, execution, valuation, and transaction advisory -related projects. What you will be doing at Evalueserve Constructanalytical dashboards from alternative data use cases, such as sector orthematic and financial KPI dashboards. Load andimport data into internal warehouses through Azure Blob Storage and/or S3deliveries, SFTP, and other ingestion mechanisms. Designand implement ETL workflows for preprocessing of transactional and aggregateddatasets including complex joins, window functions, aggregations, bins andpartitions. Manipulateand enhance time series datasets into relational data stores. Implementand refine panels in transactional datasets and relevant panel normalization. Conduct webscraping, extraction and post-processing of numerical data from web-baseddatasets. What were looking for Previousexperience working within fundamental equity investment workflows, such asexposure to financial modeling. Highproficiency in SQL and the Python data stack (pandas, numpy, sklearn). Experienceworking with scheduling and execution platforms, such as Airflow, Prefect, orsimilar scheduled DAG frameworks. Understandingof efficient query management in Snowflake, DataBricks, or equivalentplatforms. Optionalfamiliarity with automation of workflows that produce Excel outputs, such asthrough openpyxl. Optionalfamiliarity with integrations and import/exports to REST/gRPC/GraphQL APIs. Security:This role is performed in a dedicated, secure workspace Travel:Annual travel to the U.S. for onsite collaboration is expected. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking on achievements AI-powered supply chain optimization solution built on Google Cloud. How Evalueserve is now Leveraging NVIDIA NIM to enhance our AI and digital transformation solutions and to accelerate AI Capabilities . Know more about ho w Evalueserve has climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer: The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France