Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
12 - 19 Lacs
Pune
Hybrid
- Background in data pipelining, warehousing, ETL development solutions for data science and other Big Data applications - Minimum of 4 years of experience with Cloud databases
Posted 1 week ago
10.0 - 18.0 years
25 - 32 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Techwave , we are always in an exercise to foster a culture of growth, and inclusivity. We ensure whoever is associated with the brand is being challenged at every step and is provided with all the necessary opportunities to excel in life. People are at the core of everything we do. Join us! https://techwave.net/join-us/ Who are we? Techwave is a leading global IT and engineering services and solutions company revolutionizing digital transformations. We believe in enabling clients to maximize the potential and achieve a greater market with a wide array of technology services, including, but not limited to, Enterprise Resource Planning, Application Development, Analytics, Digital, and the Internet of things (IoT). Founded in 2004, headquartered in Houston, TX, USA, Techwave leverages its expertise in Digital Transformation, Enterprise Applications, and Engineering Services to enable businesses accelerate their growth. Plus, we're a team of dreamers and doers who are pushing the boundaries of what's possible. And we want YOU to be a part of it. Job Title: Data Lead Experience: 10+ Years Mode of Hire : Fulltime Key Skills: Responsibilities We are seeking a senior level ETL developer ( 10-13 years of experience) database/ETL developer responsible for building relational and data warehousing applications. Primary responsibility will be to support existing EDW, design and develop different layers of our data and test, document the ETL process. Designs and develops framework and services according to specifications within a team environment. Prepares detailed system documentation including requirements, specifications, test plans and user manuals. Performs unit and system tests and as needed, validation testing. Coordinates with Operations staff on deployment of applications. Ensures all activities are performed with quality and compliance. Design and implementation of ETL batches that meet the SLAs. Development of data collection, data staging, data movement, data quality and archiving strategies. Plan and conduct ETL Unit and development tests, monitoring results and taking corrective action when necessary. Experience in handling "slow-changing" dimensions using ETL. Design automation processes to control data access, transformation and movement and ensures source system data availability. Assists with database design. Expected to have solid understanding of database design principles and database administration methods and techniques. Perform data integrity checks and performance of data structures. Ability to write complex SQL queries, dynamic SQL, Stored procedures. Ability to work on Data Warehouse migration from existing platform to Snowflake. Preparing time estimates and justification for tasks assigned. Required Skills:- 8-10Yrs of ETL/ELT Experience Very strong SQL skills, Stored Procedures and database development skills 3-5yrs of experience in Azure Data Lake, Synapse, Azure Data Factory and Databricks. 3-5yrs of experience in Snowflake . A good understanding of the concepts and best practices of data warehouse ETL and ELT design and building relational databases. Ability to work independently and self-starter Strong database experience in DB2, SQL Server, Azure Strong in designing Relational and Dimensional Data modeling. Good understanding on Enterprise reporting primarily on Power BI Understanding of Agile practices and methodology is a plus. Assist with the analysis and extraction of relevant information from large amounts of historical business data to feed Business Intelligence initiatives Handson experiences conducting Proof of concept for new technology selection and proposing new data warehouse architecture
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
hyderabad, telangana
On-site
As an American Airlines team member in the Tech Hub in Hyderabad, India, you will have the opportunity to be part of a diverse, high-performing team dedicated to technical excellence. Your primary focus will be on delivering unrivaled digital products that drive a more reliable and profitable airline. The Data Domain you'll be working in is centered around managing and leveraging data as a strategic asset, including data management, storage, integration, and governance, with a strong emphasis on Machine Learning, AI, Data Science, and Business Intelligence. In this role, you will work closely with source data application teams and product owners to design, implement, and support analytics solutions that provide valuable insights for better decision-making. You will be responsible for implementing data migration and data engineering solutions using Azure products and services such as Azure Data Lake Storage, Azure Data Factory, Azure Functions, Event Hub, Azure Stream Analytics, Azure Databricks, and more, as well as traditional data warehouse tools. Your responsibilities will include various aspects of the development lifecycle, such as design, cloud engineering, data modeling, testing, performance tuning, deployments, BI, alerting, and production support. You will collaborate within a team environment and independently to develop technical solutions. As part of a DevOps team, you will have ownership and support for the product you work on, implementing both batch and streaming data pipelines using cloud technologies. To be successful in this role, you should have a Bachelor's degree in Computer Science, Computer Engineering, Technology, Information Systems, or a related technical discipline, or equivalent experience. You should have at least 1+ years of software solution development experience using agile, DevOps, and data analytics experience using SQL. Experience with cloud development and data lake technologies, particularly in Microsoft Azure, is preferred. Preferred qualifications include additional years of experience in software solution development, data analytics, full-stack development, and specific experience with Azure technologies. Skills in scripting languages like Python, Spark, Unix, SQL, as well as expertise with the Azure Technology stack and various data platforms and BI Analytics tools are highly valued. Certifications such as Azure Development Track and Spark are preferred. Effective communication skills are essential for this role, as you will need to collaborate with team members at all levels within the organization. Physical abilities are also necessary to perform the essential functions of the position safely. American Airlines values inclusion and diversity, providing a supportive environment for all team members to reach their full potential. If you are ready to be part of a dynamic, tech-driven environment where your creativity and strengths are celebrated, join American Airlines in Hyderabad and immerse yourself in the exciting world of technological innovation. Feel free to be yourself and contribute to keeping the largest airline in the world running smoothly as we care for people on life's journey.,
Posted 1 week ago
6.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
Calfus is a Silicon Valley headquartered software engineering and platforms company that seeks to inspire its team to rise faster, higher, stronger, and work together to build software at speed and scale. The company's core focus lies in creating engineered digital solutions that bring about a tangible and positive impact on business outcomes while standing for #Equity and #Diversity in its ecosystem and society at large. As a Data Engineer specializing in BI Analytics & DWH at Calfus, you will play a pivotal role in designing and implementing comprehensive business intelligence solutions that empower the organization to make data-driven decisions. Leveraging expertise in Power BI, Tableau, and ETL processes, you will create scalable architectures and interactive visualizations. This position requires a strategic thinker with strong technical skills and the ability to collaborate effectively with stakeholders at all levels. Key Responsibilities: - BI Architecture & DWH Solution Design: Develop and design scalable BI Analytical & DWH Solution that meets business requirements, leveraging tools such as Power BI and Tableau. - Data Integration: Oversee the ETL processes using SSIS to ensure efficient data extraction, transformation, and loading into data warehouses. - Data Modelling: Create and maintain data models that support analytical reporting and data visualization initiatives. - Database Management: Utilize SQL to write complex queries, stored procedures, and manage data transformations using joins and cursors. - Visualization Development: Lead the design of interactive dashboards and reports in Power BI and Tableau, adhering to best practices in data visualization. - Collaboration: Work closely with stakeholders to gather requirements and translate them into technical specifications and architecture designs. - Performance Optimization: Analyse and optimize BI solutions for performance, scalability, and reliability. - Data Governance: Implement best practices for data quality and governance to ensure accurate reporting and compliance. - Team Leadership: Mentor and guide junior BI developers and analysts, fostering a culture of continuous learning and improvement. - Azure Databricks: Leverage Azure Databricks for data processing and analytics, ensuring seamless integration with existing BI solutions. Qualifications: - Bachelors degree in computer science, Information Systems, Data Science, or a related field. - 6-12 years of experience in BI architecture and development, with a strong focus on Power BI and Tableau. - Proven experience with ETL processes and tools, especially SSIS. Strong proficiency in SQL Server, including advanced query writing and database management. - Exploratory data analysis with Python. - Familiarity with the CRISP-DM model. - Ability to work with different data models. - Familiarity with databases like Snowflake, Postgres, Redshift & Mongo DB. - Experience with visualization tools such as Power BI, QuickSight, Plotly, and/or Dash. - Strong programming foundation with Python for data manipulation and analysis using Pandas, NumPy, PySpark, data serialization & formats like JSON, CSV, Parquet & Pickle, database interaction, data pipeline and ETL tools, cloud services & tools, and code quality and management using version control. - Ability to interact with REST APIs and perform web scraping tasks is a plus. Calfus Inc. is an Equal Opportunity Employer.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
vadodara, gujarat
On-site
You will be working with Polymer, a smart data loss prevention (DLP) system that offers advanced cloud & AI data security and compliance solutions. By leveraging Polymer, you will play a crucial role in automating data protection processes, reducing data exposure risks, and enabling employees to enhance data security practices seamlessly within their existing workflows. Your responsibilities will include designing, developing, and maintaining ETL processes within large-scale data environments utilizing tools such as Snowflake and BigQuery. You will be tasked with constructing and deploying data pipelines to manage data ingestion, transformation, and loading operations from diverse sources. Additionally, you will create and manage data models and schemas optimized for performance and scalability, leveraging BI tools like QuickSight, Tableau, or Sigma to generate interactive dashboards and reports. Collaboration with stakeholders to grasp business requirements and convert them into technical solutions will be a key aspect of your role. You will communicate complex data insights clearly to both technical and non-technical audiences, proactively identify and resolve data quality issues and performance bottlenecks, and contribute to enhancing the data infrastructure and best practices within the organization. As a qualified candidate, you should hold a Bachelor's or Master's degree in Computer Science, Data Science, Computer Engineering, or a related field, along with 3-5 years of experience in a data science/engineering role. Proficiency in Python, including experience with Django or Flask, is essential, while expertise in Snowflake and BigQuery is advantageous. Experience with relational databases like MySQL or PostgreSQL, designing ETL processes in large-scale data environments, and working with cloud platforms such as AWS or GCP is highly valued. Your problem-solving and analytical skills, combined with a data-driven mindset, will be crucial in this role. Strong communication, interpersonal skills, and the ability to work both independently and collaboratively within a team are essential attributes. Familiarity with Agile development methodologies will be beneficial for success in this position. This is an onsite opportunity located in Vadodara, Gujarat, India.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a Senior Software Engineer at Elevance Health, a prominent health company in America dedicated to enhancing lives and simplifying healthcare. Elevance Health is the largest managed healthcare company in the Blue Cross Blue Shield (BCBS) Association, serving over 45 million lives across 14 states. This Fortune 500 company is currently ranked 20th and led by Gail Boudreaux, a prominent figure in the Fortune list of most powerful women. Your role will be within Carelon Global Solutions (CGS), a subsidiary of Elevance Health, focused on simplifying complex operational processes in the healthcare system. CGS brings together a global team of innovators across various locations, including Bengaluru and Gurugram in India, to optimize healthcare operations effectively and efficiently. As a Senior Software Engineer, your primary responsibility involves collaborating with data architects to implement data models and ensure seamless integration with AWS services. You will be responsible for supporting, monitoring, and resolving production issues to meet SLAs, being available 24/7 for business application support. You should have hands-on experience with technologies like Snowflake, Python, AWS S3-Athena, RDS, Cloudwatch, Lambda, and more. Your expertise should include handling nested JSON files, analyzing daily loads/issues, working closely with admin/architect teams, and understanding complex job and data flows in the project. To qualify for this role, you need a Bachelor's degree in Information Technology/Data Engineering or equivalent education and experience, along with 5-8 years of overall IT experience and 2-9 years in AWS services. Experience in agile development processes is preferred. You are expected to have skills in Snowflake, AWS services, complex SQL queries, and technologies like Hadoop, Kafka, HBase, Sqoop, and Scala. Your ability to analyze, research, and solve technical problems will be crucial for success in this role. Carelon promises limitless opportunities for its associates, emphasizing growth, well-being, purpose, and belonging. With a focus on learning and development, an innovative culture, and comprehensive rewards, Carelon offers a supportive environment for personal and professional growth. Carelon is an equal opportunity employer that values diversity and inclusivity. If you require accommodations due to a disability, you can request the Reasonable Accommodation Request Form. This is a full-time position that offers a competitive benefits package and a conducive work environment.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You are a strategic thinker passionate about driving solutions in Data Analytics. You have found the right team. As an Analytics Solutions Vice President in our Finance team, you will define, refine, and deliver our firm's goals. If you're a skilled data professional passionate about transforming raw data into actionable insights and eager to learn and implement new technologies, you've found the right team. Join us in the Finance Data & Insights Team, an agile product team focused on developing, producing, and transforming financial data and reporting across CCB. Your role will involve creating data visualizations and intelligence solutions for top leaders to achieve strategic goals. You'll identify opportunities to eliminate manual processes and use automation tools like Alteryx, Tableau, and ThoughtSpot to develop automated solutions. Additionally, you'll extract, analyze, and summarize data for ad hoc requests and contribute to modernizing our data environment to a cloud platform. Job responsibilities: - Lead Data & Analytics requirements gathering sessions with varying levels of leadership and complete detailed project planning using JIRA to record planned project execution steps. - Understand databases, ETL processes, and translate logic into requirements for the Technology team. - Develop and enhance Alteryx workflows by collecting data from disparate sources and summarizing it as defined in requirements gathering with stakeholders, following best practices to source data from authoritative sources. - Develop data visualization solutions using Tableau and/or ThoughtSpot to provide intuitive insights to key stakeholders. - Conduct thorough control testing of each component of the intelligence solution, providing evidence that all data and visualizations offer accurate insights and evidence in the control process. - Seek to understand stakeholder use cases to anticipate their requirements, questions, and objections. - Become a subject matter expert in these responsibilities and support team members in becoming more proficient. Required qualifications, capabilities, and skills: - Bachelor's degree in MIS or Computer Science, Mathematics, Engineering, Statistics, or other quantitative or financial subject areas - People management experience of at least 3 years is required - Experience with business intelligence analytic and data wrangling tools such as Alteryx, SAS, or Python - Experience with relational databases optimizing SQL to pull and summarize large datasets, report creation and ad-hoc analyses, Databricks, Cloud solutions - Experience in reporting development and testing, and ability to interpret unstructured data and draw objective inferences given known limitations of the data - Demonstrated ability to think beyond raw data and to understand the underlying business context and sense business opportunities hidden in data - Strong written and oral communication skills; ability to communicate effectively with all levels of management and partners from a variety of business functions - Experience with ThoughtSpot or similar tools empowering stakeholders to better understand their data - Highly motivated, self-directed, curious to learn new technologies Preferred qualifications, capabilities, and skills: - Experience with ThoughtSpot / Python major advantage - Experience with AI/ML or LLM added advantage but not a must-have. Minimum 8 years experience developing advanced data visualization and presentations preferably with Tableau - Experience with Hive, Spark SQL, Impala, or other big-data query tools. AWS, Databricks, Snowflake, or other Cloud Data Warehouse experience - Minimum of 8 years experience working with data analytics projects, preferably related to financial services domain,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be a valuable member of the data engineering team, focusing on developing data pipelines, transforming data, exploring new data patterns, optimizing current data feeds, and implementing enhancements. Your primary responsibilities will involve utilizing your expertise in RDBMS concepts, hands-on experience with AWS Cloud platform and Services (including IAM, EC2, Lambda, RDS, Timestream, Glue, etc.), familiarity with data streaming tools like Kafka, practical knowledge of ETL/ELT tools, and understanding of Snowflake/PostgreSQL or any other database system. Ideally, you should also have a good grasp of data modeling techniques to further bolster your capabilities in this role.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
About this position: We are eagerly seeking candidates with 4 to 8 years of experience for an Automation QA Dot Net role to join our dynamic team in Pune. The ideal candidate will play a pivotal role in ensuring the quality, performance, and reliability of our microservice components delivered to clients. You will be integral to our digital transformation efforts, utilizing industry best practices and advanced technologies such as cloud computing, artificial intelligence, and robotic process automation to enhance business capabilities. Impact you will realize: As an Automation QA Dot Net, you will be responsible for defining and executing Test Plans for various high-complexity products and solutions. This includes covering End to End testing of Functional and cross-functional teams, Regression, Usability, UI, and Performance tests. You will monitor and track Test Plan/Automation Projects, review projects to handle changes in scope effectively, define and implement Test Automation Projects, ensure Build Acceptance Tests are executed within the given timeframe, and improve QA Processes/Best Practices. Key skills you will require: Primary Skills: - Strong knowledge of SQL, Snowflake, Data Products/services, Queueing systems, and Reporting. - Experience in Cross-teams functional End to End testing. - Knowledge of AWS services. - Understanding of Test Automation and Performance testing tools. - Experience with Typescript and backend testing. - Proficiency in GIT commands. - Strong Experience in API and Microservice testing using Automation tools. - Sound knowledge of SDLC phases and Testing Life cycle. - Strong Requirements and Business Analysis skills with attention to detail. - Test Management in Jira: Test case, Test Execution. - Ability to define and execute Test Suites/Test Cases. - Test Automation Tools: Playwright, Selenium, Test Architect. - Knowledge of Continuous Integration Tools Jenkins. - Experience with any Data streaming application. - Strong planning and tracking of work items. - Experience in Enterprise-level projects release pattern (Major & Minor release). - Proficiency in Jira (Fix versions, JQL). - Understanding of various testing cycles based on release patterns (Regression cycles importantly). Why should you join Xoriant Xoriant is a trusted provider of digital engineering services, known for building and operating complex platforms and products at scale. With three decades of software engineering excellence, we combine modern technology expertise in Data & AI (GenAI), cloud & security, domain and process consulting to solve complex technology challenges. We serve over 100 Fortune 500 companies and tech startups on their journey to becoming unicorns and beyond. With over 5000 passionate XFactors (our employees) from over 20 countries, we foster a culture focused on purpose and employee happiness. If you possess the XFactor and are looking to be part of a passionate team that creates a better future through tech & innovation, Xoriant is the place for you. Join us in a diverse and inclusive workspace where continuous learning, well-being, and work-life balance are prioritized. Be part of a culture that values innovation, rewards hard work, and celebrates diversity. To know more about Xoriant, please visit: www.xoriant.com,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a DevOps Engineer for our team based in Europe, you will be responsible for leveraging your skills in Informatica Powercenter and PowerExchange, Datavault modeling, and Snowflake. With over 7 years of experience, you will bring valuable expertise in ETL development, specifically with Informatica Powercenter and Datavault modeling. Your proficiency in DevOps practices and SAFe methodologies will be essential in ensuring the smooth operation of our systems. Moreover, your hands-on experience with Snowflake and DBT will be advantageous in optimizing our data processes. You will have the opportunity to work within a scrum team environment, where your contributions will be vital. If you have previous experience as a Scrum Master or aspire to take on such a role, we encourage you to apply. If you are a detail-oriented professional with a passion for driving efficiency and innovation in a dynamic environment, we would love to hear from you. Please send your profile to contact@squalas.com to be considered for this exciting opportunity.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
The role within Niro Money, Data and Analytics team involves translating data into actionable insights to enhance marketing ROI, drive business growth, and improve customer experience for various financial products such as personal loan, home loan, credit card, and insurance. The successful candidate will possess a strong background in data analytics and be capable of providing strategic recommendations to key stakeholders and business leaders. You will lead, mentor, and develop a high-performing team of data analysts and data scientists focused on building decision science models and segmentations to predict customer behaviors. Collaborating with the Partnership & Marketing team, you will conduct marketing experiments to enhance funnel conversion rates. Additionally, you will evaluate the effectiveness of marketing campaigns, identify successful strategies, recommend necessary changes, and oversee the implementation of customer journey-related product enhancements. Creating a culture of collaboration, innovation, and data-driven decision-making across various teams is crucial. You will manage multiple analytics projects concurrently, prioritizing them based on potential business impacts and ensuring timely and accurate completion. Project planning, monitoring, and addressing challenges promptly to keep projects on track are essential responsibilities. Collaborating with Data Engineering, Technology, and Product teams, you will develop and implement data capabilities for conducting marketing experiments and delivering actionable insights at scale. Applicants should hold a Master's degree in statistics, mathematics, data science, economics, or a BTech in computer science or engineering. A minimum of 5 years of hands-on experience in decision science analytics and developing data-driven strategies, preferably in the financial services industry, is required. You should also have at least 2 years of experience in managing and leading teams of data analysts and data scientists. Proficiency in statistical model development within financial service industries, including the use of logistic regression/gradient boosting algorithms with Python packages like Scikit-learn, XGBoost, Stats models, or decision tree tools, is essential. Moreover, candidates should have a minimum of 2 years of practical experience in SQL and Python. A proven track record of making data-driven decisions and solving problems based on analytics is necessary. Familiarity with Snowflake, AWS Athena/S3, Redshift, BI Tools like AWS Quicksight is advantageous. An analytical mindset, the ability to assess complex scenarios, and make data-driven decisions are essential qualities. A creative and curious nature, willingness to learn new tools and techniques, and a data-oriented personality are desired traits. Excellent communication and interpersonal skills are crucial for effectively collaborating with diverse stakeholders.,
Posted 1 week ago
7.0 - 10.0 years
6 - 16 Lacs
Bengaluru
Hybrid
Experienced Database Developer (7–10 yrs) with strong skills in SQL/PLSQL, performance tuning, SCD, dimensional modeling (Star/Snowflake), and handling ad-hoc tasks across multiple projects
Posted 1 week ago
7.0 - 9.0 years
20 - 25 Lacs
Hyderabad, Bengaluru
Work from Office
Immediate Joiners Only Role & responsibilities 6+ years of experience with Snowflake (Snowpipe, Streams, Tasks) Strong proficiency in SQL for high-performance data transformations Hands-on experience building ELT pipelines using cloud-native tools Proficiency in dbt for data modeling and workflow automation Python skills (Pandas, PySpark, SQLAlchemy) for data processing Experience with orchestration tools like Airflow or Prefect
Posted 1 week ago
15.0 - 20.0 years
40 - 100 Lacs
Bengaluru
Hybrid
Hiring, Investment Management and Risk Data Product Owner - ISS Data (Associate Director) Role The Investment and Risk & Attribution Data Product Owner role is instrumental in the creation and execution of a future state design for investment and risk data across our key business areas. The successful candidate will have an in-depth knowledge of all data domains that services Investment management, risk and attribution capabilities within the asset management industry. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned to deliver cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our clients. Key Responsibilities Leadership and Management: Lead the Investment and Risk data outcomes and capabilities for the ISS Data Programme. Realign existing resources and provide coaching and line management for junior data analysts within the chapter, influence and motivate them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead data product documentation, enable peer-reviews, get analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for improving efficiencies and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. An advocate for the ISS Data Programme. Coordinate with internal and external teams to communicate with those impacted by data flows. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. Essential Skills Required Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years s a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. In depth knowledge of how data vendor solutions such as Rimes, Bloomberg, MSCI, FactSet support Investment, Risk, Performance and Attribution business needs. Outstanding knowledge of data life cycle that drives Investment Management such as research, order management, trading, risk and attribution. In depth expertise in data and calculations across the investment industry covering the below. Financial data: This includes information on asset prices, market trends, economic indicators, interest rates, and other financial metrics that help in evaluating asset performance and making investment decisions. Asset-specific data: This includes data related to financial instruments reference data like asset specifications, maintenance records, usage history, and depreciation schedules. Market data: This includes data like security prices, exchange rates, index constituent and licensing restrictions on them. Risk data: This includes data related to risk factors such as market risk, credit risk, operational risk, and compliance risk. Performance & Attribution data: This includes data on fund performance returns and attribution using various methodologies like Time Weighted Returns, Transaction based performance attribution. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Hands on SQL, Advanced Excel, Python, ML (optional) and knowledge of end-to-end tech solutions involving data platforms. Knowledge of data management, data governance and data engineering practices. Hands on experience on data modelling techniques like dimensional, data vault etc. Willingness to own and drive things, collaboration across business and tech stakeholders.
Posted 1 week ago
15.0 - 20.0 years
40 - 100 Lacs
Bengaluru
Hybrid
Hiring, Sustainable, Client and Regulatory Reporting Data Product Owner - ISS Data (Associate Director) About your team The Technology function provides IT services that are integral to running an efficient run-the business operating model and providing change-driven solutions to meet outcomes that deliver on our business strategy. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, marketing and customer service functions. The broader organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. The Technology group is responsible for providing Technology solutions to the Investment Solutions & Services business (which covers Investment Management, Asset Management Operations & Distribution business units globally) The Technology team supports and enhances existing applications as well as designs, builds and procures new solutions to meet requirements and enable the evolving business strategy. As part of this group, a dedicated Data Programme team has been mobilised as a key foundational programme to support the execution of the overarching Investment Solutions and Service strategy. About your role The Investment Reporting Data Product Owner role is instrumental in the creation and execution of a future state data reporting product to enable Regulatory, Client, Vendor, Internal & MI reporting and analytics. The successful candidate will have an in- depth knowledge of all data domains that represent institutional clients , the investment life cycle , regulatory and client reporting data requirements. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned with our cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our internal business stakeholders and our clients. Key Responsibilities Leadership and Management: Lead the ISS distribution, Client Propositions, Sustainable Investing and Regulatory reporting data outcomes defining the data roadmap and capabilities and supporting the execution and delivery of the data solutions as a Data Product lead within the ISS Data Programme. Line management responsibilities for junior data analysts within the chapter, coaching, influencing and motivating them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead and define the data product backlog , documentation, enable peer-reviews, analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for driving efficiencies, scale and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. Coordinate with internal and external teams to communicate with those impacted by data flows. An advocate for the ISS Data Programme. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. About you The Investment Reporting Data Product Owner role is instrumental in the creation and execution of a future state data reporting product to enable Regulatory, Client, Vendor, Internal & MI reporting and analytics. The successful candidate will have an in- depth knowledge of all data domains that represent institutional clients , the investment life cycle , regulatory and client reporting data requirements. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned with cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our internal business stakeholders and our clients. Key Responsibilities Leadership and Management: Lead the ISS distribution, Client Propositions, Sustainable Investing and Regulatory reporting data outcomes defining the data roadmap and capabilities and supporting the execution and delivery of the data solutions as a Data Product lead within the ISS Data Programme. Line management responsibilities for junior data analysts within the chapter, coaching, influencing and motivating them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead and define the data product backlog , documentation, enable peer-reviews, analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for driving efficiencies, scale and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. Coordinate with internal and external teams to communicate with those impacted by data flows. An advocate for the ISS Data Programme. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. Your Skills and Experience Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years as a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. Outstanding knowledge of Client life cycle covering institutional & wholesale with a focus on CRM data, Transfer agency data. Very good understanding of the data generated by investment management processes and how that is leveraged in Go-to market capabilities such as client reporting, Sales, Marketing. Excellent knowledge of regulatory environment with a focus on European regulations and ESG specific ones such as MIFID II, EMIR, SFDR. Work effortlessly in different operating models such as insourcing, outsourcing and hybrid models. Automation mindset that can drive efficiencies and quality in the reporting landscape. Knowledge of industry standard data calcs for fund factsheets, Institutional admin and investment reports would be an added advantage. In Depth expertise in data and calculations across the investment industry covering the below. Client Specific data: This includes institutional and wholesale client, account and channels data, client preferences and data sets needed for client analytics. Knowledge of Salesforce desirable. Transfer Agency & Platform data: This includes granular client holdings at various levels, client transactions and relevant ref data. Knowledge of role of TPAs as TA and integrating external feeds/products with strategic inhouse data platforms. Investment data: This includes investment life cycle data covering data domains such as trading, ABOR, IBOR, Security and fund reference. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Hands on SQL, Advanced Excel, Python, ML (optional) and knowledge of end-to-end tech solutions involving data platforms. Knowledge of data management, data governance, and data engineering practices Hands on experience with data modelling techniques such as dimensional, data vault. Willingness to own and drive things, collaboration across business and tech stakeholders.
Posted 1 week ago
7.0 - 12.0 years
30 - 45 Lacs
Bengaluru
Work from Office
Lead Data Engineer - What You Will Do: As a PR3 Lead Data Engineer, you will be instrumental in driving our data strategy, ensuring data quality, and leading the technical execution of a small, impactful team. Your responsibilities will include: Team Leadership: Establish the strategic vision for the evolution of our data products and our technology solutions, then provide technical leadership and guidance for a small team of Data Engineers in executing the roadmap. Champion and enforce best practices for data quality, governance, and architecture within your team's work. Embody a product mindset over the teams data. Oversee the team’s use of Agile methodologies (e.g., Scrum, Kanban), ensuring smooth and predictable delivery, and overtly focusing on continuous improvement. Data Expertise & Domain Knowledge: Actively seek out, propose, and implement cutting-edge approaches to data transfer, transformation, analytics, and data warehousing to drive innovation. Design and implement scalable, robust, and high-quality ETL processes to support growing business demand for information, delivering data as a reliable service that directly influences decision making. Develop a profound understanding and "feel" for the business meaning, lineage, and context of each data field within our domain. Communication & Stakeholder Partnership: Collaborate with other engineering teams and business partners, proactively managing dependencies and holding them accountable for their contributions to ensure successful project delivery. Actively engage with data consumers to achieve deep understanding of their specific data usage, pain points, and current gaps, then plan initiatives to implement improvements collaboratively. Clearly articulate project goals, technical strategies, progress, challenges, and business value to both technical and non-technical audiences. Produce clear, concise, and comprehensive documentation. Your Qualifications: At Vista, we value the experience and potential that individual team members add to our culture. Please don’t hesitate to apply even if you don’t meet the exact qualifications, we look forward to learning more about you! Bachelor's or Master's degree in computer science, data engineering, or a related field . 10+ years of professional experience, with at least 6 years of hands-on Data Engineering, specifically in e-commerce or direct to consumer, and 4 years of team leadership Demonstrated experience in leading a team of data engineers, providing technical guidance, and coordinating project execution Stakeholder management experience and excellent communication skills Strong knowledge of SQL and data warehousing concepts is a must Strong knowledge of Data Modeling concepts and hands-on experience designing complex multi-dimension data models Strong hands-on experience in designing and managing scalable ETL pipelines in cloud environments with large volume datasets (both structured/unstructured data) Proficiency with cloud services in AWS (Preferred), including S3, EMR, RDS, Step Functions, Fargate, Glue etc. Critical hands-on experience with cloud-based data platforms (Snowflake strongly preferred) Data Visualization experience with reporting and data tools (preferably Looker with LookML skills) Coding mastery in at least one modern programming language: Python (strongly preferred), Java, Golang, PySpark, etc. Strong knowledge in production standards such as versioning, CI/CD, data quality, documentation, automation, etc. Problem solving and multi-tasking ability in a fast-paced, globally distributed environment Nice To Have: Experience with API development on enterprise platforms, with GraphQL APIs being a clear plus Hands-on experience designing DBT data pipelines Knowledge of finance, accounting, supply chain, logistics, operations, procurement data is a plus Experience managing work in Jira and writing documentation in Confluence Proficiency in AWS account management, including IAM, infrastructure, and monitoring for health, security and cost optimization Experience with Gen AI/ML tools for enhancing data pipelines or automating analysis. Why You'll Love Working Here There is a lot to love about working at Vista. We are an award winning Remote-First company. We’re an inclusive community. We’re growing (which means you can too). And to help orient us all in the same direction, we have our Vista Behaviors which exemplify the behavioral attributes that make us a culturally strong and high-performing team. Our Team: Enterprise Business Solutions Vistas Enterprise Business Solutions (EBS) domain is working to make our company one of the most data-driven organizations to support Finance, Supply Chain, and HR functions. The cross-functional team includes product owners, analysts, technologists, data engineers and more – all focused on providing Vista with cutting-edge tools and data we can use to deliver jaw-dropping customer value. EBS team members are empowered to learn new skills, communicate openly, and be active problem-solvers. Join our EBS Domain as a Lead Data Engineer! This Lead level within the organization will be responsible for the work of a small team of data engineers, focusing not only on implementations but also operations and support. The Lead Data Engineer will implement best practices, data standards, and reporting tools. The role will oversee and manage the work of other data engineers as well as being an individual contributor. This role has a lot of opportunity to impact general ETL development and implementation of new solutions. We will look to the Lead Data Engineer to modernize data technology solutions in EBS, including the opportunity to work on modern warehousing, finance, and HR datasets and integration technologies. This role will require an in-depth understanding of cloud data integration tools and cloud data warehousing, with a strong and pronounced ability to lead and execute initiatives to tangible results.
Posted 1 week ago
15.0 - 21.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Data Architect with over 15 years of experience, your primary responsibility will be to lead the design and implementation of scalable, secure, and high-performing data architectures. You will collaborate with business, engineering, and product teams to develop robust data solutions that support business intelligence, analytics, and AI initiatives. Your key responsibilities will include designing and implementing enterprise-grade data architectures using cloud platforms such as AWS, Azure, or GCP. You will lead the definition of data architecture standards, guidelines, and best practices while architecting scalable data solutions like data lakes, data warehouses, and real-time streaming platforms. Collaborating with data engineers, analysts, and data scientists, you will ensure optimal solutions are delivered based on data requirements. In addition, you will oversee data modeling activities encompassing conceptual, logical, and physical data models. It will be your duty to ensure data security, privacy, and compliance with relevant regulations like GDPR and HIPAA. Defining and implementing data governance strategies alongside stakeholders and evaluating data-related tools and technologies are also integral parts of your role. To excel in this position, you should possess at least 15 years of experience in data architecture, data engineering, or database development. Strong experience in architecting data solutions on major cloud platforms like AWS, Azure, or GCP is essential. Proficiency in data management principles, data modeling, ETL/ELT pipelines, and modern data platforms/tools such as Snowflake, Databricks, and Apache Spark is required. Familiarity with programming languages like Python, SQL, or Java, as well as real-time data processing frameworks like Kafka, Kinesis, or Azure Event Hub, will be beneficial. Moreover, experience in implementing data governance, data cataloging, and data quality frameworks is important. Knowledge of DevOps practices, CI/CD pipelines for data, and Infrastructure as Code (IaC) is a plus. Excellent problem-solving, communication, and stakeholder management skills are necessary for this role. A Bachelor's or Master's degree in Computer Science, Information Technology, or a related field is preferred, along with certifications like Cloud Architect or Data Architect (AWS/Azure/GCP). Join us at Infogain, a human-centered digital platform and software engineering company, where you will have the opportunity to work on cutting-edge data and AI projects in a collaborative and inclusive work environment. Experience competitive compensation and benefits while contributing to experience-led transformation for our clients in various industries.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for designing, developing, and implementing data-centric software solutions using various technologies. This includes conducting code reviews, recommending best coding practices, and providing effort estimates for the proposed solutions. Additionally, you will design audit business-centric software solutions and maintain comprehensive documentation for all proposed solutions. As a key member of the team, you will lead architect and design efforts for product development and application development for relevant use cases. You will provide guidance and support to team members and clients, implementing best practices of data engineering and architectural solution design, development, testing, and documentation. Your role will require you to participate in team meetings, brainstorming sessions, and project planning activities. It is essential to stay up-to-date with the latest advancements in the data engineering area to drive innovation and maintain a competitive edge. You will stay hands-on with the design, development, and validation of systems and models deployed. Collaboration with audit professionals to understand business, regulatory, and risk requirements, as well as key alignment considerations for audit, is a crucial aspect of the role. Driving efforts in the data engineering and architecture practice area will be a key responsibility. In terms of mandatory technical and functional skills, you should have a deep understanding of RDBMS (MS SQL Server, ORACLE, etc.), strong programming skills in T-SQL, and proven experience in ETL and reporting (MSBI stack/COGNOS/INFORMATICA, etc.). Additionally, experience with cloud-centric databases (AZURE SQL/AWS RDS), ADF (AZURE Data Factory), data warehousing skills using SYNAPSE/Redshift, understanding and implementation experience of datalakes, and experience in large data processing/ingestion using Databricks APIs, Lakehouse, etc., are required. Knowledge in MPP databases like SnowFlake/Postgres-XL is also essential. Preferred technical and functional skills include understanding financial accounting, experience with NoSQL using MONGODB/COSMOS, Python coding experience, and an aptitude towards emerging data platforms technologies like MS AZURE Fabric. Key behavioral attributes required for this role include strong analytical, problem-solving, and critical-thinking skills, excellent collaboration skills, the ability to work effectively in a team-oriented environment, excellent written and verbal communication skills, and the willingness to learn new technologies and work on them.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Supply Chain Data Integration Consultant Senior The opportunity We're looking for Senior Level Consultants with expertise in Data Modelling, Data Integration, Data Manipulation, and analysis to join the Supply Chain Technology group of our GDS consulting Team. This is a fantastic opportunity to be part of a leading firm while being instrumental in the growth of a new service offering. This role demands a highly technical, extremely hands-on Data Warehouse Modelling consultant who will work closely with our EY Partners and external clients to develop new business as well as drive other initiatives on different business needs. The ideal candidate must have a good understanding of the value of data warehouse and ETL with Supply Chain industry knowledge and proven experience in delivering solutions to different lines of business and technical leadership. Your key responsibilities A minimum of 5+ years of experience in BI/Data integration/ETL/DWH solutions in cloud and on-premises platforms such as Informatica/PC/IICS/Alteryx/Talend/Azure Data Factory (ADF)/SSIS/SSAS/SSRS and experience on any reporting tool like Power BI, Tableau, OBIEE, etc. Performing Data Analysis and Data Manipulation as per client requirements. Expert in Data Modelling to simplify business concepts. Create extensive ER Diagrams to help business in decision-making. Working experience with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures, and integrated datasets using data integration technologies. Should be able to develop sophisticated workflows & macros (Batch, Iterative, etc.) in Alteryx with enterprise data. Design and develop ETL workflows and datasets in Alteryx to be used by the BI Reporting tool. Perform end-to-end Data validation to maintain the accuracy of data sets. Support client needs by developing SSIS Packages in Visual Studio (version 2012 or higher) or Azure Data Factory (Extensive hands-on experience implementing data migration and data processing using Azure Data Factory). Support client needs by delivering Various Integrations with third-party applications. Experience in pulling data from a variety of data source types using appropriate connection managers as per Client needs. Develop, Customize, Deploy, maintain SSIS packages as per client business requirements. Should have thorough knowledge in creating dynamic packages in Visual Studio with multiple concepts such as - reading multiple files, Error handling, Archiving, Configuration creation, Package Deployment, etc. Experience working with clients throughout various parts of the implementation lifecycle. Proactive with a Solution-oriented mindset, ready to learn new technologies for Client requirements. Analyzing and translating business needs into long-term solution data models. Evaluating existing Data Warehouses or Systems. Strong knowledge of database structure systems and data mining. Skills and attributes for success Deliver large/medium DWH programs, demonstrate expert core consulting skills and an advanced level of Informatica, SQL, PL/SQL, Alteryx, ADF, SSIS, Snowflake, Databricks knowledge, and industry expertise to support delivery to clients. Demonstrate management and an ability to lead projects or teams individually. Experience in team management, communication, and presentation. To qualify for the role, you must have 5+ years ETL experience as Lead/Architect. Expertise in the ETL Mappings, Data Warehouse concepts. Should be able to design a Data Warehouse and present solutions as per client needs. Thorough knowledge in Structured Query Language (SQL) and experience working on SQL Server. Experience in SQL tuning and optimization using explain plan and SQL trace files. Should have experience in developing SSIS Batch Jobs Deployment, Scheduling Jobs, etc. Building Alteryx workflows for data integration, modeling, optimization, and data quality. Knowledge of Azure components like ADF, Azure Data Lake, and Azure SQL DB. Knowledge of data modeling and ETL design. Design and develop complex mappings, Process Flows, and ETL scripts. In-depth experience in designing the database and data modeling. Ideally, you'll also have Strong knowledge of ELT/ETL concepts, design, and coding. Expertise in data handling to resolve any data issues as per client needs. Experience in designing and developing DB objects such as Tables, Views, Indexes, Materialized Views, and Analytical functions. Experience of creating complex SQL queries for retrieving, manipulating, checking, and migrating complex datasets in DB. Experience in SQL tuning and optimization using explain plan and SQL trace files. Candidates ideally should have ideally good knowledge of ETL technologies/tools such as Alteryx, SSAS, SSRS, Azure Analysis Services, Azure Power Apps. Good verbal and written communication in English, Strong interpersonal, analytical, and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Candidates having additional knowledge of BI tools such as PowerBi, Tableau, etc will be preferred. Experience with Cloud databases and multiple ETL tools. What we look for The incumbent should be able to drive ETL Infrastructure related developments. Additional knowledge of complex source system data structures preferably in Financial services (preferred) Industry and reporting related developments will be an advantage. An opportunity to be a part of market-leading, multi-disciplinary team of 10000 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY GDS consulting practices globally with leading businesses across a range of industries. What working at EY offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies, and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching, and feedback from some of the most engaging colleagues around. Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As a Software Engineer II at PAR, you will utilize your expertise in GoLang to develop Enterprise Grade Systems that are scalable and maintainable. With 3+ years of experience, you will leverage the unique paradigms, idioms, and syntax of GoLang to create well-documented programs with reasonable test coverage. Your role will involve collaborating with the team to ensure the infrastructure functions seamlessly. Key Responsibilities: - Develop Enterprise Grade Systems using GoLang - Design scalable and maintainable programs - Coordinate with team members across different layers of the infrastructure - Ensure thorough documentation and reasonable test coverage - Solve complex problems through collaborative problem-solving and sophisticated design Requirements: - Proficiency in GoLang - Experience working on enterprise grade systems - Design web services - Full Stack development skills with frontend JavaScript Frameworks like Vue Js, React Js - Ability to scale systems with database bottlenecks - Knowledge of Microservices architecture - Familiarity with OAuth, JWT, SSO, Authentication, and Identity Federation - Experience with AWS, Docker, Kubernetes, Pods, and Meshes - Proficiency in MySQL, Snowflake, and MongoDB Why Join Us: - Contribute to writing scalable, robust, and testable code - Translate software requirements into high-performance software - Play a key role in architectural and design decisions for efficient microservices distributed architecture If you are passionate about creating innovative solutions that connect people to the restaurants, meals, and moments they love, we welcome you to join our team at PAR as a Software Engineer II.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You should have a minimum of 6 years of experience in the technical field and possess the following skills: Python, Spark SQL, PySpark, Apache Airflow, DBT, Snowflake, CI/CD, Git, GitHub, and AWS. Your role will involve understanding the existing code base in AWS services and SQL, and converting it to a tech stack primarily using Airflow, Iceberg, Python, and SQL. Your responsibilities will include designing and building data models to support business requirements, developing and maintaining data ingestion and processing systems, implementing data storage solutions, ensuring data consistency and accuracy through validation and cleansing techniques, and collaborating with cross-functional teams to address data-related issues. Proficiency in Python, experience with big data Spark, orchestration experience with Airflow, and AWS knowledge are essential for this role. You should also have experience in security and governance practices such as role-based access control (RBAC) and data lineage tools, as well as knowledge of database management systems like MySQL. Strong problem-solving and analytical skills, along with excellent communication and collaboration abilities, are key attributes for this position. At NucleusTeq, we foster a positive and supportive culture that encourages our associates to perform at their best every day. We value and celebrate individual uniqueness, offering flexibility for making daily choices that contribute to overall well-being. Our well-being programs and continuous efforts to enhance our culture aim to create an environment where our people can thrive, lead healthy lives, and excel in their roles.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You are familiar with AWS and Azure Cloud. You have extensive knowledge of Snowflake, with SnowPro Core certification being a must-have. In at least one project, you have utilized DBT to deploy models in production. Furthermore, you have experience in configuring and deploying Airflow, integrating various operators in Airflow, especially DBT & Snowflake. Your capabilities also include designing build, release pipelines, and a solid understanding of the Azure DevOps Ecosystem. Proficiency in Python, particularly PySpark, allows you to write metadata-driven programs. You are well-versed in Data Vault (Raw, Business) and concepts such as Point In Time and Semantic Layer. In ambiguous situations, you demonstrate resilience and possess the ability to clearly articulate problems in a business-friendly manner. Documenting processes, managing artifacts, and evolving them over time are practices you believe in and adhere to diligently. Required Skills: data vault, dbt, python, snowflake, data, Azure Cloud, AWS, articulate, PySpark, concepts, Azure, Airflow, artifacts, Azure DevOps.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You will play a crucial role in enhancing the Analytics capabilities for our businesses. Your responsibilities will include engaging with key stakeholders to comprehend Fidelity's sales, marketing, client services, and propositions context. You will collaborate with internal teams such as the data support team and technology team to develop new tools, capabilities, and solutions. Additionally, you will work closely with IS Operations to expedite the development and sharing of customized data sets. Maximizing the adoption of Cloud-Based Data Management Services will be a significant part of your role. This involves setting up sandbox analytics environments using platforms like Snowflake, AWS, Adobe, and Salesforce. You will also support data visualization and data science applications to enhance business operations. In terms of stakeholder management, you will work with key stakeholders to understand business problems and translate them into suitable analytics solutions. You are expected to facilitate smooth execution, delivery, and implementation of these solutions through effective engagement with stakeholders. Your role will also involve collaborating with the team to share knowledge and best practices, including coaching on deep learning and machine learning methodologies. Taking independent ownership of projects and initiatives within the team is crucial, demonstrating leadership and accountability. Furthermore, you will be responsible for developing and evaluating tools, methodologies, or infrastructure to address long-term business challenges. This may involve enhancing modelling software, methodologies, data requirements, and optimization environments to elevate the team's capabilities. To excel in this role, you should possess 5 to 8 years of overall experience in Analytics, with at least 4 years of experience in SQL, Python, open-source Machine Learning Libraries, and Deep Learning. Experience working in an AWS Environment, preferably using Snowflake, is preferred. Proficiency in analytics applications such as Python, SAS, SQL, and interpreting statistical results is necessary. Knowledge of SPARK, Hadoop, and Big Data Platforms will be advantageous.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
tirupati, andhra pradesh
On-site
You are an experienced Snowflake Data Engineer with expertise in Python and SQL, holding a Snowflake certification and having at least 4 years of hands-on experience with Snowflake. Your primary responsibility will be to design, develop, and maintain robust data pipelines in a cloud environment, ensuring efficient data integration, transformation, and storage within the Snowflake data platform. Your key responsibilities will include designing and developing data pipelines to handle large volumes of structured and unstructured data using Snowflake and SQL. You will also be responsible for developing and maintaining efficient ETL/ELT processes to integrate data from various sources into Snowflake, ensuring data quality and availability. Additionally, you will write Python scripts to automate data workflows, implement data transformation logic, and integrate with external APIs for data ingestion. You will create and optimize complex SQL queries for data extraction, transformation, and reporting purposes. Moreover, you will develop and maintain data models to support business intelligence and analytics, leveraging Snowflake best practices. Ensuring proper data governance, security, and compliance within the Snowflake environment will also be one of your responsibilities by implementing access controls, encryption, and monitoring. Collaboration is key, as you will work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver scalable solutions. As a qualified candidate, you must have a Snowflake Certification, 4+ years of experience with Snowflake, and active Snowflake certification. You should possess strong experience with Python for data processing, automation, and API integration. Expertise in writing and optimizing complex SQL queries and experience with data warehousing and database management is essential. Hands-on experience with designing and implementing ETL/ELT pipelines using Snowflake is also required. Familiarity with cloud environments such as AWS, GCP, or Azure, especially in relation to data storage and processing, is necessary. Experience with implementing data governance frameworks and security protocols in a cloud data platform is also a prerequisite. Preferred skills include experience with CI/CD pipelines for data projects, knowledge of Apache Airflow or other orchestration tools, and familiarity with big data technologies and distributed systems. Educational background should include a Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Additionally, possessing strong problem-solving and analytical skills, excellent communication skills to interact with both technical and non-technical stakeholders, and the ability to work in a fast-paced, agile environment are essential soft skills for this role.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
About Mindstix Software Labs: Mindstix accelerates digital transformation for the world's leading brands. We are a team of passionate innovators specialized in Cloud Engineering, DevOps, Data Science, and Digital Experiences. Our UX studio and modern-stack engineers deliver world-class products for our global customers that include Fortune 500 Enterprises and Silicon Valley startups. Our work impacts a diverse set of industries - eCommerce, Luxury Retail, ISV and SaaS, Consumer Tech, and Hospitality. A fast-moving open culture powered by curiosity and craftsmanship. A team committed to bold thinking and innovation at the very intersection of business, technology, and design. That's our DNA. Roles and Responsibilities: Mindstix is looking for a proficient Data Engineer. You are a collaborative person who takes pleasure in finding solutions to issues that add to the bottom line. You appreciate technical work by hand and feel a sense of ownership. You require a keen eye for detail, work experience as a data analyst, and in-depth knowledge of widely used databases and technologies for data analysis. Your responsibilities include: - Building outstanding domain-focused data solutions with internal teams, business analysts, and stakeholders. - Applying data engineering practices and standards to develop robust and maintainable solutions. - Being motivated by a fast-paced, service-oriented environment and interacting directly with clients on new features for future product releases. - Being a natural problem-solver and intellectually curious across a breadth of industries and topics. - Being acquainted with different aspects of Data Management like Data Strategy, Architecture, Governance, Data Quality, Integrity & Data Integration. - Being extremely well-versed in designing incremental and full data load techniques. Qualifications and Skills: - Bachelors or Master's degree in Computer Science, Information Technology, or allied streams. - 2+ years of hands-on experience in the data engineering domain with DWH development. - Must have experience with end-to-end data warehouse implementation on Azure or GCP. - Must have SQL and PL/SQL skills, implementing complex queries and stored procedures. - Solid understanding of DWH concepts such as OLAP, ETL/ELT, RBAC, Data Modelling, Data Driven Pipelines, Virtual Warehousing, and MPP. - Expertise in Databricks - Structured Streaming, Lakehouse Architecture, DLT, Data Modeling, Vacuum, Time Travel, Security, Monitoring, Dashboards, DBSQL, and Unit Testing. - Expertise in Snowflake - Monitoring, RBACs, Virtual Warehousing, Query Performance Tuning, and Time Travel. - Understanding of Apache Spark, Airflow, Hudi, Iceberg, Nessie, NiFi, Luigi, and Arrow (Good to have). - Strong foundations in computer science, data structures, algorithms, and programming logic. - Excellent logical reasoning and data interpretation capability. - Ability to interpret business requirements accurately. - Exposure to work with multicultural international customers. - Experience in the Retail/ Supply Chain/ CPG/ EComm/Health Industry is a plus. Who Fits Best - You are a data enthusiast and problem solver. - You are a self-motivated and fast learner with a strong sense of ownership and drive. - You enjoy working in a fast-paced creative environment. - You appreciate great design, have a strong sense of aesthetics and have a keen eye for detail. - You thrive in a customer-centric environment with the ability to actively listen, empathize and collaborate with globally distributed teams. - You are a team player who desires to mentor and inspire others to do their best. - You love expressing ideas and articulating well with strong written and verbal English communication and presentation skills. - You are detail-oriented with an appreciation for craftsmanship. Benefits: - Flexible working environment. - Competitive compensation and perks. - Health insurance coverage. - Accelerated career paths. - Rewards and recognition. - Sponsored certifications. - Global customers. - Mentorship by industry leaders. Location: This position is primarily based at our Pune (India) headquarters, requiring all potential hires to work from this location. A modern workplace is deeply collaborative by nature, while also demanding a touch of flexibility. We embrace deep collaboration at our offices with reasonable flexi-timing and hybrid options to our seasoned team members. Equal Opportunity Employer.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France