Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
1 - 5 Lacs
Mumbai, Hyderabad, Bengaluru
Hybrid
Role 1: Snowflake Developer (Coding, Documentation) Locations : Multiple location (Bangalore , Hyderabad , Chennai , kolkata , Mumbai , Pune , Gurugram) Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Key Skills & Responsibilities Strong hands-on experience with Snowflake database design, coding, and documentation. Expertise in performance tuning for both Oracle and Snowflake. Experience as an Apps DBA, capable of coordinating with application teams. Proficiency in using OEM, Tuning Advisor, and analyzing AWR reports. Strong SQL skills with the ability to guide application teams on improvements. Efficient management of compute and storage in Snowflake architecture. Execute administrative tasks, handle multiple Snowflake accounts, and apply best practices. Implement data governance via column-level security, dynamic masking, and RBAC. Utilize Time Travel, Cloning, replication, and recovery methods. Manage DML/DDL operations, concurrency models, and security policies. Enable secure data sharing internally and externally. Skills: documentation,apps dba,replication,dml/ddl operations,performance tunning,compute management,rbac,security policies,coding,recovery methods,dynamic masking,cloning,oem,storage management,performance tuning,secure data sharing,time travel,tuning advisor,snowflake database design,snowflake,column-level security,data governance,concurrency models,column-level security, dynamic masking, and rbac,oracle,awr reports analysis,sql
Posted 4 weeks ago
2.0 - 3.0 years
8 - 12 Lacs
Gurugram, Delhi / NCR
Work from Office
Job Title: Power Apps Developer (35 Years Experience) Location: Gurgaon Job Type: Full-Time Job Summary: We are seeking a skilled and proactive Power Apps Developer with 3 to 5 years of experience in building robust, scalable applications using Microsoft Power Platform. The ideal candidate will bring strong expertise in Power Apps (Canvas and Model-Driven), with a functional orientation towards web and app development . Familiarity with Azure services and Snowflake is essential, as you will be working on data-rich applications integrated into enterprise cloud environments. You would be responsible for designing, developing, testing, deploying, and maintaining Power Apps solutions to meet business requirements. You would collaborate with stakeholders, develop complex data models, integrate them with Power Apps, and ensure applications are performant, scalable, and secure. This includes utilizing various Power Platform tools like Power Apps, Power Automate, and Power BI. Key Responsibilities: Design, develop, and maintain scalable apps using Microsoft Power Apps (Canvas & Model-Driven). Collaborate with stakeholders to gather and analyze functional requirements and translate them into user-friendly business applications. Integrate Power Apps solutions with Azure services , Snowflake , SharePoint, Dataverse, and other cloud data sources. Build and manage Power Automate flows to automate business processes. Implement responsive and intuitive UI/UX for mobile and web platforms. Participate in system architecture planning and contribute to best practices in pplication design and lifecycle management. Troubleshoot, debug, and resolve technical issues across environments. Maintain documentation, deployment scripts, and operational support materials. Work closely with cross-functional teams including analysts, cloud engineers, and data teams. Required Qualifications: 2-3 years of hands-on experience in Power Apps development. Strong knowledge of Power Platform components including Power Apps, Power Automate, and Dataverse. Proficiency in JavaScript ( Experience with JavaScript frameworks like React.js for front-end development ), HTML, and CSS for custom front-end functionalities. Experience integrating with Azure services such as Azure Functions, Logic Apps, and Key Vault. Expertise in ASP.net front end, SAP and SQL database Hands-on experience with Snowflake and its integration into apps or workflows. Strong understanding of REST APIs, connectors, and integration patterns. Excellent problem-solving skills and a user-centric mindset. Ability to communicate technical concepts effectively with non-technical stakeholders. RPA: Understanding of Robotic Process Automation (RPA). PCF: Experience with Power Apps Component Framework (PCF). OOB Connectors: Experience with Out-of-the-box (OOB) connectors in Power Apps. Premium Connectors: Experience with premium connectors in Power Apps. Preferred Qualifications: Microsoft Power Platform certifications (e.g., PL-100, PL-400). Familiarity with Agile methodologies and tools like Azure DevOps or Jira. Knowledge of data governance, security roles, and access control within Power Platform. Experience with CI/CD pipelines for Power Platform.
Posted 4 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 4 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. Youll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, youll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, youll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks
Posted 4 weeks ago
12.0 - 17.0 years
6 - 10 Lacs
Bengaluru
Work from Office
The role of Sr. Analytics Consultant / Lead exists within the Analytics offshore and onshore teams to develop innovative Analytics solutions using Data Visualization / BI solutions, Gen AI / ML, Data Analytics and Data Science, that will generate tangible value for business stakeholders and customers. The role has a focus on using sophisticated data analysis and modelling techniques, alongside cognitive computing and artificial intelligence, with business acumen to discover insights that will guide business decisions and uncover business opportunities. Key AccountabilitiesManaging large Advanced Analytics teams owning client delivery and team growth accountabilities using consulting mind set. Consulting, transformation, building proposal & competencyExperience within Insurance services industry essential. Confident in leading large scale data projects, working in product teams.Highly experience in solving business problem and Data Lead Tech solutions. Managing diverse cross functional teams with a strong commercial mindset Interpret big data to identify and analyse emerging trends and produce appropriate business insights which monitors the performance of the Portfolios and to continuously drive an improvement in business results.Develop advance business performance & Data Analytics Tools to assist the Senior Specialists, Portfolio Managers, business stakeholders (including but not limited to Portfolio, Customer & Marketing functions) & wider Commercial team members to make required data-based recommendations, implement and monitor them accurately.Develop statistical models to predict business performance and customer behaviour. Research customer behaviours and attitudes leading to in depth knowledge and understanding of differences in customer level profitability.Promote innovation through improving current processes and developing new analytical methods and factors. Identify, investigate and introduce additional rating factors with the objective of improving product risk and location selection to maximise profit.Provide consulting advice and solution to solve the Business clients hardest pain points and realise biggest business opportunities through advanced use of data and algorithms. Can work on projects across functions based on needs of the business.Actively add value by lifting the Business capability of process automation.Build, enhance and maintain quality relationships with all internal and external customers. Adopt a customer focus in the delivery of internal/external services. Build positive stakeholder relationships to foster a productive environment and open communication channels.Bring new Data Science thinking to the group by staying on top of latest developments, industry meet-ups etc.Expert level knowledge of ,Gen AI/ML Python, BI and Visualization, transformation and business consulting building technical proposalKnowledge of statistical concepts Expert levelTypically, this role would have 12 years plus of relevant experience (in a Data Science or Advanced Analytical consulting field of work). At least 10 years of leadership experience necessary.Experience within Insurance services industry essential. Confident in leading large scale data projects, working in product teams.Highly experience in solving business problem and Data Lead Tech solutions. Managing diverse cross functional teams with a strong commercial mindset Qualifications Superior results in Bachelor Degree in highly technical subject area (statistics, actuarial, engineering, maths, programming etc). Post graduate degree in relevant statistical or data science related area (or equivalent & demonstrable online study). Key Capabilities/Technical Competencies (skills, knowledge, technical or specialist capabilities)MandatoryProven ability to engage in a team to achieve individual, team and divisional goals. Consulting, transformation, building proposal & competencyLead and manage largescale teams from people, project management and client management perspective Solid programming experience in Tableau, R, SQL and Python (AI/ML.Experience with AWS or other cloud service.Familiar with data lake platforms (e.g. Snowflake and Databricks)Demonstrated understanding of advanced machine learning algorithms including some exposure to NLP and Image Processing.Expert level understanding of statistical conceptsPlanning and organization Advanced levelDelegation, project management, delivery and productionizing analytical services Advanced High degree of specialist expertise within one or more data science capabilities eg. unstructured data analysis, cognitive/AI solutions (incl. use of API platforms such as IBM Watson, MS Azure, AWS etc), Hadoop/Spark based ecosystems etc.Familiarity with Gen AI conceptsHighly ValuedGood understanding of the Insurance products, industry, market environment, customer segment and key business drivers.Strong knowledge of finance, Budgeting/Forecasting of key business drivers, with the ability to interpret and analyse reports relevant to area of responsibility.Additional Creativity and Innovation - A curious mind that does not accept the status quo. Design thinking experience highly valued. Communication Skills Superior communication skills to be able to co-design solutions with customers. Emotional intelligence and the ability to communicate complex ideas to a range of internal stakeholders. Consulting skills highly valued. Business Focus - Advanced analytical skills will need to be practiced with a significant business focus to ensure practical solutions that deliver tangible value. Strategic Focus - Critically evaluate both company and key business customers' strategy. Also keeping abreast of best practice advanced analytics strategies. Change management capability - ability to recognise, understand and support need for change and anticipated impact on both the team and self. Adaptable and responsive to a continuously changing environment.Customer service - proven commitment to delivering a quality differentiated experience.Time management skills prioritisation of work without supervision.Project management - Ability to plan, organize, implement, monitor, and control projects, ensuring efficient utilisation of technical and business resources, to achieve project objectives.Partnering - Ability to deliver solutions utilising both onshore and offshore resources. Job Location
Posted 4 weeks ago
5.0 - 10.0 years
7 - 11 Lacs
Hyderabad
Work from Office
At F5, we strive to bring a better digital world to life. Our teams empower organizations across the globe to create, secure, and run applications that enhance how we experience our evolving digital world. We are passionate about cybersecurity, from protecting consumers from fraud to enabling companies to focus on innovation. Job Summary: We are seeking a skilled and driven Data Analyst Business Intelligence to join our global Services organization, supporting Customer Success and Renewals. This role is essential to enabling data-driven decision-making across a worldwide team by transforming complex, multi-source datasets into strategic insights. The ideal candidate will bring 5+ years of experience in data analysis, reporting, and business intelligence, with a demonstrated ability to work with large, complex datasets from diverse repositories. This individual will proactively identify data gaps, propose and implement solutions, and synthesize improved data with industry knowledge to deliver high-impact recommendations to business leaders. Success in this role means accelerating decision-making, improving operational efficiency, and uncovering opportunities that drive customer satisfaction, revenue retention, and long-term growth. Key Responsibilities: Analyze global Services Renewals data to uncover trends, forecast performance, and support revenue optimization strategies. Design, build, and maintain dashboards and reports that surface key performance indicators (KPIs) related to renewals, churn, upsell, and customer retention. Collaborate cross-functionally with Renewals, Sales, Customer Success, and Finance teams to deliver insights that improve forecasting accuracy and operational execution. Manage an intake queue for ad hoc and strategic data requests, partnering with business leaders to clarify needs, propose analytical approaches, and drive solutions through to delivery. Support weekly and quarterly business reviews by delivering timely, accurate reporting and insight packages that inform executive decision-making. Work with large, complex datasets from multiple systems, ensuring data integrity, consistency, and usability across platforms. Proactively identify data gaps and quality issues, propose solutions, and lead remediation efforts to enhance analytical accuracy and business impact. Continuously explore data to uncover new opportunities, develop hypotheses, and recommend strategies that improve customer retention and revenue performance. Leverage BI tools (e.g., Power BI, Tableau, Looker) and SQL to automate reporting, streamline workflows, and scale analytics capabilities. Contribute to the development and refinement of predictive models that assess customer renewal behavior and risk indicators. Identify opportunities to apply Artificial Intelligence (AI) and machine learning tools to enhance forecasting, automate insights, and optimize customer success strategies. Stay current on emerging AI technologies and proactively recommend innovative solutions that improve analytical efficiency, insight generation, and strategic decision-making. Skills / Knowledge / Abilities: Advanced proficiency in SQL and data visualization tools such as Power BI, Tableau, and Looker , with the ability to build scalable, user-friendly dashboards. Proven experience extracting, transforming, and analyzing large, complex datasets from multiple systems, ensuring data quality and consistency. Strong analytical thinking and problem-solving skills, with a proactive mindset for uncovering insights and driving business outcomes. Demonstrated ability to build and apply predictive models to assess customer behavior, renewal likelihood, and churn risk, using statistical or machine learning techniques. Ability to translate data into strategic recommendations , combining analytical rigor with business acumen and industry context. Experience supporting Customer Success, Renewals, or subscription-based business models ; familiarity with churn, retention, and upsell analytics is highly preferred. Effective communicator with the ability to present insights clearly to both technical and non-technical stakeholders, including senior leadership. Skilled in managing multiple priorities in a fast-paced, cross-functional environment , with a strong sense of ownership and accountability. Familiarity with CRM and ERP systems such as Salesforce, Oracle, or SAP. Working knowledge of data warehousing and cloud platforms (e.g., Snowflake, BigQuery, Azure) Ability to identify and apply AI and machine learning tools to enhance forecasting, automate insights, and improve strategic decision-making. Qualifications: Bachelors degree in Business, Data Analytics, Statistics, Computer Science, or related field. 5+ years of relevant experience in data analytics, preferably in services, subscription, or renewals-focused environment The is intended to be a general representation of the responsibilities and requirements of the job. However, the description may not be all-inclusive, and responsibilities and requirements are subject to change. Please note that F5 only contacts candidates through F5 email address (ending with @f5.com) or auto email notification from Workday (ending with f5.com or @myworkday.com ) . Equal Employment Opportunity It is the policy of F5 to provide equal employment opportunities to all employees and employment applicants without regard to unlawful considerations of race, religion, color, national origin, sex, sexual orientation, gender identity or expression, age, sensory, physical, or mental disability, marital status, veteran or military status, genetic information, or any other classification protected by applicable local, state, or federal laws. This policy applies to all aspects of employment, including, but not limited to, hiring, job assignment, compensation, promotion, benefits, training, discipline, and termination. F5 offers a variety of reasonable accommodations for candidates . Requesting an accommodation is completely voluntary. F5 will assess the need for accommodations in the application process separately from those that may be needed to perform the job. Request by contacting accommodations@f5.com.
Posted 4 weeks ago
6.0 - 8.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Senior Data Scientist Enterprise Analytics Want to be part of the Data & Analytics organization, whose strategic goal is to create a world-class Data & Analytics company by building, embedding, and maturing a data-driven culture across Thomson Reuters. We are looking for a highly motivated individual with strong organizational and technical skills for the position of Senior Data Scientist. You will play a critical role working on cutting edge of analytics, leveraging predictive models, machine learning and generative AI to drive business insights and facilitating informed decision-making and help Thomson Reuters rapidly scale data-driven initiatives. About the Role In this opportunity as Senior Data Scientist, you will: Engage with stakeholders, business analysts and project team to understand the data requirements. Work in multiple business domain areas including Customer Service, Finance, Sales and Marketing. Design analytical frameworks to provide insights into a business problem. Explore and visualize multiple data sets to understand data available and prepare data for problem solving. Build machine learning models and/or statistical solutions. Build predictive models, generative AI solutions. Use Natural Language Processing to extract insight from text. Design database models (if a data mart or operational data store is required to aggregate data for modeling). Design visualizations and build dashboards in Tableau and/or PowerBI. Extract business insights from the data and models. Present results to stakeholders (and tell stories using data) using power point and/or dashboards. About You You're a fit for the role of Senior Data Scientist if your background includes: Experience- 6-8 Years in the field of Machine Learning & AI Must have a minimum of 3 years of experience working in the data science domain Degree preferred in a quantitative field (Computer Science, Statistics, etc.) Both technical and business acumen is required Technical skills Proficient in machine learning, statistical modelling, data science and generative AI techniques Highly proficient in Python and SQL Experience with Tableau and/or PowerBI Has worked with Amazon Web Services and Sagemaker Ability to build data pipelines for data movement using tools such as Alteryx, GLUE Experience Predictive analytics for customer retention, upsell/cross sell products and new customer acquisition, Customer Segmentation, Recommendation engines (customer and AWS Personalize), POCs in building Generative AI solutions (GPT, Llama etc.,) Hands on with Prompt Engineering Experience in Customer Service, Finance, Sales and Marketing Additional Technical skills include Familiarity with Natural Language Processing including Feature Extraction techniques, Word Embeddings, Topic Modeling, Sentiment Analysis, Classification, Sequence Models and Transfer Learning Knowledgeable of AWS APIs for Machine Learning Has worked with Snowflake extensively. Good presentation skills and the ability to tell stories using data and Powerpoint/Dashboard Visualizations. Ability to communicate complex results in a simple and concise manner at all levels within the organization. Consulting Experience with a premier consulting firm. #LI-SS5 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 4 weeks ago
3.0 - 7.0 years
11 - 15 Lacs
Hyderabad
Work from Office
The Manager, Software Development Engineering leads a team of technical experts in successfully executing technology projects and solutions that align with the strategy and have broad business impact. The Manager, Software Development Engineering will work closely with development teams to identify and understand key features and their underlying functionality while also partnering closely with Product Management and UX Design. They may exercise influence and govern overall end-to-end software development life cycle related activities including management of support and maintenance releases, minor functional releases, and major projects. The Manager, Software Development Engineering will lead & provide technical guidance for process improvement programs while leveraging engineering best practices. In this people leadership role, Managers will recruit, train, motivate, coach, grow and develop Software Development Engineer team members at a variety of levels through their technical expertise and providing continuous feedback to ensure employee expectations, customer needs and product demands are met. About the Role: Lead and manage a team of engineers, providing mentorship and fostering a collaborative environment. Design, implement, and maintain scalable data pipelines and systems to support business analytics and data science initiatives. Collaborate with cross-functional teams to understand data requirements and ensure data solutions align with business goals. Ensure data quality, integrity, and security across all data processes and systems. Drive the adoption of best practices in data engineering, including coding standards, testing, and automation. Evaluate and integrate new technologies and tools to enhance data processing and analytics capabilities. Prepare and present reports on engineering activities, metrics, and project progress to stakeholders. About You: Proficiency in programming languages such as Python, Java, or Scala. Data Engineering with API & any programming language. Strong understanding of APIs and possess forward-looking knowledge of AI/ML tools or models and need to have some knowledge on software architecture. Experience with cloud platforms (e.g., AWS,Google Cloud) and big data technologies (e.g., Hadoop, Spark). Experience with Rest/Odata API's Strong problem-solving skills and the ability to work in a fast-paced environment. Excellent communication and interpersonal skills. Experience with data warehousing solutions such as BigQuery or snowflakes Familiarity with data visualization tools and techniques. Understanding of machine learning concepts and frameworks. #LI-AD2 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 4 weeks ago
7.0 - 9.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Want to be part of the Data & Analytics organization, whose strategic goal is to create a world-class Data & Analytics company by building, embedding, and maturing a data-driven culture across Thomson Reuters. About The Role We are looking for a highly motivated individual with strong organizational and technical skills for the position of Lead Data Engineer/ Data Engineering Manager (Snowflake). You will play a critical role working on cutting edge of Data Engineering and analytics, leveraging predictive models, machine learning and generative AI to drive business insights and facilitating informed decision-making and help Thomson Reuters rapidly scale data-driven initiatives.Effectively communicate across various levels, including Executives, and functions within the global organization.Demonstrate strong leadership skills with ability to drive projects/tasks to delivering valueEngage with stakeholders, business analysts and project team to understand the data requirements.Design analytical frameworks to provide insights into a business problem.Explore and visualize multiple data sets to understand data available and prepare data for problem solving.Design database models (if a data mart or operational data store is required to aggregate data for modeling). About You You're a fit for the Lead Data Engineer/ Data Engineering Manager (Snowflake), if your background includes:QualificationsB-Tech/M-Tech/MCA or equivalentExperience7-9 years of corporate experienceLocationBangalore, IndiaHands-on experience in developing data models for large scale data warehouse/data Lake Snowflake, BWMap the data journey from operational system sources through any transformations in transit to itsdelivery into enterprise repositories (Warehouse, Data Lake, Master Data, etc.)Enabling on the overall master and reference data strategy, including the procedures to ensure the consistency and quality of Finance reference data.Experience across ETL, SQL and other emerging data technologies with experience in integrations of a cloud-based analytics environmentBuild and refine end-to-end data workflows to offer actionable insightsFair understanding of Data Strategy, Data Governance ProcessKnowledge in BI analytics and visualization toolsPower BI, Tableau #LI-NR1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 4 weeks ago
4.0 - 9.0 years
6 - 11 Lacs
Hyderabad, Gurugram, Ahmedabad
Work from Office
About the Role: Grade Level (for internal use): 10 The RoleSenior Software Developer, Application Development The Team We are looking for a highly motivated, enthusiastic, and skilled software engineer for S&P Global Market Intelligence. This developer would help us to accelerate different initiatives within Energy Datasets. The Impact: As a Senior Software Developer, you will be part of the development team that manages and supports the Internal applications supporting Energy Datasets. Whats in it for you: Its a fast-paced agile environment that deals with huge volumes of data, so youll have an opportunity to sharpen your data skills and work on an emerging technology stack. Responsibilities Problems, analyze, and isolate issues.Able to work on projects and tasks independently or with little assistance. Build solutions to develop/support key business needs.Engineer components and common services based on standard development models, languages, and tools. Produce system design documents. Collaborate effectively with technical and non-technical partners. Quickly learn new and internal technologies and help junior teammates. What Were Looking For Bachelors / masters degree in computer science, Information Systems or equivalent. Experience working in Agile Scrum methodology. Minimum of 4 years of experience in application development using Microsoft and Big Data Technologies. We are looking for engineers possessing solid expertise in AWS and Databricks . You will have to work on and lead multiple projects built around these technologies. A minimum of three years hands-on experience in these technologies is MUST. Solid command on Bigdata and Cloud based technologies including Snowflake, OData, python, Scala and Postgres. We also have applications written in Dotnet C#. You will be asked to work on these as per business requirements. An ideal candidate will possess strong Knowledge of object-oriented design, .NET, .NET Core, C#, SQL Server, and design patterns including MVVM. Good knowledge and experience working on multi-tier applications. Experience with Microservices Architecture will be a huge plus Experience building applications using Win forms, WPF, ADO.net, Entity Framework will be a huge plus. Experience working on windows services and scheduler applications using Dotnet and C#. Server platform, stored procedure programming experience using Transact SQL. Experience with troubleshooting applications, debugging, logging, performance monitoring, Must be a team player and quick learner, and willing to take on difficult tasks as and when required. Experience with other technologies including Azure Cloud, Google Cloud, Docker is a plus but not mandatory. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 4 weeks ago
8.0 - 13.0 years
8 - 12 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 11 S&P Global Market Intelligence The Role: Lead Software Engineer The Team The Market Intelligence Industry Data Solutions business line provides data technology and services supporting acquisition, ingestion, content management, mastering and distribution to power our Financial Institution Group business and customer needs. Focus on platform scale to support business by following a common data lifecycle that accelerates business value. We provide essential intelligence for Financial Services, Real Estate and Insurance industries The Impact The FIG Data Engineering team will be responsible for implementing & maintaining services and/or tools to support existing feed systems which allows users to consume FIG datasets and make FIG data available to a data fabric for wider consumption and processing within the company. Whats in it for you Ability to work with global stakeholders and working on latest tools and technologies. Responsibilities Build new data acquisition and transformation pipelines using big data and cloud technologies. Work with the broader technology team, including information architecture and data fabric teams to align pipelines with the lodestone initiative. What Were Looking For: Bachelors in computer science or equivalent with at least 8+ years of professional software work experience Experience with Big Data platforms such as Apache Spark, Apache Airflow, Google Cloud Platform, Apache Hadoop. Deep understanding of REST, good API design, and OOP principles Experience with object-oriented/object function scripting languages Python , C#, Scala, etc. Good working knowledge of relational SQL and NoSQL databases Experience in maintaining and developing software in production utilizing cloud-based tooling ( AWS, Docker & Kubernetes, Okta) Strong collaboration and teamwork skills with excellent written and verbal communications skills Self-starter and motivated with ability to work in a fast-paced software development environment Agile experience highly desirable Experience in Snowflake, Databricks will be a big plus. Return to Work Have you taken time out for caring responsibilities and are now looking to return to workAs part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 4 weeks ago
4.0 - 9.0 years
6 - 11 Lacs
Chennai
Work from Office
An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQLs Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance
Posted 4 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Chennai
Hybrid
Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Key Skills & Responsibilities Strong hands-on experience with Snowflake database design, coding, and documentation. Expertise in performance tuning for both Oracle and Snowflake. Experience as an Apps DBA, capable of coordinating with application teams. Proficiency in using OEM, Tuning Advisor, and analyzing AWR reports. Strong SQL skills with the ability to guide application teams on improvements. Efficient management of compute and storage in Snowflake architecture. Execute administrative tasks, handle multiple Snowflake accounts, and apply best practices. Implement data governance via column-level security, dynamic masking, and RBAC. Utilize Time Travel, Cloning, replication, and recovery methods. Manage DML/DDL operations, concurrency models, and security policies. Enable secure data sharing internally and externally. Skills: documentation,apps dba,replication,dml/ddl operations,performance tunning,compute management,rbac,security policies,coding,recovery methods,dynamic masking,cloning,oem,storage management,performance tuning,secure data sharing,time travel,tuning advisor,snowflake database design,snowflake,column-level security,data governance,concurrency models,column-level security, dynamic masking, and rbac,oracle,awr reports analysis,sql
Posted 4 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Role 1: Snowflake Developer (Coding, Documentation) Locations : Multiple location (Bangalore , Hyderabad , Chennai , kolkata , Mumbai , Pune , Gurugram) Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Budget: 18L(Max) Key Skills & Responsibilities Strong hands-on experience with Snowflake database design, coding, and documentation. Expertise in performance tuning for both Oracle and Snowflake. Experience as an Apps DBA, capable of coordinating with application teams. Proficiency in using OEM, Tuning Advisor, and analyzing AWR reports. Strong SQL skills with the ability to guide application teams on improvements. Efficient management of compute and storage in Snowflake architecture. Execute administrative tasks, handle multiple Snowflake accounts, and apply best practices. Implement data governance via column-level security, dynamic masking, and RBAC. Utilize Time Travel, Cloning, replication, and recovery methods. Manage DML/DDL operations, concurrency models, and security policies. Enable secure data sharing internally and externally. Skills: documentation,apps dba,replication,dml/ddl operations,performance tunning,compute management,rbac,security policies,coding,recovery methods,dynamic masking,cloning,oem,storage management,performance tuning,secure data sharing,time travel,tuning advisor,snowflake database design,snowflake,column-level security,data governance,concurrency models,column-level security, dynamic masking, and rbac,oracle,awr reports analysis,sql
Posted 4 weeks ago
3.0 - 8.0 years
7 - 17 Lacs
Bengaluru
Work from Office
Primary Skill : #Snowflake , #Cloud ( #AWS , #GCP ), #SCALA , #Python , #Spark , #BigData and #SQL . RESPONSIBILITY :- Strong development experience in #Snowflake , #Cloud ( #AWS , #GCP ), #SCALA , #Python , #Spark , #BigData and #SQL . Work closely with stakeholders, including product managers and designers, to align technical solutions with business goals. Maintain code quality through reviews and make architectural decisions that impact scalability and performance. Performs Root cause Analysis for any critical defects and address technical challenges, optimize workflows, and resolve issues efficiently. Expert in #Agile , #WaterfallProgram / #ProjectImplementation . Manages strategic and tactical relationships with program stakeholders. Successfully executing projects within strict deadlines while managing intense pressure. Good understanding of #SDLC ( #SoftwareDevelopmentLifeCycle ) Identify potential technical risks and implement mitigation strategies Excellent verbal, written, and interpersonal communication abilities, coupled with strong problem-solving, facilitation, and analytical skills. Cloud Management Activities To have a good understanding of the cloud architecture /containerization and application management on #AWS and #Kubernetes , to have in depth understanding of the #CICD Pipelines and review tools like #Jenkins , #Bamboo / #DevOps . Skilled at adapting to evolving work conditions and fast-paced challenges.
Posted 4 weeks ago
13.0 - 18.0 years
44 - 48 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About KPI Partners. KPI Partners is a leading provider of data-driven insights and innovative analytics solutions. We strive to empower organizations to harness the full potential of their data, driving informed decision-making and business success. We are seeking an enthusiastic and experienced professional to join our dynamic team as an Associate Director / Director in Data Engineering & Modeling. We are looking for a highly skilled and motivated Associate Director/ Director – Data Engineering & Solution Architecture to support the strategic delivery of modern data platforms and enterprise analytics solutions. This is a hands-on leadership role that bridges technology and business, helping design, develop, and operationalize scalable cloud-based data ecosystems. You will work closely with client stakeholders, internal delivery teams, and practice leadership to drive the architecture, implementation, and best practices across key initiatives. Key Responsibilities Solution Design & Architecture : Collaborate on designing robust, secure, and cost-efficient data architectures using cloud-native platforms such as Databricks, Snowflake, Azure Data Services, AWS, and Incorta. Data Engineering Leadership : Oversee the development of scalable ETL/ELT pipelines using ADF, Airflow, dbt, PySpark, and SQL, with an emphasis on automation, error handling, and auditing. Data Modeling & Integration : Design data models (star, snowflake, canonical), resolve dimensional hierarchies, and implement efficient join strategies. API-based Data Sourcing : Work with REST APIs for data acquisition — manage pagination, throttling, authentication, and schema evolution. Platform Delivery : Support end-to-end project lifecycle — from requirement analysis and PoCs to development, deployment, and handover. CI/CD & DevOps Enablement : Implement and manage CI/CD workflows using Git, Azure DevOps, and related tools to enforce quality and streamline deployments. Mentoring & Team Leadership : Mentor senior engineers and developers, conduct code reviews, and promote best practices across engagements. Client Engagement : Interact with clients to understand needs, propose solutions, resolve delivery issues, and maintain high satisfaction levels. Required Skills & Qualifications 14+ years of experience in Data Engineering, BI, or Solution Architecture roles. Strong hands-on expertise in one of the cloud like Azure, Databricks, Snowflake, and AWS (EMR). Proficiency in Python, SQL, and PySpark for large-scale data transformation. Proven skills in developing dynamic and reusable data pipelines (metadata-driven preferred). Strong grasp of data modeling principles and modern warehouse design. Experience with API integrations, including error handling and schema versioning. Ability to design modular and scalable solutions aligned with business goals. Solid communication and stakeholder management skills. Preferred Qualifications Exposure to data governance, data quality frameworks, and security best practices. Certifications in Azure Data Engineering, Databricks, or Snowflake are a plus. Experience working with Incorta and building materialized views or delta-based architectures. Experience working with enterprise ERP systems. Exposure leading data ingestion from Oracle Fusion ERP and other enterprise systems. What We Offer Opportunity to work on cutting-edge data transformation projects for global enterprises Mentorship from senior leaders and a clear path to Director-level roles Flexible work environment and a culture that values innovation, ownership, and growth Competitive compensation and professional development support
Posted 4 weeks ago
4.0 - 9.0 years
7 - 17 Lacs
Mumbai, Navi Mumbai, Mumbai (All Areas)
Work from Office
Role & responsibilities Strong, hands-on proficiency with Snowflake: In-depth knowledge of Snowflake architecture, features (e.g., Snowpipe, Tasks, Streams, Time Travel, Zero-Copy Cloning). Experience in designing and implementing Snowflake data models (schemas, tables, views). Expertise in writing and optimizing complex SQL queries in Snowflake. Experience with data loading and unloading techniques in Snowflake. Solid experience with AWS Cloud services: Proficiency in using AWS S3 for data storage, staging, and as a landing zone for Snowflake. Experience with other relevant AWS services (e.g., IAM for security, Lambda for serverless processing, Glue for ETL - if applicable). Strong experience in designing and building ETL/ELT data pipelines.
Posted 4 weeks ago
8.0 - 12.0 years
22 - 27 Lacs
Indore, Chennai
Work from Office
We are hiring a Senior Python DevOps Engineer to develop scalable apps using Flask/FastAPI, automate CI/CD, manage cloud and ML workflows, and support containerized deployments in OpenShift environments. Required Candidate profile 8+ years in Python DevOps with expertise in Flask, FastAPI, CI/CD, cloud, ML workflows, and OpenShift. Skilled in automation, backend optimization, and global team collaboration.
Posted 4 weeks ago
6.0 - 11.0 years
13 - 22 Lacs
Hyderabad, Bengaluru
Work from Office
Aws Glue - Mandatory Aws S3 and AWS lambada - should have some experience Must have used snowpipe to build integration pipelines. how to build procedure from scratch. write complex Sql queries writing complex Sql queries python-Numpy and pandas
Posted 4 weeks ago
2.0 - 4.0 years
7 - 9 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
POSITION Senior Data Engineer / Data Engineer LOCATION Bangalore/Mumbai/Kolkata/Gurugram/Hyd/Pune/Chennai EXPERIENCE 2+ Years JOB TITLE: Senior Data Engineer / Data Engineer OVERVIEW OF THE ROLE: As a Data Engineer or Senior Data Engineer, you will be hands-on in architecting, building, and optimizing robust, efficient, and secure data pipelines and platforms that power business-critical analytics and applications. You will play a central role in the implementation and automation of scalable batch and streaming data workflows using modern big data and cloud technologies. Working within cross-functional teams, you will deliver well-engineered, high-quality code and data models, and drive best practices for data reliability, lineage, quality, and security. HASHEDIN BY DELOITTE 2025 Mandatory Skills: Hands-on software coding or scripting for minimum 3 years Experience in product management for at-least 2 years Stakeholder management experience for at-least 3 years Experience in one amongst GCP, AWS or Azure cloud platform Key Responsibilities: Design, build, and optimize scalable data pipelines and ETL/ELT workflows using Spark (Scala/Python), SQL, and orchestration tools (e.g., Apache Airflow, Prefect, Luigi). Implement efficient solutions for high-volume, batch, real-time streaming, and event-driven data processing, leveraging best-in-class patterns and frameworks. Build and maintain data warehouse and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift) to support analytics, data science, and BI workloads. Develop, automate, and monitor Airflow DAGs/jobs on cloud or Kubernetes, following robust deployment and operational practices (CI/CD, containerization, infra-as-code). Write performant, production-grade SQL for complex data aggregation, transformation, and analytics tasks. Ensure data quality, consistency, and governance across the stack, implementing processes for validation, cleansing, anomaly detection, and reconciliation. Collaborate with Data Scientists, Analysts, and DevOps engineers to ingest, structure, and expose structured, semi-structured, and unstructured data for diverse use-cases. Contribute to data modeling, schema design, data partitioning strategies, and ensure adherence to best practices for performance and cost optimization. Implement, document, and extend data lineage, cataloging, and observability through tools such as AWS Glue, Azure Purview, Amundsen, or open-source technologies. Apply and enforce data security, privacy, and compliance requirements (e.g., access control, data masking, retention policies, GDPR/CCPA). Take ownership of end-to-end data pipeline lifecycle: design, development, code reviews, testing, deployment, operational monitoring, and maintenance/troubleshooting. Contribute to frameworks, reusable modules, and automation to improve development efficiency and maintainability of the codebase. Stay abreast of industry trends and emerging technologies, participating in code reviews, technical discussions, and peer mentoring as needed. Skills & Experience: Proficiency with Spark (Python or Scala), SQL, and data pipeline orchestration (Airflow, Prefect, Luigi, or similar). Experience with cloud data ecosystems (AWS, GCP, Azure) and cloud-native services for data processing (Glue, Dataflow, Dataproc, EMR, HDInsight, Synapse, etc.). © HASHEDIN BY DELOITTE 2025 Hands-on development skills in at least one programming language (Python, Scala, or Java preferred); solid knowledge of software engineering best practices (version control, testing, modularity). Deep understanding of batch and streaming architectures (Kafka, Kinesis, Pub/Sub, Flink, Structured Streaming, Spark Streaming). Expertise in data warehouse/lakehouse solutions (Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse) and storage formats (Parquet, ORC, Delta, Iceberg, Avro). Strong SQL development skills for ETL, analytics, and performance optimization. Familiarity with Kubernetes (K8s), containerization (Docker), and deploying data pipelines in distributed/cloud-native environments. Experience with data quality frameworks (Great Expectations, Deequ, or custom validation), monitoring/observability tools, and automated testing. Working knowledge of data modeling (star/snowflake, normalized, denormalized) and metadata/catalog management. Understanding of data security, privacy, and regulatory compliance (access management, PII masking, auditing, GDPR/CCPA/HIPAA). Familiarity with BI or visualization tools (PowerBI, Tableau, Looker, etc.) is an advantage but not core. Previous experience with data migrations, modernization, or refactoring legacy ETL processes to modern cloud architectures is a strong plus. Bonus: Exposure to open-source data tools (dbt, Delta Lake, Apache Iceberg, Amundsen, Great Expectations, etc.) and knowledge of DevOps/MLOps processes. Professional Attributes: Strong analytical and problem-solving skills; attention to detail and commitment to code quality and documentation. Ability to communicate technical designs and issues effectively with team members and stakeholders. Proven self-starter, fast learner, and collaborative team player who thrives in dynamic, fast-paced environments. Passion for mentoring, sharing knowledge, and raising the technical bar for data engineering practices. Desirable Experience: Contributions to open source data engineering/tools communities. Implementing data cataloging, stewardship, and data democratization initiatives. Hands-on work with DataOps/DevOps pipelines for code and data. Knowledge of ML pipeline integration (feature stores, model serving, lineage/monitoring integration) is beneficial. © HASHEDIN BY DELOITTE 2025 EDUCATIONAL QUALIFICATIONS: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience). Certifications in cloud platforms (AWS, GCP, Azure) and/or data engineering (AWS Data Analytics, GCP Data Engineer, Databricks). Experience working in an Agile environment with exposure to CI/CD, Git, Jira, Confluence, and code review processes. Prior work in highly regulated or large-scale enterprise data environments (finance, healthcare, or similar) is a plus.
Posted 4 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad, Telangana
Work from Office
Notice Period: Immediate to 15 days preferred We are actively looking for a MongoDB Developer to join our growing technology team. This role involves working on data modeling, ingestion, and integration with various systems while ensuring high performance, scalability, and reliability of data-driven applications. Key Responsibilities: Design, develop, and manage scalable and optimized MongoDB-based database solutions Work on data modeling for performance and storage efficiency Develop effective and scalable queries and operations using MongoDB Integrate third-party services, APIs, and tools with MongoDB for streamlined data management Work collaboratively with developers, data engineers, and business teams to ensure seamless application integration Write and execute unit, integration, and performance tests for MongoDB implementations Conduct code reviews and database optimization, ensuring best practices in data security and architecture Document and maintain schema changes and performance improvements Preferred Skills: Experience with Snowflake or similar cloud-based data warehouses is an advantage Exposure to Agile methodologies Familiarity with CI/CD pipelines for data workflows Primary Skills Required: Strong experience with MongoDB (querying, schema design, ingestion) Familiarity with JavaScript/Node.js or Python (for integrations, if applicable) API Integration Performance tuning and data management MongoDB Atlas (preferred) To Apply, Please Share Your Resume with the Following Details: Full Name: Total Experience: Relevant Experience in MongoDB: Snowflake Experience (if any): Current Company: Current CTC: Expected CTC: Notice Period: Preferred Location: Willingness to work remotely (Yes/No): Send your resume to: [Your Email Address] Contact: [Your Phone Number, if needed]
Posted 4 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad
Work from Office
Dear Candidate, Greetings from Tata Consultancy Services (TCS) We are pleased to invite you to our in-person interview drive for professionals with expertise in Snowflake developer - Hyderabad. Interview Drive Details: Date : 21-Jun-2025 Time : 9:00 AM to 5:00 PM Venue : TCS Deccan park-LS1 Zone Plot No 1, Survey No. 64/2, Software Units Layout Serilingampally Mandal, Madhapur Hyderabad - 500081, Telangana. Role ** Snowflake Developer Required Technical Skill Set** Snowflake Desired Experience Range** 5 to 10 yrs exp Location of Requirement Hyderabad Desired Competencies (Technical/Behavioral Competency) Must-Have** At least 5+ years of relevant work experience in any Data Warehouse Technologies At least 2+ years of experience in designing, implementing, and migrating Data/Enterprise/Engineering workloads on to snowflake DWH. Should be able take the requirements from Business , co-ordinate with Business and IT Teams on clarifications, dependencies and status reporting As an individual contributor, should be able Create, test, and implement business solutions in Snowflake Experience in implementing Devops/CICD using Azure Devops / GITLAB Actions is preferred. Hands on experience in Data Modeling Expert in SQL and Performance tuning techniques of queries Experience on ingestion techniques using ETL tools (IICS) and Snowflakes COPY/Snowpipe/StreamLit Utility Strong in writing snowflakes stored procedures, views, UDFs etc. Good exposure of handling CDC using Streams, TimeTravel Proficient in working with Snowflake Tasks, Data Sharing, Data replication Good-to-Have DBT Responsibility of / Expectations from the Role 1. Good exposure of handling CDC using Streams, TimeTravel 2. Expert in SQL and Performance tuning techniques of queries 3. Experience on ingestion techniques using ETL tools (IICS) and Snowflakes COPY/Snowpipe/StreamLit Utility 4. Strong in writing snowflakes stored procedures, views, UDFs etc. 5. Good exposure of handling CDC using Streams, TimeTravel We look forward to your confirmation and participation in the interview drive
Posted 1 month ago
8.0 - 13.0 years
20 - 25 Lacs
Vadodara
Work from Office
Expanding renewables Transforming conventional power Strengthening electrical grids Driving industrial decarbonization Securing the supply chain and necessary minerals Job Requirements: A Snapshot of Your Day Your key responsibility is to project manage finance digitalization solutions to secure sustainable quality, efficiency, and performance together with the entire finance organization. By supporting our teams implementing state-of-the-art technology, you will lead projects that build user-centric solutions to enable our joint digital transformation journey. How Youll Make an Impact Manage end-to-end digitalization projects within the finance organization, from concept to delivery. Facilitate Agile ceremonies, including daily stand-ups, sprint planning, and retrospectives, to promote team collaboration and continuous improvement. Assist project teams in effectively using project management tools (e.g., Jira). Monitor engagement, accuracy, and timely updates including comments, statuses, and documentation. Identify and proactively remove obstacles to ensure smooth workflow and timely delivery of milestones. Support change management initiatives, equipping teams to adapt to new processes and digital solutions. Build strong relationships with business stakeholders, understanding their needs and translating them into sustainable digital solutions. What You Bring Expert knowledge and practical experience in both Agile and traditional project management methodologies. Strong understanding of finance processes (e.g., record to report, purchase to pay, order to cash). Proficiency in digital tools such as Alteryx, SAP Analytics Cloud, Power BI, Tableau, Snowflake, Jira, Confluence, or similar solutions. Excellent communication abilities focusing on collaboration and empathy. Proficient in English (spoken and written). Relevant formal certification (e.g., SCRUM, PMP) required. Bachelors degree or equivalent professional experience
Posted 1 month ago
3.0 - 8.0 years
9 - 15 Lacs
Gurugram, Bengaluru
Hybrid
NOTICE: Immediate - 15 Days Serving. PF Mandatory ! MANDATORY SKILLS: Snowflake Cloud (AWS/GCP) SCALA Python Spark -- Thanks & Regards, Karthik Kumar, IT Recruiter SP Software (P) Limited (An ISO, ISMS & CMMI Level-3 Certified company) An SP Group Enterprise. Connect on : linkedin.com/in/b-karthik-kumar-116990179
Posted 1 month ago
12.0 - 20.0 years
35 - 40 Lacs
Navi Mumbai
Work from Office
Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France