Jobs
Interviews

2818 Snowflake Jobs - Page 39

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

No of years experience Relevant 5+ Years. Detailed job description - Skill Set: Attached Mandatory Skills* Design, Build, Testing of a new Contractor Management Dashboard portal for the program and Assets utilizing current MyPass and other Snowflake Tables. Good to Have Skills Snowflake tables review/optimization for MyPass data. User Training, Feedback and Refinement

Posted 3 weeks ago

Apply

4.0 - 8.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Project description We are more than a diversified global financial markets infrastructure and data business. We are dedicated, open-access partners with a commitment to excellence in delivering the services our customers expect from us. With extensive experience, deep knowledge and worldwide presence across financial markets, we enable businesses and economies around the world to fund innovation, manage risk and create jobs. It's how we've contributed to supporting the financial stability and growth of communities and economies globally for more than 300 years. Through a comprehensive suite of trusted financial market infrastructure services - and our open-access model - we provide the flexibility, stability and trust that enable our customers to pursue their ambitions with confidence and clarity. We are headquartered in the United Kingdom, with significant operations in 70 countries across EMEA, North America, Latin America and Asia Pacific. We employ 25,000 people globally, more than half located in Asia Pacific. Responsibilities As a Senior Quality Assurance Engineer, you will be responsible for ensuring the quality and reliability of complex data-driven systems, with a focus on financial services applications. You will work closely with Data Engineers, Business Analysts, and Developers across global teams to validate functionality, accuracy, and performance of software solutions, particularly around data migration from on-premises to cloud platforms. Key responsibilities include Leading and executing end-to-end test plans, including functional, unit, regression, and back-to-back testing Designing test strategies for data migration projects, with a strong focus on Oracle to Cloud transitions Verifying data accuracy and transformation logic across multiple environments Writing Python-based automated test scripts and utilities for validation Participating in Agile ceremonies, collaborating closely with cross-functional teams Proactively identifying and documenting defects, inconsistencies, and process improvements Contributing to continuous testing and integration practices Ensuring traceability between requirements, test cases, and delivered code Skills Must have Mandatory Skills Description The ideal candidate must demonstrate strong experience ( minimum 7 Years) and hands-on expertise in the following areas Data Testing (Oracle to Cloud Migration)Deep understanding of testing strategies related to large-scale data movement and transformation validation between legacy on-premise systems and modern cloud platforms. Python ScriptingProficient in using Python for writing automated test scripts and tools to streamline testing processes. Regression TestingProven ability to develop and manage comprehensive regression test suites ensuring consistent software performance over releases. Back-to-Back TestingExperience in comparing results between old and new systems or components to validate data integrity post-migration. Functional TestingSkilled in verifying system behavior against functional requirements in a business-critical environment. Unit TestingCapable of writing and executing unit tests for small code components to ensure correctness at the foundational level. Nice to have While not required, the following skills would be a strong plus and would enhance your effectiveness in the role Advanced Python DevelopmentExperience in building complex QA tools or contributing to CI/CD pipelines using Python. DBT (Data Build Tool)Familiarity with DBT for transformation testing and documentation in data engineering workflows. SnowflakeExposure to Snowflake cloud data warehouse and understanding of its testing and validation mechanisms.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Project description Luxoft has been asked to contract a Developer in support of a number of customer initiatives. The primary objective is to develop based on client requirements in the Telecom/network work environment Responsibilities A Data Engineer with experience in the following techologies Databricks and Azure Apache Spark-based, hands on Python, SQL, Apache Airflow. Databricks clusters for ETL processes. Integration with ADLS, Blob Storage. Efficiently ingest data from various sources, including on-premises databases, cloud storage, APIs, and streaming data. Use Azure Key Vault for managing secrets. Hands on experience working with API's Kafka/Azure EventHub streaming hands on experience Hands on experience with data bricks delta API's and UC catalog Hands on experience working with version control tools Github Data Analytics Supports various ML frameworks. Integration with Databricks for model training. OnPrem Exposure on Linux based systems Unix scripting Skills Must have Python, Apache Airflow, Microsoft Azure and Databricks, SQL, databricks clusters for ETL, ADLS, Blob storage, ingestion from various sources including databases and cloud storage, APIs and streaming data, Kafka/Azure EventHub, databricks delta APIs and UC catalog. EducationTypically, a Bachelor's degree in Computer Science (preferably M.Sc. in Computer Science), Software Engineering, or a related field is required. Experience7+ years of experience in development or related fields. Problem-Solving Skills: Ability to troubleshoot and resolve issues related to application development and deployment. Communication Skills: Ability to effectively communicate technical concepts to team members and stakeholders. This includes written and verbal communication. TeamworkAbility to work effectively in teams with diverse individuals and skill sets. Continuous LearningGiven the rapidly evolving nature of web technologies, a commitment to learning and adapting to new technologies and methodologies is crucial. Nice to have Snowflake, PostGre, Redis exposure GenAI exposure Good understanding of RBAC

Posted 3 weeks ago

Apply

5.0 - 10.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Project description We are seeking a highly skilled and motivated Data Scientist with 5+ years of experience to join our team. The ideal candidate will bring strong data science, programming, and data engineering expertise, along with hands-on experience in generative AI, large language models, and modern LLM application frameworks. This role also demands excellent communication and stakeholder management skills to collaborate effectively across business units. Responsibilities We are seeking a highly skilled and motivated Data Scientist with 5+ years of experience to join our team. The ideal candidate will bring strong data science, programming, and data engineering expertise, along with hands-on experience in generative AI, large language models, and modern LLM application frameworks. This role also demands excellent communication and stakeholder management skills to collaborate effectively across business units. Skills Must have Experience5+ years of industry experience as a Data Scientist, with a proven track record of delivering impactful, data-driven solutions. Programming Skills: Advanced proficiency in Python, with extensive experience writing clean, efficient, and maintainable code. Proficiency with version control tools such as Git. Data EngineeringStrong working proficiency with SQL and distributed computing with Apache Spark. Cloud PlatformsExperience building and deploying apps on Azure Cloud. Generative AI & LLMsPractical experience with large language models (e.g., OpenAI, Anthropic, HuggingFace). Knowledge of Retrieval-Augmented Generation (RAG) techniques and prompt engineering is expected. Machine Learning & ModelingStrong grasp of statistical modeling, machine learning algorithms, and tools like scikit-learn, XGBoost, etc. Stakeholder EngagementExcellent communication skills with a demonstrated ability to interact with business stakeholders, understand their needs, present technical insights clearly, and drive alignment across teams. Tools and librariesProficiency with libraries like Pandas, NumPy, and ML lifecycle tools such as MLflow. Team CollaborationProven experience contributing to agile teams and working cross-functionally in fast-paced environments. Nice to have Hands-on experience with Databricks and Snowflake. Hands-on experience building LLM-based applications using agentic frameworks like LangChain, LangGraph, and AutoGen. Familiarity with data visualization platforms such as Power BI, Tableau, or Plotly. Front-end/Full stack development experience. Exposure to MLOps practices and model deployment pipelines in production.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Project description We are DXC Luxoft Financial Services - an award-winning provider of technology solutions, dedicated to the Financial Services sector. Join our international team and become a member of our open minded, progressive and professional team of financial services consultants. In this role you will be working on projects for biggest investment banks across the globe. You will have a chance to grow your technical and soft skills, and build a thorough expertise in the Capital Markets industry. On top of attractive salary and benefits package, we will invest into your professional training, and allow you to grow your professional career. Responsibilities We are seeking a Data Engineer who can design and build with minimum supervision solutions for our team. The person needs to be knowledgeable on modern software development practices including Scrums, be up to date with the latest Database tools and frameworks and be able to work with the existing team to deliver solutions. The ideal candidate will be diligent, tenacious and delivery focused with a track record of building high quality solutions that are resilient and maintainable. Use of cloud technology is increasing, so experience working with AWS/ Snowflake would be beneficial. Skills Must have MUST HAVE S Proven experience as a Senior Software Engineer with extensive experience in software development at least 7Y total experience Proven experience with DBT (3+ year) Proven experience working with Oracle PLSQL & XML Jason Strong Understanding of ETL, main DWH principles Strong Understanding of sharding and clustering. Optimization of dynamic SQL (4+ year) Good knowledge of T-SQL (4+ years) OTHER Strong analytical skills BS/MS Degree in Computer Science, Engineering, and/or related field or equivalent work experience Good communication skills Able to work in a challenging, fast-paced environment Nice to have Snowflake experience Financial domain knowledge Experience in a market-data environment is a plus Proven experience working with Java and Snowflake

Posted 3 weeks ago

Apply

5.0 - 10.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Project description Institutional Banking Data Platform (IDP) is state-of-the-art cloud platform engineered to streamline data ingestion, transformation, and data distribution workflows that underpin Regulatory Reporting, Market Risk, Credit Risk, Quants, and Trader Surveillance. In your role as Software Engineer, you will be responsible for ensuring the stability of the platform, performing maintenance and support activities, and driving innovative process improvements that add significant business value. Responsibilities Problem solving advanced analytical and problem-solving skills to analyse complex information for key insights and present as meaningful information to senior management Communication excellent verbal and written communication skills with the ability to lead discussions with a varied stakeholder across levels Risk Mindset You are expected to proactively identify and understand, openly discuss, and act on current and future risks Skills Must have Bachelor's degree in computer science, Engineering, or a related field/experience. 5+ years of proven experience as a Software Engineer or similar role, with a strong track record of successfully maintaining and supporting complex applications. Strong hands-on experience with Ab Initio GDE, including Express>It, Control Centre, Continuous>flow. Should have handled and worked with XML, JSON, and Web API. Strong hands-on experience in SQL. Hands-on experience in any shell scripting language. Experience with Batch and streaming-based integrations. Nice to have Knowledge of CI/CD tools such as TeamCity, Artifactory, Octopus, Jenkins, SonarQube, etc. Knowledge of AWS services including EC2, S3, CloudFormation, CloudWatch, RDS and others. Knowledge of Snowflake and Apache Kafka is highly desirable. Experience with configuration management and infrastructure-as-code tools such as Ansible, Packer, and Terraform. Experience with monitoring and observability tools like Prometheus/Grafana.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

11 - 16 Lacs

Gurugram

Work from Office

Project description We are looking for the star Python Developer who is not afraid of work and challenges! Gladly becoming a partner with famous financial institution, we are gathering a team of professionals with wide range of skills to successfully deliver business value to the client. Responsibilities Analyse existing SAS DI pipelines and SQL-based transformations. Translate and optimize SAS SQL logic into Python code using frameworks such as Pyspark. Develop and maintain scalable ETL pipelines using Python on AWS EMR. Implement data transformation, cleansing, and aggregation logic to support business requirements. Design modular and reusable code for distributed data processing tasks on EMR clusters. Integrate EMR jobs with upstream and downstream systems, including AWS S3, Snowflake, and Tableau. Develop Tableau reports for business reporting. Skills Must have 6+ years of experience in ETL development, with at least 5 years working with AWS EMR. Bachelor's degree in Computer Science, Data Science, Statistics, or a related field. Proficiency in Python for data processing and scripting. Proficient in SQL and experience with one or more ETL tools (e.g., SAS DI, Informatica)/. Hands-on experience with AWS servicesEMR, S3, IAM, VPC, and Glue. Familiarity with data storage systems such as Snowflake or RDS. Excellent communication skills and ability to work collaboratively in a team environment. Strong problem-solving skills and ability to work independently. Nice to have N/A

Posted 3 weeks ago

Apply

8.0 - 13.0 years

22 - 27 Lacs

Bengaluru

Work from Office

MEET THE TEAM We are a high-performing analytics team on a mission to embed AI into every layer of our decision-making. By demonstrating machine learning, large language models (LLMs), and automation, we're redefining how data powers innovation and efficiency at scale. YOUR IMPACT As the AI Lead, you'll architect AI-powered BI systems integrating Snowflake, Tableau, and natural language interfaces. Youll transform ETL workflows with ML automation and build multi-agent AI systems to supervise KPIs, generate dashboards, and drive insights in real time. MINIMUM QUALIFICATIONS Proven expertise in building AI apps (LLMs, RAG, NLP, Deep Learning). Sophisticated Python, SQL (Snowflake), and Tableau dashboarding. Experience with ML pipelines, GenAI frameworks (LangChain, Hugging Face). Ability to lead workshops, PoCs, and mentor teams. Close collaboration with AI/engineering/product teams. Continuous learning mindset with industry awareness. BASIC QUALIFICATIONS Bachelors/Masters in CS, AI, or related field. 10+ years in BI/Analytics; 3+ in AI/ML roles. Expertise in LLMs (OpenAI, Claude, Llamas), RAG, and agent-based systems. Strong MLOps skills: model deployment, lifecycle management. Leadership in cross-functional teams and innovation delivery. PREFERRED QUALIFICATIONS Experience with multi-agent AI in BI, Snowflake Cortex, and MLOps tools like Kubeflow. Proven track record integrating AI with enterprise BI platforms for business impact.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

18 - 30 Lacs

Hyderabad

Work from Office

Power BI Reporting Analyst with hands on ETL tools, DWH, SQL, Snowflake experience. Excellent SQL skills Good ETL knowledge – Hands on experience in any ETL tool like SSIS/Informatica/Talend Banking knowledge (Deposits and Loans understanding)

Posted 3 weeks ago

Apply

5.0 - 7.0 years

5 - 9 Lacs

India, Bengaluru

Work from Office

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. We are looking for Snowflake Engineer. Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. You’ll make a difference by: Developing and delivering parts of a product, in accordance with the customers’ requirements and organizational quality norms. Activities to be performed include: Communicating within the team as well as with all the stake holders Strong customer focus and good learner. Highly proactive and team player Implementation of features and/or bug-fixing and delivering solutions in accordance with coding guidelines and on-time with high quality. Identification and implementation of test strategy to ensure solution addresses customer requirements, and quality, security requirements of product are met. Job Requirements/ Skills: 5-7 years’ work experience in Software Engineering especially in professional software product development. Strong knowledge in Snowflake, Database and Tools Strong knowledge in Data Warehouse, Data Visualization, BI, ETL, Analytics Strong knowledge in RDBMS, Stored Procedures and Triggers Strong Knowledge in DBT Basic knowledge of AWS services Knowledge in any programming languages like Python or Java. Basic Experience with Agile/Lean and SAFe practices is preferred Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here/digitalminds

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

India, Bengaluru

Work from Office

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. We are looking for Snowflake Engineer. Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. You’ll make a difference by: Being responsible for the development and delivery of parts of a product, in accordance with the customers’ requirements and organizational quality norms. Activities to be performed include: Good at communicating within the team as well as with all the stake holders Strong customer focus and good learner. Highly proactive and team player Implementation of features and/or bug-fixing and delivering solutions in accordance with coding guidelines and on-time with high quality. Identification and implementation of test strategy to ensure solution addresses customer requirements, and quality, security requirements of product are met. Job Requirements/ Skills: 3-6 years’ work experience in Software Engineering especially in professional software product development. Strong knowledge in Snowflake, Database and Tools Strong knowledge in Data Warehouse, Data Visualization, BI, ETL, Analytics Strong knowledge in RDBMS, Stored Procedures and Triggers Strong Knowledge in DBT Basic knowledge in Power BI Knowledge of Software Engineering processes. Basic Experience with Agile Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here/digitalminds

Posted 3 weeks ago

Apply

3.0 - 5.0 years

15 - 30 Lacs

Bengaluru

Hybrid

The Modern Data Engineer is responsible for designing, implementing, and maintaining scalable data architectures using cloud technologies, primarily on AWS, to support our next evolutionary stage of the Investment Process. They build robust data pipelines, optimize data storage, and access patterns, and ensure data quality while collaborating across engineering teams to deliver high-value data products. Key Responsibilities Implement and maintain data pipelines for ingestion, transformation, and delivery Ensure data quality through validation and monitoring processes Collaborate with senior engineers to design scalable data solutions Work with business analysts to understand and implement data requirements Optimize data models and queries for performance and efficiency Follow engineering best practices and contribute to team standards Participate in code reviews and knowledge sharing activities Implement data security controls and access policies Troubleshoot and resolve data pipeline issues About you Core Technical Skills Cloud Platforms: Proficient with cloud-based data platforms (Snowflake, data lakehouse architecture) AWS Ecosystem : Strong knowledge of AWS services including Lambda, Glue, and S3 Streaming Architecture : Understanding of event-based or streaming data concepts using Kafka Programming: Strong proficiency in Python and SQL DevOps : Experience with CI/CD pipelines and infrastructure as code (Terraform) Data Security: Knowledge of implementing basic data access controls Database Systems : Experience with RDBMS (Oracle, Postgres, MSSQL) and exposure to NoSQL databases Data Integration : Understanding of data integration patterns and techniques Orchestration : Experience using workflow tools (Airflow, Control-M, etc.) Engineering Practices : Experience with GitHub, code verification, and validation Domain Knowledge: Basic knowledge of investment management industry concepts

Posted 3 weeks ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Must have 5+ years of experience in Testing Must have experience on Snowflake, Azure Cloud and DWH Must have good experience in SQL queries Analyse and understand the Business requirements and ability to convert that to test cases of functional and non-functional components Identification, Preparation, documentation and execution of Smoke, functional, non-functional, Automation Regression and E2E test cases Experience of working in agile teams, where to collaborate with PO, developers, and testers Adaptive team player who can be flexible to work on an aggressive timeline as an Individual contributor and as well as with the team Good communication in articulating the defects clearly documenting the required evidence in JIRA and to re-test the issues until closure Experience with JIRA/XRAY Experience in Securities & Capital Market domain is preferred Excellent Communication skill Mandatory Skills: ETL Testing. Experience: 3-5 Years.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Snowflake. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

16 - 21 Lacs

Hyderabad

Work from Office

Responsibilities: * Design, develop, and maintain Power BI solutions using DAX, ETL tools, SQL, and Snowflake. * Collaborate with cross-functional teams on data modeling and reporting requirements.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Job Overview: We are looking for a BI & Visualization Developer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is to support the design, development and maintainance of business intelligence and analytics solutions. Responsibilities: \u2022 Develop reports, dashboards, and advanced visualizations. Works closely with the product managers, business analysts, clients etc. to understand the needs / requirements and develop visualizations needed. \u2022 Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. \u2022 Learn and develop new visualization techniques as required to keep up with the contemporary visualization design and presentation. \u2022 Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. \u2022 Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. \u2022 Build and reuse template/components/web services across multiple dashboards \u2022 Support presentations to Customers and Partners \u2022 Advising on new technology trends and possible adoption to maintain competitive advantage \u2022 Mentoring Associates Experience Needed: \u2022 8+ years of related experience is required. \u2022 A Bachelor degree or Masters degree in Computer Science or related technical discipline is required \u2022 Highly skilled in data visualization tools like PowerBI, Tableau, Qlikview etc. \u2022 Very Good Understanding of PowerBI Tabular Model/Azure Analysis Services using large datasets. \u2022 Strong SQL coding experience with performance optimization experience for data queries. \u2022 Understands different data models like normalized, de-normalied, stars, and snowflake models. \u2022 Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. \u2022 Experience in design, development, and deployment of BI systems. \u2022 Candidates with ETL experience preferred. \u2022 Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. \u2022 Has strong technical background and remains evergreen with technology and industry developments. Additional Requirements \u2022 Demonstrated ability to have successfully completed multiple, complex technical projects \u2022 Prior experience with application delivery using an Onshore/Offshore model \u2022 Experience with business processes across multiple Master data domains in a services based company \u2022 Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. \u2022 Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. \u2022 Strong written communication skills. Is effective and persuasive in both written and oral communication. \u2022 Experience with gathering end user requirements and writing technical documentation \u2022 Time management and multitasking skills to effectively meet deadlines under time-to-market pressure \u2022 May require occasional travel Conduent is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, creed, religion, ancestry, national origin, age, gender identity, gender expression, sex/gender, marital status, sexual orientation, physical or mental disability, medical condition, use of a guide dog or service animal, military/veteran status, citizenship status, basis of genetic information, or any other group protected by law. People with disabilities who need a reasonable accommodation to apply for or compete for employment with Conduent may request such accommodation(s) by submitting their request through this form that must be downloaded:click here to access or download the form. Complete the form and then email it as an attachment toFTADAAA@conduent.com.You may alsoclick here to access Conduent's ADAAA Accommodation Policy. At Conduent we value the health and safety of our associates, their families and our community. For US applicants while we DO NOT require vaccination for most of our jobs, we DO require that you provide us with your vaccination status, where legally permissible. Providing this information is a requirement of your employment at Conduent.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Were looking for a Senior Data Analyst to join our data-driven team at an ad-tech company that thrives on turning complexity into clarity. Our analysts play a critical role in transforming raw, noisy data into accurate, actionable signals that drive real-time decision-making and long-term strategy. Youll work closely with product, engineering, and business teams to uncover insights, shape KPIs, and guide performance optimization. Responsibilities: Analyze large-scale datasets from multiple sources to uncover actionable insights and drive business impact. Design, monitor, and maintain key performance indicators (KPIs) across ad delivery, bidding, and monetization systems. Partner with product, engineering, and operations teams to define metrics, run deep-dive analyses, and influence strategic decisions. Develop and maintain dashboards, automated reports, and data pipelines to ensure data accessibility and accuracy. Lead investigative analysis of anomalies or unexpected trends in campaign performance, traffic quality, or platform behavior. Requirements BA / BSc in Industrial Engineering and Management / Information Systems Engineering / Economics / Statistics / Mathematics / similar background. 3+ years of experience in Data Analysis and interpretation (Marketing/ Business/ Product). High proficiency in SQL. Experience with data visualization of large data sets using BI systems (Qlik Sense, Sisense, Tableau, Looker, etc.). Experience working with data warehouse/data lake tools like Athena / Redshift / Snowflake /BigQuery. Knowledge of Python - An advantage. Experience building ETL processes An advantage. Fluent in English both written and spoken - Must

Posted 3 weeks ago

Apply

5.0 - 10.0 years

19 - 25 Lacs

Bengaluru

Hybrid

5+ years of experience in data analysis, reporting, or business intelligence.Advanced proficiency in Tableau – ability to create complex, interactive dashboards.Strong SQL skills – Experience with Snowflake.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

10 - 13 Lacs

Bengaluru

Hybrid

CAN Share Resume to : sowmya.v@acesoftlabs.com Position Data Engineer Experience Range: 3-8years Key Responsibilities Data Pipeline Development Design, build, and optimize scalable data pipelines Ingest, transform, and load data from multiple sources Use Azure Databricks , Snowflake , and DBT for pipeline orchestration Data Architecture & Modeling Develop and manage data models within Snowflake Ensure efficient data organization, accessibility, and quality Data Transformation Implement standardized data transformations using DBT Performance Optimization Monitor pipeline performance Troubleshoot and resolve issues Optimize workflows for efficiency Collaboration Work with data scientists, analysts, and business stakeholders Ensure access to reliable, well-structured data for analytics and reporting Required Qualifications Bachelors degree in Computer Science, Data Engineering, or related field Proficiency in Azure Databricks for data processing Experience with Snowflake as a data warehouse platform Hands-on expertise with DBT for data transformations Strong SQL skills and understanding of data modeling principles Ability to troubleshoot and optimize complex data workflows Preferred / Additional Skills Experience in MS Azure , Snowflake , DBT , and Big Data (Hadoop ecosystem) Knowledge of Hadoop architecture and storage frameworks Hands-on with Hadoop, Spark, Hive , and Databricks Experience with Data Lake solutions using Scala and Python Experience with Azure Data Factory (ADF) for orchestration Familiarity with CI/CD tools such as Jenkins , Azure DevOps , and GitHub Strong programming skills in Python or Scala

Posted 3 weeks ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Hyderabad, Chennai

Work from Office

Key Responsibilities: Design, develop, and maintain data pipelines and data models using Snowflake. Validate and optimize ETL processes to ensure accurate and efficient data movement. Collaborate with data engineers and analysts to understand business requirements and deliver scalable solutions. Perform data quality checks and resolve data issues in the Snowflake environment. Implement best practices for data warehousing, performance tuning, and security in Snowflake. Document ETL processes, data flows, and architectural decisions.

Posted 3 weeks ago

Apply

9.0 - 14.0 years

35 - 50 Lacs

Chennai

Hybrid

Hiring: Power BI Architect Location: Chennai (Hybrid 3 Days Office) Experience: 9+ Years Notice Period: Immediate to 30 Days About the Role We are looking for an experienced and visionary Power BI Architect to lead the design, development, and optimization of business intelligence solutions across enterprise data platforms. This role demands deep expertise in Power BI , data modeling , ETL frameworks , and cloud data platforms like Snowflake or Data Lake . You will play a key role in architecting scalable and high-performing BI systems while collaborating closely with stakeholders, data engineers, and analysts. Key Responsibilities Architect end-to-end Power BI solutions including semantic models, ETL pipelines, and secure data access layers. Design and manage data modeling strategies (Star/Snowflake schemas), leveraging DAX , Power Query , and best practices. Lead ETL development for data ingestion from various sources (cloud/on-prem) into Power BI or data lakes. Define and enforce reporting standards , performance optimization guidelines, and governance frameworks . Collaborate with engineering teams on data quality , integration , and automation using tools like Azure Data Factory , Dataflows , or Databricks . Implement advanced security measures such as Row-Level Security (RLS) and data classification. Mentor Power BI developers and contribute to code reviews , documentation , and solution templates . Stay updated on Power BI roadmap, and proactively evaluate new features and tools for integration. Participate in requirement gathering, solution design discussions, and architectural reviews. Required Qualifications Bachelor’s/Master’s degree in Computer Science, Information Systems, or related field. 9+ years of experience in Business Intelligence , with at least 5+ years hands-on in Power BI . Expertise in DAX , Power Query (M) , and building optimized data models . Strong command of SQL , data warehousing , and dimensional modeling . Proven experience with Snowflake or Azure Data Lake implementations. Experience in designing and maintaining enterprise BI architecture and ETL frameworks . Familiarity with Azure ecosystem (ADF, Azure SQL DB, Azure Synapse, Databricks) is highly desirable. Exposure to version control tools like Git/GitHub or Azure DevOps. Strong communication, stakeholder management, and leadership skills. Nice to Have Experience in Power BI Embedded , Deployment Pipelines , or Paginated Reports . Knowledge of CI/CD for BI solutions . Certifications: Microsoft Certified: Power BI Data Analyst Associate or Azure Data Engineer Associate . Why Join Us? Work with a dynamic and collaborative team building cutting-edge analytics solutions. Flexible hybrid work setup. Opportunity to architect enterprise-scale BI platforms in a modern data stack environment.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Chennai

Work from Office

Job Summary: We are seeking a highly skilled Data Engineer to design, develop, and maintain robust data pipelines and architectures. The ideal candidate will transform raw, complex datasets into clean, structured, and scalable formats that enable analytics, reporting, and business intelligence across the organization. This role requires strong collaboration with data scientists, analysts, and cross-functional teams to ensure timely and accurate data availability and system performance. Key Responsibilities Design and implement scalable data pipelines to support real-time and batch processing. Develop and maintain ETL/ELT processes that move, clean, and organize data from multiple sources. Build and manage modern data architectures that support efficient storage, processing, and access. Collaborate with stakeholders to understand data needs and deliver reliable solutions. Perform data transformation, enrichment, validation, and normalization for analysis and reporting. Monitor and ensure the quality, integrity, and consistency of data across systems. Optimize workflows for performance, scalability, and cost-efficiency. Support cloud and on-premise data integrations, migrations, and automation initiatives. Document data flows, schemas, and infrastructure for operational and development purposes. • Apply best practices in data governance, security, and compliance. Required Qualifications & Skills: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. Proven 6+ Years experience in data engineering, ETL development, or data pipeline management. Proficiency with tools and technologies such as: SQL, Python, Spark, Scala ETL tools (e.g., Apache Airflow, Talend) Cloud platforms (e.g., AWS, GCP, Azure) Big Data tools (e.g., Hadoop, Hive, Kafka) Data warehouses (e.g., Snowflake, Redshift, BigQuery) Strong understanding of data modeling, data architecture, and data lakes. Experience with CI/CD, version control, and working in Agile environments. Preferred Qualifications: • Experience with data observability and monitoring tools. • Knowledge of data cataloging and governance frameworks. • AWS/GCP/Azure data certification is a plus.

Posted 3 weeks ago

Apply

9.0 - 14.0 years

30 - 37 Lacs

Hyderabad

Hybrid

SQL & Database Management : Deep knowledge of relational databases (SQL / PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake . ETL/ELT Tools: Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines Data Modeling & Optimization: Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. Cloud & Security: Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). Data Warehousing: Experience managing large datasets, data marts, and optimizing databases for performance. Agile & CI/CD: Knowledge of Agile methodologies and CI/CD automation tools. Imp :-The candidate should have a strong data engineering background with hands-on experience in handling large volumes of data, data pipelines, and cloud-based data systems the same it should reflect in profile

Posted 3 weeks ago

Apply

1.0 - 4.0 years

12 - 16 Lacs

Gurugram

Hybrid

Primary Role Responsibilities: Develop and maintain data ingestion and transformation pipelines across on-premise and cloud platforms. Develop scalable ETL/ELT pipelines that integrate data from a variety of sources (i.e. form-based entries, SQL databases, Snowflake, SharePoint). Collaborate with data scientists, data analysts, simulation engineers and IT personnel to deliver data engineering and predictive data analytics projects. Implement data quality checks, logging, and monitoring to ensure reliable operations. Follow and maintain data versioning, schema evolution, and governance controls and guidelines. Help administer Snowflake environments for cloud analytics. Work with more senior staff to improve solution architectures and automation. Stay updated with the latest data engineering technologies and trends. Participate in code reviews and knowledge sharing sessions. Participate in and plan new data projects that impact business and technical domains. Required Qualifications: Bachelors or masters degree in computer science, data engineering, or related field. 1-3 years of experience in data engineering, ETL/ELT development, and/or backend software engineering. Demonstrated expertise in Python and SQL. Demonstrated experience working with data lakes and/or data warehouses (e.g. Snowflake, Databricks, or similar) Familiarity with source control and development practices (e.g Git, Azure DevOps) Strong problem-solving skills and eagerness to work with cross-functional globalized teams. Preferred Qualifications: Required qualification plus Working experience and knowledge of scientific and R&D workflows, including simulation data and LIMS systems. Demonstrated ability to balance operational support and longer-term project contributions. Experience with Java Strong communication and presentation skills. Motivated and self-driven learner

Posted 3 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Your Impact: As a developer you will work as part of a highly skilled team of professionals who are responsible for architecture, designing and developing of cost effective and sustainable solutions for Security Product business of OpenText. Strong organizational skills, technical expertise and attention to detail are key in this customer-focused role. What the role offers: Translate business requirements using complex methods/models to determine appropriate system solutions Work within a cross-functional team to provide technical expertise in the design and planning of system solutions. Research, identify, test, certify, and select technology required for solution delivery. Maximize the performance, uptime, and supportability of the product. Developing highly scalable Security product using technologies such as Java, J2EE, REST, Azure, Aws, GCP and Snowflake. Working with team to design solutions to security problems, monitor and analyze the security vulnerabilities reported in bundled 3rd party products. Designs and implements new interface components in collaboration with the product owner and other OpenText development teams. Collaborates with engineer and development partners to develop reliable, cost-effective, and high-quality software solutions. Maintains the existing components and resolves problems reported by customers. Enhances existing components with new capabilities whilst maintaining compatibility. Provide feedback on test plans, test cases, and test methodologies. Research new technologies for product improvements and future roadmap. Communicate with stakeholders, provide project progress, highlight any risks involved along with mitigation plan. What you need to succeed: Bachelor's or masters degree in computer science, Information Systems, or equivalent. 2-5 years of software development experience building large-scale and highly distributed applications. Developing highly scalable Security product using technologies such as Java, J2EE, REST/SOAP, AWS, GCP, Snowflake, Azure. Demonstrated ability to have completed multiple, complex technical projects. Strong programming skills in Java, J2EE. Experience in Cloud (Aws or GCP or Azure) is must. Experience in working in Devops, Continuous Integration environment. Excellent communication skills and ability to interact effectively with both technical and non-technical staff In-depth technical experience in IT infrastructure area, Understanding of operational challenges involved in managing complex systems Previous experience in being a part of complex integration projects Technical execution of project activities and responsibilities for on-time delivery and results. Interfacing with customer facing functions to gather project requirements and performing due diligence as required. Providing technical guidance for trouble shooting and issue resolution when needed. Familiarity with Agile Software Development (preferably Scrum). Unit testing and mock framework like mockito. Desired Skills Understanding of the Security domain. Experience in Azure, Aws, GCP and Hadoop. Working knowledge in Linux. Cloud technologies and cloud application development. Good knowledge about the security threat models and good knowledge of various security encryption techniques. Knowledge of different types of security vulnerabilities, attack vectors and common type of cyberattacks.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies