Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 14.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Develop and maintain a metadata driven generic ETL framework for automating ETL code Design, build, and optimize ETL/ELT pipelines using Databricks (PySpark/SQL) on AWS . Ingest data from a variety of structured and unstructured sources (APIs, RDBMS, flat files, streaming). Develop and maintain robust data pipelines for batch and streaming data using Delta Lake and Spark Structured Streaming. Implement data quality checks, validations, and logging mechanisms. Optimize pipeline performance, cost, and reliability. Collaborate with data analysts, BI, and business teams to deliver fit for purpose datasets. Support data modeling efforts (star, snowflake schemas) de norm tables approach and assist with data warehousing initiatives. Work with orchestration tools Databricks Workflows to schedule and monitor pipelines. Follow best practices for version control, CI/CD, and collaborative development Skills Hands-on experience in ETL/Data Engineering roles. Strong expertise in Databricks (PySpark, SQL, Delta Lake), Databricks Data Engineer Certification preferred Experience with Spark optimization, partitioning, caching, and handling large-scale datasets. Proficiency in SQL and scripting in Python or Scala. Solid understanding of data lakehouse/medallion architectures and modern data platforms. Experience working with cloud storage systems like AWS S3 Familiarity with DevOps practices Git, CI/CD, Terraform, etc. Strong debugging, troubleshooting, and performance-tuning skills.
Posted 1 day ago
8.0 - 13.0 years
5 - 10 Lacs
Mumbai
Work from Office
Sr Developer with special emphasis and experience of 8 to 10 years on Python and Pyspark along with hands on experience on AWS Data components like AWS Glue, Athena etc.,. Also have good knowledge on Data ware house tools to understand the existing system. Candidate should also have experience on Datalake, Teradata and Snowflake. Should be good at terraform. 8-10 years of experience in designing and developing Python and Pyspark applications Creating or maintaining data lake solutions using Snowflake,taradata and other dataware house tools. Should have good knowledge and hands on experience on AWS Glue , Athena etc., Sound Knowledge on all Data lake concepts and able to work on data migration projects. Providing ongoing support and maintenance for applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews and CICD Pipelines.
Posted 1 day ago
8.0 - 13.0 years
5 - 10 Lacs
Hyderabad
Work from Office
Sr Developer with special emphasis and experience of 8 to 10 years on Python and Pyspark along with hands on experience on AWS Data components like AWS Glue, Athena etc.,. Also have good knowledge on Data ware house tools to understand the existing system. Candidate should also have experience on Datalake, Teradata and Snowflake. Should be good at terraform. 8-10 years of experience in designing and developing Python and Pyspark applications Creating or maintaining data lake solutions using Snowflake,taradata and other dataware house tools. Should have good knowledge and hands on experience on AWS Glue , Athena etc., Sound Knowledge on all Data lake concepts and able to work on data migration projects. Providing ongoing support and maintenance for applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews and CICD Pipelines.
Posted 1 day ago
6.0 - 11.0 years
8 - 13 Lacs
Pune
Work from Office
P1,C2,STS 6+years of Strong hands-on experience in Machine Learning. Experience in Python coding, Data pre-processing , Data Modelling Good experience in Cloud Platfom, Kuberenetes Experience in SQL queries, Snowflake Should be able to perform an Individual role. skills Machine Learning Data Modelling and Analysis SQL Kubernetes
Posted 1 day ago
9.0 - 14.0 years
12 - 16 Lacs
Pune
Work from Office
Skills requiredStrong SQL(minimum 6-7 years experience), Datawarehouse, ETL Data and Client Platform Tech project provides all data related services to internal and external clients of SST business. Ingestion team is responsible for getting and ingesting data into Datalake. This is Global team with development team at Shanghai, Pune, Dublin and Tampa. Ingestion team uses all Big Data technologies like Impala, Hive, Spark and HDFS. Ingestion team uses Cloud technologies such as Snowflake for cloud data storage. Responsibilities: You will gain an understanding of the complex domain model and define the logical and physical data model for the Securities Services business. You will also constantly improve the ingestion, storage and performance processes by analyzing them and possibly automating them wherever possible. You will be responsible for defining standards and best practices for the team in the areas of Code Standards, Unit Testing, Continuous Integration, and Release Management. You will be responsible for improving performance of queries from lake tables views You will be working with a wide variety of stakeholders source systems, business sponsors, product owners, scrum masters, enterprise architects and possess excellent communication skills to articulate challenging technical details to various class of people. You will be working in Agile Scrum and complete all assigned tasks JIRAs as per Sprint timelines and standards. Qualifications 5 8 years of relevant experience in Data Development, ETL and Data Ingestion and Performance optimization. Strong SQL skills are essential experience writing complex queries spanning multiple tables is required. Knowledge of Big Data technologies Impala, Hive, Spark nice to have. Working knowledge of performance tuning of database queries understanding the inner working of the query optimizer, query plans, indexes, partitions etc. Experience in systems analysis and programming of software applications in SQL and other Big Data Query Languages. Working knowledge of data modelling and dimensional modelling tools and techniques. Knowledge of working with high volume data ingestion and high volume historic data processing is required. Exposure to scripting language like shell scripting, python is required. Working knowledge of consulting project management techniques methods Knowledge of working in Agile Scrum Teams and processes. Experience in data quality, data governance, DataOps and latest data management techniques a plus. Education Bachelors degree University degree or equivalent experience
Posted 1 day ago
8.0 - 13.0 years
8 - 13 Lacs
Hyderabad
Work from Office
P2-C3-STS JD Data Warehouse. In this role you will be part of a team working to develop solutions enabling the business to leverage data as an asset at the bank. As a Lead ETL Developer, you will be leading teams to develop, maintain and enhance code ensuring all IT SDLC processes are documented and practiced, working closely with multiple technologies teams across the enterprise. The Lead ETL Developer should have extensive knowledge of data warehousing cloud technologies. If you consider data as a strategic asset, evangelize the value of good data and insights, have a passion for learning and continuous improvement, this role is for you. Key Responsibilities: Translate requirements and data mapping documents to a technical design. Develop, enhance and maintain code following best practices and standards. Create and execute unit test plans. Support regression and system testing efforts. Debug and problem solve issues found during testing and or production. Communicate status, issues and blockers with project team. Support continuous improvement by identifying and solving opportunities. Basic Qualifications Bachelor degree or military experience in related field (preferably computer science). At least 5 years of experience in ETL development within a Data Warehouse. Deep understanding of enterprise data warehousing best practices and standards. Strong experience in software engineering comprising of designing, developing and operating robust and highly-scalable cloud infrastructure services. Strong experience with Python/PySpark, DataStage ETL and SQL development. Proven experience in cloud infrastructure projects with hands on migration expertise on public clouds such as AWS and Azure, preferably Snowflake. Knowledge of Cybersecurity organization practices, operations, risk management processes, principles, architectural requirements, engineering and threats and vulnerabilities, including incident response methodologies. Understand Authentication & Authorization Services, Identity & Access Management. Strong communication and interpersonal skills. Strong organization skills and the ability to work independently as well as with a team. Preferred Qualifications AWS Certified Solutions Architect Associate, AWS Certified DevOps Engineer Professional and/or AWS Certified Solutions Architect Professional Experience defining future state roadmaps for data warehouse applications. Experience leading teams of developers within a project. Experience in financial services (banking) industry. Mandatory Skills ETL - Datawarehouse concepts AWS, Glue SQL python SNOWFLAKE CI/CD Tools (Jenkins, GitHub) Secondary Skills zena pyspark infogix
Posted 1 day ago
6.0 - 11.0 years
3 - 7 Lacs
Pune
Work from Office
Experience :7-9 yrs Experience in AWS services must like S3, Lambda , Airflow, Glue, Athena, Lake formation ,Step functions etc. Experience in programming in JAVA and Python. Experience performing data analysis (NOT DATA SCIENCE) on AWS platforms Nice to have : Experience in a Big Data technologies (Terradata, Snowflake, Spark, Redshift, Kafka, etc.) Experience with data management process on AWS is a huge Plus Experience in implementing complex ETL transformations on AWS using Glue. Familiarity with relational database environment (Oracle, Teradata, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc.
Posted 1 day ago
3.0 - 7.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. AWS / Snowflake - Design, develop, and maintain data models using SQL and Snowflake / AWS Redshift-specific features. Collaborate with stakeholders to understand the requirements of the data warehouse. Implement data security, privacy, and compliance measures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Stay current with new AWS/Snowflake services and features and recommend improvements to existing architecture. Design and implement scalable, secure, and cost-effective cloud solutions using AWS / Snowflake services. Collaborate with cross-functional teams to understand requirements and provide technical guidance.
Posted 1 day ago
6.0 - 11.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Sr Developer with special emphasis and experience of 8 to 10 years on Python and Pyspark along with hands on experience on AWS Data components like AWS Glue, Athena etc.,. Also have good knowledge on Data ware house tools to understand the existing system. Candidate should also have experience on Datalake, Teradata and Snowflake. Should be good at terraform. 8-10 years of experience in designing and developing Python and Pyspark applications Creating or maintaining data lake solutions using Snowflake,taradata and other dataware house tools. Should have good knowledge and hands on experience on AWS Glue , Athena etc., Sound Knowledge on all Data lake concepts and able to work on data migration projects. Providing ongoing support and maintenance for applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews and CICD Pipelines.
Posted 1 day ago
8.0 - 13.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Responsibilities Design and Develop Scalable Data PipelinesBuild and maintain robust data pipelines using Python to process, transform, and integrate large-scale data from diverse sources. Orchestration and AutomationImplement and manage workflows using orchestration tools such as Apache Airflow to ensure reliable and efficient data operations. Data Warehouse ManagementWork extensively with Snowflake to design and optimize data models, schemas, and queries for analytics and reporting. Queueing SystemsLeverage message queues like Kafka, SQS, or similar tools to enable real-time or batch data processing in distributed environments. CollaborationPartner with Data Science, Product, and Engineering teams to understand data requirements and deliver solutions that align with business objectives. Performance OptimizationOptimize the performance of data pipelines and queries to handle large scales of data efficiently. Data Governance and SecurityEnsure compliance with data governance and security standards to maintain data integrity and privacy. DocumentationCreate and maintain clear, detailed documentation for data solutions, pipelines, and workflows. Qualifications Required Skills: 5+ years of experience in data engineering roles with a focus on building scalable data solutions. Proficiency in Python for ETL, data manipulation, and scripting. Hands-on experience with Snowflake or equivalent cloud-based data warehouses. Strong knowledge of orchestration tools such as Apache Airflow or similar. Expertise in implementing and managing messaging queues like Kafka, AWS SQS, or similar. Demonstrated ability to build and optimize data pipelines at scale, processing terabytes of data. Experience in data modeling, data warehousing, and database design. Proficiency in working with cloud platforms like AWS, Azure, or GCP. Strong understanding of CI/CD pipelines for data engineering workflows. Experience working in an Agile development environment, collaborating with cross-functional teams. Preferred Skills: Familiarity with other programming languages like Scala or Java for data engineering tasks. Knowledge of containerization and orchestration technologies (Docker, Kubernetes). Experience with stream processing frameworks like Apache Flink. Experience with Apache Iceberg for data lake optimization and management. Exposure to machine learning workflows and integration with data pipelines. Soft Skills: Strong problem-solving skills with a passion for solving complex data challenges. Excellent communication and collaboration skills to work with cross-functional teams. Ability to thrive in a fast-paced, innovative environment.
Posted 1 day ago
4.0 - 9.0 years
4 - 8 Lacs
Hyderabad
Work from Office
SkillData Engineer RoleT3, T2 Key responsibility Data Engineer Must have 5+ years of experience in below mentioned skills. Must HaveBig Data Concepts , Python(Core Python- Able to write code), SQL, Shell Scripting, AWS S3 Good to HaveEvent-driven/AWA SQS, Microservices, API Development, Kafka, Kubernetes, Argo, Amazon Redshift, Amazon Aurora
Posted 1 day ago
3.0 - 8.0 years
4 - 9 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Role & responsibilities Job Title Snowflake with Python Job Description Job Title: Snowflake Data Engineer (TL) Location: Bangalore, Hyderabad, Pune, Noida, Kolkata Work Experience: 3 to 12 years Role & Required Skills: Proven experience in Snowflake. Good experience in SQL and Python. Experience in Data Warehousing. Experience in Data migration from SQL to Snowflake. AWS experience is nice to have. Good communication skills. Responsibilities: Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models. Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Analyze and translate business needs into long-term solution data models. Evaluate existing data systems. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross-compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems. Roles & Responsibilities Role & Required Skills: Proven experience in Snowflake. Good experience in SQL and Python. Experience in Data Warehousing. Experience in Data migration from SQL to Snowflake. AWS experience is nice to have. Good communication skills. Preferred candidate profile
Posted 1 day ago
12.0 - 17.0 years
14 - 19 Lacs
Hyderabad
Work from Office
Data Tester Highlights: 5 plus years experience in data testing ETL TestingValidating the extraction, transformation, and loading (ETL) of data from various sources. Data ValidationEnsuring the accuracy, completeness, and integrity of data in databases and data warehouses. SQL ProficiencyWriting and executing SQL queries to fetch and analyze data. Data ModelingUnderstanding data models, data mappings, and architectural documentation. Test Case DesignCreating test cases, test data, and executing test plans. TroubleshootingIdentifying and resolving data-related issues. Dashboard TestingValidating dashboards for accuracy, functionality, and user experience. CollaborationWorking with developers and other stakeholders to ensure data quality and functionality. Primary Responsibilities Dashboard Testing Components: Functional TestingSimulating user interactions and clicks to ensure dashboards are functioning correctly. Performance TestingEvaluating dashboard responsiveness and load times. Data Quality TestingVerifying that the data displayed on dashboards is accurate, complete, and consistent. Usability TestingAssessing the ease of use and navigation of dashboards. Data Visualization TestingEnsuring charts, graphs, and other visualizations are accurate and present data effectively. Security TestingVerifying that dashboards are secure and protect sensitive data. Tools and Technologies: SQLUsed for querying and validating data. Hands on snowflake ETL ToolsTools like Talend, Informatica, or Azure Data Factory used for data extraction, transformation, and loading. Data Visualization ToolsTableau, Power BI, or other BI tools used for creating and testing dashboards. Testing FrameworksFrameworks like Selenium or JUnit used for automating testing tasks. Cloud PlatformsAWS platforms used for data storage and processing. Hands on Snowflake experience HealthCare Domain knowledge is plus point. Secondary Skills Automation framework, Life science domain experience. UI Testing, API Testing Any other ETL Tools
Posted 1 day ago
6.0 - 11.0 years
5 - 9 Lacs
Gurugram
Work from Office
QA Automation The successful candidate would be expected to be a part of the ISS Technology QE team. The Senior Test Analysts key responsibilities would include developing a firm understanding of the system, creation, and maintenance of automation test plans, creating frameworks, analysing metrics from application and system logs, and simulating system behaviour to improve the performance and reliability of applications. The senior analyst would be expected to interact with a wide variety of stakeholders, including programme managers, development managers, architects, delivery leads, business analysts, test managers and functional automation testers, so clear communication skills are very important. Key Responsibilities Establish test objectives, acceptance criteria, workload profiles and use case scenarios. Work with the Product Managers and Developers to design, develop and maintain automation framework for use in development and testing cycles. Work closely with development teams and architects to test the application under load and make recommendations to improve performance, reliability, and stability. Utilize innovative test technologies to develop a product's automation testing strategy. T Mandatory Skills Good in SDLC, STLC and defect lifecycle writing and understanding complex database queries in various databases (Oracle, Snowflake) AWS/Azure concepts experience in maintenance of Automation Suites Java/Python/Selenium Familiarity with CI/CD tools
Posted 1 day ago
5.0 - 10.0 years
4 - 8 Lacs
Hyderabad
Work from Office
P2-C3-STS JD In this role you will be part of a team working to develop solutions enabling the business to leverage data as an asset at the bank.The Senior ETL Developer should have extensive knowledge of data warehousing cloud technologies. If you consider data as a strategic asset, evangelize the value of good data and insights, have a passion for learning and continuous improvement, this role is for you. Translate requirements and data mapping documents to a technical design. Develop, enhance and maintain code following best practices and standards. Create and execute unit test plans. Support regression and system testing efforts. Debug and problem solve issues found during testing and/or production. Communicate status, issues and blockers with project team. Support continuous improvement by identifying and solving opportunities. Basic Qualifications Bachelor degree or military experience in related field (preferably computer science). At least 5 years of experience in ETL development within a Data Warehouse. Deep understanding of enterprise data warehousing best practices and standards. Strong experience in software engineering comprising of designing, developing and operating robust and highly-scalable cloud infrastructure services. Strong experience with Python/PySpark, DataStage ETL and SQL development. Proven experience in cloud infrastructure projects with hands on migration expertise on public clouds such as AWS and Azure, preferably Snowflake. Knowledge of Cybersecurity organization practices, operations, risk management processes, principles, architectural requirements, engineering and threats and vulnerabilities, including incident response methodologies. Understand Authentication & Authorization Services, Identity & Access Management. Strong communication and interpersonal skills. Strong organization skills and the ability to work independently as well as with a team. Preferred Qualifications AWS Certified Solutions Architect Associate, AWS Certified DevOps Engineer Professional and/or AWS Certified Solutions Architect Professional Experience defining future state roadmaps for data warehouse applications. Experience leading teams of developers within a project. Experience in financial services (banking) industry. Mandatory Skills ETL - Datawarehouse concepts Snowflake AWS, Glue CI/CD Tools (Jenkins, GitHub) python Datastage Secondary Skills zena pyspark infogix
Posted 1 day ago
8.0 - 13.0 years
4 - 8 Lacs
Mumbai
Work from Office
Sr Devloper with special emphasis and experience of 8 to 10 years on Python and Pyspark along with hands on experience on AWS Data components like AWS Glue, Athena etc.,. Also have good knowledge on Data ware house tools to understand the existing system. Candidate should also have experience on Datalake, Teradata and Snowflake. Should be good at terraform. 8-10 years of experience in designing and developing Python and Pyspark applications Creating or maintaining data lake solutions using Snowflake,taradata and other dataware house tools. Should have good knowledge and hands on experience on AWS Glue , Athena etc., Sound Knowledge on all Data lake concepts and able to work on data migration projects. Providing ongoing support and maintenance for applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews and CICD Pipelines.
Posted 1 day ago
7.0 - 12.0 years
3 - 6 Lacs
Hyderabad
Work from Office
RoleIICS Developer Work ModeHybrid Work timings2pm to 11pm LocationChennai & Hyderabad Primary Skills: IICS Job Summary: We are looking for a highly experienced Senior Lead Data Engineer role with strong expertise in Informatica IICS, Snowflake, Unix/Linux Shell Scripting, CI/CD tools, Agile, and cloud platforms. The ideal candidate will lead complex data engineering initiatives, optimize data architecture, and drive automation while ensuring high standards of data quality and governance within an agile environment. Required Qualifications: - Required Minimum 5+ years of experience in data warehousing and data warehouse concepts. Extensive experience in Informatica IICS, and Snowflake. - Experience in designing, developing, and maintaining data integration solutions using IICS. - Experience in designing, implementing, and optimizing data storage and processing solutions using Snowflake. - Design and execute complex SQL queries for data extraction, transformation, and analysis. - Strong proficiency in Unix/Linux shell scripting and SQL. - Extensive expertise in CI/CD tools and ESP Scheduling. - Experience working in agile environments, with a focus on iterative improvements and collaboration. - Knowledge of SAP Data Services is an added advantage. - Expertise in cloud platforms (AWS, Azure). - Proven track record in data warehousing, data integration, and data governance. - Excellent data analysis and data profiling skills - Collaborate with stakeholders to define data requirements and develop effective data strategies. - Strong leadership and communication skills, with the ability to drive strategic data initiatives.
Posted 1 day ago
8.0 - 13.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL Skills. Developers should be proficient in Python (especially Pandas, PySpark, or Dask) for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential.
Posted 1 day ago
7.0 - 12.0 years
4 - 8 Lacs
Pune
Work from Office
1. ETLHands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica 2. Big DataExperience of big data platforms such as Hadoop, Hive or Snowflake for data storage and processing 3. Data Warehousing & Database ManagementUnderstanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design 4. Data Modeling & DesignGood exposure to data modeling techniques; design, optimization and maintenance of data models and data structures 5. LanguagesProficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala 6. DevOpsExposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Ab InitioExperience developing CoOp graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, ExpressIT, Data Profiler and ConductIT, ControlCenter, ContinuousFlows CloudGood exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & ControlsExposure to data validation, cleansing, enrichment and data controls ContainerizationFair understanding of containerization platforms like Docker, Kubernetes File FormatsExposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta OthersBasics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage.
Posted 1 day ago
10.0 - 15.0 years
12 - 16 Lacs
Gurugram
Work from Office
1. Experience of working in AWS cloud service (i.e. S3, AWS Glue, Glue catalogue, Step Functions, lambda, Event Bridge etc) 2. Must have hands on DQ Libraries for data quality checks 3. Proficiency in data modelling and database management. 4. Strong programming skills in python, unix and on ETL technologies like Informatica 5. Experience of DevOps and Agile methodology and associated toolsets (including working with code repositories) and methodologies 6. Knowledge of big data technologies like Hadoop and Spark. 7. Must have hands on Reporting tools Tableau, quick sight and MS Power BI 8. Must have hands on databases like Postgres, MongoDB 9. Experience of using industry recognised frameworks and experience on Streamsets & Kafka is preferred 10. Experience in data sourcing including real time data integration 11. Proficiency in Snowflake Cloud and associated data migration from on premise to Cloud with knowledge on databases like Snowflake, Azure data lake, Postgres
Posted 1 day ago
14.0 - 19.0 years
4 - 8 Lacs
Hyderabad
Work from Office
SkillData Engineer RoleT2, T1 Key responsibility Data Engineer Must have 9+ years of experience in below mentioned skills. Must HaveBig Data Concepts , Python(Core Python- Able to write code), SQL, Shell Scripting, AWS S3 Good to HaveEvent-driven/AWA SQS, Microservices, API Development, Kafka, Kubernetes, Argo, Amazon Redshift, Amazon Aurora
Posted 1 day ago
4.0 - 9.0 years
6 - 11 Lacs
Hyderabad
Work from Office
Gen AI IT is a global leader in industrial packaging products and services. We are committed to providing innovative solutions that enhance our customers productivity and sustainability. Our team is dedicated to excellence, and we strive to create a collaborative and inclusive work environment. Position Overview: It is seeking a skilled and motivated AI Engineer to join our dynamic team. The ideal candidate will have 2 to 4 years of experience in developing and deploying GenAI and AI/ML solutions to production. This requires hands on experience with no code, low code, and SDKs to build AI systems. The candidate should be proficient in working with data platforms such as Microsoft Azure, Snowflake, GenAI platform such as Azure AI Foundry, Azure OpenAI, Copilot Studio, and ChatGPT. The ability to manage small projects with minimal supervision and a working knowledge of the Agile methodology are essential. The candidate must be comfortable with ambiguity and a fastpaced PoC (Proof of Concept) delivery schedule. KeyResponsibilities: Focus on designing and developing proof of concepts (PoCs) and demonstrate the solution on a tight schedule. Utilize GenAI no code, low code, and SDKs to build robust GenAI agents that automate business processes. Work with data platforms such as Microsoft Azure, Snowflake, and integration services like Azure Data Factory to build agentic workflow Embed/integrate GenAI agents (Copilot agents) into business platforms such as Workday, Teams, etc. Manage small to medium-sized projects with minimal supervision. Apply Agile methodology to ensure efficient project delivery. Make informed decisions under uncertainty and adapt to changing project requirements. Qualifications: Bachelor or master degree in AI, computer science, Engineering, mathematics or a related field. 2 to 4 years of experience in developing and deploying AI/ML solutions to production. Hands on experience with no code, low code, and SDKs for AI system development. Proficiency in data platforms such as Microsoft Azure, Snowflake, and integration services like Azure Data Factory. Experience with Azure Cloud, Azure AI Foundry, Copilot Studio, and frameworks such LangChain, LangGraph, MCP for building agentic systems. Strong understanding of Agile methodology and project management. Ability to manage projects independently and make decisions under ambiguity. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Skills: Automation of complex processes using GenAI agents, especially using the Azure GenAI echo system Advanced Python programming. Handson experience with data storage systems, especially Snowflake, Azure Data Factory, Azure Fabric, and Azure Synapse Building Copilot agents and embedding them into systems such as Workday, Teams, etc. Mandatory Skills Gen AI Python data storage systems - especially Snowflake, Azure Data Factory, Azure Fabric, and Azure Synapse
Posted 1 day ago
3.0 - 8.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will be at the forefront of designing, building, and configuring applications. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also be responsible for troubleshooting issues and providing guidance to team members, ensuring that projects are delivered on time and to the highest standards. Your role will require you to stay updated with the latest technologies and methodologies to enhance application performance and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing guidance and support to enhance their skills and knowledge. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data warehousing concepts and best practices.- Experience with SQL and data modeling techniques.- Familiarity with ETL processes and tools.- Ability to analyze and optimize database performance. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive successful project outcomes. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring adherence to best practices in application development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring within the team to enhance overall performance.- Monitor project progress and make adjustments as necessary to ensure successful delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data warehousing concepts and architecture.- Experience with ETL processes and data integration techniques.- Familiarity with SQL and data querying for effective data manipulation.- Ability to analyze and optimize data models for performance improvements. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
7.0 - 10.0 years
17 - 22 Lacs
Noida
Work from Office
About the Role We at Innovaccer are looking for a Site Database Reliability Engineer-II to build the most amazing product experience. Youll get to work with other engineers to build delightful feature experience to understand and solve our customers pain points A Day in the Life Database design, model, implement and size large scale systems using Snowflake, PostgreSQL and MongoDB Responsible for Provisioning, availability 24X7, reliability, performance, Security, maintenance, upgrades, and cost optimization Capacity planning of large-scale database clusters Automate DB provisioning, deployments, routine administration, maintenance and upgrades Address the business critical incidents P0/P1 within the SLA, identify the RCA and address the issue permanently Sync data between multiple data stores (eg: PostgreSQL to ES and Snowflake to ES) Design, document and benchmark the Snowflake or MongoDB DB Maintenance, Backup, Health check, alerting and Monitoring Create processes, best practices, and enforce Identify and tune the long running queries to improve DB performance and to reduce the cost. What You Need Having 4+ years of experience. Work in a fast-paced environment with the agility to change directions as per business needs. Hands-on experience on SQL query writing along with Python or any other scripting language in any database environments. Demonstrated experience any cloud environment like AWS, Azure and GCP. Having in-depth knowledge on any two of MongoDB , Redis or Elasticsearch. Knowledge on PostgreSQL / Snowflake / MySQL is a plus. Setup high availability, replication and incremental backups for various datastores. Setup database security best practices like encryption, auditing and Role based access control. Knowledge on DB design principles, partitioning / shading and query optimization. Expert in troubleshooting database performance issues in production. Demonstrated experience with any cloud managed databases and self hosted databases, managing medium to large sized production. Experience in building proof of concepts, trying out new solutions and improving existing systems with best practices to solve business problems and support scaling. Having knowledge/ experience with Terraform , Jenkins , Ansible is a plus . Having knowledge on database monitoring stack such as Prometheus and grafana. Having expertise on Docker and Kubernetes is mandatory. Should be proactive and have the intellect to explore and come up with solutions to complex technical issues.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane