Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
7 - 11 Lacs
Pune
Work from Office
About the Role Were looking for a Data Engineer to help build reliable and scalable data pipelines that power reports, dashboards, and business decisions at Hevo. Youll work closely with engineering, product, and business teams to make sure data is accurate, available, and easy to use. Key Responsibilities Independently design and implement scalable ELT workflows using tools like Hevo, dbt, Airflow, and Fivetran. Ensure the availability, accuracy, and timeliness of datasets powering analytics, dashboards, and operations. Collaborate with Platform and Engineering teams to address issues related to ingestion, schema design, and transformation logic. Escalate blockers and upstream issues proactively to minimize delays for stakeholders. Maintain strong documentation and ensure discoverability of all models, tables, and dashboards. Own end-to-end pipeline quality, minimizing escalations or errors in models and dashboards. Implement data observability practices such as freshness checks, lineage tracking, and incident alerts. Regularly audit and improve accuracy across business domains. Identify gaps in instrumentation, schema evolution, and transformation logic. Ensure high availability and data freshness through monitoring, alerting, and incident resolution processes. Set up internal SLAs, runbooks, and knowledge bases (data catalog, transformation logic, FAQs). Improve onboarding material and templates for future engineers and analysts Required Skills & Experience 3-5 years of experience in Data Engineering, Analytics Engineering, or related roles. Proficient in SQL and Python for data manipulation, automation, and pipeline creation. Strong understanding of ELT pipelines, schema management, and data transformation concepts. Experience with modern data stack : dbt, Airflow, Hevo, Fivetran, Snowflake, Redshift, or BigQuery. Solid grasp of data warehousing concepts: OLAP/OLTP, star/snowflake schemas, relational & columnar databases. Understanding of Rest APIs, Webhooks, and event-based data ingestion. Strong debugging skills and ability to troubleshoot issues across systems. Preferred Background Experience in high-growth industries such as eCommerce, FinTech, or hyper-commerce environments. Experience working with or contributing to a data platform (ELT/ETL tools, observability, lineage, etc.). Core Competencies Excellent communication and problem-solving skills Attention to detail and a self-starter mindset High ownership and urgency in execution Collaborative and coachable team player Strong prioritization and resilience under pressure
Posted 16 hours ago
3.0 - 8.0 years
7 - 12 Lacs
Mumbai
Work from Office
Oracle Data Integrator (ODI) consultant with overall experience of 3+ years of relevant experience in the implementation of Batch/Real-time Integrations using ODI 11g. Ability to customize knowledge modules as per the requirement. Should have strong design and development skills. Ability to design and develop Interfaces, Packages, Load plans, user functions, variables and sequences in ODI. Understanding of ODI/ODQ administration, maintenance and configuration skills. Experience working with multiple source/target systems such as Oracle, MS SQL Server, XML files, flat files, MS access/excel documents. Exposure to OLAP, OLTP, Data warehouse, Data mart development, Fact and Dimensional DB designs. Should have experience in developing dimensions and fact tables using ODI. Experience in high data volume environments/ performance tuning in ODI. Good to have exposure on ODI Administration and Load Balancing. Should be able to configure topology for all the technologies. Should be able to configure Standalone and Java EE agents. Good to have OAS/OAC. Exposure to CDC / Journalizing implementations and customizing knowledge modules. Experience in Modelling (logical and Physical) Warehouse and Marts Strong database design, relational and dimensional data modelling. Experience in writing complex queries, Stored Procedures in PL/SQL. Experience with UNIX Operating System, Windows Systems. Experience on ODI version 12c would be an added advantage. Qualifications Bachelors Degree
Posted 3 days ago
8.0 - 13.0 years
10 - 15 Lacs
Gurugram
Work from Office
Required skillsStrong working knowledge of modern programming languages, ETL/Data Integration tools (preferably SnapLogic) and understanding of Cloud Concepts. SSL/TLS, SQL, REST, JDBC, JavaScript, JSON Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Good to have the ability to build complex mappings with JSON path expressions, and Python scripting. Good to have experience in ground plex and cloud plex integrations. Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Should be able to deliver the project by leading a team of the 6-8member team. Should have had experience in integration projects with heterogeneous landscapes. Good to have the ability to build complex mappings with JSON path expressions, flat files and cloud. Good to have experience in ground plex and cloud plex integrations. Experience in one or more RDBMS (Oracle, DB2, and SQL Server, PostgreSQL and RedShift) Real-time experience working in OLAP & OLTP database models (Dimensional models).
Posted 3 days ago
5.0 - 10.0 years
5 - 8 Lacs
Hyderabad
Work from Office
Key Responsibilities: Design, develop, and maintain Qlik View applications and dashboards. Collaborate with business stakeholders to gather requirements and translate them into technical specifications. Perform data analysis and create data models to support business intelligence initiatives. Optimize Qlik View applications for performance and scalability. Provide technical support and troubleshooting for Qlik View applications. Ensure data accuracy and integrity in all Qlik View applications. Integrate Snowflake with Qlik View to enhance data processing and analytics capabilities. Stay updated with the latest Qlik View features and best practices. Conduct training sessions for end-users to maximize the utilization of Qlik View applications. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience between 2-5 years as a Qlik View Developer. Strong knowledge of Qlik View architecture, data modeling, and scripting. Proficiency in SQL and database management. Knowledge of Snowflake and its integration with Qlik View. Excellent analytical and problem-solving skills. Ability to work independently and as part of a team. Strong communication and interpersonal skills.
Posted 3 days ago
8.0 - 12.0 years
27 - 32 Lacs
Bengaluru
Work from Office
1. Strong development knowledge in DB Design & development with 6+ Years to 10 Years experience (Postgres DB) Mandatory 2. Strong hands on writing complex PGSQL, procedure and Functions & prevent blocking and Deadlocks 3. Conduct SQL objects code review & Performance tuning( Mandatory ) 4. having hands on Microsoft SQL and MYSQL DB is an advantage. 5. Strong knowledge in RDBMS and NoSQL Concept with strong logical thinking and solutions (Highly required) 6. Expert in transaction databases (OLTP) and ACID property with handling large scale application databases(Mandatory) 8. Consult application developer to provide suggestion / SQL's / PGSQLs on DB with best solutions 9. Good Communications and written skills
Posted 4 days ago
12.0 - 15.0 years
16 - 20 Lacs
Bengaluru
Work from Office
As a Principal Technical Specialist, you will lead teams in technical discussions, architecture strategy, and system design for high-performance applications. You will collaborate with architects and developers to define workflows, interfaces, and domain models, leveraging SQL and NoSQL databases. Your role requires a hands-on approach, strong communication skills, and the ability to translate complex technical concepts into actionable insights. You have: Bachelors degree in Engineering with 12 to 15 years of relevant work experience. Experience with architectural patterns and programming languages such as Java, Python/Shell, and Golang. Familiarity with frameworks like Spring, Guice, or Micronaut, and libraries such as Pandas, Keras, Pytorch, Scipy, and Numpy. Experience with Kafka, Spark, and databases (SQLPostgres, Oracle; NoSQLElastic, Prometheus, Mongo, Cassandra, Redis, and Pinot). It would be nice if you also had: Experience in OLTP/real-time system management in enterprise software. Experience in both large-scale and small-scale development, with the ability to model domain-specific systems. Expertise in data engineering, statistical analytics, and AI/ML techniques (AI experience is a plus). Knowledge of NMS/EMS for the telecom domain, including Network Operational Lifecycle and Network Planning. Lead and Guide the team in technical discussions. Expertise in Java, Python/Shell, and frameworks like Spring, Kafka, and AI/ML libraries will drive innovation in telecom network management and real-time enterprise software. Analyze and decompose requirements. Work on long lead items and define architecture strategy for the application. Work with architects and developers in other applications and define interfaces and workflows. With a strong foundation in statistical analytics, data engineering and AI/ML. Communicate and present to internal and external audiences.
Posted 4 days ago
6.0 - 10.0 years
20 - 32 Lacs
New Delhi, Pune, Bengaluru
Work from Office
Job Description Overall 7-10 yrs of IT experience with a Bachelor's degree in computer science, information technology, or a similar field. Must have 4+ years strong experience in Data modelling. Able to understand, analyze and design enterprise data model, and have expertise on any data modelling tool Erwin,power designer. Have created flexible and scalable model Have experience at least two end-to-end implementation of cloud data warehouse. 3 years of hands-on experience with physical and relational data modeling. Expert knowledge of metadata management and related tools. Supporting and providing consultation to database users and developers Preparing accurate database design and architecture reports for management and executive teams Overseeing the migration of data from legacy systems to new solutions Recommending solutions to improve new and existing database systems Experience with team management. Excellent communication and presentation skills. Roles & Responsibilities Involvement in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, production support. Familiarity and experience in the work environment consisting of Business analysts, Production Support teams, Subject Matter Experts, Database Administrators, Data Engineers and BI Developers. Strong skills in SQL, PL/SQL packages, functions, stored procedures, triggers and materialized views to implement business logic in database. Develop mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, datatypes, volumetric, domain definitions , and corporate meta-data definitions. Exceptional communication and presentation skills and established track record of client interactions. Experience with Database SQL tuning and query optimization tools like Explain Plan. Experience in designing Conceptual, Logical and Physical Data Models SKills - Data-Modeler, Data-Modelling, Erwin, Power-Designer, SQL
Posted 5 days ago
10.0 - 15.0 years
50 - 75 Lacs
Chennai
Work from Office
Position Summary... What youll do... About Team: Walmart s Enterprise Business Services (EBS) is a powerhouse of several exceptional teams delivering world-class technology solutions and services making a profound impact at every level of Walmart. As a key part of Walmart Global Tech, our teams set the bar for operational excellence and leverage emerging technology to support millions of customers, associates, and stakeholders worldwide. Each time an associate turns on their laptop, a customer makes a purchase, a new supplier is onboarded, the company closes the books, physical and legal risk is avoided, and when we pay our associates consistently and accurately, that is EBS. Joining EBS means embarking on a journey of limitless growth, relentless innovation, and the chance to set new industry standards that shape the future of Walmart. What youll do: Guide the team in architectural decisions and best practices for building scalable applications. Drive design, development, implementation and documentation Build, test and deploy cutting edge solutions at scale, impacting associates of Walmart worldwide. Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Engage with Product Management and Business to drive the agenda, set your priorities and deliver awesome products. Drive the success of the implementation by applying technical skills, to design and build enhanced processes and technical solutions in support of strategic initiatives. Work closely with the Architects and cross functional teams and follow established practices for the delivery of solutions meeting QCD (Quality, Cost & Delivery). Within the established architectural guidelines. Work with senior leadership to chart out the future roadmap of the products Participate in hiring and build teams enabling them to be high performing agile teams. Interact closely for requirements with Business owners and technical teams both within India and across the globe. What youll bring: Bachelors/ Master s degree in Computer Science , engineering, or related field, with minimum 10+ years of experience in software design, development and automated deployments. Hands on experience building Java-based backend systems and experience of working in cloud based solutions is a must . Should be proficient in Java, Spring Boot, Kafka and Spark. Have prior experience in delivering highly scalable large scale data processing Java applications. Strong in high and low level system design. Should be experienced in designing data intensive applications in open stack. A good understanding of CS Fundamentals, Microservices, Data Structures, Algorithms & Problem Solving Should be experienced in CICD development environments/tools including, but not limited to, Git, Maven , Jenkins . Strong in writing modular and testable code and test cases (unit, functional and integration) using frameworks like JUnit, Mockito, and Mock MVC Should be experienced in microservices architecture. Possesses good understanding of distributed concepts, common design principles, design patterns and cloud native development concepts. Hands-on experience in Spring boot, concurrency, garbage collection, RESTful services, data caching services and ORM tools. Experience working with Relational Database and writing complex OLAP, OLTP and SQL queries. Provide multiple alternatives for development frameworks, libraries, and tools. Experience in working with NoSQL Databases like cosmos DB. Experience in working with Caching technology like Redis, Mem cache or other related Systems. Experience in event based systems like Kafka. Experience utilizing monitoring and alert tools like Prometheus, Splunk, and other related systems and excellent in debugging and troubleshooting issues. Exposure to Containerization tools like Docker, Helm, Kubernetes. Knowledge of public cloud platforms like Azure, GCP etc. will be an added advantage. An understanding of Mainframe databases will be an added advantage. About Walmart Global Tech . . Flexible, hybrid work . Benefits . Belonging . . Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer - By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions - while being inclusive of all people. Minimum Qualifications... Minimum Qualifications:Option 1: Bachelors degree in computer science, computer engineering, computer information systems, software engineering, or related area and 4 years experience in software engineering or related area.Option 2: 6 years experience in software engineering or related area. Preferred Qualifications... Master s degree in Computer Science, Computer Engineering, Computer Information Systems, Software Engineering, or related area and 2 years experience in software engineering or related area Primary Location... Rmz Millenia Business Park, No 143, Campus 1B (1St -6Th Floor), Dr. Mgr Road, (North Veeranam Salai) Perungudi , India
Posted 6 days ago
3.0 - 7.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Hiring for Python programming and PySpark-Pan India Strong hands-on experience in Python programming and PySpark Experience using AWS services (RedShift, Glue, EMR, S3 & Lambda) Experience working with Apache Spark and Hadoop ecosystem. Experience in writing and optimizing SQL for data manipulations. Good Exposure to scheduling tools. Airflow is preferable. Must Have Data Warehouse Experience with AWS Redshift or Hive Experience in implementing security measures for data protection. Expertise to build/test complex data pipelines for ETL processes (batch and near real time) Readable documentation of all the components being developed. Knowledge of Database technologies for OLTP and OLAP workloads
Posted 6 days ago
6.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Job Role: A Data Modeler designs and creates data structures to support business processes and analytics, ensuring data integrity and efficiency. They translate business requirements into technical data models, focusing on accuracy, scalability, and consistency. Here's a more detailed look at the role: Responsibilities: Designing and developing data models: This includes creating conceptual, logical, and physical models to represent data in a structured way. Translating business needs : They work with stakeholders to understand business requirements and translate them into actionable data structures. Ensuring data integrity: They implement data validation rules and constraints to maintain the accuracy and reliability of data. Optimizing data models: Data modelers optimize models for performance, scalability, and usability, ensuring data can be efficiently stored and retrieved. Collaborating with other teams : They work with database administrators, data engineers, and business analysts to ensure data models align with business needs and technical requirements. Documenting data models: They provide clear documentation of data structures and relationships, including entity-relationship diagrams and metadata. Skills: Data modeling techniques: Knowledge of various data modeling approaches, including normalization, denormalization, and dimensional modeling. Database technologies: Understanding of relational databases, NoSQL databases, and other database systems. SQL: Proficiency in writing SQL queries for database management and data manipulation. Data modeling tools: Familiarity with tools like PowerDesigner, ERwin, or Visio. Communication and collaboration: Strong communication skills to effectively work with diverse teams and stakeholders. Problem-solving:. Ability to identify and resolve data model performance issues and ensure data accuracy.
Posted 6 days ago
8.0 - 10.0 years
9 - 13 Lacs
Navi Mumbai
Work from Office
Responsibilities : - Assess the existing deployments of Postgres and associated tools in the customer place, and make recommendations on how to improve them and document the required technical steps. - Gather and record the enterprise-wide database requirements by consulting with the stakeholders in the client teams. - Work on creating proposals for professional services that meet the objectives and goals of customers. - Possessing knowledge of programming languages like C/C++, PHP, python, Java or Ruby to guide customers in the development of custom Postgres-based applications. - Troubleshooting the various Database challenges including the performance issues - Perform database Migrations from Oracle, SQL Server, and / or MySQL to PostgreSQL - Performance tuning for existing database deployments - Work with customers to upgrade their application to the latest version of Postgres and configure / change their application to take advantage of new features. - Design and implement Database High Availability configurations using PostgreSQL. - Evaluate security needs by customers and make recommendations. - Document all work clearly and concisely in runbooks and customer communications throughout the project. - Implement backup strategies and perform upgrades and patching. - Develop Data migration strategies to migrate large databases in line with the customer downtime constraints. - Implement advanced Database monitoring and set up the alerts using the industry best practices and tools. What you will bring : - 8+ years of working experience for a professional consulting organization in the Database area with strong Database foundation skills. - 5+ years of direct working experience on PostgreSQL and Oracle databases - Expertise in PostgreSQL Database implementation for high availability in large enterprise environments. - Be familiar with database design and building applications for PostgreSQL, Oracle, or other databases. - Expert knowledge on General system administration and performance tuning skills for Linux (Ubuntu, Centos, Red hat, or Debian), including troubleshooting and scripting. - Be well-versed in SQL and MS Windows - Have strong time management skills, and the ability to balance multiple projects in parallel. - Have a bachelors or masters degree in computer science / Engineering or equivalent experience - Have strong written and verbal communication skills - Excellent troubleshooting skills What will give you an edge : - Proficient with the PostgreSQL community version. - Proficient with database tuning and use of PostgreSQL tools such as PgBadger, PEM, BARMAN - Proficient with PostgreSQL replication technologies Streaming, Logical and Bidirectional - Proficient with PostgreSQL architecture - Proficient with PostgreSQL networking, HBA and connection pooling (PgBouncer, PgPool) - Working knowledge of OLTP/DSS/DW system design - Working knowledge of scripting languages, BASH, KSH, Python, Perl etc. - Have experience deploying open-source software.
Posted 6 days ago
2.0 - 7.0 years
4 - 9 Lacs
Pune
Work from Office
Responsibilities Requisition ID R-10360830 Date posted 06/19/2025 End Date 06/30/2025 City Pune State/Region Maharashtra Country India Location Type Onsite Calling all innovators find your future at Fiserv. Job Title Data Engineer About your role: At Fiserv, we are dedicated to transforming financial services technology to benefit our clients. As Data Engineer you will play a key role in making sure that the data movement is smooth and efficient. You will have a good understanding of all the data elements, schemas and various data assets. The development of new pipelines to expand the value of our data assets is also an integral part of the role. What youll do: Ensure repeatability and robustness of data movement and storage Execute data purge strategies from the OLTP systems Execute effective error detection and mitigation strategies Ensure update to data related documents Experience youll need to have: Deep Knowledge of Microsoft Azure environment, Managed and non-managed SQL Deep expertise around Microsoft Fabric Proven experience of developing and managing data pipelines Good oral and written communication skills An undergraduate (Bachelor s) degree preferably in Computer Science, Master s degree will be an added advantage 2+ years of experience Post undergrad / master s degree Experience that would be great to have: Experience in the financial services industry Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address. Share this Job Email LinkedIn X Facebook
Posted 1 week ago
4.0 - 9.0 years
20 - 30 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered
Posted 1 week ago
6.0 - 10.0 years
22 - 25 Lacs
Mumbai, Hyderabad
Work from Office
About the role As a Data Warehouse Architect, you will be responsible for managing and enhancing data warehouse that manages large volume of customer-life cycle data flowing in from various applications within guardrails of risk and compliance. You will be managing the day-to-day operations of data warehouse i.e. Vertica. In this role responsibility, you will manage a team of data warehouse engineers to develop data modelling, designing ETL data pipeline, issue management, upgrades, performance fine-tuning, migration, governance and security framework of the data warehouse. This role enables the Bank to maintain huge data sets in a structured manner that is amenable for data intelligence. The data warehouse supports numerous information systems used by various business groups to derive insights.As a natural progression, the data warehouses will be gradually migrated to Data Lake enabling better analytical advantage. The role holder will also be responsible for guiding the team towards this migration. Key Responsibilities Data Pipeline Design Responsible for designing and developing ETL data pipelines that can help in organising large volumes of data. Use of data warehousing technologies to ensure that the data warehouse is efficient, scalable, and secure. Issue Management Responsible for ensuring that the data warehouse is running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance. Collaboration Collaborate with cross-functional teams to implement upgrades, migrations and continuous improvements. Data Integration and Processing Responsible for processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent. Data Modelling Responsible for designing and implementing data modelling solutions to ensure that the organizations data is properly structured and organized for analysis. Key Qualifications & Skills Educational Qualification B.E./B. Tech. in Computer Science, Information Technology or equivalent domain with 6 to 10 years of experience and at least 5 years or relevant work experience in Datawarehouse/mining/BI/MIS. Experience in Data Warehousing Knowledge on ETL and data technologies and outline future vision in OLTP, OLAP (Oracle / MSSQL). Data Modelling, Data Analysis and Visualization experience (Analytical tools experience like Power BI / SAS / ClickView / Tableu etc). Good to have exposure to Azure Cloud Data platform services like COSMOS, Azure Data Lake, Azure Synapse, and Azure Data factory. Synergize with the Team Regular interaction with business/product/functional teams to create mobility solutions. Certification Azure certified DP 900, PL 300, DP 203 or any other Data platform/Data Analyst certifications.. Communication skills Good oral and written communication skills.
Posted 1 week ago
6.0 - 9.0 years
8 - 11 Lacs
Chennai
Work from Office
Job Title: Data Modeller - GCP Experience: 6-9 Years Work Type: On-site Work Location: Chennai (Work from Client Office - Mandatory) Job Description We are seeking a skilled Data Modeller with strong experience in data modelling for OLTP and OLAP systems, particularly within Google Cloud Platform (GCP). The ideal candidate will be hands-on with designing efficient, scalable data architectures and have a solid grasp of performance tuning and cloud-based databases. Key Responsibilities: Design and implement Conceptual, Logical, and Physical Data Models for OLTP and OLAP systems Apply best practices in data indexing, partitioning, and sharding for optimized performance Use data modelling tools (preferably DBSchema) to support and document database design Ensure data architecture supports near real-time reporting and application performance Collaborate with cross-functional teams to translate business requirements into data structures Work with GCP database technologies like AlloyDB, CloudSQL, and BigQuery Validate and improve database performance metrics through continuous optimization Must-Have Skills: GCP: AlloyDB, CloudSQL, BigQuery Strong hands-on experience with data modelling tools (DBSchema preferred) Expertise in OLTP & OLAP data models, indexing, partitioning, and data sharding Deep understanding of database performance tuning and system architecture Good to Have: Functional knowledge of the mutual fund industry Exposure to data governance and security best practices in the cloud
Posted 1 week ago
8.0 - 13.0 years
13 - 18 Lacs
Bengaluru
Work from Office
We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: B achelors or masters degree in computer science, mathematics, or related fields. At least 8 years of experience as a data warehouse expert, data engineer or data integration specialist. In depth knowledge of Snowflake components including Security and Governance Proven experience in implementing complex data models (eg. OLTP , OLAP , Data vault) A strong understanding of ETL including end-to-end data flows, from ingestion to data modeling and solution delivery. Proven industry experience with DBT and JINJA scripts Strong proficiency in SQL, with additional knowledge of Python (i.e. pandas and PySpark) being advantageous. Familiarity with data & analytics solutions such as AWS (especially Glue, Lambda, DMS) is nice to have. Experience working with Azure Dev Ops and warehouse automation tools (eg. Coalesce) is a plus. Experience with Healthcare R&D is a plus. Excellent English communication skills, with the ability to effectively engage both with R&D scientists and software engineers. Experience working in virtual and agile teams.
Posted 1 week ago
8.0 - 13.0 years
10 - 20 Lacs
Bengaluru
Hybrid
Urgent Hiring Network Engineer (8+ yrs) with SBC/Ribbon, VOIP, Scripting (Python/Shell), MPLS, Routing, Telecom, OLP Support, CPAAS, Freesbc, Prosbc, Telcobridges. C2H @TE Infotech (Exotel) Converted to Permanent, BLR. Apply: ssankala@toppersedge.com
Posted 1 week ago
5.0 - 9.0 years
4 - 7 Lacs
Gurugram
Work from Office
Primary Skills SQL (Advanced Level) SSAS (SQL Server Analysis Services) Multidimensional and/or Tabular Model MDX / DAX (strong querying capabilities) Data Modeling (Star Schema, Snowflake Schema) Secondary Skills ETL processes (SSIS or similar tools) Power BI / Reporting tools Azure Data Services (optional but a plus) Role & Responsibilities Design, develop, and deploy SSAS models (both tabular and multidimensional). Write and optimize MDX/DAX queries for complex business logic. Work closely with business analysts and stakeholders to translate requirements into robust data models. Design and implement ETL pipelines for data integration. Build reporting datasets and support BI teams in developing insightful dashboards (Power BI preferred). Optimize existing cubes and data models for performance and scalability. Ensure data quality, consistency, and governance standards. Top Skill Set SSAS (Tabular + Multidimensional modeling) Strong MDX and/or DAX query writing SQL Advanced level for data extraction and transformations Data Modeling concepts (Fact/Dimension, Slowly Changing Dimensions, etc.) ETL Tools (SSIS preferred) Power BI or similar BI tools Understanding of OLAP & OLTP concepts Performance Tuning (SSAS/SQL) Skills: analytical skills,etl processes (ssis or similar tools),collaboration,multidimensional expressions (mdx),power bi / reporting tools,sql (advanced level),sql proficiency,dax,ssas (multidimensional and tabular model),etl,data modeling (star schema, snowflake schema),communication,azure data services,mdx,data modeling,ssas,data visualization
Posted 1 week ago
4.0 - 6.0 years
3 - 7 Lacs
Nagar, Pune
Work from Office
Title : REF64648E - Python developer + Chatbot with 4 - 6 years exp - Pune/Mum/ BNG/ GGN/CHN Assistant Manager - WTS 4 - 6 years of experience as a Python Developer with a strong understanding of Python programming concepts and best practice Bachelors Degree/B.Tech/B.E in Computer Science or a related discipline Design, develop, and maintain robust and scalable Python-based applications, tools, and frameworks that integrate machine learning models and algorithms Demonstrated expertise in developing machine learning solutions, including feature selection, model training, and evaluation Proficiency in data manipulation libraries (e.g., Pandas, NumPy) and machine learning frameworks (e.g., Scikit-learn, TensorFlow, PyTorch, Keras) Experience with web frameworks like Django or Flask Contribute to the architecture and design of data-driven solutions, ensuring they meet both functional and non-functional requirements Experience with databases such as MS-SQL Server, PostgreSQL or MySQL. Solid knowledge of OLTP and OLAP concepts Experience with CI/CD tooling (at least Git and Jenkins) Experience with the Agile/Scrum/Kanban way of working Self-motivated and hard-working Knowledge of performance testing frameworks including Mocha and Jest. Knowledge of RESTful APIs. Understanding of AWS and Azure Cloud services Experience with chatbot and NLU / NLP based application is required Qualifications Bachelors Degree/B.Tech/B.E in Computer Science or a related discipline Job Location
Posted 1 week ago
16.0 - 22.0 years
40 - 55 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities • Minimum 15 years of experience • Deep understanding of data architecture principles, data modelling, data integration, data governance, and data management technologies. • Experience in Data strategies and developing logical and physical data models on RDBMS, NoSQL, and Cloud native databases. • Decent experience in one or more RDBMS systems (such as Oracle, DB2, SQL Server) • Good understanding of Relational, Dimensional, Data Vault Modelling • Experience in implementing 2 or more data models in a database with data security and access controls. • Good experience in OLTP and OLAP systems • Excellent Data Analysis skills with demonstrable knowledge on standard datasets and sources. • Good Experience on one or more Cloud DW (e.g. Snowflake, Redshift, Synapse) • Experience on one or more cloud platforms (e.g. AWS, Azure, GCP) • Understanding of DevOps processes • Hands-on experience in one or more Data Modelling Tools • Good understanding of one or more ETL tool and data ingestion frameworks • Understanding of Data Quality and Data Governance • Good understanding of NoSQL Database and modeling techniques • Good understanding of one or more Business Domains • Understanding of Big Data ecosystem • Understanding of Industry Data Models • Hands-on experience in Python • Experience in leading the large and complex teams • Good understanding of agile methodology • Extensive expertise in leading data transformation initiatives, driving cultural change, and promoting a data driven mindset across the organization. • Excellent Communication skills • Understand the Business Requirements and translate business requirements into conceptual, logical and physical Data models. • Work as a principal advisor on data architecture, across various data requirements, aggregation data lake data models data warehouse etc. • Lead cross-functional teams, define data strategies, and leverage the latest technologies in data handling. • Define and govern data architecture principles, standards, and best practices to ensure consistency, scalability, and security of data assets across projects. • Suggest best modelling approach to the client based on their requirement and target architecture. • Analyze and understand the Datasets and guide the team in creating Source to Target Mapping and Data Dictionaries, capturing all relevant details. • Profile the Data sets to generate relevant insights. • Optimize the Data Models and work with the Data Engineers to define the Ingestion logic, ingestion frequency and data consumption patterns. • Establish data governance practices, including data quality, metadata management, and data lineage, to ensure data accuracy, reliability, and compliance. • Drives automation in modeling activities Collaborate with Business Stakeholders, Data Owners, Business Analysts, Architects to design and develop next generation data platform. • Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc. • Guide /mentor team members, and review artifacts. • Contribute to the overall data strategy and roadmaps. • Propose and execute technical assessments, proofs of concept to promote innovation in the data space.
Posted 1 week ago
6.0 - 11.0 years
20 - 35 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
We are seeking a highly skilled and experienced Java Developer with 6 to 14 years of experience to join our dynamic team. The ideal candidate will be proficient in Java development and possess a deep understanding of software development processes. This role requires hands-on experience with core Java frameworks, Spring Boot , microservices architecture , and various database technologies. If you're passionate about coding, enjoy working on complex applications, and thrive in a fast-paced environment, we want to hear from you! Role & responsibilities Design, develop, and maintain robust, scalable, and high-performance Java-based applications. Write efficient and maintainable code following best practices in software development. Collaborate with cross-functional teams to define, design, and ship new features. Implement microservices and APIs to ensure seamless integration between services. Troubleshoot and debug applications to optimize performance and resolve issues. Work with version control systems like Git to maintain and manage code repositories. Participate in code reviews to ensure code quality, performance, and maintainability. Follow Agile/Scrum methodologies for iterative development cycles. Stay updated with emerging technologies and trends in Java and related technologies. Preferred candidate profile 6 to 11 years of professional experience in Java development . Strong expertise in Core Java (Java 8 or later) and Object-Oriented Programming principles. Extensive experience with Spring Framework , especially Spring Boot . In-depth knowledge of microservices architecture and RESTful web services. Proficiency with databases such as MySQL , PostgreSQL , or Oracle , and familiarity with JPA/Hibernate . Experience with cloud platforms like AWS , Azure , or GCP is a plus. Strong understanding of DevOps principles and familiarity with CI/CD tools like Jenkins, Docker, and Kubernetes. Proficient in multithreading , concurrency , and performance optimization techniques. Experience with NoSQL databases like MongoDB is a plus. Familiarity with testing frameworks such as JUnit , Mockito , and Selenium . Knowledge of front-end technologies like HTML5 , CSS3 , JavaScript , and Angular or React is an advantage. Perks and benefits Five Reasons Why You Should Join Zycus: 1. Cloud Product Company: We are a Cloud SaaS Company, and our products are created by using the latest technologies like ML and AI. Our UI is in Angular JS, and we are developing our mobile apps using React. 2. A Market Leader: Zycus is recognized by Gartner (world's leading market research analyst) as a Leader in Procurement Software Suites. 3. Move between Roles: We believe that change leads to growth and therefore we allow our employees to shift careers and move to different roles and functions within the organization 4. Get a Global Exposure: You get to work and deal with our global customers. 5. Create an Impact: Zycus gives you the environment to create an impact on the product and transform your ideas into reality. Even our junior engineers get the opportunity to work on different product features.
Posted 2 weeks ago
4.0 - 6.0 years
1 - 5 Lacs
Pune
Work from Office
4 - 6 years of experience as a Python Developer with a strong understanding of Python programming concepts and best practice Bachelor s Degree/B.Tech/B.E in Computer Science or a related discipline Design, develop, and maintain robust and scalable Python-based applications, tools, and frameworks that integrate machine learning models and algorithms Demonstrated expertise in developing machine learning solutions, including feature selection, model training, and evaluation Proficiency in data manipulation libraries (e.g., Pandas, NumPy) and machine learning frameworks (e.g., Scikit-learn, TensorFlow, PyTorch, Keras) Experience with web frameworks like Django or Flask Contribute to the architecture and design of data-driven solutions, ensuring they meet both functional and non-functional requirements Experience with databases such as MS-SQL Server, PostgreSQL or MySQL. Solid knowledge of OLTP and OLAP concepts Experience with CI/CD tooling (at least Git and Jenkins) Experience with the Agile/Scrum/Kanban way of working Self-motivated and hard-working Knowledge of performance testing frameworks including Mocha and Jest. Knowledge of RESTful APIs. Understanding of AWS and Azure Cloud services Experience with chatbot and NLU / NLP based application is required Qualifications Bachelor s Degree/B.Tech/B.E in Computer Science or a related discipline
Posted 2 weeks ago
4.0 - 9.0 years
20 - 30 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 4-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only
Posted 2 weeks ago
7.0 - 12.0 years
0 Lacs
Hyderabad
Work from Office
DBA Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional/non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. using Python/open source technologies. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and regions. Work with data and analytics experts to strive for greater functionality in our data systems. Test databases and perform bug fixes. Develop best practices for database design and development activities. Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code. Take on technical leadership responsibilities of database projects across various scrum teams Manage exploratory data analysis to support database and dashboard development Required Skills: Expert knowledge in databases like PostgreSQL (preferably cloud hosted in any one or more cloud offerings like AWS, Azure, GCP), any cloud-based Data Warehouse (like Snowflake, Azure Synapse) with strong programming experience in SQL. Competence in data preparation and/or ETL tools like snapLogic, MATILLION, Azure Data Factory, AWS Glue, and SSIS (preferably strong working experience in one or more) to build and maintain data pipelines and flows. Understanding of data modeling techniques and working knowledge with OLTP and OLAP systems Deep knowledge of databases, stored procedures, optimizations of huge data In-depth knowledge of ingestion techniques, data cleaning, de-dupe, and partitioning. Experience with building the infrastructure required for data ingestion and analytics Ability to fine-tune report generating queries Solid understanding of normalization and denormalization of data, database exception handling, transactions, profiling queries, performance counters, debugging, database & query optimization techniques Understanding of index design and performance-tuning techniques Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting Adhere to standards for all databases e.g., Data Models, Data Architecture, and Naming Conventions Exposure to Source control like GIT, Azure DevOps Understanding of Agile methodologies (Scrum, Kanban) Preferably experience with NoSQL database to migrate data into other types of databases with real-time replication. Experience with automated testing and coverage tools Experience with CI/CD automation tools (desirable)
Posted 2 weeks ago
4.0 - 9.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. Role: ETL Test Engineer Experience: 4 to 10 years Interview Location: Bangalore Job description: Min 4 to 10 yrs of Exp in ETL Testing. 1. SQL - Expert level of knowledge in core concepts of SQL and query. 2.Lead and mentor a team of ETL testers, providing technical guidance, training, and support in ETL tools, SQL, and test automation frameworks. 3.Create and review complex test cases, test scripts, and test data for ETL processes. 4. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 5.Execute test cases, validate data transformat ions, and ensure data accuracy and consistency across source and target systems 6. Experience in query optimization, stored procedures/views and functions. 7. Strong familiarity of data warehouse projects and data modeling. 8. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 9. Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 10.Develop and maintain ETL test automation frameworks to enhance testing efficiency and coverage. 11. Integrate automated tests into the CI/CD pipeline to ensure continuous validation of ETL processes. 12. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 13. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 14. Communication - Good communication skills to understand and collaborate with all the stake holders within the project
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
22558 Jobs | Dublin
Wipro
12294 Jobs | Bengaluru
EY
8435 Jobs | London
Accenture in India
7026 Jobs | Dublin 2
Uplers
6784 Jobs | Ahmedabad
Amazon
6588 Jobs | Seattle,WA
IBM
6430 Jobs | Armonk
Oracle
6230 Jobs | Redwood City
Virtusa
4470 Jobs | Southborough
Capgemini
4309 Jobs | Paris,France