Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 7.0 years
22 - 25 Lacs
Pune
Remote
Engage stakeholders, architect data products, guide dev teams, build & own ETL pipelines (PULSE, Snowflake DV2), ensure data quality/governance, agile leadership. "Passport mandatory." contact to 9063478484/v.aparna@tekgenieservices.com
Posted 2 months ago
4.0 - 9.0 years
15 - 25 Lacs
Ahmedabad, Gurugram
Work from Office
Hi, Wishes from GSN! Pleasure connecting with you. About the job: This is GCP Data Engineer opportunity with a leading bootstrapped product company, a valued client of GSN HR . Job Title: GCP Data Engineer Experience: 4+ Years Work Loc : Ahmedabad Work Mode : WFO - 5 Days in Office Work Timing : General CTC Range : 20 LPA to 25 LPA Job Summary: We are seeking a GCP Data Enginner professional , to join a high-impact QA team working on mission-critical banking systems. Key Responsibilities: Proficiency in Python for data processing and scripting. S trong SQL knowledge and experience with relational databases (e.g., MySQL, PostgreSQL, SQL Server) Understanding of data modelling, data warehousing, and data architecture. Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services . Proficiency in working with GCP (especially Big Query and GCS). Version control skills using Git. If interested, click Apply now for IMMEDIATE response. Best, DIVYA GSN HR | divya@gsnhr.net | 9994042152 | Google review : https://g.co/kgs/UAsF9W
Posted 2 months ago
5.0 - 10.0 years
19 - 30 Lacs
Hyderabad
Work from Office
For Data Engineer Years of experience -3-5 years Number of openings-2 For Sr. Data Engineer Years of experience- 6-10 years Number of openings-2 About Us Logic Pursuits provides companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes. Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively. With deep, big four consulting experience in business transformation and efficient processes, Logic Pursuits is a game-changer in any operations strategy. Job Description We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization. Key Responsibilities Design and build robust ELT pipelines using dbt on Snowflake, including ingestion from relational databases, APIs, cloud storage, and flat files . Reverse-engineer and optimize SAP Data Services (SAP DS) jobs to support scalable migration to cloud-based data platforms . Implement layered data architectures (e.g., staging, intermediate, mart layers) to enable reliable and reusable data assets. Enhance dbt/Snowflake workflows through performance optimization techniques such as clustering, partitioning, query profiling, and efficient SQL design. Use orchestration tools like Airflow, dbt Cloud, and Control-M to schedule, monitor, and manage data workflows. Apply modular SQL practices, testing, documentation, and Git-based CI/CD workflows for version-controlled, maintainable code. Collaborate with data analysts, scientists, and architects to gather requirements, document solutions, and deliver validated datasets. Contribute to internal knowledge sharing through reusable dbt components and participate in Agile ceremonies to support consulting delivery. Required Qualifications Data Engineering Skills 3–5 years of experience in data engineering, with hands-on experience in Snowflake and basic to intermediate proficiency in dbt. Capable of building and maintaining ELT pipelines using dbt and Snowflake with guidance on architecture and best practices. Understanding of ELT principles and foundational knowledge of data modeling techniques (preferably Kimball/Dimensional) . Intermediate experience with SAP Data Services (SAP DS), including extracting, transforming, and integrating data from legacy systems. Proficient in SQL for data transformation and basic performance tuning in Snowflake (e.g., clustering, partitioning, materializations). Familiar with workflow orchestration tools like dbt Cloud, Airflow, or Control M . Experience using Git for version control and exposure to CI/CD workflows in team environments. Exposure to cloud storage solutions such as Azure Data Lake, AWS S3, or GCS for ingestion and external staging in Snowflake. Working knowledge of Python for basic automation and data manipulation tasks. Understanding of Snowflake's role-based access control (RBAC) , data security features, and general data privacy practices like GDPR. Data Quality & Documentation Familiar with dbt testing and documentation practices (e.g., dbt tests, dbt docs). Awareness of standard data validation and monitoring techniques for reliable pipeline development. Soft Skills & Collaboration Strong problem-solving skills and ability to debug SQL and transformation logic effectively. Able to document work clearly and communicate technical solutions to a cross-functional team. Experience working in Agile settings, participating in sprints, and handling shifting priorities. Comfortable collaborating with analysts, data scientists, and architects across onshore/offshore teams. High attention to detail, proactive attitude, and adaptability in dynamic project environments. Nice to Have Experience working in client-facing or consulting roles. Exposure to AI/ML data pipelines or tools like feature stores and MLflow Familiarity with enterprise-grade data quality tools Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus Additional Information Why Join Us? Opportunity to work on diverse and challenging projects in a consulting environment. Collaborative work culture that values innovation and curiosity. Access to cutting-edge technologies and a focus on professional development. Competitive compensation and benefits package. Be part of a dynamic team delivering impactful data solutions Required Qualification Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.)
Posted 2 months ago
12.0 - 20.0 years
35 - 50 Lacs
Bengaluru
Hybrid
Data Architect with Cloud Expert, Data Architecture, Data Integration & Data Engineering ETL/ELT - Talend, Informatica, Apache NiFi. Big Data - Hadoop, Spark Cloud platforms (AWS, Azure, GCP), Redshift, BigQuery Python, SQL, Scala,, GDPR, CCPA
Posted 2 months ago
10.0 - 17.0 years
50 - 75 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role: Presales Senior Cloud Data Architect (with Data Warehousing Experience) Employment Type: Full-Time Professional Summary: Onix is seeking an experienced Presales Senior Cloud Data Architect with a strong background in data warehousing and cloud platforms to play a pivotal role in the presales lifecycle and solution design process. This position is key to architecting scalable, secure, and cost-efficient data solutions that align with client business objectives. The ideal candidate will have deep expertise in data architecture, modeling, and cloud data platforms such as AWS and GCP , combined with the ability to lead and influence during the presales engagement phase. Scope / Level of Decision Making: This is an exempt position operating under limited supervision , with a high degree of autonomy in presales technical solutioning, client engagement, and proposal development. Complex decisions are escalated to the manager as necessary. Primary Responsibilities: Presales & Solutioning Responsibilities: Engage early in the sales cycle to understand client requirements, gather technical objectives, and identify challenges and opportunities. Partner with sales executives to develop presales strategies , define technical win themes, and align proposed solutions with client needs. Lead the technical discovery process , including stakeholder interviews, requirement elicitation, gap analysis, and risk identification. Design comprehensive cloud data architecture solutions , ensuring alignment with business goals and technical requirements. Develop Proofs of Concept (PoCs) , technical demos, and architecture diagrams to validate proposed solutions and build client confidence. Prepare and deliver technical presentations , RFP responses, and detailed proposals for client stakeholders, including C-level executives. Collaborate with internal teams (sales, product, delivery) to scope solutions , define SOWs, and transition engagements to the implementation team. Drive technical workshops and architecture review sessions with clients to ensure stakeholder alignment. Cloud Data Architecture Responsibilities: Deliver scalable and secure end-to-end cloud data solutions across AWS, GCP, and hybrid environments. Design and implement data warehouse architectures , data lakes, ETL/ELT pipelines, and real-time data streaming solutions. Provide technical leadership and guidance across multiple client engagements and industries. Leverage AI/ML capabilities to support data intelligence, automation, and decision-making frameworks. Apply cost optimization strategies , cloud-native tools, and best practices for performance tuning and governance. Qualifications: Required Skills & Experience: 8+ years of experience in data architecture , data modeling , and data management . Strong expertise in cloud-based data platforms (AWS/GCP), including data warehousing and big data tools. Proficient in SQL, Python , and at least one additional programming language (Java, C++, Scala, etc.). Knowledge of ETL/ELT pipelines , CI/CD , and automated delivery systems . Familiarity with NoSQL and SQL databases (e.g., PostgreSQL, MongoDB). Excellent presentation, communication, and interpersonal skills especially in client-facing environments. Proven success working with C-level executives and key stakeholders . Experience with data governance , compliance, and security in cloud environments. Strong problem-solving and analytical skills . Ability to manage multiple initiatives and meet tight deadlines in a fast-paced setting. Education: Bachelors degree in Computer Science, Information Systems, or related field (or equivalent experience required). Travel Expectation: Up to 15% for client engagements and technical workshops.
Posted 2 months ago
5.0 - 9.0 years
13 - 17 Lacs
Pune
Work from Office
Diacto is looking for a highly capable Data Architect with 5 to 9 years of experience to lead cloud data platform initiatives with a primary focus on Snowflake and Azure Data Hub. This individual will play a key role in defining the data architecture strategy, implementing robust data pipelines, and enabling enterprise-grade analytics solutions. This is an on-site role based in our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design and implement enterprise-level data architecture with a strong focus on Snowflake and Azure Data Hub Define standards and best practices for data ingestion, transformation, and storage Collaborate with cross-functional teams to develop scalable, secure, and high-performance data pipelines Lead Snowflake environment setup, configuration, performance tuning, and optimization Integrate Azure Data Services with Snowflake to support diverse business use cases Implement governance, metadata management, and security policies Mentor junior developers and data engineers on cloud data technologies and best practices Experience and Skills Required: 5?9 years of overall experience in data architecture or data engineering roles Strong, hands-on expertise in Snowflake, including design, development, and performance tuning Solid experience with Azure Data Hub and Azure Data Services (Data Lake, Synapse, etc.) Understanding of cloud data integration techniques and ELT/ETL frameworks Familiarity with data orchestration tools such as DBT, Airflow, or Azure Data Factory Proven ability to handle structured, semi-structured, and unstructured data Strong analytical, problem-solving, and communication skills Nice to Have: Certifications in Snowflake and/or Microsoft Azure Experience with CI/CD tools like GitHub for code versioning and deployment Familiarity with real-time or near-real-time data ingestio Why Join Diacto Technologies Work with a cutting-edge tech stack and cloud-native architectures Be part of a data-driven culture with opportunities for continuous learning Collaborate with industry experts and build transformative data solutions
Posted 2 months ago
5.0 - 9.0 years
14 - 17 Lacs
Pune
Work from Office
Diacto is seeking an experienced and highly skilled Data Architect to lead the design and development of scalable and efficient data solutions. The ideal candidate will have strong expertise in Azure Databricks, Snowflake (with DBT, GitHub, Airflow), and Google BigQuery. This is a full-time, on-site role based out of our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design, build, and optimize robust data architecture frameworks for large-scale enterprise solutions Architect and manage cloud-based data platforms using Azure Databricks, Snowflake, and BigQuery Define and implement best practices for data modeling, integration, governance, and security Collaborate with engineering and analytics teams to ensure data solutions meet business needs Lead development using tools such as DBT, Airflow, and GitHub for orchestration and version control Troubleshoot data issues and ensure system performance, reliability, and scalability Guide and mentor junior data engineers and developers
Posted 2 months ago
5.0 - 8.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Total Experience 58 years with 4+ years of relevant experience Skills Proficiency on Databricks platform Strong handson experience with Pyspark , SQL, and Python Any cloud Azure, AWS, GCP Certifications (Any of the following) Databricks Certified Associate Developer for Spark 3.0 Preferred Databricks Certified Data Engineer Associate Databricks Certified Data Engineer Professional Mandatory Skill Sets Databricks,Pyspark, SQL,Python, Any cloud Azure, AWS, GCP Preferred Skill Sets Related Ceritfication Databricks Certified Associate Developer for Spark 3.0 Preferred Databricks Certified Data Engineer Associate Databricks Certified Data Engineer Professional Education Qualification BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Microsoft Azure Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion
Posted 2 months ago
5.0 - 10.0 years
7 - 11 Lacs
Mumbai
Work from Office
Looking for a savvy Data Engineer to join team of Modeling / Architect experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as we'll as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company s data architecture to support our next generation of products and data initiatives.This role requires a flexible working schedule, including potential weekend support for critical operations, while maintaining a 40-hour work week. In this role, you will assist in maintaining the MDLZ DataHub Google BigQuery data pipelines and corresponding platforms (on-prem and cloud), working closely with global teams on DataOps initiatives. The D4GV platform spans across three key GCP instances: NALA, MEU, and AMEA, supporting the global rollout of o9 across all Mondel z BUs over the next three years 5+ years of overall industry experience and minimum of 2-4 years of experience building and deploying large scale data processing pipelines in a production environment Focus on excellence: Has practical experience of Data-Driven Approaches, Is familiar with the application of Data Security strategy, Is familiar with we'll know data engineering tools and platforms Technical depth and breadth : Able to build and operate Data Pipelines, Build and operate Data Storage, Has worked on big data architecture within Distributed Systems. Is familiar with Infrastructure definition and automation in this context. Is aware of adjacent technologies to the ones they have worked on. Can speak to the alternative tech choices to that made on their projects. Implementation and automation of Internal data extraction from SAP BW / HANA Implementation and automation of External data extraction from openly available internet data sources via APIs Data cleaning, curation and enrichment by using Alteryx, SQL, Python, R, PySpark, SparkR Data ingestion and management in Hadoop / Hive Preparing consolidated DataMart for use by Data Scientists and managing SQL Databases Exposing data via Alteryx, SQL Database for consumption in Tableau Data documentation maintenance/update Collaboration and workflow using a version control system (eg, Git Hub) Learning ability : Is self-reflective, Has a hunger to improve, Has a keen interest to drive their own learning. Applies theoretical knowledge to practice Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Flexible Working Hours: This role requires the flexibility to work non-traditional hours, including providing support during off-hours or weekends for critical data pipeline job runs, deployments, or incident response, while ensuring the total work commitment remains a 40-hour week. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Skills and Experience Deep knowledge in manipulating, processing, and extracting value from datasets; support the day-to-day operations of these GCP-based data pipelines, ensuring data governance, reliability, and performance optimization. Hands-on experience with GCP data services such as Dataflow, BigQuery, Dataproc, Pub/Sub, and real-time streaming architectures is preferred. + 5 years of experience in data engineering, business intelligence, data science, or related field; Proficiency with Programming Languages: SQL, Python, R Spark, PySpark, SparkR, SQL for data processing; Strong project management skills and ability to plan and prioritize work in a fast-paced environment; Experience with: MS Azure Data Factory, MS Azure Data Lake Store, SQL Database, SAP BW/ ECC / HANA, Alteryx, Tableau; Ability to think creatively, highly-driven and self-motivated; Knowledge of SAP BW for HANA (Extractors, Transformations, Modeling aDSOs, Queries, OpenHubs) No Relocation support available
Posted 2 months ago
14.0 - 20.0 years
35 - 45 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Data Architect: • Design and implement enterprise data models, ensuring data integrity, consistency, and scalability Responsibilities: Design and implement enterprise data models, ensuring data integrity, consistency, and scalability. Analyse business needs and translate them into technical requirements for data storage, processing, and access. In-memory Cache: Optimizes query performance by storing frequently accessed data in memory. Query Engine: Processes and executes complex data queries efficiently. Business Rules Engine (BRE): Enforces data access control and compliance with business rules. Select and implement appropriate data management technologies, including databases, data warehouses. Collaborate with data engineers, developers, and analysts to ensure seamless integration of data across various systems. Monitor and optimize data infrastructure performance, identifying and resolving bottlenecks. • Stay up-to-date on emerging data technologies and trends, recommending and implementing solutions. Document data architecture and processes for clear communication and knowledge sharing, including the integration. Qualifications: • Proven experience in designing and implementing enterprise data models. • Expertise in SQL and relational databases (e.g., Oracle, MySQL, PostgreSQL). • Experience with cloud-based data platforms (e.g., AWS, Azure, GCP) is mandatory. • Working experience with ETL tools and data ingestion leveraging any real-time solutions (e.g., Kafka, streaming) is required • Strong understanding of data warehousing concepts and technologies. • Familiarity with data governance principles and best practices. • Excellent communication, collaboration, and problem-solving skills. • Ability to work independently and as part of a team. • Strong analytical and critical thinking skills. • Experience with data visualization & UI Development is a plus. • Bachelors degree in computer science, Information Technology, or a related fiel
Posted 2 months ago
10.0 - 15.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Position: Integration Architect Job Description: What you will be doing: The Enterprise Architect contributes to enterprise architecture by ensuring that common architecture decisions are implemented consistently across business and IT in order to support the business and IT strategy. They will also research, analyze, design, propose, and deliver IT architecture solutions that are optimal for the business and IT strategies in one or more domains. The Enterprise Architect provides integrated systems analysis and recommends appropriate hardware, software, and communication links required to support IT goals and strategy. The Enterprise Architect is responsible for analyzing and incorporating where appropriate industry trends, and being familiar with enterprise standards and methodology. The role will also have familiarity with the enterprise infrastructure and applications, and will specialize in a particular domain. Contribute to the definition of conceptual and logical architecture specifications (e.g., data architecture, application architecture, technical architecture) for the enterprise. Interact with business strategists and comprehend design impact to systems and develop measurable business cases that support the Architecture. Serve as an architecture conduit for external requests and queries, and provide education on enterprise architecture directions and goals. Manage organizational impacts of architecture. Work with other architects on enterprise architectural efforts, utilizing cross-functional knowledge (strategy, change management and business process management). Analyze IT industry and market trends and determine potential impact upon enterprise as well as, identify and analyze enterprise business drivers to derive enterprise business, information, technical, and solution architecture requirements. Implement overall architecture approach for all layers of a solution. Ensure that enterprise architecture standards, policies, and procedures are enacted uniformly across application development projects and programs. What we are looking for : Experience / Education Typically requires a minimum of 10 years of related experience with a 4 year degree; or 8 years and an advanced degree; or equivalent experience. Arrow Electronics, Inc. (NYSE: ARW), an award-winning Fortune 133 and one of Fortune Magazine s Most Admired Companies. Arrow guides innovation forward for over 220,000 leading technology manufacturers and service providers. With 2023 sales of USD $33.11 billion, Arrow develops technology solutions that improve business and daily life. Our broad portfolio that spans the entire technology landscape helps customers create, make and manage forward-thinking products that make the benefits of technology accessible to as many people as possible. Learn more at www.arrow.com . Our strategic direction of guiding innovation forward is expressed as Five Years Out, a way of thinking about the tangible future to bridge the gap between whats possible and the practical technologies to make it happen. Learn more at https://www.fiveyearsout.com/ . Location: IN-KA-Bangalore, India (SKAV Seethalakshmi) GESC Time Type: Full time Job Category: Information Technology
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Chennai
Work from Office
Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Responsibilities: Mandatory skills : Full stack developer , Dot net Core and above angular 14/16 version Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Mumbai
Work from Office
Job Summary This position provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. He/She directs component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. He/She guides teams to ensure effective communication and achievement of objectives. This position researches and supports the integration of emerging technologies. He/She provides knowledge and support for applications development, integration, and maintenance. This position leads junior team members with project related activities and tasks. He/She guides and influences department and project teams. This position facilitates collaboration with stakeholders. Responsibilities: Leads systems analysis and design. Leads design and development of applications. Develops and ensures creation of application documents. Defines and produces integration builds. Monitors emerging technology trends. Leads maintenance and support. Primary Skills: .Net Full Stack (Angular 14 or above; .Net 6 or above) Secondary Skills: Agile Development, DevOps, HTML, CSS, JS, WCAG 2.0 Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred.
Posted 2 months ago
4.0 - 6.0 years
6 - 8 Lacs
Mumbai
Work from Office
Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Responsibilities: Knowledge in .Net Core and GCP is must Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred.
Posted 2 months ago
4.0 - 6.0 years
6 - 8 Lacs
Mumbai
Work from Office
Position - Senior Software Engineer- Full Stack - C#, Dotnet, MVC, Angular, Azure Cloud Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Responsibilities: Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. Primary Skills: C#/Dotnet Framework and Dotnet Core Framework - MVC, Entity Framework Web-Development (ASP.NET/Angular) Database(MS SQL) ReST API (Micro/Webservices), WCF, Web API Secondary Skills: Cloud & DevOps(CI/CD) Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred.
Posted 2 months ago
1.0 - 2.0 years
11 - 15 Lacs
Hyderabad
Work from Office
About the role We are seeking a highly skilled and experienced Data Architect to join our team. The ideal candidate will have at least 1 2 years of experience in s oftware & data engineering and a nalytics and a proven track record of designing and implementing complex data solutions. Y ou will be expected to design, create, deploy, and manage Blackbauds data architecture. This role has considerable technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence atBlackbaud. Thisindividual acts as an evangelist for proper data strategy with other teams at Blackbaud and assists with the technical direction, specifically with data, of other projects. What youll be doing Develop and direct the strategy for all aspects of Blackbauds Data and Analytics platforms, products and services Set, communicate and facilitate technical directionmore broadly for the AI Center of Excellence and collaboratively beyond the Center of Excellence Design and develop breakthrough products, services or technological advancements in the Data Intelligence space that expand our business Work alongside product management to craft technical solutions to solve customer business problems. Own the technical data governance practices and ensures data sovereignty, privacy, security and regulatory compliance. Continuously challenging the status quo of how things have been done in the past. Build data access strategy to securely democratize data and enable research, modelling, machine learning and artificial intelligence work. Help define the tools and pipeline patterns our engineers and data engineers use to transform data and support our analytics practice Work in a cross-functional team to translate business needs into data architecture solutions. Ensure data solutions are built for performance, scalability, and reliability. Mentor junior data architects and team members. Keep current on technologydistributed computing, big data concepts and architecture. Promote internally how data within Blackbaud can help change the world. What we want you to have: 1 0 + years of experience in data and advanced analytics At least 8 years of experience working on data technologies in Azure/AWS Experience building modern products and infrastructure Experience working with .Net /Java and Microservice Architecture Expertise in SQL and Python Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. Expertise in Databricks, Microsoft Fabric Strong understanding of data modeling, data warehousing, data lakes , data mesh and data products. Experience with machine learning Excellent communication and leadership skills. Able to work flexible hours as required by business priorities Ability to deliver software that meets consistent standards of quality, security and operability. Stay up to date on everything Blackbaud, follow us on Linkedin, X, Instagram, Facebook and YouTube Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.
Posted 2 months ago
1.0 - 3.0 years
8 - 12 Lacs
Bengaluru
Work from Office
PhonePe is seeking passionate BI Engineers with 1-3 years of experience, ideally inQlik Sense, to drive data availability and insights at scale. If you're driven by dataand constantly seek better ways, join our innovative team! What would you get to do in this role Work with large-scale datasets and solve real-world data modeling challenges to ensure scalability, flexibility, and efficiency in reporting and analytics. Develop interactive Qlik dashboards for various stakeholders to support data-driven decision-making. Help build and optimize data models that support robust reporting and analytics capabilities, while ensuring seamless integration with the organization’s data architecture. Collaborate with stakeholders to understand data requirements and ensure the right data is provided at the right time. Use modern open-source tools and technologies in the data processing stack, with opportunities to experiment and implement automation to improve data workflows. Contribute to the design and development of scalable data warehousing pipelines to process and aggregate raw data into actionable insights. Learn and grow in a dynamic environment, gaining expertise in BI and data visualization best practices. What do you need to have to apply for this position 1-3 years of BI experience in relevant roles, preferably in a product-based firm. Proficient with Qlik Sense development, dashboard design and performance optimization. Proficient in creating and managing Qlik Sense reports, charts, and visualizations. Data warehousing, modeling & data flow understanding is desired. Strong knowledge in SQL - Hive experience will be preferred. Translate complex business requirements into interactive dashboards and reports. Good in collaboration and execution rigour. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news
Posted 2 months ago
10.0 - 15.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Novo Nordisk Global Business Services ( GBS) India Department - Global Data & Artificial lntelligence Are you passionate about building scalable data pipelines and optimising data workflowsDo you want to work at the forefront of data engineering, collaborating with cross-functional teams to drive innovationIf so, we are looking for a talented Data Engineer to join our Global Data & AI team at Novo Nordisk. Read on and apply today for a life-changing career! The Position As a Senior Data Engineer, you will play a key role in designing, developing, and main-taining data pipelines and integration solutions to support analytics, Artificial Intelligence workflows, and business intelligence. It includes: Design, implement, and maintain scalable data pipelines and integration solutions aligned with the overall data architecture and strategy. Implement data transformation workflows using modern ETL/ELT approaches while establishing best practices for data engineering, including testing methodologies and documentation. Optimize data workflows by harmonizing and securely transferring data across systems, while collaborating with stakeholders to deliver high-performance solutions for analytics and Artificial Intelligence. Monitoring and maintaining data systems to ensure their reliability. Support data governance by ensuring data quality and consistency, while contributing to architectural decisions shaping the data platform's future. Mentoring junior engineers and fostering a culture of engineering excellence. Qualifications Bachelor’s or master’s degree in computer science, Software Development, Engineering. Possess over 10 years of overall professional experience, including more than 4 years of specialized expertise in data engineering. Experience in developing production-grade data pipelines using Python, Data-bricks and Azure cloud, with a strong foundation in software engineering principles. Experience in the clinical data domain, with knowledge of standards such as CDISC SDTM and ADaM (Good to have). Experience working in a regulated industry (Good to have). About the department You will be part of the Global Data & AI team. Our department is globally distributed and has for mission to harness the power of Data and Artificial Intelligence, integrating it seamlessly into the fabric of Novo Nordisk's operations. We serve as the vital link, weaving together the realms of Data and Artificial Intelligence throughout the whole organi-zation, empowering Novo Nordisk to realize its strategic ambitions through our pivotal initiatives. The atmosphere is fast-paced and dynamic, with a strong focus on collaboration and innovation. We work closely with various business domains to create actionable insights and drive commercial excellence.
Posted 2 months ago
12.0 - 18.0 years
25 - 40 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities Azure Cloud Services (PaaS & IaaS): Proficient in deploying and managing cloud-based solutions using Azure's Platform-as-a-Service and Infrastructure-as-a-Service offerings. Data Engineering & Analytics: Azure Synapse Analytics: Integrated big data and data warehousing capabilities for comprehensive analytics solutions. Azure Data Factory: Developed and orchestrated ETL/ELT pipelines for seamless data movement and transformation. Azure Databricks & PySpark: Engineered scalable data processing workflows and machine learning models. Azure Stream Analytics: Implemented real-time data stream processing for immediate insights. Microsoft Fabric: Utilized AI-powered analytics for unified data access and management.deepaksood619.github.io Business Intelligence & Reporting: Power BI & SSRS: Designed and developed interactive dashboards and reports for data visualization and decision-making. SQL Server Analysis Services (SSAS): Built OLAP cubes and tabular models for multidimensional data analysis. Data Governance & Security: Microsoft Purview: Established comprehensive data governance frameworks to ensure compliance and data integrity. DevOps & Automation: Azure DevOps: Implemented CI/CD pipelines and automated deployment processes for efficient software delivery. Preferred candidate profile Technical Skills: Cloud Computing: Azure-Cloud Services (PaaS & IaaS), Active Directory, Application Insights, Azure Stream Analytics, Azure Search, Data Factory, Key Vault and SQL Azure, Azure Data Factory, Azure Analysis services, Azure Synapse Analytics (DW), Azure Data Lake, PySpark, Microsoft Fabric Database & BI Tools: SQL, T-SQL, SSIS, SSRS, SQL Server Management Studio (SSMS) 2016/2014, SQL Server Job Agent, Import and Export Data, Linked Servers. Reporting Tools: SSRS, Power BI reports, Tableau, Excel
Posted 2 months ago
16.0 - 22.0 years
40 - 55 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities • Minimum 15 years of experience • Deep understanding of data architecture principles, data modelling, data integration, data governance, and data management technologies. • Experience in Data strategies and developing logical and physical data models on RDBMS, NoSQL, and Cloud native databases. • Decent experience in one or more RDBMS systems (such as Oracle, DB2, SQL Server) • Good understanding of Relational, Dimensional, Data Vault Modelling • Experience in implementing 2 or more data models in a database with data security and access controls. • Good experience in OLTP and OLAP systems • Excellent Data Analysis skills with demonstrable knowledge on standard datasets and sources. • Good Experience on one or more Cloud DW (e.g. Snowflake, Redshift, Synapse) • Experience on one or more cloud platforms (e.g. AWS, Azure, GCP) • Understanding of DevOps processes • Hands-on experience in one or more Data Modelling Tools • Good understanding of one or more ETL tool and data ingestion frameworks • Understanding of Data Quality and Data Governance • Good understanding of NoSQL Database and modeling techniques • Good understanding of one or more Business Domains • Understanding of Big Data ecosystem • Understanding of Industry Data Models • Hands-on experience in Python • Experience in leading the large and complex teams • Good understanding of agile methodology • Extensive expertise in leading data transformation initiatives, driving cultural change, and promoting a data driven mindset across the organization. • Excellent Communication skills • Understand the Business Requirements and translate business requirements into conceptual, logical and physical Data models. • Work as a principal advisor on data architecture, across various data requirements, aggregation data lake data models data warehouse etc. • Lead cross-functional teams, define data strategies, and leverage the latest technologies in data handling. • Define and govern data architecture principles, standards, and best practices to ensure consistency, scalability, and security of data assets across projects. • Suggest best modelling approach to the client based on their requirement and target architecture. • Analyze and understand the Datasets and guide the team in creating Source to Target Mapping and Data Dictionaries, capturing all relevant details. • Profile the Data sets to generate relevant insights. • Optimize the Data Models and work with the Data Engineers to define the Ingestion logic, ingestion frequency and data consumption patterns. • Establish data governance practices, including data quality, metadata management, and data lineage, to ensure data accuracy, reliability, and compliance. • Drives automation in modeling activities Collaborate with Business Stakeholders, Data Owners, Business Analysts, Architects to design and develop next generation data platform. • Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc. • Guide /mentor team members, and review artifacts. • Contribute to the overall data strategy and roadmaps. • Propose and execute technical assessments, proofs of concept to promote innovation in the data space.
Posted 2 months ago
16.0 - 21.0 years
40 - 60 Lacs
Pune, Gurugram, Delhi / NCR
Hybrid
Role & responsibilities Understand the Business Requirements and translate business requirements into conceptual, logical and physical Data models. Work as a principal advisor on data architecture, across various data requirements, aggregation data lake data models – data warehouse etc. Lead cross-functional teams, define data strategies, and leverage the latest technologies in data handling. Define and govern data architecture principles, standards, and best practices to ensure consistency, scalability, and security of data assets across projects. Suggest best modelling approach to the client based on their requirement and target architecture. Analyze and understand the Datasets and guide the team in creating Source to Target Mapping and Data Dictionaries, capturing all relevant details. Profile the Data sets to generate relevant insights. Optimize the Data Models and work with the Data Engineers to define the Ingestion logic, ingestion frequency and data consumption patterns. Establish data governance practices, including data quality, metadata management, and data lineage, to ensure data accuracy, reliability, and compliance. Drives automation in modeling activities Collaborate with Business Stakeholders, Data Owners, Business Analysts, Architects to design and develop next generation data platform. Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc. Guide /mentor team members, and review artifacts. Contribute to the overall data strategy and roadmaps. Propose and execute technical assessments, proofs of concept to promote innovation in the data space. Preferred candidate profile Minimum 16 years of experience Deep understanding of data architecture principles, data modelling, data integration, data governance, and data management technologies. Experience in Data strategies and developing logical and physical data models on RDBMS, NoSQL, and Cloud native databases. Decent experience in one or more RDBMS systems (such as Oracle, DB2, SQL Server) • Good understanding of Relational, Dimensional, Data Vault Modelling Experience in implementing 2 or more data models in a database with data security and access controls. Good experience in OLTP and OLAP systems Excellent Data Analysis skills with demonstrable knowledge on standard datasets and sources. Good Experience on one or more Cloud DW (e.g. Snowflake, Redshift, Synapse) Experience on one or more cloud platforms (e.g. AWS, Azure, GCP) Understanding of DevOps processes Hands-on experience in one or more Data Modelling Tools Good understanding of one or more ETL tool and data ingestion frameworks Understanding of Data Quality and Data Governance Good understanding of NoSQL Database and modeling techniques Good understanding of one or more Business Domains Understanding of Big Data ecosystem Understanding of Industry Data Models Hands-on experience in Python Experience in leading the large and complex teams Good understanding of agile methodology You are important to us, let’s stay connected! Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow.We are an equal opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Perks and benefits
Posted 2 months ago
5.0 - 10.0 years
15 - 30 Lacs
Chennai
Hybrid
Job Summary: We are looking for a highly skilled Backend Data Engineer to join our growing FinTech team. In this role, you will design and implement robust data models and architectures, build scalable data ingestion pipelines, and ensure data quality across financial datasets. You will play a key role in enabling data-driven decision-making by developing efficient and secure data infrastructure tailored to the fast-paced FinTech environment. Key Responsibilities: Design and implement scalable data models and data architecture to support financial analytics, risk modeling, and regulatory reporting. Build and maintain data ingestion pipelines using Python or Java to process high-volume, high-velocity financial data from diverse sources. Lead data migration efforts from legacy systems to modern cloud-based platforms. Develop and enforce data validation processes to ensure accuracy, consistency, and compliance with financial regulations. Create and manage task schedulers to automate data workflows and ensure timely data availability. Collaborate with product, engineering, and data science teams to deliver reliable and secure data solutions. Optimize data processing for performance, scalability, and cost-efficiency in a cloud environment. Required Skills & Qualifications: Proficiency in Python and/or Java for backend data engineering tasks. Strong experience in data modelling , ETL/ELT pipeline development , and data architecture . Hands-on experience with data migration and transformation in financial systems. Familiarity with task scheduling tools (e.g., Apache Airflow, Cron, Luigi). Solid understanding of SQL and experience with relational and NoSQL databases. Knowledge of data validation frameworks and best practices in financial data quality. Experience with cloud platforms (AWS, GCP, or Azure), especially in data services. Understanding of data security , compliance , and regulatory requirements in FinTech. Preferred Qualifications: Experience with big data technologies (e.g., Spark, Kafka, Hadoop). Familiarity with CI/CD pipelines , containerization (Docker), and orchestration (Kubernetes). Exposure to financial data standards (e.g., FIX, ISO 20022) and regulatory frameworks (e.g., GDPR, PCI-DSS). Role & responsibilities Preferred candidate profile
Posted 2 months ago
15.0 - 20.0 years
15 - 18 Lacs
Bengaluru
Work from Office
We are seeking a skilled and strategic Data and Analytics Executive to lead our data initiatives and build a high-performing data team. The ideal candidate will possess extensive experience in developing data teams, data architecture, analytics, and a strong ability to align data priorities with business goals. This role is essential in driving data-driven decision-making across the organization and ensuring that our data landscape serves our business objectives effectively. Key Responsibilities: 1. Team Building and Mentorship: Recruit, develop, and mentor a talented team of data professionals, including engineers, business analysts, data product managers, and visualization experts. Foster a culture of continuous learning and improvement within the team. Establish clear performance metrics and career development pathways for team members. 2. Goal Prioritization Based on Business Value: Collaborate with business teams to define and prioritize data and analytics initiatives that align with overall business strategy. Assess the potential business impact of various data projects to inform prioritization and resource allocation. Communicate project goals and progress to stakeholders to ensure alignment and transparency. 3. Design and Architecture of Data Landscape: Own the design and execution of the organization s data architecture, ensuring scalability, security, and accessibility. Evaluate and implement modern data technologies and frameworks that support business objectives. Collaborate with the Data Governance team and implement practices to maintain data quality, privacy, and compliance. 4. Development of Certified Data Products: Lead the development and deployment of data products that are reliable, scalable, and meet the needs of end users. Collaborate with product teams to identify opportunities for new data product development and enhancement. Establish testing and certification processes to validate data products effectiveness and reliability. 5. Ensuring Data Infrastructure Meets Business Requirements: Assess the organizations data infrastructure and identify opportunities for improvement to meet evolving business needs. Ensure that data pipelines, storage solutions, and analytics tools effectively support data integration and analysis. Monitor and improve the performance of data systems to ensure timely and accurate reporting. Required Qualifications: Bachelor s degree in computer science, Statistics, or a related field. Masters degree preferred. 15+ years of IT experience with a proven track record in building data teams with at least 5 years in a leadership role. Ability to build and lead high-performing teams in a dynamic and fast-paced environment. Strong understanding of data architecture, data governance, and analytics technologies. Excellent communication and interpersonal skills, with the ability to influence stakeholders at all levels.
Posted 2 months ago
5.0 - 8.0 years
17 - 19 Lacs
Bengaluru
Work from Office
In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Requirements Total Experience 58 years with 4+ years of relevant experience Skills Proficiency on Databricks platform Strong handson experience with Pyspark , SQL, and Python Any cloud Azure, AWS, GCP Certifications (Any of the following) Databricks Certified Associate Developer for Spark 3.0 Preferred Databricks Certified Data Engineer Associate Databricks Certified Data Engineer Professional Mandatory Skill Sets Databricks, Pyspark, SQL,Python, Any cloud Azure, AWS, GCP Preferred Skill Sets Related CeCeritfication Databricks Certified Associate Developer for Spark 3.0 Preferred Databricks Certified Data Engineer Associate Databricks Certified Data Engineer Professional Education Qualification BE, B.Tech, ME, M,Tech, MBA, MCA Education Degrees/Field of Study required Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Certifications (if blank, certifications not specified) Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline
Posted 2 months ago
5.0 - 8.0 years
17 - 19 Lacs
Bengaluru
Work from Office
In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Requirements Total Experience 58 years with 4+ years of relevant experience Skills Proficiency on Databricks platform Strong handson experience with Pyspark , SQL, and Python Any cloud Azure, AWS, GCP Certifications (Any of the following) Databricks Certified Associate Developer for Spark 3.0 Preferred Databricks Certified Data Engineer Associate Databricks Certified Data Engineer Professional Mandatory Skill Sets Databricks, Pyspark, SQL,Python, Any cloud Azure, AWS, GCP Preferred Skill Sets Related CeCeritfication Databricks Certified Associate Developer for Spark 3.0 Preferred Databricks Certified Data Engineer Associate Databricks Certified Data Engineer Professional Education Qualification BE, B.Tech, ME, M,Tech, MBA, MCA Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration, Master of Engineering Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40419 Jobs | Dublin
Wipro
19673 Jobs | Bengaluru
Accenture in India
18234 Jobs | Dublin 2
EY
16675 Jobs | London
Uplers
12161 Jobs | Ahmedabad
Amazon
10909 Jobs | Seattle,WA
Accenture services Pvt Ltd
10500 Jobs |
Bajaj Finserv
10207 Jobs |
Oracle
9771 Jobs | Redwood City
IBM
9641 Jobs | Armonk