Jobs
Interviews

1817 Data Architecture Jobs - Page 24

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru

Work from Office

KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. The person will work on a variety of projects in a highly collaborative, fast-paced environment. The person will be responsible for software development activities of KPMG, India. Part of the development team, he/she will work on the full life cycle of the process, develop code and unit testing. He/she will work closely with Technical Architect, Business Analyst, user interaction designers, and other software engineers to develop new product offerings and improve existing ones. Additionally, the person will ensure that all development practices are in compliance with KPMG s best practices policies and procedures. This role requires quick ramp up on new technologies whenever required. Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. . Role : Azure Data Engineer Location: Bangalore Experience: 4 to 6 years Data Management : Design, implement, and manage data solutions on the Microsoft Azure cloud platform. Data Integration : Develop and maintain data pipelines, ensuring efficient data extraction, transformation, and loading (ETL) processes using Azure Data Factory. Data Storage : Work with various Azure data storage solutions like Azure SQL Database, Azure Data Lake Storage, and Azure Cosmos DB. Big Data Processing : Utilize big data technologies such as Azure Databricks and Apache Spark to handle and analyze large datasets. Data Architecture : Design and optimize data models and architectures to meet business requirements. Performance Monitoring : Monitor and optimize the performance of data systems and pipelines. Collaboration : Collaborate with data scientists, analysts, and other stakeholders to support data-driven decision-making. Security and Compliance : Ensure data solutions comply with security and regulatory requirements. Technical Skills : Proficiency in Azure Data Factory, Azure Databricks, Azure SQL Database, and other Azure data tools. Analytical Skills : Strong analytical and problem-solving skills. Communication : Excellent communication and teamwork skills. Certifications : Relevant certifications such as Microsoft Certified: Azure Data Engineer Associate are a plus.

Posted 1 month ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Mandatory Skills: Big Data Consulting. Experience: 5-8 Years.

Posted 1 month ago

Apply

3.0 - 6.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 9 Lacs

Nagpur

Work from Office

Project Role: SAP CPI Consultant Project Role Description: Work for an end-to-end integration solution. Drive client discussions to define the integration requirements and translate the business requirements to the technology solution. Activities include mapping business processes to support applications, defining the data entities, selecting integration technology components and patterns. Work Experience: 4-6 years Work location: Nagpur Must Have Skills: SAP CPI, API Key Responsibilities: The candidate is expected to work with the Functional and Data Architecture team members to facilitate design and development across required Project Life cycle. Should be able to understand the requirement from the client and develop the interfaces individually. Ready to work from office for 15 Days Sap Cpi

Posted 1 month ago

Apply

12.0 - 18.0 years

50 - 65 Lacs

Bengaluru

Work from Office

Oversee the delivery of data engagements across a portfolio of client accounts, understanding their specific needs, goals & challenges Provide mentorship & guidance for the Architects, Project Managers, & technical teams for data engagements Required Candidate profile 12+ years of experience and should be hands on in Data Architecture and should be an expert in DataBricks or Azure Should be in data engineering leadership or management roles

Posted 1 month ago

Apply

10.0 - 15.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Project description We are a leading international Bank that is going through a significant transformation of its front-to-back operations, marked as one of the banks top 3 transformation agendas. F2B Business Architecture is a small central global team in CIB- CTB to support the delivery of this key front-to-back (F2B) transformation priorities of the management board. The Data Architecture team will play the central role of defining the data model that will align the business processes and ensure data lineage, effective controls, and implement efficient client strategy and reporting solutions. This will require building strong relationships with key stakeholders and help deliver tangible value. The role will report to the India Head of Investment Bank and Cross Product F2B Operations. Responsibilities Be part of the CTB team to define and manage Data models used to implement solutions to automate F2B business processes and controls Ensure the models follows bank's data modelling standards and principles and influence them as necessary Actively partner with various functional leads & teams to socialize the data models towards adoption and execution of front-to-back solutions Skills Must have 10+ years in financial services, preferably Strategy and solutions in the Corporate and Investment Banking domain. Strong Knowledge of transaction banking domain process and controls for banking & trading business to drive conversation with business SMEs. Experience in developing models for transacting banking products is preferable. Knowledge of Loans or Cash/Deposits lifecycle and/or Customer lifecycle and related business data required to manage operations and analytics is desirable. Well-developed business requirements analysis skills, including good communication abilities (both speaking and listening), influencing, and stakeholder management (all levels up to managing director). Can partner with Technology and business to understand current issues and articulate recommendations and solutions. Experience working in an enterprise agile environment in a matrix organization. Critical problem-solving skills, able to think tactically and strategically. Strong design experience and defining solutions. Knowledge of banking industry data models/best practices is a plus. Consolidates process and data, and existing architecture to drive recommendations and solutions. Strong Data analysis skills, SQL/Python experience, and the ability to build data models are desirable. Nice to have Good Tech stack

Posted 1 month ago

Apply

6.0 - 11.0 years

13 - 18 Lacs

Gurugram

Work from Office

Project description We are looking for an experienced Senior ServiceNow (SNOW) Engineer to join our IT Operations team. You are responsible for designing robust data models, developing custom reports, and building seamless API integrations within the ServiceNow platform. You should have a strong background in ITSM processes, data architecture, and hands-on experience with ServiceNow development and automation. You will play a pivotal role in optimizing our ServiceNow environment to enhance service delivery, operational visibility, and integration with enterprise systems. Responsibilities Internal Data Structures & Configuration Design, build, and maintain data models, tables, and relationships within the ServiceNow platform. Extend and customize out-of-the-box modules (e.g., CMDB, Incident, Change, Request, etc.) to meet business requirements. Ensure data integrity, normalization, and performance optimization across the ServiceNow environment. Collaborate with stakeholders to translate business requirements into scalable ServiceNow configurations or custom applications. Reporting & Dashboards Develop real-time dashboards and reports using ServiceNow Reporting Tools and Performance Analytics. Deliver insights into key ITSM metrics such as SLAs, incident trends, and operational KPIs. Automate the generation and distribution of recurring reports to stakeholders. Work with business and technical teams to define and implement reporting frameworks tailored to their needs. Automated Feeds & API Integration Develop and manage robust data integrations using ServiceNow REST/SOAP APIs. Build and maintain data pipelines to and from external systems (e.g., CMDB, HRIS, ERP, Flexera, etc.). Implement secure, scalable automation for data exchange with appropriate error handling, logging, and monitoring. Troubleshoot and resolve integration-related issues to ensure smooth system interoperability. Skills Must have Minimum 6+ years of hands-on experience with ServiceNow, including ITSM, CMDB, and integrations. Technical Expertise Advanced knowledge of ServiceNow architecture, configuration, and scripting (JavaScript, Glide). Strong experience with REST/SOAP APIs for ServiceNow integrations. Solid understanding of relational databases, data normalization, and model optimization. Familiarity with common enterprise systems such as ERP, HRIS, Flexera, and CMDB tools. Reporting Skills: Proficiency in ServiceNow Performance Analytics, standard reporting, and dashboard design. Experience defining KPIs and building automated reporting solutions. Soft Skills: Strong communication and collaboration skills. Proven ability to translate business requirements into scalable ServiceNow solutions. Analytical and detail-oriented mindset with a problem-solving approach. Nice to have N/A.

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Project description Application Modernization Practice is a horizontal practice, supporting all business verticals in DXC. As a Senior Modernization Architect, you'll play a pivotal role in both shaping modernization solutions and supervising delivery execution. You will partner with sales, delivery, and clients to design transformation paths from legacy to modern architectures, integrating GenAI accelerators and helping deliver tangible business value. Responsibilities Collaborate on pre-sales engagementsassessments, proposals, orals, and business case creation. Design modernization paths from legacy systems (COBOL, z/OS, etc.) to modern stacks (Java, MSA, cloud). Lead effort estimation, tool strategy selection, and transformation approach definition. Provide architectural oversight during execution to ensure value realization. Participate in tooling evaluations and PoCs involving GenAI and automation accelerators. Skills Must have 8+ years in enterprise application architecture, with at least 3 years in modernization. Proven ability to assess legacy estates and define future-state architectures. Proficiency in mainframe tech (COBOL, DB2, CICS) and modern stacks (Java, Spring, microservices). Exposure to GenAI use cases in application engineering and code conversion. Strong client communication, technical documentation, and stakeholder alignment skills. Nice to have Java, Python, C#

Posted 1 month ago

Apply

7.0 - 12.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Date 25 Jun 2025 Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling, and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus. Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges, and a long-term career free from boring daily routines. Work with advanced data and cloud technologies to drive innovation. Collaborate with cross-functional teams and helpful colleagues. Contribute to innovative projects that have a global impact. Utilise our flexible and hybrid working environment. Steer your career in whatever direction you choose across functions and countries. Benefit from our investment in your development, through award-winning learning programs. Progress towards leadership roles or specialized technical paths. Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension). You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

entomo is an Equal Opportunity Employer. The company promotes and supports a diverse workforce at all levels across the organization. We ensure that our associates, potential hires, third-party support staff, and suppliers are not discriminated against directly or indirectly based on color, creed, caste, race, nationality, ethnicity, national origin, marital status, pregnancy, age, disability, religion or similar philosophical belief, sexual orientation, gender, or gender reassignment. We are looking for a skilled and experienced Data Engineer with 3 to 5 years of experience to design, build, and optimize scalable data pipelines and infrastructure. The ideal candidate will work closely with data scientists, analysts, and software engineers to ensure reliable and efficient data delivery throughout our data ecosystem. Key Responsibilities Design, implement, and maintain robust data pipelines using ETL/ELT frameworks Build and manage data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) Optimize data systems for performance, scalability, and cost-efficiency Ensure data quality, consistency, and integrity across various sources Collaborate with cross-functional teams to integrate data from multiple business systems Implement data governance, privacy, and security best practices Monitor and troubleshoot data workflows and perform root cause analysis on data issues Automate data integration and validation using scripting languages (e.g., Python, SQL) Work with DevOps teams to deploy data solutions using CI/CD pipelines Required Skills & Qualifications Bachelor s or Master s degree in Computer Science, Engineering, Data Science, or a related field 3 5 years of experience in data engineering or a similar role Strong proficiency in SQL and at least one programming language (Python, Java, or Scala) Experience with cloud platforms (AWS, Azure, or GCP) Hands-on experience with data pipeline tools (e.g., Apache Airflow, Luigi, DBT) Proficient with relational and NoSQL databases Familiarity with big data tools (e.g., Spark, Hadoop) Good understanding of data architecture, modeling, and warehousing principles Excellent problem-solving and communication skills Preferred Qualifications Certifications in cloud platforms or data engineering tools Experience with containerization tools (e.g., Docker, Kubernetes) Knowledge of real-time data processing tools (e.g., Kafka, Flink) Exposure to data privacy regulations (e.g., GDPR, HIPAA) LINKEDIN PROFILE submit application By clicking the submit application button, you consent to entomo processing your personal information for the purpose of assessing your candidacy for this position in accordance with entomo Job Applicant Privacy Policy. transform people experience in your enterprise of tomorrow No 60 Paya Lebar Road, #11-06 Paya Lebar Square, Singapore 409051 +65 3138 1767 2700 Post Oak Blvd, 21st Floor, Houston, TX 77056 +1 (800) 947 8211 The Onyx Tower 1 Office 910 P.O. Box 410870. Dubai, United Arab Emirates +971 4399 52 53 Taubstummengasse 7 A-1040 Vienna + 43 1 78 66 318 Unit 27-13, Level 27, Q Sentral, Jalan Stesen Sentral 2, 50470, Kuala Lumpur, Malaysia 13th Cross, Sampige Road 4th Floor, #218 JP Royale, Malleshwaram, Bengaluru, Karnataka 560003

Posted 1 month ago

Apply

3.0 - 8.0 years

17 - 20 Lacs

India, Bengaluru

Work from Office

We are looking for a highly motivated and experienced IT Enterprise Architect (f/m/d) with a strong focus on end-to-end (E2E) customer service processes. You will play a key role in shaping and aligning our IT landscape across platforms such as SAP, ServiceNow, and other customer service-related systems. Your expertise will help drive the digital transformation of our global service processes, ensuring scalability, resilience, and excellent customer experience. Your tasks and responsibilities: You are responsible for enterprise architecture management (including business IT alignment and application portfolio analysis) and the derivation of IT strategies from business requirements. You design and maintain the end-to-end Enterprise Architecture for all customer service processes,and supporting processes (egs. spare parts managment, returns management technician skill matching etc.). You Lead cross-functional workshops and architecture communities to align business goals with IT strategy You will drive the development of the architecture framework, the architecture roadmap and the application and data architecture for the end-to-end customer service business process. You guide the selection and integration of platforms such as SAP S/4HANA, SAP Sales Cloud, Salesforce, Oracle Sales Cloud, and ServiceNow etc. You model IT architectures and processes and drive the consistent design planning and implementation of IT solutions. You contribute to solution evaluations, RFI/RFP processes, and vendor selection in the customer service space You coordinate communication with all key decision-makers and relevant stakeholders and advise them on the development of the IT landscape You drive documentation and presentations to ensure executive alignment Your qualifications and experience: You have a degree in computer science, industrial engineering or a comparable qualification You have experience as an Enterprise Architect or Solution-/Domain Architect in Customer facing IT landscapes You are familar with enterprise architecture methods and frameworks, governance structures, IT Service Management Frameworks (egs. TOGAF, Zachman, ITIL etc.). You bring functional or IT implementation experience across all customer service processes and functions (installation and maintenance, customer service, field service, material logistics and finance etc.) You have experience in the implementation of customer service solutions (e.g. ServiceNow, Salesforce, SAP Service Cloud, SAP Field Service Management, Oracle Sales Cloud, CPQ, Spryker etc.) You have extensive experience with data architecture and integration concepts and a very good understanding of cloud technologies (e.g. Azure, AWS) You have gained practical experience with enterprise architecture tools such as BizzDesign, LeanIX or Avolution and have good knowledge of modeling and managing business processes Your attributes and skills: In addition, you have sound technological know-how and several years of experience in complex technology landscapes We require a very good command of English, both spoken and written, for cooperation with specialist departments in Germany and abroad. Ideally, you also have a very good command of German You are an organizational talent and impress with good communication and presentation skills You are a team player with strong interpersonal skills who can operate confidently in a global environment We do not compromise on quality - you work in a results and quality-oriented manner with a high level of commitment and have good analytical and conceptual skills You are flexible in thought and action, have a quick understanding and constructive assertiveness

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Pune

Work from Office

Job Title Data Modeler Location Pune, Maharashtra, India Experience 4 to 8 Years As a Data Architect at JCI, you will play a pivotal role in designing and implementing robust data solutions that support our analytics and business intelligence initiatives. This role requires extensive experience in data modeling, data warehousing, and familiarity with cloud technologies. How you will do it Design and implement data architecture solutions that meet business requirements and align with overall data strategy. Work closely with data engineers, data scientists, and other stakeholders to understand data requirements and ensure data availability and quality. Create and maintain data models, ensuring they are optimized for performance and scalability. Establish data governance practices to maintain data integrity and security across the organization. Lead the design and implementation of data integration processes, including ETL workflows and data pipelines. Evaluate and recommend new tools and technologies to improve data management capabilities. Provide technical leadership and mentorship to other team members in best practices for data architecture. Stay current with industry trends and advancements in data technologies and methodologies. What we look for Bachelor s degree in computer science, Information Technology, or a related field. 4 to 8 years of experience in data architecture or a similar role. Strong proficiency in SQL and experience with data modeling and database design. Experience with cloud data solutions, such as AWS, Azure, or Google Cloud Platform. Familiarity with data warehousing concepts and tools. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Join JCI and leverage your expertise to create impactful data solutions that drive our business forward!

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 27 Lacs

Pune

Hybrid

Role & responsibilities Join as an AVP - Business Analyst (Data Designer) at a leading UK based bank, where you'll spearhead the evolution of the digital landscape, driving innovation and excellence. You'll harness cutting edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. You may be assessed on the key critical skills relevant for success in role, such as experience with data design compliance with best practices, governance, and security policies, data profiling and analysis and data design specifications as well as job-specific skillsets. Basic/ Essential Qualifications: Has designed and develop detail data models, schemas, and database designs. Understands data requirements and translate them into effective data designs and data flows. Optimize data structures for performance and scalability in alignment with business objectives. Experience in conducting data profiling and analysis to identify data quality issues and propose solutions. Understands data design specifications and maintain data dictionaries. Proficiency in SQL and familiarity with database management systems (e.g., Oracle, SQL Server, MySQL, Kafka, AWS etc.). Desirable skillsets/ good to have: Bachelors degree in Business Administration, Data Science, or related field. Proven experience in data modeling, database design, and data governance frameworks. Knowledge of data warehousing concepts and tools. Has basic understanding of financial crime domain. Excellent communication skills to interact with both technical and non-technical stakeholders.

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Chennai

Work from Office

Job Summary: We are seeking a highly skilled Data Engineer to design, develop, and maintain robust data pipelines and architectures. The ideal candidate will transform raw, complex datasets into clean, structured, and scalable formats that enable analytics, reporting, and business intelligence across the organization. This role requires strong collaboration with data scientists, analysts, and cross-functional teams to ensure timely and accurate data availability and system performance. Key Responsibilities Design and implement scalable data pipelines to support real-time and batch processing. Develop and maintain ETL/ELT processes that move, clean, and organize data from multiple sources. Build and manage modern data architectures that support efficient storage, processing, and access. Collaborate with stakeholders to understand data needs and deliver reliable solutions. Perform data transformation, enrichment, validation, and normalization for analysis and reporting. Monitor and ensure the quality, integrity, and consistency of data across systems. Optimize workflows for performance, scalability, and cost-efficiency. Support cloud and on-premise data integrations, migrations, and automation initiatives. Document data flows, schemas, and infrastructure for operational and development purposes. • Apply best practices in data governance, security, and compliance. Required Qualifications & Skills: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. Proven 6+ Years experience in data engineering, ETL development, or data pipeline management. Proficiency with tools and technologies such as: SQL, Python, Spark, Scala ETL tools (e.g., Apache Airflow, Talend) Cloud platforms (e.g., AWS, GCP, Azure) Big Data tools (e.g., Hadoop, Hive, Kafka) Data warehouses (e.g., Snowflake, Redshift, BigQuery) Strong understanding of data modeling, data architecture, and data lakes. Experience with CI/CD, version control, and working in Agile environments. Preferred Qualifications: • Experience with data observability and monitoring tools. • Knowledge of data cataloging and governance frameworks. • AWS/GCP/Azure data certification is a plus.

Posted 1 month ago

Apply

12.0 - 15.0 years

40 - 45 Lacs

New Delhi, Pune

Hybrid

Role & responsibilities Experience: 10-14 years Key skills: Azure Data Architect sql data modeling dimension data modeling databricks or synapse

Posted 1 month ago

Apply

2.0 - 7.0 years

12 - 16 Lacs

Hyderabad

Work from Office

WHAT YOU'LL DO Build scalable data infrastructure solutions Design and optimize new and existing data pipelines Integrate new data sources into our existing data architecture Collaborate with a cross-functional product engineering teams and data stakeholders deliver on Codecademy’s data needs WHAT YOU'LL NEED 3 to 5 years of hands-on experience building and maintaining large scale ETL systems Deep understanding of database design and data structures. SQL, & NoSQL. Fluency in Python. Experience working with cloud-based data platforms (we use AWS) SQL and data warehousing skills -- able to write clean and efficient queries Ability to make pragmatic engineering decisions in a short amount of time Strong project management skills; a proven ability to gather and translate requirements from stakeholders across functions and teams into tangible results WHAT WILL MAKE YOU STAND OUT Experience with tools in our current data stack: Apache Airflow, Snowflake, dbt, FastAPI, S3, & Looker. Experience with Kafka, Kafka Connect, and Spark or other data streaming technologies Familiarity with the database technologies we use in production: Snowflake, Postgres, and MongoDB. Comfort with containerization technologies: Docker, Kubernetes, etc.

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Pune

Work from Office

JD for Azure Databricks. Role name: Developer Role Description: ADB/ADF, PySparkProficiency with Azure services: Azure Databricks (Unity Catalog) Azure Data Factory, Azure Data lake storage, Azure Function and Logic App and Monitors.Programming Language: Python, PySpark and SQL for data processing.Strong SQL skills for querying, managing and optimizing relational and non-relational databases.Experience with Bigdata technology: Spark, Hadoop ecosystem.Expertise in building and managing ETL ( Extract, Transform, Load ) pipelinesData Modeling and designing scalable data architecture Competencies: Digital : Microsoft Azure, Digital : Databricks Experience (Years): 8-10 Essential Skills: ADB/ADF, PySparkProficiency with Azure services: Azure Databricks (Unity Catalog) Azure Data Factory, Azure Data lake storage, Azure Function and Logic App and Monitors.Programming Language: Python, PySpark and SQL for data processing.Strong SQL skills for querying, managing and optimizing relational and non-relational databases.Experience with Bigdata technology: Spark, Hadoop ecosystem.Expertise in building and managing ETL ( Extract, Transform, Load ) pipelinesData Modeling and designing scalable data architecture

Posted 1 month ago

Apply

7.0 - 12.0 years

0 - 0 Lacs

Hyderabad

Work from Office

Role & responsibilities Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Cloud Data Architecture Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Software Development Engineer, you will engage in a dynamic environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users while maintaining high standards of quality and performance. You will also be responsible for troubleshooting issues and implementing solutions that enhance the functionality and efficiency of the applications. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to foster their professional growth and development. - Continuously evaluate and improve development processes to enhance team productivity. Professional & Technical Skills: - Must To Have Skills: Proficiency in Cloud Data Architecture. - Strong understanding of cloud computing principles and architecture. - Experience with data modeling and database design. - Familiarity with cloud service providers such as AWS, Azure, or Google Cloud. - Knowledge of application development frameworks and methodologies. Additional Information: - The candidate should have minimum 7.5 years of experience in Cloud Data Architecture. - This position is based at our Hyderabad office. - A 15 years full time education is required. Preferred candidate profile

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

About The Role Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that client requirements are met effectively and efficiently. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data warehousing concepts and technologies.- Ability to analyze and optimize data models for performance. Additional Information:- The candidate should have minimum 3 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

15 - 19 Lacs

Kolkata

Work from Office

About The Role Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Technology Architect, you will design and deliver technology architecture for a platform, product, or engagement. Your typical day will involve collaborating with various teams to define solutions that meet performance, capability, and scalability needs, ensuring that the architecture aligns with the overall business objectives and technical requirements. You will engage in discussions with stakeholders to gather requirements, analyze existing systems, and propose innovative solutions that enhance the technology landscape. Your role will also include mentoring team members and providing guidance on best practices in technology architecture, fostering a culture of continuous improvement and collaboration within the team. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Evaluate and recommend new technologies to improve system performance and efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with performance tuning and optimization techniques.- Familiarity with cloud technologies and architecture.- Ability to design scalable and robust data architectures. Additional Information:- The candidate should have minimum 5 years of experience in Ab Initio.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Kolkata

Work from Office

About The Role Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : PySpark Good to have skills : Python (Programming Language)Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects.- Ensure cohesive integration between systems and data models.- Implement data platform components effectively.- Provide insights for enhancing data platform performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 3 years of experience in PySpark.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), Apache Spark, Databricks Unified Data Analytics PlatformMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to identify data needs and optimize data workflows, contributing to the overall efficiency and effectiveness of data management within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture to support business needs.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Databricks Unified Data Analytics Platform, Apache Spark, Python (Programming Language).- Strong understanding of data modeling and database design principles.- Experience with ETL tools and data integration techniques.- Familiarity with cloud data warehousing solutions and big data technologies. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Kolkata

Work from Office

About The Role Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Alation Data Catalog Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with various teams to ensure that the integration between systems and data models is seamless, while also contributing to the overall architecture of the data platform. You will engage in discussions that shape the direction of data initiatives and work closely with Integration Architects and Data Architects to align on best practices and strategies for data management. Roles & Responsibilities:- Expected to be a Subject Matter Expert with deep knowledge and experience.- Should have influencing and advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and discussions to gather requirements and feedback from stakeholders.- Mentor junior professionals in best practices and methodologies related to data architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Alation Data Catalog.- Strong understanding of data governance and management principles.- Experience with data integration tools and techniques.- Familiarity with cloud data platforms and services.- Ability to design and implement data models that support business needs. Additional Information:- The candidate should have minimum 15 years of experience in Alation Data Catalog.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Hyderabad

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : SAP Analytics Cloud DevelopmentMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing innovative solutions that enhance the overall functionality of the applications you work on. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with SAP Analytics Cloud Development.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data governance and data quality management. Additional Information:- The candidate should have minimum 12 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and guidance to your team members while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and data querying techniques.- Familiarity with cloud-based data solutions and architecture.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies