Jobs
Interviews

1814 Data Architecture Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 16.0 years

0 Lacs

haryana

On-site

As the Head of Architecture at REA India, you will play a crucial role in defining and implementing the end-to-end architecture strategy for the organization. Your primary focus will be on ensuring scalability, security, cloud optimization, and AI-driven innovation while providing mentorship to teams and enhancing development efficiency. Collaboration with leaders from REA Group will also be essential to align with the global architectural strategy. Your responsibilities will include maintaining Architectural Decision Records (ADR) to document technical choices, defining and implementing scalable and secure architectures across Housing and PropTiger, aligning technical decisions with business goals, optimizing cloud infrastructure, improving SEO performance, enhancing CI/CD pipelines, enforcing security best practices, architecting data pipelines and AI-driven solutions, establishing incident management best practices, and mentoring engineering teams to foster technical excellence and innovation. To be successful in this role, you should have at least 12 years of experience in software architecture, cloud platforms (AWS/GCP), and large-scale system design. Expertise in microservices, API design, DevOps, CI/CD, cloud cost optimization, security best practices, data architecture, AI/ML pipelines, Gen AI applications, and leadership in mentoring high-performing engineering teams will be crucial. Strong problem-solving, analytical, and cross-functional collaboration skills are also required. Joining REA India will provide you with the opportunity to build and lead high-scale real estate tech products, drive cutting-edge AI and cloud innovations, and mentor and shape the next generation of top engineering talent. If you are passionate about revolutionizing the real estate technology landscape and driving innovation, this role offers an exciting opportunity to make a significant impact. In summary, as the Head of Architecture at REA India, you will lead the architectural strategy, drive innovation, and mentor engineering teams to create cutting-edge real estate technology solutions. Your expertise in software architecture, cloud platforms, security, data architecture, and AI/ML applications will be instrumental in shaping the future of the organization and the real estate technology sector as a whole.,

Posted 4 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Software Engineer Practitioner at TekWissen in Chennai, you will be a crucial part of the team responsible for the development and maintenance of the Enterprise Data Platform. Your main focus will be on designing, building, and optimizing scalable data pipelines within the Google Cloud Platform (GCP) environment. Working with GCP Native technologies such as BigQuery, Dataform, Dataflow, and Pub/Sub, you will ensure data governance, security, and optimal performance. This role offers you the opportunity to utilize your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at the client. To be successful in this role, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field of study. You should have at least 5 years of experience with a strong understanding of database concepts and multiple database technologies to optimize query and data processing performance. Proficiency in SQL, Python, and Java is essential, along with experience in programming engineering transformations in Python or similar languages. Additionally, you should have the ability to work effectively across different organizations, product teams, and business partners, along with knowledge of Agile (Scrum) methodology and experience in writing user stories. Your skills should include expertise in data architecture, data warehousing, and Google Cloud Platform tools such as BigQuery, Data Flow, Dataproc, Data Fusion, and others. Experience with Data Warehouse concepts, ETL processes, and data service ecosystems is crucial for this role. Strong communication skills are necessary for both internal team collaboration and external stakeholder interactions. Your role will involve advocating for user experience through empathetic stakeholder relationships and ensuring effective communication within the team and with stakeholders. As a Software Engineer Practitioner, you should have excellent communication, collaboration, and influence skills to energize the team. Your knowledge of data, software, architecture operations, data engineering, and data management standards will be valuable in this role. Hands-on experience in Python using libraries like NumPy and Pandas is required, along with extensive knowledge of GCP offerings and bundled services related to data operations. You should also have experience in re-developing and optimizing data operations, data science, and analytical workflows and products. TekWissen Group is an equal opportunity employer that supports workforce diversity, and we encourage applicants from diverse backgrounds to apply. Join us in shaping the future of data engineering and making a positive impact on lives, communities, and the planet.,

Posted 4 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

JSW One Platforms is a B2B e-commerce platform aimed at enhancing transparency, trust, and ease-of-business for MSMEs in India. With the JSW One Platforms serving as the hub for JSW One MSME and JSW One Homes, we offer a comprehensive range of services tailored to meet the needs of MSMEs and individual home builders. JSW One MSME stands as a one-stop, multi-product digital marketplace dedicated to supporting MSMEs in elevating their businesses. Through our integrated platform, we aim to cater to the construction and manufacturing requirements of a diverse clientele by utilizing JSW One's extensive catalogue, logistics, and financing expertise. On the other hand, JSW One Homes provides end-to-end home construction services for individual home builders, ensuring a hassle-free construction experience. As a technology-driven B2C e-commerce platform, JSW One Homes connects home builders with vetted turnkey contractors, offering a tech-enabled tool for customers to monitor the progress, cost, quality, and timeline of their construction projects. As JSW One continues to grow, we are actively seeking experienced and dedicated individuals to join our team. At JSW One, we foster a culture of camaraderie and transparency, promoting efficient collaboration and a supportive work environment that encourages growth. Join us in our mission to transform India together! Overview: We are currently looking for a highly motivated Product Manager - Data, Analytics, and AI to spearhead the strategy, development, and delivery of data and analytics products as well as data-driven features. This pivotal role necessitates a deep understanding of data architecture, analytics, and business strategy. You will collaborate cross-functionally with data engineers, analysts, product managers, and business leaders to create scalable data solutions. Reporting directly to the Chief Product Officer (CPO), this position plays a crucial role in converting raw data into actionable insights, enabling informed business decisions, enhancing customer experiences, and driving commercial growth. Roles & Responsibilities: - Develop and oversee the product roadmap for data platforms, AI, pipelines, APIs, and analytics products. - Work closely with data engineering and analytics teams to establish and maintain data infrastructure and services. - Collaborate with various stakeholders to ensure that data products align with user and business requirements. - Translate business needs into technical specifications and prioritize features based on impact and feasibility. - Implement and uphold standards for data quality, governance, privacy, and security in line with industry best practices. - Define and monitor key performance indicators (KPIs) for data quality, product usage, and business impact. Who can apply: - Candidates with 6-10 years of product management experience, with a minimum of 3-5 years focused on data products and platforms. - Strong background in data orientation and analytics, with a track record of building data products. - Proficiency in data infrastructure like data lakes and warehouses (e.g., Snowflake, BigQuery). - Familiarity with SQL and the ability to work independently with data. - Experience with visualization and BI tools such as Tableau, Looker, and Power BI. - Knowledge and exposure to AI capabilities and applications. Key Result Areas (KRAs): - Define and execute the product vision and strategic roadmap for data, analytics, and AI products and platforms. - Collaborate with business leaders, product teams, and engineering to drive the delivery of data solutions. - Ensure the reliability, performance, and scalability of data solutions, including pipelines and APIs. - Establish and monitor frameworks for data quality, lineage, governance, privacy, and security. - Drive the adoption of data, analytics, and AI products by making insights accessible and actionable. - Manage the migration and ongoing enhancement of key KPIs reporting to BI tools.,

Posted 4 weeks ago

Apply

5.0 - 8.0 years

8 - 11 Lacs

Hyderabad, Bengaluru

Hybrid

Use data mappings and models provided by the data modeling team to build robust pipelines in Snowflake.Design and implement data pipelines with proper 2NF/3NF normalization standards.Expert-level SQL and experience with data transformation Required Candidate profile Expert-level SQL exp with data transformation.data architecture normalization techniques 2NF/3NF Exp cloud-based data platforms and pipeline design.exp AWS data services.Carrier CAB process.

Posted 4 weeks ago

Apply

15.0 - 20.0 years

40 - 50 Lacs

Pune, Bengaluru

Work from Office

Role : Enterprise Architect (Data) Location : Bangalore Role overview We are seeking an experienced Enterprise Data Architect to join our team, focusing primarily on Data engineering, Data Analytics/Visualization and cloud engineering. The ideal candidate will play a crucial role in shaping our technology landscape, participating in project delivery, and contributing to presales activities during lean periods. 14 to 20 years of total experience in IT, with a minimum of 3 years in an Enterprise Data Architect capacity. Strong expertise in data and cloud technologies, with hands-on experience in data architecture, cloud migrations, and modern data platforms. Knowledge of design patterns and architectural styles. Experience with data modeling and database design. Experience with Google Cloud Platform is a MUST, has used Google products on the data side; multi-cloud expertise is a plus. Proven track record of designing and implementing large-scale, complex systems. Familiarity with modern data tools, such as dbt, Snowflake, and Kafka, and proficiency in SQL and Python. Excellent communication skills, with the ability to convey complex technical concepts to both technical and non-technical audiences. Strong leadership and mentoring skills. What would you do here Design and oversee enterprise-wide Data engineering, Data modeling, Data Analytics, and cloud architecture based solutions. Lead and participate in large-scale projects, integrating solutions across cloud, data engineering, and Analytics practices. Engage in customer-facing roles, including presales activities and project delivery. Develop robust data governance frameworks, ensuring compliance with regulations like GDPR, CCPA, or other industry standards. Collaborate with cross-functional teams to ensure alignment of technology solutions with business objectives. Stay current with emerging technologies and industry trends, particularly in cloud computing. Build reusable assets, frameworks, and accelerators to enhance delivery efficiency. Participate in and potentially lead architectural reviews and governance processes.

Posted 4 weeks ago

Apply

18.0 - 25.0 years

35 - 55 Lacs

Chennai

Work from Office

Architect scalable data solutions for BFSI Design data models for transactional systems, Lead data migration from legacy systems Experience with financial data modeling Proficiency in Erwin, PowerDesigner, or Toad Data Modeler.

Posted 4 weeks ago

Apply

12.0 - 20.0 years

40 - 60 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Hello Folks... One of my IT clinet is looking for Enterprise Aritect oim Presales. Location: Bangalore, Hyderabad, Pune Experiencee: 12-20 Years NP: Immediate-60 days The Solution Architect will closely associate with a client (account) and be responsible for translating the client's technology standards and framework utilization for specific system solution designs. In addition, SA will also be working with client personnel and executives to understand functional and technical problem statements, identify patterns of issues, and subsequently work with the internal team to identify, justify, and design solutions. Responsibilities for this position include: Demonstrates in-depth knowledge in one or more solution domains - Application, Data or Cloud, archetypes as well as the customers technical and business environment. Develops compelling customer proposals and critically reviews them, manages the expectations of internal stakeholders and customers, ensuring the meeting of customers business and technical requirements are met. Quantifies the impact of the business problem(s), positions business value, identifies the strengths and weaknesses of the overall proposed solution to achieve long-term business objectives. Collaborate with project managers and technical teams to ensure proper architecture implementation. Act as a trusted advisor to our project managers, providing end-to-end accountability for technological directions in project delivery. Participates in deep-dive discussions and partners with the account sales & delivery team to build customer relationships at all levels Proactively shares knowledge with peers and helps develop more junior team members. Qualifications In-depth Knowledge of building systems in distributed environment Grew up through the SW development ranks Proven experience in managing and responding to complex RFPs Keen interest in keeping abreast of latest developments in Information Technology Ability to work on a set of concurrent technical and communication tasks Experience in working for Banking, Financial Services domain IF you are looking for the same role and your skills match then drop me your resume at chanchal@oitindia.com

Posted 4 weeks ago

Apply

10.0 - 12.0 years

35 - 40 Lacs

Pune

Work from Office

Seeking Data Architect with strong ETL, data modeling, data warehouse, SQL, Pyspark and Cloud experience. Architect experience is mandatory. Looking for only Immediate to currently serving candidates.

Posted 4 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Ahmedabad

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Responsibilities Design and build data pipelines & Data lakes to automate ingestion of structured and unstructured data that provide fast, optimized, and robust endtoend solutions Knowledge about the concepts of data lake and dat warehouse Experience working with AWS big data technologies Improve the data quality and reliability of data pipelines through monitoring, validation and failure detection. Deploy and configure components to production environments Technology Redshift, S3, AWS Glue, Lambda, SQL, PySpark, SQL Mandatory skill sets AWS Data Engineer Preferred skill sets AWS Data Engineer Years of experience required 48 Education qualification B.tech/MBA/MCA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred Required Skills AWS Development, Data Engineering Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?

Posted 4 weeks ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Chennai

Work from Office

Why Quvia Founded in 2019, Quvia is a fast-growing, Series A tech startup passionate about making connectivity and digital experiences better for everyone, everywhere. Our industry-first solutions are already addressing major challenges for companies in the aviation and cruise industries and were just getting started. Quvia is headquartered in the greater Miami region, with offices in the UK and India, and remote teams around the world. As an early-stage company, all Quvia employees have the opportunity to make a significant impact on our growth trajectory, and the future of the industries we serve. Quvia is backed by Columbia Capital, a respected venture capital firm founded in 1989 that has raised over $5 Bn of fund commitments. What You Will Do As a Data Engineer , you will lead the design and development of scalable, reliable, and high-performance data systems that power analytics and decision-making across the organization. You will serve as a technical authority in data engineering and work cross-functionally to elevate our data infrastructure, accelerate innovation, and drive strategic outcomes. Your key responsibilities will include: Data Architecture Pipeline Development : Design and build end-to-end data pipelines, including streaming and batch ETL processes, using modern tools and platforms (e.g., Kafka, Airflow, DBT). Data Modeling Warehousing : Develop optimized data models and warehouse schemas to support business intelligence, reporting, and machine learning use cases. Collaboration Communication : Work closely with analytics, data science, DevOps, and product engineering teams to integrate and align data infrastructure with company goals. Data Quality Governance : Implement robust monitoring, testing, and governance practices to ensure data accuracy, consistency, and compliance. Performance Tuning Optimization : Continuously optimize pipeline performance, storage strategies, and query execution for scalability and efficiency. Mentorship Leadership : Mentor junior data engineers, share best practices, and help foster a high-performing, collaborative data team. Innovation Improvement : Drive continuous improvement by evaluating new tools, frameworks, and processes to enhance our data ecosystem. What You Will Need Required Skills Experience Strong programming skills in Python or Java for data pipeline and backend development. Deep understanding of SQL and modern data warehousing practices. Proficiency with at least one major cloud platform (AWS, GCP, Azure). Experience with Kafka for real-time data streaming and ingestion. Hands-on experience with Git , Docker , DBT , Jenkins , and Airflow for version control, containerization, and orchestration. Ability to lead technical projects and collaborate effectively with cross-functional stakeholders. Nice to Have Experience with Click House , including table design, performance tuning, and Merge Tree optimization. Familiarity with Kubernetes for orchestration of containerized applications. What We Offer Deep domain exposure in aviation and maritime industries. Cross-functional collaboration with global teams. Day-one health benefits. Generous paid time off. Performance-based bonuses. A fast-paced, collaborative environment with real ownership and impact. Quvia is an Equal Opportunity Employer. Employment opportunities at Quvia are based upon ones qualifications and capabilities to perform the essential functions of a particular job. All employment opportunities are provided without regard to race, religion, sex (including sexual orientation and transgender status), pregnancy, childbirth or related medical conditions, national origin, age, veteran status, disability, genetic information, or any other characteristic protected by law.

Posted 4 weeks ago

Apply

3.0 - 8.0 years

12 - 16 Lacs

Bengaluru

Work from Office

As a Data Engineer in Data Infrastructure and Strategy group, you will play a key role in transforming the way Operations Finance teams access and analyse the data. You will work to advance 3 Year Data Infrastructure Modernisation strategy and play a key role in adopting and expanding a unified data access platform and scalable governance and observability frameworks that follow modern data architecture and cloud-first designs. Your responsibilities will include supporting and migrating data analytics use cases of a targeted customer group and new features implementations within the central platform: component design, implementation using NAWS services that follow best engineering practices, user acceptance testing, launch, adoption and post-launch support. You will work on system design and integrate new components to established architecture. You will be engaged into cross-team collaboration by building reusable design patterns and components and adopting designs adopted by others. You will contribute to buy vs build decision by evaluating latest product and features releases for NAWS and internal products, perform gap analysis and define feasibility of their adoption and the list of blockers. The ideal candidate possess a track record of creating efficient AWS-based data solutions; data models for both relational databases and Glue/Athena/EMR stack; developing solution documentation, project plans, user guides and other project documentation. We are looking into individual contributor inspired to become data systems architects. Track of production level deliverables leveraging GenAI is big plus. Elevate and optimize existing solutions while driving strategic migration. Conduct thorough impact assessments to identify opportunities for transformative re-architecture or migration to central platforms. Your insights will shape the technology roadmap, ensuring we make progress towards deprecation goals while providing best customer service; Design, review and implement data solutions that support WW Operations Finance standardisation and automation initiatives using AWS technologies and internally built tools, including Spark/EMR, Redshift, Athena, DynamoDB, Lambda, S3, Glue, Lake Formation, etc; Support data solutions adoption by both, finance and technical teams; identify and remove adoption blockers; Ensure the speed of delivery and high-quality: iteratively improve development process and adopt mechanisms for optimisation of the development and support; Contribute into engineering excellence by reviewing design and code created by others; Contribute to delivery execution, planning, operational excellence, retrospectives, problem identification, and solution proposals; Collaborate with finance analysts, engineers, product and program managers and external teams to influence and optimize the value of the delivery in data platform; Create technical and customer-facing documentation on the products within the platform. A day in the life You work with the Engineering, Product, BI and Operations teams to elevate existing data platforms and implement best-of-class data solutions for WW Operations Finance organization. You solve unstructured customer pain points with technical solutions, you are focused on users productivity when working with the data. You participate in discussions with stakeholders to provide updates on project progress, gather feedback, and align on priorities. Utilizing AWS CDK and various AWS services, you design, execute, and deploy solutions. Your broader focus is on system architecture rather than individual pipelines. You regularly review your designs with Principal Engineer, incorporate gathered insights. Conscious of your impact on customers and infrastructure, you establish efficient development and change management processes to guarantee speed, quality and scalability of delivered solution. About the team WW Operations Finance Standardization and Automation improves customer experience and business outcomes across Amazon Operations Finance through innovative technical solutions, standardization and automation of processes and use of modern data analytics technologies. MS or BS in Computer Science, Electrical Engineering, or similar fields; Strong AWS engineering background, 5+ years of demonstrated track record designing and operating data solutions in Native AWS. The right person will be highly technical and analytical with ability to drive technical execution towards organization goals; Exceptional triaging and bug fixing skills. Ability to assess risks, implement fix without customer impact; Strong data modelling experience, 5+ years of data modeling practice is required. Expertise in designing both analytical and operational data models is a must. Candidate needs to demonstrate the working knowledge of trade-offs in data model designs and platform-specific considerations with concentration in Redshift, MySQL, EMR/Spark and Athena; Excellent knowledge of modern data architecture concepts data lakes, data lakehouses, as well as governance practices; Strong documentation skills, proven ability to adapt the document to the audience. The ability to communicate the information on levels ranging from executive summaries and strategy addendums to detailed design specifications is critical to the success; Excellent communication skills, both written and oral. Ability to communicate technical complexity to a wide range of stakeholders. Data Governance frameworks experience; Compliance frameworks experience, SOX preferred; Familiarity or production level experience with AI-based AWS offerings (Bedrock) is a plus.

Posted 4 weeks ago

Apply

12.0 - 17.0 years

14 - 15 Lacs

Mumbai

Work from Office

Create and manage AGILE project management practices using JIRA and Kanban boards. Manage stakeholder relationships and build project pipelines for COE delivery. Design and execute MIS reporting for COE utilization and contributions. Conduct landscape assessment studies and define target states in collaboration with data architects. Lead project timeline estimation and address blockers for timely delivery. Develop executive-level decks for finance updates, executive updates, and knowledge sharing forums. Required qualifications, capabilities, and skills 12+ years of experience in project/program management, with 3 years in Big Data projects. Expertise in AGILE practices and tailoring project delivery mechanisms. Hands-on experience with AGILE tools. Excellent communication skills for writing decks and presenting to senior management. Strong leadership skills to remove roadblocks and focus on delivery. Basic understanding of Big Data architecture. Preferred qualifications, capabilities, and skills Basic knowledge of SQL/Tableau.

Posted 4 weeks ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

Bengaluru

Work from Office

About Our Company/Team At Oracle Finergy, we are committed to delivering innovative solutions to the Banking, Financial Services, and Insurance (BFSI) sector. Our team of experts leverages proven methodologies and cutting-edge technologies to address the complex financial needs of our clients. We pride ourselves on being a leading provider of end-to-end banking solutions, enhancing operational efficiency, and ensuring technology aligns with our clients business goals. Our mission is to empower financial institutions to succeed in a rapidly changing world. Job Summary As a Microsoft Fabric Data Engineer/Developer, you will play a vital role in designing, developing, and implementing robust and scalable data solutions within the Microsoft Fabric ecosystem. You will collaborate closely with data architects, business stakeholders, and cross-functional teams to transform raw data into actionable insights, driving informed decision-making across the organization. If you are passionate about data engineering, possess a strong technical background, and excel in collaborative environments, we invite you to join our growing data team. Career Level - IC2 Microsoft Fabric Development : Design, develop, and deploy end-to-end data solutions using various components of Microsoft Fabric, including Lakehouse, Data Warehouse, Data Factory, and Data Engineering. Implement and optimize data pipelines for ingestion, transformation, and curation of data from diverse sources (e.g., Azure Data Lake Storage Gen2, on-premises databases, APIs, third-party systems). Develop and optimize data models within Microsoft Fabric, ensuring adherence to best practices for performance, scalability, and data quality. Utilize Power BI for data visualization and reporting, ensuring seamless integration with Fabric data assets. Azure Data Services Integration : Demonstrate strong hands-on experience with core Microsoft Azure data services, including Azure Data Factory (for ETL/ELT orchestration), Azure Databricks (for advanced analytics and processing), and Azure Data Lake Storage Gen2. Integrate Microsoft Fabric solutions with existing Azure data services and other enterprise systems. Data Architecture Governance : Contribute to the design and implementation of robust, scalable, and secure data architectures within the Microsoft Fabric platform. Implement data quality, validation, and reconciliation processes to ensure data integrity and accuracy. Apply data governance best practices, including security, access controls (e.g., role-based access control), and compliance within Fabric and Azure Purview. Documentation Knowledge Sharing : Maintain comprehensive documentation for data architectures, pipelines, data models, and processes. Stay updated with the latest advancements in Microsoft Fabric, Azure data services, and data engineering best practices. Qualifications Skills Mandatory : Bachelors degree in Computer Science, Information Technology, Engineering, or a related field. 4-7 years of professional experience as a Data Engineer, Data Developer, or in a similar role. Hands-on experience with Microsoft Fabric, including its core components (Lakehouse, Data Warehouse, Data Factory, Data Engineering). Strong expertise in Microsoft Azure data services: Azure Data Factory (ADF) Azure Data Lake Storage Gen2 Proven experience in designing, developing, and maintaining scalable data pipelines. Solid understanding of data warehousing concepts, dimensional modeling, and data lakehouse architectures. Proficiency in SQL for data manipulation and querying. Experience with version control systems (e.g., Git, Azure Repos). Strong analytical and problem-solving skills with meticulous attention to detail. Excellent communication skills (written and verbal) and the ability to collaborate effectively with cross-functional teams. Good-to-Have : Certification in Microsoft Azure or Microsoft Fabric. Experience with cloud-based data platforms, such as Amazon Web Services (AWS) or Google Cloud Platform (GCP). Knowledge of data governance frameworks and best practices. Additional Notes Ideal to have some background knowledge around Finance / Investment Banking / Fixed Income / OCIO Business Self-Assessment Questions To help you determine if this role is a good fit, please consider the following questions: 1)Can you describe your experience with Microsoft Fabric and its core components, highlighting specific projects or accomplishments 2)How do you ensure data quality, validation, and reconciliation in your data pipelines, and can you provide an example from a previous project 3)Can you explain your approach to data governance, including security, access controls, and compliance, and how youve applied this in a previous role 4)How do you stay up-to-date with the latest advancements in Microsoft Fabric, Azure data services, and data engineering best practices 5)Can you provide an example of a complex data problem youve solved in the past, highlighting your analytical and problem-solving skills

Posted 4 weeks ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Bengaluru

Work from Office

ETRM Data Engineer: Key Responsibilities Design, develop, and maintain scalable data pipelines and ETRM systems. Work on data integration projects within the Energy Trading and Risk Management (ETRM) domain. Collaborate with cross-functional teams to integrate data from ETRM trading systems like Allegro, RightAngle, and Endur. Optimize and manage data storage solutions in Data Lake and Snowflake. Develop and maintain ETL processes using Azure Data Factory and Databricks. Write efficient and maintainable code in Python for data processing and analysis. Ensure data quality and integrity across various data sources and platforms. Ensure data accuracy, integrity, and availability across various trading systems. Collaborate with traders, analysts, and IT teams to understand data requirements and deliver robust solutions. Optimize and enhance data architecture for performance and scalability Mandatory Skills: Python/ pyspark Fast API Pydantic SQL Alchemy Snowflake or SQL Data Lake Azure Data Factory (ADF) CI\CD, Azure fundamentals , GIT Integration of data solutions with CETRM trading Systems(Allegro, RightAngle, Endur) Good to have: Databricks Streamlit Kafka Power BI Kubernetes Fast Stream

Posted 4 weeks ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience : 8-10 Years.

Posted 4 weeks ago

Apply

10.0 - 15.0 years

15 - 20 Lacs

Hyderabad

Remote

Should possess a deep understanding of data architecture, database design, and conceptual, logical, and physical data models. expertise in ER Studio responsible for developing and maintaining data models to support business requirements Required Candidate profile data modeler with over 5+ years of experience in data modeling and a minimum of 3 years of proficiency using ER Studio. Strong analytical, problem-solving, and communication skills must.

Posted 4 weeks ago

Apply

3.0 - 5.0 years

7 - 11 Lacs

Bengaluru

Work from Office

We are looking for a Data & AI Associate to join NASSCOM s AI team and act as a strategic advisor for the industry on data & AI infrastructure, platforms, and innovation. This role is critical in enabling Indian enterprises, startups, and policymakers to navigate the evolving data ecosystem and adopt future-ready data strategies. You will be responsible for driving cross-industry engagement, developing best practice frameworks, advising stakeholders on scalable data architecture and governance, and supporting key initiatives including IndiaAI, Responsible AI Hub, and sectoral data partnerships. Data and AI Associate | nasscom Data and AI Associate We are looking for a Data & AI Associate to join NASSCOM s AI team and act as a strategic advisor for the industry on data & AI infrastructure, platforms, and innovation. This role is critical in enabling Indian enterprises, startups, and policymakers to navigate the evolving data ecosystem and adopt future-ready data strategies. You will be responsible for driving cross-industry engagement, developing best practice frameworks, advising stakeholders on scalable data architecture and governance, and supporting key initiatives including IndiaAI, Responsible AI Hub, and sectoral data partnerships. Job Details Location, Department Unit and Reporting Basic Functions/ Job Responsibility Industry Advisory & Thought Leadership: Act as a domain expert on data & AI platforms (cloud, hybrid, on-prem), data lakes, warehouses, mesh, and lakehouses. Advise companies (from startups to small and medium enterprises) on modern data architecture, integration, governance, and tooling. Support sector-specific data strategies (healthcare, BFSI, manufacturing, etc.) through working groups and roundtables. Research & Framework Development: Lead development of whitepapers, policy notes, and best practices on data interoperability, privacy-preserving data systems, and AI-ready data infrastructures. Track global benchmarks on data infrastructure and translate insights into actionable frameworks for companies. Stakeholder Engagement: Collaborate with government bodies, regulators, and industry leaders to co-develop India s data economy vision. Engage with ecosystem players cloud providers, analytics firms, data exchange platforms, startups, and academia. Capacity Building: Support NASSCOM s data & AI skilling and literacy efforts through content, training modules, and evangelism. Mentor industry partners and incubated startups on building scalable and ethical data architectures. Knowledge, Skills, Qualifications, Experience 3 to 5 years of experience Deep understanding of: Modern data platforms (e.g., Snowflake, Databricks, Google BigQuery, AWS Redshift, Azure Synapse) Data integration and pipeline orchestration (e.g., Airflow, dbt, Kafka, Fivetran) Governance and compliance (e.g., data cataloging, lineage, access control, anonymization) Knows using AI models for building products and solutions. Knowledge on choosing right models based on the solution proposed. Knowledge on the AI tools (MCP, Langchain, Langraph) and its integration techniques. Knowledge on AI products, copilots that can be used for better productivity. Experience advising or working with cross-functional stakeholders technical teams, policy makers, and business leaders. Knowledge of data standards, India s data protection laws, and international practices (e.g., GDPR, Open Data). Ability to suggest development process, fine-tune, and evaluate AI/ML models and algorithms to fit client needs. Build proofs-of-concept (PoCs), prepare demos, and support pilot projects in Generative AI (e.g. chatbots, text/image generation, data extraction). Assist with data wrangling, feature engineering, model training, and fine-tuning. Stay up to date on AI trends, tools, and frameworks to suggest relevant solutions for clients. Engage in knowledge sharing through whitepapers, internal training sessions, or publishing short research findings to maintain thought leadership. Assist in training the upcoming AI and GenAI developers during the Nasscom developer sessions. Preferred Qualifications: Experience working in consulting, think tanks, industry bodies, or tech product companies. Exposure to industry data challenges in sectors like BFSI, health, retail, or public sector. Familiarity with AI/ML platforms and responsible AI frameworks is a strong plus. Graduate/Postgraduate degree in Computer Science, Data Science, Engineering, or related field. Fill the form below to apply for the Data and AI Associate Upload file (doc docx pdf) from here. Upload Cover Letter Upload cover letter (doc docx pdf) from here. Linkedin Profile For Recaptcha requires verification. Enter your nasscom username. Enter the password that accompanies your username. 1 + 0 = Solve this simple math problem and enter the result. E.g. for 1+3, enter 4. The email address is not made public. It will only be used if you need to be contacted about your account or for opted-in notifications. 10 + 2 = Solve this simple math problem and enter the result. E.g. for 1+3, enter 4. Username or email address Password reset instructions will be sent to your registered email address. 9 + 3 = Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.

Posted 4 weeks ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Introduction to Demandbase: Demandbase is the Smarter GTM company for B2B brands. We help B2B companies hit their revenue goals using fewer resources. How? By aligning their sales and marketing teams around a combination of their data, our data, and artificial intelligence what we call Account Intelligence so they can identify, engage, and focus their time and money on the accounts most likely to buy. As a company, we re as committed to growing careers as we are to building world-class technology. We invest heavily in people, our culture, and the community around us. We have offices in the San Francisco Bay Area, Seattle, and India, as well as a team in the UK. We have also been continuously recognized as one of the best places to work in the San Francisco Bay Area including, Best Workplaces for Millennials and Best Workplaces for Parents ! Were committed to attracting, developing, retaining, and promoting a diverse workforce. By ensuring that every Demandbase employee is able to bring a diversity of talents to work, were increasingly capable of living out our mission to transform how B2B goes to market. We encourage people from historically underrepresented backgrounds and all walks of life to apply. Come grow with us at Demandbase! About the Role: Demandbase is seeking creative, highly motivated, enthusiastic engineer individuals to be part of our product development team. You will work in a fast paced agile environment to build and deliver key features of the application. This is a great opportunity to work with a talented, high energy and creative team focused on building a world-class product. Our technology stack is primarily in Java/Scala and while experience with these languages is great. Demandbase engineering focuses on fundamentals, not the tools/languages that you already know. We welcome people from all backgrounds who seek the opportunity to help build a future where everyone and everything can move independently. If you have the curiosity, passion, and collaborative spirit, work with us, and let s move the world forward, together. Were looking for people who are excellent at fundamentals, have willingness to learn, and have an unstoppable desire to follow through with the job. What youll be doing: This job is for a responsible individual contributor with a primary duty of leading the development effort and building scalable distributed systems. Design & develop scalable data processing platforms. Work on developing scalable data architecture system. It provides the opportunity and flexibility to own a problem space and drive its product roadmap. With ample opportunities to learn and explore, a highly motivated and committed engineer can push the limits of technologies in NLP area as well. Follow engineering best practices to solve data matching, data search related problems Work closely with cross-functional teams in an agile environment. What were looking for: You are a strong analytical and problem-solving skills. You are self-motivated learner. You are eager to learn new technologies. You are receptive to constructive feedback. You are Confident and articulate with excellent written and verbal communication skills. You are open to work in small development environment. Skills Required: Bachelor s degree in computer science or equivalent discipline from a top engineering institution. Adept in computer science fundamentals and passionate towards algorithms, programming and problem solving. 8- 12 years of Software Engineering experience in product companies is a plus. Should have experience in writing Production level code in Java or Scala . Good to have experience in writing Production level code in Python. Should have experience in Multithreading, Distributed Systems, Performance Optimization. Good knowledge of database concepts & proficient in SQL. Experience in Big Data tech stack like Spark, Kafka & Airflow is a plus. Should have knowledge/experience on one of the cloud AWS/Azure/GCP. Experience in writing unit test cases & Integration test is a must. Our Commitment to Diversity, Equity, and Inclusion at Demandbase We recognize that not all candidates will have every skill or qualification listed in this job description. If you feel you have the level of experience to be successful in the role, we encourage you to apply! We acknowledge that true diversity and inclusion require ongoing effort, and we are committed to doing the work required to make our workplace a safe and equitable space for all. Join us in building a community where we can learn from each other, celebrate our differences, and work together. Personal information that you submit will be used by Demandbase for recruiting and other business purposes. Our Privacy Policy explains how we collect and use personal information.

Posted 4 weeks ago

Apply

3.0 - 8.0 years

6 - 9 Lacs

Mumbai

Work from Office

Summary:Join Test Demo, a dynamic and innovative company based in the bustling city of Mumbai, as a Data Engineer. We are seeking a talented individual with 3 years of experience in data engineering to become an integral part of our team. In this in-office role, you will have the opportunity to work in a collaborative environment, surrounded by a team of passionate professionals dedicated to pushing the boundaries of data technology. As a Data Engineer at Test Demo, you will be responsible for designing, building, and maintaining scalable data pipelines that ensure the seamless flow of data across our systems. Your expertise will be crucial in optimizing data architecture and implementing robust data solutions that support our business objectives. You will work closely with cross-functional teams to understand data requirements and deliver high-quality data solutions that drive informed decision-making. The ideal candidate will have a strong background in data engineering, with proficiency in programming languages such as Python or Java, and experience with big data technologies like Hadoop or Spark. If you are a detail-oriented problem solver with a passion for data and a desire to work in a vibrant city like Mumbai, we invite you to apply and be a part of our exciting journey at Test Demo.ResponsibilitiesDesign, build, and maintain scalable data pipelines to ensure seamless data flow across systems.Optimize data architecture and implement robust data solutions to support business objectives.Collaborate with cross-functional teams to understand data requirements and deliver high-quality data solutions.

Posted 4 weeks ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Identify customer needs& build a tailored data engg solution Design&implement robust data pipelines using Azure Data Factory, PySpark notebooks, Spark SQL &Python Develop ETL/ELT processes to ingest, transform &load data into Azure Synapse, Data Lake Required Candidate profile 8+ yrs exp on building data/system architectures for analytics & data warehouse. Language: Python 3.8+, SQL,PySpark Azure Data Factory,Databricks,Synapse Analytics,Data Lake Storage,SQL DB,storage a/c

Posted 4 weeks ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: AI Application Integration. Experience: 10 YEARS.

Posted 4 weeks ago

Apply

9.0 - 14.0 years

30 - 45 Lacs

Chennai, Bengaluru

Work from Office

Educational Qualifications Lead and mentor the team, define goals, ensure timely project delivery, and manage code reviews. Design scalable data architectures, select appropriate technologies, and ensure compliance with data security regulations. Build and optimize ETL/ELT pipelines, automate workflows, and ensure data quality. Hands-on experience with Databricks for building and managing scalable data pipelines. Manage cloud-based infrastructure, implement IaC, and optimize costs and availability. Work with business stakeholders to translate requirements into technical solutions, track project milestones, and manage risks using Agile. Stay updated on new technologies, drive innovation, and optimize existing systems. Maintain documentation, share knowledge, and conduct team training sessions. Educational Qualifications Bachelors or Masters degree in Computer Science, Data Engineering, or related field 9+ years of experience in Data Engineering, with at least 3+ years in an architectural role Strong expertise in data engineering tools and technologies (e.g., Apache Spark, 1 Cloud (AWS or GCP or Azure), SQL, Python). Proficiency in any cloud platforms (e.g., AWS, Azure, GCP) and their data services. Experience with data modeling, ETL/ELT processes, and data warehousing solutions. Knowledge of distributed systems, big data technologies, and real-time data processing. Strong leadership, communication, and problem-solving skills. Familiarity with DevOps practices and tools (e.g., Docker, Kubernetes, Terraform). Understanding of data governance, security, and compliance requirements.

Posted 4 weeks ago

Apply

8.0 - 12.0 years

15 - 30 Lacs

Gurugram

Work from Office

Role description Lead and mentor a team of data engineers to design, develop, and maintain high-performance data pipelines and platforms. Architect scalable ETL/ELT processes, streaming pipelines, and data lake/warehouse solutions (e.g., Redshift, Snowflake, BigQuery). Own the roadmap and technical vision for the data engineering function, ensuring best practices in data modeling, governance, quality, and security. Drive adoption of modern data stack tools (e.g., Airflow, Kafka, Spark etc.) and foster a culture of continuous improvement. Ensure the platform is reliable, scalable, and cost-effective across batch and real-time use cases. Champion data observability, lineage, and privacy initiatives to ensure trust in data across the org. Skills Bachelors or Masters degree in Computer Science, Engineering, or related technical field. 8+ years of hands-on experience in data engineering with at least 2+ years in a leadership or managerial role. Proven experience with distributed data processing frameworks such as Apache Spark, Flink, or Kafka. Strong SQL skills and experience in data modeling, data warehousing, and schema design. Proficiency with cloud platforms (AWS/GCP/Azure) and their native data services (e.g., AWS Glue, Redshift, EMR, BigQuery). Solid grasp of data architecture, system design, and performance optimization at scale. Experience working in an agile development environment and managing sprint-based delivery cycles.

Posted 4 weeks ago

Apply

12.0 - 17.0 years

14 - 19 Lacs

Mumbai, Maharastra

Work from Office

About the Role: Grade Level (for internal use): 12 The Team You will be an expert contributor and part of the Rating Organizations Data Services Product Engineering Team. This team, who has a broad and expert knowledge on Ratings organizations critical data domains, technology stacks and architectural patterns, fosters knowledge sharing and collaboration that results in a unified strategy. All Data Services team members provide leadership, innovation, timely delivery, and the ability to articulate business value. Be a part of a unique opportunity to build and evolve S&P Ratings next gen analytics platform. Responsibilities: Architect, design, and implement innovative software solutions to enhance S&P Ratings' cloud-based analytics platform. Mentor a team of engineers (as required), fostering a culture of trust, continuous growth, and collaborative problem-solving. Collaborate with business partners to understand requirements, ensuring technical solutions align with business goals. Manage and improve existing software solutions, ensuring high performance and scalability. Participate actively in all Agile scrum ceremonies, contributing to the continuous improvement of team processes. Produce comprehensive technical design documents and conduct technical walkthroughs. Experience & Qualifications: Bachelors degree in Computer Science, Information Systems, Engineering, or a related field is required. Proficient in software development lifecycle (SDLC) methodologies, including Agile and Test-Driven Development. A total of 12+ years of experience, with 8+ years focused on designing enterprise products, modern data architectures, and analytics platforms. 6+ years of hands-on experience in application architecture and design, with proven knowledge of software and enterprise integration design patterns, as well as full-stack development across modern distributed front-end and back-end technology stacks. 5+ years of full-stack development experience using modern web development technologies, including proficiency in programming languages and UI frameworks, as well as experience with relational and NoSQL databases. Experience in designing transactional systems, data warehouses, data lakes, and data integrations within a big data ecosystem using cloud technologies. Thorough understanding of distributed computing principles. A passionate, intelligent, and articulate developer with a quality-first mindset and a strong background in developing products for a global audience at scale. Excellent analytical thinking, interpersonal skills, and both oral and written communication skills, with a strong ability to influence both IT and business partners. Superior knowledge of system architecture, object-oriented design, and design patterns. Strong work ethic, self-starter mentality, and results-oriented approach. Excellent communication skills are essential, with strong verbal and writing proficiencies. Additional Preferred Qualifications: Experience working with cloud service providers. Familiarity with Agile frameworks, including scaled Agile methodologies. Advanced degree in Computer Science, Information Systems, or a related field. Hands-on experience in application architecture and design, with proven software and enterprise integration design principles. Ability to prioritize and manage work to meet critical project timelines in a fast-paced environment. Strong analytical and communication skills, with the ability to train and mentor others.

Posted 4 weeks ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Patna

Work from Office

Project description We've been engaged by a large Australian company to work on their Digital Transformation project. We are building a team from offshore that will work with the existing team. We require an experienced Microsoft SQL Developer with strong experience with Microsoft SQL Server with broad exposure to financial markets. This will be a high profile engagement with a view to quickly grow our offshore footprint with this client. Responsibilities The role requires strong previous experience as an analyst in Data related functions and platforms, performing impact assessments, data discovery, data lineage, current state & gap analysis for complex data requirements Experience within Financial services preferred, ideally with exposure to the Banking and Finance data domain. A natural ability to effectively influence and collaborate with multiple technical and non-technical groups of stakeholders, and work well with a range of internal cross functional teams is highly desirable. Define problems and opportunities. Perform business & technical data analysis that can be used to support or challenge business outcomes Effectively communicating requirements to stakeholders, managing conflicts, issues and changes in order to ensure that stakeholders and project team members remain in agreement on solution scope Skills Must have 8-12 years of experience as a SQL developer in Microsoft environment. Ability to write complex queries and optimize, handle large volumes and remain efficient in a production setting Good knowledge in Database data structures, Data-warehouse Need a bright and eager to learn candidate for this role Some exposure/understanding of banking/financial market Balance sheet management exposure: great plus Technical knowledge of writing complex queries/stored procedures. Understanding of data architecture and advanced data analysis skills Good communication skills. Must be able to understand requirements and explain technical solutions to business stakeholders and the team. Excellent troubleshooting and analytical skills. Able to investigate, analyse and troubleshoot tasks. Experience in query optimization. Familiarity with SSI packages for integration and execution of the stored procedure. Track record of data analysis and writing SQL queries/stored procedure in large-scale applications and transformations. Nice to have A track record working as a developer in the Financial Service Industry or exposure to loans, interest rates, balance sheet related applications/datasets is desirable but not mandatory. Other Languages English: C1 Advanced Seniority Senior

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies