Home
Jobs

77 Data Architect Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

8 - 11 Lacs

Hubli, Mangaluru, Mysuru

Work from Office

Naukri logo

Job_Description":" Data Architect Location: Bangalore Job Type: Full Time Experience Level: Mid-Senior About the Role We are looking for an experiencedData Architect to design and implement scalable, high-performance dataarchitectures. You will be responsible for defining data models, optimizingdata flows, and ensuring data governance across the organization. This rolerequires a deep understanding of database design, cloud data platforms, and bigdata technologies to support analytics and business intelligence. Key Responsibilities Designand implement scalable data architectures that meet business and technicalrequirements. Developdata models, schemas, and metadata management strategies for structured andunstructured data. Defineand enforce data governance, security, and compliance best practices. Workclosely with data engineers, data scientists, and business teams to ensureefficient data pipelines and workflows. Optimizedata storage and retrieval strategies for performance and cost-effectiveness. Architectcloud-based and on-premise data solutions (AWS, Azure, Google Cloud, etc.). Implementdata integration and ETL/ELT processes using modern data platforms. Evaluateand recommend data management technologies, tools, and frameworks. Ensuredata quality and integrity through robust data validation, monitoring, anderror-handling mechanisms. Supportadvanced analytics, AI/ML workloads, and real-time data processing initiatives. Requirements Required Skills & Qualifications Bachelor\u2019s / Master\u2019sdegree in Computer Science, Data Science, or a related field. Extensiveexperience with relational and NoSQL databases (PostgreSQL, MySQL, MongoDB,Cassandra, etc.). Expertisein data modeling, data warehousing, and data lake architectures. Strongknowledge of big data technologies (Hadoop, Spark, Kafka, Snowflake, Redshift,BigQuery, etc.). Experiencewith ETL/ELT tools (Informatica, Talend, DBT, Apache Nifi, etc.). Hands-onexperience with cloud data platforms (AWS, Azure, GCP) and their data services. Proficiencyin SQL, Python, or Scala for data processing and scripting. Understandingof data security, privacy, and regulatory requirements (GDPR, HIPAA, CCPA,etc.). Stronganalytical, problem-solving, and leadership skills. Awareof Data Governance Preferred Qualifications Certificationsin cloud data services (AWS Certified Data Analytics, Google Professional DataEngineer, etc.). Experiencewith real-time data processing and streaming (Kafka, Flink, Spark Streaming). Familiaritywith machine learning and AI-driven data architectures. Knowledgeof CI/CD and DevOps for data workflows. Benefits - Workwith cutting-edge technologies and a talented team. - Competitive salary and benefitspackage. - Flexible work environment withgrowth opportunities. - Access to professional developmentand learning resources. ","

Posted 2 days ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Mumbai

Work from Office

Naukri logo

About BNP Paribas India Solutions: Established in 2005, BNP Paribas India Solutions is a wholly owned subsidiary of BNP Paribas SA, European Unions leading bank with an international reach. With delivery centers located in Bengaluru, Chennai and Mumbai, we are a 24x7 global delivery center. India Solutions services three business lines: Corporate and Institutional Banking, Investment Solutions and Retail Banking for BNP Paribas across the Group. Driving innovation and growth, we are harnessing the potential of over 10000 employees, to provide support and develop best-in-class solutions. About BNP Paribas Group: BNP Paribas is the European Unions leading bank and key player in international banking. It operates in 65 countries and has nearly 185,000 employees, including more than 145,000 in Europe. The Group has key positions in its three main fields of activity: Commercial, Personal Banking Services for the Groups commercial personal banking and several specialised businesses including BNP Paribas Personal Finance and Arval; Investment Protection Services for savings, investment, and protection solutions; and Corporate Institutional Banking, focused on corporate and institutional clients. Based on its strong diversified and integrated model, the Group helps all its clients (individuals, community associations, entrepreneurs, SMEs, corporates and institutional clients) to realize their projects through solutions spanning financing, investment, savings and protection insurance. In Europe, BNP Paribas has four domestic markets: Belgium, France, Italy, and Luxembourg. The Group is rolling out its integrated commercial personal banking model across several Mediterranean countries, Turkey, and Eastern Europe. As a key player in international banking, the Group has leading platforms and business lines in Europe, a strong presence in the Americas as well as a solid and fast-growing business in Asia-Pacific. BNP Paribas has implemented a Corporate Social Responsibility approach in all its activities, enabling it to contribute to the construction of a sustainable future, while ensuring the Group's performance and stability Commitment to Diversity and Inclusion At BNP Paribas, we passionately embrace diversity and are committed to fostering an inclusive workplace where all employees are valued, respected and can bring their authentic selves to work. We prohibit Discrimination and Harassment of any kind and our policies promote equal employment opportunity for all employees and applicants, irrespective of, but not limited to their gender, gender identity, sex, sexual orientation, ethnicity, race, colour, national origin, age, religion, social status, mental or physical disabilities, veteran status etc. As a global Bank, we truly believe that inclusion and diversity of our teams is key to our success in serving our clients and the communities we operate in. About Businessline/Function : CIB Client Engagement and Protection IT having focus on applications servicing Client Lifecycle management, Due Diligence /KYC , Customer Relation Management, Service Request Management, Referential, Transaction monitoring (AML), Data Document Platform. Landscape includes projects that are a mix of established and some under transition to new platforms, custom developed as well as commercial products. Teams are in EUR, NA, India working in a distributed mode following Agile practices, CI-CD DevSecOps Practices with focus on Automation Testing coverage, monitoring tools, observability. Job Title: Data Technical Architect Date: 01-May-25 Department: CEP IT Location: Mumbai Business Line / Function: Transaction Monitoring - AML Reports to: (Direct) Grade: (if applicable) (Functional) Number of Direct Reports: N/A Directorship / Registration: NA Position Purpose The CEP AML IT team is in charge of AML Monitoring tools for CIB and all regions. AML Monitoring tools are mainly used by Financial Security Compliance and CIB ITO LoD The purpose of the role is to seek a highly skilled and experienced AML Functional, Technical, and Data Architect with expertise in Actimize models and rules. The candidate will have a strong background in developing and optimizing database models on the AML Data Lake architecture, specifically focusing on Actimize. They will be responsible for designing, implementing, and maintaining data architecture solutions that effectively support our AML and compliance activities, with a specific emphasis on Actimize models and rules. Working closely with stakeholders, including AML analysts, data scientists, and IT teams, the candidate will ensure that the data architecture solutions align with business requirements and adhere to relevant regulations and policies, while also optimizing Actimize models and rules for enhanced detection and prevention of financial crimes. Key responsibilities will include analyzing system requirements, devising data migration strategies, and ensuring the efficient and secure storage of company information, with a focus on Actimize models and rules. Responsibilities Direct Responsibilities Collaborate with stakeholders to understand AML functional requirements and translate them into Actimize solution design and architecture and Data requirements and translate them into data architecture solutions that support AML and compliance activities. Design and implement Technical and data architecture solutions on the AML Products and Data Lake architecture, ensuring scalability, performance, and data integrity. Able to work independently with the Program Manager to understand business requirements and translate them to technical solutions in the application Configure Actimize modules and components to meet specific AML use cases and workflows. Integrate Actimize with other systems and data sources to ensure seamless data flow and information exchange. Develop and optimize database models to effectively store and retrieve AML-related data, considering data volume, complexity, and reporting needs. Establish and enforce data quality and data governance processes to ensure the accuracy, completeness, and consistency of AML data. Implement data security and access control processes to protect sensitive AML information and ensure compliance with security standards and privacy regulations. Evaluate and propose the integration of new technologies and innovative solutions to enhance AML data management processes, such as advanced analytics, machine learning, or automation. Contributing Responsibilities Technical Behavioral Competencies Minimum of 10 years of experience as a Functional, Technical, and Data Architect, with a strong focus on AML and compliance, demonstrating a deep understanding of industry best practices and regulatory requirements. Extensive expertise in Actimize technical and functional architecture, with a proven track record of successfully implementing Actimize models and rules that align with specific line of business needs. Able to work independently with the Program Manager to understand business requirements and translate them to technical solutions in the application Demonstrated proficiency in developing and optimizing Actimize functional models and rules, as well as designing and optimizing database models on the AML Data Lake architecture. Strong experience in Data-Warehouse architectural design, providing efficient and effective solutions in the Compliance AML data domains. In-depth knowledge of AML and compliance regulations and policies, ensuring compliance with industry standards and legal requirements. Exceptional analytical and problem-solving skills, with the ability to identify and address complex issues related to AML and compliance architecture. Excellent communication and interpersonal skills, enabling effective collaboration with stakeholders at all levels of the organization. Ability to work both independently and as part of a team, demonstrating strong teamwork and collaboration skills. Strong project management skills, with the ability to effectively plan, prioritize, and execute projects within defined timelines and budgets. Good experience in technical analysis of n-tier applications with multiple integrations using object oriented, APIs Microservices approaches. Very good understanding of principle behind various DevSecOps practices and working experience of industry standard tools Experience with Agile methodology is a plus, showcasing adaptability and flexibility in project delivery. Good knowledge on front-end technologies preferably Angular. Knowledge on Software methodology practice Agile Methodology SCRUM practices Business Skills IT / Business relation (Expert) Compliance Financial Security (Proficient) IT Skills: database Transversal Skills Ability to manage a project (Expert) Analytical ability (expert) Ability to understand, explain and support change (Expert) Behaviors Skills Ability to Deliver/Results driven(Expert) Ability to collaborate (Expert) Adaptability (Expert) Personal Impact/Ability to influence (Proficient) o o Resilience (Proficient) Specific Qualifications (if required) Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to deliver / Results driven Adaptability Personal Impact / Ability to influence Resilience Transversal Skills: (Please select up to 5 skills) Ability to manage a project Analytical Ability Ability to understand, explain and support change Ability to set up relevant performance indicators Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 10 years

Posted 3 days ago

Apply

10.0 - 15.0 years

11 - 21 Lacs

Hyderabad

Hybrid

Naukri logo

Role & responsibilities Design and implement end-to-end data architecture, including data modeling, storage solutions, data pipelines, and security frameworks. Develop conceptual, logical, and physical data models to support analytical, transactional, and operational use cases. Lead the integration of structured and unstructured data from various sources into data lakes and data warehouses. Define and maintain data governance, security, data quality, and compliance standards (e.g., GDPR, HIPAA). Collaborate with cross-functional teams including data engineers, software developers, business analysts, and stakeholders to translate business needs into data solutions. Optimize data storage, retrieval, and processing workflows for performance and scalability. Evaluate and recommend new data management technologies, tools, and practices. Support real-time and batch processing solutions using modern big data and streaming technologies. Document data flows, data models, and architecture decisions.

Posted 3 days ago

Apply

5.0 - 10.0 years

0 - 3 Lacs

Noida

Work from Office

Naukri logo

• Act as Data domain expert for Snowflake in a collaborative environment to provide demonstrated understanding of data management best practices and patterns. • Design and implement robust data architectures to meet and support business requirements leveraging Snowflake platform capabilities. • Develop and enforce data modelling standards and best practices for Snowflake environments. • Develop, optimize, and maintain Snowflake data warehouses. • Leverage Snowflake features such as clustering, materialized views, and semi structured data processing to enhance data solutions. • Ensure data architecture solutions meet performance, security, and scalability requirements. • Stay current with the latest developments and features in Snowflake and related technologies, continually enhancing our data capabilities. • Collaborate with cross-functional teams to gather business requirements, translate them into effective data solutions in Snowflake and provide data-driven insights. • Stay updated with the latest trends and advancements in data architecture and Snowflake technologies. • Provide mentorship and guidance to junior data engineers and architects. • Troubleshoot and resolve data architecture-related issues effectively. Skills Requirement: • 5+ years of proven experience as a Data Engineer with 3+ years as Data Architect. • Proficiency in Snowflake with Hands-on experience with Snowflake features such as clustering, materialized views, and semi-structured data processing. • Experience in designing and building manual or auto ingestion data pipeline using Snowpipe. • Design and Develop automated monitoring processes on Snowflake using combination of Python, PySpark, Bash with SnowSQL. • SnowSQL Experience in developing stored Procedures writing Queries to analyse and transform data • Working experience on ETL tools like Fivetran, DBT labs, MuleSoft • Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them. • Excellent problem-solving skills and attention to detail. • Effective communication and collaboration abilities. • Relevant certifications (e.g., SnowPro Core / Advanced) are a must have. • Must have expertise in AWS. Azure, Salesforce Platform as a Service (PAAS) model and its integration with Snowflake to load/unload data. • Strong communication and exceptional team player with effective problem-solving skills Educational Qualification Required: • Masters degree in Business Management (MBA / PGDM) / Bachelor's degree in computer science, Information Technology, or related field.

Posted 3 days ago

Apply

12.0 - 20.0 years

35 - 50 Lacs

Bengaluru

Hybrid

Naukri logo

Data Architect with Cloud Expert, Data Architecture, Data Integration & Data Engineering ETL/ELT - Talend, Informatica, Apache NiFi. Big Data - Hadoop, Spark Cloud platforms (AWS, Azure, GCP), Redshift, BigQuery Python, SQL, Scala,, GDPR, CCPA

Posted 6 days ago

Apply

4.0 - 12.0 years

5 - 9 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Hands-on experience in developing chatbots using Dialogflow CX and ES. Experience in designing/developing user-centric conversation experiences involving chat, text or voice Experience in architecting and building AI powe'red chatbot applications using platforms such as Dialogflow CX Sound knowledge of cloud platforms(GCP/AWS/Azure) Experience in integrating APIs using NodeJS and Python Construct intents, entities, and annotations in Dialogflow tool Write API documentation that outlines endpoints that Customers need to implement the CCAI on their end Liaise with the Customer and Data Architect on use case requirements and API technical requirements Label Agent Questions and Answers Labels that are used by Conversational Architects in the Conversation Nodes can be considered as Pages in Dialogflow Conversational Architects need at least 50 complete interactions from start to finish to model the conversation flow effectively Google Dialogflow, Dialogflow Es/Cx, Dialogflow Cx

Posted 1 week ago

Apply

12.0 - 18.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

1 Familiar with Data Modelling and apply same for new business requirement 2 Hands on SQL,Adv SQL and No SQL database 3 DWH and ETL experience 4 Should be able to interact with Customer understand business requirements 5 Familiar with Tableau and Power BI

Posted 1 week ago

Apply

2.0 - 7.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Hiring for MS SQL Server Bangalore location Develop SSIS packages based on instructions provided by the Data Architect Maintain code in TFS and deployment of SSIS packages into a SSIS server Retrieve Data from Databases, Design Data Solutions, Create and Maintain Reporting Processes, Resolve IT and Data Issues using SSRS (SQL SERVER REPORTING Services) Develop triggers, functions, stored procedures to support this effort Support production processes IT IS_AMS_Database_MS SQL Server

Posted 1 week ago

Apply

1.0 - 6.0 years

50 - 60 Lacs

Bengaluru

Work from Office

Naukri logo

Number of Openings* 1 Data Architect 2 Application architect Approved ECMS RQ# * Will provide post budget approval Duration of contract* 1 year Total Yrs. of Experience* 12-15 Relevant Yrs. of experience* 7 Detailed JD *(Roles and Responsibilities) Attached Mandatory skills* Java, Spring boot, Kafka, NoSQL (Cosmos Mongo, PostgreSQL, Cassandra) Azure SQL, Azure Kubernetes, Azure Cloud platform. Desired skills* React JS Domain* Retail-Client, Transportation/Logistics, Supply Chain Approx. vendor billing rate* Excluding service tax 16,000 INR/Day Work Location* Hyderabad, Chennai Background check process to be followed: * Before onboarding / After onboarding: * BGV Agency: * After on boarding; Can try for exception from customer if needed Mode of Interview: Telephonic/Face to Face/Virtual F2F preferred; 2 nd choice: Virtual

Posted 1 week ago

Apply

7.0 - 11.0 years

13 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Qentelli Solutions is looking for Data Architect - AWS to join our dynamic team and embark on a rewarding career journey Designs AWS-based data solutions and warehouses Optimizes databases for performance and scalability Implements data security and governance Guides teams on cloud best practices

Posted 1 week ago

Apply

10.0 - 12.0 years

13 - 20 Lacs

Kanpur

Work from Office

Naukri logo

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 week ago

Apply

10.0 - 12.0 years

13 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Noida

Work from Office

Naukri logo

Develop RESTful APIs using Azure APIM Develop integration workflow using LogicApp, synpase and service bus. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation using Azure Data Factory and synapse pipelines. Collaborate closely with Product Owners to understand data pipeline requirements and design effective data workflows. Translate business requirements into technical specifications for data pipelines. Create and maintain data storage solutions using Azure Cosmos DB and Azure Data Lake Storage Design and implement data models to optimize data storage and retrieval. Ensure data security and compliance with data governance policies. Analyze data pipeline performance metrics to identify bottlenecks and areas for improvement. Monitor data pipelines to ensure data consistency, availability, and adherence to service-level agreements. Integrate data pipelines with Azure DevOps to automate data pipeline deployment and testing processes. Leverage Azure DevOps tools for continuous integration and continuous delivery (CI/CD) of data pipelines Work effectively in an Agile development environment Collaborate with cross-functional teams to deliver value in an Agile manner. About you: 5 years work experience (minimum 3 years Experience in Microsoft Azure) (Azure Administrator, Data Platform, Data Lake, Synapse Pipelines, Synapse Analytics, API Management and other data cloud architecture) Development environments: Git, Azure DevOps, Template ARM Languages: C#, .NET, Python Strong analytical problem solver with an organized approach Fluent English Excellent methodology (communication, documentation, collaborative approach) Act independently and as a top-level contributor in resolving project strategy, scope, and direction Excellent organizational skills and a proven ability to get results Data mindset Nice to have : Microsoft Azure certifications Scala, JAVA Data-related projects: 5 years minimum

Posted 1 week ago

Apply

10.0 - 12.0 years

13 - 20 Lacs

Nagpur

Work from Office

Naukri logo

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 week ago

Apply

10.0 - 12.0 years

13 - 20 Lacs

Ahmedabad

Work from Office

Naukri logo

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 week ago

Apply

5.0 - 10.0 years

11 - 15 Lacs

Noida

Work from Office

Naukri logo

Be part of the solution at Technip Energies and embark on a one-of-a-kind journey. You will be helping to develop cutting-edge solutions to solve real-world energy problems. We are currently seeking an Azure Data Architect , to join our Digi team based in Noida . About us: Technip Energies is a global technology and engineering powerhouse. With leadership positions in LNG, hydrogen, ethylene, sustainable chemistry, and CO2 management, we are contributing to the development of critical markets such as energy, energy derivatives, decarbonization, and circularity. Our complementary business segments, Technology, Products and Services (TPS) and Project Delivery, turn innovation into scalable and industrial reality. Through collaboration and excellence in execution, our 17,000+ employees across 34 countries are fully committed to bridging prosperity with sustainability for a world designed to last. About the opportunity we offer: Develop RESTful APIs using Azure APIM Develop integration workflow using LogicApp, synpase and service bus. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation using Azure Data Factory and synapse pipelines. Collaborate closely with Product Owners to understand data pipeline requirements and design effective data workflows. Translate business requirements into technical specifications for data pipelines. Create and maintain data storage solutions using Azure Cosmos DB and Azure Data Lake Storage Design and implement data models to optimize data storage and retrieval. Ensure data security and compliance with data governance policies. Analyze data pipeline performance metrics to identify bottlenecks and areas for improvement. Monitor data pipelines to ensure data consistency, availability, and adherence to service-level agreements. Integrate data pipelines with Azure DevOps to automate data pipeline deployment and testing processes. Leverage Azure DevOps tools for continuous integration and continuous delivery (CI/CD) of data pipelines Work effectively in an Agile development environment Collaborate with cross-functional teams to deliver value in an Agile manner. About you: 5 years work experience (minimum 3 years Experience in Microsoft Azure) (Azure Administrator, Data Platform, Data Lake, Synapse Pipelines, Synapse Analytics, API Management and other data cloud architecture) Development environments: Git, Azure DevOps, Template ARM Languages: C#, .NET, Python Strong analytical problem solver with an organized approach Fluent English Excellent methodology (communication, documentation, collaborative approach) Act independently and as a top-level contributor in resolving project strategy, scope, and direction Excellent organizational skills and a proven ability to get results Data mindset Nice to have : Microsoft Azure certifications Scala, JAVA Data-related projects: 5 years minimum Your career with us: Working at Technip Energies is an inspiring journey, filled with groundbreaking projects and dynamic collaborations. Surrounded by diverse and talented individuals, you will feel welcomed, respected, and engaged. Enjoy a safe, caring environment where you can spark new ideas, reimagine the future, and lead change. As your career grows, you will benefit from learning opportunities at T.EN University, such as The Future Ready Program, and from the support of your manager through check-in moments like the Mid-Year Development Review, fostering continuous growth and development What s next Once receiving your application, our Talent Acquisition professionals will screen and match your profile against the role requirements. We ask for your patience as the team completes the volume of applications with reasonable timeframe. Check your application progress periodically via personal account from created candidate profile during your application. We invite you to get to know more about our company by visiting and follow us on LinkedIn , Instagram , Facebook , X and YouTube for company updates. #LI-AP1

Posted 1 week ago

Apply

8.0 - 13.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Data Architect- Total Yrs. of Experience* 15+ Relevant Yrs. of experience* 8+ Detailed JD *(Roles and Responsibilities) Leadership qualities and ability to lead a team of 8 data engineers + PowerBI resources Should be able to engage with business users and IT to provide consultation on data and visualization needs Excellent communication, articulation, and presentation skills Exposure to data architecture, ETL architecture Design, develop, and maintain scalable data pipelines using Python, ADF, and Databricks Implement ETL process to extract, transform, and load data from various sources into Snowflake Ensure data is processed efficiently and is made available for analytics and reporting 8+ years of experience in data engineering, with a focus on Python, ADF, Snowflake, Databricks, and ETL processes. Proficiency in SQL and experience with cloud-based data storage and processing. Strong problem-solving skills and the ability to work in a fast-paced environment Experience with Agile methodologies and working in a collaborative team environment. Certification in Snowflake, Azure, or other relevant technologies is an added advantage Bachelors degree in computer science engineering, Information Systems or equivalent field Mandatory skills* Python, Snowflake, Azure Data Factory, Databricks, SQL Desired skills* 1. Strong Oral and written communication 2. Proactive and accountable of the deliverables quality and timely submission Domain* Retail Work Location* India Location- PAN India Yrs of Exp-15+Yrs

Posted 1 week ago

Apply

12.0 - 13.0 years

35 - 60 Lacs

Bengaluru

Work from Office

Naukri logo

ECMS # 528867 Number of Openings 1 Duration of project 6 months Initially No of years experience 12 Years Detailed job description - Skill Set: Scroll down for the JD Mandatory Skills Databricks, Python, Pyspark, Architect Vendor Billing range (local currency /Day) 10000 INR/DAY Work Location BANGALORE/PUNE Hybrid/Remote/WFO Hybrid BGV Pre/Post onboarding Pre on-boarding. Any particular shift timings General shift whereas need to extend couple of hours if required. Role Data Architect In the role of Data Architect, you will interface with key stakeholders and apply your knowledge for understanding the business and business data across source systems. You will play an important role in creating a detailed business data understanding, outlining problems, opportunities, and data solutions for a business. Basic Bachelors degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education. At least 12+ years of experience with Information Technology and 5 + years in Data Architect Extensive experience in Design and Architecture of large data transformation systems. Preferred Understanding of the business area that the project is involved with. Working with data stewards to understand the data sources. Clear understanding of data entities, relationships, cardinality etc for the inbound sources based on inputs from the data stewards / source system experts. Performance tuning understanding the overall requirement, reporting impact. Data Modeling for the business and reporting models as per the reporting needs or delivery needs to other downstream systems. Have experience to components and languages like Databricks, Python, PySpark, SCALA, R. Ability to ask strong questions to help the team see areas that may lead to problems. Ability to validate the data by writing sql queries and compare against the source system and transformation mapping. Work closely with teams to collect and translate information requirements into data to develop data-centric solutions. Ensure that industry-accepted data architecture principles and standards are integrated and followed for modeling, stored procedures, replication, regulations, and security, among other concepts, to meet technical and business goals. Continuously improve the quality, consistency, accessibility, and security of our data activity across company needs. Experience on Azure DevOps project tracking tool or equivalent tools like JIRA. Should have Outstanding verbal, non-verbal communication. Should have experience and desire to work in a Global delivery environment.

Posted 1 week ago

Apply

10.0 - 12.0 years

13 - 20 Lacs

Lucknow

Work from Office

Naukri logo

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 week ago

Apply

10.0 - 12.0 years

13 - 20 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 week ago

Apply

10.0 - 12.0 years

13 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 week ago

Apply

10.0 - 12.0 years

13 - 20 Lacs

Jaipur

Work from Office

Naukri logo

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 week ago

Apply

10.0 - 20.0 years

45 - 55 Lacs

Noida, Hyderabad, Gurugram

Work from Office

Naukri logo

Data Architect Telecom Domain To design comprehensive data architecture and technical solutions specifically for telecommunications industry challenges, leveraging TMforum frameworks and modern data platforms. To work closely with customers, and technology partners to deliver data solutions that address complex telecommunications business requirements including customer experience management, network optimization, revenue assurance, and digital transformation initiatives. Responsibilities: Design and articulate enterprise-scale telecom data architectures incorporating TMforum standards and frameworks, including SID (Shared Information/Data Model), TAM (Telecom Application Map), and eTOM (enhanced Telecom Operations Map) Develop comprehensive data models aligned with TMforum guidelines for telecommunications domains such as Customer, Product, Service, Resource, and Partner management Create data architectures that support telecom-specific use cases including customer journey analytics, network performance optimization, fraud detection, and revenue assurance Design solutions leveraging Microsoft Azure and Databricks for telecom data processing and analytics Conduct technical discovery sessions with telecom clients to understand their OSS/BSS architecture, network analytics needs, customer experience requirements, and digital transformation objectives Design and deliver proof of concepts (POCs) and technical demonstrations showcasing modern data platforms solving real-world telecommunications challenges Create comprehensive architectural diagrams and implementation roadmaps for telecom data ecosystems spanning cloud, on-premises, and hybrid environments Evaluate and recommend appropriate big data technologies, cloud platforms, and processing frameworks based on telecom-specific requirements and regulatory compliance needs. Design data governance frameworks compliant with telecom industry standards and regulatory requirements (GDPR, data localization, etc.) Stay current with the latest advancements in data technologies including cloud services, data processing frameworks, and AI/ML capabilities Contribute to the development of best practices, reference architectures, and reusable solution components for accelerating proposal development Qualifications: Bachelor's or Master's degree in Computer Science, Telecommunications Engineering, Data Science, or a related technical field 10+ years of experience in data architecture, data engineering, or solution architecture roles with at least 5 years in telecommunications industry Deep knowledge of TMforum frameworks including SID (Shared Information/Data Model), eTOM, TAM, and their practical implementation in telecom data architectures Demonstrated ability to estimate project efforts, resource requirements, and implementation timelines for complex telecom data initiatives Hands-on experience building data models and platforms aligned with TMforum standards and telecommunications business processes Strong understanding of telecom OSS/BSS systems, network management, customer experience management, and revenue management domains Hands-on experience with data platforms including Databricks, and Microsoft Azure in telecommunications contexts Experience with modern data processing frameworks such as Apache Kafka, Spark and Airflow for real-time telecom data streaming Proficiency in Azure cloud platform and its respective data services with an understanding of telecom-specific deployment requirements Knowledge of system monitoring and observability tools for telecommunications data infrastructure Experience implementing automated testing frameworks for telecom data platforms and pipelines Familiarity with telecom data integration patterns, ETL/ELT processes, and data governance practices specific to telecommunications Experience designing and implementing data lakes, data warehouses, and machine learning pipelines for telecom use cases Proficiency in programming languages commonly used in data processing (Python, Scala, SQL) with telecom domain applications Understanding of telecommunications regulatory requirements and data privacy compliance (GDPR, local data protection laws) Excellent communication and presentation skills with ability to explain complex technical concepts to telecom stakeholders Strong problem-solving skills and ability to think creatively to address telecommunications industry challenges Good to have TMforum certifications or telecommunications industry certifications Relevant data platform certifications such as Databricks, Azure Data Engineer are a plus Willingness to travel as required if you will all or most of the criteria contact bdm@intellisearchonline.net M 9341626895

Posted 1 week ago

Apply

15.0 - 19.0 years

40 - 45 Lacs

Pune

Work from Office

Naukri logo

Skill Name - Data Architect with Azure & Databricks + Power BIExperience: 15 - 19 years Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions.Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Extensive Experience with common Azure services such as ADLS, Synapse, Databricks, Azure SQL etc. Experience on Azure services such as ADF, Polybase, Azure Stream Analytics Proven expertise in Databricks architecture, Delta Lake, Delta sharing, Unity Catalog, data pipelines, and Spark tuning. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. In-depth experience with SQL, Python, and/or PySpark. Hands-on knowledge of data governance, lineage, and cataloging tools such as Azure Purview and Unity Catalog. Experience in implementing CI/CD pipelines for data and BI components (e.g., using DevOps or GitHub). Experience on building symantec modeling in Power BI. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. Strong expertise in data exploration using SQL and a deep understanding of data relationships. Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills.

Posted 1 week ago

Apply

10.0 - 15.0 years

15 - 30 Lacs

Noida, Pune, Bengaluru

Work from Office

Naukri logo

Roles and responsibilities Work closely with the Product Owners and stake holders to design the Technical Architecture for data platform to meet the requirements of the proposed solution. Work with the leadership to set the standards for software engineering practices within the machine learning engineering team and support across other disciplines Play an active role in leading team meetings and workshops with clients. Choose and use the right analytical libraries, programming languages, and frameworks for each task. Help the Data Engineering team produce high-quality code that allows us to put solutions into production Create and own the technical product backlogs for products, help the team to close the backlogs in right time. Refactor code into reusable libraries, APIs, and tools. Help us to shape the next generation of our products. What Were Looking For Total experience in data management area for 10 + years’ experience in the implementation of modern data ecosystems in AWS/Cloud platforms. Strong experience with AWS ETL/File Movement tools (GLUE, Athena, Lambda, Kinesis and other AWS integration stack) Strong experience with Agile Development, SQL Strong experience with Two or Three AWS database technologies (Redshift, Aurora, RDS,S3 & other AWS Data Service ) covering security, policies, access management Strong programming Experience with Python and Spark Strong learning curve for new technologies Experience with Apache Airflow & other automation stack. Excellent with Data Modeling. Excellent oral and written communication skills. A high level of intellectual curiosity, external perspective, and innovation interest Strong analytical, problem solving and investigative skills Experience in applying quality and compliance requirements. Experience with security models and development on large data sets

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies