Home
Jobs

121 Data Modelling Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 - 2 Lacs

Pune

Work from Office

Naukri logo

R Primary Skills :Stored procedures, Relational Data Modeling, sql, Enterprise Data Modeling, Data Modeling Secondary Skills :Snowflake Modeling, snowflake warehouse, Snowflake Degree :Bachelor of Computer Science, BCA, BE, BE Computer Engineering, BE-IT, BTECH, M.Tech, MCA Branch :Computer Science and Engineering, Computer Engineering View rights : Job Description: Must Have Skills: 4-8 years of overall experience 4 years' experience in designing, implementing, and documenting data architecture and data modeling solutions, which include the use of Azure SQL and Snowflake databases and SQL procedures. Knowledge of relational databases and data architecture computer systems, including SQL Be responsible for the development of conceptual, logical, and physical data models, the implementation of operational data store (ODS), data marts, and data lakes on target platforms (Azure SQL and Snowflake databases). Knowledge of ER modeling, big data, enterprise data, and physical data models Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have a strong knowledge of Data Quality and Data Governance. Must have knowledge of ETL. Professional Skills: Solid written, verbal, and presentation communication skills. Strong team and individual player. Maintains composure during all types of situations and is collaborative by nature. High standards of professionalism, consistently producing high-quality results. Self-sufficient, independent requiring little supervision or intervention. Demonstrate flexibility and openness to bring creative solutions to address issues. ole & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.

Posted 4 hours ago

Apply

5.0 - 8.0 years

10 - 15 Lacs

Pune

Hybrid

Naukri logo

Role Synopsis This role is a senior-level position within a data science team, responsible for providing leadership, strategic direction, and technical expertise in the field of data science and machine learning. This role involves leading all aspects of and guiding a team of data scientists while collaborating closely with multi functional departments, such as engineering, product management, and business collaborators. The Lead Data Scientist plays a pivotal role in crafting data-driven strategies and solutions that chip in to the organization's success and growth Key Accountabilities Data Analysis and Modeling: Lead data scientists must have a confirmed foundation in data analysis and a deep understanding of various machine learning algorithms They should be able to apply these techniques to address sophisticated problems and extract valuable insights from data Out of Code computing: Use libraries that support out-of-core computing, such as Dask in Python These libraries can process data that doesn't fit into memory by reading it in smaller portions from disk Business Insight: Understanding the FDO's business objectives and aligning data initiatives with them Project Management: Being skilled in project management methodologies helps in planning and driving data science projects efficiently Machine Learning: Innovation and Strategy - Advanced Machine Learning Skills for complex models. Evaluation Collaboration and Communication: Effective communication with collaborators, Explain the modeling approach and results. Implement to privacy guidelines and recommendation with conscious balance Continuous Learning: Staying up-to-date in competitive edge. Apply methodologies to practical business challenges. Meeting with domain GPO's Data cleaning and preprocessing, analysis: The ability to clean and preprocess data effectively is a fundamental skill in any data science role Data Ethics and Privacy: Open communication with customer. Ethical considerations in algorithm design. Secure data handling, Data minimization Database Management: Proficient in database systems and SQL is required for data retrieval and storage Domain knowledge: Expertise in the domain they are working in to understand the context and requirements of the data projects better Statistical Analysis and Mathematics: Solid grasp of statistical methods and mathematical concepts is needed for data analysis, modeling, and drawing substantial insights from data Experience and Job Requirements Data Science Team plays a crucial role in driving data-informed decision-making and generating actionable insights to support the company's goals. This team is responsible for processing, analyzing, and interpreting large and complex datasets from multiple datasets to provide valuable insights and recommendations across various domains. Through advanced analytical techniques and machine learning models, the data science team helps optimize processes, predict trends, and build data-driven strategies. A bachelor's or equivalent experience or master's degree in quantitative, qualitative field such as Computer Science, Statistics, Mathematics, Physics, Engineering, or a related data field is often required Skills: Leadership role in Data Analysis, Programming proficiency in Python, SQL, Azure Databricks, Statistics & Mathematics. Leadership qualities to steer the team. Strategic direction and technical expertise Soft skills: Active listening, Translate business problems into data questions, Communication and collaboration, Presentation, Problem solving, Multi-functional, Team management, Partner management Data Sources: SAP, Concur, Salesforce, Workday, Excel files Other: Project management. Domain knowledge [Procurement, Finance, Customer], Business Insight, Critical thinking, Story telling Able to prepare analytical reports, presentations and/or visualisation dashboards to communicate findings, important metrics and insights to both technical and non-technical customers Stay up to date with industry trends, standard methodologies and new technologies in data analytics, machine learning, data science techniques.

Posted 5 hours ago

Apply

10.0 - 14.0 years

30 - 45 Lacs

Hyderabad

Work from Office

Naukri logo

Bachelors in computer science, Information Systems, or a related field Minimum of 10+ years of experience in data architecture with a minimum of 1-3 years of experience in healthcare domain Strong hands-on experience with Cloud databases such as Snowflake, Aurora, Google BigQuery etc. Experience in designing OLAP and OLTP systems for efficient data analysis and processing. Strong handson experience with enterprise BI/Reporting tools like (Looker, AWS QuickSight, PowerBI, Tableau and Cognos). A strong understanding of HIPAA regulations and healthcare data privacy laws is a must-have for this role, as the healthcare domain requires strict adherence to data privacy and security regulations. Experience in data privacy and tokenization tools like Immuta, Privacera, Privitar OpenText and Protegrity. Experience with multiple full life-cycle data warehouse/transformation implementations in the public cloud (AWS, Azure, and GCP) with Deep technical knowledge in one. Proven experience working as an Enterprise Data Architect or a similar role, preferably in large-scale organizations. Proficient in Data modelling (Star Schema (de-normalized data model), Transactional Model (Normalized data model) using tools like Erwin. Experience with ETL/ETL architecture and integration (Matillion, AWS GLUE, Google PLEX, Azure Data Factory etc) Deep understanding of data architectures that utilize Data Fabric, Data Mesh, and Data Products implementation. Business & financial acumen to advise on product planning, conduct research & analysis, and identify the business value of new and emerging technologies. Strong SQL and database skills working with large structured and unstructured data. Experienced in Implementation of data virtualization, and semantic model driven architecture. System development lifecycle (SDLC), Agile Development, DevSecOps, and standard software development tools such as Git and Jira Excellent written and oral communication skills to convey key choices, recommendations, and technology concepts to technical and non-technical audiences. Familiarity with AI/MLOps concepts and Generative AI technology.

Posted 8 hours ago

Apply

7.0 - 12.0 years

16 - 31 Lacs

Pune, Delhi / NCR, Mumbai (All Areas)

Hybrid

Naukri logo

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities: Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications: 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience: Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM.

Posted 2 days ago

Apply

8.0 - 12.0 years

20 - 22 Lacs

Pune

Work from Office

Naukri logo

Develop and deploy ML models using SageMaker. Automate data pipelines and training processes. Monitor and optimize model performance. Ensure model governance and reproducibility.

Posted 2 days ago

Apply

8.0 - 12.0 years

20 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Develop and deploy ML models using SageMaker. Automate data pipelines and training processes. Monitor and optimize model performance. Ensure model governance and reproducibility.

Posted 2 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

Position Purpose Shared Data Ecosystem (SDE) is an ITG-FRS department hosting various applications relating to the filire unique program in charge of collecting Accounting and Risk data from local entities in an unique stream. All the Accounting and Risk data is loaded into the SRS, which is a data warehouse storing all the group information at granular level. Accounting and Risk datamarts are fed by this data warehouse, and restitutions tools are plugged to these datamarts. Our goal is to deliver an efficient access to SRS data, for both local and central users, covering multiple use cases in a coherent way and data model. Enable the Filire (1800 users from Entities to Central teams) to contribute smoothly to the closing process with: Datamarts build-up consistently to allow data exposition Consistent and user-friendly BI tools Industrial accesses to produce granular analyses and Financial & Regulatory reportings As a business analyst, your main activities are to: Analyze business needs and write business/functional requirements Explain the needs/changes required in the application to Technical Teams Test the delivery/results built by Technical Teams Build BO reports to fulfill the needs Help SRS users on their daily work on SRS Exposition layer Production monitoring (quarterly closing), with the possibility of on-call period Responsibilities Direct Responsibilities The following deliverables are the main outputs of the previous scope definition in terms of responsibility for the BA. It should be taken into consideration that during Project Mode or accordingly to other recurrent work, new deliverables can be defined. The main deliverables are: Produce Functional requirements Write and execute tests cases Participate in designing innovative solutions aligned with banks informational architecture Build new BO queries based on Finance or RISK team requirement Assist Finance on their daily production work Root cause analysis of any production incident/defects raised by user. It is expected that he/she can ensure proper support to users of the tool, as well as providing high quality work and deliverables, on the execution of his/her job. Working knowledge in Microsoft Office Suite (Excel, PowerPoint, Word) and SharePoint. Good to have skills SQL (mandatory) Restitution tools (Business Object, Power BI and cubes SSAS) Business Intelligence (data modelling) Experience in process Finance/ Accounting domain as a Business Analyst Contributing Responsibilities Contribute to overall FRS as directed by Team and Department Management Technical & Behavioral Competencies Ability to simplify complex information in a clearly organized and visually interesting manner Pro-active behavior regarding the ability to work in a fast changing and demanding environment At ease with multi-tasking Strong analytical mind and problem-solving skills Ensure a high service level for all Customers of the tool Assure a high communication level with Customers and other teams Improve process that delivery users value Mind-set on getting better all the time, ongoing effort to improve Show the improve in the light of their efficiency effectiveness and flexibility Take pertinent proactive measures Be aligned with the BNP Values: Agility, Compliance Culture, Openness, Client Satisfaction Specific Qualifications (if required) Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Critical thinking Communication skills - oral & written Client focused Transversal Skills: (Please select up to 5 skills) Ability to understand, explain and support change Ability to develop and adapt a process Ability to manage / facilitate a meeting, seminar, committee, training Choose an item. Choose an item. Education Level: Bachelor Degree or equivalent Experience Level : At least 5 years

Posted 3 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Pune

Work from Office

Naukri logo

Although the role category specified in the GPP is Remote, the requirement is for Hybrid. Job Summary: Responsible for building high-quality, innovative and fully performing software in compliance with coding standards and technical design. Design, modify, develop, write and implement software programming applications. Support and/or install software applications. Key participant in the testing process through test review and analysis, test witnessing and certification of software. Key Responsibilities: Develop software solutions by studying information needs; conferring with users; studying systems flow, data usage and work processes; investigating problem areas; following the software development lifecycle; Document and demonstrate solutions; Develops flow charts, layouts and documentation Determine feasibility by evaluating analysis, problem definition, requirements, solution development and proposed solutions; Understand business needs and know how to create the tools to manage them Prepare and install solutions by determining and designing system specifications, standards and programming Recommend state-of-the-art development tools, programming techniques and computing equipment; participate in educational opportunities; read professional publications; maintain personal networks; participate in professional organizations; remain passionate about great technologies, especially open source Provide information by collecting, analyzing, and summarizing development and issues while protecting IT assets by keeping information confidential; Improve applications by conducting systems analysis recommending changes in policies and procedures Define applications and their interfaces, allocate responsibilities to applications, understand solution deployment, and communicate requirements for interactions with solution context, define Nonfunctional Requirements (NFRs) Understands multiple architectures and how to apply architecture to solutions; understands programming and testing standards; understands industry standards for traditional and agile development Provide oversight and foster Built-In Quality and Team and Technical Agility; Adopt new mindsets and habits in how people approach their work while supporting decentralized decision making. Maintain strong relationships to deliver business value using relevant Business Relationship Management practices. External Qualifications and Competencies Competencies: Business insight - Applying knowledge of business and the marketplace to advance the organizations goals.Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences.Customer focus - Building strong customer relationships and delivering customer-centric solutions. Global perspective - Taking a broad view when approaching issues, using a global lens.Manages conflict - Handling conflict situations effectively, with a minimum of noise.Agile Architecture - Designs the fundamental organization of a system embodied by its components, their relationshipsto each other and to the environment to guide its emergent design and evolution. Agile Development - Uses API-First Development where requirements and solutions evolve through the collaborative effort of self-organizing and cross-functional teams and their customer(s)/end user(s) to construct high-quality, well designed technical solutions; understands and includes the Internet of Things (IoT), the Digital Mesh, and Hyper Connectivity as inputs to API-First Development so solutions are more adaptable to future trends in Agile development.Agile Systems Thinking - Embraces a holistic approach to analysis that focuses on the way that a system's constituent parts interrelate and how systems work over time and within the context of larger systems to ensure the economic success of the solution. Agile Testing - Leads a cross-functional agile team with special expertise contributed by testers working at a sustainable pace, by delivering business value desired by the customer at frequent intervals to ensure the economic success of the solution.Regulatory Risk Compliance Management - Evaluates the design and effectiveness of controls against established industry frameworks and regulations to assess adherence with legal/regulatory requirements. Solution Functional Fit Analysis - Composes and decomposes a system into its component parts using procedures, tools and work aides for the purpose of studying how well the component parts were designed, purchased and configured to interact holistically to meet business, technical, security, governance and compliance requirements. Solution Modeling - Creates, designs and formulates models, diagrams and documentation using industry standards, tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements.Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications: College, university, or equivalent degree in Computer Science, Engineering, or related subject, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience: Experience working as a software engineer with the following knowledge and experiences are preferred:- Working in Agile environments;- Fundamental IT technical skill sets;- Taking a system from coping requirements through actual launch of the system;- Communicating with users, other technical teams and management to collect requirements, identify tasks, provide estimates and meet production deadlines; - Professional software engineering best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing and operations. Additional Responsibilities Unique to this Position Expertise in Oracle Data Integrator (ODI) 12c on Design, develop, and implement ETL processes.Collaborate with cross-functional teams to gather and analyse business requirements for data integration solutions. Experience in working with multiple source technologies like Oracle, SQL Server, Excel files, delimited files, RESTAPI. Exposure with Unix shell and Python scripting. Proficiency with working in SQL and PL/SQL (packages, stored procedures, triggers). Proficiency in data modelling concepts, terminology, and architecture. Knowledge of dimensional modelling. Experience working with multiple source/target systems such as Oracle, XML files, Json, flat files and excel documents.Expertise in ODI customization, migration between environments and load plan generation. Develop ETL mappings, workflows, and packages using ODI to integrate data from multiple sources. Knowledge of Oracle Database and related technologies.Create and maintain comprehensive documentation related to ETL processes, data mappings, and transformations. Optimize and tune ETL processes for optimal performance and efficiency. Troubleshoot and resolve data integration issues and errors.Collaborate with stakeholders to ensure data quality and integrity. Participate in code reviews and provide constructive feedback to peers. Excellent problem-solving, communication and collaboration skills. Understanding and experience with DevOps/DevSecOps. Experience with other ETL tools and technologies is a plus. Willingness to learn and expand into other Integration suite of Products like MuleSoft would be a big plus.

Posted 3 days ago

Apply

6.0 - 10.0 years

22 - 25 Lacs

Mumbai, Hyderabad

Work from Office

Naukri logo

About the role As a Data Warehouse Architect, you will be responsible for managing and enhancing data warehouse that manages large volume of customer-life cycle data flowing in from various applications within guardrails of risk and compliance. You will be managing the day-to-day operations of data warehouse i.e. Vertica. In this role responsibility, you will manage a team of data warehouse engineers to develop data modelling, designing ETL data pipeline, issue management, upgrades, performance fine-tuning, migration, governance and security framework of the data warehouse. This role enables the Bank to maintain huge data sets in a structured manner that is amenable for data intelligence. The data warehouse supports numerous information systems used by various business groups to derive insights.As a natural progression, the data warehouses will be gradually migrated to Data Lake enabling better analytical advantage. The role holder will also be responsible for guiding the team towards this migration. Key Responsibilities Data Pipeline Design Responsible for designing and developing ETL data pipelines that can help in organising large volumes of data. Use of data warehousing technologies to ensure that the data warehouse is efficient, scalable, and secure. Issue Management Responsible for ensuring that the data warehouse is running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance. Collaboration Collaborate with cross-functional teams to implement upgrades, migrations and continuous improvements. Data Integration and Processing Responsible for processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent. Data Modelling Responsible for designing and implementing data modelling solutions to ensure that the organizations data is properly structured and organized for analysis. Key Qualifications & Skills Educational Qualification B.E./B. Tech. in Computer Science, Information Technology or equivalent domain with 6 to 10 years of experience and at least 5 years or relevant work experience in Datawarehouse/mining/BI/MIS. Experience in Data Warehousing Knowledge on ETL and data technologies and outline future vision in OLTP, OLAP (Oracle / MSSQL). Data Modelling, Data Analysis and Visualization experience (Analytical tools experience like Power BI / SAS / ClickView / Tableu etc). Good to have exposure to Azure Cloud Data platform services like COSMOS, Azure Data Lake, Azure Synapse, and Azure Data factory. Synergize with the Team Regular interaction with business/product/functional teams to create mobility solutions. Certification Azure certified DP 900, PL 300, DP 203 or any other Data platform/Data Analyst certifications.. Communication skills Good oral and written communication skills.

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 15 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Mode of work: Hybrid (2 days WFO) Mode of Interview: 2 Rounds (Virtual, F2F) Notice Period: Immediate-15 days We are looking for a highly skilled Senior Backend Developer with solid experience in developing and maintaining scalable backend systems using Go. You'll be part of a core engineering team building robust APIs anddistributed services. Key Responsibilities: Develop scalable and high-performance backend services. Write clean, efficient, and testable code. Optimize systems for latency, reliability, and cost. Collaborate closely with front-end engineers and product teams. Handle data modelling and database performance tuning. Required Skills: Strong in Go Solid understanding of: RESTful API design SQL (PostgreSQL, MySQL) NoSQL (MongoDB, Redis) Location: Bangalore, Hyderabad, Pune, Mumbai, Chennai, Ahmedabad

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 8 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Position: Senior Engineer Golang Experience: 3-5 years Location: Bangalore, Hyderabad, Pune, Mumbai, Chennai, Ahmedabad Mode of work: Hybrid (2 days WFO) Mode of Interview: 2 Rounds (Virtual, F2F) Notice Period: Immediate-15 days We are looking for a highly skilled Senior Backend Developer with solid experience in developing and maintaining scalable backend systems using Go. You'll be part of a core engineering team building robust APIs and distributed services. Key Responsibilities: Develop scalable and high-performance backend services. Write clean, efficient, and testable code. Optimize systems for latency, reliability, and cost. Collaborate closely with front-end engineers and product teams. Handle data modelling and database performance tuning. Required Skills: Strong in Go Solid understanding of: RESTful API design SQL (PostgreSQL, MySQL) NoSQL (MongoDB, Redis)

Posted 5 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous. Other Languages English: C1 Advanced Location - Pune,Bangalore,Hyderabad,Chennai,Noida

Posted 5 days ago

Apply

8.0 - 13.0 years

12 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Data Engineering Team As a Lead Data Engineer for India, you will be accountable for leading the technical aspects of product engineering by being hands on, working on the enhancement, maintenance and support of the product on which your team is working, within your technology area. You will be responsible for your own hands-on coding, provide the design thinking and design solutions, ensuring the quality of your teams output, representing your team in product-level technical forums and ensuring your team provides technical input to and aligns with the overall product road-map. How will you make an impact? You will work with Engineers in other technology areas to define the overall technical direction for the product on alignment with Groups technology roadmap, standards and frameworks, with product owners and business stakeholders to shape the product's delivery roadmap and with support teams to ensure its smooth operation. You will be accountable for the overall technical quality of the work produced by India that is in line with the expectation of the stakeholders, clients and Group. You will also be responsible for line management of your team of Engineers, ensuring that they perform to the expected levels and that their career development is fully supported. Key responsibilities o Produce Quality Code o Code follows team standards, is structured to ensure readability and maintainability and goes through review smoothly, even for complex changes o Designs respect best practices and are favourably reviewed by peers o Critical paths through code are covered by appropriate tests o High-level designs / architectures align to wider technical strategy, presenting reusable APIs where possible and minimizing system dependencies o Data updates are monitored and complete within SLA o Technical designs follow team and group standards and frameworks, is structured to ensure reusability, extensibility and maintainability and goes through review smoothly, even for complex changes o Designs respect best practices and are favourably reviewed by peers o High-level designs / architectures align to wider technical strategy, presenting reusable APIs where possible and minimizing system dependencies o Estimates are consistently challenging, but realistic o Most tasks are delivered within estimate o Complex or larger tasks are delivered autonomously o Sprint goals are consistently achieved o Demonstrate commitment to continuous improvement of squad activities o The product backlog is consistently well-groomed, with a responsible balance of new features and technical debt mitigation o Other Engineers in the Squad feel supported in their development o Direct reports have meaningful objectives recorded in Quantium's Performance Portal, and understand how those objectives relate to business strategy o Direct reports' career aspirations are understood / documented, with action plans in place to move towards those goals o Direct reports have regular catch-ups to discuss performance, career development and their ongoing happiness / engagement in their role o Any performance issues are identified, documented and agreed, with realistic remedial plans in place o Squad Collaboration o People Management o Produce Quality Technical Design o Operate at high level of productivity Key activities Build technical product/application engineering capability in the team by that is in line with the Groups technical roadmap, standards and frameworks Write polished code, aligned to team standards, including appropriate unit / integration tests Review code and test cases produced by others, to ensure changes satisfy the associated business requirement, follow best practices, and integrate with the existing code-base Provide constructive feedback to other team members on quality of code and test cases Collaborate with other Lead / Senior Engineers to produce high-level designs for larger pieces of work Validate technical designs and estimates produced by other team members Merge reviewed code into release branches, resolving any conflicts that arise, and periodically deploy updates to production and non-production environments Troubleshoot production problems and raise / prioritize bug tickets to resolve any issues Proactively monitor system health and act to report / resolve any issues Provide out of hours support for periodic ETL processes, ensuring SLAs are met Work with business stakeholders and other leads to define and estimate new epics Contribute to backlog refinement sessions, helping to break down each epic into a collection of smaller user stories that will deliver the overall feature Work closely with Product Owners to ensure the product backlog is prioritized to maximize business value and manage technical debt Lead work breakdown sessions to define the technical tasks required to implement each user story Contribute to sprint planning sessions, ensuring the team takes a realistic but challenging amount of work into each sprint and each team member will be productively occupied Contribute to the teams daily stand-up, highlighting any delays or impediments to progress and proposing mitigation for those issues Contribute to sprint review and sprint retro sessions, to maintain a culture of continuous improvement within the team Coach / mentor more junior Engineers to support their continuing development Set and periodically review delivery and development objectives for direct reports Identify each direct reports longer-term career objectives and, as far as possible, factor this into work assignments Hold fortnightly catch-ups with direct reports to review progress against objectives, assess engagement and give them the opportunity to raise concerns about the product or team Work through the annual performance review process for all team members Conduct technical interviews as necessary to recruit new Engineers The superpowers youll be bringing to the team: 8+ years of experience in design, develop, and implement end-to-end data solutions (storage, integration, processing, access) in Google Cloud Platform (GCP) or similar cloud platforms. 2. Strong experience with SQL 3. Values delivering high-quality, peer-reviewed, well-tested code 4. Create ETL/ELT pipelines that transform and process terabytes of structured and unstructured data in real-time 5. Knowledge of DevOps functions and to contribute to CI / CD pipelines 6. Strong knowledge of data warehousing and data modelling and techniques like dimensional modelling etc 7. Strong hands-on experience with BigQuery/Snowflake, Airflow/Argo, Dataflow, Data catalog, VertexAI, Pub/Sub etc or equivalent products in other cloud platforms 8. Solid grip over programming languages like Python or Scala 9. Hands on experience in manipulating SPARK at scale with true in-depth knowledge of SPARK API 10. Experience working with stakeholders and mentoring experience for juniors in the team is good to have 11. Recognized as a go-to person for high-level designs and estimations 12. Experience working with source control tools (GIT preferred) with good understanding of branching / merging strategies 13. Experience in Kubernetes and Azure will be an advantage 14. Understanding of GNU/Linux systems and Bash/scripting 15. Bachelors degree in Computer Science, Information Technology or a related discipline 16. Comfortable working in a fast moving, agile development environment 17. Excellent problem solving / analytical skills 18. Good written / verbal communication skills 19. Commercially aware, with the ability to work with a diverse range of stakeholders 20. Enthusiasm for coaching and mentoring junior engineers 21. Experience in lading teams, including line management responsibilities

Posted 5 days ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices Location: Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 5 days ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Senior Data Engineer (Remote, Contract 6 Months) Databricks, ADF, and PySpark. We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background #ContractDetails Role: Senior Data Engineer Mode: Remote Duration: 6 Months Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune

Posted 5 days ago

Apply

6.0 - 11.0 years

0 - 2 Lacs

Kochi, Bengaluru

Work from Office

Naukri logo

We are looking for a Senior Developer with solid hands-on experience in Optimizely SaaS CMS and a strong front-end and back-end tech stack. If you're ready to work on impactful enterprise projects and drive next-gen digital experiences, wed love to hear from you! Role Details: -Position: Senior Developer -Experience: 6+ Years -Location: Kochi / Bangalore -Work Mode: Hybrid -Notice Period: Immediate Joiners Preferred -Client and Budget: Will discuss -Interview- [1st Round Virtual/ 2nd RoundF2F/Virtual] Key Must-Haves: Mandatory hands-on experience with Optimizely SaaS CMS Strong expertise in Next.js (SSR & SSG), React, TypeScript Proficient in Node.js and front-end performance optimization Experience with Optimizely Suite CMS, Commerce, CDP, DAM Skilled in .NET / C#, ASP.NET MVC, and RESTful API integrations Optimizely CDP: Data modeling, segmentation, personalization Why Join us: -Work on cutting-edge CMS & personalization solutions -Hybrid flexibility collaborate in dynamic tech hubs: Kochi/Bangalore -High-growth role with a competitive package -Exposure to enterprise-level digital transformation projects Interested? Apply now or DM us directly. Know someone who fits? Tag them! anzia.sabreen@bct-consulting.com

Posted 5 days ago

Apply

8.0 - 12.0 years

20 - 22 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Develop and deploy ML models using SageMaker. Automate data pipelines and training processes. Monitor and optimize model performance. Ensure model governance and reproducibility.

Posted 6 days ago

Apply

6.0 - 10.0 years

20 - 35 Lacs

Pune

Hybrid

Naukri logo

Design, implement, optimize ETL/ELT pipelines to ingest, transform, and load data into AWS Redshift from various sources Strong background in Python scripting, AWS services (Lambda, S3, Redshift),Data Integration & Pipeline Development Required Candidate profile 6 + years of exp. in BI development, data engineering. • Python/R scripting for data processing/ automation. • AWS services: Lambda, S3, and Redshift. • Data warehousing • Proficiency in SQL

Posted 6 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Power Bi and AAS expert (Strong SC or Specialist Senior) Should have hands-on experience of Data Modelling in Azure SQL Data Warehouse and Azure Analysis Service Should be able to write and test Dex queries. Should be able generate Paginated Reports in Power BI Should have minimum 3 Years working experience in delivering projects in Power Bi Must Have:- 3 to 8 years of experience working on design, develop, and deploy ETL processes on Databricks to support data integration and transformation. Optimize and tune Databricks jobs for performance and scalability. Experience with Scala and/or Python programming languages. Proficiency in SQL for querying and managing data. Expertise in ETL (Extract, Transform, Load) processes. Knowledge of data modeling and data warehousing concepts. Implement best practices for data pipelines, including monitoring, logging, and error handling. Excellent problem-solving skills and attention to detail. Excellent written and verbal communication skills Strong analytical and problem-solving abilities. Experience in version control systems (e.g., Git) to manage and track changes to the codebase. Document technical designs, processes, and procedures related to Databricks development. Stay current with Databricks platform updates and recommend improvements to existing process. v Good to Have:- Agile delivery experience. Experience with cloud services, particularly Azure (Azure Databricks), AWS (AWS Glue, EMR), or Google Cloud Platform (GCP). Knowledge of Agile and Scrum Software Development Methodologies. Understanding of data lake architectures. Familiarity with tools like Apache NiFi, Talend, or Informatica. Skills in designing and implementing data models. Skills: azure,data modelling,power bi,aas,azure sql data warehouse,azure analysis services,dex queries,data warehouse,paginated reports

Posted 6 days ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Power Bi and AAS expert (Strong SC or Specialist Senior) Should have hands-on experience of Data Modelling in Azure SQL Data Warehouse and Azure Analysis Service Should be able to write and test Dex queries. Should be able generate Paginated Reports in Power BI Should have minimum 3 Years working experience in delivering projects in Power Bi Must Have:- 3 to 8 years of experience working on design, develop, and deploy ETL processes on Databricks to support data integration and transformation. Optimize and tune Databricks jobs for performance and scalability. Experience with Scala and/or Python programming languages. Proficiency in SQL for querying and managing data. Expertise in ETL (Extract, Transform, Load) processes. Knowledge of data modeling and data warehousing concepts. Implement best practices for data pipelines, including monitoring, logging, and error handling. Excellent problem-solving skills and attention to detail. Excellent written and verbal communication skills Strong analytical and problem-solving abilities. Experience in version control systems (e.g., Git) to manage and track changes to the codebase. Document technical designs, processes, and procedures related to Databricks development. Stay current with Databricks platform updates and recommend improvements to existing process. v Good to Have:- Agile delivery experience. Experience with cloud services, particularly Azure (Azure Databricks), AWS (AWS Glue, EMR), or Google Cloud Platform (GCP). Knowledge of Agile and Scrum Software Development Methodologies. Understanding of data lake architectures. Familiarity with tools like Apache NiFi, Talend, or Informatica. Skills in designing and implementing data models. Skills: azure,data modelling,power bi,aas,azure sql data warehouse,azure analysis services,dex queries,data warehouse,paginated reports

Posted 6 days ago

Apply

2.0 - 5.0 years

3 - 8 Lacs

Jaipur

Work from Office

Naukri logo

Role Description The role is to perform a number of key functions that support and control the business in complying with a number regulatory requirements such as Markets in Financial Directive MiFID II. This role forms part of a team in Bangalore that supports Regulatory reporting across all asset classes: Rates, Credit, Commodities, Equities and Foreign Exchange. Key responsibilities include day to day exception management MIS Compilation and User Acceptance Testing (UAT). This role will also indulge in supporting in-house tech requirements in terms of building out reports, macros etc. Your key responsibilities Performing and/or managing various exception management functions across reporting for all asset classes, across multiple jurisdictions Ensure accurate, timely and completeness of reporting Working closely with our technology development teams to design system solutions, the aim to automate as much of the exceptions process as possible Liaising with internal and external teams to propose developments to the current architecture in order to ensure greater compliance with Regulatory requirements and drive improved STP processing of our reporting across all asset classes Perform root cause analysis or exceptions with investigation & appropriate escalation of any significant issues found through testing, rejection remediation or any other stream to senior management to ensure transparency exists in our controls Ability to build and maintain effective operational process and prioritise activities based on risk. Clear communication and escalation. Ability to recognize high risk situations and deal with them in a prompt manner. Documentation of BI deliverables. Support the design of data models, reports and visualizations to meet business needs Develop end-user reports and visualizations Your skills and experience 5-8years work experience within an Ops role within financial services. Graduate in Science/Technology/Engg./Mathematics. Regulatory experience (MIFIR, EMIR, Dodd Frank, Bank of England etc.) is preferred Preferable experience in Middle Office/Back Office, Reference Data and excellent in Trade Life Cycle (At least 2 asset Classes Equities, Credits, Rates, Foreign Exchange, Commodities) Ability to work independently, as well as in a team environment Clear and concise communication and escalation. Ability to recognise high risk situations and deal with them in a prompt manner. Ability to identify and prioritize multiple tasks that have potential operational risk and p/l impact in an often high-pressure environment Experience in data analysis with intermediate/advanced Microsoft Office Suite skills including VBA. Experience in building reports and BI analysis with tools such as SAP Business Objects, Tableau, QlikView etc. Advanced SQL Experience is preferred.

Posted 6 days ago

Apply

3.0 - 5.0 years

20 - 25 Lacs

Pune, Greater Noida

Work from Office

Naukri logo

Responsibilities:- Candidate should have strong experience on Duck creek. Candidate should have strong experience on Policy. Candidate should strong experience on Duck creek Example Platform 6X & 7X. Good understanding of underwriting, rating, insurance Rules, Forms, Example Author, Server, Express, Forms, Rating, Batch Processing, Task Creation, Transact, Address Validation. Good Knowledge of Policy life cycle and various Policy Transactions. Good Knowledge of Duck Creek Policy System and workflows. Experience in P&C insurance domain. Good Knowledge of Manuscripts, data model and Inheritance model. Good Understanding of business, functional requirements and policy workflow of the total application and project. Understanding the clients requirement properly then going for the development in the core areas of DCT. Must have excellent Communication Skills. Mandate Skill- .Net, Duckcreek Policy / PAS / Policy Center, Example, Author, Pages, Rating, Forms, Insurance-P&C Education / Qualification- BE/ B.Tech / BCA / B.Sc. / M.CA / M. TECH / Any Graduate

Posted 1 week ago

Apply

12.0 - 15.0 years

40 - 45 Lacs

Hyderabad

Work from Office

Naukri logo

Role Description: The Data Strategy and Governance Lead will operationalize the Enterprise Data Council vision across specific domains (Research, Clinical Trials, Commercial, etc.). He/She will coordinate activities at the tactical level, interpreting Enterprise Data Council direction and defining operational level impact deliverables and actions to build data foundations in specific domains. The Data Strategy and Governance Lead will partner with senior leadership and other Data Governance functional leads to align data initiatives with business goals. He/she will establish and enforce data governance policies and standards to provide high-quality data, easy to reuse and connect to accelerate AI innovative solutions to better serve patients. Roles & Responsibilities: Responsible for data governance and data management for a given domain of expertise (Research, Development, Supply Chain, etc.). Manage a team of Data Governance Specialists and Data Stewards for a specific domain. Responsible for operationalizing the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Drives cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. Provides expert guidance on business process and system design to support data governance and data/information modelling objectives. Maintain documentation and act as an expert on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. Ensure compliance with data privacy, security, and regulatory policies for the assigned domains Publish metrics to measure effectiveness and drive adoption of Data Governance policy and standards, that will be applied to mitigate identified risks across the data lifecycle (e.g., capture / production, aggregation / processing, reporting / consumption). Establish enterprise level standards on the nomenclature, content, and structure of information (structured and unstructured data), metadata, glossaries, and taxonomies. Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Architecture, Enterprise Data Fabric, etc.) define the specifications shaping the development and implementation of data foundations. Functional Skills: Must-Have Skills: Technical skills with in-depth knowledge of Pharma processes with preferred specialization in a domain (e.g., Research, Clinical, Commercial, Supply Chain, Finance, etc.). Aware of industry trends and priorities and can apply to governance and policies. In-depth knowledge and experience with data governance principles and technology; can design and implement Data Governance operating models to drive Amgens transformation to be a data driven organization. In-depth knowledge of data management, common data models, metadata management, data quality, reference & master data management, data stewardship, data protection, etc. Experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Good-to-Have Skills: Experience adopting industry standards in data products. Experience managing industry external data assets (e.g. Claims, EHR, etc.) Ability to successfully execute complex projects in a fast-paced environment and in managing multiple priorities effectively. Ability to manage projects or departmental budgets. Experience with modelling tools (e.g., Visio). Basic programming skills, experience in data visualization and data modeling tools. Experience working with agile development methodologies such as Scaled Agile. Soft Skills: Ability to build business relationships and understand end-to-end data use and needs. Excellent interpersonal skills (team player). People management skills either in matrix or direct line function. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Good presentation and public speaking skills. Strong attention to detail, quality, time management and customer focus. Basic Qualifications: 12 to 15 years of Information Systems experience 4 years of managerial experience directly managing people and leadership experience leading teams, projects, or programs.

Posted 1 week ago

Apply

5.0 - 10.0 years

18 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

Skill required : Data Engineers- Azure Designation : Sr Analyst/ Consultant Job Location : Bengaluru Qualifications: BE/BTech Years of Experience : 4 - 11 Years OVERALL PURPOSE OF JOB Understand client requirements and build ETL solution using Azure Data Factory, Azure Databricks & PySpark . Build solution in such a way that it can absorb clients change request very easily. Find innovative ways to accomplish tasks and handle multiple projects simultaneously and independently. Works with Data & appropriate teams to effectively source required data. Identify data gaps and work with client teams to effectively communicate the findings to stakeholders/clients. Responsibilities : Develop ETL solution to populate Centralized Repository by integrating data from various data sources. Create Data Pipelines, Data Flow, Data Model according to the business requirement. Proficient in implementing all transformations according to business needs. Identify data gaps in data lake and work with relevant data/client teams to get necessary data required for dashboarding/reporting. Strong experience working on Azure data platform, Azure Data Factory, Azure Data Bricks. Strong experience working on ETL components and scripting languages like PySpark, Python . Experience in creating Pipelines, Alerts, email notifications, and scheduling jobs. Exposure on development/staging/production environments. Providing support in creating, monitoring and troubleshooting the scheduled jobs. Effectively work with client and handle client interactions. Skills Required: Bachelors' degree in Engineering or Science or equivalent graduates with at least 4-11 years of overall experience in data management including data integration, modeling & optimization. Minimum 4 years of experience working on Azure cloud, Azure Data Factory, Azure Databricks. Minimum 3-4 years of experience in PySpark, Python, etc. for data ETL . In-depth understanding of data warehouse, ETL concept and modeling principles. Strong ability to design, build and manage data. Strong understanding of Data integration. Strong Analytical and problem-solving skills. Strong Communication & client interaction skills. Ability to design database to store huge data necessary for reporting & dashboarding. Ability and willingness to acquire knowledge on the new technologies, good analytical and interpersonal skills with ability to interact with individuals at all levels.

Posted 1 week ago

Apply

9.0 - 12.0 years

30 - 35 Lacs

Mumbai, Pune, Greater Noida

Work from Office

Naukri logo

Notice Period- Immediate-15 Days Mandate Skill- .Net, Duckcreek Policy / PAS / Policy Center, Example, Author, Pages, Rating, Forms, Insurance-P&C Responsibilities:- Candidate should have strong experience on Duck creek. Candidate should have strong experience on Policy. Candidate should strong experience on Duck creek Example Platform 6X & 7X. Good understanding of underwriting, rating, insurance Rules, Forms, Example Author, Server, Express, Forms, Rating, Batch Processing, Task Creation, Transact, Address Validation. Good Knowledge of Policy life cycle and various Policy Transactions. Good Knowledge of Duck Creek Policy System and workflows. Experience in P&C insurance domain. Good Knowledge of Manuscripts, data model and Inheritance model. Good Understanding of business, functional requirements and policy workflow of the total application and project. Understanding the clients requirement properly then going for the development in the core areas of DCT. Must have excellent Communication Skills. Education / Qualification- BE/ B.Tech / BCA / B.Sc. / M.CA / M. TECH / Any Graduate Work Location- Greater Noida, Mumbai, Pune & Hyderabad

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies