Jobs
Interviews

1817 Data Architecture Jobs - Page 33

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 3.0 years

2 - 6 Lacs

Chandigarh

Work from Office

We are looking for a highly skilled and experienced Analyst to join our team at eClerx Services Ltd. The ideal candidate will have a strong background in IT Services & Consulting, with excellent analytical skills and attention to detail. Roles and Responsibility Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data analysis systems and reports. Provide expert-level support for data analysis and reporting needs. Identify trends and patterns in large datasets to inform business decisions. Design and implement process improvements to increase efficiency and productivity. Develop and maintain technical documentation for data analysis systems. Job Requirements Strong understanding of data analysis principles and techniques. Proficiency in data visualization tools and programming languages. Excellent communication and problem-solving skills. Ability to work in a fast-paced environment and meet deadlines. Strong attention to detail and organizational skills. Experience working with large datasets and developing complex reports.

Posted 1 month ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Bengaluru

Work from Office

Were Hiring: Data Engineer Experience: 8+ Years Location: Bangalore / Chennai / Gurugram Company: Derisk360 Are you passionate about building scalable data systems and driving data quality across complex ecosystemsJoin Derisk360 to work on advanced cloud and data engineering initiatives that power intelligent business decision-making What Youll Do: Work with a broad stack of AWS services: S3, AWS Glue, Glue Catalog, Lambda, Step Functions, EventBridge, and more Develop and implement robust data quality checks using DQ libraries Lead efforts in data modeling and manage relational and NoSQL databases Build and automate ETL workflows using Informatica, Python, and Unix scripting Apply DevOps and Agile methodologies, including use of CI/CD tools and code repositories Engineer scalable big data solutions with Hadoop and Apache Spark Design impactful dashboards using Tableau, Amazon QuickSight, and Microsoft Power BI Work extensively on PostgreSQL and MongoDB databases Integrate real-time data pipelines with StreamSets and Kafka Drive data sourcing strategies, including real-time integration solutions Spearhead cloud migration efforts to Snowflake and Azure Data Lake, including data transitions from on-premise environments What You Bring: 8+ years of hands-on experience in data engineering roles Proficiency in AWS cloud services and modern ETL technologies Solid programming experience in Python and Unix Strong understanding of data architecture, quality frameworks, and reporting tools Experience working in Agile environments and using version control/CI pipelines Exposure to big data frameworks, real-time integration tools, and cloud data platforms What Youll Get: Competitive compensation Lead and contribute to mission-critical data engineering projects Work in a high-performance team at the intersection of cloud, data, and AI Continuous learning environment with access to cutting-edge technologies Collaborative work culture backed by technical excellence

Posted 1 month ago

Apply

2.0 - 5.0 years

8 - 12 Lacs

Pune

Work from Office

The headlines Job Title Senior Data Consultant (Delivery) Start Date Mid-July 2025 Location Hybrid; 2 days a week on-site in our office in Creaticity Mall, Shashtrinagar, Yerawada Salary ??2,300,000 ??3,800,000/annum A bit about the role Were looking for passionate Senior Data Consultants to join our Delivery team; a thriving and fast-growing community of some of the industrys best cloud data engineers, ranging from interns and graduates up to seasoned experts In this role, you'll combine deep technical expertise with strategic leadership and client engagement, acting as a trusted advisor to senior stakeholders Youll take ownership of solution architecture, project planning, and business development opportunities, driving the successful delivery of high-impact data solutions Youll have the opportunity to lead and mentor teams, shape best practices, and contribute to internal initiatives, thought leadership, and go-to-market propositions With a culture that values collaboration, innovation, and professional growth, this is the perfect opportunity for a data leader looking to make a real impact within an international, industry-leading consultancy What you'll be doing Leading the design and delivery of enterprise-scale data solutions, ensuring alignment with business objectives Building and managing client relationships at a senior level, acting as a trusted advisor to stakeholders Owning and driving solution architecture, contributing to proposal development and project planning Managing and mentoring teams, ensuring high-quality project execution and professional growth of team members Identifying new business opportunities by understanding client needs and proactively proposing solutions Driving internal initiatives such as capability development, internal products, and best practice frameworks Contributing to pre-sales efforts, presenting at client meetings, industry events, and marketing initiatives Establishing thought leadership, writing blogs, publishing articles, and presenting at external events What you'll need to succeed Expertise in data warehousing, cloud analytics, and modern data architectures (Snowflake, Matillion, Databricks, or similar) Proven ability to engage and influence senior stakeholders, providing strategic guidance and technical leadership Strong consulting and client management experience, with a track record of delivering high-impact data projects Leadership and team management skills, with experience guiding multiple teams or large-scale projects An ability to manage complex project priorities, balancing resources effectively and ensuring on-time delivery A passion for innovation and continuous improvement, with the ability to identify and implement best practices Strong communication and influencing skills, capable of managing high-tension situations and facilitating negotiations So, what's in it for you The chance to work on cutting-edge cloud data projects for leading enterprise clients An opportunity to shape strategy and drive business impact, with the autonomy to influence major decisions A chance to lead and mentor talented consultants, fostering a culture of excellence and knowledge-sharing Opportunities to contribute to thought leadership through blogging, speaking engagements, and industry networking A fast-growing, dynamic company culture that values innovation, collaboration, and personal development About Snap Analytics We're a high-growth data analytics consultancy on a mission to help enterprise businesses unlock the full potential of their data With offices in the UK, India, and South Africa, we specialise in cutting-edge cloud analytics solutions, transforming complex data challenges into actionable business insights We partner with some of the biggest brands worldwide to modernise their data platforms, enabling smarter decision-making through Snowflake, Databricks, Matillion, and other cloud technologies Our approach is customer-first, innovation-driven, and results-focused, delivering impactful solutions with speed and precision At Snap, were not just consultants, were problem-solvers, engineers, and strategists who thrive on tackling complex data challenges Our culture is built on collaboration, continuous learning, and pushing boundaries, ensuring our people grow just as fast as our business Join us and be part of a team thats shaping the future of data analytics!

Posted 1 month ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Pune

Work from Office

Join Barclays as a Senior Data Engineer At Barclays, we are building the bank of tomorrow As a Strategy and Transformation Lead you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Hands on programming experience in python and PY-Spark Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based in Pune Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures Development of processing and analysis algorithms fit for the intended data complexity and volumes Collaboration with data scientist to build and deploy machine learning models Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures If managing a team, they define jobs and responsibilities, planning for the departments future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment Manage and mitigate risks through assessment, in support of the control and governance agenda Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions Adopt and include the outcomes of extensive research in problem solving processes Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave

Posted 1 month ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Work from Office

This role involves the development and application of engineering practice and knowledge in defining, configuring and deploying industrial digital technologies including but not limited to PLM MES for managing continuity of information across the engineering enterprise, including design, industrialization, manufacturing supply chain, and for managing the manufacturing data. Job Description - Grade Specific Focus on Digital Continuity Manufacturing. Fully competent in own area. Acts as a key contributor in a more complex critical environment. Proactively acts to understand and anticipates client needs. Manages costs and profitability for a work area. Manages own agenda to meet agreed targets. Develop plans for projects in own area. Looks beyond the immediate problem to the wider implications. Acts as a facilitator, coach and moves teams forward.

Posted 1 month ago

Apply

11.0 - 16.0 years

40 - 45 Lacs

Pune

Work from Office

Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. AI/ML: Identity and support the creation of AI use cases focused on delivery the data architecture strategy and data governance tooling. Identify AI/ML use cases and architect pipelines that integrate data flows, data lineage, data quality. Embed AI-powered data quality, detection and metadata enrichment to accelerate data discoverability. Assist in defining and driving the data architecture standards and requirements for AI that need to be enabled and used. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaboration: Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Demonstrable experience in designing and deploying AI tooling architectures and use cases Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions.

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

Pune

Remote

Design Databases & Data Warehouses, Power BI Solutions, Support Enterprise Business Intelligence, Strong Team Player & Contributor, Continuous Improvement Experience in SQL, Oracle, SSIS/SSRS, Azure, ADF, CI/CD, Power BI, DAX, Microsoft Fabric Required Candidate profile Source system data structures, data extraction, data transformation, warehousing, DB administration, query development, and Power BI. Develop WORK FROM HOME

Posted 1 month ago

Apply

3.0 - 8.0 years

12 - 13 Lacs

Pune

Work from Office

Join us as a Data Records Governance Analyst at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. you'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. You may be assessed on the key critical skills relevant for success in role, such as experience with Data and Records Management Governance, Data Lineage, Data Controls, as we'll as job-specific skillsets. To be successful as a Data Records Governance Analyst, you should have experience with: Basic/ Essential Qualifications: Strategic Vision and Leadership. Data Governance and Quality Management. Knowledge that includes data architecture, integration, analytics, Artificial Intelligence, or Cloud computing. Desirable skillsets/ good to have: Data Modelling. Knowledge of Data Architecture or experience with working with Data Architects. Data Sourcing Provisioning. Data Analytics. Data Privacy and Security. This role will be based out of Pune. Purpose of the role To develop, implement, and maintain effective governance frameworks for all data and records across the banks global operations. Accountabilities Development and maintenance of a comprehensive data and records governance framework aligned with regulatory requirements and industry standards. Monitoring data quality and records metrics and compliance with standards across the organization. Identification and addressing of data and records management risks and gaps. Development and implementation of a records management programme that ensures the proper identification, classification, storage, retention, retrieval and disposal of records. Development and implementation of a data governance strategy that aligns with the banks overall data management strategy and business objectives. Provision of Group wide guidance and training on Data and Records Management standard requirements. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.

Posted 1 month ago

Apply

4.0 - 9.0 years

30 - 35 Lacs

Hyderabad

Work from Office

Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you're part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we'do. Since 1980, we've helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas -Oncology , Inflammation, General Medicine, and Rare Disease- we'reach millions of patients each year. As a member of the Amgen team, you'll help make a lasting impact on the lives of patients as we'research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you'll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let s do this. Let s change the world. In this vital role you will be responsible for designing, building, maintaining , analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Basic Qualifications : masters degree and 1 to 3 years of Computer Science, IT or related field experience OR bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Must have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark ( PySpark , SparkSQL ), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools ( eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e. g. , GDPR, CCPA) Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills : Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease . Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 month ago

Apply

1.0 - 3.0 years

3 - 4 Lacs

Nagercoil

Work from Office

We are seeking a skilled Data Migration Specialist to support critical data transition initiatives, particularly involving Salesforce and Microsoft SQL Server . This role will be responsible for the end-to-end migration of data between systems, including data extraction, transformation, cleansing, loading, and validation. The ideal candidate will have a strong foundation in relational databases, a deep understanding of the Salesforce data model, and proven experience handling large-volume data loads. Required Skills and Qualifications: 1+ years of experience in data migration , ETL , or database development roles. Strong hands-on experience with Microsoft SQL Server and T-SQL (complex queries, joins, indexing, and profiling). Proven experience using Salesforce Data Loader for bulk data operations. Solid understanding of Salesforce CRM architecture , including object relationships and schema design. Strong background in data transformation and cleansing techniques . Nice to Have: Experience with large-scale data migration projects involving CRM or ERP systems. Exposure to ETL tools such as Talend, Informatica, Mulesoft, or custom scripts. Salesforce certifications (eg, Administrator , Data Architecture & Management Designer ) are a plus. Knowledge of Apex , Salesforce Flows , or other declarative tools is a bonus. Key Responsibilities: Execute end-to-end data migration activities , including data extraction, transformation, and loading (ETL). Develop and optimize complex SQL queries, joins, and stored procedures for data profiling, analysis, and validation. Utilize Salesforce Data Loader and/or Apex DataLoader CLI to manage high-volume data imports and exports. Understand and work with the Salesforce data model , including standard/custom objects and relationships (Lookup, Master-Detail). Perform data cleansing, de-duplication , and transformation to ensure quality and consistency. Troubleshoot and resolve data-related issues , load failures, and anomalies. Collaborate with cross-functional teams to gather data mapping requirements and ensure accurate system integration. Ensure data integrity , adherence to compliance standards, and document migration processes and mappings. Ability to independently analyze, troubleshoot, and resolve data-related issues effectively. Follow best practices for data security, performance tuning, and migration efficiency.

Posted 1 month ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Advanced Data Analyst Join the industry leader to design the next generation of breakthroughs. When you join Honeywe'll, you become a member of our global team of thinkers, innovators, dreamers, and doers who make the things that make the future. That means changing the way we fly, fueling jets in an eco-friendly way, keeping buildings smart and safe and driving automation with software embedded products. Working at Honeywe'll isn t just about developing cool things. That s why all our employees enjoy access to dynamic career opportunities across different fields and industries. We offer amazing opportunities for career growth with a world-class team of diverse experts. Are you'ready to help us make the future Join a team that is elevating our strategy to drive advanced analytics and visualization tools across the Commercial enterprise. In this role, Advanced Data Analyst - CX, you will design, implement, and manage the data architecture, systems, and processes to effectively collect, store, process and analyze high volume, high dimensional data to provide strategic insight into complex business problems. This will involve creating and maintaining scalable, efficient, and secure data pipelines, data warehouses, and data lakes. You need to ensure consistency in data quality and availability for analysis and reporting including compliance with data governance and security standards. YOU MUST HAVE 6 or more years of relevant experience in Data Engineering, ETL Development Visualization Hands-On Experience in Power BI development Expert in scripting and querying languages, such as Python, SQL Experience with both Structured and Unstructured data Experience in Snowflake SFDC or SAP business and technical knowledge Knowledge of Agile development methodology Adaptability of business priorities and being flexible with work time management WE VALUE Predictive analysis / trend analysis with large data Knowledge of databases, data warehouse platforms (Snowflake) and Cloud based tools. Experience in using data integration tools for ETL processes. Demonstrated experience in adobe analytics or google analytics Ability to develop and communicate technical vision for projects and initiatives that can be understood by stakeholders and management. Proven mentoring ability to drive results and technical growth in peers. Effective communication skills (verbal, written, and presentation) for interacting with customers and peers. Demonstrated application of statistics, statistical modeling, and statistical process control. YOU MUST HAVE 6 or more years of relevant experience in Data Engineering, ETL Development Visualization Hands-On Experience in Power BI development Expert in scripting and querying languages, such as Python, SQL Experience with both Structured and Unstructured data Experience in Snowflake SFDC or SAP business and technical knowledge Knowledge of Agile development methodology Adaptability of business priorities and being flexible with work time management WE VALUE Predictive analysis / trend analysis with large data Knowledge of databases, data warehouse platforms (Snowflake) and Cloud based tools. Experience in using data integration tools for ETL processes. Demonstrated experience in adobe analytics or google analytics Ability to develop and communicate technical vision for projects and initiatives that can be understood by stakeholders and management. Proven mentoring ability to drive results and technical growth in peers. Effective communication skills (verbal, written, and presentation) for interacting with customers and peers. Demonstrated application of statistics, statistical modeling, and statistical process control. Duties and Responsibilities Work in complex data science and analytics projects in support of the Customer Experience organization. Work with GDM owner to identify the data requirements and design/ maintain/ optimize data pipeline to ingest, transform, and load structured and unstructured data from various sources into the data warehouse or data lake. Design and implement data models and to support analytical and reporting requirements. Develop, operate and maintain Advanced Power BI reporting for visualization Develop and maintain ETL (Extract, Transform, Load) processes. Develop and maintain complex SQL queries. Exploratory Data Analysis to solve complex business problems. Enforce data governance policies, standards, and best practices to maintain data quality, privacy, and security - perform audit of the same Create and maintain comprehensive documentation for data architecture, processes, and systems. Troubleshoot and resolve data-related problems and optimize system performance. Partner with IT support team on production processes, continuous improvement, and production deployments. Duties and Responsibilities Work in complex data science and analytics projects in support of the Customer Experience organization. Work with GDM owner to identify the data requirements and design/ maintain/ optimize data pipeline to ingest, transform, and load structured and unstructured data from various sources into the data warehouse or data lake. Design and implement data models and to support analytical and reporting requirements. Develop, operate and maintain Advanced Power BI reporting for visualization Develop and maintain ETL (Extract, Transform, Load) processes. Develop and maintain complex SQL queries. Exploratory Data Analysis to solve complex business problems. Enforce data governance policies, standards, and best practices to maintain data quality, privacy, and security - perform audit of the same Create and maintain comprehensive documentation for data architecture, processes, and systems. Troubleshoot and resolve data-related problems and optimize system performance. Partner with IT support team on production processes, continuous improvement, and production deployments.

Posted 1 month ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Data Analyst II Join the industry leader to design the next generation of breakthroughs. When you join Honeywe'll, you become a member of our global team of thinkers, innovators, dreamers, and doers who make the things that make the future. That means changing the way we fly, fueling jets in an eco-friendly way, keeping buildings smart and safe and driving automation with software embedded products. Working at Honeywe'll isn t just about developing cool things. That s why all our employees enjoy access to dynamic career opportunities across different fields and industries. We offer amazing opportunities for career growth with a world-class team of diverse experts. Are you'ready to help us make the future Join a team that is elevating our strategy to drive advanced analytics and visualization tools across the Commercial enterprise. In this role, Advanced Data Analyst - CX, you will design, implement, and manage the data architecture, systems, and processes to effectively collect, store, process and analyze high volume, high dimensional data to provide strategic insight into complex business problems. This will involve creating and maintaining scalable, efficient, and secure data pipelines, data warehouses, and data lakes. You need to ensure consistency in data quality and availability for analysis and reporting including compliance with data governance and security standards. YOU MUST HAVE 6 or more years of relevant experience in Data Engineering, ETL Development Visualization Hands-On Experience in Power BI development Expert in scripting and querying languages, such as Python, SQL Experience with both Structured and Unstructured data Experience in Snowflake SFDC or SAP business and technical knowledge Knowledge of Agile development methodology Adaptability of business priorities and being flexible with work time management WE VALUE Predictive analysis / trend analysis with large data Knowledge of databases, data warehouse platforms (Snowflake) and Cloud based tools. Experience in using data integration tools for ETL processes. Demonstrated experience in adobe analytics or google analytics Ability to develop and communicate technical vision for projects and initiatives that can be understood by stakeholders and management. Proven mentoring ability to drive results and technical growth in peers. Effective communication skills (verbal, written, and presentation) for interacting with customers and peers. Demonstrated application of statistics, statistical modeling, and statistical process control. YOU MUST HAVE 6 or more years of relevant experience in Data Engineering, ETL Development Visualization Hands-On Experience in Power BI development Expert in scripting and querying languages, such as Python, SQL Experience with both Structured and Unstructured data Experience in Snowflake SFDC or SAP business and technical knowledge Knowledge of Agile development methodology Adaptability of business priorities and being flexible with work time management WE VALUE Predictive analysis / trend analysis with large data Knowledge of databases, data warehouse platforms (Snowflake) and Cloud based tools. Experience in using data integration tools for ETL processes. Demonstrated experience in adobe analytics or google analytics Ability to develop and communicate technical vision for projects and initiatives that can be understood by stakeholders and management. Proven mentoring ability to drive results and technical growth in peers. Effective communication skills (verbal, written, and presentation) for interacting with customers and peers. Demonstrated application of statistics, statistical modeling, and statistical process control. Duties and Responsibilities Work in complex data science and analytics projects in support of the Customer Experience organization. Work with GDM owner to identify the data requirements and design/ maintain/ optimize data pipeline to ingest, transform, and load structured and unstructured data from various sources into the data warehouse or data lake. Design and implement data models and to support analytical and reporting requirements. Develop, operate and maintain Advanced Power BI reporting for visualization Develop and maintain ETL (Extract, Transform, Load) processes. Develop and maintain complex SQL queries. Exploratory Data Analysis to solve complex business problems. Enforce data governance policies, standards, and best practices to maintain data quality, privacy, and security - perform audit of the same Create and maintain comprehensive documentation for data architecture, processes, and systems. Troubleshoot and resolve data-related problems and optimize system performance. Partner with IT support team on production processes, continuous improvement, and production deployments. Duties and Responsibilities Work in complex data science and analytics projects in support of the Customer Experience organization. Work with GDM owner to identify the data requirements and design/ maintain/ optimize data pipeline to ingest, transform, and load structured and unstructured data from various sources into the data warehouse or data lake. Design and implement data models and to support analytical and reporting requirements. Develop, operate and maintain Advanced Power BI reporting for visualization Develop and maintain ETL (Extract, Transform, Load) processes. Develop and maintain complex SQL queries. Exploratory Data Analysis to solve complex business problems. Enforce data governance policies, standards, and best practices to maintain data quality, privacy, and security - perform audit of the same Create and maintain comprehensive documentation for data architecture, processes, and systems. Troubleshoot and resolve data-related problems and optimize system performance. Partner with IT support team on production processes, continuous improvement, and production deployments.

Posted 1 month ago

Apply

3.0 - 6.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Impetus Technologies is seeking a skilled Senior Engineer with expertise in Java and Big Data technologies. As a Senior Engineer, you will be responsible for designing, developing, and deploying scalable data processing applications using Java and Big Data frameworks. Your role will involve collaborating with cross-functional teams to gather requirements, developing high-quality code, and optimizing data processing workflows. You will also mentor junior engineers and contribute to architectural decisions to enhance the performance and scalability of our systems. Key Responsibilities: - Design, develop, and maintain high-performance applications using Java and Big Data technologies. - Implement data ingestion and processing workflows utilizing frameworks like Hadoop and Spark. - Collaborate with the data architecture team to define data models and ensure efficient data storage and retrieval. - Optimize existing applications for performance, scalability, and reliability. - Mentor and guide junior engineers, providing technical leadership and fostering a culture of continuous improvement. - Participate in code reviews and ensure best practices for coding, testing, and documentation are followed. - Stay current with technology trends in Java and Big Data, and evaluate new tools and methodologies to enhance system capabilities. Skills and Tools Required: - Strong proficiency in Java programming language with experience in building complex applications. - Hands-on experience with Big Data technologies such as Apache Hadoop, Apache Spark, and Apache Kafka. - Understanding of distributed computing concepts and technologies. - Experience with data processing frameworks and libraries, including MapReduce and Spark SQL. - Familiarity with database systems such as HDFS, NoSQL databases (like Cassandra or MongoDB), and SQL databases. - Strong problem-solving skills and the ability to troubleshoot complex issues. - Knowledge of version control systems like Git, and familiarity with CI/CD pipelines. - Excellent communication and teamwork skills to collaborate effectively with peers and stakeholders. - A bachelors or masters degree in Computer Science, Engineering, or a related field is preferred. About the Role: - You will be responsible for designing and developing scalable Java applications to handle Big Data processing. - Your role will involve collaborating with cross-functional teams to implement innovative solutions that align with business objectives. - You will also play a key role in ensuring code quality and performance through best practices and testing methodologies. About the Team: - You will work with a diverse team of skilled engineers, data scientists, and product managers who are passionate about technology and innovation. - The team fosters a collaborative environment where knowledge sharing and continuous learning are encouraged. - Regular brainstorming sessions and technical workshops will provide opportunities to enhance your skills and stay updated with industry trends. You are Responsible for: - Developing and maintaining high-performance Java applications that process large volumes of data efficiently. - Implementing data integration and processing frameworks using Big Data technologies such as Hadoop and Spark. - Troubleshooting and optimizing existing systems to improve performance and scalability. To succeed in this role - you should have the following: - Strong proficiency in Java and experience with Big Data technologies and frameworks. - Solid understanding of data structures, algorithms, and software design principles. - Excellent problem-solving skills and the ability to work independently as we'll as part of a team. - Familiarity with cloud platforms and distributed computing concepts is a plus.

Posted 1 month ago

Apply

7.0 - 12.0 years

45 - 50 Lacs

Bengaluru

Work from Office

The Risk and Identity Solutions (RaIS) team provides risk management services for banks, merchants, and other payment networks. Machine learning and AI models are the heart of the real-time insights used by our clients to manage risk. Created by the Visa Predictive Models (VPM) team, continual improvement and efficient deployment of these models is essential for our future success. To support our rapidly growing suite of predictive models we are looking for engineers who are passionate about managing large volumes of data, creating efficient, automated processes and standardizing ML/AI tools. Job Description This is a great opportunity to work with a new Data Engineering and MLOps team to scale and structure large scale data engineering and ML/AI that drives significant revenue for Visa. As a member of the Risk and Identify Solutions modeling organization (VPM), your role will involve developing and implementing practices that will allow deployment of machine learning models in large data science projects. You must be a hands-on expert able to navigate both data engineering and data science disciplines to build effective engineering solutions that support ML/AI models. You will partner closely with global stakeholders in RaIS Product, VPM Data Science and Visa Research to help create and prioritize our strategic roadmap. You will then leverage your expert technical knowledge of data engineering, tools and data architecture in the design and creation of the solutions on our roadmap. The position is based at Visas offices in Bangalore, India. Qualifications 7+ yrs. work experience with a bachelors degree or 6+ years of work experience with a Masters or Advanced Degree in an analytical field such as computer science, statistics, finance, economics, or relevant area. Working knowledge of Hadoop ecosystem and associated technologies, (For e.g. Apache Spark, Python, Pandas etc.) Technical skills: Strong experience in creating large scale data engineering pipelines, data-based decision-making, and quantitative analysis. Experience with SQL for extracting, aggregating, and processing big data Pipelines using Hadoop, EMR & NoSQL Databases. Experience with complex, high volume, multi-dimensional data, as well as machine learning models based on unstructured, structured, and streaming datasets. Preferred skills: ETL processes: The role also involves developing and executing large scale ETL processes to support data quality, reporting, data marts, and predictive modeling. Spark pipelines: The role requires building and maintaining efficient and robust Spark pipelines to create and access data sets and feature stores for ML models. Experience in writing and optimizing spark code and Hive code to process Large Data Sets in Big-Data Environments. Strong Development experience in more than one of the following: Golang, Java, Python, Rust. Knowledge of standard big data and Real Time stack such as Hadoop, Spark, Kafka, Redis, Flink and similar technologies Hands on experience in building and maintaining data pipelines, feature engineering pipelines and comfortable with core ML concepts. Hands on experience in engineering, testing, validating and productizing AL/ML models for high performance use cases. Exposure to model serving engines such as TensorFlow, Triton etc. Exposure to model development frameworks like Ml flow. Proficient in managing and operating AWS services including EC2, S3, SageMaker etc. Proficient in setting up and managing distributed data and computing environments using AWS services. Knowledge about DR / HA topologies, Reliability Engineering with hands on experience in implementing the same. Knowledge of using and maintaining DevOps tools and implementing automations for production Experience of working with containerized and virtualized environments (Docker, K8s) Experience with Unix/Shell or Python scripting and exposure to Scheduling tools like Airflow and Control - M. Experience creating/supporting production software/systems and a proven track record of identifying and resolving performance bottlenecks for production systems. Exposure to deploying large scale ML/AI models built by the data science teams and experience with development of models is a strong plus. Exposure to public cloud equivalents, and ecosystem shall be a plus. Strong Experience with Visualization Tools like Tableau, Power BI, is a plus.

Posted 1 month ago

Apply

4.0 - 8.0 years

11 - 15 Lacs

Pune

Work from Office

Join us as a Data Records Governance Lead at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. you'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. You may be assessed on the key critical skills relevant for success in role, such as experience with Data and Records Management Governance, Data Lineage, Data Controls, as we'll as job-specific skillsets. To be successful as a Data Records Governance Lead, you should have experience with: Basic/ Essential Qualifications: Strategic Vision and Leadership. Data Governance and Quality Management. Knowledge that includes data architecture, integration, analytics, Artificial Intelligence, or Cloud computing. Desirable skillsets/ good to have: Data Modelling. Knowledge of Data Architecture or experience with working with Data Architects. Data Sourcing Provisioning. Data Analytics. Data Privacy and Security. This role will be based out of Pune. Purpose of the role To develop, implement, and maintain effective governance frameworks for all data and records across the banks global operations. Accountabilities Development and maintenance of a comprehensive data and records governance framework aligned with regulatory requirements and industry standards. Monitoring data quality and records metrics and compliance with standards across the organization. Identification and addressing of data and records management risks and gaps. Development and implementation of a records management programme that ensures the proper identification, classification, storage, retention, retrieval and disposal of records. Development and implementation of a data governance strategy that aligns with the banks overall data management strategy and business objectives. Provision of Group wide guidance and training on Data and Records Management standard requirements. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using we'll developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc). to solve problems creatively and effectively. Communicate complex information. Complex information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes.

Posted 1 month ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

To design, build, and optimize scalable data pipelines and solutions using Azure Databricks and related technologies, enabling Zodiac Maritime to make faster, data-driven decisions as part of its data transformation journey. Proficiency in data integration techniques, ETL processes and data pipeline architectures. we'll versed in Data Quality rules, principles and implementation. Key Result Areas and Activities: Data Pipeline Development: Design and implement robust batch and streaming data pipelines using Azure Databricks and Spark. Data Architecture Implementation: Apply Medallion Architecture to structure data layers (raw, enriched, curated). Data Quality & Governance: Ensure data accuracy, consistency, and governance using tools like Azure Purview and Unity Catalog. Performance Optimization: Optimize Spark jobs, Delta Lake tables, and SQL queries for efficiency and cost-effectiveness. Collaboration & Delivery: Work closely with analysts, architects, and business teams to deliver end-to-end data solutions. Technical Experience : Must Have: Hands-on experience with Azure Databricks, Delta Lake, Data Factory. Proficiency in Python, PySpark, and SQL with strong query optimization skills. Deep understanding of Lakehouse architecture and Medallion design patterns. Experience building scalable ETL/ELT pipelines and data transformations. Familiarity with Git, CI/CD pipelines, and Agile methodologies. Good To Have: Knowledge of data quality frameworks and monitoring practices. Experience with Power BI or other data visualization tools. Understanding of IoT data pipelines and streaming technologies like Kafka/Event Hubs. Awareness of emerging technologies such as Knowledge Graphs. Qualifications: Education: Likely a degree in Computer Science, Data Engineering, Information Systems, or a related field. Experience: Proven hands-on experience with Azure data stack (Databricks, Data Factory, Delta Lake). Experience in building scalable ETL/ELT pipelines. Familiarity with data governance and DevOps practices. Qualities: Strong problem-solving and analytical skills Attention to detail and commitment to data quality Collaborative mindset and effective communication Proactive and self-driven Passion for learning and staying updated with emerging data technologies

Posted 1 month ago

Apply

1.0 - 6.0 years

5 - 6 Lacs

Nagercoil

Work from Office

Job Summary: We are seeking a skilled Data Migration Specialist to support critical data transition initiatives, particularly involving Salesforce and Microsoft SQL Server . This role will be responsible for the end-to-end migration of data between systems, including data extraction, transformation, cleansing, loading, and validation. The ideal candidate will have a strong foundation in relational databases, a deep understanding of the Salesforce data model, and proven experience handling large-volume data loads. Required Skills and Qualifications: 1+ years of experience in data migration , ETL , or database development roles. Strong hands-on experience with Microsoft SQL Server and T-SQL (complex queries, joins, indexing, and profiling). Proven experience using Salesforce Data Loader for bulk data operations. Solid understanding of Salesforce CRM architecture , including object relationships and schema design. Strong background in data transformation and cleansing techniques . Nice to Have: Experience with large-scale data migration projects involving CRM or ERP systems. Exposure to ETL tools such as Talend, Informatica, Mulesoft, or custom scripts. Salesforce certifications (e.g., Administrator , Data Architecture & Management Designer ) are a plus. Knowledge of Apex , Salesforce Flows , or other declarative tools is a bonus. Key Responsibilities: Execute end-to-end data migration activities , including data extraction, transformation, and loading (ETL). Develop and optimize complex SQL queries, joins, and stored procedures for data profiling, analysis, and validation. Utilize Salesforce Data Loader and/or Apex DataLoader CLI to manage high-volume data imports and exports. Understand and work with the Salesforce data model , including standard/custom objects and relationships (Lookup, Master-Detail). Perform data cleansing, de-duplication , and transformation to ensure quality and consistency. Troubleshoot and resolve data-related issues , load failures, and anomalies. Collaborate with cross-functional teams to gather data mapping requirements and ensure accurate system integration. Ensure data integrity , adherence to compliance standards, and document migration processes and mappings. Ability to independently analyze, troubleshoot, and resolve data-related issues effectively. Follow best practices for data security, performance tuning, and migration efficiency.

Posted 1 month ago

Apply

9.0 - 12.0 years

1 - 2 Lacs

Hyderabad

Remote

Job Title: Data Architect Location: Remote Employment Type: Full-Time Reports to: Lead Data Strategist About Client / Project: Client is a specialist data strategy and AI consultancy that empowers businesses to unlock tangible value from their data assets. We specialize in developing comprehensive data strategies tailored to address core business and operational challenges. By combining strategic advisory with hands-on implementation, we ensure data becomes a true driver of business growth, operational efficiency, and competitive advantage for our clients. As a solutions-focused and forward-thinking consultancy, we help organizations transform their data capabilities using modern technology, reduce costs, and accelerate business growth by aligning every initiative directly with our clients core business objectives. Role Overview We are seeking a highly experienced Data Architect to lead the design and implementation of scalable data architectures for global clients across industries. You will define enterprise-grade data platforms leveraging cloud-native technologies and modern data frameworks. Key Responsibilities Design and implement cloud-based data architectures (GCP, AWS, Azure, Snowflake, Redshift, Databricks, or Hadoop)• Develop conceptual, logical, and physical data models Define data flows, ETL/ELT pipelines, and ingestion strategies Design and maintain data catalogs, metadata, and domain structures Establish data architecture standards, reference models, and blueprints Oversee data lineage, traceability, and audit readiness Guide integration of AI/ML pipelines and analytics solutions Ensure data privacy, protection, and compliance (e.g., GDPR, HIPAA) Collaborate closely with Engineers, Analysts, and Strategists Required Skills & Qualifications 8+ years of experience in data architecture or enterprise data platform roles Deep experience with at least two major cloud platforms (AWS, Azure, GCP) Proven hands-on work with modern data platforms: Snowflake, Databricks, Redshift, Hadoop Strong understanding of data warehousing, data lakes, lakehouse architecture Advanced proficiency in SQL, Python, Spark, and/or Scala Experience with data cataloging and metadata tools (e.g., Informatica, Collibra, Alation) Knowledge of data governance frameworks and regulatory compliance Strong documentation, stakeholder communication, and architectural planning skills Bachelor’s degree in Computer Science, Engineering, or related field (Master’s preferred)

Posted 1 month ago

Apply

8.0 - 12.0 years

18 - 27 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Job Description: Design, implement, and maintain data pipelines and data integration solutions using Azure Synapse Develop and optimize data models and data storage solutions on Azure Collaborate with data scientists and analysts to implement data processing and data transformation tasks. Ensure data quality and integrity through data validation and cleansing methodologies. Monitor and troubleshoot data pipelines to identify and resolve performance issues Collaborate with cross-functional teams to understand and prioritize data requirements. Stay up-to-date with the latest trends and technologies in data engineering and Azure services. Skills & Qualifications: Bachelors degree in IT, computer science, computer engineering, or similar 8+ years of experience in Data Engineering. Microsoft Azure Synapse Analytics experience is essential. (Azure Data Factory, Dedicated SQL Pool, Lake Database, Azure Storage) Hands on experience in Spark notebooks(python or Scala) is mandatory End-to-end Data Warehouse experience: Ingestion, ETL, big data pipelines, data architecture, message queuing, BI/Reporting, and Data Security. Advanced SQL/relational database knowledge and query authoring. Demonstrated experience in design and delivering data platforms for Business Intelligence and Data Warehouse. Strong skills in handling and analysing complex, high volume data with excellent attention in details. Knowledge of data modelling and data warehousing concepts, such as DataVault or 3NF. Experience with Data Governance (Quality, Lineage, Data dictionary, and Security). Familiar with Agile methodology and Agile working environment. Ability to work alone with POs, BAs, Architects.

Posted 1 month ago

Apply

6.0 - 8.0 years

30 - 32 Lacs

Bengaluru

Work from Office

We are seeking an experienced ER Modeling Expert / Data Modeler to design, develop, and maintain conceptual, logical, and physical data models for enterprise applications and data warehouses. The ideal candidate should have a deep understanding of relational databases, normalization, data governance, and schema design while ensuring data integrity and scalability. Key Responsibilities: Design and develop Entity-Relationship (ER) models for databases, data warehouses, and data lakes. Create conceptual, logical, and physical data models using tools like Erwin, Visio, Lucidchart, or PowerDesigner. Define primary keys, foreign keys, relationships, cardinality, and constraints for optimal data integrity. Work closely with DBAs, data architects, and software developers to implement data models. Optimize database performance, indexing, and query tuning for relational databases. Define and enforce data governance, data quality, and master data management (MDM) standards. Develop and maintain metadata repositories, data dictionaries, and schema documentation. Ensure compliance with data security and privacy regulations (GDPR, HIPAA, etc.). Support ETL/ELT pipeline design to ensure smooth data flow between systems. Work with big data platforms (Snowflake, Databricks, Redshift, BigQuery, or Synapse) to support modern data architectures. Required Skills & Qualifications: 6+ years of experience in data modeling, database design, and ER modeling. Strong expertise in relational databases (SQL Server, Oracle, PostgreSQL, MySQL, etc.). Hands-on experience with data modeling tools (Erwin, PowerDesigner, DB Designer, Visio, or Lucidchart). Proficiency in SQL, indexing strategies, query performance tuning, and stored procedures. Deep understanding of normalization, denormalization, star schema, and snowflake schema. Experience with data governance, data quality, and metadata management. Strong knowledge of ETL processes, data pipelines, and data warehousing concepts. Familiarity with NoSQL databases (MongoDB, Cassandra, DynamoDB) and their modeling approaches. Ability to collaborate with cross-functional teams including data engineers, architects, and business analysts. Strong documentation and communication skills. Preferred Qualifications: Certifications in Data Management, Data Architecture, or Cloud Databases. Experience with cloud-based databases (AWS RDS, Azure SQL, Google Cloud Spanner, Snowflake). Knowledge of Graph Databases (Neo4j, Amazon Neptune) and hierarchical modeling.

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Do you want to help solve the world's most pressing challengesFeeding the world's growing population and slowing climate change are two of the world's greatest challenges AGCO is a part of the solution! Join us to make your contribution AGCO is looking to hire candidates for the position of Senior Manager, AI & Data Systems Architecture We are seeking an experienced and innovative Senior Manager, AI & Data Systems Architecture to lead the design, creation, and evolution of system architectures for AI, analytics, and data systems within our organization The ideal candidate will have extensive experience delivering scalable, high-performance data and AI architectures across cloud platforms such as AWS, Google Cloud Platform, and Databricks, with a proven ability to align technology solutions with business goals This individual will collaborate with cross-functional teams, including data engineers, data scientists, and other IT professionals to create architectures that support cutting-edge AI and data initiatives, driving efficiency, scalability, and innovation Your Impact Architecture Leadership: Lead the end-to-end architecture for AI and data systems, ensuring cost-effective scalability, performance, and security across cloud and on-premises environments The goal is to build and support a modern data stack AI & Data Systems: Design, implement, and manage data infrastructure and AI platforms, including but not limited to AWS, Azure, Google Cloud Platform, Databricks, and other key data tools Lead the data model approach for all data products and solutions Cloud Expertise: Champion cloud adoption strategies, optimizing data pipelines, analytics workloads, and AI/ML model deployment, end point creation and app integration System Evolution: Drive the continuous improvement and evolution of data and AI architectures to meet emerging business needs, technological advancements, and industry trends Collaboration & Leadership: Work closely with delivery teams, data engineers, data scientists, software engineers, and IT operations to implement comprehensive data architectures that support AI and analytics initiatives focused on continuous improvement Strategic Vision: Partner with business and technology stakeholders to understand long-term goals, translating them into architectural frameworks and roadmaps that drive business value Governance & Best Practices: Ensure best practices in data governance, security, and compliance, overseeing the implementation of standards across AI and data systems Performance Optimization: Identify opportunities to optimize performance, cost-efficiency, and operational effectiveness of AI and data systems including ETL, ELT and data pipeline creation and evolution and optimizing of AI resource models Functional Knowledge Experience: 10+ years of experience in data architecture, AI systems, or cloud infrastructure, with at least 3-5 years in a leadership role Proven experience driving solutions from ideation to delivery and support Cloud Expertise: Deep hands-on experience with cloud platforms like AWS, Google Cloud Platform (GCP), and Databricks Familiarity with other data and AI platforms is a plus CRM Expertise: Hands-on experience with key CRM systems like Salesforce and AI systems inside of those solutions (ex Einstein) AI & Analytics Systems: Proven experience designing architectures for AI, machine learning, analytics, and large-scale data processing systems Technical Knowledge: Expertise in data architecture, including data lakes, data warehouses, real-time data streaming, and batch processing frameworks Cross-Platform Knowledge: Solid understanding of containerization (Docker, Kubernetes), infrastructure as code (Terraform, CloudFormation), and big data ecosystems (Spark, Hadoop) Experience in applying Agile methodologies, including Scrum, Kanban or SAFe Experience in top reporting solutions, including/preferred Tableau which is one of our cornerstone reporting solutions Leadership: Strong leadership and communication skills, with the ability to drive architecture initiatives in a collaborative and fast-paced environment Excellent problem solving skills and a proactive mindset Education: Bachelors degree in Computer Science, Data Science, or related field Masters degree or relevant certifications (e g , AWS Certified Solutions Architect) is preferred Business Expertise Experience in industries such as manufacturing, agriculture, or supply chain, particularly in AI and data use cases Familiarity with regulatory requirements related to data governance and security Experience with emerging technologies like edge computing, IoT, and AI/ML automation tools Your Experience And Qualifications Excellent communication / interpersonal skills, capable of interacting with multiple levels of IT and business management/leadership Hands on experience with SAP Hana, SAP Data Services or similar data storage, warehousing and/or ETL solutions 10+ years of progressive IT experience Experience creating data models, querying data, business process and technical process mapping Successfully influences diverse groups and teams in a complex, ambiguous and rapidly changing environment to deliver value-added solutions Effective working relationship with the business to ensure business requirements are accurately captured, agreed, and accepted Adaptable to new technologies/practices and acts as change agent within teams Your Benefits GLOBAL DIVERSITY Diversity means many things to us, different brands, cultures, nationalities, genders, generations even variety in our roles You make us unique! ENTERPRISING SPIRITEvery role adds value We're committed to helping you develop and grow to realize your potential POSITIVE IMPACT Make it personal and help us feed the world INNOVATIVE TECHNOLOGIES You can combine your love for technology with manufacturing excellence and work alongside teams of people worldwide who share your enthusiasm MAKE THE MOST OF YOU Benefits include health care and wellness plans and flexible and virtual work option??? Your Workplace AGCO is Great Place to Work Certified and has been recognized for delivering exceptional employee experience and a positive workplace culture We value inclusion and recognize the innovation a diverse workforce delivers to our farmers Through our recruiting, we are committed to building a team that includes a variety of experiences, backgrounds, cultures and perspectives We value inclusion and recognize the innovation a diverse workforce delivers to our farmers Through our recruitment efforts, we are committed to building a team that includes a variety of experiences, backgrounds, cultures and perspectives Join us as we bring agriculture into the future and apply now! Please note that this job posting is not designed to cover or contain a comprehensive listing of all required activities, duties, responsibilities, or benefits and may change at any time with or without notice AGCO is proud to be an Equal Opportunity Employer

Posted 1 month ago

Apply

1.0 - 3.0 years

16 - 19 Lacs

Bengaluru

Work from Office

About The Position Chevron invites applications for the role of Cloud Engineer Data Hosting within our team in India This position supports Chevrons data hosting environment by delivering modern digital data hosting capabilities in a cost competitive, reliable, and secure manner This position will provide broad exposure to the application of technology to enable business with many opportunities for growth and professional development for the candidate Key Responsibilities Design, implement, and manage scalable and secure data hosting solutions on Azure Develop and maintain data architectures, including data models, data warehouses, and data lakes Refine data storage and extraction procedures to enhance performance and cost-effectiveness Uphold stringent data security measures and ensure adherence to relevant industry standards and regulatory requirements Collaborate with data scientists, analysts, and other stakeholders to understand and address their data needs Monitor and troubleshoot data hosting environments to ensure high availability and reliability Streamline data workflows and operations through the automation capabilities of Azure Data Factory and comparable technologies Design, develop, and deploy modular cloud-based systems Develop and maintain cloud solutions in accordance with best practices Required Qualifications Must have bachelors degree in computer science engineering or related discipline 0-5 years' experience At least 2 years of experience in data hosting for both on-premises and azure environments Microsoft AZ900 Certification Proficient in utilizing Azure data services, including Azure SQL Database, Azure Data Lake Storage, and Azure Data Factory In-depth understanding of cloud infrastructure, encompassing virtual networks, storage solutions, and compute resources within Azure Extensive hands-on experience with Azure services such as Azure SQL Database, Azure Blob Storage, Azure Data Lake, and Azure Synapse Analytics Well-versed in on-premises storage systems from vendors like NetApp, Dell, and others Skilled proficiency in scripting languages like Ansible, PowerShell, Python, and Azure CLI for automation and management tasks Comprehensive knowledge of Azure security best practices, including identity and access management, encryption, and compliance standards Preferred Qualifications Demonstrated proficiency in architecting, deploying, and managing secure and scalable data hosting solutions on the Azure platform Extensive experience in developing and maintaining robust data architectures, including data models, data warehouses, and data lakes, utilizing Azure services Expertise in optimizing data storage and retrieval processes for superior performance and cost efficiency within Azure environments In-depth knowledge of data security protocols and compliance with industry standards and regulations, with a focus on Azure cloud compliance Proven ability to collaborate effectively with data scientists, analysts, and other stakeholders to address their data needs using Azure's capabilities Strong track record of monitoring and troubleshooting Azure data hosting environments to ensure high availability and system reliability Skilled in automating data workflows and processes using Azure Data Factory and other Azure-based automation tools Experience in designing, developing, and deploying modular, cloud-based systems, with a particular emphasis on Azure solutions Commitment to maintaining cloud solutions in alignment with Azure best practices and continuously integrating Azure's latest updates and features Possession of Azure certifications, such as the Azure Data Engineer Associate or Azure Database Administrator Associate, with a preference for candidates holding the Azure Solutions Architect Expert certification or equivalent advanced credentials Chevron ENGINE supports global operations, supporting business requirements across the world Accordingly, the work hours for employees will be aligned to support business requirements The standard work week will be Monday to Friday Working hours are 8:00am to 5:00pm or 1 30pm to 10 30pm Chevron participates in E-Verify in certain locations as required by law

Posted 1 month ago

Apply

2.0 - 6.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Job Title: Data Governance & Management Associate Location: Bangalore, India Role Description The Compliance and Anti-Financial Crime (CAFC) Data Office is responsible for Data Governance and Management across key functions including AFC, Compliance, and Legal. The team supports these functions in establishing and improving data governance to achieve critical business outcomes such as effective control operation, regulatory compliance, and operational efficiency. The CAFC Data Governance and Management team implements Deutsche Banks Enterprise Data Management Frameworkfocusing on controls, culture, and capabilitiesto drive improved data quality, reduce audit and regulatory findings, and strengthen controls. As a member of the Divisional Data Office, the role holder will support both Run-the-Bank and Change-the-Bank initiatives, with a particular focus on Financial Crime Risk Assessment (FCRA) data collation, processing, testing, and automation. Your key responsibilities Document and maintain existing and new processes; respond to internal and external audit queries and communicate updates clearly to both technical and non-technical audiences. Independently manage the FCRA data collection process including data collection template generation, quality checks, and stakeholder escalation. Execution of data cleansing and transformation tasks to prepare data for analysis. Perform variance analysis and develop a deep understanding of underlying data sources used in Financial Crime Risk Assessment. Documentation of data quality findings and recommendations for improvement/feeding into the technology requirements. Work with Data Architecture & developers to design and build data FCRA Risk data metrics. Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification. To ensure new data sources align with Deutsche Banks Data Governance standards, maintain metadata in Collibra, visualize data lineage in Solidatus, and ensure certification and control coverage. Automate manual data processes using tools such as Python, SQL, Power Query and MS excel to improve efficiency and reduce operational risk. Translate complex technical issues into simple, actionable insights for business stakeholders, demonstrating strong communication and stakeholder management skills. Your skills and experience 6+ years of experience in data management within financial services, with a strong understanding of data risks and controls. Familiarity with industry-standard frameworks such as DCAM or DAMA (certification preferred). Hands-on experience with Data cataloguing using Collibra, Data lineage documentation using Solidatus and Data control assessment and monitoring Proficiency in Python, SQL, and Power Query/excel for data analysis and automation. Strong communication skills with the ability to explain technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively across global teams.

Posted 1 month ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Pune

Work from Office

Step into role of a Senior Data Engineer At Barclays, innovation isnt encouraged, its expected As a Senior Data Engineer you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Hands on programming experience in python and PY-Spark Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based in Pune Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures Development of processing and analysis algorithms fit for the intended data complexity and volumes Collaboration with data scientist to build and deploy machine learning models Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures If managing a team, they define jobs and responsibilities, planning for the departments future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment Manage and mitigate risks through assessment, in support of the control and governance agenda Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions Adopt and include the outcomes of extensive research in problem solving processes Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave

Posted 1 month ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Job Summary We are seeking a skilled Escalation Engineer with expertise in NetApp ONTAP, data center operations, and storage concepts The ideal candidate will possess a robust technical background in data storage, coupled with extensive experience in providing technical support and leading teams in resolving complex issues This role requires a deep understanding of product sustainability, engineering cycles, and a commitment to delivering exceptional customer service Job Requirements Serve as a subject matter expert in NetApp ONTAP and related storage technologies Lead and coordinate resolution efforts for escalated technical issues, collaborating closely with cross-functional teams Provide advanced troubleshooting and problem-solving expertise to address complex customer issues Conduct in-depth analysis of customer environments to identify root causes and develop effective solutions Actively participate in product sustainability initiatives, including product lifecycle management and engineering cycles Mentor and guide junior team members, fostering a culture of continuous learning and development Communicate effectively with customers, internal stakeholders, and management, both verbally and in writing Document technical solutions, best practices, and knowledge base articles to enhance team efficiency and customer satisfaction Education & Requirements Bachelors degree in Computer Science, Information Technology, or related field Extensive experience for 10+ years in technical support as a Sr Engineer/Principal Engineer, handling escalations preferably in a storage or data center environment In-depth knowledge of NetApp ONTAP and storage concepts such as SAN, NAS, RAID, and replication Strong understanding of data center architectures, virtualization technologies, and cloud platforms Proven track record of leading teams in resolving technical escalations and driving issue resolution Excellent collaboration skills with the ability to work effectively in a cross-functional team environment Exceptional verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences Demonstrated ability to prioritize and manage multiple tasks in a fast-paced environment Relevant certifications such as NetApp Certified Implementation Engineer (NCIE) or equivalent are a plus At NetApp, we embrace a hybrid working environment designed to strengthen connection, collaboration, and culture for all employees This means that most roles will have some level of in-office and/or in-person expectations, which will be shared during the recruitment process Equal Opportunity Employer NetApp is firmly committed to Equal Employment Opportunity (EEO) and to compliance with all laws that prohibit employment discrimination based on age, race, color, gender, sexual orientation, gender identity, national origin, religion, disability or genetic information, pregnancy, and any protected classification Why NetApp We are all about helping customers turn challenges into business opportunity It starts with bringing new thinking to age-old problems, like how to use data most effectively to run better but also to innovate We tailor our approach to the customer's unique needs with a combination of fresh thinking and proven approaches We enable a healthy work-life balance Our volunteer time off program is best in class, offering employees 40 hours of paid time off each year to volunteer with their favourite organizations We provide comprehensive benefits, including health care, life and accident plans, emotional support resources for you and your family, legal services, and financial savings programs to help you plan for your future We support professional and personal growth through educational assistance and provide access to various discounts and perks to enhance your overall quality of life If you want to help us build knowledge and solve big problems, let's talk

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies