Jobs
Interviews

832 Data Services Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

6 - 11 Lacs

Noida

Work from Office

Role Senior Databricks Engineer As a Mid Databricks Engineer, you will play a pivotal role in designing, implementing, and optimizing data processing pipelines and analytics solutions on the Databricks platform. You will collaborate closely with cross-functional teams to understand business requirements, architect scalable solutions, and ensure the reliability and performance of our data infrastructure. This role requires deep expertise in Databricks, strong programming skills, and a passion for solving complex engineering challenges. What you'll do : - Design and develop data processing pipelines and analytics solutions using Databricks. - Architect scalable and efficient data models and storage solutions on the Databricks platform. - Collaborate with architects and other teams to migrate current solution to use Databricks. - Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements. - Use best practices for data governance, security, and compliance on the Databricks platform. - Mentor junior engineers and provide technical guidance. - Stay current with emerging technologies and trends in data engineering and analytics to drive continuous improvement. You'll be expected to have : - Bachelor's or master's degree in computer science, Engineering, or a related field. - 5 to 8 years of overall experience and 2+ years of experience designing and implementing data solutions on the Databricks platform. - Proficiency in programming languages such as Python, Scala, or SQL. - Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. - Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. - Proven track record of delivering scalable and reliable data solutions in a fast-paced environment. - Excellent problem-solving skills and attention to detail. - Strong communication and collaboration skills with the ability to work effectively in cross-functional teams. - Good to have experience with containerization technologies such as Docker and Kubernetes. - Knowledge of DevOps practices for automated deployment and monitoring of data pipelines.

Posted 1 week ago

Apply

8.0 - 13.0 years

8 - 13 Lacs

Telangana

Work from Office

Key Responsibilities: Team Leadership: Lead and mentor a team of Azure Data Engineers, providing technical guidance and support. Foster a collaborative and innovative team environment. Conduct regular performance reviews and set development goals for team members. Organize training sessions to enhance team skills and technical capabilities. Azure Data Platform: Design, implement, and optimize scalable data solutions using Azure data services such as Azure Databricks, Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics. Ensure data engineering best practices and data governance are followed. Stay up-to-date with Azure data technologies and recommend improvements to enhance data processing capabilities. Data Architecture: Collaborate with data architects to design efficient and scalable data architectures. Define data modeling standards and ensure data integrity, security, and governance compliance. Project Management: Work with project managers to define project scope, goals, and deliverables. Develop project timelines, allocate resources, and track progress. Identify and mitigate risks to ensure successful project delivery. Collaboration & Communication: Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to deliver data-driven solutions. Communicate effectively with stakeholders to understand requirements and provide updates. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Team Lead or Manager in data engineering. Extensive experience with Azure data services and cloud technologies. Expertise in Azure Databricks, PySpark, and SQL. Strong understanding of data engineering best practices, data modeling, and ETL processes. Experience with agile development methodologies. Certifications in Azure data services (preferred). Preferred Skills: Experience with big data technologies and data warehousing solutions. Familiarity with industry standards and compliance requirements. Ability to lead and mentor a team.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Database Administrator at NTT DATA, you will be a seasoned subject matter expert responsible for ensuring the availability, integrity, and performance of critical data assets. You will work closely with cross-functional teams to support data-driven applications, troubleshoot issues, and implement robust backup and recovery strategies. Collaboration with Change Control, Release Management, Asset and Configuration Management, and Capacity and Availability Management will be essential to establish user needs, monitor access and security, and control database environments. Key responsibilities include performing the installation, configuration, and maintenance of database management systems, collaborating with software developers/architects to optimize database schemas, and designing backup and disaster recovery strategies. You will monitor database performance, identify bottlenecks, and optimize queries for optimal performance. Additionally, you will work on database documentation, data validation, integrity checks, and data cleansing activities. Supporting database-related initiatives, applying patches, and communicating with technical teams and stakeholders are also crucial aspects of the role. To excel in this position, you should have seasoned proficiency in database administration tasks, SQL knowledge, database security principles, backup and recovery strategies, and data architecture. Effective communication, problem-solving, analytical skills, and the ability to manage multiple projects concurrently are necessary. Academic qualifications include a Bachelor's degree in computer science or related field, along with relevant certifications such as MCSE DBA or Oracle Associate. Prior experience as a Database Administrator in an IT organization, working with Oracle Enterprise and Microsoft SQL Server, and managing databases is required. NTT DATA is a trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a focus on R&D and a diverse team of experts, NTT DATA provides consulting, data, AI, and industry solutions to move organizations confidently into the digital future. As an Equal Opportunity Employer, NTT DATA offers a workplace where diversity and inclusion thrive, allowing employees to grow, belong, and succeed.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You should have a solid working knowledge of AWS database & data services as well as the Power BI stack. Your experience should include gathering requirements, modeling data, and designing & supporting high-performance big data backend and data visualization systems. Additionally, you should be proficient in utilizing methodologies & platform stacks such as Map Reduce, Spark, Streaming solutions (like Kafka, Kinesis), ETL systems (like Glue, Firehose), storage (like S3), warehouse stacks (like Redshift, DynamoDB), and equivalent open source stacks. Your responsibilities will involve designing & implementing solutions using visualization technologies like Power BI and Quick Sight. You will also need to maintain and continuously groom the product backlog, the release pipeline, and the product roadmap. It is important for you to capture problem statements and opportunities raised by customers as demand items, epics & stories. Furthermore, you will be expected to lead database physical design sessions with the engineers in the team and oversee quality assurance and load tests of the solution to ensure customer experience is maintained. You should also support data governance and data quality (cleansing) efforts. Your primary skills should include expertise in AWS database, data services, PowerBi stack, and Big data.,

Posted 1 week ago

Apply

4.0 - 9.0 years

8 - 13 Lacs

Pune, Anywhere in /Multiple Locations

Work from Office

Role Senior Databricks Engineer As a Mid Databricks Engineer, you will play a pivotal role in designing, implementing, and optimizing data processing pipelines and analytics solutions on the Databricks platform. You will collaborate closely with cross-functional teams to understand business requirements, architect scalable solutions, and ensure the reliability and performance of our data infrastructure. This role requires deep expertise in Databricks, strong programming skills, and a passion for solving complex engineering challenges. What you'll do : - Design and develop data processing pipelines and analytics solutions using Databricks.- Architect scalable and efficient data models and storage solutions on the Databricks platform.- Collaborate with architects and other teams to migrate current solution to use Databricks.- Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements.- Use best practices for data governance, security, and compliance on the Databricks platform.- Mentor junior engineers and provide technical guidance.- Stay current with emerging technologies and trends in data engineering and analytics to drive continuous improvement. You'll be expected to have : - Bachelor's or master's degree in computer science, Engineering, or a related field.- 5 to 8 years of overall experience and 2+ years of experience designing and implementing data solutions on the Databricks platform.- Proficiency in programming languages such as Python, Scala, or SQL.- Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark.- Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services.- Proven track record of delivering scalable and reliable data solutions in a fast-paced environment.- Excellent problem-solving skills and attention to detail.- Strong communication and collaboration skills with the ability to work effectively in cross-functional teams.- Good to have experience with containerization technologies such as Docker and Kubernetes.- Knowledge of DevOps practices for automated deployment and monitoring of data pipelines.

Posted 1 week ago

Apply

18.0 - 23.0 years

15 - 19 Lacs

Hyderabad

Work from Office

About the Role We are seeking a highly skilled and experienced Data Architect to join our team. The ideal candidate will have at least 18 years of experience in Data engineering and Analytics and a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will be expected to design, create, deploy, and manage Blackbauds data architecture. This role has considerable technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. This individual acts as an evangelist for proper data strategy with other teams at Blackbaud and assists with the technical direction, specifically with data, of other projects. What you'll do Develop and direct the strategy for all aspects of Blackbauds Data and Analytics platforms, products and services Set, communicate and facilitate technical direction more broadly for the AI Center of Excellence and collaboratively beyond the Center of Excellence Design and develop breakthrough products, services or technological advancements in the Data Intelligence space that expand our business Work alongside product management to craft technical solutions to solve customer business problems. Own the technical data governance practices and ensures data sovereignty, privacy, security and regulatory compliance. Continuously challenging the status quo of how things have been done in the past. Build data access strategy to securely democratize data and enable research, modelling, machine learning and artificial intelligence work. Help define the tools and pipeline patterns our engineers and data engineers use to transform data and support our analytics practice Work in a cross-functional team to translate business needs into data architecture solutions. Ensure data solutions are built for performance, scalability, and reliability. Mentor junior data architects and team members. Keep current on technologydistributed computing, big data concepts and architecture. Promote internally how data within Blackbaud can help change the world. What you'll bring 18+ years of experience in data and advanced analytics At least 8 years of experience working on data technologies in Azure/AWS Expertise in SQL and Python Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. Expertise in Databricks, Microsoft Fabric Strong understanding of data modeling, data warehousing, data lakes, data mesh and data products. Experience with machine learning Excellent communication and leadership skills. Preferred Qualifications Experience working with .Net/Java and Microservice Architecture Stay up to date on everything Blackbaud, follow us on Linkedin, X, Instagram, Facebook and YouTube Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.

Posted 1 week ago

Apply

8.0 - 13.0 years

11 - 15 Lacs

Hyderabad

Work from Office

This role will be responsible for leading the design, implementation and maintenance of database systems, ensuring their performance, security and integrity. You will be involved in the software development lifecycle (SDLC) working closely with developers and other data analysts to optimize data management and retrieval processes. What youll be doing: Technical Leadership: Coach and mentor team members and DBAs to enhance their technical skills and knowledge in the database engineering domain. Strategy Development: Collaborate with stakeholders to set and implement the technical strategy for database management and development. Database Management: Design, implement, and maintain robust database systems using Azure, SQL Server, and NoSQL databases. Performance Optimization: Ensure optimal performance, reliability, and scalability of database systems. SaaS Integration: Develop and manage database solutions for SaaS applications, ensuring seamless integration and performance. Scripting and Automation: Develop and maintain scripts and automation tools to streamline database operations. Data Security: Implement and enforce data security measures to protect sensitive information. Monitoring Monitor database performance and ensure high availability and security Troubleshooting: Diagnose and resolve database-related issues promptly and efficiently. Documentation: Maintain comprehensive documentation of database configurations, processes, and procedures. What well want you to have: Experience: Minimum of 8 years of experience in database engineering. Technical Skills: Azure Expertise: Extensive experience with Azure SQL Database, Azure Cosmos DB, and other Azure data services. Proficiency in deploying, managing, and scaling databases in Azure. Knowledge of Azure Data Factory, Azure Data Lake, and Azure Synapse Analytics. Experience with Azure Resource Manager (ARM) templates and Azure DevOps for CI/CD pipelines. SQL Server Expertise: In-depth knowledge of SQL Server architecture, performance tuning, and query optimization. Experience with SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), and SQL Server Analysis Services (SSAS). Proficiency in database backup, recovery, and high availability solutions such as Always On Availability Groups. NoSQL Databases: Proven experience with NoSQL databases like MongoDB, Cassandra, or Couchbase. SQL and Scripting Tools: Advanced proficiency in SQL and scripting tools such as PowerShell, Python, or Bash. SaaS Experience: Proven experience in developing and managing database solutions for SaaS applications. Leadership: Proven ability to coach and mentor team members. Strategic Thinking: Experience in setting and executing technical strategies. Problem-Solving: Strong analytical and problem-solving skills. Communication: Excellent verbal and written communication skills. Able to work flexible hours as required by business priorities Able to deliver work that meets quality, security and operability standards. Preferred Qualifications: Experience with cloud-based database solutions. Knowledge of data warehousing and ETL processes. Familiarity with DevOps practices and tools. CertificationsAzure data and SQL server certifications Stay up to date on everything Blackbaud, follow us on Linkedin, X, Instagram, Facebook and YouTube Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.

Posted 1 week ago

Apply

0.0 - 2.0 years

9 - 10 Lacs

Pune

Work from Office

Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for 50 years in the USA. Data Axle has set up a strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases. Data Axle is headquartered in Dallas, TX, USA. Roles and Responsibilities: Implement application components and systems according to department standards and guidelines. Work with product and designers to translate requirements into accurate representations. Analyze, design, code, debug, and test business applications. Code reviews in accordance with team processes/standards. Understand and work with data in varying degrees of complexity and scale. Be responsible for planning, processing and performing all jobs in an efficient manner. Provide assistance to testers and support personnel as needed to determine system problems. Resolve problems involved with integrating new technologies with systems. Remain knowledgeable of new emerging technologies and their impact on internal systems. Perform other miscellaneous duties as assigned by management. ","jobQualifications":" Qualifications: 3+ years of experience with a range of software applications and technologies. Bachelor s degree in a technology related area (Computer Science, Engineering, etc.) is required; Master s Degree preferred. Good knowledge of design methodology and standard software design patterns. Experience of working in Agile teams. Strong technical written and verbal communication in English. Proven ability to develop systems and web services for data storage and access. Strong organizational and detail-oriented skills. This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level. ","

Posted 1 week ago

Apply

2.0 - 4.0 years

3 - 7 Lacs

Pune

Work from Office

About Data Axle: Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases. Data Axle India is recognized as a Great Place to Work! This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive. General Summary: We are looking for a Collections Specialist Associate who will be responsible for contacting clients and customers regarding collection of payment, collecting unpaid balances, and applying payments to accounts. Roles & Responsibilities: Make collection calls on past due accounts. Collect payments due on unpaid balances from delinquent accounts. Work with customers to resolve disputed items. Issue appropriate credits and post accurately. Review and reconcile accounts. Re-bill if necessary. Post daily cash receipts. Research negative balance accounts. Review tax exempt certificates and code them into the master file. Escalate any accounts that are more than 75 days delinquent. Preparation and distribution of reports. Able to be at work on a regular and predictable basis or as scheduled. Ad hoc projects within accounting ","jobQualifications":" Qualifications: 2-4 years of previous relevant work experience required Bachelor s degree Ability to sufficiently communicate with individuals at all levels in the Company and with various external business contacts in an articulate, professional manner. Strong organizational skills and attention to detail. Must have experience in managing high volume email communications. Ability to anticipate and react quickly in a dynamic business environment. Must be a team player. Must be able to multi-task to meet strict deadlines. Ability to use Microsoft Excel and Outlook at an advanced level Experience with Oracle EBS This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level. Disclaimer: Data Axle India follows a merit-based employee recruitment practice with extensive screening steps. Data Axle India does not charge/accept any amount or security deposit from job seekers during the recruitment process. Job Seekers are requested to be aware of unsolicited or fraudulent communication regarding a job offer or an interview call against payment of money, please stay alert. All Data Axle India jobs are published on the Careers page of its website and/or on its LinkedIn profile. Interested job seekers may access the same and apply directly. If you believe you have been a victim of recruitment fraud, you are requested to approach law enforcement agencies immediately. ","

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Ahmedabad

Work from Office

Role Senior Databricks Engineer As a Mid Databricks Engineer, you will play a pivotal role in designing, implementing, and optimizing data processing pipelines and analytics solutions on the Databricks platform. You will collaborate closely with cross-functional teams to understand business requirements, architect scalable solutions, and ensure the reliability and performance of our data infrastructure. This role requires deep expertise in Databricks, strong programming skills, and a passion for solving complex engineering challenges. What you'll do : - Design and develop data processing pipelines and analytics solutions using Databricks. - Architect scalable and efficient data models and storage solutions on the Databricks platform. - Collaborate with architects and other teams to migrate current solution to use Databricks. - Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements. - Use best practices for data governance, security, and compliance on the Databricks platform. - Mentor junior engineers and provide technical guidance. - Stay current with emerging technologies and trends in data engineering and analytics to drive continuous improvement. You'll be expected to have : - Bachelor's or master's degree in computer science, Engineering, or a related field. - 5 to 8 years of overall experience and 2+ years of experience designing and implementing data solutions on the Databricks platform. - Proficiency in programming languages such as Python, Scala, or SQL. - Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. - Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. - Proven track record of delivering scalable and reliable data solutions in a fast-paced environment. - Excellent problem-solving skills and attention to detail. - Strong communication and collaboration skills with the ability to work effectively in cross-functional teams. - Good to have experience with containerization technologies such as Docker and Kubernetes. - Knowledge of DevOps practices for automated deployment and monitoring of data pipelines.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Overview DataOps L3 The role will leverage & enhance existing technologies in the area of data and analytics solutions like Power BI, Azure data engineering technologies, ADLS, ADB, Synapse, and other Azure services. The role will be responsible for developing and support IT products and solutions using these technologies and deploy them for business users Responsibilities 5 to 10 Years of IT & Azure Data engineering technologies experience Prior experience in ETL, data pipelines, data flow techniques using Azure Data Services Working experience in Python, Py Spark, Azure Data Factory, Azure Data Lake Gen2, Databricks, Azure Synapse and file formats like JSON & Parquet. Experience in creating ADF Pipelines to source and process data sets. Experience in creating Databricks notebooks to cleanse, transform and enrich data sets. Development experience in orchestration of pipelines Good understanding about SQL, Databases, Datawarehouse systems preferably Teradata Experience in deployment and monitoring techniques. Working experience with Azure DevOps CI/CD pipelines to deploy Azure resources. Experience in handling operations/Integration with source repository Must have good knowledge on Datawarehouse concepts and Datawarehouse modelling. Working knowledge of SNOW including resolving incidents, handling Change requests /Service requests, reporting on metrics to provide insights. Collaborate with the project team to understand tasks to model tables using data warehouse best practices and develop data pipelines to ensure the efficient delivery of data. Strong expertise in performance tuning and optimization of data processing systems. Proficient in Azure Data Factory, Azure Databricks, Azure SQL Database, and other Azure data services. Develop and enforce best practices for data management, including data governance and security. Work closely with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Proficient in implementing DataOps framework. Qualifications Azure data factory Azure Databricks Azure Synapse PySpark/SQL ADLS Azure DevOps with CI/CD implementation. Nice-to-Have Skill Sets Business Intelligence tools (preferredPower BI) DP-203 Certified.

Posted 1 week ago

Apply

6.0 - 11.0 years

25 - 27 Lacs

Hyderabad

Work from Office

Overview We are seeking a highly skilled and experienced Azure Data Engineer to join our dynamic team. In this critical role, you will be responsible for designing, developing, and maintaining robust and scalable data solutions on the Microsoft Azure platform. You will work closely with data scientists, analysts, and business stakeholders to translate business requirements into effective data pipelines and data models. Responsibilities Design, develop, and implement data pipelines and ETL/ELT processes using Azure Data Factory, Azure Databricks, and other relevant Azure services. Develop and maintain data lakes and data warehouses on Azure, including Azure Data Lake Storage Gen2 and Azure Synapse Analytics. Build and optimize data models for data warehousing, data marts, and data lakes. Develop and implement data quality checks and data governance processes. Troubleshoot and resolve data-related issues. Collaborate with data scientists and analysts to support data exploration and analysis. Stay current with the latest advancements in cloud computing and data engineering technologies. Participate in all phases of the software development lifecycle, from requirements gathering to deployment and maintenance Qualifications 6+ years of experience in data engineering, with at least 3 years of experience working with Azure data services. Strong proficiency in SQL, Python, and other relevant programming languages. Experience with data warehousing and data lake architectures. Experience with ETL/ELT tools and technologies, such as Azure Data Factory, Azure Databricks, and Apache Spark. Experience with data modeling and data warehousing concepts. Experience with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Experience with Agile development methodologies. Bachelor's degree in Computer Science, Engineering, or a related field (Master's degree preferred). Relevant Azure certifications (e.g., Azure Data Engineer Associate) are a plus

Posted 1 week ago

Apply

5.0 - 10.0 years

19 - 25 Lacs

Hyderabad

Work from Office

Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.

Posted 1 week ago

Apply

2.0 - 4.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Overview We are seeking a skilled and proactive business analyst with expertise in Azure Data Engineering to join our dynamic team. In this role, you will bridge the gap between business needs and technical solutions, leveraging your analytical skills and Azure platform knowledge to design and implement robust data solutions. You will collaborate closely with stakeholders to gather and translate requirements, develop data pipelines, and ensure data quality and governance. This position requires a strong understanding of Azure services, data modeling, and ETL processes, along with the ability to thrive in a fast-paced, evolving environment. Responsibilities Collaborate with stakeholders to understand business needs and translate them into technical requirements. Design, develop, and implement data solutions using Azure Data Engineering technologies. Analyze complex data sets to identify trends, patterns, and insights that drive business decisions. Create and maintain detailed documentation of business requirements, data models, and data flows. Work in an environment where requirements are not always clearly defined, demonstrating flexibility and adaptability. Conduct data quality assessments and implement data governance practices. Provide training and support to end-users on data tools and solutions. Continuously monitor and optimize data processes for efficiency and performance. Qualifications Minimum of 2-4 years of experience as a data analyst with hands-on experience in Azure Data Engineering. Proficiency in Azure Data Factory, Azure Databricks, Azure SQL Database, and other Azure data services. Strong analytical and problem-solving skills with the ability to work in a fast-paced, ambiguous environment. Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams. Experience with data modeling, ETL processes, and data warehousing. Knowledge of data governance and data quality best practices. Ability to manage multiple projects and priorities simultaneously. Preferred Skills: Experience with other cloud platforms and data engineering tools. Certification in Azure Data Engineering or related fields.

Posted 1 week ago

Apply

10.0 - 15.0 years

4 - 8 Lacs

Noida

Work from Office

Highly skilled and experienced Data Modeler to join Enterprise Data Modelling team. The candidate will be responsible for creating and maintaining conceptual logical and physical data models ensuring alignment with industry best practices and standards. Working closely with business and functional teams the Data Modeler will play a pivotal role in standardizing data models at portfolio and domain levels driving efficiencies and maximizing the value of clients data assets. Preference will be given to candidates with prior experience within an Enterprise Data Modeling team. The ideal domain experience would be Insurance or Investment Banking. Roles and Responsibilities: Develop comprehensive conceptual logical and physical data models for multiple domains within the organization leveraging industry best practices and standards. Collaborate with business and functional teams to understand their data requirements and translate them into effective data models that support their strategic objectives. Serve as a subject matter expert in data modeling tools such as ERwin Data Modeler providing guidance and support to other team members and stakeholders. Establish and maintain standardized data models across portfolios and domains ensuring consistency governance and alignment with organizational objectives. Identify opportunities to optimize existing data models and enhance data utilization particularly in critical areas such as fraud banking AML. Provide consulting services to internal groups on data modeling tool usage administration and issue resolution promoting seamless data flow and application connections. Develop and deliver training content and support materials for data models ensuring that stakeholders have the necessary resources to understand and utilize them effectively. Collaborate with the enterprise data modeling group to develop and implement a robust governance framework and metrics for model standardization with a focus on longterm automated monitoring solutions. Qualifications: Bachelors or masters degree in computer science Information Systems or a related field. 10 years of experience working as a Data Modeler or in a similar role preferably within a large enterprise environment. Expertise in data modeling concepts and methodologies with demonstrated proficiency in creating conceptual logical and physical data models. Handson experience with data modeling tools such as Erwin Data Modeler as well as proficiency in database environments such as Snowflake and Netezza. Strong analytical and problemsolving skills with the ability to understand complex data requirements and translate them into effective data models. Excellent communication and collaboration skills with the ability to work effectively with crossfunctional teams and stakeholders. problem-solving skills,business intelligence platforms,erwin,data modeling,database management systems,data warehousing,etl processes,big data technologies,agile methodologies,data governance,sql,enterprise data modelling,data visualization tools,cloud data services,analytical skills,data modelling tool,,data architecture,communication skills

Posted 1 week ago

Apply

0.0 - 5.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Job Title:Data Engineer - DBT (Data Build Tool)Experience0-5 YearsLocation:Bengaluru : Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWSRequirements definition, source data analysis and profiling, the logical and physical design of the data lake and datawarehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systemsWork in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONSEssential Skills, Education and Experience Should have a B.E. / B.Tech. / MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposureOther skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as neededStrong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and drawconclusions.

Posted 1 week ago

Apply

3.0 - 6.0 years

13 - 18 Lacs

Bengaluru

Work from Office

We are looking to hire Data engineer for the Platform Engineering team. It is a collection of highly skilled individuals ranging from development to operations with a security first mindset who strive to push the boundaries of technology. We champion a DevSecOps culture and raise the bar on how and when we deploy applications to production. Our core principals are centered around automation, testing, quality, and immutability all via code. The role is responsible for building self-service capabilities that improve our security posture, productivity, and reduce time to market with automation at the core of these objectives. The individual collaborates with teams across the organization to ensure applications are designed for Continuous Delivery (CD) and are well-architected for their targeted platform which can be on-premise or the cloud. If you are passionate about developer productivity, cloud native applications, and container orchestration, this job is for you! Principal Accountabilities: The incumbent is mentored by senior individuals on the team to capture the flow and bottlenecks in the holistic IT delivery process and define future tool sets Skills and Software Requirements: Experience with a language such as Python, Go,SQL, Java, or Scala GCP data services (BigQuery; Dataflow; Dataproc; Cloud Composer; Pub/Sub; Google Cloud Storage; IAM) Experience with Jenkins, Maven, Git, Ansible, or CHEF Experience working with containers, orchestration tools (like Kubernetes, Mesos, Docker Swarm etc.) and container registries (GCE, Docker hub etc.) Experience with [SPI]aaS- Software-as-a-Service, Platform-as-a-Service, or Infrastructure-as- a-Service Acquire, cleanse, and ingest structured and unstructured data on the cloud Combine data from disparate sources in to a single, unified, authoritative view of data (e.g., Data Lake) Enable and support data movement from one system service to another system service Experience implementing or supporting automated solutions to technical problems Experience working in a team environment, proactively executing on tasks while meeting agreed delivery timelines Ability to contribute to effective and timely solutions Excellent oral and written communication skills

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Database Administrator at NTT DATA, you will be a seasoned subject matter expert responsible for ensuring the availability, integrity, and performance of critical data assets. You will work closely with cross-functional teams to support data-driven applications, troubleshoot issues, and implement robust backup and recovery strategies. Collaboration with Change Control, Release Management, Asset and Configuration Management, and Capacity and Availability Management will be essential to meet the needs of users and ensure database security and integrity. Key responsibilities include performing installation, configuration, and maintenance of database management systems, collaborating with software developers/architects to optimize database-related applications, designing backup and disaster recovery strategies, monitoring database performance, and providing technical support to end-users. You will also participate in database software upgrades, data validation activities, and work collaboratively with cross-functional teams to support database-related initiatives. To excel in this role, you should have seasoned proficiency in database administration tasks, a strong understanding of SQL, database security principles, and backup strategies. Effective communication, problem-solving, and analytical skills are crucial, along with the ability to manage multiple projects concurrently while maintaining attention to detail. Academic qualifications in computer science or related fields, along with relevant certifications like MCSE DBA or Oracle Certified Professional, are preferred. NTT DATA is a trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a diverse workforce and a focus on R&D, NTT DATA is dedicated to moving organizations confidently into the digital future. As an Equal Opportunity Employer, NTT DATA offers a dynamic workplace where employees can thrive, grow, and make a difference.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You should have a solid working knowledge of AWS database and data services as well as the Power BI stack. Your experience in gathering requirements, modeling data, designing, and supporting high-performance big data backend and data visualization systems will be crucial. You should be adept at utilizing methodologies and platform stacks such as Map Reduce, Spark, streaming solutions like Kafka and Kinesis, ETL systems like Glue and Firehose, storage solutions like S3, warehouse stacks like Redshift and DynamoDB, and equivalent open source stacks. Designing and implementing solutions using visualization technologies like Power BI and Quick Sight should be within your expertise. You will be responsible for maintaining and continuously grooming the product backlog, the release pipeline, and the product roadmap. It will be your responsibility to capture problem statements and opportunities raised by customers as demand items, epics, and stories. Leading database physical design sessions with the engineers in the team and ensuring quality assurance and load testing of the solution to maintain customer experience are also part of the role. Additionally, you will be supporting data governance and data quality (cleansing) efforts. Your primary skills should include proficiency in AWS database, data services, PowerBi stack, and big data.,

Posted 1 week ago

Apply

6.0 - 7.0 years

6 - 11 Lacs

Noida

Work from Office

Responsibilities Data Architecture: Develop and maintain the overall data architecture, ensuring scalability, performance, and data quality. AWS Data Services: Expertise in using AWS data services such as AWS Glue, S3, SNS, SES, Dynamo DB, Redshift, Cloud formation, Cloud watch, IAM, DMS, Event bridge scheduler etc. Data Warehousing: Design and implement data warehouses on AWS, leveraging AWS Redshift or other suitable options. Data Lakes: Build and manage data lakes on AWS using AWS S3 and other relevant services. Data Pipelines: Design and develop efficient data pipelines to extract, transform, and load data from various sources. Data Quality: Implement data quality frameworks and best practices to ensure data accuracy, completeness, and consistency. Cloud Optimization: Optimize data engineering solutions for performance, cost-efficiency, and scalability on the AWS cloud. Team Leadership: Mentor and guide data engineers, ensuring they adhere to best practices and meet project deadlines. Qualifications Bachelors degree in computer science, Engineering, or a related field. 6-7 years of experience in data engineering roles, with a focus on AWS cloud platforms. Strong understanding of data warehousing and data lake concepts. Proficiency in SQL and at least one programming language ( Python/Pyspark ). Good to have - Experience with any big data technologies like Hadoop, Spark, and Kafka. Knowledge of data modeling and data quality best practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a team. Preferred Qualifications Certifications in AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Data. Mandatory Competencies Big Data - Big Data - Pyspark Data on Cloud - Azure Data Lake (ADL) Beh - Communication and collaboration Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - AWS - AWS S3, S3 glacier, AWS EBS Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Database - Sql Server - SQL Packages Data Science and Machine Learning - Data Science and Machine Learning - Python

Posted 1 week ago

Apply

1.0 - 4.0 years

12 - 17 Lacs

Mumbai, Nagpur, Thane

Work from Office

Xpetize Technology Solutions Private Limited is looking for Field Sales Executive to join our dynamic team and embark on a rewarding career journey Promote and sell products or services to customers in the field Meet or exceed sales targets by developing and implementing sales strategies Build and maintain relationships with customers to increase product awareness and loyalty Conduct product presentations and provide product information to customers Provide excellent customer service, including addressing customer inquiries and resolving customer issues Monitor market trends and competitor activities to identify business opportunities and threats Maintain accurate sales records and report sales activity to management Strong knowledge of sales techniques, product knowledge, and customer service best practices Excellent communication and interpersonal skills

Posted 1 week ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Ahmedabad

Work from Office

Develop, record, and maintain cutting-edge web-based PHP applications for the company. Build innovative, state-of-the-art applications and collaborate with the User Experience (UX) team Ensure HTML, CSS, and shared JavaScript is valid and consistent across applications Prepare and maintain all applications utilizing standard development tools Utilize backend data services and contribute to increasing existing data services API Lead the entire web application development life cycle right from concept stage to delivery and post-launch support Convey effectively with all task progress, evaluations, suggestions, schedules along with technical and process issues Document the development process, architecture, and standard components Coordinate with co-developers and keeps project manager well informed of the status of development effort and serves as a liaison between development staff and project manager.

Posted 1 week ago

Apply

4.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

We are looking for a skilled SAP ABAP Developer with a strong programming background and hands-on experience in modern SAP technologies. The ideal candidate should have solid technical expertise across ABAP and related frameworks, along with a strong educational foundation. Design, develop, and implement SAP applications using ABAP/ABAP OO Develop SAP Core Data Services (CDS), OData services, and Fiori/UI5 applications Work with technologies such as BOPF, RAP, and HANA Integrate SOAP APIs and work with frontend scripting using JavaScript Collaborate with functional teams to translate business requirements into technical solutions Required Skills: 4 to 10 years of experience in software development Strong educational background: Bachelor s degree in Engineering or MCA from reputed institutes Expertise in: ABAP/ABAP OO CDS Views / OData Services Fiori / UI5 BOPF / HANA / RAP SOAP API / JavaScript Sap Abap, Migration, Implemenataion

Posted 1 week ago

Apply

7.0 - 12.0 years

7 - 11 Lacs

Kolkata

Work from Office

Your day in the role will include.. Assists clients in the selection, implementation, and support of Data Services for SAP. Lead multiple sized projects as team member or lead to implement new functionalities and improve existing functionalities including articulating, analyzing requirements and translating them into effective solutions Prepare and conduct Unit Testing and User Acceptance Testing You will come with.. Knowledge and experience in implementation planning, fit analysis, configuration, testing, rollout and post-implementation support. Experience in working with teams to prioritize work and drive system solutions by clearly articulating business needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, 7-12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. At least 3 years of hands-on experience in Syniti ADM, apart from other Data Migration exposure Syniti ADM Consultant, primarily have Data migration experience from Different Legacy Systems to SAP or Non-SAP systems. Data Migration experience from SAP ECC or Non SAP to SAP S4 HANA or other SAP Target Systems. Should be able to prepare mapping sheet combining his/her Functional and technical expertise Preferred technical and professional experience Having worked or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS

Posted 1 week ago

Apply

4.0 - 7.0 years

8 - 9 Lacs

Hyderabad

Work from Office

Req ID: 332502 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Senior SAP FICO S4 Hana Consultant to join our team in Hyderabad, Telangana (IN-TG), India (IN). Senior SAP S/4 HANA FICO Consultant At NTT DATA Services, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our company s growth, market presence and our ability to help our clients stay a step ahead of the competition. By hiring the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA Services and for the people who work here. NTT DATA Services currently seeks a senior SAP S/4 HANA FICO Consultant to join our team. SAP S/4 HANA FICO Consultant: An SAP S/4HANA FICO (Financial Accounting and Controlling) consultant is responsible for implementing, configuring, and supporting SAP S/4HANA Finance solutions . This involves understanding business requirements, designing solutions, performing configurations, testing, and providing user support. They also play a key role in integrating FICO with other SAP modules and external systems. Gathering and Analyzing Requirements: Working with business users to understand their needs and translating them into functional specifications for SAP S/4HANA FICO. Solution Design and Configuration: Designing and configuring the SAP S/4HANA FICO module to meet business requirements, including setting up company codes, chart of accounts, fiscal year variants, and other relevant configurations. Engage with customers in executive meetings, solution workshops, and design sessions and clearly articulate the business value of SAP solutions through tailored presentations and demos. Identify and describe solution options, evaluate pros/cons, and make recommendations regarding best solution strategies and estimated opportunity timelines. Create business requirements documents, process maps, high-level solution design diagrams, and scope of works for various business implementation scenarios and solution designs. Shape conversations with customers by guiding deep cross-functional Discovery based on industry best practices, customer research, and SAP customer experiences across SAP solutions. Stay updated on SAP product roadmaps and trends, leveraging this knowledge to influence presales strategies. Deliver customized demos, workshops, and presentations to illustrate the value of SAP s cloud portfolio. Collaborate with SAP s technical and implementation teams to ensure a seamless transition from presales to delivery. Monitor market and competitive trends in the SAP cloud landscape to keep the presales strategy relevant Training other members of the pre-sales team on the technical aspects of the service offerings and products. Positions General Duties and Tasks: 10+ years of experience in SAP S/4 HANA FICO implementing , solution architecture, SAP consulting and solution architecture experience Strong knowledge of SAP ERP, S/4HANA, and other SAP Finance . Deep understanding of SAP S/4HANA Finance concepts, including General Ledger (GL), Accounts Payable (AP), Accounts Receivable (AR), Asset Accounting (AA), Controlling (CO), and Profitability Analysis (PA). Excellent communication, presentation, and client relationship skills. Ability to translate technical details into business language and value-driven solutions. Experience in industries like manufacturing or Life Sciences. Bachelor s degree in Computer Science, Engineering, Business, or related field. Location: Hyderabad, Bangalore

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies