Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
4 - 7 Lacs
bengaluru
Work from Office
Were seeking a Senior Software Engineer or a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, locations and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer or a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5-10 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Django Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in Kafka or any other stream message processing solutions. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a Senior Software Engineer in the Software department, you will be responsible for the following key tasks: - Demonstrating proven work experience as a backend developer - Utilizing your hands-on experience with Java, specifically emphasizing Java 8 as a requirement. Knowledge of Java 11/17 will be an added advantage - Showing a strong understanding of Spring Framework, Spring Boot, and proficient skills in RESTful API design - Leveraging AI-assisted development tools like GitHub Copilot and ChatGPT to improve code quality, hasten development processes, and automate repetitive tasks - Employing AI models and frameworks such as Llama for tasks related to natural language understanding and generation specific to product features - Implementing and enhancing AI inference using Groq hardware accelerators to optimize performance-critical workloads - Utilizing Langsmith or similar AI workflow management tools to create, monitor, and enhance AI model pipelines and integrations - Having experience with Advanced SQL, PLSQL, and familiarity with version control systems, especially Git - Ensuring compliance with coding conventions and industry best practices - Possessing exceptional analytical and debugging skills The ideal candidate profile for this role includes: - Previous experience working with AI-powered development environments and tools to increase productivity and foster innovation - A strong interest in keeping abreast of AI trends and integrating AI-driven solutions into software products - Proficiency in Java, Sprint Boot, advanced SQL, and PL/SQL for intricate data querying and database programming - Familiarity with containerization using Docker, K8S, GCP If you are excited about the opportunity to work with cutting-edge technologies and contribute to the development of innovative software solutions, this role might be the perfect fit for you.,
Posted 4 days ago
9.0 - 13.0 years
0 Lacs
punjab
On-site
As a Salesforce Marketing Cloud Consultant in Sydney with over 9 years of experience, you are expected to have a strong domain knowledge of the Salesforce Marketing Cloud platform. Your responsibilities will include integrating marketing with Sales/Service Cloud and other external data systems for data push/pull, including CRMs, ERPs, eCommerce platforms, Google Analytics, and SMS. You should have a solid understanding of relational data models, SOAP APIs, REST APIs, and integration techniques, along with advanced SQL skills. Your role will involve designing marketing cloud journeys and campaigns based on data dependencies. Proficiency in Email Studio, Journey Builder, and campaign management is essential, including data configuration, audience creation, and utilizing SFMC platform capabilities. You should be adept at Amp Script for email personalization and creating complex cloud pages with Amp Script. A technical background with a history of understanding complex systems is required, along with the ability to work both independently and collaboratively in a team environment. Strong communication skills and team handling capabilities are crucial. Possession of Salesforce Marketing Cloud Consultant certification is mandatory for this role. Your experience in Email Studio, Journey Builder, Automation Studio, Web Studio/Cloud Pages, Amp Script, and Marketing Cloud API's (REST and SOAP) will be valuable. Key responsibilities include enabling and executing marketing automations, testing marketing automations with dynamic content, and designing and optimizing campaigns with strategies like A/B Testing and throttled sending. You will also be responsible for personalizing marketing campaigns, building and executing email campaigns using Content Builder, and leveraging AmpScript for dynamic content. Subscriber and data management tasks, working with Data Extensions, Profile management, and relational data models, are also part of your responsibilities. Additionally, you should be able to design mobile-responsive creatives, create Microsites using ExactTarget, and configure Exact Target Microsite Activity. Knowledge of UI and front-end web technologies like SSJS, HTML, CSS, and Javascript/jQuery will be considered a value addition to your role as a Salesforce Marketing Cloud Consultant.,
Posted 6 days ago
1.0 - 5.0 years
0 Lacs
pune, maharashtra
On-site
The ideal candidate should have 1 to 2 years of experience in providing technical training and mentoring. You must possess a strong understanding of Data Analytics and hands-on experience with Python, Advanced Python, R programming, SAS, and machine learning. Proficiency in SQL and Advanced SQL is essential, along with a basic understanding of Statistics. Knowledge of operating systems such as GNU/Linux and Network fundamentals is required. Additionally, familiarity with MS Office applications (Excel, Word, PowerPoint) is necessary. The candidate should be self-motivated, technology-driven, possess excellent analytical and logical skills, and be a good team player. Exceptional communication and presentation skills are a must, along with good aptitude skills being preferred. Responsibilities include the ability to quickly grasp new technologies and effectively train other employees. You should be capable of resolving technical queries, conducting training sessions, and ensuring placement-driven quality in the training process. The candidate should be able to work independently without constant supervision and actively participate in reviews and meetings. This is a full-time position with a day shift schedule. The work location is in person.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You are a Mid-Level ETL Tester SQL Python Data Quality Engineer with 4-8 years of experience, specializing in data quality, functional testing, and advanced SQL skills. Your role at EIL Global, a prominent IT services provider based in Adelaide, Australia, involves ensuring data integrity and accuracy across systems in Chennai or Pune. You will be required to work on-site for 3 days a week. You must have a strong proficiency in Python or Java for automating testing processes and scripts. Additionally, proven experience in functional testing methodologies and advanced SQL skills are essential for efficiently extracting, manipulating, and validating large amounts of data. Experience in CI/CD pipelines, Selenium for automated web application testing, and Cucumber for behavior-driven development is crucial. As a Data Quality Engineer, you will collaborate with cross-functional teams to understand data requirements, conduct thorough functional testing, develop robust test scripts using Python and SQL, and implement CI/CD practices. Your responsibilities include monitoring and maintaining data quality standards, documenting testing activities, and continuously enhancing testing strategies for optimal data assurance outcomes. The interview process comprises an L1 interview, client interview, DIGI, HR interview, followed by an offer and onboarding. Your expertise in advanced SQL topics like Window Functions, Common Table Expressions, Subqueries, Analytical Functions, Full-Text Search, Hierarchical Queries, and Optimization Techniques is highly valued in this role. Join EIL Global to play a vital role in ensuring high standards of accuracy and consistency in data management.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a senior data consultant, you will serve as a trusted advisor and subject matter expert, utilizing data to drive actionable business insights. Your role involves leading client engagements, delivering high-quality data solutions, and collaborating with sales teams to create accelerators, architectural artifacts, and pre-sales assets for RFPs. In this position, you will lead strategic data consulting projects, working directly with clients to design and implement data-driven solutions. You will be responsible for developing and maintaining a library of accelerators, frameworks, and artifacts to support sales proposals and RFP responses. Additionally, you will collaborate with cross-functional teams to ensure successful client delivery and seamless handoffs between sales and project implementation. Key Skills / Technologies: Must-Have: - Data Analytics & Visualization (Tableau, Power BI, etc.) - Advanced SQL and data querying skills - Strong statistical and analytical expertise - Experience in data integration, cleansing, and modeling - Excellent communication and stakeholder management skills Good-to-Have: - Familiarity with programming languages (Python, R) for advanced analytics - Knowledge of data warehousing and big data platforms - Experience in a consulting or client-facing role - Familiarity with data governance and business intelligence frameworks Responsibilities: Client Consulting & Delivery: - Lead data analysis projects, define client requirements, and deliver actionable insights. - Design data models and visualizations to support strategic decision-making across various business areas. - Advise clients on data management and analytics best practices to optimize business processes. Sales & Pre-Sales Support: - Develop accelerators, consulting frameworks, and architectural artifacts for RFP responses and sales proposals. - Support sales team through client presentations, technical workshops, and pre-sales engagements. - Provide expert insights and technical recommendations aligning client needs with technology solutions. Collaboration & Mentoring: - Work closely with technical, sales, and delivery teams to ensure cohesive strategies and smooth client handoffs. - Mentor junior consultants and share best practices in data analysis and consulting methodologies. Required Qualifications: - Bachelors or Masters degree in Data Science, Business Analytics, Statistics, or related field. - 5+ years of experience in data consulting or analytics roles with client-facing responsibilities. - Demonstrated ability to develop data-driven solutions that enhance business performance. - Strong problem-solving and communication skills supporting both sales and client delivery. Why Join Us: - Contribute to transforming businesses through data on diverse, high-impact projects. - Drive sales success and ensure high-quality client delivery simultaneously. - Collaborate with a passionate team of experts, experiencing continuous professional growth in a dynamic environment with competitive benefits.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Documentation Specialist, you will be responsible for creating world-class customer-facing documentation that delights and excites customers. Your role involves removing ambiguity by documenting information effectively, leading to increased team efficiency and effectiveness. Your efforts will help convert tacit knowledge into implicit knowledge. You will manage a full region or multiple customers within a region, owning end-to-end communication and status reporting to both leadership and customers. Your responsibilities include managing your portfolio, estimates, asset projection, unit metrics, tracking CARR (Contracted Annual Recurring Revenue), asset transfers, and cloud costs for fully owned projects. Additionally, you will provide valuable data insights to customers, identify early warning signs for issues, and collaborate with Customer Success stakeholders. Collaborating effectively with stakeholders, managing escalations, planning transitions, and initiating hiring efforts are key aspects of your role. You will also drive initiatives to achieve target profit gross margin and CSAT score for your allocated portfolio, while prioritizing work aspects amidst changing timeframes and incomplete information. Your leadership skills will be crucial in mentoring, grooming, assessing, and providing balanced feedback to your team members. Regular performance discussions and tracking Individual Development Plans are essential. Additionally, you will act as a backup SEM for another region. Required Skills: - Advanced SQL & Unix experience - Strong ETL & Python support skills - Hands-on knowledge of Analytics Tools (Power BI or Tableau) - Good Healthcare knowledge - Fundamental ITIL Expertise - Proficiency in Support Processes (SLAs, OLAs, Product or application support) - Project and Program management abilities - Escalation & Team management skills - Problem-solving mindset - Excellent written and verbal communication skills - Ambitious and adaptable to work in a flexible startup environment with a focus on achieving goals.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the opportunity to shape a career that is as unique as you are, benefiting from global support, an inclusive culture, and cutting-edge technology to empower you to reach your full potential. Your distinctive voice and perspective are crucial in contributing to EY's continuous improvement. Join us in creating an exceptional experience for yourself and in fostering a better working world for all. As a part of the EY-ER-Regulatory Compliance team, you will play a key role in understanding clients" business requirements and delivering solutions in alignment with EY guidelines and methodologies. In your role as a Regulatory Compliance Senior, you will actively cultivate and enhance both internal and external relationships. Upholding our commitment to quality, you will drive projects to successful completion with high-quality deliverables, enhance operational efficiency, identify and communicate risks to clients and EY senior management, and take the lead on internal initiatives. We are seeking an ETQ Developer who will be responsible for designing, developing, and maintaining various modules of the EtQ Reliance platform. This role involves implementing system configurations and customizations, utilizing Out-of-Box features, writing ETQ Scripts for complex configurations, and collaborating with cross-functional teams to gather requirements and ensure successful implementation of quality management systems in a regulated environment. Key Responsibilities: - Collaborate with stakeholders to gather requirements and define software functionalities. - Design and develop software applications on the EtQ Reliance platform in an Agile team setting. - Configure and customize the EtQ Reliance system to meet business needs. - Conduct unit testing to ensure software quality and performance. - Peer review code and configurations. - Create and maintain technical documentation, including system configurations and workflows. - Perform code promotions following the defined SDLC process. - Execute test scripts for code promotions. - Provide technical support and training to end-users. - Troubleshoot and resolve issues in the production environment. - Collaborate with technical leads and scrum masters to define project scope and deliverables. - Stay updated on the latest EtQ Reliance features and industry trends. Qualifications: - Bachelor's degree in Computer Science, Software Engineering, or a related field. - Proficiency in coding with Python and Java, Advanced SQL, and DBMS. - Strong knowledge of the EtQ Reliance platform and its modules. - Excellent problem-solving and analytical skills. - Previous experience as an EtQ Developer, including exposure to various configurations, customizations, system integrations, data migration, and automation in a regulated environment. - Ability to work independently, manage multiple priorities, and follow Agile methodology. - Strong communication and collaboration skills. Good to Have: - ETQ Reliance Promotion Certification. - Intermediate or Advanced ETQ Designer certification. EY is dedicated to building a better working world by creating long-term value for clients, people, and society while fostering trust in the capital markets. Our diverse teams, spread across 150 countries, leverage data and technology to provide assurance and support clients in growth, transformation, and operations across various sectors. With expertise in assurance, consulting, law, strategy, tax, and transactions, EY teams are committed to asking the right questions to address the complex challenges of today's world.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You should have good knowledge in System Replication server Architecture and a deep understanding of Sybase architecture, including installation, configuration, maintenance, and applying security patches. Your responsibilities will include DB Shrink, SRS configuration, and troubleshooting latency issues. Performance tuning will be a key aspect of your role. Additionally, you should have experience with high availability and disaster recovery planning and solutions. Knowledge of scripting languages for automation is essential. Advanced SQL knowledge is required, including stored procedures, triggers, and complex query optimization techniques. You should also be familiar with backups, restores, and recovery models. Skills required for this role include expertise in db shrink, complex query optimization, triggers, SQL, disaster recovery planning, installation, configuration, backups, restores, scripting languages, troubleshooting latency issues, performance tuning, Sybase SQL Anywhere, SRS configuration, high availability, security patches, Sybase architecture, stored procedures, recovery models, System Replication server architecture, maintenance, and advanced SQL.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Analytics Lead at Cummins Inc., you will be responsible for facilitating data, compliance, and environment governance processes for the assigned domain. Your role includes leading analytics projects to provide insights for the business, integrating data analysis findings into governance solutions, and ingesting key data into the data lake while ensuring the creation and maintenance of relevant metadata and data profiles. You will coach team members, business teams, and stakeholders to find necessary and relevant data, contribute to communities of practice promoting responsible analytics use, and develop the capability of peers and team members within the Analytics Ecosystem. Additionally, you will mentor and review the work of less experienced team members, integrate data from various source systems to build models for business use, and cleanse data to ensure accuracy and reduce redundancy. Your responsibilities will also involve leading the preparation of communications to leaders and stakeholders, designing and implementing data/statistical models, collaborating with stakeholders on analytics initiatives, and automating complex workflows and processes using tools like Power Automate and Power Apps. You will manage version control and collaboration using GITLAB, utilize SharePoint for project management and data collaboration, and provide regular updates on work progress via JIRA/Meets to stakeholders. Qualifications: - College, university, or equivalent degree in a relevant technical discipline, or relevant equivalent experience required. - This position may require licensing for compliance with export controls or sanctions regulations. Competencies: - Balancing stakeholders - Collaborating effectively - Communicating clearly and effectively - Customer focus - Managing ambiguity - Organizational savvy - Data Analytics - Data Mining - Data Modeling - Data Communication and Visualization - Data Literacy - Data Profiling - Data Quality - Project Management - Valuing differences Technical Skills: - Advanced Python - Databricks, Pyspark - Advanced SQL, ETL tools - Power Automate - Power Apps - SharePoint - GITLAB - Power BI - Jira - Mendix - Statistics Soft Skills: - Strong problem-solving and analytical abilities - Excellent communication and stakeholder management skills - Proven ability to lead a team - Strategic thinking - Advanced project management Experience: - Intermediate level of relevant work experience required - This is a Hybrid role Join Cummins Inc. and be part of a dynamic team where you can utilize your technical and soft skills to make a significant impact in the field of data analytics.,
Posted 3 weeks ago
6.0 - 10.0 years
0 - 3 Lacs
Pune, Chennai
Hybrid
Greetings, We have an opening for one of our clients " Mphasis " for " Scala Developer " position. Role & responsibilities Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field. Minimum 5 years of professional experience in Data Engineering, with a strong focus on big data technologies. Proficiency in Scala for developing big data applications and transformations, especially with Apache Spark. Expert-level proficiency in SQL; ability to write complex queries, optimize performance, and understand database internals. Extensive hands-on experience with Apache Spark (Spark SQL, DataFrames, RDDs) for large-scale data processing and analytics. Solid understanding of distributed computing concepts and experience with the Hadoop ecosystem (HDFS, Hive). Experience with building and optimizing ETL/ELT processes and data warehousing concepts. Strong understanding of data modeling techniques (e.g., Star Schema, Snowflake Schema). Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an Agile team environment. Work mode : Hybrid ( 3 days ) Location : Pune / Chennai
Posted 3 weeks ago
10.0 - 15.0 years
25 - 30 Lacs
Nagpur, Pune
Work from Office
Design and manage scalable, secure, and high-performance database systems aligned with business goals. Optimize performance, ensure data integrity, and implement modern data solutions. Lead cross-functional collaboration.
Posted 3 weeks ago
2.0 - 4.0 years
12 - 15 Lacs
Bengaluru
Hybrid
Role & responsibilities Responsibilities Manage end-to-end data pipelines, ensuring seamless flow and integrity of data from diverse sources to analytical systems Collaborate with data scientists, analysts, and business teams to understand data needs and develop efficient solutions. Implement robust data governance practices to maintain data quality standards and facilitate reliable analysis and reporting Conduct thorough data validation procedures to ensure accuracy and reliability of analytical outputs Monitor data systems and pipelines, troubleshoot issues, and ensure the continuous availability of data Ensure data quality, integrity, and consistency across different data sources and storage systems Optimize data flow and storage processes for performance and scalability. Preferred candidate profile Skills & Requirement Must Have... At least 2-4 years of experience working in the field of analytics, reporting out metrics and deep dive analytics. Strong proficiency with Advanced SQL (Window Functions, DML Commands, DDL Commands, CTES, Sub Queries, etc) Expertise in building end to end data pipelines and ETL frameworks & tools Ability to write complex queries and understanding of database concepts. Strong understanding of data modelling, schema design, and database optimization techniques Knowledge of version control (e.g., Git) and collaborative development practices. Exceptional communication and collaboration skills. Nice to have... Exposure to broader analytics ecosystem Experience with data lake architectures and big data technologies. Education Bachelors degree in computer science, Engineering, or a related field. At least 2-4 years of relevant experience in analytics organizations of large corporates or in consulting companies in analytics roles. Perks and benefits
Posted 3 weeks ago
3.0 - 5.0 years
12 - 15 Lacs
Bengaluru
Hybrid
Role & responsibilities Responsibilities Mine and analyse data to identify patterns and correlations among the various data points. Perform end-to-end analysis across all digital touchpoints, including data gathering from large and complex data sets, data processing, and analysis. Conduct in-depth analysis of user behaviour, customer journeys and other relevant metrics to understand the effectiveness of digital initiatives and identify areas for improvement. Present findings from analytics and research and make recommendations to business teams. Preferred candidate profile Skills & Requirement Must Have... 3-5 years of experience working in the field of analytics, reporting out metrics and deep dive analytics. Strong proficiency with Advanced SQL (Windows Functions, DML, DDL Commands, CTES, Sub Queries,etc) for data analysis and building end to end data pipelines. Ability to write complex queries and understanding of database concepts. Strong analytical problem-solving skills and an aptitude for learning quickly. Expert in data analysis and presentation skills. Exceptional communication and collaboration skills Critical Thinking and ability to think beyond the obvious Nice to have... Experience in web analytics and tools like (Adobe Omniture, Google analytics etc) Experience with programming languages like Python & Unix Shell for data pipeline automation and analysis. Knowledge of statistics concepts and Machine learning algorithms like regression, clustering etc. Education Bachelors with Post-Graduation in Management Science and related fields. 2-4 years of relevant experience in analytics organizations of large corporates or in consulting companies in analytics roles. Perks and benefits
Posted 3 weeks ago
6.0 - 8.0 years
8 - 18 Lacs
Pune
Hybrid
Job Title: Lead ETL Developer Job Location: Pune Job Description: Company Introduction Join Nitor Infotech, an Ascendion company, where we harness data to drive impactful solutions. Our innovative team is dedicated to excellence in data processing and analytics, making a significant difference in the retail domain. Be part of a collaborative environment that values your expertise and contributions. Job Overview We are seeking an ETL Developer with expertise in Advanced SQL, Python, and Shell Scripting. This full-time position reports to the Data Engineering Manager and is available in a hybrid work model. This is a replacement position within the SRAI - EYC Implementation team. Key Responsibilities Design and develop ETL processes for data extraction, transformation, and loading. Utilize Advanced SQL for data processing and analysis. Implement data processing solutions using Python and Shell Scripting. Collaborate with cross-functional teams to understand data requirements. Maintain and optimize data pipelines for performance and reliability. Provide insights and analysis to support business decisions. Ensure data quality and integrity throughout the ETL process. Stay updated on industry trends and best practices in data engineering. Must-Have Skills and Qualifications 7-8 years of experience as an ETL Developer. Expertise in Advanced SQL for data manipulation and analysis. Proficient in Python and Shell Scripting. Foundational understanding of Databricks and Power BI. Strong logical problem-solving skills. Experience in data processing and transformation. Understanding of the retail domain is a plus. Good-to-Have Skills and Qualifications Familiarity with cloud data platforms (AWS, Azure). Knowledge of data warehousing concepts. Experience with data visualization tools. Understanding of Agile methodologies. What We Offer Competitive salary and comprehensive benefits package. Opportunities for professional growth and advancement. Collaborative and innovative work environment. Flexible work arrangements. Impactful work that drives industry change. DEI Statement At Nitor Infotech, we embrace diversity and inclusion. We actively foster an environment where all voices are heard and valued. ISMS Statement Nitor Infotech maintains ISO 27001 certification. All employees must adhere to our information security policies.
Posted 3 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
As a Databricks Engineer-Lead, you will be responsible for designing and developing ETL pipelines using Azure Data Factory for data ingestion and transformation. You will collaborate with various Azure stack modules such as Data Lakes and SQL Data Warehouse to create robust data solutions. Your role will involve writing efficient SQL, Python, and PySpark code for data processing and transformation. It is essential to understand and translate business requirements into technical designs, develop mapping documents, and adhere to transformation rules as per the project scope. Effective communication with stakeholders to ensure smooth project execution is a crucial aspect of this role. To excel in this position, you should possess 7-10 years of experience in data ingestion, data processing, and analytical pipelines involving big data and relational databases. Hands-on experience with Azure services like Azure Data Lake Storage, Azure Databricks, Azure Data Factory, Azure Synapse Analytics, and Azure SQL Database is required. Proficiency in SQL, Python, and PySpark for data manipulation is essential. Familiarity with DevOps practices and CI/CD deployments is a plus. Strong communication skills and attention to detail, especially in high-pressure situations, are highly valued in this role. Previous experience in the insurance or financial industry is preferred. This role is based in Hyderabad and requires the selected candidate to work from the office. If you are passionate about leveraging Databricks, PySpark, SQL, and other Azure technologies to drive innovative data solutions, this position offers an exciting opportunity to lead and contribute to impactful projects.,
Posted 4 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
This role is a part of the upcoming ValueMomentum Data Engineering Recruitment Drive on July 19th. As a Tech Lead-Modern Data, you will be responsible for leading data integration processes utilizing Informatica IICS. With 7-10 years of experience, you will be involved in designing, developing, and managing ETL/ELT processes. Your role will entail close collaboration with cross-functional teams to ensure that data solutions not only meet business needs but also align with industry best practices. Joining ValueMomentums Engineering Center means becoming part of a team of passionate engineers dedicated to addressing complex business challenges with innovative solutions. Our focus on transforming the P&C insurance value chain relies on a strong engineering foundation and a continuous refinement of processes, methodologies, tools, agile delivery teams, and core engineering archetypes. With expertise in Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise, we are committed to investing in your growth through our Infinity Program, empowering you to build your career with role-specific skill development using immersive learning platforms. As a Tech Lead, your responsibilities will include designing and implementing data integration processes using Informatica IICS, constructing mappings, tasks, task flows, schedules, and parameter files. You will ensure adherence to ETL/ELT best practices, create ETL mapping documentation, collaborate with stakeholders to understand data requirements, and implement solutions. Supporting activities such as ticket creation and resolution in Jira/ServiceNow, working in an Agile/DevOps environment, and ensuring timely delivery of solutions are key aspects of this role. To be successful in this position, you should have at least 7 years of experience in Informatica, with a minimum of 2 years in Informatica IICS. Strong experience in ETL tools and database designs, a good understanding of Agile methodologies, experience working in Onsite/Offshore models, as well as experience in the insurance or financial industry are preferred. Strong problem-solving and analytical skills, attention to detail in high-pressure situations, and excellent verbal and written communication skills are essential requirements. ValueMomentum is a leading solutions provider for the global property and casualty insurance industry. With a focus on helping insurers achieve sustained growth and high performance, the company enhances stakeholder value and fosters resilient societies. Having served over 100 insurers, ValueMomentum stands as one of the largest services providers exclusively dedicated to the insurance industry. At ValueMomentum, we offer a congenial environment for your professional growth, surrounded by experienced professionals. Some benefits available to you include a competitive compensation package, individual career development through coaching and mentoring programs, comprehensive training and certification programs, performance management tools like goal setting, continuous feedback, and year-end appraisal, as well as rewards and recognition for outstanding performers.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
The Associate Manager, Compliance Analytics supports the development and implementation of compliance analytics across the global compliance organization. You will be responsible for designing and developing analytical solutions, generating insights, and fostering a culture of ethics and integrity. Your role will involve transforming data from multiple systems, designing the analytics approach, and executing the analytics. Your responsibilities will include designing, developing, and maintaining analytics to provide business insights, enabling risk-intelligent decisions, informing compliance risk assessment processes, and supporting remediation activities. You will be involved in scoping, gathering requirements, developing analytics models, validating and testing models, and communicating results to stakeholders for analytics projects. Additionally, you will support the execution of analytics initiatives to enhance the Global Compliance program and provide actionable insights through data and analysis to achieve business objectives. Furthermore, you will contribute to the development of standardized analytics processes and frameworks, including documentation and validation of work. You will also be responsible for setting up sustainable analytics solutions, including data pipelines and automated refresh schedules as necessary. Collaboration with compliance officers, IT, and stakeholders to understand business objectives and provide reliable and accurate reports, insights, and analysis to inform decision-making is an essential aspect of this role. As part of your duties, you will lead a culture of continuous improvement by enhancing existing databases, data collection methods, statistical methods, technology, procedures, and training. You will partner with data custodians and process experts to ensure the quality of data and definitions to support the building of reliable data models and analysis. Additionally, coaching and developing junior team members will be a key component of this role. To qualify for this position, you should have 8+ years of relevant work experience in Python, Advanced SQL, R, Azure, Databricks, PySpark, and PowerBI. A strong knowledge and experience in advanced analytics tools and languages to analyze large data sets from multiple sources are required. A BTech in Computer Science, IT, MSc in Mathematics/Statistics, or equivalent courses from an accredited university is necessary. You should possess a strong understanding of algorithms, mathematical models, statistical techniques, data mining, and experience implementing statistical and machine learning models. Experience with analyzing accounting and other financial data, as well as demonstrating excellent ability to exercise discretion and maintain confidentiality, are essential. Knowledge and experience with data transformation and cleansing, data modeling, and database concepts are also preferred.,
Posted 1 month ago
8.0 - 12.0 years
18 - 27 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
Job Description: Design, implement, and maintain data pipelines and data integration solutions using Azure Synapse Develop and optimize data models and data storage solutions on Azure Collaborate with data scientists and analysts to implement data processing and data transformation tasks. Ensure data quality and integrity through data validation and cleansing methodologies. Monitor and troubleshoot data pipelines to identify and resolve performance issues Collaborate with cross-functional teams to understand and prioritize data requirements. Stay up-to-date with the latest trends and technologies in data engineering and Azure services. Skills & Qualifications: Bachelors degree in IT, computer science, computer engineering, or similar 8+ years of experience in Data Engineering. Microsoft Azure Synapse Analytics experience is essential. (Azure Data Factory, Dedicated SQL Pool, Lake Database, Azure Storage) Hands on experience in Spark notebooks(python or Scala) is mandatory End-to-end Data Warehouse experience: Ingestion, ETL, big data pipelines, data architecture, message queuing, BI/Reporting, and Data Security. Advanced SQL/relational database knowledge and query authoring. Demonstrated experience in design and delivering data platforms for Business Intelligence and Data Warehouse. Strong skills in handling and analysing complex, high volume data with excellent attention in details. Knowledge of data modelling and data warehousing concepts, such as DataVault or 3NF. Experience with Data Governance (Quality, Lineage, Data dictionary, and Security). Familiar with Agile methodology and Agile working environment. Ability to work alone with POs, BAs, Architects.
Posted 1 month ago
2.0 - 7.0 years
8 - 18 Lacs
Gurugram
Work from Office
About this role: career opportunities for a Data Analyst within the Chief Data and Analytics group. This role will provide support for business processes (including but not limited to Enterprise, Account and Person data management) with focus on building data management and analytics solution to improve Data Quality at scale What youll do: Analyse Data to define and create Data quality metrics to support strategic decision making Drive data quality initiatives on improving the accuracy of our client data thats leveraged in critical business decisions throughout the enterprise while curating data insights that will ultimately improve the transparency and value of data across Gartner. Complete analysis interpreting results using a variety of techniques, ranging from simple data aggregation to more complex statistical analysis Create executive level Dashboards to present Data quality metrics Collaborate across business and technical teams, both on site and offshore, to create business deliverables such as Data flow diagrams, Business Requirements, Functional Requirements Obtain understanding of relevant business area(s), technical options, limitations, costs and risks to communicate tradeoffs and recommend solutions or suggest alternatives to proposed solutions to shape requirements Independently drive critical data workshops with business and IT stakeholders to develop requirements and execution process while managing dependencies across teams Collaborate and provide support to data scientists in the team with the right data insights to help them build AI models for better data quality What youll need: 2-4 years of experience as a Data Analyst; prior experience in working in Data Warehousing, Business Intelligence, Master Data Management, Analytics environment a plus Advanced SQL skills are mandatory Strong Python skills required Dashboard build experience (PowerBI, Tableau) is required to present data insights Strong analytical, strategic thinking and problem-solving skills including ability to clearly and concisely gather, interpret, analyze and document Business Process, User and Data Functional requirements in a structured way Understanding of data analysis tools and techniques is required Ability to breakdown complex business problems and workflows into meaningful components that are understandable by various levels Well versed with in utilizing tools such MS Excel, MS Word, JIRA, Confluence Knowledge and experience with Scrum/Agile methodology Expert level communication with both technical and non-technical personnel, both oral and written. Who you are: Effective time management skills and ability to meet deadlines Excellent communications skills interacting with technical and business audiences Excellent organization, multitasking, and prioritization skills Must possess a willingness and aptitude to embrace new technologies/ideas and master concepts rapidly Intellectual curiosity, passion for technology and keeping up with new trends Delivering project work on-time within budget with high quality
Posted 1 month ago
2.0 - 7.0 years
3 - 7 Lacs
Bengaluru
Work from Office
About the Team As Business Analysts, its on us to dive into data and derive insights from it. These then become actionable solutions in the form of changes, improvements, upgrades and new features. As a Business Analyst at Meesho, you will play a crucial role in identifying, improving, and developing technology solutions that drive our strategic goals. This is a tremendous opportunity to learn about high-priority initiatives and collaborate with colleagues throughout the firm and across teams. We work at the intersection of business and technology, continuously developing our leadership, management and communication skills in the process. The exact team you will be working with will be decided during or after the hiring process. Regardless, you are sure to learn and grow and have fun doing so too. Each of our teams at Meesho has its own fun rituals from casual catch-ups to bar hopping, movies nights, and games. So, join us! About the Role As a Senior Business Analyst, you will work on improving the reporting tools, methods, and processes of the team you are assigned to. You will also create and deliver weekly, monthly, and quarterly metrics critical for tracking and managing the business. You will manage numerous requests concurrently and strategically, prioritising them when necessary. You will actively engage with internal partners throughout the organisation to meet and exceed customer service levels and transport-related KPIs. You will brainstorm simple, scalable solutions to difficult problems, and seamlessly manage projects under your purview. You will maintain excellent relationships with our users and in fact, advocate for them while keeping in mind the business goals of your team. What you will do Create various algorithms for optimizing demand and supply data Conduct analysis and solution-building based on insights captured from data Give insights to management and help in strategic planning Analyze metrics, key indicators and other available data sources to discover root causes of process defects Support business development and help to create efficient designs and solution processes Determine efficient utilization of resources Research and implement cost reduction opportunities Must have skills MBA in any discipline 2+ years of experience as a Business Analyst Proficiency in Advanced Excel and Advanced SQL (must-have) and Python(must have) Understanding of basic statistics and probability concepts Proven problem-solving skills
Posted 2 months ago
3.0 - 8.0 years
16 - 30 Lacs
Bengaluru
Work from Office
Role & responsibilities Design, develop, and optimize complex SQL queries, stored procedures, and data models for Oracle-based systems Create and maintain efficient data pipelines for extract, transform, and load (ETL) processes using Informatica or Python Implement data quality controls and validation processes to ensure data integrity Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications Document database designs, procedures, and configurations to support knowledge sharing and system maintenance Troubleshoot and resolve database performance issues through query optimization and indexing strategies Preferred candidate profile 3+ years of experience with Oracle databases, including advanced SQL & PLSQL development Strong knowledge of data modelling principles and database design Proficiency with Python for data processing and automation Experience implementing and maintaining data quality controls
Posted 2 months ago
4.0 - 9.0 years
8 - 18 Lacs
Navi Mumbai, Pune, Mumbai (All Areas)
Hybrid
Job Description : Job Overview: We are seeking a highly skilled Data Engineer with expertise in SQL, Python, Data Warehousing, AWS, Airflow, ETL, and Data Modeling . The ideal candidate will be responsible for designing, developing, and maintaining robust data pipelines, ensuring efficient data processing and integration across various platforms. This role requires strong problem-solving skills, an analytical mindset, and a deep understanding of modern data engineering frameworks. Key Responsibilities: Design, develop, and optimize scalable data pipelines and ETL processes to support business intelligence, analytics, and operational data needs. Build and maintain data models (conceptual, logical, and physical) to enhance data storage, retrieval, and transformation efficiency. Develop, test, and optimize complex SQL queries for efficient data extraction, transformation, and loading (ETL). Implement and manage data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) for structured and unstructured data storage. Work with AWS, Azure , and cloud-based data solutions to build high-performance data ecosystems. Utilize Apache Airflow for orchestrating workflows and automating data pipeline execution. Collaborate with cross-functional teams to understand business data requirements and ensure alignment with data strategies. Ensure data integrity, security, and compliance with governance policies and best practices. Monitor, troubleshoot, and improve the performance of existing data systems for scalability and reliability. Stay updated with emerging data engineering technologies, frameworks, and best practices to drive continuous improvement. Required Skills & Qualifications: Proficiency in SQL for query development, performance tuning, and optimization. Strong Python programming skills for data processing, automation, and scripting. Hands-on experience with ETL development , data integration, and transformation workflows. Expertise in data modeling for efficient database and data warehouse design. Experience with cloud platforms such as AWS (S3, Redshift, Lambda), Azure, or GCP. Working knowledge of Airflow or similar workflow orchestration tools. Familiarity with Big Data frameworks like Hadoop or Spark (preferred but not mandatory). Strong problem-solving skills and ability to work in a fast-paced, dynamic environment. Role & responsibilities Preferred candidate profile
Posted 2 months ago
5.0 - 10.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Were seeking a Senior Software Engineer or a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, locations and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer or a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5-10 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Django Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in Kafka or any other stream message processing solutions. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake
Posted 2 months ago
3.0 - 8.0 years
8 - 14 Lacs
Nashik
Work from Office
Architect and develop apps (.NET Core, Angular, SQL Server) Design and optimize database schemas Ensure coding/testing best practices Conduct code reviews Collaborate with cross-functional teams Stay updated with new tech Troubleshoot complex issues
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City