Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
haryana
On-site
As an Assistant Vice President, Data Engineering Expert at Analytics & Information Management (AIM) in Gurugram, you will play a crucial role in leading the Data/Information Management Team. Your responsibilities will include driving the development and implementation of data analytics solutions to support key business objectives for Legal Operations as part of the COO (Chief Operating Office). You will be expected to build and manage high-performing teams, deliver impactful insights, and foster a data-driven culture within the organization. In this role, you will be responsible for supporting Business Execution, Legal Data & Reporting activities for the Chief Operating Office by implementing data engineering solutions to manage banking operations. This will involve establishing monitoring routines, scorecards, and escalation workflows, as well as overseeing Data Strategy, Smart Automation, Insight Generation, Data Quality, and Reporting activities using proven analytical techniques. Additionally, you will be required to enable proactive issue detection, implement a governance framework, and interface between business and technology partners for digitizing data collection. You will also need to communicate findings and recommendations to senior management, stay updated with the latest trends in analytics, ensure compliance with data governance policies, and set up a governance operating framework to enable operationalization of data domains. To excel in this role, you should have at least 8 years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like Python, PySpark, Tableau, MicroStrategy, and SQL. Strong understanding of Data Transformation, Data Strategy, Data Architecture, Data Tracing & Lineage, and Database Management & Optimization will be essential. Additionally, experience in AI solutions, banking operations, and regulatory requirements related to data privacy and security will be beneficial. A Bachelor's/University degree in STEM is required for this position, with a Master's degree being preferred. Your ability to work as a senior member in a team of data engineering professionals and effectively manage end-to-end conceptualization & implementation of data strategies will be critical for success in this role. If you are excited about the opportunity to lead a dynamic Data/Information Management Team and drive impactful insights through data analytics solutions, we encourage you to apply for this position and be a part of our talented team at AIM, Gurugram.,
Posted 5 days ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
This is a data engineer position where you will be responsible for designing, developing, implementing, and maintaining data flow channels and data processing systems to support the collection, storage, batch and real-time processing, and analysis of information in a scalable, repeatable, and secure manner in coordination with the Data & Analytics team. Your main objective will be to define optimal solutions for data collection, processing, and warehousing, particularly within the banking & finance domain. You must have expertise in Spark Java development for big data processing, Python, and Apache Spark. You will be involved in designing, coding, and testing data systems and integrating them into the internal infrastructure. Your responsibilities will include ensuring high-quality software development with complete documentation, developing and optimizing scalable Spark Java-based data pipelines, designing and implementing distributed computing solutions for risk modeling, pricing, and regulatory compliance, ensuring efficient data storage and retrieval using Big Data, implementing best practices for Spark performance tuning, maintaining high code quality through testing, CI/CD pipelines, and version control, working on batch processing frameworks for Market risk analytics, and promoting unit/functional testing and code inspection processes. You will also collaborate with business stakeholders, Business Analysts, and other data scientists to understand and interpret complex datasets. Qualifications: - 5-8 years of experience in working in data ecosystems - 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting, and other Big data frameworks - 3+ years of experience with relational SQL and NoSQL databases such as Oracle, MongoDB, HBase - Strong proficiency in Python and Spark Java with knowledge of core Spark concepts (RDDs, Dataframes, Spark Streaming, etc.), Scala, and SQL - Data integration, migration, and large-scale ETL experience - Data modeling experience - Experience building and optimizing big data pipelines, architectures, and datasets - Strong analytic skills and experience working with unstructured datasets - Experience with various technologies like Confluent Kafka, Redhat JBPM, CI/CD build pipelines, Git, BitBucket, Jira, external cloud platforms, container technologies, and supporting frameworks - Highly effective interpersonal and communication skills - Experience with software development life cycle Education: - Bachelors/University degree or equivalent experience in computer science, engineering, or a similar domain This is a full-time position in the Data Architecture job family group within the Technology sector.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As an AWS Consultant specializing in Infrastructure, Data & AI, and Databricks, you will play a crucial role in designing, implementing, and optimizing AWS Infrastructure solutions. Your expertise will be utilized to deliver secure and scalable data solutions using various AWS services and platforms. Your responsibilities will also include architecting and implementing ETL/ELT pipelines, data lakes, and distributed compute frameworks. You will be expected to work on automation and infrastructure as code using tools like CloudFormation or Terraform, and manage deployments through AWS CodePipeline, GitHub Actions, or Jenkins. Collaboration with internal teams and clients to gather requirements, assess current-state environments, and define cloud transformation strategies will be a key aspect of your role. Your support during pre-sales and delivery cycles will involve contributing to RFPs, SOWs, LOEs, solution blueprints, and technical documentation. Ensuring best practices in cloud security, cost governance, and compliance will be a priority. The ideal candidate for this position will possess 3 to 5 years of hands-on experience with AWS services, a Bachelor's degree or equivalent experience, and a strong understanding of cloud networking, IAM, security best practices, and hybrid connectivity. Proficiency in Databricks on AWS, experience with data modeling, ETL frameworks, and working with structured/unstructured data are required skills. Additionally, you should have working knowledge of DevOps tools and processes in the AWS ecosystem, strong documentation skills, and excellent communication abilities to translate business needs into technical solutions. Preferred certifications for this role include AWS Certified Solutions Architect - Associate or Professional, AWS Certified Data Analytics - Specialty (preferred), and Databricks Certified Data Engineer Associate/Professional (a plus).,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Elait is a digital data management and cloud solutions provider based in Bengaluru, offering innovative solutions across various industry verticals using leading technologies such as Ab Initio, Microsoft, and Snowflake. Our consultants are experts in data governance, architecture, high-volume data processing, data integration, and more. We have a rich catalog of accelerators and frameworks that guide our customers in data engineering, enabling them to work in agile environments with quicker delivery timelines and cost reductions. As a Metadata Model Developer at Elait, your responsibilities will include designing, developing, and implementing metadata models within the Ab Initio MD Hub environment. You will configure and customize MD Hub components, build custom extractors for data integration, implement efficient data processing solutions using Ab Initio tools, and ensure data quality and accuracy within the metadata repository. Additionally, you will support data governance initiatives through the implementation of data quality rules, data lineage tracking, and data classification systems, while also troubleshooting and resolving metadata-related issues. The ideal candidate for this role should have 3-6 years of hands-on experience with Ab Initio tools such as GDE (Graphical Development Environment), Express>It, MDH (Metadata Hub), and PDL (Parallel Data Language). Proficiency in MD Hub features and customization, experience in building custom data extractors, a strong understanding of data modeling and ETL processes, knowledge of data governance frameworks, and familiarity with DPDP compliance and GDPR are preferred. A Bachelor's or Master's degree in Computer Science, Engineering, or a related field is required for this position.,
Posted 5 days ago
3.0 - 7.0 years
0 - 0 Lacs
karnataka
On-site
The ideal candidate for this role should be an Immediate Joiner with a salary range of 15-20 LPA. Your responsibilities will include the following: - Design and Development: You will be responsible for gathering requirements, designing solutions, and creating high-level design artifacts. - Coding: Deliver high-quality code for assigned modules. - Testing: Lead validation for all types of testing activities. - Implementation: Support activities related to implementation, transition, and warranty. - Data Quality: Establish data-quality metrics and requirements, and define policies and procedures for access to data. - Platform Management: Design and implement MDM solutions, including data models, data integration, data quality, data governance, and data security. - Documentation: Maintain comprehensive documentation for all service processes and incidents. - Customer Collaboration: Work with customers on solution brainstorming and solution design. The ideal candidate should have experience with MDM development, Informatica MDM tools (MDM Hub, Data Director, provisioning tool, and ActiveVOS), data modeling principles, data integration, and SAAS implementation techniques.,
Posted 5 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You should have a minimum of 2 years of experience in SAP MDG to apply for this position. As an SAP MDG Consultant, you will be responsible for overseeing end-to-end Master Data Governance (MDG) implementations. This includes tasks such as design, configuration, development, deployment, and continuous support. The ideal candidate will possess a comprehensive functional and technical knowledge related to SAP MDG and will be adept at driving efficient master data processes across various business domains. Your key responsibilities in this role will include: - Demonstrating a strong background in SAP MDG design, configuration, and implementation. - Being proficient in tasks such as data modeling, UI/FPM configuration, and workflow design. - Having hands-on experience with BRF+, DRF, and ABAP enhancements. - Integrating MDG with SAP ECC, S/4HANA, and other relevant systems. - Understanding data governance principles, data lifecycle, and quality frameworks to ensure effective data management.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As an MDM Project Manager, you will play a crucial role in planning, coordinating, and overseeing the implementation of Master Data Management (MDM) projects. Your primary responsibilities will include ensuring that projects are completed on time, within budget, and in alignment with all business requirements. You will be responsible for managing data governance, data quality, and stakeholder engagement throughout the project lifecycle. This will involve collaborating closely with cross-functional teams to define data standards, cleanse data, and integrate MDM solutions with existing systems. Your key duties as an MDM Project Manager will encompass various aspects of project management, data governance, stakeholder management, technical expertise, training, and support. In terms of project planning and execution, you will be expected to develop detailed project plans, timelines, and budgets for MDM initiatives. You will define project scope, deliverables, and success metrics, as well as manage project risks and mitigation strategies. Monitoring project progress and making necessary adjustments to ensure timely delivery will also fall under your purview. Regarding data governance and quality, you will establish and enforce data governance policies and procedures to maintain data consistency and accuracy. You will define data quality standards and metrics, lead data cleansing and deduplication activities, and identify/address data quality issues throughout the project lifecycle. Additionally, you will cleanse and standardize master data by applying data quality rules, enriching data with relevant information, and ensuring data integrity. Stakeholder management will be another critical aspect of your role. You will facilitate communication and collaboration with cross-functional teams, including business stakeholders, IT teams, and data stewards. Gathering business requirements and translating them into technical specifications for the MDM solution, managing stakeholder expectations, and addressing concerns throughout the project will be key responsibilities. Your technical expertise will involve understanding MDM concepts, best practices, and available MDM tools. Working with technical teams to design and implement MDM architecture, data integration processes, and data mapping will be essential. You will oversee data migration activities from legacy systems to the MDM platform, participate in the design and implementation of MDM solutions, and manage and maintain the MDM platform, including user access controls, data mapping, and workflow configurations. A strong understanding of SQL queries to access and manipulate data within relational databases is also required. Furthermore, you will be responsible for developing and delivering training programs to end-users on MDM processes and data management practices. Providing ongoing support and troubleshooting for MDM-related issues will be part of your duties. To excel in this role, you should possess strong project management skills, a deep understanding of Master Data Management principles and data governance best practices, excellent communication and stakeholder management abilities, technical proficiency in data integration, data modeling, and MDM tools, business acumen, and the ability to translate business requirements into technical solutions. Strong analytical and problem-solving skills will also be essential for success in this position.,
Posted 5 days ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a Cloud Backend Developer at Smart Home Solutions, you will play a crucial role in designing and developing various software products. Your primary responsibility will be to implement prototypes and final applications efficiently and within the set timelines. You will be involved in both creating new products and enhancing existing ones. Your tasks will include developing rapid prototypes, sample data sets, and data models that are in alignment with user interface requirements and business objectives. Effective communication and consistent feedback during the design phase will be essential for creating secure, high-speed, and effective backend integrations. We are seeking a self-motivated individual with exceptional problem-solving skills, clear thinking, and a proactive approach to work. Your constructive and enthusiastic work style will be pivotal in ensuring the successful delivery of our next-generation initiatives. Key Responsibilities: - Designing and developing scalable and flexible AWS cloud solutions to establish connections with databases and applications - Collaborating with front-end developers and utilizing UI design capabilities for seamless integration - Leading the backend development efforts to meet project requirements and timelines effectively If you have 4-8 years of experience in the field and are ready to take on this exciting opportunity, we are looking forward to your application. This is a full-time position based in Hyderabad with a notice period ranging from 0-15 days. Join us in shaping the future of digital cloud web technologies and make a significant impact in the industry!,
Posted 5 days ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
As a Machine Learning Manager at our company, you will be responsible for leading a high-performing team and driving the development of innovative machine learning solutions. Your expertise in machine learning techniques, programming languages like Python, R, Java, and Scala, data structures, data modeling, and software architecture will be crucial for success in this role. Your role will involve developing and implementing machine learning models to solve complex business problems, collaborating with cross-functional teams to translate business objectives into technical solutions, and designing scalable machine learning systems. You will also conduct exploratory data analysis, feature engineering, and model evaluation, staying updated on the latest advancements in machine learning research and technologies. In addition to technical excellence, strong leadership capabilities are essential in this role. You will be required to mentor junior team members, drive best practices in coding, testing, and documentation, and lead end-to-end machine learning initiatives aligned with business goals. Your expertise in defining scalable ML architectures, selecting appropriate tools and frameworks, and leading cross-functional teams to deliver impactful ML solutions will be critical. The preferred candidate for this position will have at least 6 years of experience in Machine Learning, with a Bachelor's or Master's degree in Computer Science, Statistics, Mathematics, or a related field. Proficiency in programming languages, hands-on experience with machine learning frameworks, and a solid understanding of data structures and software architecture are required. Experience with MLOps and MLaaS is highly advantageous, along with strong problem-solving skills and excellent communication abilities. Joining our team will offer you the opportunity to focus on impactful tasks, work in a flexible and supportive environment, maintain a healthy work-life balance, and collaborate with a motivated and goal-oriented team. You will benefit from a competitive compensation package, work with cutting-edge technologies, and see your work make a tangible impact on the lives of millions of customers.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
delhi
On-site
As a Sales Planning & Analysis professional, you will be responsible for supporting the National Sales Manager (NSM) in developing region-wise quarterly and annual sales plans. Your role will involve conducting performance diagnostics by region, aligning strategies with sales targets, and providing actionable insights through data modeling and advanced analytics. In the area of Field Force Alignment & Execution Oversight, you will coordinate closely with Regional Business Managers (RBMs) and Area Business Managers (ABMs) to ensure that quarterly plans are well-understood and implemented. Monitoring the potential and performance of doctors/institutions, providing support in resource allocation, and auditing field execution standards to identify process deviations or gaps will also be key aspects of your role. You will also be responsible for overseeing Stock Movement & Demand Forecasting by tracking regional stock movement patterns to prevent excess inventory or shortfalls. Your role will involve detecting and addressing early warning signs of product dumping, as well as collaborating with supply chain and demand planning teams to refine forecasts based on real-time trends and market dynamics. Furthermore, in the domain of Strategic Reporting & Collaboration, you will be expected to generate regular performance dashboards, market opportunity reports, and deviation alerts for leadership review. Acting as a strategic partner to marketing and medical affairs in rolling out region-specific campaigns or initiatives, as well as providing input into incentive planning and budgeting from a regional performance perspective, will also fall under your responsibilities. Overall, this role requires a detail-oriented individual with strong analytical skills, the ability to collaborate effectively with cross-functional teams, and a strategic mindset to drive sales planning and analysis initiatives successfully.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You should have 3-5 years of experience in ETL DWH/ Database testing. You must be proficient in ETL testing tools and frameworks with hands-on experience in creating and executing test scripts. It is essential to have hands-on experience in testing Data completeness and Quality for various data feeds. Conceptual knowledge of Data warehousing concepts and Data modeling is required for this role. You should be well-versed with relational databases, non-relational databases, data streams, and file stories. Hands-on experience in creating/executing stored procedures, functions, tables, views, and cursors is a must. As a part of the job, you will work closely with development and product teams to understand requirements and ensure that quality considerations are integrated into SDLC. You should possess excellent analytical and problem-solving skills with attention to detail. Developing metrics and reports on the quality of the software, test coverage, and the effectiveness of QA processes will be a part of your responsibilities. Hands-on experience in any one of the Test management tools from requirement mapping till closure is required. Experience in working in Scrum/Agile methodologies is preferred. You should have a proven ability to work collaboratively in a team environment and communicate effectively with stakeholders. The locations available for this position are Pune, Bangalore, Hyderabad, and Chennai.,
Posted 5 days ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
As a skilled and motivated Technical Lead Golang Developer at Omniful in Gurugram, you will play a crucial role in designing and developing efficient, scalable, and high-performing software solutions using Golang. Your responsibilities will include guiding a team of engineers, making key architecture decisions, and actively contributing to the entire software development lifecycle from concept to deployment. You will lead the design, development, testing, and deployment of backend services and APIs using Golang. Additionally, you will drive architectural decisions and system design for distributed systems and microservices. Mentoring and guiding junior developers on best practices, code quality, and development standards will also be a part of your role. Collaboration with product managers, frontend developers, and QA team members is essential to deliver robust and scalable solutions. Your ability to write clean, maintainable, and well-documented code will be crucial in troubleshooting and resolving complex technical issues and bugs. Conducting code reviews to ensure adherence to development and security standards will also be a key responsibility. Your expertise should include proficiency in Golang, hands-on experience in building web services and backend systems, solid understanding of data structures, algorithms, and design patterns, as well as experience with concurrency models and performance optimization in Golang. Strong experience in building and consuming RESTful APIs, gRPC, and GraphQL is preferred, along with experience in API versioning and documentation. A deep understanding of microservices architecture, experience with message queues such as Kafka, RabbitMQ, NATS, and event-driven architecture, as well as proficiency in containerization and orchestration tools like Docker and Kubernetes are desired skills. Familiarity with unit testing, integration testing, and test automation frameworks in Golang, along with CI/CD pipelines using tools like Jenkins, GitLab CI, GitHub Actions, or similar, is also important. Experience with both SQL (PostgreSQL, MySQL) and NoSQL (MongoDB, Redis) databases, understanding of data modeling, indexing, and query optimization are required. Basic knowledge of authentication, authorization, and secure coding practices is essential, and exposure to cloud platforms like AWS, GCP, or Azure is a plus. To qualify for this role, you should have a Bachelors or Masters degree in Computer Science, Software Engineering, or related field, along with a minimum of 4+ years of industry experience in backend development, preferably with Golang.,
Posted 5 days ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
As an Analytics Manager specializing in Power BI, Python, Tableau, and SQL within the Insurance Domain, you will be responsible for designing, developing, and implementing Power BI dashboards. Your expertise in data warehousing, ETL processes, and data governance will be crucial for this role. Your key responsibilities will include leading, mentoring, and developing a team of data analysts and data scientists. You will provide strategic direction to ensure the timely delivery of analytical projects. Additionally, you will define and implement the company's data analytics strategy, collaborate with stakeholders to understand data needs, conduct complex data analysis, and translate findings into strategic recommendations. You will oversee the development of interactive dashboards, reports, and visualizations to make data insights accessible to technical and non-technical stakeholders. Ensuring data integrity across systems, enforcing data governance policies, and collaborating with cross-functional teams will also be part of your role. To qualify for this position, you should have a Bachelor's degree in data science, Statistics, Computer Science, Engineering, or a related field. With 10+ years of experience in data analysis, including at least 2 years in a managerial role, you should be proficient in SQL, Python, R, Tableau, and Power BI. Strong knowledge of data modeling, ETL processes, and database management is essential, along with exceptional problem-solving skills and the ability to communicate technical concepts to non-technical stakeholders. Experience in managing and growing data professional teams, strong project management skills, and domain knowledge in insurance are desirable. Staying updated with the latest data analytics trends and technologies and leading data-driven projects from initiation to execution will be key aspects of this role.,
Posted 5 days ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Manager of HR at XPO India Shared Services in Hyderabad, you will play a crucial role in the Business Intelligence Division by designing, developing, and maintaining interactive data visualizations and reports using Looker. Your responsibilities will include creating and optimizing data models to support business requirements, integrating Looker reports into other applications for enhanced business capabilities, and monitoring and optimizing the performance of Looker reports and dashboards. Additionally, you will collaborate with business stakeholders to understand their data visualization and business intelligence needs, while implementing security measures on data to ensure compliance with data governance policies. To succeed in this role, you should possess a Bachelors or Masters degree in computer science, Information Technology, or a related field, along with at least 2 years of experience in data analysis, data visualization, and business intelligence using BI Tools such as Looker, Power BI, or Tableau. Proficiency in writing SQL queries, a solid understanding of Data Warehouse and data modeling concepts, strong analytical and problem-solving skills, excellent communication and teamwork skills are essential. Experience with Cloud Platforms like Google Cloud Platform and Google Big Query, programming languages like Python, R, and version controlling tools such as GitHub, SVN, TFS, as well as Google Cloud Platform or Looker certification would be advantageous. Join XPO and be part of something big where we offer competitive compensation and a generous benefits package, including medical insurance, OPD benefits, term insurance, accidental insurance, and more. The office timings are from 2 pm to 11 pm, providing you with a conducive environment to thrive in your role as an Associate Analyst, Business Intelligence.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As an ETL Testing & Big Data professional, you will be responsible for designing and implementing ETL test strategies based on business requirements. Your role involves reviewing and analyzing ETL source code, as well as developing and executing test plans and test cases for ETL processes. Data validation and reconciliation using SQL queries will be a key aspect of your responsibilities. Monitoring ETL jobs, resolving issues affecting data accuracy, and performing performance testing on ETL processes to focus on optimization are crucial tasks in this role. Ensuring data quality and integrity across various data sources, along with coordinating with development teams to troubleshoot issues and suggest improvements, are essential for success. You will be expected to utilize automation tools to enhance the efficiency of testing processes and conduct regression testing after ETL releases or updates. Documenting test results, issues, and proposals for resolution, as well as providing support to business users regarding data-related queries, are integral parts of your responsibilities. Staying updated with the latest trends in ETL testing and big data technologies, working closely with data architects to ensure effective data modeling, and participating in technical discussions to contribute to knowledge sharing are key aspects of this role. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 3+ years of experience in ETL testing and big data environments. - Strong proficiency in SQL and data modeling techniques. - Hands-on experience with Hadoop ecosystem and related tools. - Familiarity with ETL tools such as Informatica, Talend, or similar. - Experience with data quality frameworks and methodologies. - Knowledge of big data technologies like Spark, Hive, or Pig. - Excellent analytical and problem-solving skills. - Proficient communication skills for effective collaboration. - Ability to manage multiple tasks and meet deadlines efficiently. - Experience in Java or scripting languages is a plus. - Strong attention to detail and a commitment to delivering quality work. - Certifications in data management or testing are a plus. - Ability to work independently and as part of a team. - Willingness to adapt to evolving technologies and methodologies. Skills required: - Scripting languages - Data modeling - Data quality frameworks - Hive - Talend - Analytical skills - SQL - Performance testing - Automation tools - Pig - Hadoop ecosystem - ETL testing - Informatica - Hadoop - Data quality - Big data - Java - Regression testing - Spark,
Posted 5 days ago
15.0 - 20.0 years
0 Lacs
ahmedabad, gujarat
On-site
You are seeking a skilled Lead Data Engineer to join the dynamic team at e.l.f. Beauty, Inc. As a Sr. Data Engineer, you will be responsible for designing, developing, and maintaining data pipelines, integrations, and data warehouse infrastructure. Your role will involve collaborating with data scientists, analysts, and business stakeholders to ensure accurate, secure, and accessible data for all users. Responsibilities: - Design and build scalable data pipeline architecture capable of handling large data volumes - Develop ELT/ETL pipelines to extract, load, and transform data from various sources into the data warehouse - Optimize and maintain data infrastructure for high availability and performance - Collaborate with data scientists and analysts to implement improvements to data pipelines and models - Develop and maintain data models to support business requirements - Ensure data security and compliance with governance policies - Identify and troubleshoot data quality issues - Automate and streamline data management processes - Stay updated with emerging data technologies and trends for continuous improvement of data infrastructure - Analyze data products and requirements to align with data strategy - Assist in extracting and researching data for cross-functional business partners - Enhance efficiency, automation, and accuracy of existing reports - Follow best practices in data querying and manipulation for data integrity Requirements: - Bachelor's or Master's degree in Computer Science, Data Science, or related field - 15+ years of experience as a Data Engineer or related role - Experience with Snowflake, including building, maintaining, and documenting data pipelines - Strong knowledge of Snowflake concepts like RBAC management, virtual warehouse, file format, streams, zero copy clone, time travel, etc. - Proficiency in SQL development, ELT/ETL tools, data standardization, cleansing, enrichment, and modeling - Skills in one or more programming languages like Python, Java, or C# - Experience with cloud computing platforms such as AWS, Azure, or GCP - Knowledge of ELT/ETL processes, data warehousing, data modeling, security, and governance best practices - Strong problem-solving and analytical skills - Excellent communication and collaboration skills This job description provides an overview of the responsibilities and qualifications required for the Lead Data Engineer position at e.l.f. Beauty, Inc. It is not an exhaustive list and may be subject to changes at the supervisor's discretion.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Engineering Senior Specialist (Databricks) at Nasdaq Bangalore, you will be joining the Bangalore technology center in India, where innovation and effectiveness are the driving forces. Nasdaq is at the forefront of revolutionizing markets and constantly evolving by adopting new technologies to create innovative solutions, aiming to shape the future. In this role, your primary responsibility will be to analyze defined business requirements, providing analytical insights, modeling, dimensional modeling, and testing to design solutions that meet customer needs effectively. You will focus on understanding business data needs and translating them into adaptable, extensible, and sustainable data structures. As a Databricks Data Engineer, your role will involve designing, building, and maintaining data pipelines within the Databricks Lakehouse Platform. Your expertise will be crucial in enabling efficient data processing, analysis, and reporting for data-driven initiatives. You will utilize the Databricks Lakehouse Platform for data engineering tasks, implement ETL tasks using Apache Spark SQL and Python, and develop ETL pipelines following the Medallion Architecture. Moreover, you will be responsible for adding new sources to the Lakehouse platform, reviewing technology platforms on AWS cloud, supervising data extraction methods, resolving technical issues, and ensuring project delivery within the assigned timeline and budget. You will also lead administrative tasks, ensuring completeness and accuracy in administration processes. To excel in this role, you are expected to have 8-10 years of overall experience with at least 5-6 years of specific Data Engineering experience on Databricks. Proficiency in SQL and Python for data manipulation, knowledge of modern data technologies, cloud computing platforms like AWS, data modeling, architecture, best practices, and familiarity with AI/ML Ops in Databricks are essential. A Bachelor's/Master's degree in a relevant field or equivalent qualification is required. It would be advantageous if you have knowledge of Terraform and hold certifications in relevant fields. Nasdaq offers a vibrant and entrepreneurial work environment where taking initiative, challenging the status quo, and embracing intelligent risks are encouraged. The company values diversity, inclusivity, and work-life balance in a hybrid-first environment. As an employee, you can benefit from various perks such as an annual monetary bonus, becoming a Nasdaq shareholder, health insurance, flexible working schedules, internal mentorship programs, and a wide selection of online learning resources. If you believe you possess the required skills and experience for this role, we encourage you to submit your application in English as soon as possible. The selection process is ongoing, and we aim to get back to you within 2-3 weeks. At Nasdaq, we are committed to providing reasonable accommodations to individuals with disabilities throughout the job application and interview process, ensuring equal access to employment opportunities. If you require any accommodations, please reach out to us to discuss your needs.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Business Intelligence (BI) Developer Foundational Team Lead, you will be responsible for establishing the foundation of the BI function in our organization. Your role will require a combination of technical expertise and strong business understanding, particularly in the multifamily real estate domain. Your primary objective will be to design, develop, and manage end-to-end BI solutions that facilitate data-driven decisions and operational excellence. In addition to analytical skills, effective communication and training abilities will be crucial to enhance user adoption and maximize the business value derived from BI solutions. Your key responsibilities will include developing BI solutions by creating interactive dashboards, visualizations, and reports using tools such as Power BI or equivalent platforms like Tableau. You will also be involved in data engineering tasks, including designing, maintaining, and optimizing data models and pipelines to ensure data integrity, scalability, and performance. Advanced scripting skills will be required to write complex SQL queries and utilize M Script (Power Query) for data transformation and cleaning. Integration across the Power Platform, including Power Apps, Copilot Studio, and Microsoft Fabric, will be essential for comprehensive business solutions. Understanding the business needs, especially in the multifamily real estate sector, and translating them into scalable technical solutions will be a critical aspect of your role. Conducting training sessions, supporting business users, and collaborating with cross-functional teams to gather requirements and facilitate strategic decision-making are vital components of the position. You will also be responsible for documenting processes, models, and dashboards while ensuring data quality and compliance with data governance practices. To qualify for this role, you should have a minimum of 3 years of hands-on BI development experience with SQL, Power BI, and data modeling. Proficiency in tools such as Power BI, SQL, and Power Platform components (Power Apps, Copilot Studio, Fabric) is essential, and knowledge of Tableau would be advantageous. Experience or familiarity with multifamily real estate and related domains, along with excellent communication skills to convey complex insights clearly and train non-technical stakeholders, are required. A mindset of a lifelong learner, coupled with the ability to work independently and adapt to technological advancements, will be beneficial in this role. Preferred qualifications for this position include experience with LIHTC (Low-Income Housing Tax Credit) projects, exposure to data governance and data hygiene best practices, proven ability to lead or mentor junior developers in a BI environment, and a degree in Computer Science, Information Technology, Data Analytics, or related fields.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Technical Lead in Data Governance at CitiusTech, you will play a crucial role in an Agile team by designing and constructing healthcare applications, as well as implementing new features while adhering to the highest coding development standards. Your responsibilities will include owning and managing metadata documentation, such as table and column descriptions, across enterprise data platforms. You will be responsible for generating, presenting, and explaining metric definitions and metadata standards to various stakeholder groups. Collaboration with data owners and stewards to ensure consistency and accuracy of business definitions and data lineage will be a key aspect of your role. Additionally, you will support the implementation and maintenance of the Advanced Analytics Governance Framework and work closely with analytics teams to ensure that governed metrics align with business objectives and are consistently applied across dashboards and reports. You will also be responsible for maintaining and enhancing metadata repositories using tools like Databricks Unity Catalog and Metric Insights, as well as promoting the adoption of AI tools that leverage metadata to improve data discovery and usage. Acting as a liaison between technical teams and business stakeholders to ensure that metadata is understandable and actionable, as well as contributing to the continuous improvement of governance processes, standards, and documentation practices will also be part of your responsibilities. The ideal candidate for this role should have 5-7 years of experience, an Engineering Degree (BE/ME/BTech/MTech/BSc/MSc), and a background in information systems, Data Science, or Business Analytics. Mandatory technical skills include over 3 years of experience in data governance, metadata management, or data cataloging, a strong understanding of data modeling, data lineage, and business glossary concepts, excellent communication skills, and familiarity with tools like Databricks Unity Catalog and Metric Insights. Knowledge of healthcare data and analytics, as well as experience in Epic and other EHRs, are also required. Join CitiusTech and be a part of an organization that is committed to combining the best of IT services, consulting, products, accelerators, and frameworks with a client-first mindset and next-gen tech understanding to humanize healthcare and make a positive impact on human lives. Embrace our core values of Passion, Respect, Openness, Unity, and Depth (PROUD) of knowledge, and enjoy a fun, transparent, non-hierarchical, diverse work culture focused on continuous learning and work-life balance. Experience a rewarding career with comprehensive benefits at CitiusTech, where you can thrive both personally and professionally. Let's shape the future of healthcare together and positively impact human lives. To learn more about CitiusTech, visit https://www.citiustech.com/careers. Happy applying!,
Posted 5 days ago
1.0 - 5.0 years
0 Lacs
maharashtra
On-site
As a global leader in assurance, tax, transaction and advisory services, EY is committed to hiring and developing passionate individuals to contribute to building a better working world. We foster a culture that believes in providing training, opportunities, and creative freedom to help individuals reach their full potential. At EY, we focus not only on who you are at present, but also on who you aspire to become. We believe that your career is yours to shape, offering limitless potential, and we are dedicated to providing motivating and fulfilling experiences throughout your professional journey to support you in becoming your best self. The opportunity available is for the role of Consultant-FS-Business Consulting Risk-CNS - Risk - Digital Risk in Mumbai within the Financial Services sector. Today's financial services institutions face significant challenges such as comprehensive regulatory changes, digital transformation, convergence, and disruption from non-traditional competitors, while also meeting increasing demands for trust and transparency. In response to these complex issues, our team of proficient business strategists, technologists, and industry leaders bring fresh perspectives and sector knowledge across banking and capital markets, insurance, and wealth and asset management. This collaboration results in innovative problem-solving, breakthrough performance gains, and sustainable value creation. Within the CNS - Risk - Digital Risk sector, EY Consulting is dedicated to transforming businesses through the power of people, technology, and innovation. Our client-centric approach focuses on delivering long-term value by addressing our clients" most strategic challenges. EY Consulting comprises three sub-service lines: Business Consulting (including Performance Improvement and Risk Consulting), Technology Consulting, and People Advisory Services. We assist clients in identifying and managing the interplay between upside and downside risks to make informed decisions that align with their future business strategies and objectives across Enterprise Risk, Technology Risk, and Financial Services Risk. Your responsibilities in this role include demonstrating technical excellence by understanding project requirements, engaging with key stakeholders, providing timely updates to seniors, preparing reports and presentations, attending training sessions, and delivering outputs in line with EY's quality standards. You will also be responsible for multitasking and managing multiple projects as directed by managers. To qualify for this role, you must hold a Master's degree in computer science, information technology, business administration, be a Chartered Accountant, Certified Internal Auditor, or have a Bachelor's in Engineering along with 1 to 3 years of relevant experience. We are looking for individuals who can work collaboratively across client departments, adhere to commercial and legal requirements, offer practical solutions to complex problems, and demonstrate agility, curiosity, mindfulness, positive energy, adaptability, and creativity. EY offers a dynamic work environment with numerous opportunities for growth and learning. With a vast client base, a global team of over 300,000 professionals, and a strong presence in India, EY is a leading employer known for market-leading growth and innovation. We are committed to investing in the skills and development of our people, providing personalized career journeys, and promoting inclusivity to maintain a balanced and supportive work environment. If you meet the criteria outlined above and are eager to contribute to building a better working world, we encourage you to apply and join us on this exciting journey. Apply now to be part of our team at EY.,
Posted 5 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Software Developer in the Business Intelligence team at Mediaocean, you will play a crucial role in designing, developing, and maintaining microservices that support BI data pipelines and analytics platforms. Your responsibilities will include building and optimizing ETL processes using Python and SQL, collaborating with various teams to deliver high-impact BI solutions, implementing CI/CD pipelines for seamless deployment, and applying OOP principles to develop modular and testable code. You will be expected to monitor and troubleshoot production systems to ensure high availability and performance, as well as contribute to data modeling and architecture decisions for scalable BI infrastructure. Your hands-on experience in software development with a focus on BI or data-centric applications, proficiency in Python, strong understanding of SQL and relational databases, and familiarity with Jenkins, Git, microservices architecture, and containerization tools will be essential for success in this role. To excel in this position, you should possess excellent problem-solving skills, attention to detail, and strong communication and collaboration abilities. Experience in Adtech or digital marketing domains, exposure to BI tools like Sisense and Jasper, understanding of data warehousing, familiarity with Agile methodologies, and sprint-based development will be advantageous. At Mediaocean, we value transferrable skills and education, so even if you do not meet every requirement listed, we encourage you to apply. Join our team and be part of an innovative company that is shaping the future of the advertising ecosystem.,
Posted 5 days ago
8.0 - 12.0 years
0 Lacs
kochi, kerala
On-site
You are being hired as a Senior Java Developer to join the team at P Square Solutions, which is a part of Neology Inc (www.neology.com). The company is looking to fill 2 open positions for individuals with 8 to 10 years of experience in IT Product & Services and IT Consulting. This is a full-time position based in Smart City, Kochi, Kerala. The shift timing may vary based on projects, typically falling within day/evening shifts. As a Senior Java Developer, you are expected to be an experienced technical expert with a deep understanding of software development. The ideal candidate will have 8-10 years of experience, preferably in Java, possess strong architectural skills, and demonstrate the ability to design and implement complex systems effectively. Your key responsibilities will include designing and developing robust, scalable Java applications, architecting and implementing middleware/API solutions, creating optimized database schemas and data models, designing efficient technical processes and workflows, solving complex technical challenges independently, maintaining high code quality standards through reviews and best practices, and collaborating with cross-functional teams to deliver technical solutions. In addition to technical competencies, you are also expected to demonstrate certain soft skills such as advanced problem-solving abilities, strong communication skills to convey technical concepts to diverse audiences, quick learning capacity, professionalism in client interactions, effective knowledge sharing with team members, capability to work independently with minimal supervision, flexibility in adapting to changing requirements and technologies. Other competencies that are required for this role include experience with cloud platforms (AWS, Azure, or GCP), knowledge of microservices architecture, experience with distributed systems, familiarity with DevOps practices and tools, and contributions to open-source projects. P Square Solutions LLC, a part of Neology Inc, is a leading firm in Toll systems solutions and systems Integration Services. The company is committed to delivering innovative Toll solutions and exceptional service to clients, with core values of integrity, collaboration, and excellence. It offers a good work culture, career opportunities with competitive salaries, excellent employee benefits, opportunities for learning and career growth, work-life balance through a balanced leave policy, and a holistic approach to talent development and nurturing work culture. The company values feedback and is dedicated to enhancing the work environment at P-Square.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
A career in our Advisory Acceleration Centre is the natural extension of PwC's leading class global delivery capabilities. We provide premium, cost-effective, high-quality services that support process,
Posted 5 days ago
4.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
The position is based in Hyderabad, India within the IT department at Company XYZ. As a member of the Data Product Managers team, you will play a crucial role in developing quality data collection processes, ensuring the integrity of data foundations, and facilitating rapid access to data for decision-making and innovation by business leaders, data scientists, and data engineers. Your key responsibilities will include: - Taking ownership of end-to-end Data + Analytics delivery execution, optimizing resource allocation and sequencing. - Overseeing the design, construction, and management of business-ready data within the Enterprise Data Foundation. - Facilitating coordination between Domains, Products, and Projects to drive re-use and reduce redundancy in the D+A and broader PepsiCo portfolio. - Creating data roadmaps to meet hydration targets and support timely delivery for global data initiatives. - Managing delivery against key technical milestones and reporting progress against hydration milestones. - Leading cross-chapter Pod resources towards shared goals and participating in Planning to align domain goals with program/product milestones. - Collaborating with multiple stakeholders to define Epic and feature definitions and guide user stories for delivery. - Designing and documenting data product artifacts required by the team and ensuring technical documentation is accessible to key stakeholders. - Cataloging data in the Data Foundation for easy access by business stakeholders and tracking benefits of insights for leadership acceptance. - Working with program Data Product Managers to manage expectations and eliminate disconnect on what can be addressed. The ideal candidate should possess: - 9+ years of experience in Product Management, Data Analytics, Data Science, or Data Management and Operations in business-facing functions. - 4+ years of experience in leading/building advanced analytics and big data solutions, large scale data modeling, or building enterprise SaaS. - Experience in a data-centric business environment and identifying sources of value from data analytics across core business domains. - Familiarity with data governance and stewardship principles and tools. - History of working in agile environments and successfully delivering complex products. - Strong communication skills to convey complex information in a clear and concise manner. - Leadership skills with a team-player attitude to drive end-to-end implementation of use cases under time pressure. - Ability to drive innovative solutions using data science, feature engineering, and machine learning. - Proficiency in team development and talent cultivation across various experience levels.,
Posted 5 days ago
3.0 - 23.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Yardi Report Developer at Relay Human Cloud, you will be responsible for designing, developing, and maintaining custom reports and data visualization solutions within the Yardi property management software. Your role is integral to providing accurate insights and supporting decision-making for our US-based clients" property management operations. Your key responsibilities will include developing and maintaining custom YSR reports, collaborating with stakeholders to understand reporting needs, creating dynamic reports and dashboards, troubleshooting report issues, documenting processes, staying updated on Yardi software features, assisting in ETL processes, performing ad-hoc data analysis, and providing training to end-users. To qualify for this role, you should be proficient in English, hold a Bachelor's degree in computer science or related field, have at least 3 years of experience in Yardi property management software with expertise in YSR reporting, possess strong SQL and data modeling skills, be familiar with report development tools like Yardi Voyager, SSRS, Power BI, and exhibit problem-solving abilities and attention to detail. Preferred qualifications include experience in real estate or property management, knowledge of ETL tools, and familiarity with data visualization best practices. If you are a self-motivated individual with excellent analytical skills and a passion for delivering high-quality reports, we invite you to join our team at Relay Human Cloud and contribute to our mission of connecting companies with top international talent.,
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39928 Jobs | Dublin
Wipro
19400 Jobs | Bengaluru
Accenture in India
15955 Jobs | Dublin 2
EY
15128 Jobs | London
Uplers
11280 Jobs | Ahmedabad
Amazon
10521 Jobs | Seattle,WA
Oracle
9339 Jobs | Redwood City
IBM
9274 Jobs | Armonk
Accenture services Pvt Ltd
7978 Jobs |
Capgemini
7754 Jobs | Paris,France